Sample records for flow rank based

  1. An auxiliary method to reduce potential adverse impacts of projected land developments: subwatershed prioritization.

    PubMed

    Kalin, Latif; Hantush, Mohamed M

    2009-02-01

    An index based method is developed that ranks the subwatersheds of a watershed based on their relative impacts on watershed response to anticipated land developments, and then applied to an urbanizing watershed in Eastern Pennsylvania. Simulations with a semi-distributed hydrologic model show that computed low- and high-flow frequencies at the main outlet increase significantly with the projected landscape changes in the watershed. The developed index is utilized to prioritize areas in the urbanizing watershed based on their contributions to alterations in the magnitude of selected flow characteristics at two spatial resolutions. The low-flow measure, 7Q10, rankings are shown to mimic the spatial trend of groundwater recharge rates, whereas average annual maximum daily flow, QAMAX, and average monthly median of daily flows, QMMED, rankings are influenced by both recharge and proximity to watershed outlet. Results indicate that, especially with the higher resolution, areas having quicker responses are not necessarily the more critical areas for high-flow scenarios. Subwatershed rankings are shown to vary slightly with the location of water quality/quantity criteria enforcement. It is also found that rankings of subwatersheds upstream from the site of interest, which could be the main outlet or any interior point in the watershed, may be influenced by the time scale of the hydrologic processes.

  2. PageRank versatility analysis of multilayer modality-based network for exploring the evolution of oil-water slug flow.

    PubMed

    Gao, Zhong-Ke; Dang, Wei-Dong; Li, Shan; Yang, Yu-Xuan; Wang, Hong-Tao; Sheng, Jing-Ran; Wang, Xiao-Fan

    2017-07-14

    Numerous irregular flow structures exist in the complicated multiphase flow and result in lots of disparate spatial dynamical flow behaviors. The vertical oil-water slug flow continually attracts plenty of research interests on account of its significant importance. Based on the spatial transient flow information acquired through our designed double-layer distributed-sector conductance sensor, we construct multilayer modality-based network to encode the intricate spatial flow behavior. Particularly, we calculate the PageRank versatility and multilayer weighted clustering coefficient to quantitatively explore the inferred multilayer modality-based networks. Our analysis allows characterizing the complicated evolution of oil-water slug flow, from the opening formation of oil slugs, to the succedent inter-collision and coalescence among oil slugs, and then to the dispersed oil bubbles. These properties render our developed method particularly powerful for mining the essential flow features from the multilayer sensor measurements.

  3. Adaptive protection algorithm and system

    DOEpatents

    Hedrick, Paul [Pittsburgh, PA; Toms, Helen L [Irwin, PA; Miller, Roger M [Mars, PA

    2009-04-28

    An adaptive protection algorithm and system for protecting electrical distribution systems traces the flow of power through a distribution system, assigns a value (or rank) to each circuit breaker in the system and then determines the appropriate trip set points based on the assigned rank.

  4. Toward two-dimensional search engines

    NASA Astrophysics Data System (ADS)

    Ermann, L.; Chepelianskii, A. D.; Shepelyansky, D. L.

    2012-07-01

    We study the statistical properties of various directed networks using ranking of their nodes based on the dominant vectors of the Google matrix known as PageRank and CheiRank. On average PageRank orders nodes proportionally to a number of ingoing links, while CheiRank orders nodes proportionally to a number of outgoing links. In this way, the ranking of nodes becomes two dimensional which paves the way for the development of two-dimensional search engines of a new type. Statistical properties of information flow on the PageRank-CheiRank plane are analyzed for networks of British, French and Italian universities, Wikipedia, Linux Kernel, gene regulation and other networks. A special emphasis is done for British universities networks using the large database publicly available in the UK. Methods of spam links control are also analyzed.

  5. Multi-year microbial source tracking study characterizing fecal contamination in an urban watershed

    USGS Publications Warehouse

    Bushon, Rebecca N.; Brady, Amie M. G.; Christensen, Eric D.; Stelzer, Erin A.

    2017-01-01

    Microbiological and hydrological data were used to rank tributary stream contributions of bacteria to the Little Blue River in Independence, Missouri. Concentrations, loadings and yields of E. coli and microbial source tracking (MST) markers, were characterized during base flow and storm events in five subbasins within Independence, as well as sources entering and leaving the city through the river. The E. coli water quality threshold was exceeded in 29% of base-flow and 89% of storm-event samples. The total contribution of E. coli and MST markers from tributaries within Independence to the Little Blue River, regardless of streamflow, did not significantly increase the median concentrations leaving the city. Daily loads and yields of E. coli and MST markers were used to rank the subbasins according to their contribution of each constituent to the river. The ranking methodology used in this study may prove useful in prioritizing remediation in the different subbasins.

  6. Discovering urban mobility patterns with PageRank based traffic modeling and prediction

    NASA Astrophysics Data System (ADS)

    Wang, Minjie; Yang, Su; Sun, Yi; Gao, Jun

    2017-11-01

    Urban transportation system can be viewed as complex network with time-varying traffic flows as links to connect adjacent regions as networked nodes. By computing urban traffic evolution on such temporal complex network with PageRank, it is found that for most regions, there exists a linear relation between the traffic congestion measure at present time and the PageRank value of the last time. Since the PageRank measure of a region does result from the mutual interactions of the whole network, it implies that the traffic state of a local region does not evolve independently but is affected by the evolution of the whole network. As a result, the PageRank values can act as signatures in predicting upcoming traffic congestions. We observe the aforementioned laws experimentally based on the trajectory data of 12000 taxies in Beijing city for one month.

  7. Nexus Between Protein–Ligand Affinity Rank-Ordering, Biophysical Approaches, and Drug Discovery

    PubMed Central

    2013-01-01

    The confluence of computational and biophysical methods to accurately rank-order the binding affinities of small molecules and determine structures of macromolecular complexes is a potentially transformative advance in the work flow of drug discovery. This viewpoint explores the impact that advanced computational methods may have on the efficacy of small molecule drug discovery and optimization, particularly with respect to emerging fragment-based methods. PMID:24900579

  8. A new mutually reinforcing network node and link ranking algorithm

    PubMed Central

    Wang, Zhenghua; Dueñas-Osorio, Leonardo; Padgett, Jamie E.

    2015-01-01

    This study proposes a novel Normalized Wide network Ranking algorithm (NWRank) that has the advantage of ranking nodes and links of a network simultaneously. This algorithm combines the mutual reinforcement feature of Hypertext Induced Topic Selection (HITS) and the weight normalization feature of PageRank. Relative weights are assigned to links based on the degree of the adjacent neighbors and the Betweenness Centrality instead of assigning the same weight to every link as assumed in PageRank. Numerical experiment results show that NWRank performs consistently better than HITS, PageRank, eigenvector centrality, and edge betweenness from the perspective of network connectivity and approximate network flow, which is also supported by comparisons with the expensive N-1 benchmark removal criteria based on network efficiency. Furthermore, it can avoid some problems, such as the Tightly Knit Community effect, which exists in HITS. NWRank provides a new inexpensive way to rank nodes and links of a network, which has practical applications, particularly to prioritize resource allocation for upgrade of hierarchical and distributed networks, as well as to support decision making in the design of networks, where node and link importance depend on a balance of local and global integrity. PMID:26492958

  9. Stochastic constructions of flows of rank 1

    NASA Astrophysics Data System (ADS)

    Prikhod'ko, A. A.

    2001-12-01

    Automorphisms of rank 1 appeared in the well-known papers of Chacon (1965), who constructed an example of a weakly mixing automorphism not having the strong mixing property, and Ornstein (1970), who proved the existence of mixing automorphisms without a square root. Ornstein's construction is essentially stochastic, since its parameters are chosen in a "sufficiently random manner" according to a certain random law.In the present article it is shown that mixing flows of rank 1 exist. The construction given is also stochastic and is based to a large extent on ideas in Ornstein's paper. At the same time it complements Ornstein's paper and makes it more transparent. The construction can be used also to obtain automorphisms with various approximation and statistical properties. It is established that the new examples of dynamical systems are not isomorphic to Ornstein automorphisms, that is, they are qualitatively new.

  10. Stochastic constructions of flows of rank 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prikhod'ko, A A

    2001-12-31

    Automorphisms of rank 1 appeared in the well-known papers of Chacon (1965), who constructed an example of a weakly mixing automorphism not having the strong mixing property, and Ornstein (1970), who proved the existence of mixing automorphisms without a square root. Ornstein's construction is essentially stochastic, since its parameters are chosen in a 'sufficiently random manner' according to a certain random law. In the present article it is shown that mixing flows of rank 1 exist. The construction given is also stochastic and is based to a large extent on ideas in Ornstein's paper. At the same time it complementsmore » Ornstein's paper and makes it more transparent. The construction can be used also to obtain automorphisms with various approximation and statistical properties. It is established that the new examples of dynamical systems are not isomorphic to Ornstein automorphisms, that is, they are qualitatively new.« less

  11. Streamflow of 2016—Water year summary

    USGS Publications Warehouse

    Jian, Xiaodong; Wolock, David M.; Lins, Harry F.; Brady, Steven J.

    2017-09-26

    The maps and graphs in this summary describe national streamflow conditions for water year 2016 (October 1, 2015, to September 30, 2016) in the context of streamflow ranks relative to the 87-year period of 1930–2016, unless otherwise noted. The illustrations are based on observed data from the U.S. Geological Survey’s (USGS) National Streamflow Network. The period of 1930–2016 was used because the number of streamgages before 1930 was too small to provide representative data for computing statistics for most regions of the country.In the summary, reference is made to the term “runoff,” which is the depth to which a river basin, State, or other geographic area would be covered with water if all the streamflow within the area during a specified period was uniformly distributed on it. Runoff quantifies the magnitude of water flowing through the Nation’s rivers and streams in measurement units that can be compared from one area to another.In all the graphics, a rank of 1 indicates the highest flow of all years analyzed and 87 indicates the lowest flow of all years. Rankings of streamflow are grouped into much below normal, below normal, normal, above normal, and much above normal based on percentiles of flow (less than 10 percent, 10–24 percent, 25–75 percent, 76–90 percent, and greater than 90 percent, respectively). Some of the data used to produce the maps and graphs are provisional and subject to change.

  12. Base-flow characteristics of streams in the Valley and Ridge, Blue Ridge, and Piedmont physiographic provinces of Virginia

    USGS Publications Warehouse

    Nelms, D.L.; Harlow, G.E.; Hayes, Donald C.

    1995-01-01

    Growth within the Valley and Ridge, Blue Ridge, and Piedmont Physiographic Provinces of Virginia has focussed concern about allocation of surface-water flow and increased demands on the ground-water resources. The purpose of this report is to (1) describe the base-flow characteristics of streams, (2) identify regional differences in these flow characteristics, and (3) describe, if possible, the potential surface-water and ground-water yields of basins on the basis of the base-flow character- istics. Base-flow characteristics are presented for streams in the Valley and Ridge, Blue Ridge, and Piedmont Physiographic Provinces of Virginia. The provinces are separated into five regions: (1) Valley and Ridge, (2) Blue Ridge, (3) Piedmont/Blue Ridge transition, (4) Piedmont northern, and (5) Piedmont southern. Different flow statistics, which represent streamflows predominantly comprised of base flow, were determined for 217 continuous-record streamflow-gaging stations from historical mean daily discharge and for 192 partial-record streamflow-gaging stations by means of correlation of discharge measurements. Variability of base flow is represented by a duration ratio developed during this investigation. Effective recharge rates were also calculated. Median values for the different flow statistics range from 0.05 cubic foot per second per square mile for the 90-percent discharge on the streamflow-duration curve to 0.61 cubic foot per second per square mile for mean base flow. An excellent estimator of mean base flow for the Piedmont/Blue Ridge transition region and Piedmont southern region is the 50-percent discharge on the streamflow-duration curve, but tends to under- estimate mean base flow for the remaining regions. The base-flow variability index ranges from 0.07 to 2.27, with a median value of 0.55. Effective recharge rates range from 0.07 to 33.07 inches per year, with a median value of 8.32 inches per year. Differences in the base-flow characteristics exist between regions. The median discharges for the Valley and Ridge, Blue Ridge, and Piedmont/Blue Ridge transition regions are higher than those for the Piedmont regions. Results from statistical analysis indicate that the regions can be ranked in terms of base-flow characteristics from highest to lowest as follows: (1) Piedmont/Blue Ridge transition, (2) Valley and Ridge and Blue Ridge, (3) Piedmont southern, and (4) Piedmont northern. The flow statistics are consistently higher and the values for base-flow variability are lower for basins within the Piedmont/Blue Ridge transition region relative to those from the other regions, whereas the basins within the Piedmont northern region show the opposite pattern. The group rankings of the base-flow characteristics were used to designate the potential surface-water yield for the regions. In addition, an approach developed for this investigation assigns a rank for potential surface- water yield to a basin according to the quartiles in which the values for the base-flow character- istics are located. Both procedures indicate that the Valley and Ridge, Blue Ridge, and Piedmont/Blue Ridge transition regions have moderate-to-high potential surface-water yield and the Piedmont regions have low-to-moderate potential surface- water yield. In order to indicate potential ground-water yield from base-flow characteristics, aquifer properties for 51 streamflow-gaging stations with continuous record of streamflow data were determined by methods that use streamflow records and basin characteristics. Areal diffusivity ranges from 17,100 to 88,400 feet squared per day, with a median value of 38,400 feet squared per day. Areal transmissivity ranges from 63 to 830 feet squared per day, with a median value of 270 feet squared per day. Storage coefficients, which were estimated by dividing areal transmissivity by areal diffusivity, range from approximately 0.001 to 0.019 (dimensionless), with a median value of 0.007. The median value for areal diffus

  13. Improved efficacy of soluble human receptor activator of nuclear factor kappa B (RANK) fusion protein by site-directed mutagenesis.

    PubMed

    Son, Young Jun; Han, Jihye; Lee, Jae Yeon; Kim, HaHyung; Chun, Taehoon

    2015-06-01

    Soluble human receptor activator of nuclear factor kappa B fusion immunoglobulin (hRANK-Ig) has been considered as one of the therapeutic agents to treat osteoporosis or diseases associated with bone destruction by blocking the interaction between RANK and the receptor activator of nuclear factor kappa B ligand (RANKL). However, no scientific record showing critical amino acid residues within the structural interface between the human RANKL and RANK complex is yet available. In this study, we produced several mutants of hRANK-Ig by replacement of amino acid residue(s) and tested whether the mutants had increased binding affinity to human RANKL. Based on the results from flow cytometry and surface plasmon resonance analyses, the replacement of E(125) with D(125), or E(125) and C(127) with D(125) and F(127) within loop 3 of cysteine-rich domain 3 of hRANK-Ig increases binding affinity to human RANKL over the wild-type hRANK-Ig. This result may provide the first example of improvement in the efficacy of hRANK-Ig by protein engineering and may give additional information to understand a more defined structural interface between hRANK and RANKL.

  14. Selection of the open pit mining cut-off grade strategy under price uncertainty using a risk based multi-criteria ranking system / Wybór strategii określania warunku opłacalności wydobycia w kopalniach odkrywkowych w warunkach niepewności cen w oparciu o wielokryterialny system rankingowy z uwzględnieniem czynników ryzyka

    NASA Astrophysics Data System (ADS)

    Azimi, Yousue; Osanloo, Montza; Esfahanipour, Akbar

    2012-12-01

    Cut-off Grade Strategy (COGS) is a concept that directly influences the financial, technical, economic, environmental, and legal issues in relation to exploitation of a mineral resource. A decision making system is proposed to select the best technically feasible COGS under price uncertainty. In the proposed system both the conventional discounted cash flow and modern simulation based real option valuations are used to evaluate the alternative strategies. Then the conventional expected value criterion and a multiple criteria ranking system were used to rank the strategies based on the two valuation methods. In the multiple criteria ranking system besides the expected value other stochastic orders expressing abilities of strategies in producing extra profits, minimizing losses and achieving the predefined goals of the exploitation strategy are considered. Finally, the best strategy is selected based on the overall average rank of strategies through all ranking systems. The proposed system was examined using the data of Sungun Copper Mine. To assess the merits of the alternatives better, ranking process was done at both high (prevailing economic condition) and low price conditions. Ranking results revealed that at different price conditions and valuation methods, different results would be obtained. It is concluded that these differences are due to the different behavior of the embedded option to close the mine early, which is more likely to be exercised under low price condition rather than high price condition. The proposed system would enhance the quality of decision making process by providing a more informative and certain platform for project evaluation.

  15. System for ranking relative threats of U.S. volcanoes

    USGS Publications Warehouse

    Ewert, J.W.

    2007-01-01

    A methodology to systematically rank volcanic threat was developed as the basis for prioritizing volcanoes for long-term hazards evaluations, monitoring, and mitigation activities. A ranking of 169 volcanoes in the United States and the Commonwealth of the Northern Mariana Islands (U.S. volcanoes) is presented based on scores assigned for various hazard and exposure factors. Fifteen factors define the hazard: Volcano type, maximum known eruptive explosivity, magnitude of recent explosivity within the past 500 and 5,000 years, average eruption-recurrence interval, presence or potential for a suite of hazardous phenomena (pyroclastic flows, lahars, lava flows, tsunami, flank collapse, hydrothermal explosion, primary lahar), and deformation, seismic, or degassing unrest. Nine factors define exposure: a measure of ground-based human population in hazard zones, past fatalities and evacuations, a measure of airport exposure, a measure of human population on aircraft, the presence of power, transportation, and developed infrastructure, and whether or not the volcano forms a significant part of a populated island. The hazard score and exposure score for each volcano are multiplied to give its overall threat score. Once scored, the ordered list of volcanoes is divided into five overall threat categories from very high to very low. ?? 2007 ASCE.

  16. Expanding the landscape of $$ \\mathcal{N} $$ = 2 rank 1 SCFTs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Argyres, Philip C.; Lotito, Matteo; Lu, Yongchao

    Here, we refine our previous proposal [1-3] for systematically classifying 4d rank-1 N = 2 SCFTs by constructing their possible Coulomb branch geometries. Four new recently discussed rank-1 theories [4, 5], including novel N = 3 SCFTs, sit beautifully in our refined classification framework. By arguing for the consistency of their RG flows we can make a strong case for the existence of at least four additional rank-1 SCFTs, nearly doubling the number of known rank-1 SCFTs. The refinement consists of relaxing the assumption that the flavor symmetries of the SCFTs have no discrete factors. This results in an enlargedmore » (but finite) set of possible rank-1 SCFTs. Their existence can be further constrained using consistency of their central charges and RG flows.« less

  17. Expanding the landscape of $$ \\mathcal{N} $$ = 2 rank 1 SCFTs

    DOE PAGES

    Argyres, Philip C.; Lotito, Matteo; Lu, Yongchao; ...

    2016-05-16

    Here, we refine our previous proposal [1-3] for systematically classifying 4d rank-1 N = 2 SCFTs by constructing their possible Coulomb branch geometries. Four new recently discussed rank-1 theories [4, 5], including novel N = 3 SCFTs, sit beautifully in our refined classification framework. By arguing for the consistency of their RG flows we can make a strong case for the existence of at least four additional rank-1 SCFTs, nearly doubling the number of known rank-1 SCFTs. The refinement consists of relaxing the assumption that the flavor symmetries of the SCFTs have no discrete factors. This results in an enlargedmore » (but finite) set of possible rank-1 SCFTs. Their existence can be further constrained using consistency of their central charges and RG flows.« less

  18. Explosive percolation on directed networks due to monotonic flow of activity

    NASA Astrophysics Data System (ADS)

    Waagen, Alex; D'Souza, Raissa M.; Lu, Tsai-Ching

    2017-07-01

    An important class of real-world networks has directed edges, and in addition, some rank ordering on the nodes, for instance the popularity of users in online social networks. Yet, nearly all research related to explosive percolation has been restricted to undirected networks. Furthermore, information on such rank-ordered networks typically flows from higher-ranked to lower-ranked individuals, such as follower relations, replies, and retweets on Twitter. Here we introduce a simple percolation process on an ordered, directed network where edges are added monotonically with respect to the rank ordering. We show with a numerical approach that the emergence of a dominant strongly connected component appears to be discontinuous. Large-scale connectivity occurs at very high density compared with most percolation processes, and this holds not just for the strongly connected component structure but for the weakly connected component structure as well. We present analysis with branching processes, which explains this unusual behavior and gives basic intuition for the underlying mechanisms. We also show that before the emergence of a dominant strongly connected component, multiple giant strongly connected components may exist simultaneously. By adding a competitive percolation rule with a small bias to link uses of similar rank, we show this leads to formation of two distinct components, one of high-ranked users, and one of low-ranked users, with little flow between the two components.

  19. A Variational Approach to Video Registration with Subspace Constraints.

    PubMed

    Garg, Ravi; Roussos, Anastasios; Agapito, Lourdes

    2013-01-01

    This paper addresses the problem of non-rigid video registration, or the computation of optical flow from a reference frame to each of the subsequent images in a sequence, when the camera views deformable objects. We exploit the high correlation between 2D trajectories of different points on the same non-rigid surface by assuming that the displacement of any point throughout the sequence can be expressed in a compact way as a linear combination of a low-rank motion basis. This subspace constraint effectively acts as a trajectory regularization term leading to temporally consistent optical flow. We formulate it as a robust soft constraint within a variational framework by penalizing flow fields that lie outside the low-rank manifold. The resulting energy functional can be decoupled into the optimization of the brightness constancy and spatial regularization terms, leading to an efficient optimization scheme. Additionally, we propose a novel optimization scheme for the case of vector valued images, based on the dualization of the data term. This allows us to extend our approach to deal with colour images which results in significant improvements on the registration results. Finally, we provide a new benchmark dataset, based on motion capture data of a flag waving in the wind, with dense ground truth optical flow for evaluation of multi-frame optical flow algorithms for non-rigid surfaces. Our experiments show that our proposed approach outperforms state of the art optical flow and dense non-rigid registration algorithms.

  20. Optical interconnection network for parallel access to multi-rank memory in future computing systems.

    PubMed

    Wang, Kang; Gu, Huaxi; Yang, Yintang; Wang, Kun

    2015-08-10

    With the number of cores increasing, there is an emerging need for a high-bandwidth low-latency interconnection network, serving core-to-memory communication. In this paper, aiming at the goal of simultaneous access to multi-rank memory, we propose an optical interconnection network for core-to-memory communication. In the proposed network, the wavelength usage is delicately arranged so that cores can communicate with different ranks at the same time and broadcast for flow control can be achieved. A distributed memory controller architecture that works in a pipeline mode is also designed for efficient optical communication and transaction address processes. The scaling method and wavelength assignment for the proposed network are investigated. Compared with traditional electronic bus-based core-to-memory communication, the simulation results based on the PARSEC benchmark show that the bandwidth enhancement and latency reduction are apparent.

  1. Simulated effects of ground-water pumpage on stream-aquifer flow in the vicinity of federally protected species of freshwater mussels in the lower Apalachicola-Chattahoochee-Flint River basin (Subarea 4), southeastern Alabama, northwestern Florida, and southwestern Georgia

    USGS Publications Warehouse

    Albertson, Phillip N.; Torak, Lynn J.

    2002-01-01

    Simulation results indicate that ground-water withdrawal in the lower Apalachicola-Chattahoochee-Flint River basin during times of drought could reduce stream-aquifer flow and cause specific stream reaches to go dry. Of the 37 reaches that were studied, 8 reaches ranked highly sensitive to pumpage, 13 reaches ranked medium, and 16 reaches ranked low. Of the eight reaches that ranked high, seven contain at least one federally protected mussel species. Small tributary streams such as Gum, Jones, Muckalee, Spring, and Cooleewahee Creeks would go dry at lower pumping rates than needed to dry up larger streams. Other streams that were ranked high may go dry depending on the amount of upstream flow entering the reach; this condition is indicated for some reaches on Spring Creek. A dry stream condition is of particular concern to water and wildlife managers because adequate streamflow is essential for mussel survival.

  2. Structure-preserving model reduction of large-scale logistics networks. Applications for supply chains

    NASA Astrophysics Data System (ADS)

    Scholz-Reiter, B.; Wirth, F.; Dashkovskiy, S.; Makuschewitz, T.; Schönlein, M.; Kosmykov, M.

    2011-12-01

    We investigate the problem of model reduction with a view to large-scale logistics networks, specifically supply chains. Such networks are modeled by means of graphs, which describe the structure of material flow. An aim of the proposed model reduction procedure is to preserve important features within the network. As a new methodology we introduce the LogRank as a measure for the importance of locations, which is based on the structure of the flows within the network. We argue that these properties reflect relative importance of locations. Based on the LogRank we identify subgraphs of the network that can be neglected or aggregated. The effect of this is discussed for a few motifs. Using this approach we present a meta algorithm for structure-preserving model reduction that can be adapted to different mathematical modeling frameworks. The capabilities of the approach are demonstrated with a test case, where a logistics network is modeled as a Jackson network, i.e., a particular type of queueing network.

  3. Critical review of methods for risk ranking of food-related hazards, based on risks for human health.

    PubMed

    Van der Fels-Klerx, H J; Van Asselt, E D; Raley, M; Poulsen, M; Korsgaard, H; Bredsdorff, L; Nauta, M; D'agostino, M; Coles, D; Marvin, H J P; Frewer, L J

    2018-01-22

    This study aimed to critically review methods for ranking risks related to food safety and dietary hazards on the basis of their anticipated human health impacts. A literature review was performed to identify and characterize methods for risk ranking from the fields of food, environmental science and socio-economic sciences. The review used a predefined search protocol, and covered the bibliographic databases Scopus, CAB Abstracts, Web of Sciences, and PubMed over the period 1993-2013. All references deemed relevant, on the basis of predefined evaluation criteria, were included in the review, and the risk ranking method characterized. The methods were then clustered-based on their characteristics-into eleven method categories. These categories included: risk assessment, comparative risk assessment, risk ratio method, scoring method, cost of illness, health adjusted life years (HALY), multi-criteria decision analysis, risk matrix, flow charts/decision trees, stated preference techniques and expert synthesis. Method categories were described by their characteristics, weaknesses and strengths, data resources, and fields of applications. It was concluded there is no single best method for risk ranking. The method to be used should be selected on the basis of risk manager/assessor requirements, data availability, and the characteristics of the method. Recommendations for future use and application are provided.

  4. Augmenting the Deliberative Method for Ranking Risks.

    PubMed

    Susel, Irving; Lasley, Trace; Montezemolo, Mark; Piper, Joel

    2016-01-01

    The Department of Homeland Security (DHS) characterized and prioritized the physical cross-border threats and hazards to the nation stemming from terrorism, market-driven illicit flows of people and goods (illegal immigration, narcotics, funds, counterfeits, and weaponry), and other nonmarket concerns (movement of diseases, pests, and invasive species). These threats and hazards pose a wide diversity of consequences with very different combinations of magnitudes and likelihoods, making it very challenging to prioritize them. This article presents the approach that was used at DHS to arrive at a consensus regarding the threats and hazards that stand out from the rest based on the overall risk they pose. Due to time constraints for the decision analysis, it was not feasible to apply multiattribute methodologies like multiattribute utility theory or the analytic hierarchy process. Using a holistic approach was considered, such as the deliberative method for ranking risks first published in this journal. However, an ordinal ranking alone does not indicate relative or absolute magnitude differences among the risks. Therefore, the use of the deliberative method for ranking risks is not sufficient for deciding whether there is a material difference between the top-ranked and bottom-ranked risks, let alone deciding what the stand-out risks are. To address this limitation of ordinal rankings, the deliberative method for ranking risks was augmented by adding an additional step to transform the ordinal ranking into a ratio scale ranking. This additional step enabled the selection of stand-out risks to help prioritize further analysis. © 2015 Society for Risk Analysis.

  5. Optimal ranking regime analysis of TreeFlow dendrohydrological reconstructions

    USDA-ARS?s Scientific Manuscript database

    The Optimal Ranking Regime (ORR) method was used to identify 6-100 year time windows containing significant ranking sequences in 55 western U.S. streamflow reconstructions, and reconstructions of the level of the Great Salt Lake and San Francisco Bay salinity during 1500-2007. The method’s ability t...

  6. Detecting determinism with improved sensitivity in time series: rank-based nonlinear predictability score.

    PubMed

    Naro, Daniel; Rummel, Christian; Schindler, Kaspar; Andrzejak, Ralph G

    2014-09-01

    The rank-based nonlinear predictability score was recently introduced as a test for determinism in point processes. We here adapt this measure to time series sampled from time-continuous flows. We use noisy Lorenz signals to compare this approach against a classical amplitude-based nonlinear prediction error. Both measures show an almost identical robustness against Gaussian white noise. In contrast, when the amplitude distribution of the noise has a narrower central peak and heavier tails than the normal distribution, the rank-based nonlinear predictability score outperforms the amplitude-based nonlinear prediction error. For this type of noise, the nonlinear predictability score has a higher sensitivity for deterministic structure in noisy signals. It also yields a higher statistical power in a surrogate test of the null hypothesis of linear stochastic correlated signals. We show the high relevance of this improved performance in an application to electroencephalographic (EEG) recordings from epilepsy patients. Here the nonlinear predictability score again appears of higher sensitivity to nonrandomness. Importantly, it yields an improved contrast between signals recorded from brain areas where the first ictal EEG signal changes were detected (focal EEG signals) versus signals recorded from brain areas that were not involved at seizure onset (nonfocal EEG signals).

  7. Detecting determinism with improved sensitivity in time series: Rank-based nonlinear predictability score

    NASA Astrophysics Data System (ADS)

    Naro, Daniel; Rummel, Christian; Schindler, Kaspar; Andrzejak, Ralph G.

    2014-09-01

    The rank-based nonlinear predictability score was recently introduced as a test for determinism in point processes. We here adapt this measure to time series sampled from time-continuous flows. We use noisy Lorenz signals to compare this approach against a classical amplitude-based nonlinear prediction error. Both measures show an almost identical robustness against Gaussian white noise. In contrast, when the amplitude distribution of the noise has a narrower central peak and heavier tails than the normal distribution, the rank-based nonlinear predictability score outperforms the amplitude-based nonlinear prediction error. For this type of noise, the nonlinear predictability score has a higher sensitivity for deterministic structure in noisy signals. It also yields a higher statistical power in a surrogate test of the null hypothesis of linear stochastic correlated signals. We show the high relevance of this improved performance in an application to electroencephalographic (EEG) recordings from epilepsy patients. Here the nonlinear predictability score again appears of higher sensitivity to nonrandomness. Importantly, it yields an improved contrast between signals recorded from brain areas where the first ictal EEG signal changes were detected (focal EEG signals) versus signals recorded from brain areas that were not involved at seizure onset (nonfocal EEG signals).

  8. Uncertainty Quantification of Turbulence Model Closure Coefficients for Transonic Wall-Bounded Flows

    NASA Technical Reports Server (NTRS)

    Schaefer, John; West, Thomas; Hosder, Serhat; Rumsey, Christopher; Carlson, Jan-Renee; Kleb, William

    2015-01-01

    The goal of this work was to quantify the uncertainty and sensitivity of commonly used turbulence models in Reynolds-Averaged Navier-Stokes codes due to uncertainty in the values of closure coefficients for transonic, wall-bounded flows and to rank the contribution of each coefficient to uncertainty in various output flow quantities of interest. Specifically, uncertainty quantification of turbulence model closure coefficients was performed for transonic flow over an axisymmetric bump at zero degrees angle of attack and the RAE 2822 transonic airfoil at a lift coefficient of 0.744. Three turbulence models were considered: the Spalart-Allmaras Model, Wilcox (2006) k-w Model, and the Menter Shear-Stress Trans- port Model. The FUN3D code developed by NASA Langley Research Center was used as the flow solver. The uncertainty quantification analysis employed stochastic expansions based on non-intrusive polynomial chaos as an efficient means of uncertainty propagation. Several integrated and point-quantities are considered as uncertain outputs for both CFD problems. All closure coefficients were treated as epistemic uncertain variables represented with intervals. Sobol indices were used to rank the relative contributions of each closure coefficient to the total uncertainty in the output quantities of interest. This study identified a number of closure coefficients for each turbulence model for which more information will reduce the amount of uncertainty in the output significantly for transonic, wall-bounded flows.

  9. Ensemble forecasting of short-term system scale irrigation demands using real-time flow data and numerical weather predictions

    NASA Astrophysics Data System (ADS)

    Perera, Kushan C.; Western, Andrew W.; Robertson, David E.; George, Biju; Nawarathna, Bandara

    2016-06-01

    Irrigation demands fluctuate in response to weather variations and a range of irrigation management decisions, which creates challenges for water supply system operators. This paper develops a method for real-time ensemble forecasting of irrigation demand and applies it to irrigation command areas of various sizes for lead times of 1 to 5 days. The ensemble forecasts are based on a deterministic time series model coupled with ensemble representations of the various inputs to that model. Forecast inputs include past flow, precipitation, and potential evapotranspiration. These inputs are variously derived from flow observations from a modernized irrigation delivery system; short-term weather forecasts derived from numerical weather prediction models and observed weather data available from automatic weather stations. The predictive performance for the ensemble spread of irrigation demand was quantified using rank histograms, the mean continuous rank probability score (CRPS), the mean CRPS reliability and the temporal mean of the ensemble root mean squared error (MRMSE). The mean forecast was evaluated using root mean squared error (RMSE), Nash-Sutcliffe model efficiency (NSE) and bias. The NSE values for evaluation periods ranged between 0.96 (1 day lead time, whole study area) and 0.42 (5 days lead time, smallest command area). Rank histograms and comparison of MRMSE, mean CRPS, mean CRPS reliability and RMSE indicated that the ensemble spread is generally a reliable representation of the forecast uncertainty for short lead times but underestimates the uncertainty for long lead times.

  10. Postwildfire debris-flow hazard assessment of the area burned by the 2013 West Fork Fire Complex, southwestern Colorado

    USGS Publications Warehouse

    Verdin, Kristine L.; Dupree, Jean A.; Stevens, Michael R.

    2013-01-01

    This report presents a preliminary emergency assessment of the debris-flow hazards from drainage basins burned by the 2013 West Fork Fire Complex near South Fork in southwestern Colorado. Empirical models derived from statistical evaluation of data collected from recently burned basins throughout the intermountain western United States were used to estimate the probability of debris-flow occurrence, potential volume of debris flows, and the combined debris-flow hazard ranking along the drainage network within and just downstream from the burned area, and to estimate the same for 54 drainage basins of interest within the perimeter of the burned area. Input data for the debris-flow models included topographic variables, soil characteristics, burn severity, and rainfall totals and intensities for a (1) 2-year-recurrence, 1-hour-duration rainfall, referred to as a 2-year storm; (2) 10-year-recurrence, 1-hour-duration rainfall, referred to as a 10-year storm; and (3) 25-year-recurrence, 1-hour-duration rainfall, referred to as a 25-year storm. Estimated debris-flow probabilities at the pour points of the 54 drainage basins of interest ranged from less than 1 to 65 percent in response to the 2-year storm; from 1 to 77 percent in response to the 10-year storm; and from 1 to 83 percent in response to the 25-year storm. Twelve of the 54 drainage basins of interest have a 30-percent probability or greater of producing a debris flow in response to the 25-year storm. Estimated debris-flow volumes for all rainfalls modeled range from a low of 2,400 cubic meters to a high of greater than 100,000 cubic meters. Estimated debris-flow volumes increase with basin size and distance along the drainage network, but some smaller drainages also were predicted to produce substantial debris flows. One of the 54 drainage basins of interest had the highest combined hazard ranking, while 9 other basins had the second highest combined hazard ranking. Of these 10 basins with the 2 highest combined hazard rankings, 7 basins had predicted debris-flow volumes exceeding 100,000 cubic meters, while 3 had predicted probabilities of debris flows exceeding 60 percent. The 10 basins with high combined hazard ranking include 3 tributaries in the headwaters of Trout Creek, four tributaries to the West Fork San Juan River, Hope Creek draining toward a county road on the eastern edge of the burn, Lake Fork draining to U.S. Highway 160, and Leopard Creek on the northern edge of the burn. The probabilities and volumes for the modeled storms indicate a potential for debris-flow impacts on structures, reservoirs, roads, bridges, and culverts located within and immediately downstream from the burned area. U.S. Highway 160, on the eastern edge of the burn area, also is susceptible to impacts from debris flows.

  11. A new methodology for hydro-abrasive erosion tests simulating penstock erosive flow

    NASA Astrophysics Data System (ADS)

    Aumelas, V.; Maj, G.; Le Calvé, P.; Smith, M.; Gambiez, B.; Mourrat, X.

    2016-11-01

    Hydro-abrasive resistance is an important property requirement for hydroelectric power plant penstock coating systems used by EDF. The selection of durable coating systems requires an experimental characterization of coating performance. This can be achieved by performing accelerated and representative laboratory tests. In case of severe erosion induced by a penstock flow, there is no suitable method or standard representative of real erosive flow conditions. The presented study aims at developing a new methodology and an associated laboratory experimental device. The objective of the laboratory apparatus is to subject coated test specimens to wear conditions similar to the ones generated at the penstock lower generatrix in actual flow conditions. Thirteen preselected coating solutions were first been tested during a 45 hours erosion test. A ranking of the thirteen coating solutions was then determined after characterisation. To complete this first evaluation and to determine the wear kinetic of the four best coating solutions, additional erosion tests were conducted with a longer duration of 216 hours. A comparison of this new method with standardized tests and with real service operating flow conditions is also discussed. To complete the final ranking based on hydro-abrasive erosion tests, some trial tests were carried out on penstock samples to check the application method of selected coating systems. The paper gives some perspectives related to erosion test methodologies for materials and coating solutions for hydraulic applications. The developed test method can also be applied in other fields.

  12. Potential flood hazard assessment by integration of ALOS PALSAR and ASTER GDEM: a case study for the Hoa Chau commune, Hoa Vang district, in central Vietnam

    NASA Astrophysics Data System (ADS)

    Huong, Do Thi Viet; Nagasawa, Ryota

    2014-01-01

    The potential flood hazard was assessed for the Hoa Chau commune in central Vietnam in order to identify the high flood hazard zones for the decision makers who will execute future rural planning. A new approach for deriving the potential flood hazard based on integration of inundation and flow direction maps is described. Areas inundated in the historical flood event of 2007 were extracted from Advanced Land Observing Satellite (ALOS) phased array L-band synthetic aperture data radar (PALSAR) images, while flow direction characteristics were derived from the ASTER GDEM to extract the depressed surfaces. Past flood experience and the flow direction were then integrated to analyze and rank the potential flood hazard zones. The land use/cover map extracted from LANDSAT TM and flood depth point records from field surveys were utilized to check the possibility of susceptible inundated areas, extracting data from ALOS PALSAR and ranking the potential flood hazard. The estimation of potential flood hazard areas revealed that 17.43% and 17.36% of Hoa Chau had high and medium potential flood hazards, respectively. The flow direction and ALOS PALSAR data were effectively integrated for determining the potential flood hazard when hydrological and meteorological data were inadequate and remote sensing images taken during flood times were not available or were insufficient.

  13. Applicability of ranked Regional Climate Models (RCM) to assess the impact of climate change on Ganges: A case study.

    NASA Astrophysics Data System (ADS)

    Anand, Jatin; Devak, Manjula; Gosain, Ashvani Kumar; Khosa, Rakesh; Dhanya, Ct

    2017-04-01

    The negative impact of climate change is felt over wide range of spatial scales, ranging from small basins to large watershed area, which can possibly outweighs the benefits of natural water system. General Circulation Models (GCMs) has been widely used as an input to a hydrological models (HMs), to simulate different hydrological components of a river basin. However, the coarser scale of GCMs and spatio-temporal biases, restricted its use at finer resolution. If downscaled, adds one more level of uncertainty i.e., downscaling uncertainty together with model and scenario uncertainty. The outputs computed from Regional Climate Models (RCM) may aid the uncertainties arising from GCMs, as the RCMs are the miniatures of GCMs. However, the RCMs do have some inherent systematic biases, hence bias correction is a prerequisite process before it is fed to HMs. RCMs, together with the input from GCMs at later boundaries also takes topography of the area into account. Hence, RCMs need to be ranked a priori. In this study, impact of climate change on the Ganga basin, India, is assessed using the ranked RCMs. Firstly, bias correction of 14 RCM models are done using Quantile-Quantile mapping and Equidistant cumulative distribution method, for historic (1990-2004) and future scenario (2021-2100), respectively. The runoff simulations from Soil Water Assessment Tool (SWAT), for historic scenario is used for ranking of RCMs. Entropy and PROMETHEE-2 method is employed to rank the RCMs based on five performance indicators namely, Nash-Sutcliffe efficiency (NSE), coefficient of determination (R2), normalised root mean square error (NRMSE), absolute normalised mean bias error (ANMBE) and average absolute relative error (AARE). The results illustrated that each of the performance indicators behaves differently for different RCMs. RCA 4 (CNRM-CERFACS) is found as the best model with the highest value of  (0.85), followed by RCA4 (MIROC) and RCA4 (ICHEC) with  values of 0.80 and 0.53, respectively, for Ganga basin. Flow-duration curve and long-term average of streamflow for ranked RCMs, confirm that SWAT model is efficient in capturing the hydrology of the basin. For monsoon months (June, July, August and September), future annual mean surface runoff decreases substantially ( -50 % to -10%), while the base flow for October, November and December is projected to increase ( 10- 20 %). Analysis of snow-melt hydrology, indicated that the snow-melt is projected to increase during the months of November to March, with a maximum increase (400%) shown by RCA 4 (CNRM-CERFACS) and least by RCA4 (ICHEC) (15%). Further, all the RCMs projected higher and lower frequency of dry and wet monsoon, respectively. The analysis of simulated base flow and recharge illustrates that the change varies from +100% to - 500% and +97% to -600%, respectively, with central part of the basin undergoing major loss in the recharge. Hence, this research provides important insights of surface runoff to climate change projections and therefore, better administration and management of available resources is necessary. Keyword: Climate change, uncertainty, Soil Water Assessment Tool (SWAT), General Circulation Model (GCM), Regional Climate Models (RCM), Bias correction.

  14. Panel acoustic contribution analysis.

    PubMed

    Wu, Sean F; Natarajan, Logesh Kumar

    2013-02-01

    Formulations are derived to analyze the relative panel acoustic contributions of a vibrating structure. The essence of this analysis is to correlate the acoustic power flow from each panel to the radiated acoustic pressure at any field point. The acoustic power is obtained by integrating the normal component of the surface acoustic intensity, which is the product of the surface acoustic pressure and normal surface velocity reconstructed by using the Helmholtz equation least squares based nearfield acoustical holography, over each panel. The significance of this methodology is that it enables one to analyze and rank relative acoustic contributions of individual panels of a complex vibrating structure to acoustic radiation anywhere in the field based on a single set of the acoustic pressures measured in the near field. Moreover, this approach is valid for both interior and exterior regions. Examples of using this method to analyze and rank the relative acoustic contributions of a scaled vehicle cabin are demonstrated.

  15. Capturing spatiotemporal variation in wildfires for improving postwildfire debris-flow hazard assessments: Chapter 20

    USGS Publications Warehouse

    Haas, Jessica R.; Thompson, Matthew P.; Tillery, Anne C.; Scott, Joe H.

    2017-01-01

    Wildfires can increase the frequency and magnitude of catastrophic debris flows. Integrated, proactive natural hazard assessment would therefore characterize landscapes based on the potential for the occurrence and interactions of wildfires and postwildfire debris flows. This chapter presents a new modeling effort that can quantify the variability surrounding a key input to postwildfire debris-flow modeling, the amount of watershed burned at moderate to high severity, in a prewildfire context. The use of stochastic wildfire simulation captures variability surrounding the timing and location of ignitions, fire weather patterns, and ultimately the spatial patterns of watershed area burned. Model results provide for enhanced estimates of postwildfire debris-flow hazard in a prewildfire context, and multiple hazard metrics are generated to characterize and contrast hazards across watersheds. Results can guide mitigation efforts by allowing planners to identify which factors may be contributing the most to the hazard rankings of watersheds.

  16. Affinity ranking of antibodies using flow cytometry: application in antibody phage display-based target discovery.

    PubMed

    Geuijen, Cecilia A W; Clijsters-van der Horst, Marieke; Cox, Freek; Rood, Pauline M L; Throsby, Mark; Jongeneelen, Mandy A C; Backus, Harold H J; van Deventer, Els; Kruisbeek, Ada M; Goudsmit, Jaap; de Kruif, John

    2005-07-01

    Application of antibody phage display to the identification of cell surface antigens with restricted expression patterns is often complicated by the inability to demonstrate specific binding to a certain cell type. The specificity of an antibody can only be properly assessed when the antibody is of sufficient high affinity to detect low-density antigens on cell surfaces. Therefore, a robust and simple assay for the prediction of relative antibody affinities was developed and compared to data obtained using surface plasmon resonance (SPR) technology. A panel of eight anti-CD46 antibody fragments with different affinities was selected from phage display libraries and reformatted into complete human IgG1 molecules. SPR was used to determine K(D) values for these antibodies. The association and dissociation of the antibodies for binding to CD46 expressed on cell surfaces were analysed using FACS-based assays. We show that ranking of the antibodies based on FACS data correlates well with ranking based on K(D) values as measured by SPR and can therefore be used to discriminate between high- and low-affinity antibodies. Finally, we show that a low-affinity antibody may only detect high expression levels of a surface marker while failing to detect lower expression levels of this molecule, which may lead to a false interpretation of antibody specificity.

  17. Equipment management risk rating system based on engineering endpoints.

    PubMed

    James, P J

    1999-01-01

    The equipment management risk ratings system outlined here offers two significant departures from current practice: risk classifications are based on intrinsic device risks, and the risk rating system is based on engineering endpoints. Intrinsic device risks are categorized as physical, clinical and technical, and these flow from the incoming equipment assessment process. Engineering risk management is based on verification of engineering endpoints such as clinical measurements or energy delivery. This practice eliminates the ambiguity associated with ranking risk in terms of physiologic and higher-level outcome endpoints such as no significant hazards, low significance, injury, or mortality.

  18. Development of a risk-ranking framework to evaluate potential high-threat microorganisms, toxins, and chemicals in food.

    PubMed

    Newsome, R; Tran, N; Paoli, G M; Jaykus, L A; Tompkin, B; Miliotis, M; Ruthman, T; Hartnett, E; Busta, F F; Petersen, B; Shank, F; McEntire, J; Hotchkiss, J; Wagner, M; Schaffner, D W

    2009-03-01

    Through a cooperative agreement with the U.S. Food and Drug Administration, the Institute of Food Technologists developed a risk-ranking framework prototype to enable comparison of microbiological and chemical hazards in foods and to assist policy makers, risk managers, risk analysts, and others in determining the relative public health impact of specific hazard-food combinations. The prototype is a bottom-up system based on assumptions that incorporate expert opinion/insight with a number of exposure and hazard-related risk criteria variables, which are propagated forward with food intake data to produce risk-ranking determinations. The prototype produces a semi-quantitative comparative assessment of food safety hazards and the impacts of hazard control measures. For a specific hazard-food combination the prototype can produce a single metric: a final risk value expressed as annual pseudo-disability adjusted life years (pDALY). The pDALY is a harmonization of the very different dose-response relationships observed for chemicals and microbes. The prototype was developed on 2 platforms, a web-based user interface and an Analytica(R) model (Lumina Decision Systems, Los Gatos, Calif., U.S.A.). Comprising visual basic language, the web-based platform facilitates data input and allows use concurrently from multiple locations. The Analytica model facilitates visualization of the logic flow, interrelationship of input and output variables, and calculations/algorithms comprising the prototype. A variety of sortable risk-ranking reports and summary information can be generated for hazard-food pairs, showing hazard and dose-response assumptions and data, per capita consumption by population group, and annual p-DALY.

  19. Phosphorus loss from an agricultural watershed as a function of storm size.

    PubMed

    Sharpley, Andrew N; Kleinman, Peter J A; Heathwaite, A Louise; Gburek, William J; Folmar, Gordon J; Schmidt, John P

    2008-01-01

    Phosphorus (P) loss from agricultural watersheds is generally greater in storm rather than base flow. Although fundamental to P-based risk assessment tools, few studies have quantified the effect of storm size on P loss. Thus, the loss of P as a function of flow type (base and storm flow) and size was quantified for a mixed-land use watershed (FD-36; 39.5 ha) from 1997 to 2006. Storm size was ranked by return period (<1, 1-3, 3-5, 5-10, and >10 yr), where increasing return period represents storms with greater peak and total flow. From 1997 to 2006, storm flow accounted for 32% of watershed discharge yet contributed 65% of dissolved reactive P (DP) (107 g ha(-1) yr(-1)) and 80% of total P (TP) exported (515 g ha(-1) yr(-1)). Of 248 storm flows during this period, 93% had a return period of <1 yr, contributing most of the 10-yr flow (6507 m(3) ha(-1); 63%) and export of DP (574 g ha(-1); 54%) and TP (2423 g ha(-1); 47%). Two 10-yr storms contributed 23% of P exported between 1997 and 2006. A significant increase in storm flow DP concentration with storm size (0.09-0.16 mg L(-1)) suggests that P release from soil and/or area of the watershed producing runoff increase with storm size. Thus, implementation of P-based Best Management Practice needs to consider what level of risk management is acceptable.

  20. Navy Nurse Corps manpower management model.

    PubMed

    Kinstler, Daniel P; Johnson, Raymond W; Richter, Anke; Kocher, Kathryn

    2008-01-01

    The Navy Nurse Corps is part of a team of professionals that provides high quality, economical health care to approximately 700,000 active duty Navy and Marine Corps members, as well as 2.6 million retired and family members. Navy Nurse Corps manpower management efficiency is critical to providing this care. This paper aims to focus on manpower planning in the Navy Nurse Corps. The Nurse Corps manages personnel primarily through the recruitment process, drawing on multiple hiring sources. Promotion rates at the lowest two ranks are mandated, but not at the higher ranks. Retention rates vary across pay grades. Using these promotion and attrition rates, a Markov model was constructed to model the personnel flow of junior nurse corps officers. Hiring sources were shown to have a statistically significant effect on promotion and retention rates. However, these effects were not found to be practically significant in the Markov model. Only small improvements in rank imbalances are possible given current recruiting guidelines. Allowing greater flexibility in recruiting practices, fewer recruits would generate a 25 percent reduction in rank imbalances, but result in understaffing. Recruiting different ranks at entry would generate a 65 percent reduction in rank imbalances without understaffing issues. Policies adjusting promotion and retention rates are more powerful in controlling personnel flows than adjusting hiring sources. These policies are the only means for addressing the fundamental sources of rank imbalances in the Navy Nurse Corps arising from current manpower guidelines. The paper shows that modeling to improve manpower management may enable the Navy Nurse Corps to more efficiently fulfill its mandate for high-quality healthcare.

  1. Dual Dynamically Orthogonal approximation of incompressible Navier Stokes equations with random boundary conditions

    NASA Astrophysics Data System (ADS)

    Musharbash, Eleonora; Nobile, Fabio

    2018-02-01

    In this paper we propose a method for the strong imposition of random Dirichlet boundary conditions in the Dynamical Low Rank (DLR) approximation of parabolic PDEs and, in particular, incompressible Navier Stokes equations. We show that the DLR variational principle can be set in the constrained manifold of all S rank random fields with a prescribed value on the boundary, expressed in low rank format, with rank smaller then S. We characterize the tangent space to the constrained manifold by means of a Dual Dynamically Orthogonal (Dual DO) formulation, in which the stochastic modes are kept orthonormal and the deterministic modes satisfy suitable boundary conditions, consistent with the original problem. The Dual DO formulation is also convenient to include the incompressibility constraint, when dealing with incompressible Navier Stokes equations. We show the performance of the proposed Dual DO approximation on two numerical test cases: the classical benchmark of a laminar flow around a cylinder with random inflow velocity, and a biomedical application for simulating blood flow in realistic carotid artery reconstructed from MRI data with random inflow conditions coming from Doppler measurements.

  2. Learning of Rule Ensembles for Multiple Attribute Ranking Problems

    NASA Astrophysics Data System (ADS)

    Dembczyński, Krzysztof; Kotłowski, Wojciech; Słowiński, Roman; Szeląg, Marcin

    In this paper, we consider the multiple attribute ranking problem from a Machine Learning perspective. We propose two approaches to statistical learning of an ensemble of decision rules from decision examples provided by the Decision Maker in terms of pairwise comparisons of some objects. The first approach consists in learning a preference function defining a binary preference relation for a pair of objects. The result of application of this function on all pairs of objects to be ranked is then exploited using the Net Flow Score procedure, giving a linear ranking of objects. The second approach consists in learning a utility function for single objects. The utility function also gives a linear ranking of objects. In both approaches, the learning is based on the boosting technique. The presented approaches to Preference Learning share good properties of the decision rule preference model and have good performance in the massive-data learning problems. As Preference Learning and Multiple Attribute Decision Aiding share many concepts and methodological issues, in the introduction, we review some aspects bridging these two fields. To illustrate the two approaches proposed in this paper, we solve with them a toy example concerning the ranking of a set of cars evaluated by multiple attributes. Then, we perform a large data experiment on real data sets. The first data set concerns credit rating. Since recent research in the field of Preference Learning is motivated by the increasing role of modeling preferences in recommender systems and information retrieval, we chose two other massive data sets from this area - one comes from movie recommender system MovieLens, and the other concerns ranking of text documents from 20 Newsgroups data set.

  3. SibRank: Signed bipartite network analysis for neighbor-based collaborative ranking

    NASA Astrophysics Data System (ADS)

    Shams, Bita; Haratizadeh, Saman

    2016-09-01

    Collaborative ranking is an emerging field of recommender systems that utilizes users' preference data rather than rating values. Unfortunately, neighbor-based collaborative ranking has gained little attention despite its more flexibility and justifiability. This paper proposes a novel framework, called SibRank that seeks to improve the state of the art neighbor-based collaborative ranking methods. SibRank represents users' preferences as a signed bipartite network, and finds similar users, through a novel personalized ranking algorithm in signed networks.

  4. Systematic Differences in Signal Emitting and Receiving Revealed by PageRank Analysis of a Human Protein Interactome

    PubMed Central

    Li, Xiu-Qing

    2012-01-01

    Most protein PageRank studies do not use signal flow direction information in protein interactions because this information was not readily available in large protein databases until recently. Therefore, four questions have yet to be answered: A) What is the general difference between signal emitting and receiving in a protein interactome? B) Which proteins are among the top ranked in directional ranking? C) Are high ranked proteins more evolutionarily conserved than low ranked ones? D) Do proteins with similar ranking tend to have similar subcellular locations? In this study, we address these questions using the forward, reverse, and non-directional PageRank approaches to rank an information-directional network of human proteins and study their evolutionary conservation. The forward ranking gives credit to information receivers, reverse ranking to information emitters, and non-directional ranking mainly to the number of interactions. The protein lists generated by the forward and non-directional rankings are highly correlated, but those by the reverse and non-directional rankings are not. The results suggest that the signal emitting/receiving system is characterized by key-emittings and relatively even receivings in the human protein interactome. Signaling pathway proteins are frequent in top ranked ones. Eight proteins are both informational top emitters and top receivers. Top ranked proteins, except a few species-related novel-function ones, are evolutionarily well conserved. Protein-subunit ranking position reflects subunit function. These results demonstrate the usefulness of different PageRank approaches in characterizing protein networks and provide insights to protein interaction in the cell. PMID:23028653

  5. Streamflow of 2015—Water year national summary

    USGS Publications Warehouse

    Jian, Xiaodong; Wolock, David M.; Lins, Harry F.; Brady, Steve

    2016-08-30

    IntroductionThe maps and graphs in this summary describe national streamflow conditions for water year 2015 (October 1, 2014, to September 30, 2015) in the context of the 86-year period 1930–2015, unless otherwise noted. The illustrations are based on observed data from the U.S. Geological Survey’s (USGS) National Streamflow Information Program http://water.usgs.gov/nsip). The period 1930–2015 was used because prior to 1930, the number of streamgages was too small to provide representative data for computing statistics for most regions of the country.In the summary, reference is made to the term “runoff,” which is the depth to which a river basin, State, or other geographic area would be covered with water if all the streamflow within the area during a specified time period was uniformly distributed upon it. Runoff quantifies the magnitude of water flowing through the Nation's rivers and streams in measurement units that can be compared from one area to another.Each of the maps and graphs can be expanded to a larger view by clicking on the image. In all of the graphics, a rank of 1 indicates the highest flow of all years analyzed. Rankings of streamflow are grouped into much-below normal, below normal, normal, above normal, and much-above normal, based on percentiles of flow (greater than 90 percent, 76–90 percent, 25–75 percent, 10–24 percent, and less than 10 percent, respectively) (http://waterwatch.usgs.gov/?id=ww_current). Some data used to produce maps and graphs are provisional and subject to change.

  6. Non-Gaussian elliptic-flow fluctuations in PbPb collisions at $$\\sqrt{\\smash[b]{s_{_\\text{NN}}}} = 5.02$$ TeV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sirunyan, Albert M; et al.

    Event-by-event fluctuations in the elliptic-flow coefficientmore » $$v_2$$ are studied in PbPb collisions at $$\\sqrt{s_{_\\text{NN}}} = 5.02$$ TeV using the CMS detector at the CERN LHC. Elliptic-flow probability distributions $${p}(v_2)$$ for charged particles with transverse momentum 0.3$$< p_\\mathrm{T} <$$3.0 GeV and pseudorapidity $$| \\eta | <$$ 1.0 are determined for different collision centrality classes. The moments of the $${p}(v_2)$$ distributions are used to calculate the $$v_{2}$$ coefficients based on cumulant orders 2, 4, 6, and 8. A rank ordering of the higher-order cumulant results and nonzero standardized skewness values obtained for the $${p}(v_2)$$ distributions indicate non-Gaussian initial-state fluctuation behavior. Bessel-Gaussian and elliptic power fits to the flow distributions are studied to characterize the initial-state spatial anisotropy.« less

  7. RANK Expression and Osteoclastogenesis in Human Monocytes in Peripheral Blood from Rheumatoid Arthritis Patients.

    PubMed

    Nanke, Yuki; Kobashigawa, Tsuyoshi; Yago, Toru; Kawamoto, Manabu; Yamanaka, Hisashi; Kotake, Shigeru

    2016-01-01

    Rheumatoid arthritis (RA) appears as inflammation of synovial tissue and joint destruction. Receptor activator of NF- κ B (RANK) is a member of the TNF receptor superfamily and a receptor for the RANK ligand (RANKL). In this study, we examined the expression of RANK high and CCR6 on CD14 + monocytes from patients with RA and healthy volunteers. Peripheral blood samples were obtained from both the RA patients and the healthy volunteers. Osteoclastogenesis from monocytes was induced by RANKL and M-CSF in vitro . To study the expression of RANK high and CCR6 on CD14 + monocytes, two-color flow cytometry was performed. Levels of expression of RANK on monocytes were significantly correlated with the level of osteoclastogenesis in the healthy volunteers. The expression of RANK high on CD14 + monocyte in RA patients without treatment was elevated and that in those receiving treatment was decreased. In addition, the high-level expression of RANK on CD14 + monocytes was correlated with the high-level expression of CCR6 in healthy volunteers. Monocytes expressing both RANK and CCR6 differentiate into osteoclasts. The expression of CD14 + RANK high in untreated RA patients was elevated. RANK and CCR6 expressed on monocytes may be novel targets for the regulation of bone resorption in RA and osteoporosis.

  8. Google matrix analysis of the multiproduct world trade network

    NASA Astrophysics Data System (ADS)

    Ermann, Leonardo; Shepelyansky, Dima L.

    2015-04-01

    Using the United Nations COMTRADE database [United Nations Commodity Trade Statistics Database, available at: http://comtrade.un.org/db/. Accessed November (2014)] we construct the Google matrix G of multiproduct world trade between the UN countries and analyze the properties of trade flows on this network for years 1962-2010. This construction, based on Markov chains, treats all countries on equal democratic grounds independently of their richness and at the same time it considers the contributions of trade products proportionally to their trade volume. We consider the trade with 61 products for up to 227 countries. The obtained results show that the trade contribution of products is asymmetric: some of them are export oriented while others are import oriented even if the ranking by their trade volume is symmetric in respect to export and import after averaging over all world countries. The construction of the Google matrix allows to investigate the sensitivity of trade balance in respect to price variations of products, e.g. petroleum and gas, taking into account the world connectivity of trade links. The trade balance based on PageRank and CheiRank probabilities highlights the leading role of China and other BRICS countries in the world trade in recent years. We also show that the eigenstates of G with large eigenvalues select specific trade communities.

  9. A stable systemic risk ranking in China's banking sector: Based on principal component analysis

    NASA Astrophysics Data System (ADS)

    Fang, Libing; Xiao, Binqing; Yu, Honghai; You, Qixing

    2018-02-01

    In this paper, we compare five popular systemic risk rankings, and apply principal component analysis (PCA) model to provide a stable systemic risk ranking for the Chinese banking sector. Our empirical results indicate that five methods suggest vastly different systemic risk rankings for the same bank, while the combined systemic risk measure based on PCA provides a reliable ranking. Furthermore, according to factor loadings of the first component, PCA combined ranking is mainly based on fundamentals instead of market price data. We clearly find that price-based rankings are not as practical a method as fundamentals-based ones. This PCA combined ranking directly shows systemic risk contributions of each bank for banking supervision purpose and reminds banks to prevent and cope with the financial crisis in advance.

  10. Understanding the topological characteristics and flow complexity of urban traffic congestion

    NASA Astrophysics Data System (ADS)

    Wen, Tzai-Hung; Chin, Wei-Chien-Benny; Lai, Pei-Chun

    2017-05-01

    For a growing number of developing cities, the capacities of streets cannot meet the rapidly growing demand of cars, causing traffic congestion. Understanding the spatial-temporal process of traffic flow and detecting traffic congestion are important issues associated with developing sustainable urban policies to resolve congestion. Therefore, the objective of this study is to propose a flow-based ranking algorithm for investigating traffic demands in terms of the attractiveness of street segments and flow complexity of the street network based on turning probability. Our results show that, by analyzing the topological characteristics of streets and volume data for a small fraction of street segments in Taipei City, the most congested segments of the city were identified successfully. The identified congested segments are significantly close to the potential congestion zones, including the officially announced most congested streets, the segments with slow moving speeds at rush hours, and the areas near significant landmarks. The identified congested segments also captured congestion-prone areas concentrated in the business districts and industrial areas of the city. Identifying the topological characteristics and flow complexity of traffic congestion provides network topological insights for sustainable urban planning, and these characteristics can be used to further understand congestion propagation.

  11. A Gaussian-based rank approximation for subspace clustering

    NASA Astrophysics Data System (ADS)

    Xu, Fei; Peng, Chong; Hu, Yunhong; He, Guoping

    2018-04-01

    Low-rank representation (LRR) has been shown successful in seeking low-rank structures of data relationships in a union of subspaces. Generally, LRR and LRR-based variants need to solve the nuclear norm-based minimization problems. Beyond the success of such methods, it has been widely noted that the nuclear norm may not be a good rank approximation because it simply adds all singular values of a matrix together and thus large singular values may dominant the weight. This results in far from satisfactory rank approximation and may degrade the performance of lowrank models based on the nuclear norm. In this paper, we propose a novel nonconvex rank approximation based on the Gaussian distribution function, which has demanding properties to be a better rank approximation than the nuclear norm. Then a low-rank model is proposed based on the new rank approximation with application to motion segmentation. Experimental results have shown significant improvements and verified the effectiveness of our method.

  12. Evaluation and ranking of candidate ceramic wafer engine seal materials

    NASA Technical Reports Server (NTRS)

    Steinetz, Bruce M.

    1991-01-01

    Modern engineered ceramics offer high temperature capabilities not found in even the best superalloy metals. The high temperature properties of several selected ceramics including aluminum oxide, silicon carbide, and silicon nitride are reviewed as they apply to hypersonic engine seal design. A ranking procedure is employed to objectively differentiate among four different monolithic ceramic materials considered, including: a cold-pressed and sintered aluminum oxide; a sintered alpha-phase silicon carbide; a hot-isostatically pressed silicon nitride; and a cold-pressed and sintered silicon nitride. This procedure is used to narrow the wide range of potential ceramics considered to an acceptable number for future detailed and costly analyses and tests. The materials are numerically scored according to their high temperature flexural strength; high temperature thermal conductivity; resistance to crack growth; resistance to high heating rates; fracture toughness; Weibull modulus; and finally according to their resistance to leakage flow, where materials having coefficients of thermal expansion closely matching the engine panel material resist leakage flow best. The cold-pressed and sintered material (Kyocera SN-251) ranked the highest in the overall ranking especially when implemented in engine panels made of low expansion rate materials being considered for the engine, including Incoloy and titanium alloys.

  13. Post-fire debris-flow hazard assessment of the area burned by the 2013 Beaver Creek Fire near Hailey, central Idaho

    USGS Publications Warehouse

    Skinner, Kenneth D.

    2013-01-01

    A preliminary hazard assessment was developed for debris-flow hazards in the 465 square-kilometer (115,000 acres) area burned by the 2013 Beaver Creek fire near Hailey in central Idaho. The burn area covers all or part of six watersheds and selected basins draining to the Big Wood River and is at risk of substantial post-fire erosion, such as that caused by debris flows. Empirical models derived from statistical evaluation of data collected from recently burned basins throughout the Intermountain Region in Western United States were used to estimate the probability of debris-flow occurrence, potential volume of debris flows, and the combined debris-flow hazard ranking along the drainage network within the burn area and to estimate the same for analyzed drainage basins within the burn area. Input data for the empirical models included topographic parameters, soil characteristics, burn severity, and rainfall totals and intensities for a (1) 2-year-recurrence, 1-hour-duration rainfall, referred to as a 2-year storm (13 mm); (2) 10-year-recurrence, 1-hour-duration rainfall, referred to as a 10-year storm (19 mm); and (3) 25-year-recurrence, 1-hour-duration rainfall, referred to as a 25-year storm (22 mm). Estimated debris-flow probabilities for drainage basins upstream of 130 selected basin outlets ranged from less than 1 to 78 percent with the probabilities increasing with each increase in storm magnitude. Probabilities were high in three of the six watersheds. For the 25-year storm, probabilities were greater than 60 percent for 11 basin outlets and ranged from 50 to 60 percent for an additional 12 basin outlets. Probability estimates for stream segments within the drainage network can vary within a basin. For the 25-year storm, probabilities for stream segments within 33 basins were higher than the basin outlet, emphasizing the importance of evaluating the drainage network as well as basin outlets. Estimated debris-flow volumes for the three modeled storms range from a minimal debris flow volume of 10 cubic meters [m3]) to greater than 100,000 m3. Estimated debris-flow volumes increased with basin size and distance downstream. For the 25-year storm, estimated debris-flow volumes were greater than 100,000 m3 for 4 basins and between 50,000 and 100,000 m3 for 10 basins. The debris-flow hazard rankings did not result in the highest hazard ranking of 5, indicating that none of the basins had a high probability of debris-flow occurrence and a high debris-flow volume estimate. The hazard ranking was 4 for one basin using the 10-year-recurrence storm model and for three basins using the 25-year-recurrence storm model. The maps presented herein may be used to prioritize areas where post-wildfire remediation efforts should take place within the 2- to 3-year period of increased erosional vulnerability.

  14. Dominance-based ranking functions for interval-valued intuitionistic fuzzy sets.

    PubMed

    Chen, Liang-Hsuan; Tu, Chien-Cheng

    2014-08-01

    The ranking of interval-valued intuitionistic fuzzy sets (IvIFSs) is difficult since they include the interval values of membership and nonmembership. This paper proposes ranking functions for IvIFSs based on the dominance concept. The proposed ranking functions consider the degree to which an IvIFS dominates and is not dominated by other IvIFSs. Based on the bivariate framework and the dominance concept, the functions incorporate not only the boundary values of membership and nonmembership, but also the relative relations among IvIFSs in comparisons. The dominance-based ranking functions include bipolar evaluations with a parameter that allows the decision-maker to reflect his actual attitude in allocating the various kinds of dominance. The relationship for two IvIFSs that satisfy the dual couple is defined based on four proposed ranking functions. Importantly, the proposed ranking functions can achieve a full ranking for all IvIFSs. Two examples are used to demonstrate the applicability and distinctiveness of the proposed ranking functions.

  15. Software-based on-site estimation of fractional flow reserve using standard coronary CT angiography data.

    PubMed

    De Geer, Jakob; Sandstedt, Mårten; Björkholm, Anders; Alfredsson, Joakim; Janzon, Magnus; Engvall, Jan; Persson, Anders

    2016-10-01

    The significance of a coronary stenosis can be determined by measuring the fractional flow reserve (FFR) during invasive coronary angiography. Recently, methods have been developed which claim to be able to estimate FFR using image data from standard coronary computed tomography angiography (CCTA) exams. To evaluate the accuracy of non-invasively computed fractional flow reserve (cFFR) from CCTA. A total of 23 vessels in 21 patients who had undergone both CCTA and invasive angiography with FFR measurement were evaluated using a cFFR software prototype. The cFFR results were compared to the invasively obtained FFR values. Correlation was calculated using Spearman's rank correlation, and agreement using intraclass correlation coefficient (ICC). Sensitivity, specificity, accuracy, negative predictive value, and positive predictive value for significant stenosis (defined as both FFR ≤0.80 and FFR ≤0.75) were calculated. The mean cFFR value for the whole group was 0.81 and the corresponding mean invFFR value was 0.84. The cFFR sensitivity for significant stenosis (FFR ≤0.80/0.75) on a per-lesion basis was 0.83/0.80, specificity was 0.76/0.89, and accuracy 0.78/0.87. The positive predictive value was 0.56/0.67 and the negative predictive value was 0.93/0.94. The Spearman rank correlation coefficient was ρ = 0.77 (P < 0.001) and ICC = 0.73 (P < 0.001). This particular CCTA-based cFFR software prototype allows for a rapid, non-invasive on-site evaluation of cFFR. The results are encouraging and cFFR may in the future be of help in the triage to invasive coronary angiography. © The Foundation Acta Radiologica 2015.

  16. The Atlas of Chinese World Wide Web Ecosystem Shaped by the Collective Attention Flows.

    PubMed

    Lou, Xiaodan; Li, Yong; Gu, Weiwei; Zhang, Jiang

    2016-01-01

    The web can be regarded as an ecosystem of digital resources connected and shaped by collective successive behaviors of users. Knowing how people allocate limited attention on different resources is of great importance. To answer this, we embed the most popular Chinese web sites into a high dimensional Euclidean space based on the open flow network model of a large number of Chinese users' collective attention flows, which both considers the connection topology of hyperlinks between the sites and the collective behaviors of the users. With these tools, we rank the web sites and compare their centralities based on flow distances with other metrics. We also study the patterns of attention flow allocation, and find that a large number of web sites concentrate on the central area of the embedding space, and only a small fraction of web sites disperse in the periphery. The entire embedding space can be separated into 3 regions(core, interim, and periphery). The sites in the core (1%) occupy a majority of the attention flows (40%), and the sites (34%) in the interim attract 40%, whereas other sites (65%) only take 20% flows. What's more, we clustered the web sites into 4 groups according to their positions in the space, and found that similar web sites in contents and topics are grouped together. In short, by incorporating the open flow network model, we can clearly see how collective attention allocates and flows on different web sites, and how web sites connected each other.

  17. A Citation-Based Ranking of German-Speaking Researchers in Business Administration with Data of Google Scholar

    ERIC Educational Resources Information Center

    Dilger, Alexander; Müller, Harry

    2013-01-01

    Rankings of academics can be constructed in two different ways, either based on journal rankings or based on citations. Although citation-based rankings promise some fundamental advantages they are still not common in German-speaking business administration. However, the choice of the underlying database is crucial. This article argues that for…

  18. Froude space: An aquatic currency derived from remote sensing data for assessing ecological potential of river floodplains

    NASA Astrophysics Data System (ADS)

    Lorang, M. S.; Stanford, J.; Steele, B.

    2009-12-01

    In this research we take a systems ecology approach to the evaluation of river floodplains by ranking them according to their energetic complexity at or near base flow conditions. The underlying hypothesis is that energetic complexity equates to a higher potential for sustaining maximum biological diversity, in particular as it relates to Salmonids. Fr number is a hydraulic index of relative specific energy in a flowing water column ranging from calm, no flow conditions where Fr = 0 to 0.8 at the onset of rapids and higher values approaching 1 or > at locations of breaking waves and hydraulic jumps. Most of the water flowing in a gravel-bed river exists in the transition range of Fr = 0.1 to 0.8, creating a complex array of potential hydrologic habitat commonly described through observation as riffles, runs, pools eddies, and so on. We use 1.6 m2 resolution multispectral satellite imagery to predict and map water depth (h), mean flow velocity (V) and Froude number (Fr=V/(gh)^0.5) by using a distribution-free statistical learner and error analysis approach. This approach links measures of V and h made from a raft deploying an acoustic Doppler profiler (ADP) and GPS with the reflectance characteristics from the satellite imagery (4 bands) that correspond to each ADP profile. This analysis of Fr space in combination with independent classification of depth and velocity provides physical metrics related to the energetic state of flow in the river at the time of image acquisition. We use these metrics, determined from a suite of 23 floodplains spread across the rim of the North Pacific (including British Columbia, Alaska and the Kamchatka Peninsula of Russia) and covering the range in fluvial geomorphic type from braided to meandering, to rank them in terms of energetic complexity.

  19. Quantum probability ranking principle for ligand-based virtual screening.

    PubMed

    Al-Dabbagh, Mohammed Mumtaz; Salim, Naomie; Himmat, Mubarak; Ahmed, Ali; Saeed, Faisal

    2017-04-01

    Chemical libraries contain thousands of compounds that need screening, which increases the need for computational methods that can rank or prioritize compounds. The tools of virtual screening are widely exploited to enhance the cost effectiveness of lead drug discovery programs by ranking chemical compounds databases in decreasing probability of biological activity based upon probability ranking principle (PRP). In this paper, we developed a novel ranking approach for molecular compounds inspired by quantum mechanics, called quantum probability ranking principle (QPRP). The QPRP ranking criteria would make an attempt to draw an analogy between the physical experiment and molecular structure ranking process for 2D fingerprints in ligand based virtual screening (LBVS). The development of QPRP criteria in LBVS has employed the concepts of quantum at three different levels, firstly at representation level, this model makes an effort to develop a new framework of molecular representation by connecting the molecular compounds with mathematical quantum space. Secondly, estimate the similarity between chemical libraries and references based on quantum-based similarity searching method. Finally, rank the molecules using QPRP approach. Simulated virtual screening experiments with MDL drug data report (MDDR) data sets showed that QPRP outperformed the classical ranking principle (PRP) for molecular chemical compounds.

  20. Quantum probability ranking principle for ligand-based virtual screening

    NASA Astrophysics Data System (ADS)

    Al-Dabbagh, Mohammed Mumtaz; Salim, Naomie; Himmat, Mubarak; Ahmed, Ali; Saeed, Faisal

    2017-04-01

    Chemical libraries contain thousands of compounds that need screening, which increases the need for computational methods that can rank or prioritize compounds. The tools of virtual screening are widely exploited to enhance the cost effectiveness of lead drug discovery programs by ranking chemical compounds databases in decreasing probability of biological activity based upon probability ranking principle (PRP). In this paper, we developed a novel ranking approach for molecular compounds inspired by quantum mechanics, called quantum probability ranking principle (QPRP). The QPRP ranking criteria would make an attempt to draw an analogy between the physical experiment and molecular structure ranking process for 2D fingerprints in ligand based virtual screening (LBVS). The development of QPRP criteria in LBVS has employed the concepts of quantum at three different levels, firstly at representation level, this model makes an effort to develop a new framework of molecular representation by connecting the molecular compounds with mathematical quantum space. Secondly, estimate the similarity between chemical libraries and references based on quantum-based similarity searching method. Finally, rank the molecules using QPRP approach. Simulated virtual screening experiments with MDL drug data report (MDDR) data sets showed that QPRP outperformed the classical ranking principle (PRP) for molecular chemical compounds.

  1. Gene differentiation among ten endogamous groups of West Bengal, India.

    PubMed

    Chakraborty, R; Walter, H; Mukherjee, B N; Malhotra, K C; Sauber, P; Banerjee, S; Roy, M

    1986-11-01

    Ten endogamous populations of West Bengal, India have been surveyed for genetic variation in 12 systems. These populations encompass all social ranks in the caste hierarchy and cover almost the entire geographic area of the state. Gene diversity analysis suggests that these groups exhibit significant allele frequency variation at all but three loci. The overall genetic difference is not, however, in accord with the classification based on caste. Two low-ranking scheduled caste groups are, in fact, in close proximity with the high-caste ones, suggesting evidence of past generations of gene flow among them. Three different clusters of groups emerge from the present data, providing support for the anthropologic assertion that in Bengal Proto-Australoid, Caucasoid, and Mongoloid racial elements generally coexist. However, these three components are not uniformly present in all groups. Geographic separation of the groups is a strong determinant of the gene differentiation that exists among these populations.

  2. Relative Influence of Professional Counseling Journals

    ERIC Educational Resources Information Center

    Fernando, Delini M.; Barrio Minton, Casey A.

    2011-01-01

    The authors used social network analysis of citation data to study the flow of information and relative influence of 17 professional counseling journals. Although the "Journal of Counseling & Development" ranked very highly in all measures of journal influence, several division journals emerged as key players in the flow of information within the…

  3. Consistency of patterns in concentration‐discharge plots

    USGS Publications Warehouse

    Chanat, Jeffrey G.; Rice, Karen C.; Hornberger, George M.

    2002-01-01

    Concentration‐discharge (c‐Q) plots have been used to infer how flow components such as event water, soil water, and groundwater mix to produce the observed episodic hydrochemical response of small catchments. Because c‐Q plots are based only on observed streamflow and solute concentration, their interpretation requires assumptions about the relative volume, hydrograph timing, and solute concentration of the streamflow end‐members. Evans and Davies [1998] present a taxonomy of c‐Q loops resulting from three‐component conservative mixing. Their analysis, based on a fixed template of end‐member hydrograph volume, timing, and concentration, suggests a unique relationship between c‐Q loop form and the rank order of end‐member concentrations. Many catchments exhibit variability in component contributions to storm flow in response to antecedent conditions or rainfall characteristics, but the effects of such variation on c‐Q relationships have not been studied systematically. Starting with a “baseline” condition similar to that assumed by Evans and Davies [1998], we use a simple computer model to characterize the variability in c‐Q plot patterns resulting from variation in end‐member volume, timing, and solute concentration. Variability in these three factors can result in more than one c‐Q loop shape for a given rank order of end‐member solute concentrations. The number of resulting hysteresis patterns and their relative frequency depends on the rank order of solute concentrations and on their separation in absolute value. In ambiguous cases the c‐Q loop shape is determined by the relative “prominence” of the event water versus soil water components. This “prominence” is broadly defined as a capacity to influence the total streamflow concentration and may result from a combination of end‐member volume, timing, or concentration. The modeling results indicate that plausible hydrological variability in field situations can confound the interpretation of c‐Q plots, even when fundamental end‐member mixing assumptions are satisfied.

  4. A Comparison of Dose-Response Models for the Parotid Gland in a Large Group of Head-and-Neck Cancer Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houweling, Antonetta C., E-mail: A.Houweling@umcutrecht.n; Philippens, Marielle E.P.; Dijkema, Tim

    2010-03-15

    Purpose: The dose-response relationship of the parotid gland has been described most frequently using the Lyman-Kutcher-Burman model. However, various other normal tissue complication probability (NTCP) models exist. We evaluated in a large group of patients the value of six NTCP models that describe the parotid gland dose response 1 year after radiotherapy. Methods and Materials: A total of 347 patients with head-and-neck tumors were included in this prospective parotid gland dose-response study. The patients were treated with either conventional radiotherapy or intensity-modulated radiotherapy. Dose-volume histograms for the parotid glands were derived from three-dimensional dose calculations using computed tomography scans. Stimulatedmore » salivary flow rates were measured before and 1 year after radiotherapy. A threshold of 25% of the pretreatment flow rate was used to define a complication. The evaluated models included the Lyman-Kutcher-Burman model, the mean dose model, the relative seriality model, the critical volume model, the parallel functional subunit model, and the dose-threshold model. The goodness of fit (GOF) was determined by the deviance and a Monte Carlo hypothesis test. Ranking of the models was based on Akaike's information criterion (AIC). Results: None of the models was rejected based on the evaluation of the GOF. The mean dose model was ranked as the best model based on the AIC. The TD{sub 50} in these models was approximately 39 Gy. Conclusions: The mean dose model was preferred for describing the dose-response relationship of the parotid gland.« less

  5. Research on B Cell Algorithm for Learning to Rank Method Based on Parallel Strategy.

    PubMed

    Tian, Yuling; Zhang, Hongxian

    2016-01-01

    For the purposes of information retrieval, users must find highly relevant documents from within a system (and often a quite large one comprised of many individual documents) based on input query. Ranking the documents according to their relevance within the system to meet user needs is a challenging endeavor, and a hot research topic-there already exist several rank-learning methods based on machine learning techniques which can generate ranking functions automatically. This paper proposes a parallel B cell algorithm, RankBCA, for rank learning which utilizes a clonal selection mechanism based on biological immunity. The novel algorithm is compared with traditional rank-learning algorithms through experimentation and shown to outperform the others in respect to accuracy, learning time, and convergence rate; taken together, the experimental results show that the proposed algorithm indeed effectively and rapidly identifies optimal ranking functions.

  6. Research on B Cell Algorithm for Learning to Rank Method Based on Parallel Strategy

    PubMed Central

    Tian, Yuling; Zhang, Hongxian

    2016-01-01

    For the purposes of information retrieval, users must find highly relevant documents from within a system (and often a quite large one comprised of many individual documents) based on input query. Ranking the documents according to their relevance within the system to meet user needs is a challenging endeavor, and a hot research topic–there already exist several rank-learning methods based on machine learning techniques which can generate ranking functions automatically. This paper proposes a parallel B cell algorithm, RankBCA, for rank learning which utilizes a clonal selection mechanism based on biological immunity. The novel algorithm is compared with traditional rank-learning algorithms through experimentation and shown to outperform the others in respect to accuracy, learning time, and convergence rate; taken together, the experimental results show that the proposed algorithm indeed effectively and rapidly identifies optimal ranking functions. PMID:27487242

  7. Enhancement of automated blood flow estimates (ENABLE) from arterial spin-labeled MRI.

    PubMed

    Shirzadi, Zahra; Stefanovic, Bojana; Chappell, Michael A; Ramirez, Joel; Schwindt, Graeme; Masellis, Mario; Black, Sandra E; MacIntosh, Bradley J

    2018-03-01

    To validate a multiparametric automated algorithm-ENhancement of Automated Blood fLow Estimates (ENABLE)-that identifies useful and poor arterial spin-labeled (ASL) difference images in multiple postlabeling delay (PLD) acquisitions and thereby improve clinical ASL. ENABLE is a sort/check algorithm that uses a linear combination of ASL quality features. ENABLE uses simulations to determine quality weighting factors based on an unconstrained nonlinear optimization. We acquired a set of 6-PLD ASL images with 1.5T or 3.0T systems among 98 healthy elderly and adults with mild cognitive impairment or dementia. We contrasted signal-to-noise ratio (SNR) of cerebral blood flow (CBF) images obtained with ENABLE vs. conventional ASL analysis. In a subgroup, we validated our CBF estimates with single-photon emission computed tomography (SPECT) CBF images. ENABLE produced significantly increased SNR compared to a conventional ASL analysis (Wilcoxon signed-rank test, P < 0.0001). We also found the similarity between ASL and SPECT was greater when using ENABLE vs. conventional ASL analysis (n = 51, Wilcoxon signed-rank test, P < 0.0001) and this similarity was strongly related to ASL SNR (t = 24, P < 0.0001). These findings suggest that ENABLE improves CBF image quality from multiple PLD ASL in dementia cohorts at either 1.5T or 3.0T, achieved by multiparametric quality features that guided postprocessing of dementia ASL. 2 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018;47:647-655. © 2017 International Society for Magnetic Resonance in Medicine.

  8. Matching visual and nonvisual signals: evidence for a mechanism to discount optic flow during locomotion

    NASA Astrophysics Data System (ADS)

    Thurrell, Adrian; Pelah, Adar

    2005-03-01

    We report on recent experiments to investigate the Arthrovisual Locomotor Effect (ALE), a mechanism based on non-visual signals postulated to discount or remove the self-generated visual motion signals during locomotion. It is shown that perceptual matches made by standing subjects to a constant motion optic flow stimulus that is viewed while walking on a treadmill are linearly reduced by walking speed, a measure of the reported ALE. The degree of reduction in perceived speed depends on the similarity of the motor activity to natural locomotion, thus for the four activities tested, ALE strength is ranked as follows: Walking > Cycling > Hand Pedalling > Finger Tapping = 0. Other variations and important controls for the ALE are described.

  9. PEGDA hydrogels as a replacement for animal tissues in mucoadhesion testing.

    PubMed

    Eshel-Green, Tal; Eliyahu, Shaked; Avidan-Shlomovich, Shlomit; Bianco-Peled, Havazelet

    2016-06-15

    Utilization of animal parts in ex-vivo mucoadhesion assays is a common approach that presents many difficulties due to animal rights issues and large variance between animals. This study examines the suitability of two PEGDA (poly(ethylene glycol) diacrylate) based hydrogels to serve as tissue mimetics for mucoadhesion evaluation. One hydrogel, termed PEGDA-QT, was composed of pentaerythritol tetrakis (3-mercaptopropionate) and PEG and contained free thiol groups mimicking those found in natural mucosa. The other hydrogel was formed by UV (ultraviolet) curing of PEGDA and mimicked the mechanical property of mucosa but not its chemical constitute. When ranking different first generation mucoadhesive polymers using a tensile assay, both hydrogels showed good agreement with the ranking achieved for porcine small intestine. However, only PEGDA-QT and porcine small intestine shared a similar displacement curve. The same ranking for PEGDA-QT and porcine small intestine was also observed when comparing a second-generation mucoadhesive polymer, thiolated alginate, to native alginate. Our findings suggest that PEGDA-QT could serve as a replacement for porcine small intestine in both mucoadhesion evaluations using a tensile machine and the flow-through method for first and second-generation mucoadhesive polymers. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Variation in feeding, aggression, and position choice between hatchery and wild cutthroat trout in an artificial stream

    USGS Publications Warehouse

    Mesa, Matthew G.

    1991-01-01

    I compared feeding, aggressive behavior, and spatial distribution of differently ranked individuals of hatchery and wild coastal cutthroat trout Oncorhynchus clarki clarki in an artificial stream. Both hatchery and wild groups established stable dominance hierarchies that seemed to be based on size differences. Hatchery and wild fish within a hierarchical rank fed at similar rates. Hatchery fish were more aggressive than their wild conspecifics, irrespective of rank. Dominant hatchery fish were evenly distributed in pools and riffles, whereas dominant wild fish were three times more often in pools than in riffles. In both groups, socially intermediate fish were almost evenly distributed between pools and riffles, and subordinate fish spent most of their time in pools. On average, hatchery fish spent 57% of their time in pools and 43% in riffles, whereas wild fish spent 71% of their time in pools and 29% in riffles. These results support the hypothesis that excessive expenditure of energy for unnecessary aggression, use of fast-flowing water, or other purposes contributes to poor survival of hatchery fish after they are stocked in streams. Poor survival would reduce the efficacy of using hatchery stocks to supplement wild production.

  11. Rank-based pooling for deep convolutional neural networks.

    PubMed

    Shi, Zenglin; Ye, Yangdong; Wu, Yunpeng

    2016-11-01

    Pooling is a key mechanism in deep convolutional neural networks (CNNs) which helps to achieve translation invariance. Numerous studies, both empirically and theoretically, show that pooling consistently boosts the performance of the CNNs. The conventional pooling methods are operated on activation values. In this work, we alternatively propose rank-based pooling. It is derived from the observations that ranking list is invariant under changes of activation values in a pooling region, and thus rank-based pooling operation may achieve more robust performance. In addition, the reasonable usage of rank can avoid the scale problems encountered by value-based methods. The novel pooling mechanism can be regarded as an instance of weighted pooling where a weighted sum of activations is used to generate the pooling output. This pooling mechanism can also be realized as rank-based average pooling (RAP), rank-based weighted pooling (RWP) and rank-based stochastic pooling (RSP) according to different weighting strategies. As another major contribution, we present a novel criterion to analyze the discriminant ability of various pooling methods, which is heavily under-researched in machine learning and computer vision community. Experimental results on several image benchmarks show that rank-based pooling outperforms the existing pooling methods in classification performance. We further demonstrate better performance on CIFAR datasets by integrating RSP into Network-in-Network. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Motor skill failure or flow-experience? Functional brain asymmetry and brain connectivity in elite and amateur table tennis players.

    PubMed

    Wolf, Sebastian; Brölz, Ellen; Keune, Philipp M; Wesa, Benjamin; Hautzinger, Martin; Birbaumer, Niels; Strehl, Ute

    2015-02-01

    Functional hemispheric asymmetry is assumed to constitute one underlying neurophysiological mechanism of flow-experience and skilled psycho-motor performance in table tennis athletes. We hypothesized that when initiating motor execution during motor imagery, elite table tennis players show higher right- than left-hemispheric temporal activity and stronger right temporal-premotor than left temporal-premotor theta coherence compared to amateurs. We additionally investigated, whether less pronounced left temporal cortical activity is associated with more world rank points and more flow-experience. To this aim, electroencephalographic data were recorded in 14 experts and 15 amateur table tennis players. Subjects watched videos of an opponent serving a ball and were instructed to imagine themselves responding with a specific table tennis stroke. Alpha asymmetry scores were calculated by subtracting left from right hemispheric 8-13 Hz alpha power. 4-7 Hz theta coherence was calculated between temporal (T3/T4) and premotor (Fz) cortex. Experts showed a significantly stronger shift towards lower relative left-temporal brain activity compared to amateurs and a significantly stronger right temporal-premotor coherence than amateurs. The shift towards lower relative left-temporal brain activity in experts was associated with more flow-experience and lower relative left temporal activity was correlated with more world rank points. The present findings suggest that skilled psycho-motor performance in elite table tennis players reflect less desynchronized brain activity at the left hemisphere and more coherent brain activity between fronto-temporal and premotor oscillations at the right hemisphere. This pattern probably reflect less interference of irrelevant communication of verbal-analytical with motor-control mechanisms which implies flow-experience and predict world rank in experts. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Is there an association between flow diverter fish mouthing and delayed-type hypersensitivity to metals?-a case-control study.

    PubMed

    Kocer, Naci; Mondel, Prabath Kumar; Yamac, Elif; Kavak, Ayse; Kizilkilic, Osman; Islak, Civan

    2017-11-01

    Flow diverters are increasingly used in the treatment of complex and giant intracranial aneurysms. However, they are associated with complications like late aneurysmal rupture. Additionally, flow diverters show focal structural decrease in luminal diameter without any intimal hyperplasia. This resembles a "fish mouth" when viewed en face. In this pilot study, we tested the hypothesis of a possible association between flow diverter fish-mouthing and delayed-type hypersensitivity to its metal constituents. We retrospectively reviewed patient records from our center between May 2010 and November 2015. A total of nine patients had flow diverter fish mouthing. A control group of 25 patients was selected. All study participants underwent prospective patch test to detect hypersensitivity to flow diverter metal constituents. Analysis was performed using logistic regression analysis and Wilcoxon sign rank sum test. Univariate and multivariate analyses were performed to test variables to predict flow diverter fish mouthing. The association between flow diverter fish mouthing and positive patch test was not statistically significant. In multivariate analysis, history of allergy and maximum aneurysm size category was associated with flow diverter fish mouthing. This was further confirmed on Wilcoxon sign rank sum test. The study showed statistically significant association between flow diverter fish mouthing and history of contact allergy and a small aneurysmal size. Further large-scale studies are needed to detect a statistically significant association between flow diverter fish mouthing and patch test. We recommend early and more frequent follow-up imaging in patients with contact allergy to detect flow diverter fish mouthing and its subsequent evolution.

  14. Google matrix of Twitter

    NASA Astrophysics Data System (ADS)

    Frahm, K. M.; Shepelyansky, D. L.

    2012-10-01

    We construct the Google matrix of the entire Twitter network, dated by July 2009, and analyze its spectrum and eigenstate properties including the PageRank and CheiRank vectors and 2DRanking of all nodes. Our studies show much stronger inter-connectivity between top PageRank nodes for the Twitter network compared to the networks of Wikipedia and British Universities studied previously. Our analysis allows to locate the top Twitter users which control the information flow on the network. We argue that this small fraction of the whole number of users, which can be viewed as the social network elite, plays the dominant role in the process of opinion formation on the network.

  15. A Kernel-Based Low-Rank (KLR) Model for Low-Dimensional Manifold Recovery in Highly Accelerated Dynamic MRI.

    PubMed

    Nakarmi, Ukash; Wang, Yanhua; Lyu, Jingyuan; Liang, Dong; Ying, Leslie

    2017-11-01

    While many low rank and sparsity-based approaches have been developed for accelerated dynamic magnetic resonance imaging (dMRI), they all use low rankness or sparsity in input space, overlooking the intrinsic nonlinear correlation in most dMRI data. In this paper, we propose a kernel-based framework to allow nonlinear manifold models in reconstruction from sub-Nyquist data. Within this framework, many existing algorithms can be extended to kernel framework with nonlinear models. In particular, we have developed a novel algorithm with a kernel-based low-rank model generalizing the conventional low rank formulation. The algorithm consists of manifold learning using kernel, low rank enforcement in feature space, and preimaging with data consistency. Extensive simulation and experiment results show that the proposed method surpasses the conventional low-rank-modeled approaches for dMRI.

  16. Rankings of Economics Faculties and Representation on Editorial Boards of Top Journals.

    ERIC Educational Resources Information Center

    Gibbons, Jean D.; Fish, Mary

    1991-01-01

    Presents rankings of U.S., university, economics departments. Explains the rankings are based upon representation of the departments on the editorial boards of leading economics journals. Reports that results are similar to rankings based upon other criteria. (DK)

  17. A fast sorting algorithm for a hypersonic rarefied flow particle simulation on the connection machine

    NASA Technical Reports Server (NTRS)

    Dagum, Leonardo

    1989-01-01

    The data parallel implementation of a particle simulation for hypersonic rarefied flow described by Dagum associates a single parallel data element with each particle in the simulation. The simulated space is divided into discrete regions called cells containing a variable and constantly changing number of particles. The implementation requires a global sort of the parallel data elements so as to arrange them in an order that allows immediate access to the information associated with cells in the simulation. Described here is a very fast algorithm for performing the necessary ranking of the parallel data elements. The performance of the new algorithm is compared with that of the microcoded instruction for ranking on the Connection Machine.

  18. Improving chemical species tomography of turbulent flows using covariance estimation.

    PubMed

    Grauer, Samuel J; Hadwin, Paul J; Daun, Kyle J

    2017-05-01

    Chemical species tomography (CST) experiments can be divided into limited-data and full-rank cases. Both require solving ill-posed inverse problems, and thus the measurement data must be supplemented with prior information to carry out reconstructions. The Bayesian framework formalizes the role of additive information, expressed as the mean and covariance of a joint-normal prior probability density function. We present techniques for estimating the spatial covariance of a flow under limited-data and full-rank conditions. Our results show that incorporating a covariance estimate into CST reconstruction via a Bayesian prior increases the accuracy of instantaneous estimates. Improvements are especially dramatic in real-time limited-data CST, which is directly applicable to many industrially relevant experiments.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Argyres, Philip C.; Lü, Yongchao; Martone, Mario

    Coulomb branch chiral rings of N = 2 SCFTs are conjectured to be freely generated. While no counter-example is known, no direct evidence for the conjecture is known either. We initiate a systematic study of SCFTs with Coulomb branch chiral rings satisfying non-trivial relations, restricting our analysis to rank 1. The main result of our study is that (rank-1) SCFTs with non-freely generated CB chiral rings when deformed by relevant deformations, always flow to theories with non-freely generated CB rings. This implies that if they exist, they must thus form a distinct subset under RG flows. We also nd manymore » interesting characteristic properties that these putative theories satisfy which may be helpful in proving or disproving their existence using other methods.« less

  20. Neophilia Ranking of Scientific Journals.

    PubMed

    Packalen, Mikko; Bhattacharya, Jay

    2017-01-01

    The ranking of scientific journals is important because of the signal it sends to scientists about what is considered most vital for scientific progress. Existing ranking systems focus on measuring the influence of a scientific paper (citations)-these rankings do not reward journals for publishing innovative work that builds on new ideas. We propose an alternative ranking based on the proclivity of journals to publish papers that build on new ideas, and we implement this ranking via a text-based analysis of all published biomedical papers dating back to 1946. In addition, we compare our neophilia ranking to citation-based (impact factor) rankings; this comparison shows that the two ranking approaches are distinct. Prior theoretical work suggests an active role for our neophilia index in science policy. Absent an explicit incentive to pursue novel science, scientists underinvest in innovative work because of a coordination problem: for work on a new idea to flourish, many scientists must decide to adopt it in their work. Rankings that are based purely on influence thus do not provide sufficient incentives for publishing innovative work. By contrast, adoption of the neophilia index as part of journal-ranking procedures by funding agencies and university administrators would provide an explicit incentive for journals to publish innovative work and thus help solve the coordination problem by increasing scientists' incentives to pursue innovative work.

  1. Flat-Plate Solar-Collector Performance Evaluation with a Solar Simulator as a Basis for Collector Selection and Performance Prediction

    NASA Technical Reports Server (NTRS)

    Simon, F. F.

    1975-01-01

    The use of a solar simulator for performance determination permits collector testing under standard conditions of wind, ambient temperature, flow rate and sun. The performance results determined with the simulator have been found to be in good agreement with outdoor performance results. The measured thermal efficiency and evaluation of 23 collectors are reported which differ according to absorber material (copper, aluminum, steel), absorber coating (nonselective black paint, selective copper oxide, selective black nickel, selective black chrome), type of glazing material (glass, Tedlar, Lexan, antireflection glass), the use of honeycomb material and the use of vacuum to prevent thermal convection losses. The collectors were given performance rankings based on noon-hour solar conditions and all-day solar conditions. The determination with the simulator of an all-day collector performance was made possible by tests at different incident angles. The solar performance rankings were made based on whether the collector is to be used for pool heating, hot water, absorption air conditioning, heating, or for a solar Rankine machine.

  2. A Network Analysis of Countries’ Export Flows: Firm Grounds for the Building Blocks of the Economy

    PubMed Central

    Caldarelli, Guido; Cristelli, Matthieu; Gabrielli, Andrea; Pietronero, Luciano; Scala, Antonio; Tacchella, Andrea

    2012-01-01

    In this paper we analyze the bipartite network of countries and products from UN data on country production. We define the country-country and product-product projected networks and introduce a novel method of filtering information based on elements’ similarity. As a result we find that country clustering reveals unexpected socio-geographic links among the most competing countries. On the same footings the products clustering can be efficiently used for a bottom-up classification of produced goods. Furthermore we mathematically reformulate the “reflections method” introduced by Hidalgo and Hausmann as a fixpoint problem; such formulation highlights some conceptual weaknesses of the approach. To overcome such an issue, we introduce an alternative methodology (based on biased Markov chains) that allows to rank countries in a conceptually consistent way. Our analysis uncovers a strong non-linear interaction between the diversification of a country and the ubiquity of its products, thus suggesting the possible need of moving towards more efficient and direct non-linear fixpoint algorithms to rank countries and products in the global market. PMID:23094044

  3. Manpower planning using Markov Chain model

    NASA Astrophysics Data System (ADS)

    Saad, Syafawati Ab; Adnan, Farah Adibah; Ibrahim, Haslinda; Rahim, Rahela

    2014-07-01

    Manpower planning is a planning model which understands the flow of manpower based on the policies changes. For such purpose, numerous attempts have been made by researchers to develop a model to investigate the track of movements of lecturers for various universities. As huge number of lecturers in a university, it is difficult to track the movement of lecturers and also there is no quantitative way used in tracking the movement of lecturers. This research is aimed to determine the appropriate manpower model to understand the flow of lecturers in a university in Malaysia by determine the probability and mean time of lecturers remain in the same status rank. In addition, this research also intended to estimate the number of lecturers in different status rank (lecturer, senior lecturer and associate professor). From the previous studies, there are several methods applied in manpower planning model and appropriate method used in this research is Markov Chain model. Results obtained from this study indicate that the appropriate manpower planning model used is validated by compare to the actual data. The smaller margin of error gives a better result which means that the projection is closer to actual data. These results would give some suggestions for the university to plan the hiring lecturers and budgetary for university in future.

  4. Escape the Black Hole of Lecturing: Put Collaborative Ranking Tasks on Your Event Horizon

    NASA Astrophysics Data System (ADS)

    Hudgins, D. W.; Prather, E. E.; Grayson, D. J.

    2005-05-01

    At the University of Arizona, we have been developing and testing a new type of introductory astronomy curriculum material called Ranking Tasks. Ranking Tasks are a form of conceptual exercise that presents students with four to six physical situations, usually by pictures or diagrams, and asks students to rank order the situations based on some resulting effect. Our study developed design guidelines for Ranking Tasks based on learning theory and classroom pilot studies. Our research questions were: Do in-class collaborative Ranking Task exercises result in student conceptual gains when used in conjunction with traditional lecture-based instruction? And are these gains sufficient to justify implementing them into the astronomy classroom? We conducted a single-group repeated measures experiment across eight core introductory astronomy topics with 250 students at the University of Arizona in the Fall of 2004. The study found that traditional lecture-based instruction alone produced statistically significant gains - raising test scores to 61% post-lecture from 32% on the pretest. While significant, we find these gains to be unsatisfactory from a teaching and learning perspective. The study data shows that adding a collaborative learning component to the class structured around Ranking Task exercises helped students achieve statistically significant gains - with post-Ranking Task scores over the eight astronomy topic rising to 77%. Interestingly, we found that the normalized gain from the Ranking Tasks was equal to the entire previous gain from traditional instruction. Further analysis of the data revealed that Ranking Tasks equally benefited both genders; they also equally benefited both high and low-scoring median groups based on their pretest scores. Based on these results, we conclude that adding collaborative Ranking Task exercises to traditional lecture-based instruction can significantly improve student conceptual understanding of core topics in astronomy.

  5. Solving the influence maximization problem reveals regulatory organization of the yeast cell cycle.

    PubMed

    Gibbs, David L; Shmulevich, Ilya

    2017-06-01

    The Influence Maximization Problem (IMP) aims to discover the set of nodes with the greatest influence on network dynamics. The problem has previously been applied in epidemiology and social network analysis. Here, we demonstrate the application to cell cycle regulatory network analysis for Saccharomyces cerevisiae. Fundamentally, gene regulation is linked to the flow of information. Therefore, our implementation of the IMP was framed as an information theoretic problem using network diffusion. Utilizing more than 26,000 regulatory edges from YeastMine, gene expression dynamics were encoded as edge weights using time lagged transfer entropy, a method for quantifying information transfer between variables. By picking a set of source nodes, a diffusion process covers a portion of the network. The size of the network cover relates to the influence of the source nodes. The set of nodes that maximizes influence is the solution to the IMP. By solving the IMP over different numbers of source nodes, an influence ranking on genes was produced. The influence ranking was compared to other metrics of network centrality. Although the top genes from each centrality ranking contained well-known cell cycle regulators, there was little agreement and no clear winner. However, it was found that influential genes tend to directly regulate or sit upstream of genes ranked by other centrality measures. The influential nodes act as critical sources of information flow, potentially having a large impact on the state of the network. Biological events that affect influential nodes and thereby affect information flow could have a strong effect on network dynamics, potentially leading to disease. Code and data can be found at: https://github.com/gibbsdavidl/miergolf.

  6. Does resident ranking during recruitment accurately predict subsequent performance as a surgical resident?

    PubMed

    Fryer, Jonathan P; Corcoran, Noreen; George, Brian; Wang, Ed; Darosa, Debra

    2012-01-01

    While the primary goal of ranking applicants for surgical residency training positions is to identify the candidates who will subsequently perform best as surgical residents, the effectiveness of the ranking process has not been adequately studied. We evaluated our general surgery resident recruitment process between 2001 and 2011 inclusive, to determine if our recruitment ranking parameters effectively predicted subsequent resident performance. We identified 3 candidate ranking parameters (United States Medical Licensing Examination [USMLE] Step 1 score, unadjusted ranking score [URS], and final adjusted ranking [FAR]), and 4 resident performance parameters (American Board of Surgery In-Training Examination [ABSITE] score, PGY1 resident evaluation grade [REG], overall REG, and independent faculty rating ranking [IFRR]), and assessed whether the former were predictive of the latter. Analyses utilized Spearman correlation coefficient. We found that the URS, which is based on objective and criterion based parameters, was a better predictor of subsequent performance than the FAR, which is a modification of the URS based on subsequent determinations of the resident selection committee. USMLE score was a reliable predictor of ABSITE scores only. However, when we compared our worst residence performances with the performances of the other residents in this evaluation, the data did not produce convincing evidence that poor resident performances could be reliably predicted by any of the recruitment ranking parameters. Finally, stratifying candidates based on their rank range did not effectively define a ranking cut-off beyond which resident performance would drop off. Based on these findings, we recommend surgery programs may be better served by utilizing a more structured resident ranking process and that subsequent adjustments to the rank list generated by this process should be undertaken with caution. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  7. Relative toxicity of products of pyrolysis and combustion of polymeric materials using various test conditions

    NASA Technical Reports Server (NTRS)

    Hilado, C. J.

    1976-01-01

    Relative toxicity data for a large number of natural and synthetic polymeric materials are presented which were obtained by 11 pyrolysis and three flaming-combustion test methods. The materials tested include flexible and rigid polyurethane foams, different kinds of fabrics and woods, and a variety of commodity polymers such as polyethylene. Animal exposure chambers of different volumes containing mice, rats, or rabbits were used in the tests, which were performed over the temperature range from ambient to 800 C with and without air flow or recirculation. The test results are found to be sensitive to such variables as exposure mode, temperature, air flow and dilution, material concentration, and animal species, but relative toxicity rankings appear to be similar for many methods and materials. It is concluded that times to incapacitance and to death provide a more suitable basis for relative toxicity rankings than percent mortality alone, that temperature is the most important variable in the tests reported, and that variables such as chamber volume and animal species may not significantly affect the rankings.

  8. Extensive investigation of the sap flow of maize plants in an oasis farmland in the middle reach of the Heihe River, Northwest China.

    PubMed

    Zhao, Liwen; He, Zhibin; Zhao, Wenzhi; Yang, Qiyue

    2016-09-01

    A better understanding of the sap flow characteristics of maize plants is critical for improving irrigation water-use efficiency, especially for regions facing water resource shortages. In this study, sap flow rates, related soil-physics and plant-growth parameters, and meteorological factors, were simultaneously monitored in a maize field in two consecutive years, 2011 and 2012, and the sap flow rates of the maize plants were extensively analyzed based on the monitored data. Seasonal and daily variational characteristics were identified at different growth stages and under different weather conditions, respectively. The analyses on the relationships between sap flow rate and reference evapotranspiration (ET0), as well as several plant-growth parameters, indicate that the irrigation schedule can exert an influence on sap flow, and can consequently affect crop yield. The ranking of the main meteorological factors affecting the sap flow rate was: net radiation > air temperature > vapor pressure deficit > wind speed. For a quick estimation of sap flow rates, an empirical formula based on the two top influencing factors was put forward and verified to be reliable. The sap flow rate appeared to show little response to irrigation when the water content was relatively high, implying that some of the irrigation in recent years may have been wasted. These results may help to reveal the bio-physical processes of maize plants related to plant transpiration, which could be beneficial for establishing an efficient irrigation management system in this region and also for providing a reference for other maize-planting regions.

  9. The Atlas of Chinese World Wide Web Ecosystem Shaped by the Collective Attention Flows

    PubMed Central

    Lou, Xiaodan; Li, Yong; Gu, Weiwei; Zhang, Jiang

    2016-01-01

    The web can be regarded as an ecosystem of digital resources connected and shaped by collective successive behaviors of users. Knowing how people allocate limited attention on different resources is of great importance. To answer this, we embed the most popular Chinese web sites into a high dimensional Euclidean space based on the open flow network model of a large number of Chinese users’ collective attention flows, which both considers the connection topology of hyperlinks between the sites and the collective behaviors of the users. With these tools, we rank the web sites and compare their centralities based on flow distances with other metrics. We also study the patterns of attention flow allocation, and find that a large number of web sites concentrate on the central area of the embedding space, and only a small fraction of web sites disperse in the periphery. The entire embedding space can be separated into 3 regions(core, interim, and periphery). The sites in the core (1%) occupy a majority of the attention flows (40%), and the sites (34%) in the interim attract 40%, whereas other sites (65%) only take 20% flows. What’s more, we clustered the web sites into 4 groups according to their positions in the space, and found that similar web sites in contents and topics are grouped together. In short, by incorporating the open flow network model, we can clearly see how collective attention allocates and flows on different web sites, and how web sites connected each other. PMID:27812133

  10. Web Image Search Re-ranking with Click-based Similarity and Typicality.

    PubMed

    Yang, Xiaopeng; Mei, Tao; Zhang, Yong Dong; Liu, Jie; Satoh, Shin'ichi

    2016-07-20

    In image search re-ranking, besides the well known semantic gap, intent gap, which is the gap between the representation of users' query/demand and the real intent of the users, is becoming a major problem restricting the development of image retrieval. To reduce human effects, in this paper, we use image click-through data, which can be viewed as the "implicit feedback" from users, to help overcome the intention gap, and further improve the image search performance. Generally, the hypothesis visually similar images should be close in a ranking list and the strategy images with higher relevance should be ranked higher than others are widely accepted. To obtain satisfying search results, thus, image similarity and the level of relevance typicality are determinate factors correspondingly. However, when measuring image similarity and typicality, conventional re-ranking approaches only consider visual information and initial ranks of images, while overlooking the influence of click-through data. This paper presents a novel re-ranking approach, named spectral clustering re-ranking with click-based similarity and typicality (SCCST). First, to learn an appropriate similarity measurement, we propose click-based multi-feature similarity learning algorithm (CMSL), which conducts metric learning based on clickbased triplets selection, and integrates multiple features into a unified similarity space via multiple kernel learning. Then based on the learnt click-based image similarity measure, we conduct spectral clustering to group visually and semantically similar images into same clusters, and get the final re-rank list by calculating click-based clusters typicality and withinclusters click-based image typicality in descending order. Our experiments conducted on two real-world query-image datasets with diverse representative queries show that our proposed reranking approach can significantly improve initial search results, and outperform several existing re-ranking approaches.

  11. A novel three-stage distance-based consensus ranking method

    NASA Astrophysics Data System (ADS)

    Aghayi, Nazila; Tavana, Madjid

    2018-05-01

    In this study, we propose a three-stage weighted sum method for identifying the group ranks of alternatives. In the first stage, a rank matrix, similar to the cross-efficiency matrix, is obtained by computing the individual rank position of each alternative based on importance weights. In the second stage, a secondary goal is defined to limit the vector of weights since the vector of weights obtained in the first stage is not unique. Finally, in the third stage, the group rank position of alternatives is obtained based on a distance of individual rank positions. The third stage determines a consensus solution for the group so that the ranks obtained have a minimum distance from the ranks acquired by each alternative in the previous stage. A numerical example is presented to demonstrate the applicability and exhibit the efficacy of the proposed method and algorithms.

  12. Neophilia Ranking of Scientific Journals

    PubMed Central

    Packalen, Mikko; Bhattacharya, Jay

    2017-01-01

    The ranking of scientific journals is important because of the signal it sends to scientists about what is considered most vital for scientific progress. Existing ranking systems focus on measuring the influence of a scientific paper (citations)—these rankings do not reward journals for publishing innovative work that builds on new ideas. We propose an alternative ranking based on the proclivity of journals to publish papers that build on new ideas, and we implement this ranking via a text-based analysis of all published biomedical papers dating back to 1946. In addition, we compare our neophilia ranking to citation-based (impact factor) rankings; this comparison shows that the two ranking approaches are distinct. Prior theoretical work suggests an active role for our neophilia index in science policy. Absent an explicit incentive to pursue novel science, scientists underinvest in innovative work because of a coordination problem: for work on a new idea to flourish, many scientists must decide to adopt it in their work. Rankings that are based purely on influence thus do not provide sufficient incentives for publishing innovative work. By contrast, adoption of the neophilia index as part of journal-ranking procedures by funding agencies and university administrators would provide an explicit incentive for journals to publish innovative work and thus help solve the coordination problem by increasing scientists' incentives to pursue innovative work. PMID:28713181

  13. On the ranking of chemicals based on their PBT characteristics: comparison of different ranking methodologies using selected POPs as an illustrative example.

    PubMed

    Sailaukhanuly, Yerbolat; Zhakupbekova, Arai; Amutova, Farida; Carlsen, Lars

    2013-01-01

    Knowledge of the environmental behavior of chemicals is a fundamental part of the risk assessment process. The present paper discusses various methods of ranking of a series of persistent organic pollutants (POPs) according to the persistence, bioaccumulation and toxicity (PBT) characteristics. Traditionally ranking has been done as an absolute (total) ranking applying various multicriteria data analysis methods like simple additive ranking (SAR) or various utility functions (UFs) based rankings. An attractive alternative to these ranking methodologies appears to be partial order ranking (POR). The present paper compares different ranking methods like SAR, UF and POR. Significant discrepancies between the rankings are noted and it is concluded that partial order ranking, as a method without any pre-assumptions concerning possible relation between the single parameters, appears as the most attractive ranking methodology. In addition to the initial ranking partial order methodology offers a wide variety of analytical tools to elucidate the interplay between the objects to be ranked and the ranking parameters. In the present study is included an analysis of the relative importance of the single P, B and T parameters. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Ranking nano-enabled hybrid media for simultaneous removal of contaminants with different chemistries: Pseudo-equilibrium sorption tests versus column tests.

    PubMed

    Custodio, Tomas; Garcia, Jose; Markovski, Jasmina; McKay Gifford, James; Hristovski, Kiril D; Olson, Larry W

    2017-12-15

    The underlying hypothesis of this study was that pseudo-equilibrium and column testing conditions would provide the same sorbent ranking trends although the values of sorbents' performance descriptors (e.g. sorption capacity) may vary because of different kinetics and competition effects induced by the two testing approaches. To address this hypothesis, nano-enabled hybrid media were fabricated and its removal performances were assessed for two model contaminants under multi-point batch pseudo-equilibrium and continuous-flow conditions. Calculation of simultaneous removal capacity indices (SRC) demonstrated that the more resource demanding continuous-flow tests are able to generate the same performance rankings as the ones obtained by conducing the simpler pseudo-equilibrium tests. Furthermore, continuous overlap between the 98% confidence boundaries for each SRC index trend, not only validated the hypothesis that both testing conditions provide the same ranking trends, but also pointed that SRC indices are statistically the same for each media, regardless of employed method. In scenarios where rapid screening of new media is required to obtain the best performing synthesis formulation, use of pseudo-equilibrium tests proved to be reliable. Considering that kinetics induced effects on sorption capacity must not be neglected, more resource demanding column test could be conducted only with the top performing media that exhibit the highest sorption capacity. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. CNN-based ranking for biomedical entity normalization.

    PubMed

    Li, Haodi; Chen, Qingcai; Tang, Buzhou; Wang, Xiaolong; Xu, Hua; Wang, Baohua; Huang, Dong

    2017-10-03

    Most state-of-the-art biomedical entity normalization systems, such as rule-based systems, merely rely on morphological information of entity mentions, but rarely consider their semantic information. In this paper, we introduce a novel convolutional neural network (CNN) architecture that regards biomedical entity normalization as a ranking problem and benefits from semantic information of biomedical entities. The CNN-based ranking method first generates candidates using handcrafted rules, and then ranks the candidates according to their semantic information modeled by CNN as well as their morphological information. Experiments on two benchmark datasets for biomedical entity normalization show that our proposed CNN-based ranking method outperforms traditional rule-based method with state-of-the-art performance. We propose a CNN architecture that regards biomedical entity normalization as a ranking problem. Comparison results show that semantic information is beneficial to biomedical entity normalization and can be well combined with morphological information in our CNN architecture for further improvement.

  16. RANK Expression and Osteoclastogenesis in Human Monocytes in Peripheral Blood from Rheumatoid Arthritis Patients

    PubMed Central

    Kobashigawa, Tsuyoshi

    2016-01-01

    Rheumatoid arthritis (RA) appears as inflammation of synovial tissue and joint destruction. Receptor activator of NF-κB (RANK) is a member of the TNF receptor superfamily and a receptor for the RANK ligand (RANKL). In this study, we examined the expression of RANKhigh and CCR6 on CD14+ monocytes from patients with RA and healthy volunteers. Peripheral blood samples were obtained from both the RA patients and the healthy volunteers. Osteoclastogenesis from monocytes was induced by RANKL and M-CSF in vitro. To study the expression of RANKhigh and CCR6 on CD14+ monocytes, two-color flow cytometry was performed. Levels of expression of RANK on monocytes were significantly correlated with the level of osteoclastogenesis in the healthy volunteers. The expression of RANKhigh on CD14+ monocyte in RA patients without treatment was elevated and that in those receiving treatment was decreased. In addition, the high-level expression of RANK on CD14+ monocytes was correlated with the high-level expression of CCR6 in healthy volunteers. Monocytes expressing both RANK and CCR6 differentiate into osteoclasts. The expression of CD14+RANKhigh in untreated RA patients was elevated. RANK and CCR6 expressed on monocytes may be novel targets for the regulation of bone resorption in RA and osteoporosis. PMID:27822475

  17. Seiberg-Witten geometries for Coulomb branch chiral rings which are not freely generated

    DOE PAGES

    Argyres, Philip C.; Lü, Yongchao; Martone, Mario

    2017-06-27

    Coulomb branch chiral rings of N = 2 SCFTs are conjectured to be freely generated. While no counter-example is known, no direct evidence for the conjecture is known either. We initiate a systematic study of SCFTs with Coulomb branch chiral rings satisfying non-trivial relations, restricting our analysis to rank 1. The main result of our study is that (rank-1) SCFTs with non-freely generated CB chiral rings when deformed by relevant deformations, always flow to theories with non-freely generated CB rings. This implies that if they exist, they must thus form a distinct subset under RG flows. We also nd manymore » interesting characteristic properties that these putative theories satisfy which may be helpful in proving or disproving their existence using other methods.« less

  18. Latent cardiac dysfunction as assessed by echocardiography in bed-bound patients following cerebrovascular accidents: comparison with nutritional status.

    PubMed

    Masugata, Hisashi; Senda, Shoichi; Goda, Fuminori; Yoshihara, Yumiko; Yoshikawa, Kay; Fujita, Norihiro; Himoto, Takashi; Okuyama, Hiroyuki; Taoka, Teruhisa; Imai, Masanobu; Kohno, Masakazu

    2007-07-01

    The aim of this study was to elucidate the cardiac function in bed-bound patients following cerebrovascular accidents. In accord with the criteria for activities of daily living (ADL) of the Japanese Ministry of Health, Labour and Welfare, 51 age-matched poststroke patients without heart disease were classified into 3 groups: rank A (house-bound) (n = 16, age, 85 +/- 6 years), rank B (chair-bound) (n = 16, age, 84 +/- 8 years), and rank C (bed-bound) (n = 19, age, 85 +/- 9 years). Using echocardiography, the left ventricular (LV) diastolic function was assessed by the ratio of early filling (E) and atrial contraction (A) transmitral flow velocities (E/A) of LV inflow. LV systolic function was assessed by LV ejection fraction (LVEF), and the Tei index was also measured to assess both LV systolic and diastolic function. No difference was observed in the E/A and LVEF among the 3 groups. The Tei index was higher in rank C (0.56 +/- 0.17) than in rank A (0.39 +/- 0.06) and rank B (0.48 +/- 0.17), and a statistically significant difference was observed between rank A and rank C (P < 0.05). Serum albumin and blood hemoglobin were significantly lower in rank C (3.1 +/- 0.4 and 10.6 +/- 1.8 g/dL) than in rank A (4.1 +/- 0.3 and 12.4 +/- 1.2 g/dL) (P < 0.001 and P < 0.05, respectively). These results indicate that latent cardiac dysfunction and poor nutritional status may exist in bed-bound patients (rank C) following cerebrovascular accidents. The Tei index may be a useful index of cardiac dysfunction in bed-bound patients because it is independent of the cardiac loading condition.

  19. Extreme learning machine for ranking: generalization analysis and applications.

    PubMed

    Chen, Hong; Peng, Jiangtao; Zhou, Yicong; Li, Luoqing; Pan, Zhibin

    2014-05-01

    The extreme learning machine (ELM) has attracted increasing attention recently with its successful applications in classification and regression. In this paper, we investigate the generalization performance of ELM-based ranking. A new regularized ranking algorithm is proposed based on the combinations of activation functions in ELM. The generalization analysis is established for the ELM-based ranking (ELMRank) in terms of the covering numbers of hypothesis space. Empirical results on the benchmark datasets show the competitive performance of the ELMRank over the state-of-the-art ranking methods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Spectral properties of Google matrix of Wikipedia and other networks

    NASA Astrophysics Data System (ADS)

    Ermann, Leonardo; Frahm, Klaus M.; Shepelyansky, Dima L.

    2013-05-01

    We study the properties of eigenvalues and eigenvectors of the Google matrix of the Wikipedia articles hyperlink network and other real networks. With the help of the Arnoldi method, we analyze the distribution of eigenvalues in the complex plane and show that eigenstates with significant eigenvalue modulus are located on well defined network communities. We also show that the correlator between PageRank and CheiRank vectors distinguishes different organizations of information flow on BBC and Le Monde web sites.

  1. Hazard-Ranking of Agricultural Pesticides for Chronic Health Effects in Yuma County, Arizona

    PubMed Central

    Sugeng, Anastasia J.; Beamer, Paloma I.; Lutz, Eric A.; Rosales, Cecilia B.

    2013-01-01

    With thousands of pesticides registered by the United States Environmental Protection Agency, it not feasible to sample for all pesticides applied in agricultural communities. Hazard-ranking pesticides based on use, toxicity, and exposure potential can help prioritize community-specific pesticide hazards. This study applied hazard-ranking schemes for cancer, endocrine disruption, and reproductive/developmental toxicity in Yuma County, Arizona. An existing cancer hazard-ranking scheme was modified, and novel schemes for endocrine disruption and reproductive/developmental toxicity were developed to rank pesticide hazards. The hazard-ranking schemes accounted for pesticide use, toxicity, and exposure potential based on chemical properties of each pesticide. Pesticides were ranked as hazards with respect to each health effect, as well as overall chronic health effects. The highest hazard-ranked pesticides for overall chronic health effects were maneb, metam sodium, trifluralin, pronamide, and bifenthrin. The relative pesticide rankings were unique for each health effect. The highest hazard-ranked pesticides differed from those most heavily applied, as well as from those previously detected in Yuma homes over a decade ago. The most hazardous pesticides for cancer in Yuma County, Arizona were also different from a previous hazard-ranking applied in California. Hazard-ranking schemes that take into account pesticide use, toxicity, and exposure potential can help prioritize pesticides of greatest health risk in agricultural communities. This study is the first to provide pesticide hazard-rankings for endocrine disruption and reproductive/developmental toxicity based on use, toxicity, and exposure potential. These hazard-ranking schemes can be applied to other agricultural communities for prioritizing community-specific pesticide hazards to target decreasing health risk. PMID:23783270

  2. Uncertainty and sensitivity analysis for two-phase flow in the vicinity of the repository in the 1996 performance assessment for the Waste Isolation Pilot Plant: Disturbed conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HELTON,JON CRAIG; BEAN,J.E.; ECONOMY,K.

    2000-05-22

    Uncertainty and sensitivity analysis results obtained in the 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) are presented for two-phase flow in the vicinity of the repository under disturbed conditions resulting from drilling intrusions. Techniques based on Latin hypercube sampling, examination of scatterplots, stepwise regression analysis, partial correlation analysis and rank transformations are used to investigate brine inflow, gas generation repository pressure, brine saturation and brine and gas outflow. Of the variables under study, repository pressure and brine flow from the repository to the Culebra Dolomite are potentially the most important in PA for the WIPP. Subsequentmore » to a drilling intrusion repository pressure was dominated by borehole permeability and generally below the level (i.e., 8 MPa) that could potentially produce spallings and direct brine releases. Brine flow from the repository to the Culebra Dolomite tended to be small or nonexistent with its occurrence and size also dominated by borehole permeability.« less

  3. Sensitivity ranking for freshwater invertebrates towards hydrocarbon contaminants.

    PubMed

    Gerner, Nadine V; Cailleaud, Kevin; Bassères, Anne; Liess, Matthias; Beketov, Mikhail A

    2017-11-01

    Hydrocarbons have an utmost economical importance but may also cause substantial ecological impacts due to accidents or inadequate transportation and use. Currently, freshwater biomonitoring methods lack an indicator that can unequivocally reflect the impacts caused by hydrocarbons while being independent from effects of other stressors. The aim of the present study was to develop a sensitivity ranking for freshwater invertebrates towards hydrocarbon contaminants, which can be used in hydrocarbon-specific bioindicators. We employed the Relative Sensitivity method and developed the sensitivity ranking S hydrocarbons based on literature ecotoxicological data supplemented with rapid and mesocosm test results. A first validation of the sensitivity ranking based on an earlier field study has been conducted and revealed the S hydrocarbons ranking to be promising for application in sensitivity based indicators. Thus, the first results indicate that the ranking can serve as the core component of future hydrocarbon-specific and sensitivity trait based bioindicators.

  4. $$ \\mathcal{N}=1 $$ deformations and RG flows of $$ \\mathcal{N}=2 $$ SCFTs

    DOE PAGES

    Maruyoshi, Kazunobu; Song, Jaewon

    2017-02-14

    Here, we study certainmore » $$ \\mathcal{N}=1 $$ preserving deformations of four-dimensional $$ \\mathcal{N}=2 $$ superconformal field theories (SCFTs) with non-abelian flavor symmetry. The deformation is described by adding an $$ \\mathcal{N}=1 $$ chiral multiplet transforming in the adjoint representation of the flavor symmetry with a superpotential coupling, and giving a nilpotent vacuum expectation value to the chiral multiplet which breaks the flavor symmetry. This triggers a renormalization group flow to an infrared SCFT. Remarkably, we find classes of theories flow to enhanced $$ \\mathcal{N}=2 $$ supersymmetric fixed points in the infrared under the deformation. They include generalized Argyres-Douglas theories and rank-one SCFTs with non-abelian flavor symmetries. Most notably, we find renormalization group flows from the deformed conformal SQCDs to the ( A1,An) Argyres-Douglas theories. From these "Lagrangian descriptions," we compute the full superconformal indices of the ( A1,An) theories and find agreements with the previous results. Furthermore, we study the cases, including the TN and R0,N theories of class S and some of rank-one SCFTs, where the deformation gives genuine $$ \\mathcal{N}=1 $$ fixed points.« less

  5. $$ \\mathcal{N}=1 $$ deformations and RG flows of $$ \\mathcal{N}=2 $$ SCFTs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maruyoshi, Kazunobu; Song, Jaewon

    Here, we study certainmore » $$ \\mathcal{N}=1 $$ preserving deformations of four-dimensional $$ \\mathcal{N}=2 $$ superconformal field theories (SCFTs) with non-abelian flavor symmetry. The deformation is described by adding an $$ \\mathcal{N}=1 $$ chiral multiplet transforming in the adjoint representation of the flavor symmetry with a superpotential coupling, and giving a nilpotent vacuum expectation value to the chiral multiplet which breaks the flavor symmetry. This triggers a renormalization group flow to an infrared SCFT. Remarkably, we find classes of theories flow to enhanced $$ \\mathcal{N}=2 $$ supersymmetric fixed points in the infrared under the deformation. They include generalized Argyres-Douglas theories and rank-one SCFTs with non-abelian flavor symmetries. Most notably, we find renormalization group flows from the deformed conformal SQCDs to the ( A1,An) Argyres-Douglas theories. From these "Lagrangian descriptions," we compute the full superconformal indices of the ( A1,An) theories and find agreements with the previous results. Furthermore, we study the cases, including the TN and R0,N theories of class S and some of rank-one SCFTs, where the deformation gives genuine $$ \\mathcal{N}=1 $$ fixed points.« less

  6. Where Are the Global Rankings Leading Us? An Analysis of Recent Methodological Changes and New Developments

    ERIC Educational Resources Information Center

    Rauhvargers, Andrejs

    2014-01-01

    This article is based on the analysis of the changes in global university rankings and the new "products" based on rankings data in the period since mid-2011. It is a summary and continuation of the European University Association (EUA)-commissioned report "Global University Rankings Their Impact, Report II" which was launched…

  7. Reconsidering the use of rankings in the valuation of health states: a model for estimating cardinal values from ordinal data

    PubMed Central

    Salomon, Joshua A

    2003-01-01

    Background In survey studies on health-state valuations, ordinal ranking exercises often are used as precursors to other elicitation methods such as the time trade-off (TTO) or standard gamble, but the ranking data have not been used in deriving cardinal valuations. This study reconsiders the role of ordinal ranks in valuing health and introduces a new approach to estimate interval-scaled valuations based on aggregate ranking data. Methods Analyses were undertaken on data from a previously published general population survey study in the United Kingdom that included rankings and TTO values for hypothetical states described using the EQ-5D classification system. The EQ-5D includes five domains (mobility, self-care, usual activities, pain/discomfort and anxiety/depression) with three possible levels on each. Rank data were analysed using a random utility model, operationalized through conditional logit regression. In the statistical model, probabilities of observed rankings were related to the latent utilities of different health states, modeled as a linear function of EQ-5D domain scores, as in previously reported EQ-5D valuation functions. Predicted valuations based on the conditional logit model were compared to observed TTO values for the 42 states in the study and to predictions based on a model estimated directly from the TTO values. Models were evaluated using the intraclass correlation coefficient (ICC) between predictions and mean observations, and the root mean squared error of predictions at the individual level. Results Agreement between predicted valuations from the rank model and observed TTO values was very high, with an ICC of 0.97, only marginally lower than for predictions based on the model estimated directly from TTO values (ICC = 0.99). Individual-level errors were also comparable in the two models, with root mean squared errors of 0.503 and 0.496 for the rank-based and TTO-based predictions, respectively. Conclusions Modeling health-state valuations based on ordinal ranks can provide results that are similar to those obtained from more widely analyzed valuation techniques such as the TTO. The information content in aggregate ranking data is not currently exploited to full advantage. The possibility of estimating cardinal valuations from ordinal ranks could also simplify future data collection dramatically and facilitate wider empirical study of health-state valuations in diverse settings and population groups. PMID:14687419

  8. Ranking Surgical Residency Programs: Reputation Survey or Outcomes Measures?

    PubMed

    Wilson, Adam B; Torbeck, Laura J; Dunnington, Gary L

    2015-01-01

    The release of general surgery residency program rankings by Doximity and U.S. News & World Report accentuates the need to define and establish measurable standards of program quality. This study evaluated the extent to which program rankings based solely on peer nominations correlated with familiar program outcomes measures. Publicly available data were collected for all 254 general surgery residency programs. To generate a rudimentary outcomes-based program ranking, surgery programs were rank-ordered according to an average percentile rank that was calculated using board pass rates and the prevalence of alumni publications. A Kendall τ-b rank correlation computed the linear association between program rankings based on reputation alone and those derived from outcomes measures to validate whether reputation was a reasonable surrogate for globally judging program quality. For the 218 programs with complete data eligible for analysis, the mean board pass rate was 72% with a standard deviation of 14%. A total of 60 programs were placed in the 75th percentile or above for the number of publications authored by program alumni. The correlational analysis reported a significant correlation of 0.428, indicating only a moderate association between programs ranked by outcomes measures and those ranked according to reputation. Seventeen programs that were ranked in the top 30 according to reputation were also ranked in the top 30 based on outcomes measures. This study suggests that reputation alone does not fully capture a representative snapshot of a program's quality. Rather, the use of multiple quantifiable indicators and attributes unique to programs ought to be given more consideration when assigning ranks to denote program quality. It is advised that the interpretation and subsequent use of program rankings be met with caution until further studies can rigorously demonstrate best practices for awarding program standings. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  9. DockRank: Ranking docked conformations using partner-specific sequence homology-based protein interface prediction

    PubMed Central

    Xue, Li C.; Jordan, Rafael A.; EL-Manzalawy, Yasser; Dobbs, Drena; Honavar, Vasant

    2015-01-01

    Selecting near-native conformations from the immense number of conformations generated by docking programs remains a major challenge in molecular docking. We introduce DockRank, a novel approach to scoring docked conformations based on the degree to which the interface residues of the docked conformation match a set of predicted interface residues. Dock-Rank uses interface residues predicted by partner-specific sequence homology-based protein–protein interface predictor (PS-HomPPI), which predicts the interface residues of a query protein with a specific interaction partner. We compared the performance of DockRank with several state-of-the-art docking scoring functions using Success Rate (the percentage of cases that have at least one near-native conformation among the top m conformations) and Hit Rate (the percentage of near-native conformations that are included among the top m conformations). In cases where it is possible to obtain partner-specific (PS) interface predictions from PS-HomPPI, DockRank consistently outperforms both (i) ZRank and IRAD, two state-of-the-art energy-based scoring functions (improving Success Rate by up to 4-fold); and (ii) Variants of DockRank that use predicted interface residues obtained from several protein interface predictors that do not take into account the binding partner in making interface predictions (improving success rate by up to 39-fold). The latter result underscores the importance of using partner-specific interface residues in scoring docked conformations. We show that DockRank, when used to re-rank the conformations returned by ClusPro, improves upon the original ClusPro rankings in terms of both Success Rate and Hit Rate. DockRank is available as a server at http://einstein.cs.iastate.edu/DockRank/. PMID:23873600

  10. DockRank: ranking docked conformations using partner-specific sequence homology-based protein interface prediction.

    PubMed

    Xue, Li C; Jordan, Rafael A; El-Manzalawy, Yasser; Dobbs, Drena; Honavar, Vasant

    2014-02-01

    Selecting near-native conformations from the immense number of conformations generated by docking programs remains a major challenge in molecular docking. We introduce DockRank, a novel approach to scoring docked conformations based on the degree to which the interface residues of the docked conformation match a set of predicted interface residues. DockRank uses interface residues predicted by partner-specific sequence homology-based protein-protein interface predictor (PS-HomPPI), which predicts the interface residues of a query protein with a specific interaction partner. We compared the performance of DockRank with several state-of-the-art docking scoring functions using Success Rate (the percentage of cases that have at least one near-native conformation among the top m conformations) and Hit Rate (the percentage of near-native conformations that are included among the top m conformations). In cases where it is possible to obtain partner-specific (PS) interface predictions from PS-HomPPI, DockRank consistently outperforms both (i) ZRank and IRAD, two state-of-the-art energy-based scoring functions (improving Success Rate by up to 4-fold); and (ii) Variants of DockRank that use predicted interface residues obtained from several protein interface predictors that do not take into account the binding partner in making interface predictions (improving success rate by up to 39-fold). The latter result underscores the importance of using partner-specific interface residues in scoring docked conformations. We show that DockRank, when used to re-rank the conformations returned by ClusPro, improves upon the original ClusPro rankings in terms of both Success Rate and Hit Rate. DockRank is available as a server at http://einstein.cs.iastate.edu/DockRank/. Copyright © 2013 Wiley Periodicals, Inc.

  11. Impact of Doximity Residency Rankings on Emergency Medicine Applicant Rank Lists.

    PubMed

    Peterson, William J; Hopson, Laura R; Khandelwal, Sorabh; White, Melissa; Gallahue, Fiona E; Burkhardt, John; Rolston, Aimee M; Santen, Sally A

    2016-05-01

    This study investigates the impact of the Doximity rankings on the rank list choices made by residency applicants in emergency medicine (EM). We sent an 11-item survey by email to all students who applied to EM residency programs at four different institutions representing diverse geographical regions. Students were asked questions about their perception of Doximity rankings and how it may have impacted their rank list decisions. Response rate was 58% of 1,372 opened electronic surveys. This study found that a majority of medical students applying to residency in EM were aware of the Doximity rankings prior to submitting rank lists (67%). One-quarter of these applicants changed the number of programs and ranks of those programs when completing their rank list based on the Doximity rankings (26%). Though the absolute number of programs changed on the rank lists was small, the results demonstrate that the EM Doximity rankings impact applicant decision-making in ranking residency programs. While applicants do not find the Doximity rankings to be important compared to other factors in the application process, the Doximity rankings result in a small change in residency applicant ranking behavior. This unvalidated ranking, based principally on reputational data rather than objective outcome criteria, thus has the potential to be detrimental to students, programs, and the public. We feel it important for specialties to develop consensus around measurable training outcomes and provide freely accessible metrics for candidate education.

  12. Heuristic algorithms for the minmax regret flow-shop problem with interval processing times.

    PubMed

    Ćwik, Michał; Józefczyk, Jerzy

    2018-01-01

    An uncertain version of the permutation flow-shop with unlimited buffers and the makespan as a criterion is considered. The investigated parametric uncertainty is represented by given interval-valued processing times. The maximum regret is used for the evaluation of uncertainty. Consequently, the minmax regret discrete optimization problem is solved. Due to its high complexity, two relaxations are applied to simplify the optimization procedure. First of all, a greedy procedure is used for calculating the criterion's value, as such calculation is NP-hard problem itself. Moreover, the lower bound is used instead of solving the internal deterministic flow-shop. The constructive heuristic algorithm is applied for the relaxed optimization problem. The algorithm is compared with previously elaborated other heuristic algorithms basing on the evolutionary and the middle interval approaches. The conducted computational experiments showed the advantage of the constructive heuristic algorithm with regards to both the criterion and the time of computations. The Wilcoxon paired-rank statistical test confirmed this conclusion.

  13. A Ranking Approach on Large-Scale Graph With Multidimensional Heterogeneous Information.

    PubMed

    Wei, Wei; Gao, Bin; Liu, Tie-Yan; Wang, Taifeng; Li, Guohui; Li, Hang

    2016-04-01

    Graph-based ranking has been extensively studied and frequently applied in many applications, such as webpage ranking. It aims at mining potentially valuable information from the raw graph-structured data. Recently, with the proliferation of rich heterogeneous information (e.g., node/edge features and prior knowledge) available in many real-world graphs, how to effectively and efficiently leverage all information to improve the ranking performance becomes a new challenging problem. Previous methods only utilize part of such information and attempt to rank graph nodes according to link-based methods, of which the ranking performances are severely affected by several well-known issues, e.g., over-fitting or high computational complexity, especially when the scale of graph is very large. In this paper, we address the large-scale graph-based ranking problem and focus on how to effectively exploit rich heterogeneous information of the graph to improve the ranking performance. Specifically, we propose an innovative and effective semi-supervised PageRank (SSP) approach to parameterize the derived information within a unified semi-supervised learning framework (SSLF-GR), then simultaneously optimize the parameters and the ranking scores of graph nodes. Experiments on the real-world large-scale graphs demonstrate that our method significantly outperforms the algorithms that consider such graph information only partially.

  14. Estimation of distribution algorithm with path relinking for the blocking flow-shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Shao, Zhongshi; Pi, Dechang; Shao, Weishi

    2018-05-01

    This article presents an effective estimation of distribution algorithm, named P-EDA, to solve the blocking flow-shop scheduling problem (BFSP) with the makespan criterion. In the P-EDA, a Nawaz-Enscore-Ham (NEH)-based heuristic and the random method are combined to generate the initial population. Based on several superior individuals provided by a modified linear rank selection, a probabilistic model is constructed to describe the probabilistic distribution of the promising solution space. The path relinking technique is incorporated into EDA to avoid blindness of the search and improve the convergence property. A modified referenced local search is designed to enhance the local exploitation. Moreover, a diversity-maintaining scheme is introduced into EDA to avoid deterioration of the population. Finally, the parameters of the proposed P-EDA are calibrated using a design of experiments approach. Simulation results and comparisons with some well-performing algorithms demonstrate the effectiveness of the P-EDA for solving BFSP.

  15. RANWAR: rank-based weighted association rule mining from gene expression and methylation data.

    PubMed

    Mallik, Saurav; Mukhopadhyay, Anirban; Maulik, Ujjwal

    2015-01-01

    Ranking of association rules is currently an interesting topic in data mining and bioinformatics. The huge number of evolved rules of items (or, genes) by association rule mining (ARM) algorithms makes confusion to the decision maker. In this article, we propose a weighted rule-mining technique (say, RANWAR or rank-based weighted association rule-mining) to rank the rules using two novel rule-interestingness measures, viz., rank-based weighted condensed support (wcs) and weighted condensed confidence (wcc) measures to bypass the problem. These measures are basically depended on the rank of items (genes). Using the rank, we assign weight to each item. RANWAR generates much less number of frequent itemsets than the state-of-the-art association rule mining algorithms. Thus, it saves time of execution of the algorithm. We run RANWAR on gene expression and methylation datasets. The genes of the top rules are biologically validated by Gene Ontologies (GOs) and KEGG pathway analyses. Many top ranked rules extracted from RANWAR that hold poor ranks in traditional Apriori, are highly biologically significant to the related diseases. Finally, the top rules evolved from RANWAR, that are not in Apriori, are reported.

  16. Hazard-ranking of agricultural pesticides for chronic health effects in Yuma County, Arizona.

    PubMed

    Sugeng, Anastasia J; Beamer, Paloma I; Lutz, Eric A; Rosales, Cecilia B

    2013-10-01

    With thousands of pesticides registered by the United States Environmental Protection Agency, it not feasible to sample for all pesticides applied in agricultural communities. Hazard-ranking pesticides based on use, toxicity, and exposure potential can help prioritize community-specific pesticide hazards. This study applied hazard-ranking schemes for cancer, endocrine disruption, and reproductive/developmental toxicity in Yuma County, Arizona. An existing cancer hazard-ranking scheme was modified, and novel schemes for endocrine disruption and reproductive/developmental toxicity were developed to rank pesticide hazards. The hazard-ranking schemes accounted for pesticide use, toxicity, and exposure potential based on chemical properties of each pesticide. Pesticides were ranked as hazards with respect to each health effect, as well as overall chronic health effects. The highest hazard-ranked pesticides for overall chronic health effects were maneb, metam-sodium, trifluralin, pronamide, and bifenthrin. The relative pesticide rankings were unique for each health effect. The highest hazard-ranked pesticides differed from those most heavily applied, as well as from those previously detected in Yuma homes over a decade ago. The most hazardous pesticides for cancer in Yuma County, Arizona were also different from a previous hazard-ranking applied in California. Hazard-ranking schemes that take into account pesticide use, toxicity, and exposure potential can help prioritize pesticides of greatest health risk in agricultural communities. This study is the first to provide pesticide hazard-rankings for endocrine disruption and reproductive/developmental toxicity based on use, toxicity, and exposure potential. These hazard-ranking schemes can be applied to other agricultural communities for prioritizing community-specific pesticide hazards to target decreasing health risk. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Support vector methods for survival analysis: a comparison between ranking and regression approaches.

    PubMed

    Van Belle, Vanya; Pelckmans, Kristiaan; Van Huffel, Sabine; Suykens, Johan A K

    2011-10-01

    To compare and evaluate ranking, regression and combined machine learning approaches for the analysis of survival data. The literature describes two approaches based on support vector machines to deal with censored observations. In the first approach the key idea is to rephrase the task as a ranking problem via the concordance index, a problem which can be solved efficiently in a context of structural risk minimization and convex optimization techniques. In a second approach, one uses a regression approach, dealing with censoring by means of inequality constraints. The goal of this paper is then twofold: (i) introducing a new model combining the ranking and regression strategy, which retains the link with existing survival models such as the proportional hazards model via transformation models; and (ii) comparison of the three techniques on 6 clinical and 3 high-dimensional datasets and discussing the relevance of these techniques over classical approaches fur survival data. We compare svm-based survival models based on ranking constraints, based on regression constraints and models based on both ranking and regression constraints. The performance of the models is compared by means of three different measures: (i) the concordance index, measuring the model's discriminating ability; (ii) the logrank test statistic, indicating whether patients with a prognostic index lower than the median prognostic index have a significant different survival than patients with a prognostic index higher than the median; and (iii) the hazard ratio after normalization to restrict the prognostic index between 0 and 1. Our results indicate a significantly better performance for models including regression constraints above models only based on ranking constraints. This work gives empirical evidence that svm-based models using regression constraints perform significantly better than svm-based models based on ranking constraints. Our experiments show a comparable performance for methods including only regression or both regression and ranking constraints on clinical data. On high dimensional data, the former model performs better. However, this approach does not have a theoretical link with standard statistical models for survival data. This link can be made by means of transformation models when ranking constraints are included. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Bayesian Inference of High-Dimensional Dynamical Ocean Models

    NASA Astrophysics Data System (ADS)

    Lin, J.; Lermusiaux, P. F. J.; Lolla, S. V. T.; Gupta, A.; Haley, P. J., Jr.

    2015-12-01

    This presentation addresses a holistic set of challenges in high-dimension ocean Bayesian nonlinear estimation: i) predict the probability distribution functions (pdfs) of large nonlinear dynamical systems using stochastic partial differential equations (PDEs); ii) assimilate data using Bayes' law with these pdfs; iii) predict the future data that optimally reduce uncertainties; and (iv) rank the known and learn the new model formulations themselves. Overall, we allow the joint inference of the state, equations, geometry, boundary conditions and initial conditions of dynamical models. Examples are provided for time-dependent fluid and ocean flows, including cavity, double-gyre and Strait flows with jets and eddies. The Bayesian model inference, based on limited observations, is illustrated first by the estimation of obstacle shapes and positions in fluid flows. Next, the Bayesian inference of biogeochemical reaction equations and of their states and parameters is presented, illustrating how PDE-based machine learning can rigorously guide the selection and discovery of complex ecosystem models. Finally, the inference of multiscale bottom gravity current dynamics is illustrated, motivated in part by classic overflows and dense water formation sites and their relevance to climate monitoring and dynamics. This is joint work with our MSEAS group at MIT.

  19. Hydrologic Drought Decision Support System (HyDroDSS)

    USGS Publications Warehouse

    Granato, Gregory E.

    2014-01-01

    The hydrologic drought decision support system (HyDroDSS) was developed by the U.S. Geological Survey (USGS) in cooperation with the Rhode Island Water Resources Board (RIWRB) for use in the analysis of hydrologic variables that may indicate the risk for streamflows to be below user-defined flow targets at a designated site of interest, which is defined herein as data-collection site on a stream that may be adversely affected by pumping. Hydrologic drought is defined for this study as a period of lower than normal streamflows caused by precipitation deficits and (or) water withdrawals. The HyDroDSS is designed to provide water managers with risk-based information for balancing water-supply needs and aquatic-habitat protection goals to mitigate potential effects of hydrologic drought. This report describes the theory and methods for retrospective streamflow-depletion analysis, rank correlation analysis, and drought-projection analysis. All three methods are designed to inform decisions made by drought steering committees and decisionmakers on the basis of quantitative risk assessment. All three methods use estimates of unaltered streamflow, which is the measured or modeled flow without major withdrawals or discharges, to approximate a natural low-flow regime. Retrospective streamflow-depletion analysis can be used by water-resource managers to evaluate relations between withdrawal plans and the potential effects of withdrawal plans on streams at one or more sites of interest in an area. Retrospective streamflow-depletion analysis indicates the historical risk of being below user-defined flow targets if different pumping plans were implemented for the period of record. Retrospective streamflow-depletion analysis also indicates the risk for creating hydrologic drought conditions caused by use of a pumping plan. Retrospective streamflow-depletion analysis is done by calculating the net streamflow depletions from withdrawals and discharges and applying these depletions to a simulated record of unaltered streamflow. Rank correlation analysis in the HyDroDSS indicates the persistence of hydrologic measurements from month to month for the prediction of developing hydrologic drought conditions and quantitatively indicates which hydrologic variables may be used to indicate the onset of hydrologic drought conditions. Rank correlation analysis also indicates the potential use of each variable for estimating the monthly minimum unaltered flow at a site of interest for use in the drought-projection analysis. Rank correlation analysis in the HyDroDSS is done by calculating Spearman’s rho for paired samples and the 95-percent confidence limits of this rho value. Rank correlation analysis can be done by using precipitation, groundwater levels, measured streamflows, and estimated unaltered streamflows. Serial correlation analysis, which indicates relations between current and future values, can be done for a single site. Cross correlation analysis, which indicates relations among current values at one site and current and future values at a second site, also can be done. Drought-projection analysis in the HyDroDSS indicates the risk for being in a hydrologic drought condition during the current month and the five following months with and without pumping. Drought-projection analysis also indicates the potential effectiveness of water-conservation methods for mitigating the effect of withdrawals in the coming months on the basis of the amount of depletion caused by different pumping plans and on the risk of unaltered flows being below streamflow targets. Drought-projection analysis in the HyDroDSS is done with Monte Carlo methods by using the position analysis method. In this method the initial value of estimated unaltered streamflows is calculated by correlation to a measured hydrologic variable (monthly precipitation, groundwater levels, or streamflows from an index station identified with the rank correlation analysis). Then a pseudorandom number generator is used to create 251 six-month-long flow traces by using a bootstrap method. Serial correlation of the estimated unaltered monthly minimum streamflows determined from the rank correlation analysis is preserved within each flow trace. The sample of unaltered streamflows indicates the risk of being below flow targets in the coming months under simulated natural conditions (without historic withdrawals). The streamflow-depletion algorithms are then used to estimate risks of flow being below targets if selected pumping plans are used. This report also describes the implementation of the HyDroDSS. The HyDroDSS was developed as a Microsoft Access® database application to facilitate storage, handling, and use of hydrologic datasets with a simple graphical user interface. The program is implemented in the database by using the Visual Basic for Applications® (VBA) programming language. Program source code for the analytical techniques is provided in the HyDroDSS and in electronic text files accompanying this report. Program source code for the graphical user interface and for data-handling code, which is specific to Microsoft Access® and the HyDroDSS, is provided in the database. An installation package with a run-time version of the software is available with this report for potential users who do not have a compatible copy of Microsoft Access®. Administrative rights are needed to install this version of the HyDroDSS. A case study, to demonstrate the use of HyDroDSS and interpretation of results for a site of interest, is detailed for the USGS streamgage on the Hunt River (station 01117000) near East Greenwich in central Rhode Island. The Hunt River streamgage was used because it has a long record of streamflow and is in a well-studied basin with a substantial amount of hydrologic and water-use data including groundwater pumping for municipal water supply.

  20. A Case-Based Reasoning Method with Rank Aggregation

    NASA Astrophysics Data System (ADS)

    Sun, Jinhua; Du, Jiao; Hu, Jian

    2018-03-01

    In order to improve the accuracy of case-based reasoning (CBR), this paper addresses a new CBR framework with the basic principle of rank aggregation. First, the ranking methods are put forward in each attribute subspace of case. The ordering relation between cases on each attribute is got between cases. Then, a sorting matrix is got. Second, the similar case retrieval process from ranking matrix is transformed into a rank aggregation optimal problem, which uses the Kemeny optimal. On the basis, a rank aggregation case-based reasoning algorithm, named RA-CBR, is designed. The experiment result on UCI data sets shows that case retrieval accuracy of RA-CBR algorithm is higher than euclidean distance CBR and mahalanobis distance CBR testing.So we can get the conclusion that RA-CBR method can increase the performance and efficiency of CBR.

  1. Using Trained Pixel Classifiers to Select Images of Interest

    NASA Technical Reports Server (NTRS)

    Mazzoni, D.; Wagstaff, K.; Castano, R.

    2004-01-01

    We present a machine-learning-based approach to ranking images based on learned priorities. Unlike previous methods for image evaluation, which typically assess the value of each image based on the presence of predetermined specific features, this method involves using two levels of machine-learning classifiers: one level is used to classify each pixel as belonging to one of a group of rather generic classes, and another level is used to rank the images based on these pixel classifications, given some example rankings from a scientist as a guide. Initial results indicate that the technique works well, producing new rankings that match the scientist's rankings significantly better than would be expected by chance. The method is demonstrated for a set of images collected by a Mars field-test rover.

  2. An Efficient Rank Based Approach for Closest String and Closest Substring

    PubMed Central

    2012-01-01

    This paper aims to present a new genetic approach that uses rank distance for solving two known NP-hard problems, and to compare rank distance with other distance measures for strings. The two NP-hard problems we are trying to solve are closest string and closest substring. For each problem we build a genetic algorithm and we describe the genetic operations involved. Both genetic algorithms use a fitness function based on rank distance. We compare our algorithms with other genetic algorithms that use different distance measures, such as Hamming distance or Levenshtein distance, on real DNA sequences. Our experiments show that the genetic algorithms based on rank distance have the best results. PMID:22675483

  3. Desirability-based methods of multiobjective optimization and ranking for global QSAR studies. Filtering safe and potent drug candidates from combinatorial libraries.

    PubMed

    Cruz-Monteagudo, Maykel; Borges, Fernanda; Cordeiro, M Natália D S; Cagide Fajin, J Luis; Morell, Carlos; Ruiz, Reinaldo Molina; Cañizares-Carmenate, Yudith; Dominguez, Elena Rosa

    2008-01-01

    Up to now, very few applications of multiobjective optimization (MOOP) techniques to quantitative structure-activity relationship (QSAR) studies have been reported in the literature. However, none of them report the optimization of objectives related directly to the final pharmaceutical profile of a drug. In this paper, a MOOP method based on Derringer's desirability function that allows conducting global QSAR studies, simultaneously considering the potency, bioavailability, and safety of a set of drug candidates, is introduced. The results of the desirability-based MOOP (the levels of the predictor variables concurrently producing the best possible compromise between the properties determining an optimal drug candidate) are used for the implementation of a ranking method that is also based on the application of desirability functions. This method allows ranking drug candidates with unknown pharmaceutical properties from combinatorial libraries according to the degree of similarity with the previously determined optimal candidate. Application of this method will make it possible to filter the most promising drug candidates of a library (the best-ranked candidates), which should have the best pharmaceutical profile (the best compromise between potency, safety and bioavailability). In addition, a validation method of the ranking process, as well as a quantitative measure of the quality of a ranking, the ranking quality index (Psi), is proposed. The usefulness of the desirability-based methods of MOOP and ranking is demonstrated by its application to a library of 95 fluoroquinolones, reporting their gram-negative antibacterial activity and mammalian cell cytotoxicity. Finally, the combined use of the desirability-based methods of MOOP and ranking proposed here seems to be a valuable tool for rational drug discovery and development.

  4. A surrogate-based sensitivity quantification and Bayesian inversion of a regional groundwater flow model

    NASA Astrophysics Data System (ADS)

    Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.; Amerjeed, Mansoor

    2018-02-01

    Bayesian inference using Markov Chain Monte Carlo (MCMC) provides an explicit framework for stochastic calibration of hydrogeologic models accounting for uncertainties; however, the MCMC sampling entails a large number of model calls, and could easily become computationally unwieldy if the high-fidelity hydrogeologic model simulation is time consuming. This study proposes a surrogate-based Bayesian framework to address this notorious issue, and illustrates the methodology by inverse modeling a regional MODFLOW model. The high-fidelity groundwater model is approximated by a fast statistical model using Bagging Multivariate Adaptive Regression Spline (BMARS) algorithm, and hence the MCMC sampling can be efficiently performed. In this study, the MODFLOW model is developed to simulate the groundwater flow in an arid region of Oman consisting of mountain-coast aquifers, and used to run representative simulations to generate training dataset for BMARS model construction. A BMARS-based Sobol' method is also employed to efficiently calculate input parameter sensitivities, which are used to evaluate and rank their importance for the groundwater flow model system. According to sensitivity analysis, insensitive parameters are screened out of Bayesian inversion of the MODFLOW model, further saving computing efforts. The posterior probability distribution of input parameters is efficiently inferred from the prescribed prior distribution using observed head data, demonstrating that the presented BMARS-based Bayesian framework is an efficient tool to reduce parameter uncertainties of a groundwater system.

  5. Complete hazard ranking to analyze right-censored data: An ALS survival study.

    PubMed

    Huang, Zhengnan; Zhang, Hongjiu; Boss, Jonathan; Goutman, Stephen A; Mukherjee, Bhramar; Dinov, Ivo D; Guan, Yuanfang

    2017-12-01

    Survival analysis represents an important outcome measure in clinical research and clinical trials; further, survival ranking may offer additional advantages in clinical trials. In this study, we developed GuanRank, a non-parametric ranking-based technique to transform patients' survival data into a linear space of hazard ranks. The transformation enables the utilization of machine learning base-learners including Gaussian process regression, Lasso, and random forest on survival data. The method was submitted to the DREAM Amyotrophic Lateral Sclerosis (ALS) Stratification Challenge. Ranked first place, the model gave more accurate ranking predictions on the PRO-ACT ALS dataset in comparison to Cox proportional hazard model. By utilizing right-censored data in its training process, the method demonstrated its state-of-the-art predictive power in ALS survival ranking. Its feature selection identified multiple important factors, some of which conflicts with previous studies.

  6. Rehabbing the Rankings: Fool's Errand or the Lord's Work?

    ERIC Educational Resources Information Center

    Kuh, George D.

    2011-01-01

    For better or worse, rankings shape public conceptions of collegiate quality. This paper reviews the history of rankings, analyzes what they represent, explores recent efforts to employ indicators in addition to institutional resources and reputation on which the most popular rankings are based, and evaluates the extent to which rankings serve…

  7. Image Re-Ranking Based on Topic Diversity.

    PubMed

    Qian, Xueming; Lu, Dan; Wang, Yaxiong; Zhu, Li; Tang, Yuan Yan; Wang, Meng

    2017-08-01

    Social media sharing Websites allow users to annotate images with free tags, which significantly contribute to the development of the web image retrieval. Tag-based image search is an important method to find images shared by users in social networks. However, how to make the top ranked result relevant and with diversity is challenging. In this paper, we propose a topic diverse ranking approach for tag-based image retrieval with the consideration of promoting the topic coverage performance. First, we construct a tag graph based on the similarity between each tag. Then, the community detection method is conducted to mine the topic community of each tag. After that, inter-community and intra-community ranking are introduced to obtain the final retrieved results. In the inter-community ranking process, an adaptive random walk model is employed to rank the community based on the multi-information of each topic community. Besides, we build an inverted index structure for images to accelerate the searching process. Experimental results on Flickr data set and NUS-Wide data sets show the effectiveness of the proposed approach.

  8. Ranking of healthcare programmes based on health outcome, health costs and safe delivery of care in hospital pharmacy practice.

    PubMed

    Brisseau, Lionel; Bussières, Jean-François; Bois, Denis; Vallée, Marc; Racine, Marie-Claude; Bonnici, André

    2013-02-01

    To establish a consensual and coherent ranking of healthcare programmes that involve the presence of ward-based and clinic-based clinical pharmacists, based on health outcome, health costs and safe delivery of care. This descriptive study was derived from a structured dialogue (Delphi technique) among directors of pharmacy department. We established a quantitative profile of healthcare programmes at five sites that involved the provision of ward-based and clinic-based pharmaceutical care. A summary table of evidence established a unique quality rating per inpatient (clinic-based) or outpatient (ward-based) healthcare programme. Each director rated the perceived impact of pharmaceutical care per inpatient or outpatient healthcare programme on three fields: health outcome, health costs and safe delivery of care. They agreed by consensus on the final ranking of healthcare programmes. A ranking was assigned for each of the 18 healthcare programmes for outpatient care and the 17 healthcare programmes for inpatient care involving the presence of pharmacists, based on health outcome, health costs and safe delivery of care. There was a good correlation between ranking based on data from a 2007-2008 Canadian report on hospital pharmacy practice and the ranking proposed by directors of pharmacy department. Given the often limited human and financial resources, managers should consider the best evidence available on a profession's impact to plan healthcare services within an organization. Data are few on ranking healthcare programmes in order to prioritize which healthcare programme would mostly benefit from the delivery of pharmaceutical care by ward-based and clinic-based pharmacists. © 2012 The Authors. IJPP © 2012 Royal Pharmaceutical Society.

  9. Learning to rank using user clicks and visual features for image retrieval.

    PubMed

    Yu, Jun; Tao, Dacheng; Wang, Meng; Rui, Yong

    2015-04-01

    The inconsistency between textual features and visual contents can cause poor image search results. To solve this problem, click features, which are more reliable than textual information in justifying the relevance between a query and clicked images, are adopted in image ranking model. However, the existing ranking model cannot integrate visual features, which are efficient in refining the click-based search results. In this paper, we propose a novel ranking model based on the learning to rank framework. Visual features and click features are simultaneously utilized to obtain the ranking model. Specifically, the proposed approach is based on large margin structured output learning and the visual consistency is integrated with the click features through a hypergraph regularizer term. In accordance with the fast alternating linearization method, we design a novel algorithm to optimize the objective function. This algorithm alternately minimizes two different approximations of the original objective function by keeping one function unchanged and linearizing the other. We conduct experiments on a large-scale dataset collected from the Microsoft Bing image search engine, and the results demonstrate that the proposed learning to rank models based on visual features and user clicks outperforms state-of-the-art algorithms.

  10. AptRank: an adaptive PageRank model for protein function prediction on   bi-relational graphs.

    PubMed

    Jiang, Biaobin; Kloster, Kyle; Gleich, David F; Gribskov, Michael

    2017-06-15

    Diffusion-based network models are widely used for protein function prediction using protein network data and have been shown to outperform neighborhood-based and module-based methods. Recent studies have shown that integrating the hierarchical structure of the Gene Ontology (GO) data dramatically improves prediction accuracy. However, previous methods usually either used the GO hierarchy to refine the prediction results of multiple classifiers, or flattened the hierarchy into a function-function similarity kernel. No study has taken the GO hierarchy into account together with the protein network as a two-layer network model. We first construct a Bi-relational graph (Birg) model comprised of both protein-protein association and function-function hierarchical networks. We then propose two diffusion-based methods, BirgRank and AptRank, both of which use PageRank to diffuse information on this two-layer graph model. BirgRank is a direct application of traditional PageRank with fixed decay parameters. In contrast, AptRank utilizes an adaptive diffusion mechanism to improve the performance of BirgRank. We evaluate the ability of both methods to predict protein function on yeast, fly and human protein datasets, and compare with four previous methods: GeneMANIA, TMC, ProteinRank and clusDCA. We design four different validation strategies: missing function prediction, de novo function prediction, guided function prediction and newly discovered function prediction to comprehensively evaluate predictability of all six methods. We find that both BirgRank and AptRank outperform the previous methods, especially in missing function prediction when using only 10% of the data for training. The MATLAB code is available at https://github.rcac.purdue.edu/mgribsko/aptrank . gribskov@purdue.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  11. Review assessment support in Open Journal System using TextRank

    NASA Astrophysics Data System (ADS)

    Manalu, S. R.; Willy; Sundjaja, A. M.; Noerlina

    2017-01-01

    In this paper, a review assessment support in Open Journal System (OJS) using TextRank is proposed. OJS is an open-source journal management platform that provides a streamlined journal publishing workflow. TextRank is an unsupervised, graph-based ranking model commonly used as extractive auto summarization of text documents. This study applies the TextRank algorithm to summarize 50 article reviews from an OJS-based international journal. The resulting summaries are formed using the most representative sentences extracted from the reviews. The summaries are then used to help OJS editors in assessing a review’s quality.

  12. Design of Complex Systems to Achieve Passive Safety: Natural Circulation Cooling of Liquid Salt Pebble Bed Reactors

    NASA Astrophysics Data System (ADS)

    Scarlat, Raluca Olga

    This dissertation treats system design, modeling of transient system response, and characterization of individual phenomena and demonstrates a framework for integration of these three activities early in the design process of a complex engineered system. A system analysis framework for prioritization of experiments, modeling, and development of detailed design is proposed. Two fundamental topics in thermal-hydraulics are discussed, which illustrate the integration of modeling and experimentation with nuclear reactor design and safety analysis: thermal-hydraulic modeling of heat generating pebble bed cores, and scaled experiments for natural circulation heat removal with Boussinesq liquids. The case studies used in this dissertation are derived from the design and safety analysis of a pebble bed fluoride salt cooled high temperature nuclear reactor (PB-FHR), currently under development in the United States at the university and national laboratories level. In the context of the phenomena identification and ranking table (PIRT) methodology, new tools and approaches are proposed and demonstrated here, which are specifically relevant to technology in the early stages of development, and to analysis of passive safety features. A system decomposition approach is proposed. Definition of system functional requirements complements identification and compilation of the current knowledge base for the behavior of the system. Two new graphical tools are developed for ranking of phenomena importance: a phenomena ranking map, and a phenomena identification and ranking matrix (PIRM). The functional requirements established through this methodology were used for the design and optimization of the reactor core, and for the transient analysis and design of the passive natural circulation driven decay heat removal system for the PB-FHR. A numerical modeling approach for heat-generating porous media, with multi-dimensional fluid flow is presented. The application of this modeling approach to the PB-FHR annular pebble bed core cooled by fluoride salt mixtures generated a model that is called Pod. Pod. was used to show the resilience of the PB-FHR core to generation of hot spots or cold spots, due to the effect of buoyancy on the flow and temperature distribution in the packed bed. Pod. was used to investigate the PB-FHR response to ATWS transients. Based on the functional requirements for the core, Pod. was used to generate an optimized design of the flow distribution in the core. An analysis of natural circulation loops cooled by single-phase Boussinesq fluids is presented here, in the context of reactor design that relies on natural circulation decay heat removal, and design of scaled experiments. The scaling arguments are established for a transient natural circulation loop, for loops that have long fluid residence time, and negligible contribution of fluid inertia to the momentum equation. The design of integral effects tests for the loss of forced circulation (LOFC) for PB-FHR is discussed. The special case of natural circulation decay heat removal from a pebble bed reactor was analyzed. A way to define the Reynolds number in a multi-dimensional pebble bed was identified. The scaling methodology for replicating pebble bed friction losses using an electrically resistance heated annular pipe and a needle valve was developed. The thermophysical properties of liquid fluoride salts lead to design of systems with low flow velocities, and hence long fluid residence times. A comparison among liquid coolants for the performance of steady state natural circulation heat removal from a pebble bed was performed. Transient natural circulation experimental data with simulant fluids for fluoride salts is given here. The low flow velocity and the relatively high viscosity of the fluoride salts lead to low Reynolds number flows, and a low Reynolds number in conjunction with a sufficiently high coefficient of thermal expansion makes the system susceptible to local buoyancy effects Experiments indicate that slow exchange of stagnant fluid in static legs can play a significant role in the transient response of natural circulation loops. The effect of non-linear temperature profiles on the hot or cold legs or other segments of the flow loop, which may develop during transient scenarios, should be considered when modeling the performance of natural circulation loops. The data provided here can be used for validation of the application of thermal-hydraulic systems codes to the modeling of heat removal by natural circulation with liquid fluoride salts and its simulant fluids.

  13. Can streamlined multi-criteria decision analysis be used to implement shared decision making for colorectal cancer screening?

    PubMed Central

    Dolan, James G.; Boohaker, Emily; Allison, Jeroan; Imperiale, Thomas F.

    2013-01-01

    Background Current US colorectal cancer screening guidelines that call for shared decision making regarding the choice among several recommended screening options are difficult to implement. Multi-criteria decision analysis (MCDA) is an established methodology well suited for supporting shared decision making. Our study goal was to determine if a streamlined form of MCDA using rank order based judgments can accurately assess patients’ colorectal cancer screening priorities. Methods We converted priorities for four decision criteria and three sub-criteria regarding colorectal cancer screening obtained from 484 average risk patients using the Analytic Hierarchy Process (AHP) in a prior study into rank order-based priorities using rank order centroids. We compared the two sets of priorities using Spearman rank correlation and non-parametric Bland-Altman limits of agreement analysis. We assessed the differential impact of using the rank order-based versus the AHP-based priorities on the results of a full MCDA comparing three currently recommended colorectal cancer screening strategies. Generalizability of the results was assessed using Monte Carlo simulation. Results Correlations between the two sets of priorities for the seven criteria ranged from 0.55 to 0.92. The proportions of absolute differences between rank order-based and AHP-based priorities that were more than ± 0.15 ranged from 1% to 16%. Differences in the full MCDA results were minimal and the relative rankings of the three screening options were identical more than 88% of the time. The Monte Carlo simulation results were similar. Conclusion Rank order-based MCDA could be a simple, practical way to guide individual decisions and assess population decision priorities regarding colorectal cancer screening strategies. Additional research is warranted to further explore the use of these methods for promoting shared decision making. PMID:24300851

  14. Clinical evaluation of flowable resins in non-carious cervical lesions: two-year results.

    PubMed

    Celik, Cigdem; Ozgünaltay, Gül; Attar, Nuray

    2007-01-01

    This study evaluated the two-year clinical performance of one microhybrid composite and three different types of flowable resin materials in non-carious cervical lesions. A total of 252 noncarious cervical lesions were restored in 37 patients (12 male, 25 female) with Admira Flow, Dyract Flow, Filtek Flow and Filtek Z250, according to manufacturers' instructions. All the restorations were placed by one operator, and two other examiners evaluated the restorations clinically within one week after placement and after 6, 12, 18 and 24 months, using modified USPHS criteria. At the end of 24 months, 172 restorations were evaluated in 26 patients, with a recall rate of 68%. Statistical analysis was completed using the Pearson Chi-square and Fisher-Freeman-Halton tests (p < 0.05). Additionally, survival rates were analyzed with the Kaplan-Meier estimator and the Log-Rank test (p < 0.05). The Log-Rank test indicated statistically significant differences between the survival rates of Dyract Flow/Admira Flow and Dyract Flow/Filtek Z250 (p < 0.05). While there was a statistically significant difference between Dyract Flow and the other materials for color match at 12 and 18 months, no significant difference was observed among all of the materials tested at 24 months. Significant differences were revealed between Filtek Z250 and the other materials for marginal adaptation at 18 and 24 months (p < 0.05). With respect to marginal discoloration, secondary caries, surface texture and anatomic form, no significant differences were found between the resin materials (p > 0.05). It was concluded that different types of resin materials demonstrated acceptable clinical performance in non-carious cervical lesions, except for the retention rates of the Dyract Flow restorations.

  15. Improving the Rank Precision of Population Health Measures for Small Areas with Longitudinal and Joint Outcome Models

    PubMed Central

    Athens, Jessica K.; Remington, Patrick L.; Gangnon, Ronald E.

    2015-01-01

    Objectives The University of Wisconsin Population Health Institute has published the County Health Rankings since 2010. These rankings use population-based data to highlight health outcomes and the multiple determinants of these outcomes and to encourage in-depth health assessment for all United States counties. A significant methodological limitation, however, is the uncertainty of rank estimates, particularly for small counties. To address this challenge, we explore the use of longitudinal and pooled outcome data in hierarchical Bayesian models to generate county ranks with greater precision. Methods In our models we used pooled outcome data for three measure groups: (1) Poor physical and poor mental health days; (2) percent of births with low birth weight and fair or poor health prevalence; and (3) age-specific mortality rates for nine age groups. We used the fixed and random effects components of these models to generate posterior samples of rates for each measure. We also used time-series data in longitudinal random effects models for age-specific mortality. Based on the posterior samples from these models, we estimate ranks and rank quartiles for each measure, as well as the probability of a county ranking in its assigned quartile. Rank quartile probabilities for univariate, joint outcome, and/or longitudinal models were compared to assess improvements in rank precision. Results The joint outcome model for poor physical and poor mental health days resulted in improved rank precision, as did the longitudinal model for age-specific mortality rates. Rank precision for low birth weight births and fair/poor health prevalence based on the univariate and joint outcome models were equivalent. Conclusion Incorporating longitudinal or pooled outcome data may improve rank certainty, depending on characteristics of the measures selected. For measures with different determinants, joint modeling neither improved nor degraded rank precision. This approach suggests a simple way to use existing information to improve the precision of small-area measures of population health. PMID:26098858

  16. Applying a "Big Data" Literature System to Recommend Antihypertensive Drugs for Hypertension Patients with Diabetes Mellitus.

    PubMed

    Shu, Jing-Xian; Li, Ying; He, Ting; Chen, Ling; Li, Xue; Zou, Lin-Lin; Yin, Lu; Li, Xiao-Hui; Wang, An-Li; Liu, Xing; Yuan, Hong

    2018-01-07

    BACKGROUND The explosive increase in medical literature has changed therapeutic strategies, but it is challenging for physicians to keep up-to-date on the medical literature. Scientific literature data mining on a large-scale of can be used to refresh physician knowledge and better improve the quality of disease treatment. MATERIAL AND METHODS This paper reports on a reformulated version of a data mining method called MedRank, which is a network-based algorithm that ranks therapy for a target disease based on the MEDLINE literature database. MedRank algorithm input for this study was a clear definition of the disease model; the algorithm output was the accurate recommendation of antihypertensive drugs. Hypertension with diabetes mellitus was chosen as the input disease model. The ranking output of antihypertensive drugs are based on the Joint National Committee (JNC) guidelines, one through eight, and the publication dates, ≤1977, ≤1980, ≤1984, ≤1988, ≤1993, ≤1997, ≤2003, and ≤2013. The McNemar's test was used to evaluate the efficacy of MedRank based on specific JNC guidelines. RESULTS The ranking order of antihypertensive drugs changed with the date of the published literature, and the MedRank algorithm drug recommendations had excellent consistency with the JNC guidelines in 2013 (P=1.00 from McNemar's test, Kappa=0.78, P=1.00). Moreover, the Kappa index increased over time. Sensitivity was better than specificity for MedRank; in addition, sensitivity was maintained at a high level, and specificity increased from 1997 to 2013. CONCLUSIONS The use of MedRank in ranking medical literature on hypertension with diabetes mellitus in our study suggests possible application in clinical practice; it is a potential method for supporting antihypertensive drug-prescription decisions.

  17. Hot corrosion studies of four nickel-base superalloys - B-1900, NASA-TRW VIA, 713C and IN738

    NASA Technical Reports Server (NTRS)

    Fryburg, G. C.; Kohl, F. J.; Stearns, C. A.

    1976-01-01

    The susceptibility to hot corrosion of four nickel-base superalloys has been studied at 900 and 1000 C in one atmosphere of slowly flowing oxygen. Hot corrosion was induced by coating the samples with known doses of Na2SO4 and oxidizing the coated samples isothermally on a sensitive microbalance. In order of decending susceptibility to hot corrosion, these alloys were ranked: B-1900, 713C, NASA-TRW VIA, IN738. This order corresponds to the order of decreasing molybdenum content of the alloys. Chemical evidence for B-1900 indicates that hot corrosion is instigated by acid fluxing of the protective Al2O3 coating by MoO3.

  18. MiRNA-TF-gene network analysis through ranking of biomolecules for multi-informative uterine leiomyoma dataset.

    PubMed

    Mallik, Saurav; Maulik, Ujjwal

    2015-10-01

    Gene ranking is an important problem in bioinformatics. Here, we propose a new framework for ranking biomolecules (viz., miRNAs, transcription-factors/TFs and genes) in a multi-informative uterine leiomyoma dataset having both gene expression and methylation data using (statistical) eigenvector centrality based approach. At first, genes that are both differentially expressed and methylated, are identified using Limma statistical test. A network, comprising these genes, corresponding TFs from TRANSFAC and ITFP databases, and targeter miRNAs from miRWalk database, is then built. The biomolecules are then ranked based on eigenvector centrality. Our proposed method provides better average accuracy in hub gene and non-hub gene classifications than other methods. Furthermore, pre-ranked Gene set enrichment analysis is applied on the pathway database as well as GO-term databases of Molecular Signatures Database with providing a pre-ranked gene-list based on different centrality values for comparing among the ranking methods. Finally, top novel potential gene-markers for the uterine leiomyoma are provided. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Deriving preference order of post-mining land-uses through MLSA framework: application of an outranking technique

    NASA Astrophysics Data System (ADS)

    Soltanmohammadi, Hossein; Osanloo, Morteza; Aghajani Bazzazi, Abbas

    2009-08-01

    This study intends to take advantage of a previously developed framework for mined land suitability analysis (MLSA) consisted of economical, social, technical and mine site factors to achieve a partial and also a complete pre-order of feasible post-mining land-uses. Analysis by an outranking multi-attribute decision-making (MADM) technique, called PROMETHEE (preference ranking organization method for enrichment evaluation), was taken into consideration because of its clear advantages on the field of MLSA as compared with MADM ranking techniques. Application of the proposed approach on a mined land can be completed through some successive steps. First, performance of the MLSA attributes is scored locally by each individual decision maker (DM). Then the assigned performance scores are normalized and the deviation amplitudes of non-dominated alternatives are calculated. Weights of the attributes are calculated by another MADM technique namely, analytical hierarchy process (AHP) in a separate procedure. Using the Gaussian preference function beside the weights, the preference indexes of the land-use alternatives are obtained. Calculation of the outgoing and entering flows of the alternatives and one by one comparison of these values will lead to partial pre-order of them and calculation of the net flows, will lead to a ranked preference for each land-use. At the final step, utilizing the PROMETHEE group decision support system which incorporates judgments of all the DMs, a consensual ranking can be derived. In this paper, preference order of post-mining land-uses for a hypothetical mined land has been derived according to judgments of one DM to reveal applicability of the proposed approach.

  20. Determining hospital performance based on rank ordering: is it appropriate?

    PubMed

    Anderson, Judy; Hackman, Mark; Burnich, Jeff; Gurgiolo, Thomas R

    2007-01-01

    An increasing number of "pay for performance" initiatives for hospitals and physicians ascribe performance by ranking hospitals or physicians on quality of care measures. Payment is subsequently based on where a hospital or physician ranks among peers. This study examines the variability of ranking hospitals on quality of care measures and its impact on comparing hospital performance. Variability in the ranks of 3 quality of care measures was examined: discharge instruction for congestive heart failure, use of beta-blockers at discharge for heart attack, and timing of initial antibiotic therapy within 4 hours of admission to the hospital for pneumonia. The data are available on the Centers for Medicare and Medicaid Services Web site as part of the Hospital Quality Alliance project. We found that considerable uncertainty exists in ranking of hospitals on these measures, which calls into question the use of rank ordering as a determinant of performance.

  1. Embedded feature ranking for ensemble MLP classifiers.

    PubMed

    Windeatt, Terry; Duangsoithong, Rakkrit; Smith, Raymond

    2011-06-01

    A feature ranking scheme for multilayer perceptron (MLP) ensembles is proposed, along with a stopping criterion based upon the out-of-bootstrap estimate. To solve multi-class problems feature ranking is combined with modified error-correcting output coding. Experimental results on benchmark data demonstrate the versatility of the MLP base classifier in removing irrelevant features.

  2. Clinical Psychology Ph.D. Program Rankings: Evaluating Eminence on Faculty Publications and Citations

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Malone, Carrie J.; Gonzalez, Melissa L.; McClure, David R.; Laud, Rinita B.; Minshawi, Noha F.

    2005-01-01

    Program rankings and their visibility have taken on greater and greater significance. Rarely is the accuracy of these rankings, which are typically based on a small subset of university faculty impressions, questioned. This paper presents a more comprehensive survey method based on quantifiable measures of faculty publications and citations. The…

  3. GeoSearcher: Location-Based Ranking of Search Engine Results.

    ERIC Educational Resources Information Center

    Watters, Carolyn; Amoudi, Ghada

    2003-01-01

    Discussion of Web queries with geospatial dimensions focuses on an algorithm that assigns location coordinates dynamically to Web sites based on the URL. Describes a prototype search system that uses the algorithm to re-rank search engine results for queries with a geospatial dimension, thus providing an alternative ranking order for search engine…

  4. Generating Performance Models for Irregular Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friese, Ryan D.; Tallent, Nathan R.; Vishnu, Abhinav

    2017-05-30

    Many applications have irregular behavior --- non-uniform input data, input-dependent solvers, irregular memory accesses, unbiased branches --- that cannot be captured using today's automated performance modeling techniques. We describe new hierarchical critical path analyses for the \\Palm model generation tool. To create a model's structure, we capture tasks along representative MPI critical paths. We create a histogram of critical tasks with parameterized task arguments and instance counts. To model each task, we identify hot instruction-level sub-paths and model each sub-path based on data flow, instruction scheduling, and data locality. We describe application models that generate accurate predictions for strong scalingmore » when varying CPU speed, cache speed, memory speed, and architecture. We present results for the Sweep3D neutron transport benchmark; Page Rank on multiple graphs; Support Vector Machine with pruning; and PFLOTRAN's reactive flow/transport solver with domain-induced load imbalance.« less

  5. Uncertainty and sensitivity analysis for two-phase flow in the vicinity of the repository in the 1996 performance assessment for the Waste Isolation Pilot Plant: Undisturbed conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HELTON,JON CRAIG; BEAN,J.E.; ECONOMY,K.

    2000-05-19

    Uncertainty and sensitivity analysis results obtained in the 1996 performance assessment for the Waste Isolation Pilot Plant are presented for two-phase flow the vicinity of the repository under undisturbed conditions. Techniques based on Latin hypercube sampling, examination of scatterplots, stepwise regression analysis, partial correlation analysis and rank transformation are used to investigate brine inflow, gas generation repository pressure, brine saturation and brine and gas outflow. Of the variables under study, repository pressure is potentially the most important due to its influence on spallings and direct brine releases, with the uncertainty in its value being dominated by the extent to whichmore » the microbial degradation of cellulose takes place, the rate at which the corrosion of steel takes place, and the amount of brine that drains from the surrounding disturbed rock zone into the repository.« less

  6. DrugE-Rank: improving drug–target interaction prediction of new candidate drugs or targets by ensemble learning to rank

    PubMed Central

    Yuan, Qingjun; Gao, Junning; Wu, Dongliang; Zhang, Shihua; Mamitsuka, Hiroshi; Zhu, Shanfeng

    2016-01-01

    Motivation: Identifying drug–target interactions is an important task in drug discovery. To reduce heavy time and financial cost in experimental way, many computational approaches have been proposed. Although these approaches have used many different principles, their performance is far from satisfactory, especially in predicting drug–target interactions of new candidate drugs or targets. Methods: Approaches based on machine learning for this problem can be divided into two types: feature-based and similarity-based methods. Learning to rank is the most powerful technique in the feature-based methods. Similarity-based methods are well accepted, due to their idea of connecting the chemical and genomic spaces, represented by drug and target similarities, respectively. We propose a new method, DrugE-Rank, to improve the prediction performance by nicely combining the advantages of the two different types of methods. That is, DrugE-Rank uses LTR, for which multiple well-known similarity-based methods can be used as components of ensemble learning. Results: The performance of DrugE-Rank is thoroughly examined by three main experiments using data from DrugBank: (i) cross-validation on FDA (US Food and Drug Administration) approved drugs before March 2014; (ii) independent test on FDA approved drugs after March 2014; and (iii) independent test on FDA experimental drugs. Experimental results show that DrugE-Rank outperforms competing methods significantly, especially achieving more than 30% improvement in Area under Prediction Recall curve for FDA approved new drugs and FDA experimental drugs. Availability: http://datamining-iip.fudan.edu.cn/service/DrugE-Rank Contact: zhusf@fudan.edu.cn Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27307615

  7. DrugE-Rank: improving drug-target interaction prediction of new candidate drugs or targets by ensemble learning to rank.

    PubMed

    Yuan, Qingjun; Gao, Junning; Wu, Dongliang; Zhang, Shihua; Mamitsuka, Hiroshi; Zhu, Shanfeng

    2016-06-15

    Identifying drug-target interactions is an important task in drug discovery. To reduce heavy time and financial cost in experimental way, many computational approaches have been proposed. Although these approaches have used many different principles, their performance is far from satisfactory, especially in predicting drug-target interactions of new candidate drugs or targets. Approaches based on machine learning for this problem can be divided into two types: feature-based and similarity-based methods. Learning to rank is the most powerful technique in the feature-based methods. Similarity-based methods are well accepted, due to their idea of connecting the chemical and genomic spaces, represented by drug and target similarities, respectively. We propose a new method, DrugE-Rank, to improve the prediction performance by nicely combining the advantages of the two different types of methods. That is, DrugE-Rank uses LTR, for which multiple well-known similarity-based methods can be used as components of ensemble learning. The performance of DrugE-Rank is thoroughly examined by three main experiments using data from DrugBank: (i) cross-validation on FDA (US Food and Drug Administration) approved drugs before March 2014; (ii) independent test on FDA approved drugs after March 2014; and (iii) independent test on FDA experimental drugs. Experimental results show that DrugE-Rank outperforms competing methods significantly, especially achieving more than 30% improvement in Area under Prediction Recall curve for FDA approved new drugs and FDA experimental drugs. http://datamining-iip.fudan.edu.cn/service/DrugE-Rank zhusf@fudan.edu.cn Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  8. Biological relevance of streamflow metrics: Regional and national perspectives

    USGS Publications Warehouse

    Carlisle, Daren M.; Grantham, Theodore E.; Eng, Kenny; Wolock, David M.

    2017-01-01

    Protecting the health of streams and rivers requires identifying ecologically significant attributes of the natural flow regime. Streamflow regimes are routinely quantified using a plethora of hydrologic metrics (HMs), most of which have unknown relevance to biological communities. At regional and national scales, we evaluated which of 509 commonly used HMs were associated with biological indicators of fish and invertebrate community integrity. We quantified alteration of each HM by using statistical models to predict site-specific natural baseline values for each of 728 sites across the USA where streamflow monitoring data were available concurrent with assessments of invertebrate or fish community integrity. We then ranked HMs according to their individual association with biological integrity based on random forest models that included HMs and other relevant covariates, such as land cover and stream chemistry. HMs were generally the most important predictors of biological integrity relative to the covariates. At a national scale, the most influential HMs were measures of depleted high flows, homogenization of flows, and erratic flows. Unique combinations of biologically relevant HMs were apparent among regions. We discuss the implications of our findings to the challenge of selecting HMs for streamflow research and management.

  9. Bayesian Inference of Natural Rankings in Incomplete Competition Networks

    PubMed Central

    Park, Juyong; Yook, Soon-Hyung

    2014-01-01

    Competition between a complex system's constituents and a corresponding reward mechanism based on it have profound influence on the functioning, stability, and evolution of the system. But determining the dominance hierarchy or ranking among the constituent parts from the strongest to the weakest – essential in determining reward and penalty – is frequently an ambiguous task due to the incomplete (partially filled) nature of competition networks. Here we introduce the “Natural Ranking,” an unambiguous ranking method applicable to a round robin tournament, and formulate an analytical model based on the Bayesian formula for inferring the expected mean and error of the natural ranking of nodes from an incomplete network. We investigate its potential and uses in resolving important issues of ranking by applying it to real-world competition networks. PMID:25163528

  10. Bayesian Inference of Natural Rankings in Incomplete Competition Networks

    NASA Astrophysics Data System (ADS)

    Park, Juyong; Yook, Soon-Hyung

    2014-08-01

    Competition between a complex system's constituents and a corresponding reward mechanism based on it have profound influence on the functioning, stability, and evolution of the system. But determining the dominance hierarchy or ranking among the constituent parts from the strongest to the weakest - essential in determining reward and penalty - is frequently an ambiguous task due to the incomplete (partially filled) nature of competition networks. Here we introduce the ``Natural Ranking,'' an unambiguous ranking method applicable to a round robin tournament, and formulate an analytical model based on the Bayesian formula for inferring the expected mean and error of the natural ranking of nodes from an incomplete network. We investigate its potential and uses in resolving important issues of ranking by applying it to real-world competition networks.

  11. A Recursive Partitioning Method for the Prediction of Preference Rankings Based Upon Kemeny Distances.

    PubMed

    D'Ambrosio, Antonio; Heiser, Willem J

    2016-09-01

    Preference rankings usually depend on the characteristics of both the individuals judging a set of objects and the objects being judged. This topic has been handled in the literature with log-linear representations of the generalized Bradley-Terry model and, recently, with distance-based tree models for rankings. A limitation of these approaches is that they only work with full rankings or with a pre-specified pattern governing the presence of ties, and/or they are based on quite strict distributional assumptions. To overcome these limitations, we propose a new prediction tree method for ranking data that is totally distribution-free. It combines Kemeny's axiomatic approach to define a unique distance between rankings with the CART approach to find a stable prediction tree. Furthermore, our method is not limited by any particular design of the pattern of ties. The method is evaluated in an extensive full-factorial Monte Carlo study with a new simulation design.

  12. Social class rank, essentialism, and punitive judgment.

    PubMed

    Kraus, Michael W; Keltner, Dacher

    2013-08-01

    Recent evidence suggests that perceptions of social class rank influence a variety of social cognitive tendencies, from patterns of causal attribution to moral judgment. In the present studies we tested the hypotheses that upper-class rank individuals would be more likely to endorse essentialist lay theories of social class categories (i.e., that social class is founded in genetically based, biological differences) than would lower-class rank individuals and that these beliefs would decrease support for restorative justice--which seeks to rehabilitate offenders, rather than punish unlawful action. Across studies, higher social class rank was associated with increased essentialism of social class categories (Studies 1, 2, and 4) and decreased support for restorative justice (Study 4). Moreover, manipulated essentialist beliefs decreased preferences for restorative justice (Study 3), and the association between social class rank and class-based essentialist theories was explained by the tendency to endorse beliefs in a just world (Study 2). Implications for how class-based essentialist beliefs potentially constrain social opportunity and mobility are discussed.

  13. 14 CFR 1214.1105 - Final ranking.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 5 2011-01-01 2010-01-01 true Final ranking. 1214.1105 Section 1214.1105 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION SPACE FLIGHT NASA Astronaut Candidate Recruitment and Selection Program § 1214.1105 Final ranking. Final rankings will be based on a combination of...

  14. Ranking Quality in Higher Education: Guiding or Misleading?

    ERIC Educational Resources Information Center

    Bergseth, Brita; Petocz, Peter; Abrandt Dahlgren, Madeleine

    2014-01-01

    The study examines two different models of measuring, assessing and ranking quality in higher education. Do different systems of quality assessment lead to equivalent conclusions about the quality of education? This comparative study is based on the rankings of 24 Swedish higher education institutions. Two ranking actors have independently…

  15. Entropy and generalized least square methods in assessment of the regional value of streamgages

    USGS Publications Warehouse

    Markus, M.; Vernon, Knapp H.; Tasker, Gary D.

    2003-01-01

    The Illinois State Water Survey performed a study to assess the streamgaging network in the State of Illinois. One of the important aspects of the study was to assess the regional value of each station through an assessment of the information transfer among gaging records for low, average, and high flow conditions. This analysis was performed for the main hydrologic regions in the State, and the stations were initially evaluated using a new approach based on entropy analysis. To determine the regional value of each station within a region, several information parameters, including total net information, were defined based on entropy. Stations were ranked based on the total net information. For comparison, the regional value of the same stations was assessed using the generalized least square regression (GLS) method, developed by the US Geological Survey. Finally, a hybrid combination of GLS and entropy was created by including a function of the negative net information as a penalty function in the GLS. The weights of the combined model were determined to maximize the average correlation with the results of GLS and entropy. The entropy and GLS methods were evaluated using the high-flow data from southern Illinois stations. The combined method was compared with the entropy and GLS approaches using the high-flow data from eastern Illinois stations. ?? 2003 Elsevier B.V. All rights reserved.

  16. Landscape and flow metrics affecting the distribution of a federally-threatened fish: Improving management, model fit, and model transferability

    USGS Publications Warehouse

    Brewer, Shannon K.; Worthington, Thomas A.; Zhang, Tianjioa; Logue, Daniel R.; Mittelstet, Aaron R.

    2016-01-01

    Truncated distributions of pelagophilic fishes have been observed across the Great Plains of North America, with water use and landscape fragmentation implicated as contributing factors. Developing conservation strategies for these species is hindered by the existence of multiple competing flow regime hypotheses related to species persistence. Our primary study objective was to compare the predicted distributions of one pelagophil, the Arkansas River Shiner Notropis girardi, constructed using different flow regime metrics. Further, we investigated different approaches for improving temporal transferability of the species distribution model (SDM). We compared four hypotheses: mean annual flow (a baseline), the 75th percentile of daily flow, the number of zero-flow days, and the number of days above 55th percentile flows, to examine the relative importance of flows during the spawning period. Building on an earlier SDM, we added covariates that quantified wells in each catchment, point source discharges, and non-native species presence to a structured variable framework. We assessed the effects on model transferability and fit by reducing multicollinearity using Spearman’s rank correlations, variance inflation factors, and principal component analysis, as well as altering the regularization coefficient (β) within MaxEnt. The 75th percentile of daily flow was the most important flow metric related to structuring the species distribution. The number of wells and point source discharges were also highly ranked. At the default level of β, model transferability was improved using all methods to reduce collinearity; however, at higher levels of β, the correlation method performed best. Using β = 5 provided the best model transferability, while retaining the majority of variables that contributed 95% to the model. This study provides a workflow for improving model transferability and also presents water-management options that may be considered to improve the conservation status of pelagophils.

  17. 14 CFR 1214.1105 - Final ranking.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Recruitment and Selection Program § 1214.1105 Final ranking. Final rankings will be based on a combination of the selection board's initial evaluations and the results of the interview process. Veteran's...

  18. Sync-rank: Robust Ranking, Constrained Ranking and Rank Aggregation via Eigenvector and SDP Synchronization

    DTIC Science & Technology

    2015-04-28

    the players . In addition, we compare the algorithms on three real data sets: the outcome of soccer games in the English Premier League, a Microsoft...Premier League soccer games, a Halo 2 game tournament and NCAA College Basketball games), which show that our proposed method compares favorably to...information on the ground truth rank of a subset of players , and propose an algorithm based on SDP which is able to recover the ranking of the remaining

  19. An Analytic Hierarchy Process-based Method to Rank the Critical Success Factors of Implementing a Pharmacy Barcode System.

    PubMed

    Alharthi, Hana; Sultana, Nahid; Al-Amoudi, Amjaad; Basudan, Afrah

    2015-01-01

    Pharmacy barcode scanning is used to reduce errors during the medication dispensing process. However, this technology has rarely been used in hospital pharmacies in Saudi Arabia. This article describes the barriers to successful implementation of a barcode scanning system in Saudi Arabia. A literature review was conducted to identify the relevant critical success factors (CSFs) for a successful dispensing barcode system implementation. Twenty-eight pharmacists from a local hospital in Saudi Arabia were interviewed to obtain their perception of these CSFs. In this study, planning (process flow issues and training requirements), resistance (fear of change, communication issues, and negative perceptions about technology), and technology (software, hardware, and vendor support) were identified as the main barriers. The analytic hierarchy process (AHP), one of the most widely used tools for decision making in the presence of multiple criteria, was used to compare and rank these identified CSFs. The results of this study suggest that resistance barriers have a greater impact than planning and technology barriers. In particular, fear of change is the most critical factor, and training is the least critical factor.

  20. On Painlevé/gauge theory correspondence

    NASA Astrophysics Data System (ADS)

    Bonelli, Giulio; Lisovyy, Oleg; Maruyoshi, Kazunobu; Sciarappa, Antonio; Tanzini, Alessandro

    2017-12-01

    We elucidate the relation between Painlevé equations and four-dimensional rank one N = 2 theories by identifying the connection associated with Painlevé isomonodromic problems with the oper limit of the flat connection of the Hitchin system associated with gauge theories and by studying the corresponding renormalization group flow. Based on this correspondence, we provide long-distance expansions at various canonical rays for all Painlevé τ -functions in terms of magnetic and dyonic Nekrasov partition functions for N = 2 SQCD and Argyres-Douglas theories at self-dual Omega background ɛ _1 + ɛ _2 = 0 or equivalently in terms of c=1 irregular conformal blocks.

  1. Iterative procedures for space shuttle main engine performance models

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    1989-01-01

    Performance models of the Space Shuttle Main Engine (SSME) contain iterative strategies for determining approximate solutions to nonlinear equations reflecting fundamental mass, energy, and pressure balances within engine flow systems. Both univariate and multivariate Newton-Raphson algorithms are employed in the current version of the engine Test Information Program (TIP). Computational efficiency and reliability of these procedures is examined. A modified trust region form of the multivariate Newton-Raphson method is implemented and shown to be superior for off nominal engine performance predictions. A heuristic form of Broyden's Rank One method is also tested and favorable results based on this algorithm are presented.

  2. Low-rank separated representation surrogates of high-dimensional stochastic functions: Application in Bayesian inference

    NASA Astrophysics Data System (ADS)

    Validi, AbdoulAhad

    2014-03-01

    This study introduces a non-intrusive approach in the context of low-rank separated representation to construct a surrogate of high-dimensional stochastic functions, e.g., PDEs/ODEs, in order to decrease the computational cost of Markov Chain Monte Carlo simulations in Bayesian inference. The surrogate model is constructed via a regularized alternative least-square regression with Tikhonov regularization using a roughening matrix computing the gradient of the solution, in conjunction with a perturbation-based error indicator to detect optimal model complexities. The model approximates a vector of a continuous solution at discrete values of a physical variable. The required number of random realizations to achieve a successful approximation linearly depends on the function dimensionality. The computational cost of the model construction is quadratic in the number of random inputs, which potentially tackles the curse of dimensionality in high-dimensional stochastic functions. Furthermore, this vector-valued separated representation-based model, in comparison to the available scalar-valued case, leads to a significant reduction in the cost of approximation by an order of magnitude equal to the vector size. The performance of the method is studied through its application to three numerical examples including a 41-dimensional elliptic PDE and a 21-dimensional cavity flow.

  3. Relationship between Journal-Ranking Metrics for a Multidisciplinary Set of Journals

    ERIC Educational Resources Information Center

    Perera, Upeksha; Wijewickrema, Manjula

    2018-01-01

    Ranking of scholarly journals is important to many parties. Studying the relationships among various ranking metrics is key to understanding the significance of one metric based on another. This research investigates the relationship among four major journal-ranking indicators: the impact factor (IF), the Eigenfactor score (ES), the "h."…

  4. Ranking Support Vector Machine with Kernel Approximation

    PubMed Central

    Dou, Yong

    2017-01-01

    Learning to rank algorithm has become important in recent years due to its successful application in information retrieval, recommender system, and computational biology, and so forth. Ranking support vector machine (RankSVM) is one of the state-of-art ranking models and has been favorably used. Nonlinear RankSVM (RankSVM with nonlinear kernels) can give higher accuracy than linear RankSVM (RankSVM with a linear kernel) for complex nonlinear ranking problem. However, the learning methods for nonlinear RankSVM are still time-consuming because of the calculation of kernel matrix. In this paper, we propose a fast ranking algorithm based on kernel approximation to avoid computing the kernel matrix. We explore two types of kernel approximation methods, namely, the Nyström method and random Fourier features. Primal truncated Newton method is used to optimize the pairwise L2-loss (squared Hinge-loss) objective function of the ranking model after the nonlinear kernel approximation. Experimental results demonstrate that our proposed method gets a much faster training speed than kernel RankSVM and achieves comparable or better performance over state-of-the-art ranking algorithms. PMID:28293256

  5. Ranking Support Vector Machine with Kernel Approximation.

    PubMed

    Chen, Kai; Li, Rongchun; Dou, Yong; Liang, Zhengfa; Lv, Qi

    2017-01-01

    Learning to rank algorithm has become important in recent years due to its successful application in information retrieval, recommender system, and computational biology, and so forth. Ranking support vector machine (RankSVM) is one of the state-of-art ranking models and has been favorably used. Nonlinear RankSVM (RankSVM with nonlinear kernels) can give higher accuracy than linear RankSVM (RankSVM with a linear kernel) for complex nonlinear ranking problem. However, the learning methods for nonlinear RankSVM are still time-consuming because of the calculation of kernel matrix. In this paper, we propose a fast ranking algorithm based on kernel approximation to avoid computing the kernel matrix. We explore two types of kernel approximation methods, namely, the Nyström method and random Fourier features. Primal truncated Newton method is used to optimize the pairwise L2-loss (squared Hinge-loss) objective function of the ranking model after the nonlinear kernel approximation. Experimental results demonstrate that our proposed method gets a much faster training speed than kernel RankSVM and achieves comparable or better performance over state-of-the-art ranking algorithms.

  6. Solutions of interval type-2 fuzzy polynomials using a new ranking method

    NASA Astrophysics Data System (ADS)

    Rahman, Nurhakimah Ab.; Abdullah, Lazim; Ghani, Ahmad Termimi Ab.; Ahmad, Noor'Ani

    2015-10-01

    A few years ago, a ranking method have been introduced in the fuzzy polynomial equations. Concept of the ranking method is proposed to find actual roots of fuzzy polynomials (if exists). Fuzzy polynomials are transformed to system of crisp polynomials, performed by using ranking method based on three parameters namely, Value, Ambiguity and Fuzziness. However, it was found that solutions based on these three parameters are quite inefficient to produce answers. Therefore in this study a new ranking method have been developed with the aim to overcome the inherent weakness. The new ranking method which have four parameters are then applied in the interval type-2 fuzzy polynomials, covering the interval type-2 of fuzzy polynomial equation, dual fuzzy polynomial equations and system of fuzzy polynomials. The efficiency of the new ranking method then numerically considered in the triangular fuzzy numbers and the trapezoidal fuzzy numbers. Finally, the approximate solutions produced from the numerical examples indicate that the new ranking method successfully produced actual roots for the interval type-2 fuzzy polynomials.

  7. Reduction from cost-sensitive ordinal ranking to weighted binary classification.

    PubMed

    Lin, Hsuan-Tien; Li, Ling

    2012-05-01

    We present a reduction framework from ordinal ranking to binary classification. The framework consists of three steps: extracting extended examples from the original examples, learning a binary classifier on the extended examples with any binary classification algorithm, and constructing a ranker from the binary classifier. Based on the framework, we show that a weighted 0/1 loss of the binary classifier upper-bounds the mislabeling cost of the ranker, both error-wise and regret-wise. Our framework allows not only the design of good ordinal ranking algorithms based on well-tuned binary classification approaches, but also the derivation of new generalization bounds for ordinal ranking from known bounds for binary classification. In addition, our framework unifies many existing ordinal ranking algorithms, such as perceptron ranking and support vector ordinal regression. When compared empirically on benchmark data sets, some of our newly designed algorithms enjoy advantages in terms of both training speed and generalization performance over existing algorithms. In addition, the newly designed algorithms lead to better cost-sensitive ordinal ranking performance, as well as improved listwise ranking performance.

  8. Prospective mixture risk assessment and management prioritizations for river catchments with diverse land uses

    PubMed Central

    Brown, Colin D.; de Zwart, Dick; Diamond, Jerome; Dyer, Scott D.; Holmes, Christopher M.; Marshall, Stuart; Burton, G. Allen

    2018-01-01

    Abstract Ecological risk assessment increasingly focuses on risks from chemical mixtures and multiple stressors because ecosystems are commonly exposed to a plethora of contaminants and nonchemical stressors. To simplify the task of assessing potential mixture effects, we explored 3 land use–related chemical emission scenarios. We applied a tiered methodology to judge the implications of the emissions of chemicals from agricultural practices, domestic discharges, and urban runoff in a quantitative model. The results showed land use–dependent mixture exposures, clearly discriminating downstream effects of land uses, with unique chemical “signatures” regarding composition, concentration, and temporal patterns. Associated risks were characterized in relation to the land‐use scenarios. Comparisons to measured environmental concentrations and predicted impacts showed relatively good similarity. The results suggest that the land uses imply exceedances of regulatory protective environmental quality standards, varying over time in relation to rain events and associated flow and dilution variation. Higher‐tier analyses using ecotoxicological effect criteria confirmed that species assemblages may be affected by exposures exceeding no‐effect levels and that mixture exposure could be associated with predicted species loss under certain situations. The model outcomes can inform various types of prioritization to support risk management, including a ranking across land uses as a whole, a ranking on characteristics of exposure times and frequencies, and various rankings of the relative role of individual chemicals. Though all results are based on in silico assessments, the prospective land use–based approach applied in the present study yields useful insights for simplifying and assessing potential ecological risks of chemical mixtures and can therefore be useful for catchment‐management decisions. Environ Toxicol Chem 2018;37:715–728. © 2017 The Authors. Environmental Toxicology Chemistry Published by Wiley Periodicals, Inc. PMID:28845901

  9. Behind the Curtain of the Beauty Pageant: An Investigation of U.S. News Undergraduate Business Program Rankings

    ERIC Educational Resources Information Center

    Perry, Pam

    2010-01-01

    The undergraduate business program rankings in USNWR are based solely on peer assessments from deans and associate deans of AACSB accredited U.S. business schools. Often these reputation-based rankings are discounted and likened to a beauty pageant because the process lacks transparent input data. In this study, ten deans and ten associate…

  10. Efficient l1 -norm-based low-rank matrix approximations for large-scale problems using alternating rectified gradient method.

    PubMed

    Kim, Eunwoo; Lee, Minsik; Choi, Chong-Ho; Kwak, Nojun; Oh, Songhwai

    2015-02-01

    Low-rank matrix approximation plays an important role in the area of computer vision and image processing. Most of the conventional low-rank matrix approximation methods are based on the l2 -norm (Frobenius norm) with principal component analysis (PCA) being the most popular among them. However, this can give a poor approximation for data contaminated by outliers (including missing data), because the l2 -norm exaggerates the negative effect of outliers. Recently, to overcome this problem, various methods based on the l1 -norm, such as robust PCA methods, have been proposed for low-rank matrix approximation. Despite the robustness of the methods, they require heavy computational effort and substantial memory for high-dimensional data, which is impractical for real-world problems. In this paper, we propose two efficient low-rank factorization methods based on the l1 -norm that find proper projection and coefficient matrices using the alternating rectified gradient method. The proposed methods are applied to a number of low-rank matrix approximation problems to demonstrate their efficiency and robustness. The experimental results show that our proposals are efficient in both execution time and reconstruction performance unlike other state-of-the-art methods.

  11. Social norms and rank-based nudging: Changing willingness to pay for healthy food.

    PubMed

    Aldrovandi, Silvio; Brown, Gordon D A; Wood, Alex M

    2015-09-01

    People's evaluations in the domain of healthy eating are at least partly determined by the choice context. We systematically test reference level and rank-based models of relative comparisons against each other and explore their application to social norms nudging, an intervention that aims at influencing consumers' behavior by addressing their inaccurate beliefs about their consumption relative to the consumption of others. Study 1 finds that the rank of a product or behavior among others in the immediate comparison context, rather than its objective attributes, influences its evaluation. Study 2 finds that when a comparator is presented in isolation the same rank-based process occurs based on information retrieved from memory. Study 3 finds that telling people how their consumption ranks within a normative comparison sample increases willingness to pay for a healthy food by over 30% relative to the normal social norms intervention that tells them how they compare to the average. We conclude that social norms interventions should present rank information (e.g., "you are in the most unhealthy 10% of eaters") rather than information relative to the average (e.g., "you consume 500 calories more than the average person"). (c) 2015 APA, all rights reserved).

  12. Automatically identifying health outcome information in MEDLINE records.

    PubMed

    Demner-Fushman, Dina; Few, Barbara; Hauser, Susan E; Thoma, George

    2006-01-01

    Understanding the effect of a given intervention on the patient's health outcome is one of the key elements in providing optimal patient care. This study presents a methodology for automatic identification of outcomes-related information in medical text and evaluates its potential in satisfying clinical information needs related to health care outcomes. An annotation scheme based on an evidence-based medicine model for critical appraisal of evidence was developed and used to annotate 633 MEDLINE citations. Textual, structural, and meta-information features essential to outcome identification were learned from the created collection and used to develop an automatic system. Accuracy of automatic outcome identification was assessed in an intrinsic evaluation and in an extrinsic evaluation, in which ranking of MEDLINE search results obtained using PubMed Clinical Queries relied on identified outcome statements. The accuracy and positive predictive value of outcome identification were calculated. Effectiveness of the outcome-based ranking was measured using mean average precision and precision at rank 10. Automatic outcome identification achieved 88% to 93% accuracy. The positive predictive value of individual sentences identified as outcomes ranged from 30% to 37%. Outcome-based ranking improved retrieval accuracy, tripling mean average precision and achieving 389% improvement in precision at rank 10. Preliminary results in outcome-based document ranking show potential validity of the evidence-based medicine-model approach in timely delivery of information critical to clinical decision support at the point of service.

  13. A network-based dynamical ranking system for competitive sports

    NASA Astrophysics Data System (ADS)

    Motegi, Shun; Masuda, Naoki

    2012-12-01

    From the viewpoint of networks, a ranking system for players or teams in sports is equivalent to a centrality measure for sports networks, whereby a directed link represents the result of a single game. Previously proposed network-based ranking systems are derived from static networks, i.e., aggregation of the results of games over time. However, the score of a player (or team) fluctuates over time. Defeating a renowned player in the peak performance is intuitively more rewarding than defeating the same player in other periods. To account for this factor, we propose a dynamic variant of such a network-based ranking system and apply it to professional men's tennis data. We derive a set of linear online update equations for the score of each player. The proposed ranking system predicts the outcome of the future games with a higher accuracy than the static counterparts.

  14. Weighted Discriminative Dictionary Learning based on Low-rank Representation

    NASA Astrophysics Data System (ADS)

    Chang, Heyou; Zheng, Hao

    2017-01-01

    Low-rank representation has been widely used in the field of pattern classification, especially when both training and testing images are corrupted with large noise. Dictionary plays an important role in low-rank representation. With respect to the semantic dictionary, the optimal representation matrix should be block-diagonal. However, traditional low-rank representation based dictionary learning methods cannot effectively exploit the discriminative information between data and dictionary. To address this problem, this paper proposed weighted discriminative dictionary learning based on low-rank representation, where a weighted representation regularization term is constructed. The regularization associates label information of both training samples and dictionary atoms, and encourages to generate a discriminative representation with class-wise block-diagonal structure, which can further improve the classification performance where both training and testing images are corrupted with large noise. Experimental results demonstrate advantages of the proposed method over the state-of-the-art methods.

  15. Improving predicted protein loop structure ranking using a Pareto-optimality consensus method.

    PubMed

    Li, Yaohang; Rata, Ionel; Chiu, See-wing; Jakobsson, Eric

    2010-07-20

    Accurate protein loop structure models are important to understand functions of many proteins. Identifying the native or near-native models by distinguishing them from the misfolded ones is a critical step in protein loop structure prediction. We have developed a Pareto Optimal Consensus (POC) method, which is a consensus model ranking approach to integrate multiple knowledge- or physics-based scoring functions. The procedure of identifying the models of best quality in a model set includes: 1) identifying the models at the Pareto optimal front with respect to a set of scoring functions, and 2) ranking them based on the fuzzy dominance relationship to the rest of the models. We apply the POC method to a large number of decoy sets for loops of 4- to 12-residue in length using a functional space composed of several carefully-selected scoring functions: Rosetta, DOPE, DDFIRE, OPLS-AA, and a triplet backbone dihedral potential developed in our lab. Our computational results show that the sets of Pareto-optimal decoys, which are typically composed of approximately 20% or less of the overall decoys in a set, have a good coverage of the best or near-best decoys in more than 99% of the loop targets. Compared to the individual scoring function yielding best selection accuracy in the decoy sets, the POC method yields 23%, 37%, and 64% less false positives in distinguishing the native conformation, indentifying a near-native model (RMSD < 0.5A from the native) as top-ranked, and selecting at least one near-native model in the top-5-ranked models, respectively. Similar effectiveness of the POC method is also found in the decoy sets from membrane protein loops. Furthermore, the POC method outperforms the other popularly-used consensus strategies in model ranking, such as rank-by-number, rank-by-rank, rank-by-vote, and regression-based methods. By integrating multiple knowledge- and physics-based scoring functions based on Pareto optimality and fuzzy dominance, the POC method is effective in distinguishing the best loop models from the other ones within a loop model set.

  16. Classifying short genomic fragments from novel lineages using composition and homology

    PubMed Central

    2011-01-01

    Background The assignment of taxonomic attributions to DNA fragments recovered directly from the environment is a vital step in metagenomic data analysis. Assignments can be made using rank-specific classifiers, which assign reads to taxonomic labels from a predetermined level such as named species or strain, or rank-flexible classifiers, which choose an appropriate taxonomic rank for each sequence in a data set. The choice of rank typically depends on the optimal model for a given sequence and on the breadth of taxonomic groups seen in a set of close-to-optimal models. Homology-based (e.g., LCA) and composition-based (e.g., PhyloPythia, TACOA) rank-flexible classifiers have been proposed, but there is at present no hybrid approach that utilizes both homology and composition. Results We first develop a hybrid, rank-specific classifier based on BLAST and Naïve Bayes (NB) that has comparable accuracy and a faster running time than the current best approach, PhymmBL. By substituting LCA for BLAST or allowing the inclusion of suboptimal NB models, we obtain a rank-flexible classifier. This hybrid classifier outperforms established rank-flexible approaches on simulated metagenomic fragments of length 200 bp to 1000 bp and is able to assign taxonomic attributions to a subset of sequences with few misclassifications. We then demonstrate the performance of different classifiers on an enhanced biological phosphorous removal metagenome, illustrating the advantages of rank-flexible classifiers when representative genomes are absent from the set of reference genomes. Application to a glacier ice metagenome demonstrates that similar taxonomic profiles are obtained across a set of classifiers which are increasingly conservative in their classification. Conclusions Our NB-based classification scheme is faster than the current best composition-based algorithm, Phymm, while providing equally accurate predictions. The rank-flexible variant of NB, which we term ε-NB, is complementary to LCA and can be combined with it to yield conservative prediction sets of very high confidence. The simple parameterization of LCA and ε-NB allows for tuning of the balance between more predictions and increased precision, allowing the user to account for the sensitivity of downstream analyses to misclassified or unclassified sequences. PMID:21827705

  17. Improving predicted protein loop structure ranking using a Pareto-optimality consensus method

    PubMed Central

    2010-01-01

    Background Accurate protein loop structure models are important to understand functions of many proteins. Identifying the native or near-native models by distinguishing them from the misfolded ones is a critical step in protein loop structure prediction. Results We have developed a Pareto Optimal Consensus (POC) method, which is a consensus model ranking approach to integrate multiple knowledge- or physics-based scoring functions. The procedure of identifying the models of best quality in a model set includes: 1) identifying the models at the Pareto optimal front with respect to a set of scoring functions, and 2) ranking them based on the fuzzy dominance relationship to the rest of the models. We apply the POC method to a large number of decoy sets for loops of 4- to 12-residue in length using a functional space composed of several carefully-selected scoring functions: Rosetta, DOPE, DDFIRE, OPLS-AA, and a triplet backbone dihedral potential developed in our lab. Our computational results show that the sets of Pareto-optimal decoys, which are typically composed of ~20% or less of the overall decoys in a set, have a good coverage of the best or near-best decoys in more than 99% of the loop targets. Compared to the individual scoring function yielding best selection accuracy in the decoy sets, the POC method yields 23%, 37%, and 64% less false positives in distinguishing the native conformation, indentifying a near-native model (RMSD < 0.5A from the native) as top-ranked, and selecting at least one near-native model in the top-5-ranked models, respectively. Similar effectiveness of the POC method is also found in the decoy sets from membrane protein loops. Furthermore, the POC method outperforms the other popularly-used consensus strategies in model ranking, such as rank-by-number, rank-by-rank, rank-by-vote, and regression-based methods. Conclusions By integrating multiple knowledge- and physics-based scoring functions based on Pareto optimality and fuzzy dominance, the POC method is effective in distinguishing the best loop models from the other ones within a loop model set. PMID:20642859

  18. Does the Introduction of the Ranking Task in Valuation Studies Improve Data Quality and Reduce Inconsistencies? The Case of the EQ-5D-5L.

    PubMed

    Ramos-Goñi, Juan M; Rand-Hendriksen, Kim; Pinto-Prades, Jose Luis

    2016-06-01

    Time trade-off (TTO)-based valuation studies for the three-level version of the EuroQol five-dimensional questionnaire (EQ-5D) typically started off with a ranking task (ordering the health states by preference). This was not included in the protocol for the five-level EQ-5D (EQ-5D-5L) valuation study. To test whether reintroducing a ranking task before the composite TTO (C-TTO) could help to reduce inconsistencies in C-TTO responses and improve the data quality. Respondents were randomly assigned to three study arms. The control arm was the present EQ-5D-5L study protocol, without ranking. The second arm (ranking without sorting) preceded the present protocol by asking respondents to rank the target health states using physical cards. The states were then valued in random order using C-TTO. In the third arm (ranking and sorting), the ranked states remained visible through the C-TTO tasks and the order of valuation was determined by the ranking. The study used only 10 EQ-5D-5L health states. We compared the C-TTO-based inconsistent pairs of health states and ties. The final sample size was 196 in the control arm, 205 in the ranking without sorting arm, and 199 in the ranking and sorting arm. The percentages of ties by respondents were 15.1%, 12.5%, and 12.6% for the control arm, the ranking without sorting arm, and the ranking and sorting arm, respectively. The extra cost for adding the ranking task was about 15%. The benefit does not justify the effort involved in the ranking task. For this reason, the addition of the ranking task to the present EQ-5D-5L valuation protocol is not an attractive option. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  19. Geologic controls on thermal maturity patterns in Pennsylvanian coal-bearing rocks in the Appalachian basin

    USGS Publications Warehouse

    Ruppert, L.F.; Hower, J.C.; Ryder, R.T.; Levine, J.R.; Trippi, M.H.; Grady, W.C.

    2010-01-01

    Thermal maturation patterns of Pennsylvanian strata in the Appalachian basin were determined by compiling and contouring published and unpublished vitrinite reflectance (VR) measurements. VR isograd values range from 0.6% in eastern Ohio and eastern Kentucky (western side of the East Kentucky coal field) to greater than 5.5% in eastern Pennsylvania (Southern Anthracite field, Schuylkill County), corresponding to ASTM coal rank classes of high volatile C bituminous to meta-anthracite. VR isograds show that thermal maturity of Pennsylvanian coals generally increases from west to east across the basin. The isograds patterns, which are indicative of maximum temperatures during burial, can be explained by variations in paleodepth of burial, paleogeothermal gradient, or a combination of both. However, there are at least four areas of unusually high-rank coal in the Appalachian basin that depart from the regional trends and are difficult to explain by depth of burial alone: 1) a west-northwestward salient centered in southwestern Pennsylvania; 2) an elliptically-shaped, northeast-trending area centered in southern West Virginia and western Virginia; 3) the eastern part of Black Warrior coal field, Alabama; and 4) the Pennsylvania Anthracite region, in eastern Pennsylvania. High-rank excursions in southwest Pennsylvania, the Black Warrior coal field, and the Pennsylvania Anthracite region are interpreted here to represent areas of higher paleo-heat flow related to syntectonic movement of hot fluids towards the foreland, associated with Alleghanian deformation. In addition to higher heat flow from fluids, the Pennsylvania Anthracite region also experienced greater depth of burial. The high-rank excursion in southwest Virginia was probably primarily controlled by overburden thickness, but may also have been influenced by higher geothermal gradients.

  20. Toward an Economic Mobility Ranking of U.S. Colleges. Evidence Speaks Reports, Vol 1, #6

    ERIC Educational Resources Information Center

    Chingos, Matthew M.; Blagg, Kristin

    2015-01-01

    The release of institution-level earnings information as part of the Obama Administration's new College Scorecard data has already spawned new "value-added" rankings based on the economic outcomes of students who attended similar institutions. These emerging rankings are an improvement on simple unadjusted rankings, but the wide variance…

  1. Yager’s ranking method for solving the trapezoidal fuzzy number linear programming

    NASA Astrophysics Data System (ADS)

    Karyati; Wutsqa, D. U.; Insani, N.

    2018-03-01

    In the previous research, the authors have studied the fuzzy simplex method for trapezoidal fuzzy number linear programming based on the Maleki’s ranking function. We have found some theories related to the term conditions for the optimum solution of fuzzy simplex method, the fuzzy Big-M method, the fuzzy two-phase method, and the sensitivity analysis. In this research, we study about the fuzzy simplex method based on the other ranking function. It is called Yager's ranking function. In this case, we investigate the optimum term conditions. Based on the result of research, it is found that Yager’s ranking function is not like Maleki’s ranking function. Using the Yager’s function, the simplex method cannot work as well as when using the Maleki’s function. By using the Yager’s function, the value of the subtraction of two equal fuzzy numbers is not equal to zero. This condition makes the optimum table of the fuzzy simplex table is undetected. As a result, the simplified fuzzy simplex table becomes stopped and does not reach the optimum solution.

  2. Rank-based testing of equal survivorship based on cross-sectional survival data with or without prospective follow-up.

    PubMed

    Chan, Kwun Chuen Gary; Qin, Jing

    2015-10-01

    Existing linear rank statistics cannot be applied to cross-sectional survival data without follow-up since all subjects are essentially censored. However, partial survival information are available from backward recurrence times and are frequently collected from health surveys without prospective follow-up. Under length-biased sampling, a class of linear rank statistics is proposed based only on backward recurrence times without any prospective follow-up. When follow-up data are available, the proposed rank statistic and a conventional rank statistic that utilizes follow-up information from the same sample are shown to be asymptotically independent. We discuss four ways to combine these two statistics when follow-up is present. Simulations show that all combined statistics have substantially improved power compared with conventional rank statistics, and a Mantel-Haenszel test performed the best among the proposal statistics. The method is applied to a cross-sectional health survey without follow-up and a study of Alzheimer's disease with prospective follow-up. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Comparison of Mixing Calculations for Reacting and Non-Reacting Flows in a Cylindrical Duct

    NASA Technical Reports Server (NTRS)

    Oechsle, V. L.; Mongia, H. C.; Holdeman, J. D.

    1994-01-01

    A production 3-D elliptic flow code has been used to calculate non-reacting and reacting flow fields in an experimental mixing section relevant to a rich burn/quick mix/lean burn (RQL) combustion system. A number of test cases have been run to assess the effects of the variation in the number of orifices, mass flow ratio, and rich-zone equivalence ratio on the flow field and mixing rates. The calculated normalized temperature profiles for the non-reacting flow field agree qualitatively well with the normalized conserved variable isopleths for the reacting flow field indicating that non-reacting mixing experiments are appropriate for screening and ranking potential rapid mixing concepts. For a given set of jet momentum-flux ratio, mass flow ratio, and density ratio (J, MR, and DR), the reacting flow calculations show a reduced level of mixing compared to the non-reacting cases. In addition, the rich-zone equivalence ratio has noticeable effect on the mixing flow characteristics for reacting flows.

  4. Evaluating user reputation in online rating systems via an iterative group-based ranking method

    NASA Astrophysics Data System (ADS)

    Gao, Jian; Zhou, Tao

    2017-05-01

    Reputation is a valuable asset in online social lives and it has drawn increased attention. Due to the existence of noisy ratings and spamming attacks, how to evaluate user reputation in online rating systems is especially significant. However, most of the previous ranking-based methods either follow a debatable assumption or have unsatisfied robustness. In this paper, we propose an iterative group-based ranking method by introducing an iterative reputation-allocation process into the original group-based ranking method. More specifically, the reputation of users is calculated based on the weighted sizes of the user rating groups after grouping all users by their rating similarities, and the high reputation users' ratings have larger weights in dominating the corresponding user rating groups. The reputation of users and the user rating group sizes are iteratively updated until they become stable. Results on two real data sets with artificial spammers suggest that the proposed method has better performance than the state-of-the-art methods and its robustness is considerably improved comparing with the original group-based ranking method. Our work highlights the positive role of considering users' grouping behaviors towards a better online user reputation evaluation.

  5. Are your covariates under control? How normalization can re-introduce covariate effects.

    PubMed

    Pain, Oliver; Dudbridge, Frank; Ronald, Angelica

    2018-04-30

    Many statistical tests rely on the assumption that the residuals of a model are normally distributed. Rank-based inverse normal transformation (INT) of the dependent variable is one of the most popular approaches to satisfy the normality assumption. When covariates are included in the analysis, a common approach is to first adjust for the covariates and then normalize the residuals. This study investigated the effect of regressing covariates against the dependent variable and then applying rank-based INT to the residuals. The correlation between the dependent variable and covariates at each stage of processing was assessed. An alternative approach was tested in which rank-based INT was applied to the dependent variable before regressing covariates. Analyses based on both simulated and real data examples demonstrated that applying rank-based INT to the dependent variable residuals after regressing out covariates re-introduces a linear correlation between the dependent variable and covariates, increasing type-I errors and reducing power. On the other hand, when rank-based INT was applied prior to controlling for covariate effects, residuals were normally distributed and linearly uncorrelated with covariates. This latter approach is therefore recommended in situations were normality of the dependent variable is required.

  6. Tour Recommendation Guide- Personalized travel sequence recommendation

    NASA Astrophysics Data System (ADS)

    Sivakumar, Akshitha; Prabadevi, B.

    2017-11-01

    Presents a personalized travel sequence for the given area the individual wants to visit. It not only helps to personalize the travel but also recommend a travel sequence based on the area mentioned. Firstly the frequently visited routes are ranked then top ranked routes are chosen based on previous travel records. The data is being collected using data mining and the famous routes are ranked based on user and the route. It helps in bridging the gap between user travel preference and routes.

  7. A Method for Assessing Material Flammability for Micro-Gravity Environments

    NASA Technical Reports Server (NTRS)

    Steinhaus, T.; Olenick, S. M.; Sifuentes, A.; Long, R. T.; Torero, J. L.

    1999-01-01

    On a spacecraft, one of the greatest fears during a mission is the outbreak of a fire. Since spacecraft are enclosed spaces and depend highly on technical electronics, a small fire could cause a large amount of damage. NASA uses upward flame spread as a "worst case scenario" evaluation for materials and the Heat and Visible Smoke Release Rates Test to assess the damage potential of a fire. Details of these tests and the protocols followed are provided by the "Flammability, Odor, Offgassing, and Compatibility Requirements and Test Procedures for Materials in Environments that Support Combustion" document. As pointed by Ohlemiller and Villa, the upward flame spread test does not address the effect of external radiation on ignition and spread. External radiation, as that coming from an overheated electrical component, is a plausible fire scenario in a space facility and could result in a reversal of the flammability rankings derived from the upward flame spread test. The "Upward Flame Propagation Test" has been the subject of strong criticism in the last few years. In many cases, theoretical exercises and experimental results have demonstrated the possibility of a reversal in the material flammability rankings from normal to micro-gravity. Furthermore, the need to incorporate information on the effects of external radiation and opposed flame spread when ranking materials based on their potential to burn in micro-gravity has been emphasized. Experiments conducted in a 2.2 second drop tower with an ethane burner in an air cross flow have emphasized that burning at the trailing edge is deterred in micro-gravity due to the decreased oxygen transport. For very low air flow velocities (U<0.005 m/s) the flame envelopes the burner and a slight increase in velocity results in extinction of the trailing edge (U>0.01 m/s). Only for U>0.l m/s extinction is observed at the leading edge (blow-off). Three dimensional numerical calculations performed for thin cellulose centrally ignited with an axisymmetric source have shown that under the presence of a forced flow slower than 0.035 m/s flames spreads only opposing the flow. Extinction is observed at the trailing edge with no concurrent propagation. Experiments conducted by the same authors at the JAMIC 10 second drop tower verified these calculations. Reducing the oxygen supply to the flame also results in a decrease of the Damk6hler number which might lead to extinction. Greyson et al. and Ferkul conducted experiments in micro-gravity (5 second drop tower) with thin paper and observed that at very low flow velocities concurrent flame spread will stop propagating and the flame will reduce in size and extinguish. They noted that quenching differs significantly from blow-off in that the upstream leading edge will remain anchored to the burn out edge.

  8. Composite multi-parameter ranking of real and virtual compounds for design of MC4R agonists: renaissance of the Free-Wilson methodology.

    PubMed

    Nilsson, Ingemar; Polla, Magnus O

    2012-10-01

    Drug design is a multi-parameter task present in the analysis of experimental data for synthesized compounds and in the prediction of new compounds with desired properties. This article describes the implementation of a binned scoring and composite ranking scheme for 11 experimental parameters that were identified as key drivers in the MC4R project. The composite ranking scheme was implemented in an AstraZeneca tool for analysis of project data, thereby providing an immediate re-ranking as new experimental data was added. The automated ranking also highlighted compounds overlooked by the project team. The successful implementation of a composite ranking on experimental data led to the development of an equivalent virtual score, which was based on Free-Wilson models of the parameters from the experimental ranking. The individual Free-Wilson models showed good to high predictive power with a correlation coefficient between 0.45 and 0.97 based on the external test set. The virtual ranking adds value to the selection of compounds for synthesis but error propagation must be controlled. The experimental ranking approach adds significant value, is parameter independent and can be tuned and applied to any drug discovery project.

  9. Impacts of phylogenetic nomenclature on the efficacy of the U.S. Endangered Species Act.

    PubMed

    Leslie, Matthew S

    2015-02-01

    Cataloging biodiversity is critical to conservation efforts because accurate taxonomy is often a precondition for protection under laws designed for species conservation, such as the U.S. Endangered Species Act (ESA). Traditional nomenclatural codes governing the taxonomic process have recently come under scrutiny because taxon names are more closely linked to hierarchical ranks than to the taxa themselves. A new approach to naming biological groups, called phylogenetic nomenclature (PN), explicitly names taxa by defining their names in terms of ancestry and descent. PN has the potential to increase nomenclatural stability and decrease confusion induced by the rank-based codes. But proponents of PN have struggled with whether species and infraspecific taxa should be governed by the same rules as other taxa or should have special rules. Some proponents advocate the wholesale abandonment of rank labels (including species); this could have consequences for the implementation of taxon-based conservation legislation. I examined the principles of PN as embodied in the PhyloCode (an alternative to traditional rank-based nomenclature that names biological groups based on the results of phylogenetic analyses and does not associate taxa with ranks) and assessed how this novel approach to naming taxa might affect the implementation of species-based legislation by providing a case study of the ESA. The latest version of the PhyloCode relies on the traditional rank-based codes to name species and infraspecific taxa; thus, little will change regarding the main targets of the ESA because they will retain rank labels. For this reason, and because knowledge of evolutionary relationships is of greater importance than nomenclatural procedures for initial protection of endangered taxa under the ESA, I conclude that PN under the PhyloCode will have little impact on implementation of the ESA. © 2014 Society for Conservation Biology.

  10. What Does Professional Rank Mean to Teachers? A Survey of the Multiple Impacts of Professional Rank on Urban and Rural Compulsory Education Teachers

    ERIC Educational Resources Information Center

    Yuyou, Qin; Wenjing, Zeng

    2018-01-01

    Professional rank is an important indicator of the professional capacity of compulsory education teachers. A rational professional rank evaluation system plays an important role in mobilizing the enthusiasm of teachers, improving the overall quality of teachers, and promoting the development of education. Based on stratified random sample data…

  11. Effect of Er,Cr:YSGG laser on human dentin fluid flow.

    PubMed

    Al-Omari, Wael M; Palamara, Joseph E

    2013-11-01

    The aim of the current investigation was to assess the rate and magnitude of dentin fluid flow of dentinal surfaces irradiated with Er,Cr:YSGG laser. Twenty extracted third molars were sectioned, mounted, and irradiated with Er,Cr:YSGG laser at 3.5 and 4.5 W power settings. Specimens were connected to an automated fluid flow measurement apparatus (Flodec). The rate, magnitude, and direction of dentin fluid flow were recorded at baseline and after irradiation. Nonparametric Wilcoxon signed ranks repeated measure t test revealed a statistically significant reduction in fluid flow for all the power settings. The 4.5-W power output reduced the flow significantly more than the 3.5 W. The samples showed a baseline outward flow followed by inward flow due to irradiation then followed by decreased outward flow. It was concluded that Er,Cr:YSGG laser irradiation at 3.5 and 4.5 W significantly reduced dentinal fluid flow rate. The reduction was directly proportional to power output.

  12. Thermodynamics-based Metabolite Sensitivity Analysis in metabolic networks.

    PubMed

    Kiparissides, A; Hatzimanikatis, V

    2017-01-01

    The increasing availability of large metabolomics datasets enhances the need for computational methodologies that can organize the data in a way that can lead to the inference of meaningful relationships. Knowledge of the metabolic state of a cell and how it responds to various stimuli and extracellular conditions can offer significant insight in the regulatory functions and how to manipulate them. Constraint based methods, such as Flux Balance Analysis (FBA) and Thermodynamics-based flux analysis (TFA), are commonly used to estimate the flow of metabolites through genome-wide metabolic networks, making it possible to identify the ranges of flux values that are consistent with the studied physiological and thermodynamic conditions. However, unless key intracellular fluxes and metabolite concentrations are known, constraint-based models lead to underdetermined problem formulations. This lack of information propagates as uncertainty in the estimation of fluxes and basic reaction properties such as the determination of reaction directionalities. Therefore, knowledge of which metabolites, if measured, would contribute the most to reducing this uncertainty can significantly improve our ability to define the internal state of the cell. In the present work we combine constraint based modeling, Design of Experiments (DoE) and Global Sensitivity Analysis (GSA) into the Thermodynamics-based Metabolite Sensitivity Analysis (TMSA) method. TMSA ranks metabolites comprising a metabolic network based on their ability to constrain the gamut of possible solutions to a limited, thermodynamically consistent set of internal states. TMSA is modular and can be applied to a single reaction, a metabolic pathway or an entire metabolic network. This is, to our knowledge, the first attempt to use metabolic modeling in order to provide a significance ranking of metabolites to guide experimental measurements. Copyright © 2016 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  13. Multi-energy CT based on a prior rank, intensity and sparsity model (PRISM).

    PubMed

    Gao, Hao; Yu, Hengyong; Osher, Stanley; Wang, Ge

    2011-11-01

    We propose a compressive sensing approach for multi-energy computed tomography (CT), namely the prior rank, intensity and sparsity model (PRISM). To further compress the multi-energy image for allowing the reconstruction with fewer CT data and less radiation dose, the PRISM models a multi-energy image as the superposition of a low-rank matrix and a sparse matrix (with row dimension in space and column dimension in energy), where the low-rank matrix corresponds to the stationary background over energy that has a low matrix rank, and the sparse matrix represents the rest of distinct spectral features that are often sparse. Distinct from previous methods, the PRISM utilizes the generalized rank, e.g., the matrix rank of tight-frame transform of a multi-energy image, which offers a way to characterize the multi-level and multi-filtered image coherence across the energy spectrum. Besides, the energy-dependent intensity information can be incorporated into the PRISM in terms of the spectral curves for base materials, with which the restoration of the multi-energy image becomes the reconstruction of the energy-independent material composition matrix. In other words, the PRISM utilizes prior knowledge on the generalized rank and sparsity of a multi-energy image, and intensity/spectral characteristics of base materials. Furthermore, we develop an accurate and fast split Bregman method for the PRISM and demonstrate the superior performance of the PRISM relative to several competing methods in simulations.

  14. Financing Nonappropriated Fund (NAF) Major Construction.

    DTIC Science & Technology

    1985-03-09

    84-C-0488 UNCLASSIFIED F/G 12/3 -NL IIIIIIIIIIlIII IIIIIIIIEIIIIE EEEIIIIIEIIII EIIIIIIIIEEEEE lllllllEllEllI .M 7. -777 .77 "- 1.1. -P2 MICROCOP ...the compound interest equation. . Exhibit 4-1 shows an example of this type of cash flow. It can be noticed in the sample cash flow that there are two...I.R.R. method for the ranking of investment opportunities. M. G. Wright suggests an effective way to overcome this problem by compounding one of the

  15. An approach to solve group-decision-making problems with ordinal interval numbers.

    PubMed

    Fan, Zhi-Ping; Liu, Yang

    2010-10-01

    The ordinal interval number is a form of uncertain preference information in group decision making (GDM), while it is seldom discussed in the existing research. This paper investigates how the ranking order of alternatives is determined based on preference information of ordinal interval numbers in GDM problems. When ranking a large quantity of ordinal interval numbers, the efficiency and accuracy of the ranking process are critical. A new approach is proposed to rank alternatives using ordinal interval numbers when every ranking ordinal in an ordinal interval number is thought to be uniformly and independently distributed in its interval. First, we give the definition of possibility degree on comparing two ordinal interval numbers and the related theory analysis. Then, to rank alternatives, by comparing multiple ordinal interval numbers, a collective expectation possibility degree matrix on pairwise comparisons of alternatives is built, and an optimization model based on this matrix is constructed. Furthermore, an algorithm is also presented to rank alternatives by solving the model. Finally, two examples are used to illustrate the use of the proposed approach.

  16. RRCRank: a fusion method using rank strategy for residue-residue contact prediction.

    PubMed

    Jing, Xiaoyang; Dong, Qiwen; Lu, Ruqian

    2017-09-02

    In structural biology area, protein residue-residue contacts play a crucial role in protein structure prediction. Some researchers have found that the predicted residue-residue contacts could effectively constrain the conformational search space, which is significant for de novo protein structure prediction. In the last few decades, related researchers have developed various methods to predict residue-residue contacts, especially, significant performance has been achieved by using fusion methods in recent years. In this work, a novel fusion method based on rank strategy has been proposed to predict contacts. Unlike the traditional regression or classification strategies, the contact prediction task is regarded as a ranking task. First, two kinds of features are extracted from correlated mutations methods and ensemble machine-learning classifiers, and then the proposed method uses the learning-to-rank algorithm to predict contact probability of each residue pair. First, we perform two benchmark tests for the proposed fusion method (RRCRank) on CASP11 dataset and CASP12 dataset respectively. The test results show that the RRCRank method outperforms other well-developed methods, especially for medium and short range contacts. Second, in order to verify the superiority of ranking strategy, we predict contacts by using the traditional regression and classification strategies based on the same features as ranking strategy. Compared with these two traditional strategies, the proposed ranking strategy shows better performance for three contact types, in particular for long range contacts. Third, the proposed RRCRank has been compared with several state-of-the-art methods in CASP11 and CASP12. The results show that the RRCRank could achieve comparable prediction precisions and is better than three methods in most assessment metrics. The learning-to-rank algorithm is introduced to develop a novel rank-based method for the residue-residue contact prediction of proteins, which achieves state-of-the-art performance based on the extensive assessment.

  17. How to Rank Journals

    PubMed Central

    Bradshaw, Corey J. A.; Brook, Barry W.

    2016-01-01

    There are now many methods available to assess the relative citation performance of peer-reviewed journals. Regardless of their individual faults and advantages, citation-based metrics are used by researchers to maximize the citation potential of their articles, and by employers to rank academic track records. The absolute value of any particular index is arguably meaningless unless compared to other journals, and different metrics result in divergent rankings. To provide a simple yet more objective way to rank journals within and among disciplines, we developed a κ-resampled composite journal rank incorporating five popular citation indices: Impact Factor, Immediacy Index, Source-Normalized Impact Per Paper, SCImago Journal Rank and Google 5-year h-index; this approach provides an index of relative rank uncertainty. We applied the approach to six sample sets of scientific journals from Ecology (n = 100 journals), Medicine (n = 100), Multidisciplinary (n = 50); Ecology + Multidisciplinary (n = 25), Obstetrics & Gynaecology (n = 25) and Marine Biology & Fisheries (n = 25). We then cross-compared the κ-resampled ranking for the Ecology + Multidisciplinary journal set to the results of a survey of 188 publishing ecologists who were asked to rank the same journals, and found a 0.68–0.84 Spearman’s ρ correlation between the two rankings datasets. Our composite index approach therefore approximates relative journal reputation, at least for that discipline. Agglomerative and divisive clustering and multi-dimensional scaling techniques applied to the Ecology + Multidisciplinary journal set identified specific clusters of similarly ranked journals, with only Nature & Science separating out from the others. When comparing a selection of journals within or among disciplines, we recommend collecting multiple citation-based metrics for a sample of relevant and realistic journals to calculate the composite rankings and their relative uncertainty windows. PMID:26930052

  18. How to Rank Journals.

    PubMed

    Bradshaw, Corey J A; Brook, Barry W

    2016-01-01

    There are now many methods available to assess the relative citation performance of peer-reviewed journals. Regardless of their individual faults and advantages, citation-based metrics are used by researchers to maximize the citation potential of their articles, and by employers to rank academic track records. The absolute value of any particular index is arguably meaningless unless compared to other journals, and different metrics result in divergent rankings. To provide a simple yet more objective way to rank journals within and among disciplines, we developed a κ-resampled composite journal rank incorporating five popular citation indices: Impact Factor, Immediacy Index, Source-Normalized Impact Per Paper, SCImago Journal Rank and Google 5-year h-index; this approach provides an index of relative rank uncertainty. We applied the approach to six sample sets of scientific journals from Ecology (n = 100 journals), Medicine (n = 100), Multidisciplinary (n = 50); Ecology + Multidisciplinary (n = 25), Obstetrics & Gynaecology (n = 25) and Marine Biology & Fisheries (n = 25). We then cross-compared the κ-resampled ranking for the Ecology + Multidisciplinary journal set to the results of a survey of 188 publishing ecologists who were asked to rank the same journals, and found a 0.68-0.84 Spearman's ρ correlation between the two rankings datasets. Our composite index approach therefore approximates relative journal reputation, at least for that discipline. Agglomerative and divisive clustering and multi-dimensional scaling techniques applied to the Ecology + Multidisciplinary journal set identified specific clusters of similarly ranked journals, with only Nature & Science separating out from the others. When comparing a selection of journals within or among disciplines, we recommend collecting multiple citation-based metrics for a sample of relevant and realistic journals to calculate the composite rankings and their relative uncertainty windows.

  19. Extremal states of positive partial transpose in a system of three qubits

    NASA Astrophysics Data System (ADS)

    Steensgaard Garberg, Øyvind; Irgens, Børge; Myrheim, Jan

    2013-03-01

    We have studied mixed states in the system of three qubits with the property that all their partial transposes are positive; these are called PPT states. We classify a PPT state by the ranks of the state itself and its three single partial transposes. In random numerical searches, we find entangled PPT states with a large variety of rank combinations. For ranks equal to five or higher, we find both extremal and nonextremal PPT states of nearly every rank combination, with the restriction that the square sum of the four ranks of an extremal PPT state can be at most 193. We have studied especially the rank-four entangled PPT states, which are found to have rank four for every partial transpose. These states are all extremal because of the previously known result that every PPT state of rank three or less is separable. We find two distinct classes of rank-4444 entangled PPT states, identified by a real valued quadratic expression invariant under local SL(2,C) transformations, mathematically equivalent to Lorentz transformations. This quadratic Lorentz invariant is nonzero for one class of states (type I in our terminology) and zero for the other class (type II). The previously known states based on unextendible product bases are a nongeneric subclass of the type-I states. We present analytical constructions of states of both types, general enough to reproduce all the rank-4444 PPT states we have found numerically. We can not exclude the possibility that there exist nongeneric rank-four PPT states that we do not find in our random numerical searches.

  20. Multiple graph regularized protein domain ranking.

    PubMed

    Wang, Jim Jing-Yan; Bensmail, Halima; Gao, Xin

    2012-11-19

    Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods. To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods. The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications.

  1. Multiple graph regularized protein domain ranking

    PubMed Central

    2012-01-01

    Background Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods. Results To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods. Conclusion The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications. PMID:23157331

  2. An R package for analyzing and modeling ranking data

    PubMed Central

    2013-01-01

    Background In medical informatics, psychology, market research and many other fields, researchers often need to analyze and model ranking data. However, there is no statistical software that provides tools for the comprehensive analysis of ranking data. Here, we present pmr, an R package for analyzing and modeling ranking data with a bundle of tools. The pmr package enables descriptive statistics (mean rank, pairwise frequencies, and marginal matrix), Analytic Hierarchy Process models (with Saaty’s and Koczkodaj’s inconsistencies), probability models (Luce model, distance-based model, and rank-ordered logit model), and the visualization of ranking data with multidimensional preference analysis. Results Examples of the use of package pmr are given using a real ranking dataset from medical informatics, in which 566 Hong Kong physicians ranked the top five incentives (1: competitive pressures; 2: increased savings; 3: government regulation; 4: improved efficiency; 5: improved quality care; 6: patient demand; 7: financial incentives) to the computerization of clinical practice. The mean rank showed that item 4 is the most preferred item and item 3 is the least preferred item, and significance difference was found between physicians’ preferences with respect to their monthly income. A multidimensional preference analysis identified two dimensions that explain 42% of the total variance. The first can be interpreted as the overall preference of the seven items (labeled as “internal/external”), and the second dimension can be interpreted as their overall variance of (labeled as “push/pull factors”). Various statistical models were fitted, and the best were found to be weighted distance-based models with Spearman’s footrule distance. Conclusions In this paper, we presented the R package pmr, the first package for analyzing and modeling ranking data. The package provides insight to users through descriptive statistics of ranking data. Users can also visualize ranking data by applying a thought multidimensional preference analysis. Various probability models for ranking data are also included, allowing users to choose that which is most suitable to their specific situations. PMID:23672645

  3. An R package for analyzing and modeling ranking data.

    PubMed

    Lee, Paul H; Yu, Philip L H

    2013-05-14

    In medical informatics, psychology, market research and many other fields, researchers often need to analyze and model ranking data. However, there is no statistical software that provides tools for the comprehensive analysis of ranking data. Here, we present pmr, an R package for analyzing and modeling ranking data with a bundle of tools. The pmr package enables descriptive statistics (mean rank, pairwise frequencies, and marginal matrix), Analytic Hierarchy Process models (with Saaty's and Koczkodaj's inconsistencies), probability models (Luce model, distance-based model, and rank-ordered logit model), and the visualization of ranking data with multidimensional preference analysis. Examples of the use of package pmr are given using a real ranking dataset from medical informatics, in which 566 Hong Kong physicians ranked the top five incentives (1: competitive pressures; 2: increased savings; 3: government regulation; 4: improved efficiency; 5: improved quality care; 6: patient demand; 7: financial incentives) to the computerization of clinical practice. The mean rank showed that item 4 is the most preferred item and item 3 is the least preferred item, and significance difference was found between physicians' preferences with respect to their monthly income. A multidimensional preference analysis identified two dimensions that explain 42% of the total variance. The first can be interpreted as the overall preference of the seven items (labeled as "internal/external"), and the second dimension can be interpreted as their overall variance of (labeled as "push/pull factors"). Various statistical models were fitted, and the best were found to be weighted distance-based models with Spearman's footrule distance. In this paper, we presented the R package pmr, the first package for analyzing and modeling ranking data. The package provides insight to users through descriptive statistics of ranking data. Users can also visualize ranking data by applying a thought multidimensional preference analysis. Various probability models for ranking data are also included, allowing users to choose that which is most suitable to their specific situations.

  4. Preliminary Metallogenic Map of North America; a listing of deposits by commodity

    USGS Publications Warehouse

    Lee, Michael P.; Guild, Philip White; Schruben, Paul G.

    1987-01-01

    The 4,215 ore deposits shown on the Preliminary Metallogenic Map of North America and contained in the Metallogenic Map file have been sorted by their principal (first-listed) commodities and grouped into metallic and nonmetallic categories. Deposit listings for 56 individual metals and minerals have been assembled using the data base and are arranged alphabetically by country, political subdivision (for the larger countries), and deposit name. Map numbers, major and minor constituents, geographic coordinates, and a geologic code are given for each deposit; additionally, the relative size and deposit class have been derived from the code and are listed separately. The frequencies of individual commodities and commodity groups by type, geographic distribution, and geologic occurrence are summarized in tables, and the relationships of associated commodities to principal commodities in the data base are emphasized in both tables and brief texts. In all, 49 metals and minerals are listed as principal (first or only) commodities and 7 more are shown as 'major' but not principal commodities. (Commodities listed as 'minor' in the data base were not sorted or tabulated separately.) Metals, divided into six subgroups, predominate over nonmetallic minerals by a ratio of about 7 to 1, although in terms of quantities and value the disparity is not so great. Within the metals group, the ranking according to frequency is as follows: base, precious, iron and alloying, other (antimony, beryllium, and others), nuclear-fuel, and light metals. The most frequently occurring commodity in the Metallogenic Map file is gold. Copper is ranked second, both in number of occurrences and as the principal commodity in deposits. Silver is ranked third in frequency of occurrence; lead and zinc are ranked fourth and fifth, respectively. Iron, ranked sixth in frequency of occurrence as a major commodity, is the third most reported principal commodity in the data base, ahead of silver (ranked fourth), lead (ranked fifth), and zinc (ranked sixth).

  5. Student Practices, Learning, and Attitudes When Using Computerized Ranking Tasks

    NASA Astrophysics Data System (ADS)

    Lee, Kevin M.; Prather, E. E.; Collaboration of Astronomy Teaching Scholars CATS

    2011-01-01

    Ranking Tasks are a novel type of conceptual exercise based on a technique called rule assessment. Ranking Tasks present students with a series of four to eight icons that describe slightly different variations of a basic physical situation. Students are then asked to identify the order, or ranking, of the various situations based on some physical outcome or result. The structure of Ranking Tasks makes it difficult for students to rely strictly on memorized answers and mechanical substitution of formulae. In addition, by changing the presentation of the different scenarios (e.g., photographs, line diagrams, graphs, tables, etc.) we find that Ranking Tasks require students to develop mental schema that are more flexible and robust. Ranking tasks may be implemented on the computer which requires students to order the icons through drag-and-drop. Computer implementation allows the incorporation of background material, grading with feedback, and providing additional similar versions of the task through randomization so that students can build expertise through practice. This poster will summarize the results of a study of student usage of computerized ranking tasks. We will investigate 1) student practices (How do they make use of these tools?), 2) knowledge and skill building (Do student scores improve with iteration and are there diminishing returns?), and 3) student attitudes toward using computerized Ranking Tasks (Do they like using them?). This material is based upon work supported by the National Science Foundation under Grant No. 0715517, a CCLI Phase III Grant for the Collaboration of Astronomy Teaching Scholars (CATS). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

  6. Identification of significant features by the Global Mean Rank test.

    PubMed

    Klammer, Martin; Dybowski, J Nikolaj; Hoffmann, Daniel; Schaab, Christoph

    2014-01-01

    With the introduction of omics-technologies such as transcriptomics and proteomics, numerous methods for the reliable identification of significantly regulated features (genes, proteins, etc.) have been developed. Experimental practice requires these tests to successfully deal with conditions such as small numbers of replicates, missing values, non-normally distributed expression levels, and non-identical distributions of features. With the MeanRank test we aimed at developing a test that performs robustly under these conditions, while favorably scaling with the number of replicates. The test proposed here is a global one-sample location test, which is based on the mean ranks across replicates, and internally estimates and controls the false discovery rate. Furthermore, missing data is accounted for without the need of imputation. In extensive simulations comparing MeanRank to other frequently used methods, we found that it performs well with small and large numbers of replicates, feature dependent variance between replicates, and variable regulation across features on simulation data and a recent two-color microarray spike-in dataset. The tests were then used to identify significant changes in the phosphoproteomes of cancer cells induced by the kinase inhibitors erlotinib and 3-MB-PP1 in two independently published mass spectrometry-based studies. MeanRank outperformed the other global rank-based methods applied in this study. Compared to the popular Significance Analysis of Microarrays and Linear Models for Microarray methods, MeanRank performed similar or better. Furthermore, MeanRank exhibits more consistent behavior regarding the degree of regulation and is robust against the choice of preprocessing methods. MeanRank does not require any imputation of missing values, is easy to understand, and yields results that are easy to interpret. The software implementing the algorithm is freely available for academic and commercial use.

  7. Collaborative hierarchy maintains cooperation in asymmetric games.

    PubMed

    Antonioni, Alberto; Pereda, María; Cronin, Katherine A; Tomassini, Marco; Sánchez, Angel

    2018-03-29

    The interplay of social structure and cooperative behavior is under much scrutiny lately as behavior in social contexts becomes increasingly relevant for everyday life. Earlier experimental work showed that the existence of a social hierarchy, earned through competition, was detrimental for the evolution of cooperative behaviors. Here, we study the case in which individuals are ranked in a hierarchical structure based on their performance in a collective effort by having them play a Public Goods Game. In the first treatment, participants are ranked according to group earnings while, in the second treatment, their rankings are based on individual earnings. Subsequently, participants play asymmetric Prisoner's Dilemma games where higher-ranked players gain more than lower ones. Our experiments show that there are no detrimental effects of the hierarchy formed based on group performance, yet when ranking is assigned individually we observe a decrease in cooperation. Our results show that different levels of cooperation arise from the fact that subjects are interpreting rankings as a reputation which carries information about which subjects were cooperators in the previous phase. Our results demonstrate that noting the manner in which a hierarchy is established is essential for understanding its effects on cooperation.

  8. Ranking structures and rank-rank correlations of countries: The FIFA and UEFA cases

    NASA Astrophysics Data System (ADS)

    Ausloos, Marcel; Cloots, Rudi; Gadomski, Adam; Vitanov, Nikolay K.

    2014-04-01

    Ranking of agents competing with each other in complex systems may lead to paradoxes according to the pre-chosen different measures. A discussion is presented on such rank-rank, similar or not, correlations based on the case of European countries ranked by UEFA and FIFA from different soccer competitions. The first question to be answered is whether an empirical and simple law is obtained for such (self-) organizations of complex sociological systems with such different measuring schemes. It is found that the power law form is not the best description contrary to many modern expectations. The stretched exponential is much more adequate. Moreover, it is found that the measuring rules lead to some inner structures in both cases.

  9. The effect of uncertainties in distance-based ranking methods for multi-criteria decision making

    NASA Astrophysics Data System (ADS)

    Jaini, Nor I.; Utyuzhnikov, Sergei V.

    2017-08-01

    Data in the multi-criteria decision making are often imprecise and changeable. Therefore, it is important to carry out sensitivity analysis test for the multi-criteria decision making problem. The paper aims to present a sensitivity analysis for some ranking techniques based on the distance measures in multi-criteria decision making. Two types of uncertainties are considered for the sensitivity analysis test. The first uncertainty is related to the input data, while the second uncertainty is towards the Decision Maker preferences (weights). The ranking techniques considered in this study are TOPSIS, the relative distance and trade-off ranking methods. TOPSIS and the relative distance method measure a distance from an alternative to the ideal and antiideal solutions. In turn, the trade-off ranking calculates a distance of an alternative to the extreme solutions and other alternatives. Several test cases are considered to study the performance of each ranking technique in both types of uncertainties.

  10. Rank-preserving regression: a more robust rank regression model against outliers.

    PubMed

    Chen, Tian; Kowalski, Jeanne; Chen, Rui; Wu, Pan; Zhang, Hui; Feng, Changyong; Tu, Xin M

    2016-08-30

    Mean-based semi-parametric regression models such as the popular generalized estimating equations are widely used to improve robustness of inference over parametric models. Unfortunately, such models are quite sensitive to outlying observations. The Wilcoxon-score-based rank regression (RR) provides more robust estimates over generalized estimating equations against outliers. However, the RR and its extensions do not sufficiently address missing data arising in longitudinal studies. In this paper, we propose a new approach to address outliers under a different framework based on the functional response models. This functional-response-model-based alternative not only addresses limitations of the RR and its extensions for longitudinal data, but, with its rank-preserving property, even provides more robust estimates than these alternatives. The proposed approach is illustrated with both real and simulated data. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  11. When drug discovery meets web search: Learning to Rank for ligand-based virtual screening.

    PubMed

    Zhang, Wei; Ji, Lijuan; Chen, Yanan; Tang, Kailin; Wang, Haiping; Zhu, Ruixin; Jia, Wei; Cao, Zhiwei; Liu, Qi

    2015-01-01

    The rapid increase in the emergence of novel chemical substances presents a substantial demands for more sophisticated computational methodologies for drug discovery. In this study, the idea of Learning to Rank in web search was presented in drug virtual screening, which has the following unique capabilities of 1). Applicable of identifying compounds on novel targets when there is not enough training data available for these targets, and 2). Integration of heterogeneous data when compound affinities are measured in different platforms. A standard pipeline was designed to carry out Learning to Rank in virtual screening. Six Learning to Rank algorithms were investigated based on two public datasets collected from Binding Database and the newly-published Community Structure-Activity Resource benchmark dataset. The results have demonstrated that Learning to rank is an efficient computational strategy for drug virtual screening, particularly due to its novel use in cross-target virtual screening and heterogeneous data integration. To the best of our knowledge, we have introduced here the first application of Learning to Rank in virtual screening. The experiment workflow and algorithm assessment designed in this study will provide a standard protocol for other similar studies. All the datasets as well as the implementations of Learning to Rank algorithms are available at http://www.tongji.edu.cn/~qiliu/lor_vs.html. Graphical AbstractThe analogy between web search and ligand-based drug discovery.

  12. Health Needs of Urban Blacks.

    ERIC Educational Resources Information Center

    Blackwell, James E.; And Others

    Interviews were conducted with 237 adult blacks in the Boston area to determine their most urgent needs and the most urgent needs of blacks in general, to characterize the information flow from health services agencies, and to characterize access to and utilization of health services. The respondents ranked better economic conditions, family and…

  13. Evaluating Snow Data Assimilation Framework for Streamflow Forecasting Applications Using Hindcast Verification

    NASA Astrophysics Data System (ADS)

    Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.

    2012-12-01

    Snow water equivalent (SWE) estimation is a key factor in producing reliable streamflow simulations and forecasts in snow dominated areas. However, measuring or predicting SWE has significant uncertainty. Sequential data assimilation, which updates states using both observed and modeled data based on error estimation, has been shown to reduce streamflow simulation errors but has had limited testing for forecasting applications. In the current study, a snow data assimilation framework integrated with the National Weather System River Forecasting System (NWSRFS) is evaluated for use in ensemble streamflow prediction (ESP). Seasonal water supply ESP hindcasts are generated for the North Fork of the American River Basin (NFARB) in northern California. Parameter sets from the California Nevada River Forecast Center (CNRFC), the Differential Evolution Adaptive Metropolis (DREAM) algorithm and the Multistep Automated Calibration Scheme (MACS) are tested both with and without sequential data assimilation. The traditional ESP method considers uncertainty in future climate conditions using historical temperature and precipitation time series to generate future streamflow scenarios conditioned on the current basin state. We include data uncertainty analysis in the forecasting framework through the DREAM-based parameter set which is part of a recently developed Integrated Uncertainty and Ensemble-based data Assimilation framework (ICEA). Extensive verification of all tested approaches is undertaken using traditional forecast verification measures, including root mean square error (RMSE), Nash-Sutcliffe efficiency coefficient (NSE), volumetric bias, joint distribution, rank probability score (RPS), and discrimination and reliability plots. In comparison to the RFC parameters, the DREAM and MACS sets show significant improvement in volumetric bias in flow. Use of assimilation improves hindcasts of higher flows but does not significantly improve performance in the mid flow and low flow categories.

  14. Effect of The Receptor Activator of Nuclear Factor кB and RANK Ligand on In Vitro Differentiation of Cord Blood CD133(+) Hematopoietic Stem Cells to Osteoclasts.

    PubMed

    Kalantari, Nasim; Abroun, Saeid; Soleimani, Masoud; Kaviani, Saeid; Azad, Mehdi; Eskandari, Fatemeh; Habibi, Hossein

    2016-01-01

    Receptor activator of nuclear factor-kappa B ligand (RANKL) appears to be an osteoclast-activating factor, bearing an important role in the pathogenesis of multiple myeloma. Some studies demonstrated that U-266 myeloma cell line and primary myeloma cells expressed RANK and RANKL. It had been reported that the expression of myeloid and monocytoid markers was increased by co-culturing myeloma cells with hematopoietic stem cells (HSCs). This study also attempted to show the molecular mechanism of RANK and RANKL on differentiation capability of human cord blood HSC to osteoclast, as well as expression of calcitonin receptor (CTR) on cord blood HSC surface. In this experimental study, CD133(+) hematopoietic stem cells were isolated from umbilical cord blood and cultured in the presence of macrophage colony-stimulating factor (M-CSF) and RANKL. Osteoclast differentiation was characterized by using tartrate-resistant acid phosphatase (TRAP) staining, giemsa staining, immunophenotyping, and reverse transcription-polymerase chain reaction (RT-PCR) assay for specific genes. Hematopoietic stem cells expressed RANK before and after differentiation into osteoclast. Compared to control group, flow cytometric results showed an increased expression of RANK after differentiation. Expression of CTR mRNA showed TRAP reaction was positive in some differentiated cells, including osteoclast cells. Presence of RANKL and M-CSF in bone marrow could induce HSCs differentiation into osteoclast.

  15. Status change during adulthood: life-history by-product or kin selection based on reproductive value?

    PubMed Central

    Combes, S. L.; Altmann, J.

    2001-01-01

    When dominance status predicts fitness, most adaptive models of dominance relationships among cercopithecine primate females predict lifetime maintenance of status. These models and alternative ones positing rank decline as a non-adaptive by-product have remained largely untested, however, because lifetime status of older adults has been virtually unknown for natural populations. In a 25-year study of adult female savannah baboons (Papio cynocephalus), in each of three social groups, rank losses were common among the 66 females that lived past median adult age. These losses were not accounted for by loss in relative rank from group growth or by loss in absolute rank from reversals in rank between members of different maternal families or between sisters. Rather, females that had mature daughters experienced loss of dominance status to these offspring, a characteristic of all but the top-ranking matriline of each group. Among proposed hypotheses for rank reversals between adults, that of kin selection based on relative reproductive value is most clearly supported by these data. In contrast, observed patterns of rank loss are not consistent with alternative models that postulate that changes during adult lifespan are a product of accumulated risk, physical decline during ageing, or coalitionary support among females within or between matrilines. PMID:11429136

  16. The importance of fluvial hydraulics to fish-habitat restoration in low-gradient alluvial streams

    USGS Publications Warehouse

    Rabeni, Charles F.; Jacobson, Robert B.

    1993-01-01

    1. A major cause of degradation and loss of stream fish is alteration of physical habitat within and adjacent to the channel. We describe a potentially efficient approach to fish restoration based upon the relationship between fluvial hydraulics, geomorphology, and those habitats important to fish.2. The aquatic habitat in a low-gradient, alluvial stream in the Ozark Plateaus physiographical province was classified according to location in the channel, patterns of water flow, and structures that control flow. The resulting habitat types were ranked in terms of their temporal stability and ability to be manipulated.3. Delineation and quantification of discrete physical spaces in a stream, termed hydraulic habitat units, are shown to be useful in stream restoration programmes if the ecological importance of each habitat unit is known, and if habitats are defined by fluvial dynamics so that restoration is aided by natural forces.4. Examples, using different taxa, are given to illustrate management options.

  17. Description and detection of burst events in turbulent flows

    NASA Astrophysics Data System (ADS)

    Schmid, P. J.; García-Gutierrez, A.; Jiménez, J.

    2018-04-01

    A mathematical and computational framework is developed for the detection and identification of coherent structures in turbulent wall-bounded shear flows. In a first step, this data-based technique will use an embedding methodology to formulate the fluid motion as a phase-space trajectory, from which state-transition probabilities can be computed. Within this formalism, a second step then applies repeated clustering and graph-community techniques to determine a hierarchy of coherent structures ranked by their persistencies. This latter information will be used to detect highly transitory states that act as precursors to violent and intermittent events in turbulent fluid motion (e.g., bursts). Used as an analysis tool, this technique allows the objective identification of intermittent (but important) events in turbulent fluid motion; however, it also lays the foundation for advanced control strategies for their manipulation. The techniques are applied to low-dimensional model equations for turbulent transport, such as the self-sustaining process (SSP), for varying levels of complexity.

  18. Big U. S. gas producing states fan prorationing controversy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This paper reports that the biggest gas producing states in the U.S. are embracing wholesale changes in prorationing rules. And their actions have stirred controversy and warnings among key energy lawmakers in Washington. The Texas Railroad Commission last week unanimously approved new rules for setting production volumes for gas wells. Texas is the country's No. 1 gas producer. TRC commissioners eliminated a system that prorated production based on monthly nominations by gas purchasers. The Office of Conservation in Louisiana, No. 2 ranked producer, recently conducted a hearing on a prorationing proposal supported by Gov. Edwin Edwards. Conservation Commissioner Herbert W.more » Thompson is expected to disclose the state's plans by the end of this month. No. 3 Oklahoma enacted a law last March restricting production from unallocated wells in the 4 winter months to the greater of 1 MMcfd or 40% of the calculated absolute open flow and the rest of the year to the greater of 750 Mcfd or 25% of the calculated absolute open flow.« less

  19. Searching for superspreaders of information in real-world social media.

    PubMed

    Pei, Sen; Muchnik, Lev; Andrade, José S; Zheng, Zhiming; Makse, Hernán A

    2014-07-03

    A number of predictors have been suggested to detect the most influential spreaders of information in online social media across various domains such as Twitter or Facebook. In particular, degree, PageRank, k-core and other centralities have been adopted to rank the spreading capability of users in information dissemination media. So far, validation of the proposed predictors has been done by simulating the spreading dynamics rather than following real information flow in social networks. Consequently, only model-dependent contradictory results have been achieved so far for the best predictor. Here, we address this issue directly. We search for influential spreaders by following the real spreading dynamics in a wide range of networks. We find that the widely-used degree and PageRank fail in ranking users' influence. We find that the best spreaders are consistently located in the k-core across dissimilar social platforms such as Twitter, Facebook, Livejournal and scientific publishing in the American Physical Society. Furthermore, when the complete global network structure is unavailable, we find that the sum of the nearest neighbors' degree is a reliable local proxy for user's influence. Our analysis provides practical instructions for optimal design of strategies for "viral" information dissemination in relevant applications.

  20. Searching for superspreaders of information in real-world social media

    NASA Astrophysics Data System (ADS)

    Pei, Sen; Muchnik, Lev; Andrade, José S., Jr.; Zheng, Zhiming; Makse, Hernán A.

    2014-07-01

    A number of predictors have been suggested to detect the most influential spreaders of information in online social media across various domains such as Twitter or Facebook. In particular, degree, PageRank, k-core and other centralities have been adopted to rank the spreading capability of users in information dissemination media. So far, validation of the proposed predictors has been done by simulating the spreading dynamics rather than following real information flow in social networks. Consequently, only model-dependent contradictory results have been achieved so far for the best predictor. Here, we address this issue directly. We search for influential spreaders by following the real spreading dynamics in a wide range of networks. We find that the widely-used degree and PageRank fail in ranking users' influence. We find that the best spreaders are consistently located in the k-core across dissimilar social platforms such as Twitter, Facebook, Livejournal and scientific publishing in the American Physical Society. Furthermore, when the complete global network structure is unavailable, we find that the sum of the nearest neighbors' degree is a reliable local proxy for user's influence. Our analysis provides practical instructions for optimal design of strategies for ``viral'' information dissemination in relevant applications.

  1. Searching for superspreaders of information in real-world social media

    PubMed Central

    Pei, Sen; Muchnik, Lev; Andrade, Jr., José S.; Zheng, Zhiming; Makse, Hernán A.

    2014-01-01

    A number of predictors have been suggested to detect the most influential spreaders of information in online social media across various domains such as Twitter or Facebook. In particular, degree, PageRank, k-core and other centralities have been adopted to rank the spreading capability of users in information dissemination media. So far, validation of the proposed predictors has been done by simulating the spreading dynamics rather than following real information flow in social networks. Consequently, only model-dependent contradictory results have been achieved so far for the best predictor. Here, we address this issue directly. We search for influential spreaders by following the real spreading dynamics in a wide range of networks. We find that the widely-used degree and PageRank fail in ranking users' influence. We find that the best spreaders are consistently located in the k-core across dissimilar social platforms such as Twitter, Facebook, Livejournal and scientific publishing in the American Physical Society. Furthermore, when the complete global network structure is unavailable, we find that the sum of the nearest neighbors' degree is a reliable local proxy for user's influence. Our analysis provides practical instructions for optimal design of strategies for “viral” information dissemination in relevant applications. PMID:24989148

  2. Association of patient case-mix adjustment, hospital process performance rankings, and eligibility for financial incentives.

    PubMed

    Mehta, Rajendra H; Liang, Li; Karve, Amrita M; Hernandez, Adrian F; Rumsfeld, John S; Fonarow, Gregg C; Peterson, Eric D

    2008-10-22

    While most comparisons of hospital outcomes adjust for patient characteristics, process performance comparisons typically do not. To evaluate the degree to which hospital process performance ratings and eligibility for financial incentives are altered after accounting for hospitals' patient demographics, clinical characteristics, and mix of treatment opportunities. Using data from the American Heart Association's Get With the Guidelines program between January 2, 2000, and March 28, 2008, we analyzed hospital process performance based on the Centers for Medicare & Medicaid Services' defined core measures for acute myocardial infarction. Hospitals were initially ranked based on crude composite process performance and then ranked again after accounting for hospitals' patient demographics, clinical characteristics, and eligibility for measures using a hierarchical model. We then compared differences in hospital performance rankings and pay-for-performance financial incentive categories (top 20%, middle 60%, and bottom 20% institutions). Hospital process performance ranking and pay-for-performance financial incentive categories. A total of 148,472 acute myocardial infarction patients met the study criteria from 449 centers. Hospitals for which crude composite acute myocardial infarction performance was in the bottom quintile (n = 89) were smaller nonacademic institutions that treated a higher percentage of patients from racial or ethnic minority groups and also patients with greater comorbidities than hospitals ranked in the top quintile (n = 90). Although there was overall agreement on hospital rankings based on observed vs adjusted composite scores (weighted kappa, 0.74), individual hospital ranking changed with adjustment (median, 22 ranks; range, 0-214; interquartile range, 9-40). Additionally, 16.5% of institutions (n = 74) changed pay-for-performance financial status categories after accounting for patient and treatment opportunity mix. Our findings suggest that accounting for hospital differences in patient characteristics and treatment opportunities is associated with modest changes in hospital performance rankings and eligibility for financial benefits in pay-for-performance programs for treatment of myocardial infarction.

  3. BridgeRank: A novel fast centrality measure based on local structure of the network

    NASA Astrophysics Data System (ADS)

    Salavati, Chiman; Abdollahpouri, Alireza; Manbari, Zhaleh

    2018-04-01

    Ranking nodes in complex networks have become an important task in many application domains. In a complex network, influential nodes are those that have the most spreading ability. Thus, identifying influential nodes based on their spreading ability is a fundamental task in different applications such as viral marketing. One of the most important centrality measures to ranking nodes is closeness centrality which is efficient but suffers from high computational complexity O(n3) . This paper tries to improve closeness centrality by utilizing the local structure of nodes and presents a new ranking algorithm, called BridgeRank centrality. The proposed method computes local centrality value for each node. For this purpose, at first, communities are detected and the relationship between communities is completely ignored. Then, by applying a centrality in each community, only one best critical node from each community is extracted. Finally, the nodes are ranked based on computing the sum of the shortest path length of nodes to obtained critical nodes. We have also modified the proposed method by weighting the original BridgeRank and selecting several nodes from each community based on the density of that community. Our method can find the best nodes with high spread ability and low time complexity, which make it applicable to large-scale networks. To evaluate the performance of the proposed method, we use the SIR diffusion model. Finally, experiments on real and artificial networks show that our method is able to identify influential nodes so efficiently, and achieves better performance compared to other recent methods.

  4. Irreducible Representations of Oscillatory and Swirling Flows in Active Soft Matter

    NASA Astrophysics Data System (ADS)

    Ghose, Somdeb; Adhikari, R.

    2014-03-01

    Recent experiments imaging fluid flow around swimming microorganisms have revealed complex time-dependent velocity fields that differ qualitatively from the stresslet flow commonly employed in theoretical descriptions of active matter. Here we obtain the most general flow around a finite sized active particle by expanding the surface stress in irreducible Cartesian tensors. This expansion, whose first term is the stresslet, must include, respectively, third-rank polar and axial tensors to minimally capture crucial features of the active oscillatory flow around translating Chlamydomonas and the active swirling flow around rotating Volvox. The representation provides explicit expressions for the irreducible symmetric, antisymmetric, and isotropic parts of the continuum active stress. Antisymmetric active stresses do not conserve orbital angular momentum and our work thus shows that spin angular momentum is necessary to restore angular momentum conservation in continuum hydrodynamic descriptions of active soft matter.

  5. Regularized matrix regression

    PubMed Central

    Zhou, Hua; Li, Lexin

    2014-01-01

    Summary Modern technologies are producing a wealth of data with complex structures. For instance, in two-dimensional digital imaging, flow cytometry and electroencephalography, matrix-type covariates frequently arise when measurements are obtained for each combination of two underlying variables. To address scientific questions arising from those data, new regression methods that take matrices as covariates are needed, and sparsity or other forms of regularization are crucial owing to the ultrahigh dimensionality and complex structure of the matrix data. The popular lasso and related regularization methods hinge on the sparsity of the true signal in terms of the number of its non-zero coefficients. However, for the matrix data, the true signal is often of, or can be well approximated by, a low rank structure. As such, the sparsity is frequently in the form of low rank of the matrix parameters, which may seriously violate the assumption of the classical lasso. We propose a class of regularized matrix regression methods based on spectral regularization. A highly efficient and scalable estimation algorithm is developed, and a degrees-of-freedom formula is derived to facilitate model selection along the regularization path. Superior performance of the method proposed is demonstrated on both synthetic and real examples. PMID:24648830

  6. Adaptive laser link reconfiguration using constraint propagation

    NASA Technical Reports Server (NTRS)

    Crone, M. S.; Julich, P. M.; Cook, L. M.

    1993-01-01

    This paper describes Harris AI research performed on the Adaptive Link Reconfiguration (ALR) study for Rome Lab, and focuses on the application of constraint propagation to the problem of link reconfiguration for the proposed space based Strategic Defense System (SDS) Brilliant Pebbles (BP) communications system. According to the concept of operations at the time of the study, laser communications will exist between BP's and to ground entry points. Long-term links typical of RF transmission will not exist. This study addressed an initial implementation of BP's based on the Global Protection Against Limited Strikes (GPALS) SDI mission. The number of satellites and rings studied was representative of this problem. An orbital dynamics program was used to generate line-of-site data for the modeled architecture. This was input into a discrete event simulation implemented in the Harris developed COnstraint Propagation Expert System (COPES) Shell, developed initially on the Rome Lab BM/C3 study. Using a model of the network and several heuristics, the COPES shell was used to develop the Heuristic Adaptive Link Ordering (HALO) Algorithm to rank and order potential laser links according to probability of communication. A reduced set of links based on this ranking would then be used by a routing algorithm to select the next hop. This paper includes an overview of Constraint Propagation as an Artificial Intelligence technique and its embodiment in the COPES shell. It describes the design and implementation of both the simulation of the GPALS BP network and the HALO algorithm in COPES. This is described using a 59 Data Flow Diagram, State Transition Diagrams, and Structured English PDL. It describes a laser communications model and the heuristics involved in rank-ordering the potential communication links. The generation of simulation data is described along with its interface via COPES to the Harris developed View Net graphical tool for visual analysis of communications networks. Conclusions are presented, including a graphical analysis of results depicting the ordered set of links versus the set of all possible links based on the computed Bit Error Rate (BER). Finally, future research is discussed which includes enhancements to the HALO algorithm, network simulation, and the addition of an intelligent routing algorithm for BP.

  7. Does an expert-based evaluation allow us to go beyond the Impact Factor? Experiences from building a ranking of national journals in Poland.

    PubMed

    Kulczycki, Emanuel; Rozkosz, Ewa A

    2017-01-01

    This article discusses the Polish Journal Ranking, which is used in the research evaluation system in Poland. In 2015, the ranking, which represents all disciplines, allocated 17,437 journals into three lists: A, B, and C. The B list constitutes a ranking of Polish journals that are indexed neither in the Web of Science nor the European Reference Index for the Humanities. This ranking was built by evaluating journals in three dimensions: formal, bibliometric, and expert-based. We have analysed data on 2035 Polish journals from the B list. Our study aims to determine how an expert-based evaluation influenced the results of final evaluation. In our study, we used structural equation modelling, which is regression based, and we designed three pairs of theoretical models for three fields of science: (1) humanities, (2) social sciences, and (3) engineering, natural sciences, and medical sciences. Each pair consisted of the full model and the reduced model (i.e., the model without the expert-based evaluation). Our analysis revealed that the multidimensional evaluation of local journals should not rely only on the bibliometric indicators, which are based on the Web of Science or Scopus. Moreover, we have shown that the expert-based evaluation plays a major role in all fields of science. We conclude with recommendations that the formal evaluation should be reduced to verifiable parameters and that the expert-based evaluation should be based on common guidelines for the experts.

  8. A Ranking Approach to Genomic Selection.

    PubMed

    Blondel, Mathieu; Onogi, Akio; Iwata, Hiroyoshi; Ueda, Naonori

    2015-01-01

    Genomic selection (GS) is a recent selective breeding method which uses predictive models based on whole-genome molecular markers. Until now, existing studies formulated GS as the problem of modeling an individual's breeding value for a particular trait of interest, i.e., as a regression problem. To assess predictive accuracy of the model, the Pearson correlation between observed and predicted trait values was used. In this paper, we propose to formulate GS as the problem of ranking individuals according to their breeding value. Our proposed framework allows us to employ machine learning methods for ranking which had previously not been considered in the GS literature. To assess ranking accuracy of a model, we introduce a new measure originating from the information retrieval literature called normalized discounted cumulative gain (NDCG). NDCG rewards more strongly models which assign a high rank to individuals with high breeding value. Therefore, NDCG reflects a prerequisite objective in selective breeding: accurate selection of individuals with high breeding value. We conducted a comparison of 10 existing regression methods and 3 new ranking methods on 6 datasets, consisting of 4 plant species and 25 traits. Our experimental results suggest that tree-based ensemble methods including McRank, Random Forests and Gradient Boosting Regression Trees achieve excellent ranking accuracy. RKHS regression and RankSVM also achieve good accuracy when used with an RBF kernel. Traditional regression methods such as Bayesian lasso, wBSR and BayesC were found less suitable for ranking. Pearson correlation was found to correlate poorly with NDCG. Our study suggests two important messages. First, ranking methods are a promising research direction in GS. Second, NDCG can be a useful evaluation measure for GS.

  9. A novel iris transillumination grading scale allowing flexible assessment with quantitative image analysis and visual matching.

    PubMed

    Wang, Chen; Brancusi, Flavia; Valivullah, Zaheer M; Anderson, Michael G; Cunningham, Denise; Hedberg-Buenz, Adam; Power, Bradley; Simeonov, Dimitre; Gahl, William A; Zein, Wadih M; Adams, David R; Brooks, Brian

    2018-01-01

    To develop a sensitive scale of iris transillumination suitable for clinical and research use, with the capability of either quantitative analysis or visual matching of images. Iris transillumination photographic images were used from 70 study subjects with ocular or oculocutaneous albinism. Subjects represented a broad range of ocular pigmentation. A subset of images was subjected to image analysis and ranking by both expert and nonexpert reviewers. Quantitative ordering of images was compared with ordering by visual inspection. Images were binned to establish an 8-point scale. Ranking consistency was evaluated using the Kendall rank correlation coefficient (Kendall's tau). Visual ranking results were assessed using Kendall's coefficient of concordance (Kendall's W) analysis. There was a high degree of correlation among the image analysis, expert-based and non-expert-based image rankings. Pairwise comparisons of the quantitative ranking with each reviewer generated an average Kendall's tau of 0.83 ± 0.04 (SD). Inter-rater correlation was also high with Kendall's W of 0.96, 0.95, and 0.95 for nonexpert, expert, and all reviewers, respectively. The current standard for assessing iris transillumination is expert assessment of clinical exam findings. We adapted an image-analysis technique to generate quantitative transillumination values. Quantitative ranking was shown to be highly similar to a ranking produced by both expert and nonexpert reviewers. This finding suggests that the image characteristics used to quantify iris transillumination do not require expert interpretation. Inter-rater rankings were also highly similar, suggesting that varied methods of transillumination ranking are robust in terms of producing reproducible results.

  10. The Functions and Dysfunctions of College Rankings: An Analysis of Institutional Expenditure

    ERIC Educational Resources Information Center

    Kim, Jeongeun

    2018-01-01

    College rankings have become a powerful influence in higher education. While the determinants of educational quality are not clearly defined, college rankings designate an institution's standing in a numerical order based on quantifiable measurements that focus primarily on institutional resources. Previous research has identified the…

  11. The New "LJ" Index

    ERIC Educational Resources Information Center

    Lance, Keith Curry; Lyons, Ray

    2008-01-01

    As published critics of Hennen's American Public Library Ratings (HAPLR), the authors propose a new ranking system that focuses more transparently on ranking libraries based on their performance. These annual rankings are intended to contribute to self-evaluation and peer comparison, prompt questions about the statistics and how to improve them,…

  12. Error analysis of stochastic gradient descent ranking.

    PubMed

    Chen, Hong; Tang, Yi; Li, Luoqing; Yuan, Yuan; Li, Xuelong; Tang, Yuanyan

    2013-06-01

    Ranking is always an important task in machine learning and information retrieval, e.g., collaborative filtering, recommender systems, drug discovery, etc. A kernel-based stochastic gradient descent algorithm with the least squares loss is proposed for ranking in this paper. The implementation of this algorithm is simple, and an expression of the solution is derived via a sampling operator and an integral operator. An explicit convergence rate for leaning a ranking function is given in terms of the suitable choices of the step size and the regularization parameter. The analysis technique used here is capacity independent and is novel in error analysis of ranking learning. Experimental results on real-world data have shown the effectiveness of the proposed algorithm in ranking tasks, which verifies the theoretical analysis in ranking error.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    Rank distributions are collections of positive sizes ordered either increasingly or decreasingly. Many decreasing rank distributions, formed by the collective collaboration of human actions, follow an inverse power-law relation between ranks and sizes. This remarkable empirical fact is termed Zipf’s law, and one of its quintessential manifestations is the demography of human settlements — which exhibits a harmonic relation between ranks and sizes. In this paper we present a comprehensive statistical-physics analysis of rank distributions, establish that power-law and exponential rank distributions stand out as optimal in various entropy-based senses, and unveil the special role of the harmonic relation betweenmore » ranks and sizes. Our results extend the contemporary entropy-maximization view of Zipf’s law to a broader, panoramic, Gibbsian perspective of increasing and decreasing power-law and exponential rank distributions — of which Zipf’s law is one out of four pillars.« less

  14. Validation of Microcapillary Flow Cytometry for Community-Based CD4+ T Lymphocyte Enumeration in Remote Burkina Faso

    PubMed Central

    Renault, Cybèle A; Traore, Arouna; Machekano, Rhoderick N; Israelski, Dennis M

    2010-01-01

    Background: CD4+ T lymphocyte enumeration plays a critical role in the initiation and monitoring of HIV-infected patients on antiretroviral therapy. There is an urgent need for low-cost CD4+ enumeration technologies, particularly for use in dry, dusty climates characteristic of many small cities in Sub-Saharan Africa. Design: Cross-sectional study. Methods: Blood samples from 98 HIV-infected patients followed in a community HIV clinic in Ouahigouya, Burkina Faso were obtained for routine CD4+ T lymphocyte count monitoring. The blood samples were divided into two aliquots, on which parallel CD4+ measurements were performed using microcapillary (Guava EasyCD4) and dedicated (Becton Dickinson FACSCount) CD4+ enumeration systems. Spearman rank correlation coefficient was calculated, and the sensitivity, specificity and positive predictive value (PPV) for EasyCD4 <200 cells/µL were determined compared to the reference standard FACSCount CD4 <200 cells/µL. Results: Mean CD4 counts for the EasyCD4 and FACSCount were 313.75 cells/µL and 303.47 cells/µL, respectively. The Spearman rank correlation coefficient was 0.92 (p<0.001). Median values using EasyCD4 were higher than those with the FACSCount (p=0.004). For a CD4<350 cells/uL, sensitivity of the EasyCD4 was 93.9% (95%CI 85.2-98.3%), specificity was 90.6% (95% CI 75.0-98.0%), and PPV was 95.4% (95%CI 87.1-99.0%). Conclusion: Use of the EasyCD4 system was feasible and highly accurate in the harsh conditions of this remote city in Sub-Saharan Africa, demonstrating acceptable sensitivity and specificity compared to a standard operating system. Microcapillary flow cytometry offers a cost-effective alternative for community-based, point-of-care CD4+ testing and could play a substantial role in scaling up HIV care in remote, resource-limited settings. PMID:21253463

  15. An Automated Approach for Ranking Journals to Help in Clinician Decision Support

    PubMed Central

    Jonnalagadda, Siddhartha R.; Moosavinasab, Soheil; Nath, Chinmoy; Li, Dingcheng; Chute, Christopher G.; Liu, Hongfang

    2014-01-01

    Point of care access to knowledge from full text journal articles supports decision-making and decreases medical errors. However, it is an overwhelming task to search through full text journal articles and find quality information needed by clinicians. We developed a method to rate journals for a given clinical topic, Congestive Heart Failure (CHF). Our method enables filtering of journals and ranking of journal articles based on source journal in relation to CHF. We also obtained a journal priority score, which automatically rates any journal based on its importance to CHF. Comparing our ranking with data gathered by surveying 169 cardiologists, who publish on CHF, our best Multiple Linear Regression model showed a correlation of 0.880, based on five-fold cross validation. Our ranking system can be extended to other clinical topics. PMID:25954382

  16. Generalization Performance of Regularized Ranking With Multiscale Kernels.

    PubMed

    Zhou, Yicong; Chen, Hong; Lan, Rushi; Pan, Zhibin

    2016-05-01

    The regularized kernel method for the ranking problem has attracted increasing attentions in machine learning. The previous regularized ranking algorithms are usually based on reproducing kernel Hilbert spaces with a single kernel. In this paper, we go beyond this framework by investigating the generalization performance of the regularized ranking with multiscale kernels. A novel ranking algorithm with multiscale kernels is proposed and its representer theorem is proved. We establish the upper bound of the generalization error in terms of the complexity of hypothesis spaces. It shows that the multiscale ranking algorithm can achieve satisfactory learning rates under mild conditions. Experiments demonstrate the effectiveness of the proposed method for drug discovery and recommendation tasks.

  17. Wilcoxon's signed-rank statistic: what null hypothesis and why it matters.

    PubMed

    Li, Heng; Johnson, Terri

    2014-01-01

    In statistical literature, the term 'signed-rank test' (or 'Wilcoxon signed-rank test') has been used to refer to two distinct tests: a test for symmetry of distribution and a test for the median of a symmetric distribution, sharing a common test statistic. To avoid potential ambiguity, we propose to refer to those two tests by different names, as 'test for symmetry based on signed-rank statistic' and 'test for median based on signed-rank statistic', respectively. The utility of such terminological differentiation should become evident through our discussion of how those tests connect and contrast with sign test and one-sample t-test. Published 2014. This article is a U.S. Government work and is in the public domain in the USA. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  18. Rank Diversity of Languages: Generic Behavior in Computational Linguistics

    PubMed Central

    Cocho, Germinal; Flores, Jorge; Gershenson, Carlos; Pineda, Carlos; Sánchez, Sergio

    2015-01-01

    Statistical studies of languages have focused on the rank-frequency distribution of words. Instead, we introduce here a measure of how word ranks change in time and call this distribution rank diversity. We calculate this diversity for books published in six European languages since 1800, and find that it follows a universal lognormal distribution. Based on the mean and standard deviation associated with the lognormal distribution, we define three different word regimes of languages: “heads” consist of words which almost do not change their rank in time, “bodies” are words of general use, while “tails” are comprised by context-specific words and vary their rank considerably in time. The heads and bodies reflect the size of language cores identified by linguists for basic communication. We propose a Gaussian random walk model which reproduces the rank variation of words in time and thus the diversity. Rank diversity of words can be understood as the result of random variations in rank, where the size of the variation depends on the rank itself. We find that the core size is similar for all languages studied. PMID:25849150

  19. Rank diversity of languages: generic behavior in computational linguistics.

    PubMed

    Cocho, Germinal; Flores, Jorge; Gershenson, Carlos; Pineda, Carlos; Sánchez, Sergio

    2015-01-01

    Statistical studies of languages have focused on the rank-frequency distribution of words. Instead, we introduce here a measure of how word ranks change in time and call this distribution rank diversity. We calculate this diversity for books published in six European languages since 1800, and find that it follows a universal lognormal distribution. Based on the mean and standard deviation associated with the lognormal distribution, we define three different word regimes of languages: "heads" consist of words which almost do not change their rank in time, "bodies" are words of general use, while "tails" are comprised by context-specific words and vary their rank considerably in time. The heads and bodies reflect the size of language cores identified by linguists for basic communication. We propose a Gaussian random walk model which reproduces the rank variation of words in time and thus the diversity. Rank diversity of words can be understood as the result of random variations in rank, where the size of the variation depends on the rank itself. We find that the core size is similar for all languages studied.

  20. Case-Mix Adjusting Performance Measures in a Veteran Population: Pharmacy- and Diagnosis-Based Approaches

    PubMed Central

    Liu, Chuan-Fen; Sales, Anne E; Sharp, Nancy D; Fishman, Paul; Sloan, Kevin L; Todd-Stenberg, Jeff; Nichol, W Paul; Rosen, Amy K; Loveland, Susan

    2003-01-01

    Objective To compare the rankings for health care utilization performance measures at the facility level in a Veterans Health Administration (VHA) health care delivery network using pharmacy- and diagnosis-based case-mix adjustment measures. Data Sources/Study Setting The study included veterans who used inpatient or outpatient services in Veterans Integrated Service Network (VISN) 20 during fiscal year 1998 (October 1997 to September 1998; N=126,076). Utilization and pharmacy data were extracted from VHA national databases and the VISN 20 data warehouse. Study Design We estimated concurrent regression models using pharmacy or diagnosis information in the base year (FY1998) to predict health service utilization in the same year. Utilization measures included bed days of care for inpatient care and provider visits for outpatient care. Principal Findings Rankings of predicted utilization measures across facilities vary by case-mix adjustment measure. There is greater consistency within the diagnosis-based models than between the diagnosis- and pharmacy-based models. The eight facilities were ranked differently by the diagnosis- and pharmacy-based models. Conclusions Choice of case-mix adjustment measure affects rankings of facilities on performance measures, raising concerns about the validity of profiling practices. Differences in rankings may reflect differences in comparability of data capture across facilities between pharmacy and diagnosis data sources, and unstable estimates due to small numbers of patients in a facility. PMID:14596393

  1. Plus Disease in Retinopathy of Prematurity: Improving Diagnosis by Ranking Disease Severity and Using Quantitative Image Analysis.

    PubMed

    Kalpathy-Cramer, Jayashree; Campbell, J Peter; Erdogmus, Deniz; Tian, Peng; Kedarisetti, Dharanish; Moleta, Chace; Reynolds, James D; Hutcheson, Kelly; Shapiro, Michael J; Repka, Michael X; Ferrone, Philip; Drenser, Kimberly; Horowitz, Jason; Sonmez, Kemal; Swan, Ryan; Ostmo, Susan; Jonas, Karyn E; Chan, R V Paul; Chiang, Michael F

    2016-11-01

    To determine expert agreement on relative retinopathy of prematurity (ROP) disease severity and whether computer-based image analysis can model relative disease severity, and to propose consideration of a more continuous severity score for ROP. We developed 2 databases of clinical images of varying disease severity (100 images and 34 images) as part of the Imaging and Informatics in ROP (i-ROP) cohort study and recruited expert physician, nonexpert physician, and nonphysician graders to classify and perform pairwise comparisons on both databases. Six participating expert ROP clinician-scientists, each with a minimum of 10 years of clinical ROP experience and 5 ROP publications, and 5 image graders (3 physicians and 2 nonphysician graders) who analyzed images that were obtained during routine ROP screening in neonatal intensive care units. Images in both databases were ranked by average disease classification (classification ranking), by pairwise comparison using the Elo rating method (comparison ranking), and by correlation with the i-ROP computer-based image analysis system. Interexpert agreement (weighted κ statistic) compared with the correlation coefficient (CC) between experts on pairwise comparisons and correlation between expert rankings and computer-based image analysis modeling. There was variable interexpert agreement on diagnostic classification of disease (plus, preplus, or normal) among the 6 experts (mean weighted κ, 0.27; range, 0.06-0.63), but good correlation between experts on comparison ranking of disease severity (mean CC, 0.84; range, 0.74-0.93) on the set of 34 images. Comparison ranking provided a severity ranking that was in good agreement with ranking obtained by classification ranking (CC, 0.92). Comparison ranking on the larger dataset by both expert and nonexpert graders demonstrated good correlation (mean CC, 0.97; range, 0.95-0.98). The i-ROP system was able to model this continuous severity with good correlation (CC, 0.86). Experts diagnose plus disease on a continuum, with poor absolute agreement on classification but good relative agreement on disease severity. These results suggest that the use of pairwise rankings and a continuous severity score, such as that provided by the i-ROP system, may improve agreement on disease severity in the future. Copyright © 2016 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  2. Google matrix of business process management

    NASA Astrophysics Data System (ADS)

    Abel, M. W.; Shepelyansky, D. L.

    2011-12-01

    Development of efficient business process models and determination of their characteristic properties are subject of intense interdisciplinary research. Here, we consider a business process model as a directed graph. Its nodes correspond to the units identified by the modeler and the link direction indicates the causal dependencies between units. It is of primary interest to obtain the stationary flow on such a directed graph, which corresponds to the steady-state of a firm during the business process. Following the ideas developed recently for the World Wide Web, we construct the Google matrix for our business process model and analyze its spectral properties. The importance of nodes is characterized by PageRank and recently proposed CheiRank and 2DRank, respectively. The results show that this two-dimensional ranking gives a significant information about the influence and communication properties of business model units. We argue that the Google matrix method, described here, provides a new efficient tool helping companies to make their decisions on how to evolve in the exceedingly dynamic global market.

  3. Rank-based decompositions of morphological templates.

    PubMed

    Sussner, P; Ritter, G X

    2000-01-01

    Methods for matrix decomposition have found numerous applications in image processing, in particular for the problem of template decomposition. Since existing matrix decomposition techniques are mainly concerned with the linear domain, we consider it timely to investigate matrix decomposition techniques in the nonlinear domain with applications in image processing. The mathematical basis for these investigations is the new theory of rank within minimax algebra. Thus far, only minimax decompositions of rank 1 and rank 2 matrices into outer product expansions are known to the image processing community. We derive a heuristic algorithm for the decomposition of matrices having arbitrary rank.

  4. Prediction of Fontan-Associated Liver Disease Using a Novel Cine Magnetic Resonance Imaging "Vortex Flow Map" in the Right Atrium.

    PubMed

    Ishizaki, Umiko; Nagao, Michinobu; Shiina, Yumi; Fukushima, Kenji; Takahashi, Tatsunori; Shimomiya, Yamato; Matsuo, Yuka; Inai, Kei; Sakai, Shuji

    2018-05-18

    Long-term hepatic dysfunction is an increasingly recognized complication of the Fontan operation for univentricular hearts. The purpose of this study was to determine whether Fontan-associated liver disease (FALD) could be predicted by flow dynamics in the right atrium (RA) of Fontan circulation.Methods and Results:Cardiac MRI and the serum levels of total bilirubin (TBil) and hyaluronic acid (HA) were analyzed in 36 patients who underwent an atriopulmonary connection type of Fontan operation. The mean follow-up period was 53 months. Three views (axial, coronal, and sagittal) of the cine images were scanned for the maximum cross-section of the RA obtained with1.5-Tesla scanner. We developed a "vortex flow map" to demonstrate the ratio of the circumferential voxel movement in each phase to the total movement throughout a cardiac cycle towards the center of the RA. The maximum ratio was used as the magnitude of vortex flow (MVF%) in the 3 views of the RA cine imaging. Patients with coronal MVF ≥13.6% had significantly lower free rates of TBil ≥1.8 mg/dL than those with coronal MVF <13.6% (log-rank value=4.50; P<0.05; hazard ratio=4.54). Patients with sagittal MVF ≥14.0% had significantly lower free rates of HA ≥50 ng/mL than those with coronal MVF <14.0% (log-rank value=4.40; P<0.05; hazard ratio=4.12). A reduced vortex flow in the RA during the late phase of the Fontan operation was associated with the development of FALD. MVF can be used as an imaging biomarker to predict FALD.

  5. Optimizing Estimated Loss Reduction for Active Sampling in Rank Learning

    DTIC Science & Technology

    2008-01-01

    active learning framework for SVM-based and boosting-based rank learning. Our approach suggests sampling based on maximizing the estimated loss differential over unlabeled data. Experimental results on two benchmark corpora show that the proposed model substantially reduces the labeling effort, and achieves superior performance rapidly with as much as 30% relative improvement over the margin-based sampling

  6. Prioritising and planning of urban stormwater treatment in the Alna watercourse in Oslo.

    PubMed

    Nordeidet, B; Nordeide, T; Astebøl, S O; Hvitved-Jacobsen, T

    2004-12-01

    The Oslo municipal Water and Sewage Works (VAV) intends to improve the water quality in the Alna watercourse, in particular, with regards to the biological diversity. In order to reduce existing discharges of polluted urban stormwater, a study has been carried out to rank subcatchment areas in descending order of magnitude and to assess possible measures. An overall ranking methodology was developed in order to identify and select the most suitable subcatchment areas for further assessment studies (74 subcatchment/drainage areas). The municipality's comprehensive geographical information system (GIS) was applied as a base for the ranking. A weighted ranking based on three selected parameters was chosen from several major influencing factors, namely total yearly discharge (kg pollution/year), specific pollution discharge (kg/area/year) and existing stormwater system (pipe lengths/area). Results show that the highest 15 ranked catchment areas accounted for 70% of the total calculated pollution load of heavy metals. The highest ranked areas are strongly influenced by three major highways. Based on the results from similar field studies, it would be possible to remove 75-85% of total solids and about 50-80% of heavy metals using wet detention ponds as Best Available Technology (BAT). Based on the final ranking, two subcatchment areas were selected for further practical assessment of possible measures. VAV plans to use wet detention ponds, in combination with other measures when relevant, to treat the urban runoff. Using calculated loading and aerial photographs (all done in the same GIS environment), a preliminary sketch design and location of ponds were performed. The resulting GIS methodology for urban stormwater management will be used as input to a holistic and long-term planning process for the management of the watercourse, taking into account future urban development and other pollution sources.

  7. Multi-dimensional Rankings, Program Termination, and Complexity Bounds of Flowchart Programs

    NASA Astrophysics Data System (ADS)

    Alias, Christophe; Darte, Alain; Feautrier, Paul; Gonnord, Laure

    Proving the termination of a flowchart program can be done by exhibiting a ranking function, i.e., a function from the program states to a well-founded set, which strictly decreases at each program step. A standard method to automatically generate such a function is to compute invariants for each program point and to search for a ranking in a restricted class of functions that can be handled with linear programming techniques. Previous algorithms based on affine rankings either are applicable only to simple loops (i.e., single-node flowcharts) and rely on enumeration, or are not complete in the sense that they are not guaranteed to find a ranking in the class of functions they consider, if one exists. Our first contribution is to propose an efficient algorithm to compute ranking functions: It can handle flowcharts of arbitrary structure, the class of candidate rankings it explores is larger, and our method, although greedy, is provably complete. Our second contribution is to show how to use the ranking functions we generate to get upper bounds for the computational complexity (number of transitions) of the source program. This estimate is a polynomial, which means that we can handle programs with more than linear complexity. We applied the method on a collection of test cases from the literature. We also show the links and differences with previous techniques based on the insertion of counters.

  8. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or timing of cash flows are uncertain and are not fixed under § 436.14, Federal agencies may examine the impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order...

  9. Validation of an image-based technique to assess the perceptual quality of clinical chest radiographs with an observer study

    NASA Astrophysics Data System (ADS)

    Lin, Yuan; Choudhury, Kingshuk R.; McAdams, H. Page; Foos, David H.; Samei, Ehsan

    2014-03-01

    We previously proposed a novel image-based quality assessment technique1 to assess the perceptual quality of clinical chest radiographs. In this paper, an observer study was designed and conducted to systematically validate this technique. Ten metrics were involved in the observer study, i.e., lung grey level, lung detail, lung noise, riblung contrast, rib sharpness, mediastinum detail, mediastinum noise, mediastinum alignment, subdiaphragm-lung contrast, and subdiaphragm area. For each metric, three tasks were successively presented to the observers. In each task, six ROI images were randomly presented in a row and observers were asked to rank the images only based on a designated quality and disregard the other qualities. A range slider on the top of the images was used for observers to indicate the acceptable range based on the corresponding perceptual attribute. Five boardcertificated radiologists from Duke participated in this observer study on a DICOM calibrated diagnostic display workstation and under low ambient lighting conditions. The observer data were analyzed in terms of the correlations between the observer ranking orders and the algorithmic ranking orders. Based on the collected acceptable ranges, quality consistency ranges were statistically derived. The observer study showed that, for each metric, the averaged ranking orders of the participated observers were strongly correlated with the algorithmic orders. For the lung grey level, the observer ranking orders completely accorded with the algorithmic ranking orders. The quality consistency ranges derived from this observer study were close to these derived from our previous study. The observer study indicates that the proposed image-based quality assessment technique provides a robust reflection of the perceptual image quality of the clinical chest radiographs. The derived quality consistency ranges can be used to automatically predict the acceptability of a clinical chest radiograph.

  10. Functional renormalization group analysis of tensorial group field theories on Rd

    NASA Astrophysics Data System (ADS)

    Geloun, Joseph Ben; Martini, Riccardo; Oriti, Daniele

    2016-07-01

    Rank-d tensorial group field theories are quantum field theories (QFTs) defined on a group manifold G×d , which represent a nonlocal generalization of standard QFT and a candidate formalism for quantum gravity, since, when endowed with appropriate data, they can be interpreted as defining a field theoretic description of the fundamental building blocks of quantum spacetime. Their renormalization analysis is crucial both for establishing their consistency as quantum field theories and for studying the emergence of continuum spacetime and geometry from them. In this paper, we study the renormalization group flow of two simple classes of tensorial group field theories (TGFTs), defined for the group G =R for arbitrary rank, both without and with gauge invariance conditions, by means of functional renormalization group techniques. The issue of IR divergences is tackled by the definition of a proper thermodynamic limit for TGFTs. We map the phase diagram of such models, in a simple truncation, and identify both UV and IR fixed points of the RG flow. Encouragingly, for all the models we study, we find evidence for the existence of a phase transition of condensation type.

  11. Disease Specific Productivity of American Cancer Hospitals

    PubMed Central

    Goldstein, Jeffery A.; Prasad, Vinay

    2015-01-01

    Context Research-oriented cancer hospitals in the United States treat and study patients with a range of diseases. Measures of disease specific research productivity, and comparison to overall productivity, are currently lacking. Hypothesis Different institutions are specialized in research of particular diseases. Objective To report disease specific productivity of American cancer hospitals, and propose a summary measure. Method We conducted a retrospective observational survey of the 50 highest ranked cancer hospitals in the 2013 US News and World Report rankings. We performed an automated search of PubMed and Clinicaltrials.gov for published reports and registrations of clinical trials (respectively) addressing specific cancers between 2008 and 2013. We calculated the summed impact factor for the publications. We generated a summary measure of productivity based on the number of Phase II clinical trials registered and the impact factor of Phase II clinical trials published for each institution and disease pair. We generated rankings based on this summary measure. Results We identified 6076 registered trials and 6516 published trials with a combined impact factor of 44280.4, involving 32 different diseases over the 50 institutions. Using a summary measure based on registered and published clinical trails, we ranked institutions in specific diseases. As expected, different institutions were highly ranked in disease-specific productivity for different diseases. 43 institutions appeared in the top 10 ranks for at least 1 disease (vs 10 in the overall list), while 6 different institutions were ranked number 1 in at least 1 disease (vs 1 in the overall list). Conclusion Research productivity varies considerably among the sample. Overall cancer productivity conceals great variation between diseases. Disease specific rankings identify sites of high academic productivity, which may be of interest to physicians, patients and researchers. PMID:25781329

  12. Disease specific productivity of american cancer hospitals.

    PubMed

    Goldstein, Jeffery A; Prasad, Vinay

    2015-01-01

    Research-oriented cancer hospitals in the United States treat and study patients with a range of diseases. Measures of disease specific research productivity, and comparison to overall productivity, are currently lacking. Different institutions are specialized in research of particular diseases. To report disease specific productivity of American cancer hospitals, and propose a summary measure. We conducted a retrospective observational survey of the 50 highest ranked cancer hospitals in the 2013 US News and World Report rankings. We performed an automated search of PubMed and Clinicaltrials.gov for published reports and registrations of clinical trials (respectively) addressing specific cancers between 2008 and 2013. We calculated the summed impact factor for the publications. We generated a summary measure of productivity based on the number of Phase II clinical trials registered and the impact factor of Phase II clinical trials published for each institution and disease pair. We generated rankings based on this summary measure. We identified 6076 registered trials and 6516 published trials with a combined impact factor of 44280.4, involving 32 different diseases over the 50 institutions. Using a summary measure based on registered and published clinical trails, we ranked institutions in specific diseases. As expected, different institutions were highly ranked in disease-specific productivity for different diseases. 43 institutions appeared in the top 10 ranks for at least 1 disease (vs 10 in the overall list), while 6 different institutions were ranked number 1 in at least 1 disease (vs 1 in the overall list). Research productivity varies considerably among the sample. Overall cancer productivity conceals great variation between diseases. Disease specific rankings identify sites of high academic productivity, which may be of interest to physicians, patients and researchers.

  13. Iterative deblending of simultaneous-source data using a coherency-pass shaping operator

    NASA Astrophysics Data System (ADS)

    Zu, Shaohuan; Zhou, Hui; Mao, Weijian; Zhang, Dong; Li, Chao; Pan, Xiao; Chen, Yangkang

    2017-10-01

    Simultaneous-source acquisition helps greatly boost an economic saving, while it brings an unprecedented challenge of removing the crosstalk interference in the recorded seismic data. In this paper, we propose a novel iterative method to separate the simultaneous source data based on a coherency-pass shaping operator. The coherency-pass filter is used to constrain the model, that is, the unblended data to be estimated, in the shaping regularization framework. In the simultaneous source survey, the incoherent interference from adjacent shots greatly increases the rank of the frequency domain Hankel matrix that is formed from the blended record. Thus, the method based on rank reduction is capable of separating the blended record to some extent. However, the shortcoming is that it may cause residual noise when there is strong blending interference. We propose to cascade the rank reduction and thresholding operators to deal with this issue. In the initial iterations, we adopt a small rank to severely separate the blended interference and a large thresholding value as strong constraints to remove the residual noise in the time domain. In the later iterations, since more and more events have been recovered, we weaken the constraint by increasing the rank and shrinking the threshold to recover weak events and to guarantee the convergence. In this way, the combined rank reduction and thresholding strategy acts as a coherency-pass filter, which only passes the coherent high-amplitude component after rank reduction instead of passing both signal and noise in traditional rank reduction based approaches. Two synthetic examples are tested to demonstrate the performance of the proposed method. In addition, the application on two field data sets (common receiver gathers and stacked profiles) further validate the effectiveness of the proposed method.

  14. Podium: Ranking Data Using Mixed-Initiative Visual Analytics.

    PubMed

    Wall, Emily; Das, Subhajit; Chawla, Ravish; Kalidindi, Bharath; Brown, Eli T; Endert, Alex

    2018-01-01

    People often rank and order data points as a vital part of making decisions. Multi-attribute ranking systems are a common tool used to make these data-driven decisions. Such systems often take the form of a table-based visualization in which users assign weights to the attributes representing the quantifiable importance of each attribute to a decision, which the system then uses to compute a ranking of the data. However, these systems assume that users are able to quantify their conceptual understanding of how important particular attributes are to a decision. This is not always easy or even possible for users to do. Rather, people often have a more holistic understanding of the data. They form opinions that data point A is better than data point B but do not necessarily know which attributes are important. To address these challenges, we present a visual analytic application to help people rank multi-variate data points. We developed a prototype system, Podium, that allows users to drag rows in the table to rank order data points based on their perception of the relative value of the data. Podium then infers a weighting model using Ranking SVM that satisfies the user's data preferences as closely as possible. Whereas past systems help users understand the relationships between data points based on changes to attribute weights, our approach helps users to understand the attributes that might inform their understanding of the data. We present two usage scenarios to describe some of the potential uses of our proposed technique: (1) understanding which attributes contribute to a user's subjective preferences for data, and (2) deconstructing attributes of importance for existing rankings. Our proposed approach makes powerful machine learning techniques more usable to those who may not have expertise in these areas.

  15. Understanding perceived availability and importance of tobacco control interventions to inform European adoption of a UK economic model: a cross-sectional study.

    PubMed

    Kulchaitanaroaj, Puttarin; Kaló, Zoltán; West, Robert; Cheung, Kei Long; Evers, Silvia; Vokó, Zoltán; Hiligsmann, Mickael; de Vries, Hein; Owen, Lesley; Trapero-Bertran, Marta; Leidl, Reiner; Pokhrel, Subhash

    2018-02-14

    The evidence on the extent to which stakeholders in different European countries agree with availability and importance of tobacco-control interventions is limited. This study assessed and compared stakeholders' views from five European countries and compared the perceived ranking of interventions with evidence-based ranking using cost-effectiveness data. An interview survey (face-to-face, by phone or Skype) was conducted between April and July 2014 with five categories of stakeholders - decision makers, service purchasers, service providers, evidence generators and health promotion advocates - from Germany, Hungary, the Netherlands, Spain, and the United Kingdom. A list of potential stakeholders drawn from the research team's contacts and snowballing served as the sampling frame. An email invitation was sent to all stakeholders in this list and recruitment was based on positive replies. Respondents were asked to rate availability and importance of 30 tobacco control interventions. Kappa coefficients assessed agreement of stakeholders' views. A mean importance score for each intervention was used to rank the interventions. This ranking was compared with the ranking based on cost-effectiveness data from a published review. Ninety-three stakeholders (55.7% response rate) completed the survey: 18.3% were from Germany, 17.2% from Hungary, 30.1% from the Netherlands, 19.4% from Spain, and 15.1% from the UK. Of those, 31.2% were decision makers, 26.9% evidence generators, 19.4% service providers, 15.1% health-promotion advocates, and 7.5% purchasers of services/pharmaceutical products. Smoking restrictions in public areas were rated as the most important intervention (mean score = 1.89). The agreement on availability of interventions between the stakeholders was very low (kappa = 0.098; 95% CI = [0.085, 0.111] but the agreement on the importance of the interventions was fair (kappa = 0.239; 95% CI = [0.208, 0.253]). A correlation was found between availability and importance rankings for stage-based interventions. The importance ranking was not statistically concordant with the ranking based on published cost-effectiveness data (Kendall rank correlation coefficient = 0.40; p-value = 0.11; 95% CI = [- 0.09, 0.89]). The intrinsic differences in stakeholder views must be addressed while transferring economic evidence Europe-wide. Strong engagement with stakeholders, focussing on better communication, has a potential to mitigate this challenge.

  16. The Marketing of Canadian University Rankings: A Misadventure Now 24 Years Old

    ERIC Educational Resources Information Center

    Cramer, Kenneth M.; Page, Stewart; Burrows, Vanessa; Lamoureux, Chastine; Mackay, Sarah; Pedri, Victoria; Pschibul, Rebecca

    2016-01-01

    Based on analyses of Maclean's ranking data pertaining to Canadian universities published over the last 24 years, we present a summary of statistical findings of annual ranking exercises, as well as discussion about their current status and the effects upon student welfare. Some illustrative tables are also presented. Using correlational and…

  17. International Students' and Employers' Use of Rankings: A Cross-National Analysis

    ERIC Educational Resources Information Center

    Souto-Otero, Manuel; Enders, Jürgen

    2017-01-01

    The article examines, primarily based on large-scale survey data, the functionalist proposition that HE customers, students and employers, demand rankings to be able to adopt informed decisions on where to study and who to recruit respectively. This is contrasted to a Weberian "conflict" perspective on rankings in which positional…

  18. Who Should Rank Our Journals...And Based on What?

    ERIC Educational Resources Information Center

    Cherkowski, Sabre; Currie, Russell; Hilton, Sandy

    2012-01-01

    Purpose: This study aims to establish the use of active scholar assessment (ASA) in the field of education leadership as a new methodology in ranking administration and leadership journals. The secondary purpose of this study is to respond to the paucity of research on journal ranking in educational administration and leadership.…

  19. A scoring mechanism for the rank aggregation of network robustness

    NASA Astrophysics Data System (ADS)

    Yazdani, Alireza; Dueñas-Osorio, Leonardo; Li, Qilin

    2013-10-01

    To date, a number of metrics have been proposed to quantify inherent robustness of network topology against failures. However, each single metric usually only offers a limited view of network vulnerability to different types of random failures and targeted attacks. When applied to certain network configurations, different metrics rank network topology robustness in different orders which is rather inconsistent, and no single metric fully characterizes network robustness against different modes of failure. To overcome such inconsistency, this work proposes a multi-metric approach as the basis of evaluating aggregate ranking of network topology robustness. This is based on simultaneous utilization of a minimal set of distinct robustness metrics that are standardized so to give way to a direct comparison of vulnerability across networks with different sizes and configurations, hence leading to an initial scoring of inherent topology robustness. Subsequently, based on the inputs of initial scoring a rank aggregation method is employed to allocate an overall ranking of robustness to each network topology. A discussion is presented in support of the presented multi-metric approach and its applications to more realistically assess and rank network topology robustness.

  20. Comparative Case Studies on Indonesian Higher Education Rankings

    NASA Astrophysics Data System (ADS)

    Kurniasih, Nuning; Hasyim, C.; Wulandari, A.; Setiawan, M. I.; Ahmar, A. S.

    2018-01-01

    The quality of the higher education is the result of a continuous process. There are many indicators that can be used to assess the quality of a higher education. The existence of different indicators makes the different result of university rankings. This research aims to find variables that can connect ranking indicators that are used by Indonesian Ministry of Research, Technology, and Higher Education with indicators that are used by international rankings by taking two kind of ranking systems i.e. Webometrics and 4icu. This research uses qualitative research method with comparative case studies approach. The result of the research shows that to bridge the indicators that are used by Indonesian Ministry or Research, Technology, and Higher Education with web-based ranking system like Webometrics and 4icu so that the Indonesian higher education institutions need to open access towards either scientific or non-scientific that are publicly used into web-based environment. One of the strategies that can be used to improve the openness and access towards scientific work of a university is by involving in open science and collaboration.

  1. Emergency assessment of postwildfire debris-flow hazards for the 2011 Motor Fire, Sierra and Stanislaus National Forests, California

    USGS Publications Warehouse

    Cannon, Susan H.; Michael, John A.

    2011-01-01

    This report presents an emergency assessment of potential debris-flow hazards from basins burned by the 2011 Motor fire in the Sierra and Stanislaus National Forests, Calif. Statistical-empirical models are used to estimate the probability and volume of debris flows that may be produced from burned drainage basins as a function of different measures of basin burned extent, gradient, and soil physical properties, and in response to a 30-minute-duration, 10-year-recurrence rainstorm. Debris-flow probability and volume estimates are then combined to form a relative hazard ranking for each basin. This assessment provides critical information for issuing warnings, locating and designing mitigation measures, and planning evacuation timing and routes within the first two years following the fire.

  2. Ranking Specific Sets of Objects.

    PubMed

    Maly, Jan; Woltran, Stefan

    2017-01-01

    Ranking sets of objects based on an order between the single elements has been thoroughly studied in the literature. In particular, it has been shown that it is in general impossible to find a total ranking - jointly satisfying properties as dominance and independence - on the whole power set of objects. However, in many applications certain elements from the entire power set might not be required and can be neglected in the ranking process. For instance, certain sets might be ruled out due to hard constraints or are not satisfying some background theory. In this paper, we treat the computational problem whether an order on a given subset of the power set of elements satisfying different variants of dominance and independence can be found, given a ranking on the elements. We show that this problem is tractable for partial rankings and NP-complete for total rankings.

  3. Bias and Stability of Single Variable Classifiers for Feature Ranking and Selection

    PubMed Central

    Fakhraei, Shobeir; Soltanian-Zadeh, Hamid; Fotouhi, Farshad

    2014-01-01

    Feature rankings are often used for supervised dimension reduction especially when discriminating power of each feature is of interest, dimensionality of dataset is extremely high, or computational power is limited to perform more complicated methods. In practice, it is recommended to start dimension reduction via simple methods such as feature rankings before applying more complex approaches. Single Variable Classifier (SVC) ranking is a feature ranking based on the predictive performance of a classifier built using only a single feature. While benefiting from capabilities of classifiers, this ranking method is not as computationally intensive as wrappers. In this paper, we report the results of an extensive study on the bias and stability of such feature ranking method. We study whether the classifiers influence the SVC rankings or the discriminative power of features themselves has a dominant impact on the final rankings. We show the common intuition of using the same classifier for feature ranking and final classification does not always result in the best prediction performance. We then study if heterogeneous classifiers ensemble approaches provide more unbiased rankings and if they improve final classification performance. Furthermore, we calculate an empirical prediction performance loss for using the same classifier in SVC feature ranking and final classification from the optimal choices. PMID:25177107

  4. Bias and Stability of Single Variable Classifiers for Feature Ranking and Selection.

    PubMed

    Fakhraei, Shobeir; Soltanian-Zadeh, Hamid; Fotouhi, Farshad

    2014-11-01

    Feature rankings are often used for supervised dimension reduction especially when discriminating power of each feature is of interest, dimensionality of dataset is extremely high, or computational power is limited to perform more complicated methods. In practice, it is recommended to start dimension reduction via simple methods such as feature rankings before applying more complex approaches. Single Variable Classifier (SVC) ranking is a feature ranking based on the predictive performance of a classifier built using only a single feature. While benefiting from capabilities of classifiers, this ranking method is not as computationally intensive as wrappers. In this paper, we report the results of an extensive study on the bias and stability of such feature ranking method. We study whether the classifiers influence the SVC rankings or the discriminative power of features themselves has a dominant impact on the final rankings. We show the common intuition of using the same classifier for feature ranking and final classification does not always result in the best prediction performance. We then study if heterogeneous classifiers ensemble approaches provide more unbiased rankings and if they improve final classification performance. Furthermore, we calculate an empirical prediction performance loss for using the same classifier in SVC feature ranking and final classification from the optimal choices.

  5. Maryland's high cancer mortality rate: a review of contributing demographic factors.

    PubMed

    Freedman, D M

    1999-01-01

    For many years, Maryland has ranked among the top states in cancer mortality. This study analyzed mortality data from the National Center for Health Statistics (CDC-Wonder) to help explain Maryland's cancer rate and rank. Age-adjusted rates are based on deaths per 100,000 population from 1991 through 1995. Rates and ranks overall, and stratified by age, are calculated for total cancer mortality, as well as for four major sites: lung, breast, prostate, and colorectal. Because states differ in their racial/gender mix, race/gender rates among states are also compared. Although Maryland ranks seventh in overall cancer mortality, its rates and rank by race and gender subpopulation are less high. For those under 75, white men ranked 26th, black men ranked 20th, and black and white women ranked 12th and 10th, respectively. Maryland's overall rank, as with any state, is a function of the rates of its racial and gender subpopulations and the relative size of these groups in the state. Many of the disparities between Maryland's overall high cancer rank and its lower rank by subpopulation also characterize the major cancer sites. Although a stratified presentation of cancer rates and ranks may be more favorable to Maryland, it should not be used to downplay the attention cancer mortality in Maryland deserves.

  6. Minimizing the semantic gap in biomedical content-based image retrieval

    NASA Astrophysics Data System (ADS)

    Guan, Haiying; Antani, Sameer; Long, L. Rodney; Thoma, George R.

    2010-03-01

    A major challenge in biomedical Content-Based Image Retrieval (CBIR) is to achieve meaningful mappings that minimize the semantic gap between the high-level biomedical semantic concepts and the low-level visual features in images. This paper presents a comprehensive learning-based scheme toward meeting this challenge and improving retrieval quality. The article presents two algorithms: a learning-based feature selection and fusion algorithm and the Ranking Support Vector Machine (Ranking SVM) algorithm. The feature selection algorithm aims to select 'good' features and fuse them using different similarity measurements to provide a better representation of the high-level concepts with the low-level image features. Ranking SVM is applied to learn the retrieval rank function and associate the selected low-level features with query concepts, given the ground-truth ranking of the training samples. The proposed scheme addresses four major issues in CBIR to improve the retrieval accuracy: image feature extraction, selection and fusion, similarity measurements, the association of the low-level features with high-level concepts, and the generation of the rank function to support high-level semantic image retrieval. It models the relationship between semantic concepts and image features, and enables retrieval at the semantic level. We apply it to the problem of vertebra shape retrieval from a digitized spine x-ray image set collected by the second National Health and Nutrition Examination Survey (NHANES II). The experimental results show an improvement of up to 41.92% in the mean average precision (MAP) over conventional image similarity computation methods.

  7. Learning to Rank the Severity of Unrepaired Cleft Lip Nasal Deformity on 3D Mesh Data.

    PubMed

    Wu, Jia; Tse, Raymond; Shapiro, Linda G

    2014-08-01

    Cleft lip is a birth defect that results in deformity of the upper lip and nose. Its severity is widely variable and the results of treatment are influenced by the initial deformity. Objective assessment of severity would help to guide prognosis and treatment. However, most assessments are subjective. The purpose of this study is to develop and test quantitative computer-based methods of measuring cleft lip severity. In this paper, a grid-patch based measurement of symmetry is introduced, with which a computer program learns to rank the severity of cleft lip on 3D meshes of human infant faces. Three computer-based methods to define the midfacial reference plane were compared to two manual methods. Four different symmetry features were calculated based upon these reference planes, and evaluated. The result shows that the rankings predicted by the proposed features were highly correlated with the ranking orders provided by experts that were used as the ground truth.

  8. A learning framework for age rank estimation based on face images with scattering transform.

    PubMed

    Chang, Kuang-Yu; Chen, Chu-Song

    2015-03-01

    This paper presents a cost-sensitive ordinal hyperplanes ranking algorithm for human age estimation based on face images. The proposed approach exploits relative-order information among the age labels for rank prediction. In our approach, the age rank is obtained by aggregating a series of binary classification results, where cost sensitivities among the labels are introduced to improve the aggregating performance. In addition, we give a theoretical analysis on designing the cost of individual binary classifier so that the misranking cost can be bounded by the total misclassification costs. An efficient descriptor, scattering transform, which scatters the Gabor coefficients and pooled with Gaussian smoothing in multiple layers, is evaluated for facial feature extraction. We show that this descriptor is a generalization of conventional bioinspired features and is more effective for face-based age inference. Experimental results demonstrate that our method outperforms the state-of-the-art age estimation approaches.

  9. Assessment of increased sampling pump flow rates in a disposable, inhalable aerosol sampler

    PubMed Central

    Stewart, Justin; Sleeth, Darrah K.; Handy, Rod G.; Pahler, Leon F.; Anthony, T. Renee; Volckens, John

    2017-01-01

    A newly designed, low-cost, disposable inhalable aerosol sampler was developed to assess workers personal exposure to inhalable particles. This sampler was originally designed to operate at 10 L/min to increase sample mass and, therefore, improve analytical detection limits for filter-based methods. Computational fluid dynamics modeling revealed that sampler performance (relative to aerosol inhalability criteria) would not differ substantially at sampler flows of 2 and 10 L/min. With this in mind, the newly designed inhalable aerosol sampler was tested in a wind tunnel, simultaneously, at flows of 2 and 10 L/min flow. A mannequin was equipped with 6 sampler/pump assemblies (three pumps operated at 2 L/min and three pumps at 10 L/min) inside a wind tunnel, operated at 0.2 m/s, which has been shown to be a typical indoor workplace wind speed. In separate tests, four different particle sizes were injected to determine if the sampler’s performance with the new 10 L/min flow rate significantly differed to that at 2 L/min. A comparison between inhalable mass concentrations using a Wilcoxon signed rank test found no significant difference in the concentration of particles sampled at 10 and 2 L/min for all particle sizes tested. Our results suggest that this new aerosol sampler is a versatile tool that can improve exposure assessment capabilities for the practicing industrial hygienist by improving the limit of detection and allowing for shorting sampling times. PMID:27676440

  10. Assessment of increased sampling pump flow rates in a disposable, inhalable aerosol sampler.

    PubMed

    Stewart, Justin; Sleeth, Darrah K; Handy, Rod G; Pahler, Leon F; Anthony, T Renee; Volckens, John

    2017-03-01

    A newly designed, low-cost, disposable inhalable aerosol sampler was developed to assess workers personal exposure to inhalable particles. This sampler was originally designed to operate at 10 L/min to increase sample mass and, therefore, improve analytical detection limits for filter-based methods. Computational fluid dynamics modeling revealed that sampler performance (relative to aerosol inhalability criteria) would not differ substantially at sampler flows of 2 and 10 L/min. With this in mind, the newly designed inhalable aerosol sampler was tested in a wind tunnel, simultaneously, at flows of 2 and 10 L/min flow. A mannequin was equipped with 6 sampler/pump assemblies (three pumps operated at 2 L/min and three pumps at 10 L/min) inside a wind tunnel, operated at 0.2 m/s, which has been shown to be a typical indoor workplace wind speed. In separate tests, four different particle sizes were injected to determine if the sampler's performance with the new 10 L/min flow rate significantly differed to that at 2 L/min. A comparison between inhalable mass concentrations using a Wilcoxon signed rank test found no significant difference in the concentration of particles sampled at 10 and 2 L/min for all particle sizes tested. Our results suggest that this new aerosol sampler is a versatile tool that can improve exposure assessment capabilities for the practicing industrial hygienist by improving the limit of detection and allowing for shorting sampling times.

  11. Potential gene flow from transgenic rice (Oryza sativa L.) to different weedy rice (Oryza sativa f. spontanea) accessions based on reproductive compatibility.

    PubMed

    Song, Xiaoling; Liu, Linli; Wang, Zhou; Qiang, Sheng

    2009-08-01

    The possibility of gene flow from transgenic crops to wild relatives may be affected by reproductive capacity between them. The potential gene flow from two transgenic rice lines containing the bar gene to five accessions of weedy rice (WR1-WR5) was determined through examination of reproductive compatibility under controlled pollination. The pollen grain germination of two transgenic rice lines on the stigma of all weedy rice, rice pollen tube growth down the style and entry into the weedy rice ovary were similar to self-pollination in weedy rice. However, delayed double fertilisation and embryo abortion in crosses between WR2 and Y0003 were observed. Seed sets between transgenic rice lines and weedy rice varied from 8 to 76%. Although repeated pollination increased seed set significantly, the rank of the seed set between the weedy rice accessions and rice lines was not changed. The germination rates of F(1) hybrids were similar or greater compared with respective females. All F(1) plants expressed glufosinate resistance in the presence of glufosinate selection pressure. The frequency of gene flow between different weedy rice accessions and transgenic herbicide-resistant rice may differ owing to different reproductive compatibility. This result suggests that, when wild relatives are selected as experimental materials for assessing the gene flow of transgenic rice, it is necessary to address the compatibility between transgenic rice and wild relatives.

  12. RankExplorer: Visualization of Ranking Changes in Large Time Series Data.

    PubMed

    Shi, Conglei; Cui, Weiwei; Liu, Shixia; Xu, Panpan; Chen, Wei; Qu, Huamin

    2012-12-01

    For many applications involving time series data, people are often interested in the changes of item values over time as well as their ranking changes. For example, people search many words via search engines like Google and Bing every day. Analysts are interested in both the absolute searching number for each word as well as their relative rankings. Both sets of statistics may change over time. For very large time series data with thousands of items, how to visually present ranking changes is an interesting challenge. In this paper, we propose RankExplorer, a novel visualization method based on ThemeRiver to reveal the ranking changes. Our method consists of four major components: 1) a segmentation method which partitions a large set of time series curves into a manageable number of ranking categories; 2) an extended ThemeRiver view with embedded color bars and changing glyphs to show the evolution of aggregation values related to each ranking category over time as well as the content changes in each ranking category; 3) a trend curve to show the degree of ranking changes over time; 4) rich user interactions to support interactive exploration of ranking changes. We have applied our method to some real time series data and the case studies demonstrate that our method can reveal the underlying patterns related to ranking changes which might otherwise be obscured in traditional visualizations.

  13. The selection of construction sub-contractors using the fuzzy sets theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krzemiński, Michał

    The paper presents the algorithm for the selection of sub-contractors. Main area of author’s interest is scheduling flow models. The ranking task aims at execution time as short as possible Brigades downtime should also be as small as possible. These targets are exposed to significant obsolescence. The criteria for selection of subcontractors will not be therefore time and cost, it is assumed that all those criteria be meet by sub-contractors. The decision should be made in regard to factors difficult to measure, to assess which is the perfect application of fuzzy sets theory. The paper will present a set ofmore » evaluation criteria, the part of the knowledge base and a description of the output variable.« less

  14. AUCTSP: an improved biomarker gene pair class predictor.

    PubMed

    Kagaris, Dimitri; Khamesipour, Alireza; Yiannoutsos, Constantin T

    2018-06-26

    The Top Scoring Pair (TSP) classifier, based on the concept of relative ranking reversals in the expressions of pairs of genes, has been proposed as a simple, accurate, and easily interpretable decision rule for classification and class prediction of gene expression profiles. The idea that differences in gene expression ranking are associated with presence or absence of disease is compelling and has strong biological plausibility. Nevertheless, the TSP formulation ignores significant available information which can improve classification accuracy and is vulnerable to selecting genes which do not have differential expression in the two conditions ("pivot" genes). We introduce the AUCTSP classifier as an alternative rank-based estimator of the magnitude of the ranking reversals involved in the original TSP. The proposed estimator is based on the Area Under the Receiver Operating Characteristic (ROC) Curve (AUC) and as such, takes into account the separation of the entire distribution of gene expression levels in gene pairs under the conditions considered, as opposed to comparing gene rankings within individual subjects as in the original TSP formulation. Through extensive simulations and case studies involving classification in ovarian, leukemia, colon, breast and prostate cancers and diffuse large b-cell lymphoma, we show the superiority of the proposed approach in terms of improving classification accuracy, avoiding overfitting and being less prone to selecting non-informative (pivot) genes. The proposed AUCTSP is a simple yet reliable and robust rank-based classifier for gene expression classification. While the AUCTSP works by the same principle as TSP, its ability to determine the top scoring gene pair based on the relative rankings of two marker genes across all subjects as opposed to each individual subject results in significant performance gains in classification accuracy. In addition, the proposed method tends to avoid selection of non-informative (pivot) genes as members of the top-scoring pair.

  15. A Universal Rank-Size Law

    PubMed Central

    2016-01-01

    A mere hyperbolic law, like the Zipf’s law power function, is often inadequate to describe rank-size relationships. An alternative theoretical distribution is proposed based on theoretical physics arguments starting from the Yule-Simon distribution. A modeling is proposed leading to a universal form. A theoretical suggestion for the “best (or optimal) distribution”, is provided through an entropy argument. The ranking of areas through the number of cities in various countries and some sport competition ranking serves for the present illustrations. PMID:27812192

  16. Methods for the Emergency Assessment of Debris-Flow Hazards from Basins Burned by the Fires of 2007, Southern California

    USGS Publications Warehouse

    Cannon, Susan H.; Gartner, Joseph E.; Michael, John A.

    2007-01-01

    This report describes the approach used to assess potential debris-flow hazards from basins burned by the Buckweed, Santiago, Canyon, Poomacha, Ranch, Harris, Witch, Rice, Ammo, Slide, Grass Valley and Cajon Fires of 2007 in southern California. The assessments will be presented as a series of maps showing a relative ranking of the predicted volume of debris flows that can issue from basin outlets in response to a 3-hour duration rainstorm with a 10-year return period. Potential volumes of debris flows are calculated using a multiple-regression model that describes debris-flow volume at a basin outlet as a function of measures of basin gradient, burn extent, and storm rainfall. This assessment provides critical information for issuing basin-specific warnings, locating and designing mitigation measures, and planning of evacuation timing and routes.

  17. Querying and Ranking XML Documents.

    ERIC Educational Resources Information Center

    Schlieder, Torsten; Meuss, Holger

    2002-01-01

    Discussion of XML, information retrieval, precision, and recall focuses on a retrieval technique that adopts the similarity measure of the vector space model, incorporates the document structure, and supports structured queries. Topics include a query model based on tree matching; structured queries and term-based ranking; and term frequency and…

  18. Index of stations: surface-water data-collection network of Texas, September 1999

    USGS Publications Warehouse

    Gandara, Susan C.; Barbie, Dana L.

    2001-01-01

    As of September 30, 1999, the surface-water data-collection network of Texas (table 1) included 321 continuous-record streamflow stations (D), 20 continuous-record gage-height only stations (G), 24 crest-stage partial-record stations (C), 40 floodhydrograph partial-record stations (H), 25 low-flow partial-record stations (L), 1 continuous-record temperature station (M1), 25 continuous-record temperature and specific conductance stations (M2), 17 continuous-record temperature, specific conductance, dissolved oxygen, and pH stations (M4), 4 daily water-quality stations (Qd), 115 periodic water-quality stations (Qp), 17 reservoir/lake surveys for water quality stations (Qs), 85 continuous or daily reservoircontent stations (R), and 10 daily precipitation stations (Pd). Plate 1 identifies the major river basins in Texas and shows the location of the stations listed in table 1. Table 1 shows the station number and name, latitude and longitude, type of station, and office responsible for the collection of the data and maintenance of the record. An 8-digit permanent numerical designation for all gaging stations has been adopted on a nationwide basis; stations are numbered and listed in downstream order. In the downstream direction along the main stem, all stations on a tributary entering between two main-stem stations are listed between these two stations. A similar order is followed in listing stations by first rank, second rank, and other ranks of tributaries. The rank of any tributary, with respect to the stream to which it is an immediate tributary, is indicated by an indention in the table. Each indention represents one rank. This downstream order and system of indention shows which gaging stations are on tributaries between any two stations on a main stem and the rank of the tributary on which each gaging station is situated.

  19. Paradigm for Distributive & Procedural Justice in Equitable Apportionment of Transboundary Ganges Waters Under Changing Climate & Landuse

    NASA Astrophysics Data System (ADS)

    Tyagi, H.; Gosain, A. K.; Khosa, R.; Anand, J.

    2015-12-01

    Rivers have no regard for human demarcated boundaries. Besides, ever increasing demand-supply gap & vested riparian interests, fuel transboundary water conflicts. For resolving such disputes, appropriation doctrines advocating equity & fairness have received endorsement in the Helsinki Rules-1966 & UN Convention-1997. Thus, current study proposes the principle of equitable apportionment for sharing Ganges waters as it balances the interests & deservedness of all stakeholders, namely, India & its 11 states, Bangladesh, Nepal, & China. The study endeavors to derive a reasonable share of each co-basin state by operationalizing the vague concepts of fairness & equity through an objective & quantitative framework encompassing proportionality & egalitarianism for distributive & procedural justice. Equal weightage factors reflecting hydrology, geography & water use potential are chosen for fair share computation, wherein each contender ranks these factors to maximize his entitlement. If cumulative claims exceed the water availability, each claimant puts forth next ranked factor & this process continues till the claims match availability. Due to inter-annual variability in few factors, scenarios for Rabi & Kharif seasons are considered apart from cases for maximum, upper quartile, median, lower quartile & minimum. Possibility of spatial homogeneity & heterogeneity in factors is also recognized. Sometimes lack of technical information hinders transboundary dispute resolution via legal mechanisms. Hence, the study also attempts to bridge this gap between law & technology through GIS-based SWAT hydrologic model by estimating the Ganges water yield, & consequent share of each riparian for range of flows incorporating e-flows as well, under present & future climate & landuse scenarios. 82% of India's territory lies within interstate rivers, & therefore this research is very pertinent as it can facilitate the decision makers in effective interstate water conflict resolution.

  20. The metrics and correlates of physician migration from Africa.

    PubMed

    Arah, Onyebuchi A

    2007-05-17

    Physician migration from poor to rich countries is considered an important contributor to the growing health workforce crisis in the developing world. This is particularly true for Africa. The perceived magnitude of such migration for each source country might, however, depend on the choice of metrics used in the analysis. This study examined the influence of choice of migration metrics on the rankings of African countries that suffered the most physician migration, and investigated the correlates of physician migration. Ranking and correlational analyses were conducted on African physician migration data adjusted for bilateral net flows, and supplemented with developmental, economic and health system data. The setting was the 53 African birth countries of African-born physicians working in nine wealthier destination countries. Three metrics of physician migration were used: total number of physician émigrés; emigration fraction defined as the proportion of the potential physician pool working in destination countries; and physician migration density defined as the number of physician émigrés per 1000 population of the African source country. Rankings based on any of the migration metrics differed substantially from those based on the other two metrics. Although the emigration fraction and physician migration density metrics gave proportionality to the migration crisis, only the latter was consistently associated with source countries' workforce capacity, health, health spending, economic and development characteristics. As such, higher physician migration density was seen among African countries with relatively higher health workforce capacity (0.401 < or = r < or = 0.694, p < or = 0.011), health status, health spending, and development. The perceived magnitude of physician migration is sensitive to the choice of metrics. Complementing the emigration fraction, the physician migration density is a metric which gives a different but proportionate picture of which African countries stand to lose relatively more of its physicians with unchecked migration. The nature of health policies geared at health-worker migration can be expected to depend on the choice of migration metrics.

  1. A machine learning approach for ranking clusters of docked protein‐protein complexes by pairwise cluster comparison

    PubMed Central

    Pfeiffenberger, Erik; Chaleil, Raphael A.G.; Moal, Iain H.

    2017-01-01

    ABSTRACT Reliable identification of near‐native poses of docked protein–protein complexes is still an unsolved problem. The intrinsic heterogeneity of protein–protein interactions is challenging for traditional biophysical or knowledge based potentials and the identification of many false positive binding sites is not unusual. Often, ranking protocols are based on initial clustering of docked poses followed by the application of an energy function to rank each cluster according to its lowest energy member. Here, we present an approach of cluster ranking based not only on one molecular descriptor (e.g., an energy function) but also employing a large number of descriptors that are integrated in a machine learning model, whereby, an extremely randomized tree classifier based on 109 molecular descriptors is trained. The protocol is based on first locally enriching clusters with additional poses, the clusters are then characterized using features describing the distribution of molecular descriptors within the cluster, which are combined into a pairwise cluster comparison model to discriminate near‐native from incorrect clusters. The results show that our approach is able to identify clusters containing near‐native protein–protein complexes. In addition, we present an analysis of the descriptors with respect to their power to discriminate near native from incorrect clusters and how data transformations and recursive feature elimination can improve the ranking performance. Proteins 2017; 85:528–543. © 2016 Wiley Periodicals, Inc. PMID:27935158

  2. Rank One Strange Attractors in Periodically Kicked Predator-Prey System with Time-Delay

    NASA Astrophysics Data System (ADS)

    Yang, Wenjie; Lin, Yiping; Dai, Yunxian; Zhao, Huitao

    2016-06-01

    This paper is devoted to the study of the problem of rank one strange attractor in a periodically kicked predator-prey system with time-delay. Our discussion is based on the theory of rank one maps formulated by Wang and Young. Firstly, we develop the rank one chaotic theory to delayed systems. It is shown that strange attractors occur when the delayed system undergoes a Hopf bifurcation and encounters an external periodic force. Then we use the theory to the periodically kicked predator-prey system with delay, deriving the conditions for Hopf bifurcation and rank one chaos along with the results of numerical simulations.

  3. The Ranking of Higher Education Institutions in Russia: Some Methodological Problems.

    ERIC Educational Resources Information Center

    Filinov, Nikolay B.; Ruchkina, Svetlana

    2002-01-01

    The ranking of higher education institutions in Russia is examined from two points of view: as a social phenomenon and as a multi-criteria decision-making problem. The first point of view introduces the idea of interested and involved parties; the second introduces certain principles on which a rational ranking methodology should be based.…

  4. Comparative Analysis of Rank Aggregation Techniques for Metasearch Using Genetic Algorithm

    ERIC Educational Resources Information Center

    Kaur, Parneet; Singh, Manpreet; Singh Josan, Gurpreet

    2017-01-01

    Rank Aggregation techniques have found wide applications for metasearch along with other streams such as Sports, Voting System, Stock Markets, and Reduction in Spam. This paper presents the optimization of rank lists for web queries put by the user on different MetaSearch engines. A metaheuristic approach such as Genetic algorithm based rank…

  5. Highlighting entanglement of cultures via ranking of multilingual Wikipedia articles.

    PubMed

    Eom, Young-Ho; Shepelyansky, Dima L

    2013-01-01

    How different cultures evaluate a person? Is an important person in one culture is also important in the other culture? We address these questions via ranking of multilingual Wikipedia articles. With three ranking algorithms based on network structure of Wikipedia, we assign ranking to all articles in 9 multilingual editions of Wikipedia and investigate general ranking structure of PageRank, CheiRank and 2DRank. In particular, we focus on articles related to persons, identify top 30 persons for each rank among different editions and analyze distinctions of their distributions over activity fields such as politics, art, science, religion, sport for each edition. We find that local heroes are dominant but also global heroes exist and create an effective network representing entanglement of cultures. The Google matrix analysis of network of cultures shows signs of the Zipf law distribution. This approach allows to examine diversity and shared characteristics of knowledge organization between cultures. The developed computational, data driven approach highlights cultural interconnections in a new perspective. Dated: June 26, 2013.

  6. Highlighting Entanglement of Cultures via Ranking of Multilingual Wikipedia Articles

    PubMed Central

    Eom, Young-Ho; Shepelyansky, Dima L.

    2013-01-01

    How different cultures evaluate a person? Is an important person in one culture is also important in the other culture? We address these questions via ranking of multilingual Wikipedia articles. With three ranking algorithms based on network structure of Wikipedia, we assign ranking to all articles in 9 multilingual editions of Wikipedia and investigate general ranking structure of PageRank, CheiRank and 2DRank. In particular, we focus on articles related to persons, identify top 30 persons for each rank among different editions and analyze distinctions of their distributions over activity fields such as politics, art, science, religion, sport for each edition. We find that local heroes are dominant but also global heroes exist and create an effective network representing entanglement of cultures. The Google matrix analysis of network of cultures shows signs of the Zipf law distribution. This approach allows to examine diversity and shared characteristics of knowledge organization between cultures. The developed computational, data driven approach highlights cultural interconnections in a new perspective. Dated: June 26, 2013 PMID:24098338

  7. Selection of suitable e-learning approach using TOPSIS technique with best ranked criteria weights

    NASA Astrophysics Data System (ADS)

    Mohammed, Husam Jasim; Kasim, Maznah Mat; Shaharanee, Izwan Nizal Mohd

    2017-11-01

    This paper compares the performances of four rank-based weighting assessment techniques, Rank Sum (RS), Rank Reciprocal (RR), Rank Exponent (RE), and Rank Order Centroid (ROC) on five identified e-learning criteria to select the best weights method. A total of 35 experts in a public university in Malaysia were asked to rank the criteria and to evaluate five e-learning approaches which include blended learning, flipped classroom, ICT supported face to face learning, synchronous learning, and asynchronous learning. The best ranked criteria weights are defined as weights that have the least total absolute differences with the geometric mean of all weights, were then used to select the most suitable e-learning approach by using TOPSIS method. The results show that RR weights are the best, while flipped classroom approach implementation is the most suitable approach. This paper has developed a decision framework to aid decision makers (DMs) in choosing the most suitable weighting method for solving MCDM problems.

  8. Contextual effects on the perceived health benefits of exercise: the exercise rank hypothesis.

    PubMed

    Maltby, John; Wood, Alex M; Vlaev, Ivo; Taylor, Michael J; Brown, Gordon D A

    2012-12-01

    Many accounts of social influences on exercise participation describe how people compare their behaviors to those of others. We develop and test a novel hypothesis, the exercise rank hypothesis, of how this comparison can occur. The exercise rank hypothesis, derived from evolutionary theory and the decision by sampling model of judgment, suggests that individuals' perceptions of the health benefits of exercise are influenced by how individuals believe the amount of exercise ranks in comparison with other people's amounts of exercise. Study 1 demonstrated that individuals' perceptions of the health benefits of their own current exercise amounts were as predicted by the exercise rank hypothesis. Study 2 demonstrated that the perceptions of the health benefits of an amount of exercise can be manipulated by experimentally changing the ranked position of the amount within a comparison context. The discussion focuses on how social norm-based interventions could benefit from using rank information.

  9. Postwildfire preliminary debris flow hazard assessment for the area burned by the 2011 Las Conchas Fire in north-central New Mexico

    USGS Publications Warehouse

    Tillery, Anne C.; Darr, Michael J.; Cannon, Susan H.; Michael, John A.

    2011-01-01

    The Las Conchas Fire during the summer of 2011 was the largest in recorded history for the state of New Mexico, burning 634 square kilometers in the Jemez Mountains of north-central New Mexico. The burned landscape is now at risk of damage from postwildfire erosion, such as that caused by debris flows and flash floods. This report presents a preliminary hazard assessment of the debris-flow potential from 321 basins burned by the Las Conchas Fire. A pair of empirical hazard-assessment models developed using data from recently burned basins throughout the intermountain western United States was used to estimate the probability of debris-flow occurrence and volume of debris flows at the outlets of selected drainage basins within the burned area. The models incorporate measures of burn severity, topography, soils, and storm rainfall to estimate the probability and volume of debris flows following the fire. In response to a design storm of 28.0 millimeters of rain in 30 minutes (10-year recurrence interval), the probabilities of debris flows estimated for basins burned by the Las Conchas Fire were greater than 80 percent for two-thirds (67 percent) of the modeled basins. Basins with a high (greater than 80 percent) probability of debris-flow occurrence were concentrated in tributaries to Santa Clara and Rio del Oso Canyons in the northeastern part of the burned area; some steep areas in the Valles Caldera National Preserve, Los Alamos, and Guaje Canyons in the east-central part of the burned area; tributaries to Peralta, Colle, Bland, and Cochiti canyons in the southwestern part of the burned area; and tributaries to Frijoles, Alamo, and Capulin Canyons in the southeastern part of the burned area (within Bandelier National Monument). Estimated debris-flow volumes ranged from 400 cubic meters to greater than 72,000 cubic meters. The largest volumes (greater than 40,000 cubic meters) were estimated for basins in Santa Clara, Los Alamos, and Water Canyons, and for two basins at the northeast edge of the burned area tributary to Rio del Oso and Vallecitos Creek. The Combined Relative Debris-Flow Hazard Rankings identify the areas of highest probability of the largest debris flows. Basins with high Combined Relative Debris-Flow Hazard Rankings include upper Santa Clara Canyon in the northern section of the burn scar, and portions of Peralta, Colle, Bland, Cochiti, Capulin, Alamo, and Frijoles Canyons in the southern section of the burn scar. Three basins with high Combined Relative Debris-Flow Hazard Rankings also occur in areas upstream from the city of Los Alamos—the city is home to and surrounded by numerous technical sites for the Los Alamos National Laboratory. Potential debris flows in the burned area could affect the water supply for Santa Clara Pueblo and several recreational lakes, as well as recreational and archeological resources in Bandelier National Monument. Debris flows could damage bridges and culverts along State Highway 501 and other roadways. Additional assessment is necessary to determine if the estimated volume of material is sufficient to travel into areas downstream from the modeled basins along the valley floors, where they could affect human life, property, agriculture, and infrastructure in those areas. Additionally, further investigation is needed to assess the potential for debris flows to affect structures at or downstream from basin outlets and to increase the threat of flooding downstream by damaging or blocking flood mitigation structures. The maps presented here may be used to prioritize areas where erosion mitigation or other protective measures may be necessary within a 2- to 3-year window of vulnerability following the Las Conchas Fire.

  10. Asymmetric flow field flow fractionation with light scattering detection - an orthogonal sensitivity analysis.

    PubMed

    Galyean, Anne A; Filliben, James J; Holbrook, R David; Vreeland, Wyatt N; Weinberg, Howard S

    2016-11-18

    Asymmetric flow field flow fractionation (AF 4 ) has several instrumental factors that may have a direct effect on separation performance. A sensitivity analysis was applied to ascertain the relative importance of AF 4 primary instrument factor settings for the separation of a complex environmental sample. The analysis evaluated the impact of instrumental factors namely, cross flow, ramp time, focus flow, injection volume, and run buffer concentration on the multi-angle light scattering measurement of natural organic matter (NOM) molar mass (MM). A 2 (5-1) orthogonal fractional factorial design was used to minimize analysis time while preserving the accuracy and robustness in the determination of the main effects and interactions between any two instrumental factors. By assuming that separations resulting in smaller MM measurements would be more accurate, the analysis produced a ranked list of effects estimates for factors and interactions of factors based on their relative importance in minimizing the MM. The most important and statistically significant AF 4 instrumental factors were buffer concentration and cross flow. The least important was ramp time. A parallel 2 (5-2) orthogonal fractional factorial design was also employed on five environmental factors for synthetic natural water samples containing silver nanoparticles (NPs), namely: NP concentration, NP size, NOM concentration, specific conductance, and pH. None of the water quality characteristic effects or interactions were found to be significant in minimizing the measured MM; however, the interaction between NP concentration and NP size was an important effect when considering NOM recovery. This work presents a structured approach for the rigorous assessment of AF 4 instrument factors and optimal settings for the separation of complex samples utilizing efficient orthogonal factional factorial design and appropriate graphical analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. CFD simulation of vertical linear motion mixing in anaerobic digester tanks.

    PubMed

    Meroney, Robert N; Sheker, Robert E

    2014-09-01

    Computational fluid dynamics (CFD) was used to simulate the mixing characteristics of a small circular anaerobic digester tank (diameter 6 m) equipped sequentially with 13 different plunger type vertical linear motion mixers and two different type internal draft-tube mixers. Rates of mixing of step injection of tracers were calculated from which active volume (AV) and hydraulic retention time (HRT) could be calculated. Washout characteristics were compared to analytic formulae to estimate any presence of partial mixing, dead volume, short-circuiting, or piston flow. Active volumes were also estimated based on tank regions that exceeded minimum velocity criteria. The mixers were ranked based on an ad hoc criteria related to the ratio of AV to unit power (UP) or AV/UP. The best plunger mixers were found to behave about the same as the conventional draft-tube mixers of similar UP.

  12. An uncertainty analysis of the flood-stage upstream from a bridge.

    PubMed

    Sowiński, M

    2006-01-01

    The paper begins with the formulation of the problem in the form of a general performance function. Next the Latin hypercube sampling (LHS) technique--a modified version of the Monte Carlo method is briefly described. The essential uncertainty analysis of the flood-stage upstream from a bridge starts with a description of the hydraulic model. This model concept is based on the HEC-RAS model developed for subcritical flow under a bridge without piers in which the energy equation is applied. The next section contains the characteristic of the basic variables including a specification of their statistics (means and variances). Next the problem of correlated variables is discussed and assumptions concerning correlation among basic variables are formulated. The analysis of results is based on LHS ranking lists obtained from the computer package UNCSAM. Results fot two examples are given: one for independent and the other for correlated variables.

  13. Health information on internet: quality, importance, and popularity of persian health websites.

    PubMed

    Samadbeik, Mahnaz; Ahmadi, Maryam; Mohammadi, Ali; Mohseni Saravi, Beniamin

    2014-04-01

    The Internet has provided great opportunities for disseminating both accurate and inaccurate health information. Therefore, the quality of information is considered as a widespread concern affecting the human life. Despite the increasingly substantial growth in the number of users, Persian health websites and the proportion of internet-using patients, little is known about the quality of Persian medical and health websites. The current study aimed to first assess the quality, popularity and importance of websites providing Persian health-related information, and second to evaluate the correlation of the popularity and importance ranking with quality score on the Internet. The sample websites were identified by entering the health-related keywords into four most popular search engines of Iranian users based on the Alexa ranking at the time of study. Each selected website was assessed using three qualified tools including the Bomba and Land Index, Google PageRank and the Alexa ranking. The evaluated sites characteristics (ownership structure, database, scope and objective) really did not have an effect on the Alexa traffic global rank, Alexa traffic rank in Iran, Google PageRank and Bomba total score. Most websites (78.9 percent, n = 56) were in the moderate category (8 ≤ x ≤ 11.99) based on their quality levels. There was no statistically significant association between Google PageRank with Bomba index variables and Alexa traffic global rank (P > 0.05). The Persian health websites had better Bomba quality scores in availability and usability guidelines as compared to other guidelines. The Google PageRank did not properly reflect the real quality of evaluated websites and Internet users seeking online health information should not merely rely on it for any kind of prejudgment regarding Persian health websites. However, they can use Iran Alexa rank as a primary filtering tool of these websites. Therefore, designing search engines dedicated to explore accredited Persian health-related Web sites can be an effective method to access high-quality Persian health websites.

  14. Historical perspective of statewide streamflows during the 2002 and 1977 droughts in Colorado

    USGS Publications Warehouse

    Kuhn, Gerhard

    2005-01-01

    Since 1890, Colorado has experienced a number of widespread drought periods; the most recent statewide drought began during 1999 and includes 2002, a year characterized by precipitation, snowpack accumulation, and streamflows that were much lower than normal. Because the drought of 2002 had a substantial effect on streamflows in Colorado, the U.S. Geological Survey, in cooperation with the Colorado Water Conservation Board, began a study in 2004 to analyze statewide streamflows during 2002 and develop a historical perspective of those streamflows. The purpose of this report is to describe an analysis of streamflows recorded throughout Colorado during the drought of 2002, as well as other drought years such as 1977, and to provide some historical perspective of drought-diminished streamflows in Colorado. Because most streamflows in Colorado are derived from melting of mountain snowpacks during April through July, streamflows primarily were analyzed for the snowmelt (high-flow) period, but streamflows also were analyzed for the winter (low-flow) period. The snowmelt period is defined as April 1 through September 30 and the winter period is defined as October 1 through March 31. Historical daily average streamflows were analyzed on the basis of 7, 30, 90, and 180 consecutive-day periods (N-day) for 154 selected stations in Colorado. Methods used for analysis of the N-day snowmelt and winter streamflows include evaluation of trends in the historical streamflow records, computation of the rank of each annual N-day streamflow value for each station, analysis for years other than 2002 and 1977 with drought-diminished streamflows, and frequency analysis (on the basis of nonexceedance probability) of the 180-day streamflows. Ranking analyses for the N-day snowmelt streamflows indicated that streamflows during 2002 were ranked as the lowest or second lowest historical values at 114-123 stations, or about 74-80 percent of the stations; by comparison, the N-day snowmelt streamflows during 1977 were ranked as the lowest or second lowest historical values at 69-87 stations, or about 47-59 percent of the stations. Many of the stations in the mountainous headwaters where snowmelt streamflows were ranked lowest during 2002 were ranked second lowest during 1977. These results indicate that snowmelt streamflows during 2002 were considerably more diminished than those during 1977. The 180-day snowmelt streamflows were ranked among the five lowest historical values at about 90 percent of the stations during 2002 and were ranked among the five lowest historical values at about 77 percent of the stations during 1977. Other years during which the 180-day snowmelt streamflows were ranked among the five lowest values at a substantial percentage of stations include 1934, 1954, 1963, and 1981, but the percentages of stations with 180-day snowmelt streamflows ranked among the five lowest values were smaller during those years than during 2002 and 1977. Frequency analysis of snowmelt streamflows indicated that recurrence intervals for the 180-day snowmelt streamflows during 2002 were greater than 50 years for about 57 percent of the stations and were more than 100 years for about 14 percent of the stations. By comparison, recurrence intervals for the 180-day snowmelt streamflows during 1977 were greater than 50 years only for about 15 percent of the stations and were more than 100 years only for about 1 percent of the stations. Generally, snowmelt streamflows during 2002 were more diminished and have higher recurrence intervals than snowmelt streamflows during 1977. The N-day winter streamflows during 2002 and 1977 were not ranked among the five lowest historical values at about 86-103 stations, or about 58-70 percent of the stations, compared to about 10-27 percent of the stations for the N-day snowmelt streamflows. These results indicate that winter streamflows during the 2002 and 1977 droughts were diminished to a lesser extent than t

  15. Clinical prognostic significance and pro-metastatic activity of RANK/RANKL via the AKT pathway in endometrial cancer.

    PubMed

    Wang, Jing; Liu, Yao; Wang, Lihua; Sun, Xiao; Wang, Yudong

    2016-02-02

    RANK/RANKL plays a key role in metastasis of certain malignant tumors, which makes it a promising target for developing novel therapeutic strategies for cancer. However, the prognostic value and pro-metastatic activity of RANK in endometrial cancer (EC) remain to be determined. Thus, the present study investigated the effect of RANK on the prognosis of EC patients, as well as the pro-metastatic activity of EC cells. The results indicated that those with high expression of RANK showed decreased overall survival and progression-free survival. Statistical analysis revealed the positive correlations between RANK/RANKL expression and metastasis-related factors. Additionally, RANK/RANKL significantly promoted cell migration/invasion via activating AKT/β-catenin/Snail pathway in vitro. However, RANK/RANKL-induced AKT activation could be suppressed after osteoprotegerin (OPG) treatment. Furthermore, the combination of medroxyprogesterone acetate (MPA) and RANKL could in turn attenuate the effect of RANKL alone. Similarly, MPA could partially inhibit the RANK-induced metastasis in an orthotopic mouse model via suppressing AKT/β-catenin/Snail pathway. Therefore, therapeutic inhibition of MPA in RANK/RANKL-induced metastasis was mediated by AKT/β-catenin/Snail pathway both in vitro and in vivo, suggesting a potential target of RANK for gene-based therapy for EC.

  16. Thermal state of the Arkoma Basin and the Anadarko Basin, Oklahoma

    NASA Astrophysics Data System (ADS)

    Lee, Youngmin

    1999-12-01

    One of the most fundamental physical processes that affects virtually all geologic phenomena in sedimentary basins is the flow of heat from the Earth's interiors. The Arkoma Basin and the Anadarko Basin, Oklahoma, are a prolific producer of both oil and natural gas. Both basins also have important geologic phenomena. Understanding the thermal state of the these basins is crucial to understanding the timing and extent of hydrocarbon generation, the genesis of Mississippi Valley-type ore deposits, and the origin of overpressures in the Anadarko Basin. In chapter one, heat flow and heat production in the Arkoma basin and Oklahoma Platform are discussed. Results of this study are not generally supportive of theories which invoke topographically driven regional groundwater flow from the Arkoma Basin in Late Pennsylvanian-Early Permian time (˜290 Ma) to explain the genesis of geologic phenomena. In chapter 2, different types of thermal conductivity temperature corrections that are commonly applied in terrestrial heat flow studies are evaluated. The invariance of the relative rankings with respect to rock porosity suggests the rankings may be valid with respect to in situ conditions. Chapter three addresses heat flow and thermal history of the Anadarko Basin and the western Oklahoma Platform. We found no evidence for heat flow to increase significantly from the Anadarko Basin in the south to the Oklahoma Platform to the north. In chapter four, overpressures in the Anadarko Basin, southwestern Oklahoma are discussed. Using scale analyses and a simple numerical model, we evaluated two endmember hypotheses (compaction disequilibrium and hydrocarbon generation) as possible causes of overpressuring. Geopressure models which invoke compaction disequilibrium do not appear to apply to the Anadarko Basin. The Anadarko Basin belongs to a group of cratonic basins which are tectonically quiescent and are characterized by the association of abnormal pressures with natural gas. (Abstract shortened by UMI.)

  17. Are anticoagulant independent mechanical valves within reach-fast prototype fabrication and in vitro testing of innovative bi-leaflet valve models.

    PubMed

    Scotten, Lawrence N; Siegel, Rolland

    2015-08-01

    Exploration for causes of prosthetic valve thrombogenicity has frequently focused on forward or post-closure flow detail. In prior laboratory studies, we uncovered high amplitude flow velocities of short duration close to valve closure implying potential for substantial shear stress with subsequent initiation of blood coagulation pathways. This may be relevant to widely accepted clinical disparity between mechanical and tissue valves vis-à-vis thrombogenicity. With a series of prototype bi-leaflet mechanical valves, we attempt reduction of closure related velocities with the objective of identifying a prototype valve with thrombogenic potential similar to our tissue valve control. This iterative design approach may find application in preclinical assessment of valves for anticoagulation independence. Tested valves included: prototype mechanical bi-leaflet BVs (n=56), controls (n=2) and patented early prototype mechanicals (n=2) from other investigators. Pulsatile and quasi-steady flow systems were used for testing. Projected dynamic valve area (PDVA) was measured using previously described novel technology. Flow velocity over the open and closing periods was determined by volumetric flow rate/PDVA. For the closed valve interval, use was made of data obtained from quasi-steady back pressure/flow tests. Performance was ranked by a proposed thrombogenicity potential index (TPI) relative to tissue and mechanical control valves. Optimization of the prototype valve designs lead to a 3-D printed model (BV3D). For the mitral/aortic site, BV3D has lower TPI (1.10/1.47) relative to the control mechanical valve (3.44/3.93) and similar to the control tissue valve (ideal TPI ≤1.0). Using unique technology, rapid prototyping and thrombogenicity ranking, optimization of experimental valves for reduced thrombogenic potential was expedited and simplified. Innovative mechanical valve configurations were identified that merit consideration for further development which may bring the anti-coagulation independent mechanical valve within reach.

  18. Are anticoagulant independent mechanical valves within reach—fast prototype fabrication and in vitro testing of innovative bi-leaflet valve models

    PubMed Central

    Siegel, Rolland

    2015-01-01

    Background Exploration for causes of prosthetic valve thrombogenicity has frequently focused on forward or post-closure flow detail. In prior laboratory studies, we uncovered high amplitude flow velocities of short duration close to valve closure implying potential for substantial shear stress with subsequent initiation of blood coagulation pathways. This may be relevant to widely accepted clinical disparity between mechanical and tissue valves vis-à-vis thrombogenicity. With a series of prototype bi-leaflet mechanical valves, we attempt reduction of closure related velocities with the objective of identifying a prototype valve with thrombogenic potential similar to our tissue valve control. This iterative design approach may find application in preclinical assessment of valves for anticoagulation independence. Methods Tested valves included: prototype mechanical bi-leaflet BVs (n=56), controls (n=2) and patented early prototype mechanicals (n=2) from other investigators. Pulsatile and quasi-steady flow systems were used for testing. Projected dynamic valve area (PDVA) was measured using previously described novel technology. Flow velocity over the open and closing periods was determined by volumetric flow rate/PDVA. For the closed valve interval, use was made of data obtained from quasi-steady back pressure/flow tests. Performance was ranked by a proposed thrombogenicity potential index (TPI) relative to tissue and mechanical control valves. Results Optimization of the prototype valve designs lead to a 3-D printed model (BV3D). For the mitral/aortic site, BV3D has lower TPI (1.10/1.47) relative to the control mechanical valve (3.44/3.93) and similar to the control tissue valve (ideal TPI ≤1.0). Conclusions Using unique technology, rapid prototyping and thrombogenicity ranking, optimization of experimental valves for reduced thrombogenic potential was expedited and simplified. Innovative mechanical valve configurations were identified that merit consideration for further development which may bring the anti-coagulation independent mechanical valve within reach. PMID:26417581

  19. Emergency Assessment of Postfire Debris-Flow Hazards for the 2009 Station Fire, San Gabriel Mountains, Southern California

    USGS Publications Warehouse

    Cannon, Susan H.; Gartner, Joseph E.; Rupert, Michael G.; Michael, John A.; Staley, Dennis M.; Worstell, Bruce B.

    2009-01-01

    This report presents an emergency assessment of potential debris-flow hazards from basins burned by the 2009 Station fire in Los Angeles County, southern California. Statistical-empirical models developed for postfire debris flows are used to estimate the probability and volume of debris-flow production from 678 drainage basins within the burned area and to generate maps of areas that may be inundated along the San Gabriel mountain front by the estimated volume of material. Debris-flow probabilities and volumes are estimated as combined functions of different measures of basin burned extent, gradient, and material properties in response to both a 3-hour-duration, 1-year-recurrence thunderstorm and to a 12-hour-duration, 2-year recurrence storm. Debris-flow inundation areas are mapped for scenarios where all sediment-retention basins are empty and where the basins are all completely full. This assessment provides critical information for issuing warnings, locating and designing mitigation measures, and planning evacuation timing and routes within the first two winters following the fire. Tributary basins that drain into Pacoima Canyon, Big Tujunga Canyon, Arroyo Seco, West Fork of the San Gabriel River, and Devils Canyon were identified as having probabilities of debris-flow occurrence greater than 80 percent, the potential to produce debris flows with volumes greater than 100,000 m3, and the highest Combined Relative Debris-Flow Hazard Ranking in response to both storms. The predicted high probability and large magnitude of the response to such short-recurrence storms indicates the potential for significant debris-flow impacts to any buildings, roads, bridges, culverts, and reservoirs located both within these drainages and downstream from the burned area. These areas will require appropriate debris-flow mitigation and warning efforts. Probabilities of debris-flow occurrence greater than 80 percent, debris-flow volumes between 10,000 and 100,000 m3, and high Combined Relative Debris-Flow Hazard Rankings were estimated in response to both short recurrence-interval (1- and 2-year) storms for all but the smallest basins along the San Gabriel mountain front between Big Tujunga Canyon and Arroyo Seco. The combination of high probabilities and large magnitudes determined for these basins indicates significant debris-flow hazards for neighborhoods along the mountain front. When the capacity of sediment-retention basins is exceeded, debris flows may be deposited in neighborhoods and streets and impact infrastructure between the mountain front and Foothill Boulevard. In addition, debris flows may be deposited in neighborhoods immediately below unprotected basins. Hazards to neighborhoods and structures at risk from these events will require appropriate debris-flow mitigation and warning efforts.

  20. Composite Flood Risk for Virgin Island

    EPA Pesticide Factsheets

    The Composite Flood Risk layer combines flood hazard datasets from Federal Emergency Management Agency (FEMA) flood zones, NOAA's Shallow Coastal Flooding, and the National Hurricane Center SLOSH model for Storm Surge inundation for category 1, 2, and 3 hurricanes.Geographic areas are represented by a grid of 10 by 10 meter cells and each cell has a ranking based on variation in exposure to flooding hazards: Moderate, High and Extreme exposure. Geographic areas in each input layers are ranked based on their probability of flood risk exposure. The logic was such that areas exposed to flooding on a more frequent basis were given a higher ranking. Thus the ranking incorporates the probability of the area being flooded. For example, even though a Category 3 storm surge has higher flooding elevations, the likelihood of the occurrence is lower than a Category 1 storm surge and therefore the Category 3 flood area is given a lower exposure ranking. Extreme exposure areas are those areas that are exposed to relatively frequent flooding.The ranked input layers are then converted to a raster for the creation of the composite risk layer by using cell statistics in spatial analysis. The highest exposure ranking for a given cell in any of the three input layers is assigned to the corresponding cell in the composite layer.For example, if an area (a cell) is rank as medium in the FEMA layer, moderate in the SLOSH layer, but extreme in the SCF layer, the cell will be considere

  1. Leonardite char adsorbents

    DOEpatents

    Knudson, Curtis L.

    1993-01-01

    A process of preparing lignite (low rank) coal filter material, suitable for use in lieu of more expensive activated carbon filter materials, is disclosed. The process comprises size reducing Leonardite coal material to a suitable filtering effective size, and thereafter heating the size reduced Leonardite preferably to at least 750.degree. C. in the presence of a flow of an inert gas.

  2. Leonardite char adsorbents

    DOEpatents

    Knudson, C.L.

    1993-10-19

    A process of preparing lignite (low rank) coal filter material, suitable for use in lieu of more expensive activated carbon filter materials, is disclosed. The process comprises size reducing Leonardite coal material to a suitable filtering effective size, and thereafter heating the size reduced Leonardite preferably to at least 750 C in the presence of a flow of an inert gas. 1 figure.

  3. Efficient Tensor Completion for Color Image and Video Recovery: Low-Rank Tensor Train.

    PubMed

    Bengua, Johann A; Phien, Ho N; Tuan, Hoang Duong; Do, Minh N

    2017-05-01

    This paper proposes a novel approach to tensor completion, which recovers missing entries of data represented by tensors. The approach is based on the tensor train (TT) rank, which is able to capture hidden information from tensors thanks to its definition from a well-balanced matricization scheme. Accordingly, new optimization formulations for tensor completion are proposed as well as two new algorithms for their solution. The first one called simple low-rank tensor completion via TT (SiLRTC-TT) is intimately related to minimizing a nuclear norm based on TT rank. The second one is from a multilinear matrix factorization model to approximate the TT rank of a tensor, and is called tensor completion by parallel matrix factorization via TT (TMac-TT). A tensor augmentation scheme of transforming a low-order tensor to higher orders is also proposed to enhance the effectiveness of SiLRTC-TT and TMac-TT. Simulation results for color image and video recovery show the clear advantage of our method over all other methods.

  4. Monte Carlo simulations guided by imaging to predict the in vitro ranking of radiosensitizing nanoparticles

    PubMed Central

    Retif, Paul; Reinhard, Aurélie; Paquot, Héna; Jouan-Hureaux, Valérie; Chateau, Alicia; Sancey, Lucie; Barberi-Heyob, Muriel; Pinel, Sophie; Bastogne, Thierry

    2016-01-01

    This article addresses the in silico–in vitro prediction issue of organometallic nanoparticles (NPs)-based radiosensitization enhancement. The goal was to carry out computational experiments to quickly identify efficient nanostructures and then to preferentially select the most promising ones for the subsequent in vivo studies. To this aim, this interdisciplinary article introduces a new theoretical Monte Carlo computational ranking method and tests it using 3 different organometallic NPs in terms of size and composition. While the ranking predicted in a classical theoretical scenario did not fit the reference results at all, in contrast, we showed for the first time how our accelerated in silico virtual screening method, based on basic in vitro experimental data (which takes into account the NPs cell biodistribution), was able to predict a relevant ranking in accordance with in vitro clonogenic efficiency. This corroborates the pertinence of such a prior ranking method that could speed up the preclinical development of NPs in radiation therapy. PMID:27920524

  5. Monte Carlo simulations guided by imaging to predict the in vitro ranking of radiosensitizing nanoparticles.

    PubMed

    Retif, Paul; Reinhard, Aurélie; Paquot, Héna; Jouan-Hureaux, Valérie; Chateau, Alicia; Sancey, Lucie; Barberi-Heyob, Muriel; Pinel, Sophie; Bastogne, Thierry

    This article addresses the in silico-in vitro prediction issue of organometallic nanoparticles (NPs)-based radiosensitization enhancement. The goal was to carry out computational experiments to quickly identify efficient nanostructures and then to preferentially select the most promising ones for the subsequent in vivo studies. To this aim, this interdisciplinary article introduces a new theoretical Monte Carlo computational ranking method and tests it using 3 different organometallic NPs in terms of size and composition. While the ranking predicted in a classical theoretical scenario did not fit the reference results at all, in contrast, we showed for the first time how our accelerated in silico virtual screening method, based on basic in vitro experimental data (which takes into account the NPs cell biodistribution), was able to predict a relevant ranking in accordance with in vitro clonogenic efficiency. This corroborates the pertinence of such a prior ranking method that could speed up the preclinical development of NPs in radiation therapy.

  6. A ranking method for the concurrent learning of compounds with various activity profiles.

    PubMed

    Dörr, Alexander; Rosenbaum, Lars; Zell, Andreas

    2015-01-01

    In this study, we present a SVM-based ranking algorithm for the concurrent learning of compounds with different activity profiles and their varying prioritization. To this end, a specific labeling of each compound was elaborated in order to infer virtual screening models against multiple targets. We compared the method with several state-of-the-art SVM classification techniques that are capable of inferring multi-target screening models on three chemical data sets (cytochrome P450s, dehydrogenases, and a trypsin-like protease data set) containing three different biological targets each. The experiments show that ranking-based algorithms show an increased performance for single- and multi-target virtual screening. Moreover, compounds that do not completely fulfill the desired activity profile are still ranked higher than decoys or compounds with an entirely undesired profile, compared to other multi-target SVM methods. SVM-based ranking methods constitute a valuable approach for virtual screening in multi-target drug design. The utilization of such methods is most helpful when dealing with compounds with various activity profiles and the finding of many ligands with an already perfectly matching activity profile is not to be expected.

  7. Development of a multicriteria assessment model for ranking biomass feedstock collection and transportation systems.

    PubMed

    Kumar, Amit; Sokhansanj, Shahab; Flynn, Peter C

    2006-01-01

    This study details multicriteria assessment methodology that integrates economic, social, environmental, and technical factors in order to rank alternatives for biomass collection and transportation systems. Ranking of biomass collection systems is based on cost of delivered biomass, quality of biomass supplied, emissions during collection, energy input to the chain operations, and maturity of supply system technologies. The assessment methodology is used to evaluate alternatives for collecting 1.8 x 10(6) dry t/yr based on assumptions made on performance of various assemblies of biomass collection systems. A proposed collection option using loafer/ stacker was shown to be the best option followed by ensiling and baling. Ranking of biomass transport systems is based on cost of biomass transport, emissions during transport, traffic congestion, and maturity of different technologies. At a capacity of 4 x 10(6) dry t/yr, rail transport was shown to be the best option, followed by truck transport and pipeline transport, respectively. These rankings depend highly on assumed maturity of technologies and scale of utilization. These may change if technologies such as loafing or ensiling (wet storage) methods are proved to be infeasible for large-scale collection systems.

  8. Low-Rank Correction Methods for Algebraic Domain Decomposition Preconditioners

    DOE PAGES

    Li, Ruipeng; Saad, Yousef

    2017-08-01

    This study presents a parallel preconditioning method for distributed sparse linear systems, based on an approximate inverse of the original matrix, that adopts a general framework of distributed sparse matrices and exploits domain decomposition (DD) and low-rank corrections. The DD approach decouples the matrix and, once inverted, a low-rank approximation is applied by exploiting the Sherman--Morrison--Woodbury formula, which yields two variants of the preconditioning methods. The low-rank expansion is computed by the Lanczos procedure with reorthogonalizations. Numerical experiments indicate that, when combined with Krylov subspace accelerators, this preconditioner can be efficient and robust for solving symmetric sparse linear systems. Comparisonsmore » with pARMS, a DD-based parallel incomplete LU (ILU) preconditioning method, are presented for solving Poisson's equation and linear elasticity problems.« less

  9. Low-Rank Correction Methods for Algebraic Domain Decomposition Preconditioners

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Ruipeng; Saad, Yousef

    This study presents a parallel preconditioning method for distributed sparse linear systems, based on an approximate inverse of the original matrix, that adopts a general framework of distributed sparse matrices and exploits domain decomposition (DD) and low-rank corrections. The DD approach decouples the matrix and, once inverted, a low-rank approximation is applied by exploiting the Sherman--Morrison--Woodbury formula, which yields two variants of the preconditioning methods. The low-rank expansion is computed by the Lanczos procedure with reorthogonalizations. Numerical experiments indicate that, when combined with Krylov subspace accelerators, this preconditioner can be efficient and robust for solving symmetric sparse linear systems. Comparisonsmore » with pARMS, a DD-based parallel incomplete LU (ILU) preconditioning method, are presented for solving Poisson's equation and linear elasticity problems.« less

  10. Potential postwildfire debris-flow hazards: a prewildfire evaluation for the Sandia and Manzano Mountains and surrounding areas, central New Mexico

    USGS Publications Warehouse

    Tillery, Anne C.; Haas, Jessica R.; Miller, Lara W.; Scott, Joe H.; Thompson, Matthew P.

    2014-01-01

    Wildfire can drastically increase the probability of debris flows, a potentially hazardous and destructive form of mass wasting, in landscapes that have otherwise been stable throughout recent history. Although there is no way to know the exact location, extent, and severity of wildfire, or the subsequent rainfall intensity and duration before it happens, probabilities of fire and debris-flow occurrence for different locations can be estimated with geospatial analysis and modeling efforts. The purpose of this report is to provide information on which watersheds might constitute the most serious, potential, debris-flow hazards in the event of a large-scale wildfire and subsequent rainfall in the Sandia and Manzano Mountains. Potential probabilities and estimated volumes of postwildfire debris flows in the unburned Sandia and Manzano Mountains and surrounding areas were estimated using empirical debris-flow models developed by the U.S. Geological Survey in combination with fire behavior and burn probability models developed by the U.S. Department of Agriculture Forest Service. The locations of the greatest debris-flow hazards correlate with the areas of steepest slopes and simulated crown-fire behavior. The four subbasins with the highest computed debris-flow probabilities (greater than 98 percent) were all in the Manzano Mountains, two flowing east and two flowing west. Volumes in sixteen subbasins were greater than 50,000 square meters and most of these were in the central Manzanos and the western facing slopes of the Sandias. Five subbasins on the west-facing slopes of the Sandia Mountains, four of which have downstream reaches that lead into the outskirts of the City of Albuquerque, are among subbasins in the 98th percentile of integrated relative debris-flow hazard rankings. The bulk of the remaining subbasins in the 98th percentile of integrated relative debris-flow hazard rankings are located along the highest and steepest slopes of the Manzano Mountains. One of the subbasins is several miles upstream from the community of Tajique and another is several miles upstream from the community of Manzano, both on the eastern slopes of the Manzano Mountains. This prewildfire assessment approach is valuable to resource managers because the analysis of the debris-flow threat is made before a wildfire occurs, which facilitates prewildfire management, planning, and mitigation. In northern New Mexico, widespread watershed restoration efforts are being carried out to safeguard vital watersheds against the threat of catastrophic wildfire. This study was initiated to help select ideal locations for the restoration efforts that could have the best return on investment.

  11. Minority Student Academic Performance under the Uniform Admission Law: Evidence from the University of Texas at Austin

    ERIC Educational Resources Information Center

    Niu, Sunny X.; Tienda, Marta

    2010-01-01

    The University of Texas at Austin administrative data between 1990 and 2003 are used to evaluate claims that students granted automatic admission based on top 10% class rank underperform academically relative to lower ranked students who graduate from highly competitive high schools. Compared with White students ranked at or below the third…

  12. A Ranking Analysis of the Management Schools in Greater China (2000-2010): Evidence from the SSCI Database

    ERIC Educational Resources Information Center

    Hou, Mingjun; Fan, Peihua; Liu, Heng

    2014-01-01

    The authors rank the management schools in Greater China (including Mainland China, Hong Kong, Taiwan, and Macau) based on their academic publications in the Social Sciences Citation Index management and business journals from 2000 to 2010. Following K. Ritzberger's (2008) and X. Yu and Z. Gao's (2010) ranking method, the authors develop six…

  13. Investigating a Judgemental Rank-Ordering Method for Maintaining Standards in UK Examinations

    ERIC Educational Resources Information Center

    Black, Beth; Bramley, Tom

    2008-01-01

    A new judgemental method of equating raw scores on two tests, based on rank-ordering scripts from both tests, has been developed by Bramley. The rank-ordering method has potential application as a judgemental standard-maintaining mechanism, because given a mark on one test (e.g. the A grade boundary mark), the equivalent mark (i.e. at the same…

  14. Determination of Career Planning Profiles of Turkish Athletes Who Are Ranked in the Olympics

    ERIC Educational Resources Information Center

    Hulya, Bingol; Cemal, Gundogdu; Sukru, Bingol

    2012-01-01

    This study researched in the level of career planning of Turkish athletes ranked in the Olympics during the time they were active in sports and after they retired. This study which aimed to determine the career planning efficiency of Turkish athletes ranked in the Olympics based on the viewpoints of the athletes holding Olympic degree is scanning…

  15. CT Image Sequence Restoration Based on Sparse and Low-Rank Decomposition

    PubMed Central

    Gou, Shuiping; Wang, Yueyue; Wang, Zhilong; Peng, Yong; Zhang, Xiaopeng; Jiao, Licheng; Wu, Jianshe

    2013-01-01

    Blurry organ boundaries and soft tissue structures present a major challenge in biomedical image restoration. In this paper, we propose a low-rank decomposition-based method for computed tomography (CT) image sequence restoration, where the CT image sequence is decomposed into a sparse component and a low-rank component. A new point spread function of Weiner filter is employed to efficiently remove blur in the sparse component; a wiener filtering with the Gaussian PSF is used to recover the average image of the low-rank component. And then we get the recovered CT image sequence by combining the recovery low-rank image with all recovery sparse image sequence. Our method achieves restoration results with higher contrast, sharper organ boundaries and richer soft tissue structure information, compared with existing CT image restoration methods. The robustness of our method was assessed with numerical experiments using three different low-rank models: Robust Principle Component Analysis (RPCA), Linearized Alternating Direction Method with Adaptive Penalty (LADMAP) and Go Decomposition (GoDec). Experimental results demonstrated that the RPCA model was the most suitable for the small noise CT images whereas the GoDec model was the best for the large noisy CT images. PMID:24023764

  16. Postwildfire debris-flow hazard assessment of the area burned by the 2012 Little Bear Fire, south-central New Mexico

    USGS Publications Warehouse

    Tillery, Anne C.; Matherne, Anne Marie

    2013-01-01

    A preliminary hazard assessment was developed of the debris-flow potential from 56 drainage basins burned by the Little Bear Fire in south-central New Mexico in June 2012. The Little Bear Fire burned approximately 179 square kilometers (km2) (44,330 acres), including about 143 km2 (35,300 acres) of National Forest System lands of the Lincoln National Forest. Within the Lincoln National Forest, about 72 km2 (17,664 acres) of the White Mountain Wilderness were burned. The burn area also included about 34 km2 (8,500 acres) of private lands. Burn severity was high or moderate on 53 percent of the burn area. The area burned is at risk of substantial postwildfire erosion, such as that caused by debris flows and flash floods. A postwildfire debris-flow hazard assessment of the area burned by the Little Bear Fire was performed by the U.S. Geological Survey in cooperation with the U.S. Department of Agriculture Forest Service, Lincoln National Forest. A set of two empirical hazard-assessment models developed by using data from recently burned drainage basins throughout the intermountain Western United States was used to estimate the probability of debris-flow occurrence and volume of debris flows along the burn area drainage network and for selected drainage basins within the burn area. The models incorporate measures of areal burn extent and severity, topography, soils, and storm rainfall intensity to estimate the probability and volume of debris flows following the fire. Relative hazard rankings of postwildfire debris flows were produced by summing the estimated probability and volume ranking to illustrate those areas with the highest potential occurrence of debris flows with the largest volumes. The probability that a drainage basin could produce debris flows and the volume of a possible debris flow at the basin outlet were estimated for three design storms: (1) a 2-year-recurrence, 30-minute-duration rainfall of 27 millimeters (mm) (a 50 percent chance of occurrence in any given year); (2) a 10-year-recurrence, 30-minute-duration rainfall of 42 mm (a 10 percent chance of occurrence in any given year); and (3) a 25-year-recurrence, 30-minute-duration rainfall of 51 mm (a 4 percent chance of occurrence in any given year). Thirty-nine percent of the 56 drainage basins modeled have a high (greater than 80 percent) probability of debris flows in response to the 2-year design storm; 80 percent of the modeled drainage basins have a high probability of debris flows in response to the 25-year design storm. For debris-flow volume, 7 percent of the modeled drainage basins have an estimated debris-flow volume greater than 100,000 cubic meters (m3) in response to the 2-year design storm; 9 percent of the drainage basins are included in the greater than 100,000 m3 category for both the 10-year and the 25-year design storms. Drainage basins in the greater than 100,000 m3 volume category also received the highest combined hazard ranking. The maps presented herein may be used to prioritize areas where emergency erosion mitigation or other protective measures may be needed prior to rainstorms within these drainage basins, their outlets, or areas downstream from these drainage basins within the 2- to 3-year period of vulnerability. This work is preliminary and is subject to revision. The assessment herein is provided on the condition that neither the U.S. Geological Survey nor the U.S. Government may be held liable for any damages resulting from the authorized or unauthorized use of the assessment.

  17. A Methodology for Evaluating and Ranking Water Quantity Indicators in Support of Ecosystem-Based Management

    NASA Astrophysics Data System (ADS)

    James, C. Andrew; Kershner, Jessi; Samhouri, Jameal; O'Neill, Sandra; Levin, Phillip S.

    2012-03-01

    Ecosystem-based Management (EBM) is an approach that includes different management priorities and requires a balance between anthropogenic and ecological resource demands. Indicators can be used to monitor ecosystem status and trends, and assess whether projects and/or programs are leading to the achievement of management goals. As such, the careful selection of a suite of indicators is a crucial exercise. In this paper we describe an indicator evaluation and selection process designed to support the EBM approach in Puget Sound. The first step in this process was the development of a general framework for selecting indicators. The framework, designed to transparently include both scientific and policy considerations into the selection and evaluation process, was developed and then utilized in the organization and determination of a preliminary set of indicators. Next, the indicators were assessed against a set of nineteen distinct criteria that describe the model characteristics of an indicator. A literature review was performed for each indicator to determine the extent to which it satisfied each of the evaluation criteria. The result of each literature review was summarized in a numerical matrix, allowing comparison, and demonstrating the extent of scientific reliability. Finally, an approach for ranking indicators was developed to explore the effects of intended purpose on indicator selection. We identified several sets of scientifically valid and policy-relevant indicators that included metrics such as annual-7 day low flow and water system reliability, which are supportive of the EBM approach in the Puget Sound.

  18. Quantum Entanglement and Reduced Density Matrices

    NASA Astrophysics Data System (ADS)

    Purwanto, Agus; Sukamto, Heru; Yuwana, Lila

    2018-05-01

    We investigate entanglement and separability criteria of multipartite (n-partite) state by examining ranks of its reduced density matrices. Firstly, we construct the general formula to determine the criterion. A rank of origin density matrix always equals one, meanwhile ranks of reduced matrices have various ranks. Next, separability and entanglement criterion of multipartite is determined by calculating ranks of reduced density matrices. In this article we diversify multipartite state criteria into completely entangled state, completely separable state, and compound state, i.e. sub-entangled state and sub-entangledseparable state. Furthermore, we also shorten the calculation proposed by the previous research to determine separability of multipartite state and expand the methods to be able to differ multipartite state based on criteria above.

  19. Research Ranking of Iranian Universities of Medical Sciences Based on International Indicators: An Experience From I.R. of Iran.

    PubMed

    Baradaran Eftekhari, Monir; Sobhani, Zahra; Eltemasi, Masoumeh; Ghalenoee, Elham; Falahat, Katayoun; Habibi, Elham; Djalalinia, Shirin; Paykari, Niloofar; Ebadifar, Asghar; Akhondzadeh, Shahin

    2017-11-01

    In recent years, international ranking systems have been used by diverse users for various purposes. In most of these rankings, different aspects of performance of universities and research institutes, especially scientific performance, have been evaluated and ranked. In this article, we aimed to report the results of research ranking of Iranian universities of medical sciences (UMSs) based on some international indicators in 2015. In this study, after reviewing the research indicators of the majority of international ranking systems, with the participation of key stakeholders, we selected eight research indicators, namely research output, high-quality publications, leadership, total citations, citations per paper in 2015, papers per faculty member and h-index. The main sources for data gathering were Scopus, PubMed, and ISI, Web of Science. Data were extracted and normalized for Iranian governmental UMSs for 2015. A total of 18023 articles were indexed in 2015 in Scopus with affiliations of UMSs affiliation. Almost 17% of all articles were published in top journals and 15% were published with international collaborations. The maximum h-index (h-index = 110) belonged to Tehran University of Medical Sciences. The average paper per faculty member was 1.14 (Max = 2.5, Min = 0.13). The mean citation per published articles in Scopus was 0.33. Research ranking of Iranian UMSs can create favorable competition among them towards knowledge production.

  20. Accurate template-based modeling in CASP12 using the IntFOLD4-TS, ModFOLD6, and ReFOLD methods.

    PubMed

    McGuffin, Liam J; Shuid, Ahmad N; Kempster, Robert; Maghrabi, Ali H A; Nealon, John O; Salehe, Bajuna R; Atkins, Jennifer D; Roche, Daniel B

    2018-03-01

    Our aim in CASP12 was to improve our Template-Based Modeling (TBM) methods through better model selection, accuracy self-estimate (ASE) scores and refinement. To meet this aim, we developed two new automated methods, which we used to score, rank, and improve upon the provided server models. Firstly, the ModFOLD6_rank method, for improved global Quality Assessment (QA), model ranking and the detection of local errors. Secondly, the ReFOLD method for fixing errors through iterative QA guided refinement. For our automated predictions we developed the IntFOLD4-TS protocol, which integrates the ModFOLD6_rank method for scoring the multiple-template models that were generated using a number of alternative sequence-structure alignments. Overall, our selection of top models and ASE scores using ModFOLD6_rank was an improvement on our previous approaches. In addition, it was worthwhile attempting to repair the detected errors in the top selected models using ReFOLD, which gave us an overall gain in performance. According to the assessors' formula, the IntFOLD4 server ranked 3rd/5th (average Z-score > 0.0/-2.0) on the server only targets, and our manual predictions (McGuffin group) ranked 1st/2nd (average Z-score > -2.0/0.0) compared to all other groups. © 2017 Wiley Periodicals, Inc.

  1. Local Knowledge When Ranking Journals: Reproductive Effects and Resistant Possibilities

    ERIC Educational Resources Information Center

    Canagarajah, Suresh

    2014-01-01

    This article is based on the engagement of a US-based scholar and faculty members in a non-Western university in a mentoring exercise on publishing. It demonstrates how the "list" constructed in a particular academic department in the university for ranking relevant journals for publication has reproductive effects on knowledge…

  2. Ranking Institutional Settings Based on Publications in Community Psychology Journals

    ERIC Educational Resources Information Center

    Jason, Leonard A.; Pokorny, Steven B.; Patka, Mazna; Adams, Monica; Morello, Taylor

    2007-01-01

    Two primary outlets for community psychology research, the "American Journal of Community Psychology" and the "Journal of Community Psychology", were assessed to rank institutions based on publication frequency and scientific influence of publications over a 32-year period. Three specific periods were assessed (1973-1983, 1984-1994, 1995-2004).…

  3. Ignition and Combustion of Pulverized Coal and Biomass under Different Oxy-fuel O2/N2 and O2/CO2 Environments

    NASA Astrophysics Data System (ADS)

    Khatami Firoozabadi, Seyed Reza

    This work studied the ignition and combustion of burning pulverized coals and biomasses particles under either conventional combustion in air or oxy-fuel combustion conditions. Oxy-fuel combustion is a 'clean-coal' process that takes place in O2/CO2 environments, which are achieved by removing nitrogen from the intake gases and recirculating large amounts of flue gases to the boiler. Removal of nitrogen from the combustion gases generates a high CO2-content, sequestration-ready gas at the boiler effluent. Flue gas recirculation moderates the high temperatures caused by the elevated oxygen partial pressure in the boiler. In this study, combustion of the fuels took place in a laboratory laminar-flow drop-tube furnace (DTF), electrically-heated to 1400 K, in environments containing various mole fractions of oxygen in either nitrogen or carbon-dioxide background gases. The experiments were conducted at two different gas conditions inside the furnace: (a) quiescent gas condition (i.e., no flow or inactive flow) and, (b) an active gas flow condition in both the injector and furnace. Eight coals from different ranks (anthracite, semi-snthracite, three bituminous, subbituminous and two lignites) and four biomasses from different sources were utilized in this work to study the ignition and combustion characteristics of solid fuels in O2/N2 or O2/CO2 environments. The main objective is to study the effect of replacing background N2 with CO2, increasing O2 mole fraction and fuel type and rank on a number of qualitative and quantitative parameters such as ignition/combustion mode, ignition temperature, ignition delay time, combustion temperatures, burnout times and envelope flame soot volume fractions. Regarding ignition, in the quiescent gas condition, bituminous and sub-bituminous coal particles experienced homogeneous ignition in both O2/N 2 and O2/CO2 atmospheres, while in the active gas flow condition, heterogeneous ignition was evident in O2/CO 2. Anthracite, semi-anthracite and lignites mostly experienced heterogeneous ignition in either O2/N2 or O2/CO2 atmospheres in both flow conditions. Replacing the N2 by CO 2 slightly increased the ignition temperature (30--40K). Ignition temperatures increased with the enhancement of coal rank in either air or oxy-fuel combustion conditions. However, increasing oxygen mole fraction decreased the ignition temperature for all coals. The ignition delay of coal particles was prolonged in the slow-heating O2/CO2 atmospheres, relative to the faster-heating O2/N2 atmospheres, particularly at high-diluent mole fractions. At higher O2 mole fractions, ignition delays decreased in both environments. Higher rank fuels such as anthracite and semi-anthracite experienced higher ignition delays while lower rank fuels such as lignite and biomasses experienced lower igniton delay times. In combustion, fuel particles were observed to burn in different modes, such as two-mode, or in one-mode combustion, depending on their rank and the furnace conditions. Strong tendencies were observed for all fuels to burn in one-mode when N2 was replaced by CO2, and when O 2 mole fraction increased in both environments. Moreover, increasing the coal rank, from lignite to bituminous, enhanced the tendency of coal particles to exhibit a two-mode combustion behavior. Particle luminosity, fragmentation and deduced temperatures were higher in O2/N2 than in O2/CO2 atmospheres, and corresponding burnout times were shorter, at the same O2 mole fractions. Particle luminosity and temperatures increased with increasing O2 mole fractions in both N2 and in CO2 background gases, and corresponding burnout times decreased with increasing O2 mole fractions. Bituminous coal particles swelled, whereas sub-bituminous coal particles exhibited limited fragmentation prior to and during the early stages of combustion. Lignite coal particles fragmented extensively and burned in one-mode regardless of the O2 mole fraction and the background gas. The timing of fragmentation (prior or after ignition) and the number of fragments depended on the type of the lignite and on the particle shape. Temperatures and burnout times of particles were also affected by the combustion mode. In nearly all bituminous and biomass particles combustion, sooty envelope flames were formed around the particles. Replacement of background N 2 by CO2 gas decreased the average soot volume fraction, fv, whereas increasing O2 from 20% to 30--40% increased the fv and then further increasing O2 to 100% decreased the soot volume fraction drastically. bituminous coal particle flames generated lower soot volume fractions in the range 2x10 -5--9x10-5, depending on O2 mole fraction. Moreover, biomass particle flames were optically thin and of equal-sized at all O2 mole fractions. (Abstract shortened by UMI.).

  4. Combining groundwater quality analysis and a numerical flow simulation for spatially establishing utilization strategies for groundwater and surface water in the Pingtung Plain

    NASA Astrophysics Data System (ADS)

    Jang, Cheng-Shin; Chen, Ching-Fang; Liang, Ching-Ping; Chen, Jui-Sheng

    2016-02-01

    Overexploitation of groundwater is a common problem in the Pingtung Plain area of Taiwan, resulting in substantial drawdown of groundwater levels as well as the occurrence of severe seawater intrusion and land subsidence. Measures need to be taken to preserve these valuable groundwater resources. This study seeks to spatially determine the most suitable locations for the use of surface water on this plain instead of extracting groundwater for drinking, irrigation, and aquaculture purposes based on information obtained by combining groundwater quality analysis and a numerical flow simulation assuming the planning of manmade lakes and reservoirs to the increase of water supply. The multivariate indicator kriging method is first used to estimate occurrence probabilities, and to rank townships as suitable or unsuitable for groundwater utilization according to water quality standards for drinking, irrigation, and aquaculture. A numerical model of groundwater flow (MODFLOW) is adopted to quantify the recovery of groundwater levels in townships after model calibration when groundwater for drinking and agricultural demands has been replaced by surface water. Finally, townships with poor groundwater quality and significant increases in groundwater levels in the Pingtung Plain are prioritized for the groundwater conservation planning based on the combined assessment of groundwater quality and quantity. The results of this study indicate that the integration of groundwater quality analysis and the numerical flow simulation is capable of establishing sound strategies for joint groundwater and surface water use. Six southeastern townships are found to be suitable locations for replacing groundwater with surface water from manmade lakes or reservoirs to meet drinking, irrigation, and aquaculture demands.

  5. Production layout improvement by using line balancing and Systematic Layout Planning (SLP) at PT. XYZ

    NASA Astrophysics Data System (ADS)

    Buchari; Tarigan, U.; Ambarita, M. B.

    2018-02-01

    PT. XYZ is a wood processing company which produce semi-finished wood with production system is make to order. In the production process, it can be seen that the production line is not balanced. The imbalance of the production line is caused by the difference in cycle time between work stations. In addition, there are other issues, namely the existence of material flow pattern is irregular so it resulted in the backtracking and displacement distance away. This study aimed to obtain the allocation of work elements to specific work stations and propose an improvement of the production layout based on the result of improvements in the line balancing. The method used in the balancing is Ranked Positional Weight (RPW) or also known as Helgeson Birnie method. While the methods used in the improvement of the layout is the method of Systematic Layout Planning (SLP). By using Ranked Positional Weight (RPW) obtained increase in line efficiency becomes 84,86% and decreased balance delay becomes 15,14%. Repairing the layout using the method of Systematic Layout Planning (SLP) also give good results with a reduction in path length becomes 133,82 meters from 213,09 meters previously or a decrease of 37.2%.

  6. Arbitrage model for optimal capital transactions in petroleum reserves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ten Eyck, D.K.

    1986-01-01

    This dissertation provides a methodology for identifying price differentials in the market for petroleum reserves, enabling petroleum-producing firms to engage in a variation of classical arbitrage. This approach enables the petroleum-producing firm to evaluate and rank reserve-replacement projects from the three principal sources listed below in order to maximize the return on invested capital. The methodology is based on the discounted cash flow approach to valuation of the oil and gas reserves obtained (1) by exploration, (2) by direct purchase of reserves, and (3) by acquisition of an entire petroleum firm. The reserve-replacement projects are evaluated and ranked to determinemore » an optimal portfolio of reserve-replacement projects. Cost per barrel alone is shown to be ineffective as an evaluation tool because it may lead to economic decisions that do not maximize the value of the firm. When used with other economic decision criteria, cost per barrel is useful as a downside economic indicator by showing which projects will fare better under unfavorable price scenarios. Important factors affecting the valuation of an acquisition (in addition to the oil and gas reserves) are shown by this study to be purchase price, other assets including cash, future tax savings from operating losses carried forward, and liabilities, primarily long-term debt.« less

  7. Geometric constraints on the space of N = 2 SCFTs. Part I: physical constraints on relevant deformations

    NASA Astrophysics Data System (ADS)

    Argyres, Philip; Lotito, Matteo; Lü, Yongchao; Martone, Mario

    2018-02-01

    We initiate a systematic study of four dimensional N = 2 superconformal field theories (SCFTs) based on the analysis of their Coulomb branch geometries. Because these SCFTs are not uniquely characterized by their scale-invariant Coulomb branch geometries we also need information on their deformations. We construct all inequivalent such deformations preserving N = 2 supersymmetry and additional physical consistency conditions in the rank 1 case. These not only include all the ones previously predicted by S-duality, but also 16 additional deformations satisfying all the known N = 2 low energy consistency conditions. All but two of these additonal deformations have recently been identified with new rank 1 SCFTs; these identifications are briefly reviewed. Some novel ingredients which are important for this study include: a discussion of RG-flows in the presence of a moduli space of vacua; a classification of local N = 2 supersymmetry-preserving deformations of unitary N = 2 SCFTs; and an analysis of charge normalizations and the Dirac quantization condition on Coulomb branches. This paper is the first in a series of three. The second paper [1] gives the details of the explicit construction of the Coulomb branch geometries discussed here, while the third [2] discusses the computation of central charges of the associated SCFTs.

  8. Highly accurate fast lung CT registration

    NASA Astrophysics Data System (ADS)

    Rühaak, Jan; Heldmann, Stefan; Kipshagen, Till; Fischer, Bernd

    2013-03-01

    Lung registration in thoracic CT scans has received much attention in the medical imaging community. Possible applications range from follow-up analysis, motion correction for radiation therapy, monitoring of air flow and pulmonary function to lung elasticity analysis. In a clinical environment, runtime is always a critical issue, ruling out quite a few excellent registration approaches. In this paper, a highly efficient variational lung registration method based on minimizing the normalized gradient fields distance measure with curvature regularization is presented. The method ensures diffeomorphic deformations by an additional volume regularization. Supplemental user knowledge, like a segmentation of the lungs, may be incorporated as well. The accuracy of our method was evaluated on 40 test cases from clinical routine. In the EMPIRE10 lung registration challenge, our scheme ranks third, with respect to various validation criteria, out of 28 algorithms with an average landmark distance of 0.72 mm. The average runtime is about 1:50 min on a standard PC, making it by far the fastest approach of the top-ranking algorithms. Additionally, the ten publicly available DIR-Lab inhale-exhale scan pairs were registered to subvoxel accuracy at computation times of only 20 seconds. Our method thus combines very attractive runtimes with state-of-the-art accuracy in a unique way.

  9. Time-Aware Service Ranking Prediction in the Internet of Things Environment

    PubMed Central

    Huang, Yuze; Huang, Jiwei; Cheng, Bo; He, Shuqing; Chen, Junliang

    2017-01-01

    With the rapid development of the Internet of things (IoT), building IoT systems with high quality of service (QoS) has become an urgent requirement in both academia and industry. During the procedures of building IoT systems, QoS-aware service selection is an important concern, which requires the ranking of a set of functionally similar services according to their QoS values. In reality, however, it is quite expensive and even impractical to evaluate all geographically-dispersed IoT services at a single client to obtain such a ranking. Nevertheless, distributed measurement and ranking aggregation have to deal with the high dynamics of QoS values and the inconsistency of partial rankings. To address these challenges, we propose a time-aware service ranking prediction approach named TSRPred for obtaining the global ranking from the collection of partial rankings. Specifically, a pairwise comparison model is constructed to describe the relationships between different services, where the partial rankings are obtained by time series forecasting on QoS values. The comparisons of IoT services are formulated by random walks, and thus, the global ranking can be obtained by sorting the steady-state probabilities of the underlying Markov chain. Finally, the efficacy of TSRPred is validated by simulation experiments based on large-scale real-world datasets. PMID:28448451

  10. Time-Aware Service Ranking Prediction in the Internet of Things Environment.

    PubMed

    Huang, Yuze; Huang, Jiwei; Cheng, Bo; He, Shuqing; Chen, Junliang

    2017-04-27

    With the rapid development of the Internet of things (IoT), building IoT systems with high quality of service (QoS) has become an urgent requirement in both academia and industry. During the procedures of building IoT systems, QoS-aware service selection is an important concern, which requires the ranking of a set of functionally similar services according to their QoS values. In reality, however, it is quite expensive and even impractical to evaluate all geographically-dispersed IoT services at a single client to obtain such a ranking. Nevertheless, distributed measurement and ranking aggregation have to deal with the high dynamics of QoS values and the inconsistency of partial rankings. To address these challenges, we propose a time-aware service ranking prediction approach named TSRPred for obtaining the global ranking from the collection of partial rankings. Specifically, a pairwise comparison model is constructed to describe the relationships between different services, where the partial rankings are obtained by time series forecasting on QoS values. The comparisons of IoT services are formulated by random walks, and thus, the global ranking can be obtained by sorting the steady-state probabilities of the underlying Markov chain. Finally, the efficacy of TSRPred is validated by simulation experiments based on large-scale real-world datasets.

  11. An effective PSO-based memetic algorithm for flow shop scheduling.

    PubMed

    Liu, Bo; Wang, Ling; Jin, Yi-Hui

    2007-02-01

    This paper proposes an effective particle swarm optimization (PSO)-based memetic algorithm (MA) for the permutation flow shop scheduling problem (PFSSP) with the objective to minimize the maximum completion time, which is a typical non-deterministic polynomial-time (NP) hard combinatorial optimization problem. In the proposed PSO-based MA (PSOMA), both PSO-based searching operators and some special local searching operators are designed to balance the exploration and exploitation abilities. In particular, the PSOMA applies the evolutionary searching mechanism of PSO, which is characterized by individual improvement, population cooperation, and competition to effectively perform exploration. On the other hand, the PSOMA utilizes several adaptive local searches to perform exploitation. First, to make PSO suitable for solving PFSSP, a ranked-order value rule based on random key representation is presented to convert the continuous position values of particles to job permutations. Second, to generate an initial swarm with certain quality and diversity, the famous Nawaz-Enscore-Ham (NEH) heuristic is incorporated into the initialization of population. Third, to balance the exploration and exploitation abilities, after the standard PSO-based searching operation, a new local search technique named NEH_1 insertion is probabilistically applied to some good particles selected by using a roulette wheel mechanism with a specified probability. Fourth, to enrich the searching behaviors and to avoid premature convergence, a simulated annealing (SA)-based local search with multiple different neighborhoods is designed and incorporated into the PSOMA. Meanwhile, an effective adaptive meta-Lamarckian learning strategy is employed to decide which neighborhood to be used in SA-based local search. Finally, to further enhance the exploitation ability, a pairwise-based local search is applied after the SA-based search. Simulation results based on benchmarks demonstrate the effectiveness of the PSOMA. Additionally, the effects of some parameters on optimization performances are also discussed.

  12. Ranking of predictor variables based on effect size criterion provides an accurate means of automatically classifying opinion column articles

    NASA Astrophysics Data System (ADS)

    Legara, Erika Fille; Monterola, Christopher; Abundo, Cheryl

    2011-01-01

    We demonstrate an accurate procedure based on linear discriminant analysis that allows automatic authorship classification of opinion column articles. First, we extract the following stylometric features of 157 column articles from four authors: statistics on high frequency words, number of words per sentence, and number of sentences per paragraph. Then, by systematically ranking these features based on an effect size criterion, we show that we can achieve an average classification accuracy of 93% for the test set. In comparison, frequency size based ranking has an average accuracy of 80%. The highest possible average classification accuracy of our data merely relying on chance is ∼31%. By carrying out sensitivity analysis, we show that the effect size criterion is superior than frequency ranking because there exist low frequency words that significantly contribute to successful author discrimination. Consistent results are seen when the procedure is applied in classifying the undisputed Federalist papers of Alexander Hamilton and James Madison. To the best of our knowledge, the work is the first attempt in classifying opinion column articles, that by virtue of being shorter in length (as compared to novels or short stories), are more prone to over-fitting issues. The near perfect classification for the longer papers supports this claim. Our results provide an important insight on authorship attribution that has been overlooked in previous studies: that ranking discriminant variables based on word frequency counts is not necessarily an optimal procedure.

  13. Compressed sparse tensor based quadrature for vibrational quantum mechanics integrals

    DOE PAGES

    Rai, Prashant; Sargsyan, Khachik; Najm, Habib N.

    2018-03-20

    A new method for fast evaluation of high dimensional integrals arising in quantum mechanics is proposed. Here, the method is based on sparse approximation of a high dimensional function followed by a low-rank compression. In the first step, we interpret the high dimensional integrand as a tensor in a suitable tensor product space and determine its entries by a compressed sensing based algorithm using only a few function evaluations. Secondly, we implement a rank reduction strategy to compress this tensor in a suitable low-rank tensor format using standard tensor compression tools. This allows representing a high dimensional integrand function asmore » a small sum of products of low dimensional functions. Finally, a low dimensional Gauss–Hermite quadrature rule is used to integrate this low-rank representation, thus alleviating the curse of dimensionality. Finally, numerical tests on synthetic functions, as well as on energy correction integrals for water and formaldehyde molecules demonstrate the efficiency of this method using very few function evaluations as compared to other integration strategies.« less

  14. Compressed sparse tensor based quadrature for vibrational quantum mechanics integrals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rai, Prashant; Sargsyan, Khachik; Najm, Habib N.

    A new method for fast evaluation of high dimensional integrals arising in quantum mechanics is proposed. Here, the method is based on sparse approximation of a high dimensional function followed by a low-rank compression. In the first step, we interpret the high dimensional integrand as a tensor in a suitable tensor product space and determine its entries by a compressed sensing based algorithm using only a few function evaluations. Secondly, we implement a rank reduction strategy to compress this tensor in a suitable low-rank tensor format using standard tensor compression tools. This allows representing a high dimensional integrand function asmore » a small sum of products of low dimensional functions. Finally, a low dimensional Gauss–Hermite quadrature rule is used to integrate this low-rank representation, thus alleviating the curse of dimensionality. Finally, numerical tests on synthetic functions, as well as on energy correction integrals for water and formaldehyde molecules demonstrate the efficiency of this method using very few function evaluations as compared to other integration strategies.« less

  15. Ranking and averaging independent component analysis by reproducibility (RAICAR).

    PubMed

    Yang, Zhi; LaConte, Stephen; Weng, Xuchu; Hu, Xiaoping

    2008-06-01

    Independent component analysis (ICA) is a data-driven approach that has exhibited great utility for functional magnetic resonance imaging (fMRI). Standard ICA implementations, however, do not provide the number and relative importance of the resulting components. In addition, ICA algorithms utilizing gradient-based optimization give decompositions that are dependent on initialization values, which can lead to dramatically different results. In this work, a new method, RAICAR (Ranking and Averaging Independent Component Analysis by Reproducibility), is introduced to address these issues for spatial ICA applied to fMRI. RAICAR utilizes repeated ICA realizations and relies on the reproducibility between them to rank and select components. Different realizations are aligned based on correlations, leading to aligned components. Each component is ranked and thresholded based on between-realization correlations. Furthermore, different realizations of each aligned component are selectively averaged to generate the final estimate of the given component. Reliability and accuracy of this method are demonstrated with both simulated and experimental fMRI data. Copyright 2007 Wiley-Liss, Inc.

  16. PageRank as a method to rank biomedical literature by importance.

    PubMed

    Yates, Elliot J; Dixon, Louise C

    2015-01-01

    Optimal ranking of literature importance is vital in overcoming article overload. Existing ranking methods are typically based on raw citation counts, giving a sum of 'inbound' links with no consideration of citation importance. PageRank, an algorithm originally developed for ranking webpages at the search engine, Google, could potentially be adapted to bibliometrics to quantify the relative importance weightings of a citation network. This article seeks to validate such an approach on the freely available, PubMed Central open access subset (PMC-OAS) of biomedical literature. On-demand cloud computing infrastructure was used to extract a citation network from over 600,000 full-text PMC-OAS articles. PageRanks and citation counts were calculated for each node in this network. PageRank is highly correlated with citation count (R = 0.905, P < 0.01) and we thus validate the former as a surrogate of literature importance. Furthermore, the algorithm can be run in trivial time on cheap, commodity cluster hardware, lowering the barrier of entry for resource-limited open access organisations. PageRank can be trivially computed on commodity cluster hardware and is linearly correlated with citation count. Given its putative benefits in quantifying relative importance, we suggest it may enrich the citation network, thereby overcoming the existing inadequacy of citation counts alone. We thus suggest PageRank as a feasible supplement to, or replacement of, existing bibliometric ranking methods.

  17. Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.

    PubMed

    Fushing, Hsieh; McAssey, Michael P; Beisner, Brianne; McCowan, Brenda

    2011-03-15

    We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

  18. The Wilcoxon signed rank test for paired comparisons of clustered data.

    PubMed

    Rosner, Bernard; Glynn, Robert J; Lee, Mei-Ling T

    2006-03-01

    The Wilcoxon signed rank test is a frequently used nonparametric test for paired data (e.g., consisting of pre- and posttreatment measurements) based on independent units of analysis. This test cannot be used for paired comparisons arising from clustered data (e.g., if paired comparisons are available for each of two eyes of an individual). To incorporate clustering, a generalization of the randomization test formulation for the signed rank test is proposed, where the unit of randomization is at the cluster level (e.g., person), while the individual paired units of analysis are at the subunit within cluster level (e.g., eye within person). An adjusted variance estimate of the signed rank test statistic is then derived, which can be used for either balanced (same number of subunits per cluster) or unbalanced (different number of subunits per cluster) data, with an exchangeable correlation structure, with or without tied values. The resulting test statistic is shown to be asymptotically normal as the number of clusters becomes large, if the cluster size is bounded. Simulation studies are performed based on simulating correlated ranked data from a signed log-normal distribution. These studies indicate appropriate type I error for data sets with > or =20 clusters and a superior power profile compared with either the ordinary signed rank test based on the average cluster difference score or the multivariate signed rank test of Puri and Sen. Finally, the methods are illustrated with two data sets, (i) an ophthalmologic data set involving a comparison of electroretinogram (ERG) data in retinitis pigmentosa (RP) patients before and after undergoing an experimental surgical procedure, and (ii) a nutritional data set based on a randomized prospective study of nutritional supplements in RP patients where vitamin E intake outside of study capsules is compared before and after randomization to monitor compliance with nutritional protocols.

  19. Convergence analysis of evolutionary algorithms that are based on the paradigm of information geometry.

    PubMed

    Beyer, Hans-Georg

    2014-01-01

    The convergence behaviors of so-called natural evolution strategies (NES) and of the information-geometric optimization (IGO) approach are considered. After a review of the NES/IGO ideas, which are based on information geometry, the implications of this philosophy w.r.t. optimization dynamics are investigated considering the optimization performance on the class of positive quadratic objective functions (the ellipsoid model). Exact differential equations describing the approach to the optimizer are derived and solved. It is rigorously shown that the original NES philosophy optimizing the expected value of the objective functions leads to very slow (i.e., sublinear) convergence toward the optimizer. This is the real reason why state of the art implementations of IGO algorithms optimize the expected value of transformed objective functions, for example, by utility functions based on ranking. It is shown that these utility functions are localized fitness functions that change during the IGO flow. The governing differential equations describing this flow are derived. In the case of convergence, the solutions to these equations exhibit an exponentially fast approach to the optimizer (i.e., linear convergence order). Furthermore, it is proven that the IGO philosophy leads to an adaptation of the covariance matrix that equals in the asymptotic limit-up to a scalar factor-the inverse of the Hessian of the objective function considered.

  20. Automated high-grade prostate cancer detection and ranking on whole slide images

    NASA Astrophysics Data System (ADS)

    Huang, Chao-Hui; Racoceanu, Daniel

    2017-03-01

    Recently, digital pathology (DP) has been largely improved due to the development of computer vision and machine learning. Automated detection of high-grade prostate carcinoma (HG-PCa) is an impactful medical use-case showing the paradigm of collaboration between DP and computer science: given a field of view (FOV) from a whole slide image (WSI), the computer-aided system is able to determine the grade by classifying the FOV. Various approaches have been reported based on this approach. However, there are two reasons supporting us to conduct this work: first, there is still room for improvement in terms of detection accuracy of HG-PCa; second, a clinical practice is more complex than the operation of simple image classification. FOV ranking is also an essential step. E.g., in clinical practice, a pathologist usually evaluates a case based on a few FOVs from the given WSI. Then, makes decision based on the most severe FOV. This important ranking scenario is not yet being well discussed. In this work, we introduce an automated detection and ranking system for PCa based on Gleason pattern discrimination. Our experiments suggested that the proposed system is able to perform high-accuracy detection ( 95:57% +/- 2:1%) and excellent performance of ranking. Hence, the proposed system has a great potential to support the daily tasks in the medical routine of clinical pathology.

  1. Phenotypic plasticity to light and nutrient availability alters functional trait ranking across eight perennial grassland species.

    PubMed

    Siebenkäs, Alrun; Schumacher, Jens; Roscher, Christiane

    2015-03-27

    Functional traits are often used as species-specific mean trait values in comparative plant ecology or trait-based predictions of ecosystem processes, assuming that interspecific differences are greater than intraspecific trait variation and that trait-based ranking of species is consistent across environments. Although this assumption is increasingly challenged, there is a lack of knowledge regarding to what degree the extent of intraspecific trait variation in response to varying environmental conditions depends on the considered traits and the characteristics of the studied species to evaluate the consequences for trait-based species ranking. We studied functional traits of eight perennial grassland species classified into different functional groups (forbs vs. grasses) and varying in their inherent growth stature (tall vs. small) in a common garden experiment with different environments crossing three levels of nutrient availability and three levels of light availability over 4 months of treatment applications. Grasses and forbs differed in almost all above- and belowground traits, while trait differences related to growth stature were generally small. The traits showing the strongest responses to resource availability were similarly for grasses and forbs those associated with allocation and resource uptake. The strength of trait variation in response to varying resource availability differed among functional groups (grasses > forbs) and species of varying growth stature (small-statured > tall-statured species) in many aboveground traits, but only to a lower extent in belowground traits. These differential responses altered trait-based species ranking in many aboveground traits, such as specific leaf area, tissue nitrogen and carbon concentrations and above-belowground allocation (leaf area ratio and root : shoot ratio) at varying resource supply, while trait-based species ranking was more consistent in belowground traits. Our study shows that species grouping according to functional traits is valid, but trait-based species ranking depends on environmental conditions, thus limiting the applicability of species-specific mean trait values in ecological studies. Published by Oxford University Press on behalf of the Annals of Botany Company.

  2. A hybrid variational ensemble data assimilation for the HIgh Resolution Limited Area Model (HIRLAM)

    NASA Astrophysics Data System (ADS)

    Gustafsson, N.; Bojarova, J.; Vignes, O.

    2014-02-01

    A hybrid variational ensemble data assimilation has been developed on top of the HIRLAM variational data assimilation. It provides the possibility of applying a flow-dependent background error covariance model during the data assimilation at the same time as full rank characteristics of the variational data assimilation are preserved. The hybrid formulation is based on an augmentation of the assimilation control variable with localised weights to be assigned to a set of ensemble member perturbations (deviations from the ensemble mean). The flow-dependency of the hybrid assimilation is demonstrated in single simulated observation impact studies and the improved performance of the hybrid assimilation in comparison with pure 3-dimensional variational as well as pure ensemble assimilation is also proven in real observation assimilation experiments. The performance of the hybrid assimilation is comparable to the performance of the 4-dimensional variational data assimilation. The sensitivity to various parameters of the hybrid assimilation scheme and the sensitivity to the applied ensemble generation techniques are also examined. In particular, the inclusion of ensemble perturbations with a lagged validity time has been examined with encouraging results.

  3. Quantitative Assessment of Foot Blood Flow by Using Dynamic Volume Perfusion CT Technique: A Feasibility Study.

    PubMed

    Hur, Saebeom; Jae, Hwan Jun; Jang, Yeonggul; Min, Seung-Kee; Min, Sang-Il; Lee, Dong Yeon; Seo, Sang Gyo; Kim, Hyo-Cheol; Chung, Jin Wook; Kim, Kwang Gi; Park, Eun-Ah; Lee, Whal

    2016-04-01

    To demonstrate the feasibility of foot blood flow measurement by using dynamic volume perfusion computed tomographic (CT) technique with the upslope method in an animal experiment and a human study. The human study was approved by the institutional review board, and written informed consent was obtained from all patients. The animal study was approved by the research animal care and use committee. A perfusion CT experiment was first performed by using rabbits. A color-coded perfusion map was reconstructed by using in-house perfusion analysis software based on the upslope method, and the measured blood flow on the map was compared with the reference standard microsphere method by using correlation analysis. A total of 17 perfusion CT sessions were then performed (a) once in five human patients and (b) twice (before and after endovascular revascularization) in six human patients. Perfusion maps of blood flow were reconstructed and analyzed. The Wilcoxon signed rank test was used to prove significant differences in blood flow before and after treatment. The animal experiment demonstrated a strong correlation (R(2) = 0.965) in blood flow between perfusion CT and the microsphere method. Perfusion maps were obtained successfully in 16 human clinical sessions (94%) with the use of 32 mL of contrast medium and an effective radiation dose of 0.31 mSv (k factor for the ankle, 0.0002). The plantar dermis showed the highest blood flow among all anatomic structures of the foot, including muscle, subcutaneous tissue, tendon, and bone. After a successful revascularization procedure, the blood flow of the plantar dermis increased by 153% (P = .031). The interpretations of the color-coded perfusion map correlated well with the clinical and angiographic findings. Perfusion CT could be used to measure foot blood flow in both animals and humans. It can be a useful modality for the diagnosis of peripheral arterial disease by providing quantitative information on foot perfusion status.

  4. Non-normality and classification of amplification mechanisms in stability and resolvent analysis

    NASA Astrophysics Data System (ADS)

    Symon, Sean; Rosenberg, Kevin; Dawson, Scott T. M.; McKeon, Beverley J.

    2018-05-01

    Eigenspectra and pseudospectra of the mean-linearized Navier-Stokes operator are used to characterize amplification mechanisms in laminar and turbulent flows in which linear mechanisms are important. Success of mean flow (linear) stability analysis for a particular frequency is shown to depend on whether two scalar measures of non-normality agree: (1) the product between the resolvent norm and the distance from the imaginary axis to the closest eigenvalue and (2) the inverse of the inner product between the most amplified resolvent forcing and response modes. If they agree, the resolvent operator can be rewritten in its dyadic representation to reveal that the adjoint and forward stability modes are proportional to the forcing and response resolvent modes at that frequency. Hence the real parts of the eigenvalues are important since they are responsible for resonant amplification and the resolvent operator is low rank when the eigenvalues are sufficiently separated in the spectrum. If the amplification is pseudoresonant, then resolvent analysis is more suitable to understand the origin of observed flow structures. Two test cases are studied: low Reynolds number cylinder flow and turbulent channel flow. The first deals mainly with resonant mechanisms, hence the success of both classical and mean stability analysis with respect to predicting the critical Reynolds number and global frequency of the saturated flow. Both scalar measures of non-normality agree for the base and mean flows, and the region where the forcing and response modes overlap scales with the length of the recirculation bubble. In the case of turbulent channel flow, structures result from both resonant and pseudoresonant mechanisms, suggesting that both are necessary elements to sustain turbulence. Mean shear is exploited most efficiently by stationary disturbances while bounds on the pseudospectra illustrate how pseudoresonance is responsible for the most amplified disturbances at spatial wavenumbers and temporal frequencies corresponding to well-known turbulent structures. Some implications for flow control are discussed.

  5. Emergency Assessment of Debris-Flow Hazards from Basins Burned by the 2007 Ammo Fire, San Diego County, Southern California

    USGS Publications Warehouse

    Cannon, Susan H.; Gartner, Joseph E.; Michael, John A.; Bauer, Mark A.; Stitt, Susan C.; Knifong, Donna L.; McNamara, Bernard J.; Roque, Yvonne M.

    2007-01-01

    INTRODUCTION The objective of this report is to present a preliminary emergency assessment of the potential for debris-flow generation from basins burned by the Ammo Fire in San Diego County, southern California in 2007. Debris flows are among the most hazardous geologic phenomena; debris flows that followed wildfires in southern California in 2003 killed 16 people and caused tens of millions of dollars of property damage. A short period of even moderate rainfall on a burned watershed can lead to debris flows. Rainfall that is normally absorbed into hillslope soils can run off almost instantly after vegetation has been removed by wildfire. This causes much greater and more rapid runoff than is normal from creeks and drainage areas. Highly erodible soils in a burn scar allow flood waters to entrain large amounts of ash, mud, boulders, and unburned vegetation. Within the burned area and downstream, the force of rushing water, soil, and rock can destroy culverts, bridges, roadways, and buildings, potentially causing injury or death. This emergency debris-flow hazard assessment is presented as relative ranking of the predicted median volume of debris flows that can issue from basin outlets in response to 1.75 inches (44.45 mm) of rainfall over a 3-hour period. Such a storm has a 10-year return period. The calculation of debris flow volume is based on a multiple-regression statistical model that describes the median volume of material that can be expected from a recently burned basin as a function of the area burned at high and moderate severity, the basin area with slopes greater than or equal to 30 percent, and triggering storm rainfall. Cannon and others (2007) describe the methods used to generate the hazard maps. Identification of potential debris-flow hazards from burned drainage basins is necessary to issue warnings for specific basins, to make effective mitigation decisions, and to help plan evacuation timing and routes.

  6. Emergency Assessment of Debris-Flow Hazards from Basins Burned by the 2007 Ranch Fire, Ventura and Los Angeles Counties, Southern California

    USGS Publications Warehouse

    Cannon, Susan H.; Gartner, Joseph E.; Michael, John A.; Bauer, Mark A.; Stitt, Susan C.; Knifong, Donna L.; McNamara, Bernard J.; Roque, Yvonne M.

    2007-01-01

    INTRODUCTION The objective of this report is to present a preliminary emergency assessment of the potential for debris-flow generation from basins burned by the Ranch Fire in Ventura and Los Angeles Counties, southern California in 2007. Debris flows are among the most hazardous geologic phenomena; debris flows that followed wildfires in southern California in 2003 killed 16 people and caused tens of millions of dollars of property damage. A short period of even moderate rainfall on a burned watershed can lead to debris flows. Rainfall that is normally absorbed into hillslope soils can run off almost instantly after vegetation has been removed by wildfire. This causes much greater and more rapid runoff than is normal from creeks and drainage areas. Highly erodible soils in a burn scar allow flood waters to entrain large amounts of ash, mud, boulders, and unburned vegetation. Within the burned area and downstream, the force of rushing water, soil, and rock can destroy culverts, bridges, roadways, and buildings, potentially causing injury or death. This emergency debris-flow hazard assessment is presented as relative ranking of the predicted median volume of debris flows that can issue from basin outlets in response to 2.25 inches (57.15 mm) of rainfall over a 3-hour period. Such a storm has a 10-year return period. The calculation of debris flow volume is based on a multiple-regression statistical model that describes the median volume of material that can be expected from a recently burned basin as a function of the area burned at high and moderate severity, the basin area with slopes greater than or equal to 30 percent, and triggering storm rainfall. Cannon and others (2007) describe the methods used to generate the hazard maps. Identification of potential debris-flow hazards from burned drainage basins is necessary to issue warnings for specific basins, to make effective mitigation decisions, and to help plan evacuation timing and routes.

  7. Emergency assessment of debris-flow hazards from basins burned by the 2007 Harris Fire, San Diego County, southern California

    USGS Publications Warehouse

    Cannon, Susan H.; Gartner, Joseph E.; Michael, John A.; Bauer, Mark A.; Stitt, Susan C.; Knifong, Donna L.; McNamara, Bernard J.; Roque, Yvonne M.

    2007-01-01

    IntroductionThe objective of this report is to present a preliminary emergency assessment of the potential for debris-flow generation from basins burned by the Harris Fire in San Diego County, southern California in 2007. Debris flows are among the most hazardous geologic phenomena; debris flows that followed wildfires in southern California in 2003 killed 16 people and caused tens of millions of dollars of property damage. A short period of even moderate rainfall on a burned watershed can lead to debris flows. Rainfall that is normally absorbed into hillslope soils can run off almost instantly after vegetation has been removed by wildfire. This causes much greater and more rapid runoff than is normal from creeks and drainage areas. Highly erodible soils in a burn scar allow flood waters to entrain large amounts of ash, mud, boulders, and unburned vegetation. Within the burned area and downstream, the force of rushing water, soil, and rock can destroy culverts, bridges, roadways, and buildings, potentially causing injury or death. This emergency debris-flow hazard assessment is presented as relative ranking of the predicted median volume of debris flows that can issue from basin outlets in response to 1.75 inches (44.45 mm) of rainfall over a 3-hour period. Such a storm has a 10-year return period. The calculation of debris flow volume is based on a multiple-regression statistical model that describes the median volume of material that can be expected from a recently burned basin as a function of the area burned at high and moderate severity, the basin area with slopes greater than or equal to 30 percent, and triggering storm rainfall. Cannon and others (2007) describe the methods used to generate the hazard maps. Identification of potential debris-flow hazards from burned drainage basins is necessary to issue warnings for specific basins, to make effective mitigation decisions, and to help plan evacuation timing and routes.

  8. Emergency Assessment of Debris-Flow Hazards from Basins Burned by the 2007 Rice Fire, San Diego County, Southern California

    USGS Publications Warehouse

    Cannon, Susan H.; Gartner, Joseph E.; Michael, John A.; Bauer, Mark A.; Stitt, Susan C.; Knifong, Donna L.; McNamara, Bernard J.; Roque, Yvonne M.

    2007-01-01

    INTRODUCTION The objective of this report is to present a preliminary emergency assessment of the potential for debris-flow generation from basins burned by the Rice Fire in San Diego County, southern California in 2007. Debris flows are among the most hazardous geologic phenomena; debris flows that followed wildfires in southern California in 2003 killed 16 people and caused tens of millions of dollars of property damage. A short period of even moderate rainfall on a burned watershed can lead to debris flows. Rainfall that is normally absorbed into hillslope soils can run off almost instantly after vegetation has been removed by wildfire. This causes much greater and more rapid runoff than is normal from creeks and drainage areas. Highly erodible soils in a burn scar allow flood waters to entrain large amounts of ash, mud, boulders, and unburned vegetation. Within the burned area and downstream, the force of rushing water, soil, and rock can destroy culverts, bridges, roadways, and buildings, potentially causing injury or death. This emergency debris-flow hazard assessment is presented as relative ranking of the predicted median volume of debris flows that can issue from basin outlets in response to 1.75 inches (44.45 mm) of rainfall over a 3-hour period. Such a storm has a 10-year return period. The calculation of debris flow volume is based on a multiple-regression statistical model that describes the median volume of material that can be expected from a recently burned basin as a function of the area burned at high and moderate severity, the basin area with slopes greater than or equal to 30 percent, and triggering storm rainfall. Cannon and others (2007) describe the methods used to generate the hazard maps. Identification of potential debris-flow hazards from burned drainage basins is necessary to issue warnings for specific basins, to make effective mitigation decisions, and to help plan evacuation timing and routes.

  9. Emergency Assessment of Debris-Flow Hazards from Basins Burned by the 2007 Poomacha Fire, San Diego County, Southern California

    USGS Publications Warehouse

    Cannon, Susan H.; Gartner, Joseph E.; Michael, John A.; Bauer, Mark A.; Stitt, Susan C.; Knifong, Donna L.; McNamara, Bernard J.; Roque, Yvonne M.

    2007-01-01

    INTRODUCTION The objective of this report is to present a preliminary emergency assessment of the potential for debris-flow generation from basins burned by the Poomacha Fire in San Diego County, southern California in 2007. Debris flows are among the most hazardous geologic phenomena; debris flows that followed wildfires in southern California in 2003 killed 16 people and caused tens of millions of dollars of property damage. A short period of even moderate rainfall on a burned watershed can lead to debris flows. Rainfall that is normally absorbed into hillslope soils can run off almost instantly after vegetation has been removed by wildfire. This causes much greater and more rapid runoff than is normal from creeks and drainage areas. Highly erodible soils in a burn scar allow flood waters to entrain large amounts of ash, mud, boulders, and unburned vegetation. Within the burned area and downstream, the force of rushing water, soil, and rock can destroy culverts, bridges, roadways, and buildings, potentially causing injury or death. This emergency debris-flow hazard assessment is presented as relative ranking of the predicted median volume of debris flows that can issue from basin outlets in response to 2.25 inches (57.15 mm) of rainfall over a 3-hour period. Such a storm has a 10-year return period. The calculation of debris flow volume is based on a multiple-regression statistical model that describes the median volume of material that can be expected from a recently burned basin as a function of the area burned at high and moderate severity, the basin area with slopes greater than or equal to 30 percent, and triggering storm rainfall. Cannon and others (2007) describe the methods used to generate the hazard maps. Identification of potential debris-flow hazards from burned drainage basins is necessary to issue warnings for specific basins, to make effective mitigation decisions, and to help plan evacuation timing and routes.

  10. Emergency Assessment of Debris-Flow Hazards from Basins Burned by the 2007 Witch Fire, San Diego County, Southern California

    USGS Publications Warehouse

    Cannon, Susan H.; Gartner, Joseph E.; Michael, John A.; Bauer, Mark A.; Stitt, Susan C.; Knifong, Donna L.; McNamara, Bernard J.; Roque, Yvonne M.

    2007-01-01

    INTRODUCTION The objective of this report is to present a preliminary emergency assessment of the potential for debris-flow generation from basins burned by the Witch Fire in San Diego County, southern California in 2007. Debris flows are among the most hazardous geologic phenomena; debris flows that followed wildfires in southern California in 2003 killed 16 people and caused tens of millions of dollars of property damage. A short period of even moderate rainfall on a burned watershed can lead to debris flows. Rainfall that is normally absorbed into hillslope soils can run off almost instantly after vegetation has been removed by wildfire. This causes much greater and more rapid runoff than is normal from creeks and drainage areas. Highly erodible soils in a burn scar allow flood waters to entrain large amounts of ash, mud, boulders, and unburned vegetation. Within the burned area and downstream, the force of rushing water, soil, and rock can destroy culverts, bridges, roadways, and buildings, potentially causing injury or death. This emergency debris-flow hazard assessment is presented as relative ranking of the predicted median volume of debris flows that can issue from basin outlets in response to 2.25 inches (57.15 mm) of rainfall over a 3-hour period. Such a storm has a 10-year return period. The calculation of debris flow volume is based on a multiple-regression statistical model that describes the median volume of material that can be expected from a recently burned basin as a function of the area burned at high and moderate severity, the basin area with slopes greater than or equal to 30 percent, and triggering storm rainfall. Cannon and others (2007) describe the methods used to generate the hazard maps. Identification of potential debris-flow hazards from burned drainage basins is necessary to issue warnings for specific basins, to make effective mitigation decisions, and to help plan evacuation timing and routes.

  11. Emergency Assessment of Debris-Flow Hazards from Basins Burned by the 2007 Slide and Grass Valley Fires, San Bernardino County, Southern California

    USGS Publications Warehouse

    Cannon, Susan H.; Gartner, Joseph E.; Michael, John A.; Bauer, Mark A.; Stitt, Susan C.; Knifong, Donna L.; McNamara, Bernard J.; Roque, Yvonne M.

    2007-01-01

    INTRODUCTION The objective of this report is to present a preliminary emergency assessment of the potential for debris-flow generation from basins burned by the Slide and Grass Valley Fires in San Bernardino County, southern California in 2007. Debris flows are among the most hazardous geologic phenomena; debris flows that followed wildfires in southern California in 2003 killed 16 people and caused tens of millions of dollars of property damage. A short period of even moderate rainfall on a burned watershed can lead to debris flows. Rainfall that is normally absorbed into hillslope soils can run off almost instantly after vegetation has been removed by wildfire. This causes much greater and more rapid runoff than is normal from creeks and drainage areas. Highly erodible soils in a burn scar allow flood waters to entrain large amounts of ash, mud, boulders, and unburned vegetation. Within the burned area and downstream, the force of rushing water, soil, and rock can destroy culverts, bridges, roadways, and buildings, potentially causing injury or death. This emergency debris-flow hazard assessment is presented as relative ranking of the predicted median volume of debris flows that can issue from basin outlets in response to 3.50 inches (88.90 mm) of rainfall over a 3-hour period. Such a storm has a 10-year return period. The calculation of debris flow volume is based on a multiple-regression statistical model that describes the median volume of material that can be expected from a recently burned basin as a function of the area burned at high and moderate severity, the basin area with slopes greater than or equal to 30 percent, and triggering storm rainfall. Cannon and others (2007) describe the methods used to generate the hazard maps. Identification of potential debris-flow hazards from burned drainage basins is necessary to issue warnings for specific basins, to make effective mitigation decisions, and to help plan evacuation timing and routes.

  12. Emergency Assessment of Debris-Flow Hazards from Basins Burned by the 2007 Buckweed Fire, Los Angeles County, Southern California

    USGS Publications Warehouse

    Cannon, Susan H.; Gartner, Joseph E.; Michael, John A.; Bauer, Mark A.; Stitt, Susan C.; Knifong, Donna L.; McNamara, Bernard J.; Roque, Yvonne M.

    2007-01-01

    INTRODUCTION The objective of this report is to present a preliminary emergency assessment of the potential for debris-flow generation from basins burned by the Buckweed Fire in Los Angeles County, southern California in 2007. Debris flows are among the most hazardous geologic phenomena; debris flows that followed wildfires in southern California in 2003 killed 16 people and caused tens of millions of dollars of property damage. A short period of even moderate rainfall on a burned watershed can lead to debris flows. Rainfall that is normally absorbed into hillslope soils can run off almost instantly after vegetation has been removed by wildfire. This causes much greater and more rapid runoff than is normal from creeks and drainage areas. Highly erodible soils in a burn scar allow flood waters to entrain large amounts of ash, mud, boulders, and unburned vegetation. Within the burned area and downstream, the force of rushing water, soil, and rock can destroy culverts, bridges, roadways, and buildings, potentially causing injury or death. This emergency debris-flow hazard assessment is presented as relative ranking of the predicted median volume of debris flows that can issue from basin outlets in response to 2.25 inches (57.15 mm) of rainfall over a 3-hour period. Such a storm has a 10-year return period. The calculation of debris flow volume is based on a multiple-regression statistical model that describes the median volume of material that can be expected from a recently burned basin as a function of the area burned at high and moderate severity, the basin area with slopes greater than or equal to 30 percent, and triggering storm rainfall. Cannon and others (2007) describe the methods used to generate the hazard maps. Identification of potential debris-flow hazards from burned drainage basins is necessary to issue warnings for specific basins, to make effective mitigation decisions, and to help plan evacuation timing and routes.

  13. Emergency Assessment of Debris-Flow Hazards from Basins Burned by the 2007 Canyon Fire, Los Angeles County, Southern California

    USGS Publications Warehouse

    Cannon, Susan H.; Gartner, Joseph E.; Michael, John A.; Bauer, Mark A.; Stitt, Susan C.; Knifong, Donna L.; McNamara, Bernard J.; Roque, Yvonne M.

    2007-01-01

    INTRODUCTION The objective of this report is to present a preliminary emergency assessment of the potential for debris-flow generation from basins burned by the Canyon Fire in Los Angeles County, southern California in 2007. Debris flows are among the most hazardous geologic phenomena; debris flows that followed wildfires in southern California in 2003 killed 16 people and caused tens of millions of dollars of property damage. A short period of even moderate rainfall on a burned watershed can lead to debris flows. Rainfall that is normally absorbed into hillslope soils can run off almost instantly after vegetation has been removed by wildfire. This causes much greater and more rapid runoff than is normal from creeks and drainage areas. Highly erodible soils in a burn scar allow flood waters to entrain large amounts of ash, mud, boulders, and unburned vegetation. Within the burned area and downstream, the force of rushing water, soil, and rock can destroy culverts, bridges, roadways, and buildings, potentially causing injury or death. This emergency debris-flow hazard assessment is presented as relative ranking of the predicted median volume of debris flows that can issue from basin outlets in response to 2.25 inches (57.15 mm) of rainfall over a 3-hour period. Such a storm has a 10-year return period. The calculation of debris flow volume is based on a multiple-regression statistical model that describes the median volume of material that can be expected from a recently burned basin as a function of the area burned at high and moderate severity, the basin area with slopes greater than or equal to 30 percent, and triggering storm rainfall. Cannon and others (2007) describe the methods used to generate the hazard maps. Identification of potential debris-flow hazards from burned drainage basins is necessary to issue warnings for specific basins, to make effective mitigation decisions, and to help plan evacuation timing and routes.

  14. Emergency Assessment of Debris-Flow Hazards from Basins Burned by the 2007 Santiago Fire, Orange County, Southern California

    USGS Publications Warehouse

    Cannon, Susan H.; Gartner, Joseph E.; Michael, John A.; Bauer, Mark A.; Stitt, Susan C.; Knifong, Donna L.; McNamara, Bernard J.; Roque, Yvonne M.

    2007-01-01

    INTRODUCTION The objective of this report is to present a preliminary emergency assessment of the potential for debris-flow generation from basins burned by the Santiago Fire in Orange County, southern California in 2007. Debris flows are among the most hazardous geologic phenomena; debris flows that followed wildfires in southern California in 2003 killed 16 people and caused tens of millions of dollars of property damage. A short period of even moderate rainfall on a burned watershed can lead to debris flows. Rainfall that is normally absorbed into hillslope soils can run off almost instantly after vegetation has been removed by wildfire. This causes much greater and more rapid runoff than is normal from creeks and drainage areas. Highly erodible soils in a burn scar allow flood waters to entrain large amounts of ash, mud, boulders, and unburned vegetation. Within the burned area and downstream, the force of rushing water, soil, and rock can destroy culverts, bridges, roadways, and buildings, potentially causing injury or death. This emergency debris-flow hazard assessment is presented as relative ranking of the predicted median volume of debris flows that can issue from basin outlets in response to 1.75 inches (44.45 mm) of rainfall over a 3-hour period. Such a storm has a 10-year return period. The calculation of debris flow volume is based on a multiple-regression statistical model that describes the median volume of material that can be expected from a recently burned basin as a function of the area burned at high and moderate severity, the basin area with slopes greater than or equal to 30 percent, and triggering storm rainfall. Cannon and others (2007) describe the methods used to generate the hazard maps. Identification of potential debris-flow hazards from burned drainage basins is necessary to issue warnings for specific basins, to make effective mitigation decisions, and to help plan evacuation timing and routes.

  15. Unmatched U.S. Allopathic Seniors in the 2015 Main Residency Match: A Study of Applicant Behavior, Interview Selection, and Match Outcome.

    PubMed

    Liang, Mei; Curtin, Laurie S; Signer, Mona M; Savoia, Maria C

    2017-07-01

    The application and interview behaviors of unmatched U.S. allopathic medical school senior students (U.S. seniors) participating in the 2015 National Resident Matching Program (NRMP) Main Residency Match were studied in conjunction with their United States Medical Licensing Examination (USMLE) Step 1 scores and ranking preferences to understand their effects on Match outcome. USMLE Step 1 score and preferred specialty information were reviewed for U.S. seniors who responded to the 2015 NRMP Applicant Survey. Unmatched U.S. seniors were categorized as "strong," "solid," "marginal," or "weak" based on the perceived competitiveness of their Step 1 scores compared with U.S. seniors who matched in the same preferred specialty. The numbers of applications sent, interviews obtained, and programs ranked also were examined by Match outcome. Strong unmatched U.S. seniors submitted significantly more applications to achieve and attend approximately the same number of interviews as strong matched U.S. seniors. Strong unmatched seniors ranked fewer programs than their matched counterparts. As a group, unmatched U.S. seniors were less likely than their matched counterparts to rank a mix of competitive and less competitive programs and more likely to rank programs based on their perceived likelihood of matching. A small number of unmatched U.S. seniors would have matched if they had ranked programs that ranked them. U.S. seniors' Match outcomes may be affected by applicant characteristics that negatively influence their selection for interviews, and their difficulties may be exacerbated by disadvantageous ranking behaviors.

  16. A New Direction of Cancer Classification: Positive Effect of Low-Ranking MicroRNAs.

    PubMed

    Li, Feifei; Piao, Minghao; Piao, Yongjun; Li, Meijing; Ryu, Keun Ho

    2014-10-01

    Many studies based on microRNA (miRNA) expression profiles showed a new aspect of cancer classification. Because one characteristic of miRNA expression data is the high dimensionality, feature selection methods have been used to facilitate dimensionality reduction. The feature selection methods have one shortcoming thus far: they just consider the problem of where feature to class is 1:1 or n:1. However, because one miRNA may influence more than one type of cancer, human miRNA is considered to be ranked low in traditional feature selection methods and are removed most of the time. In view of the limitation of the miRNA number, low-ranking miRNAs are also important to cancer classification. We considered both high- and low-ranking features to cover all problems (1:1, n:1, 1:n, and m:n) in cancer classification. First, we used the correlation-based feature selection method to select the high-ranking miRNAs, and chose the support vector machine, Bayes network, decision tree, k-nearest-neighbor, and logistic classifier to construct cancer classification. Then, we chose Chi-square test, information gain, gain ratio, and Pearson's correlation feature selection methods to build the m:n feature subset, and used the selected miRNAs to determine cancer classification. The low-ranking miRNA expression profiles achieved higher classification accuracy compared with just using high-ranking miRNAs in traditional feature selection methods. Our results demonstrate that the m:n feature subset made a positive impression of low-ranking miRNAs in cancer classification.

  17. The Grassmannian Atlas: A General Framework for Exploring Linear Projections of High-Dimensional Data

    DOE PAGES

    Liu, S.; Bremer, P. -T; Jayaraman, J. J.; ...

    2016-06-04

    Linear projections are one of the most common approaches to visualize high-dimensional data. Since the space of possible projections is large, existing systems usually select a small set of interesting projections by ranking a large set of candidate projections based on a chosen quality measure. However, while highly ranked projections can be informative, some lower ranked ones could offer important complementary information. Therefore, selection based on ranking may miss projections that are important to provide a global picture of the data. Here, the proposed work fills this gap by presenting the Grassmannian Atlas, a framework that captures the global structuresmore » of quality measures in the space of all projections, which enables a systematic exploration of many complementary projections and provides new insights into the properties of existing quality measures.« less

  18. Low-rank matrix decomposition and spatio-temporal sparse recovery for STAP radar

    DOE PAGES

    Sen, Satyabrata

    2015-08-04

    We develop space-time adaptive processing (STAP) methods by leveraging the advantages of sparse signal processing techniques in order to detect a slowly-moving target. We observe that the inherent sparse characteristics of a STAP problem can be formulated as the low-rankness of clutter covariance matrix when compared to the total adaptive degrees-of-freedom, and also as the sparse interference spectrum on the spatio-temporal domain. By exploiting these sparse properties, we propose two approaches for estimating the interference covariance matrix. In the first approach, we consider a constrained matrix rank minimization problem (RMP) to decompose the sample covariance matrix into a low-rank positivemore » semidefinite and a diagonal matrix. The solution of RMP is obtained by applying the trace minimization technique and the singular value decomposition with matrix shrinkage operator. Our second approach deals with the atomic norm minimization problem to recover the clutter response-vector that has a sparse support on the spatio-temporal plane. We use convex relaxation based standard sparse-recovery techniques to find the solutions. With extensive numerical examples, we demonstrate the performances of proposed STAP approaches with respect to both the ideal and practical scenarios, involving Doppler-ambiguous clutter ridges, spatial and temporal decorrelation effects. As a result, the low-rank matrix decomposition based solution requires secondary measurements as many as twice the clutter rank to attain a near-ideal STAP performance; whereas the spatio-temporal sparsity based approach needs a considerably small number of secondary data.« less

  19. Kriging for Simulation Metamodeling: Experimental Design, Reduced Rank Kriging, and Omni-Rank Kriging

    NASA Astrophysics Data System (ADS)

    Hosking, Michael Robert

    This dissertation improves an analyst's use of simulation by offering improvements in the utilization of kriging metamodels. There are three main contributions. First an analysis is performed of what comprises good experimental designs for practical (non-toy) problems when using a kriging metamodel. Second is an explanation and demonstration of how reduced rank decompositions can improve the performance of kriging, now referred to as reduced rank kriging. Third is the development of an extension of reduced rank kriging which solves an open question regarding the usage of reduced rank kriging in practice. This extension is called omni-rank kriging. Finally these results are demonstrated on two case studies. The first contribution focuses on experimental design. Sequential designs are generally known to be more efficient than "one shot" designs. However, sequential designs require some sort of pilot design from which the sequential stage can be based. We seek to find good initial designs for these pilot studies, as well as designs which will be effective if there is no following sequential stage. We test a wide variety of designs over a small set of test-bed problems. Our findings indicate that analysts should take advantage of any prior information they have about their problem's shape and/or their goals in metamodeling. In the event of a total lack of information we find that Latin hypercube designs are robust default choices. Our work is most distinguished by its attention to the higher levels of dimensionality. The second contribution introduces and explains an alternative method for kriging when there is noise in the data, which we call reduced rank kriging. Reduced rank kriging is based on using a reduced rank decomposition which artificially smoothes the kriging weights similar to a nugget effect. Our primary focus will be showing how the reduced rank decomposition propagates through kriging empirically. In addition, we show further evidence for our explanation through tests of reduced rank kriging's performance over different situations. In total, reduced rank kriging is a useful tool for simulation metamodeling. For the third contribution we will answer the question of how to find the best rank for reduced rank kriging. We do this by creating an alternative method which does not need to search for a particular rank. Instead it uses all potential ranks; we call this approach omnirank kriging. This modification realizes the potential gains from reduced rank kriging and provides a workable methodology for simulation metamodeling. Finally, we will demonstrate the use and value of these developments on two case studies, a clinic operation problem and a location problem. These cases will validate the value of this research. Simulation metamodeling always attempts to extract maximum information from limited data. Each one of these contributions will allow analysts to make better use of their constrained computational budgets.

  20. CSmetaPred: a consensus method for prediction of catalytic residues.

    PubMed

    Choudhary, Preeti; Kumar, Shailesh; Bachhawat, Anand Kumar; Pandit, Shashi Bhushan

    2017-12-22

    Knowledge of catalytic residues can play an essential role in elucidating mechanistic details of an enzyme. However, experimental identification of catalytic residues is a tedious and time-consuming task, which can be expedited by computational predictions. Despite significant development in active-site prediction methods, one of the remaining issues is ranked positions of putative catalytic residues among all ranked residues. In order to improve ranking of catalytic residues and their prediction accuracy, we have developed a meta-approach based method CSmetaPred. In this approach, residues are ranked based on the mean of normalized residue scores derived from four well-known catalytic residue predictors. The mean residue score of CSmetaPred is combined with predicted pocket information to improve prediction performance in meta-predictor, CSmetaPred_poc. Both meta-predictors are evaluated on two comprehensive benchmark datasets and three legacy datasets using Receiver Operating Characteristic (ROC) and Precision Recall (PR) curves. The visual and quantitative analysis of ROC and PR curves shows that meta-predictors outperform their constituent methods and CSmetaPred_poc is the best of evaluated methods. For instance, on CSAMAC dataset CSmetaPred_poc (CSmetaPred) achieves highest Mean Average Specificity (MAS), a scalar measure for ROC curve, of 0.97 (0.96). Importantly, median predicted rank of catalytic residues is the lowest (best) for CSmetaPred_poc. Considering residues ranked ≤20 classified as true positive in binary classification, CSmetaPred_poc achieves prediction accuracy of 0.94 on CSAMAC dataset. Moreover, on the same dataset CSmetaPred_poc predicts all catalytic residues within top 20 ranks for ~73% of enzymes. Furthermore, benchmarking of prediction on comparative modelled structures showed that models result in better prediction than only sequence based predictions. These analyses suggest that CSmetaPred_poc is able to rank putative catalytic residues at lower (better) ranked positions, which can facilitate and expedite their experimental characterization. The benchmarking studies showed that employing meta-approach in combining residue-level scores derived from well-known catalytic residue predictors can improve prediction accuracy as well as provide improved ranked positions of known catalytic residues. Hence, such predictions can assist experimentalist to prioritize residues for mutational studies in their efforts to characterize catalytic residues. Both meta-predictors are available as webserver at: http://14.139.227.206/csmetapred/ .

  1. Comparison of different eigensolvers for calculating vibrational spectra using low-rank, sum-of-product basis functions

    NASA Astrophysics Data System (ADS)

    Leclerc, Arnaud; Thomas, Phillip S.; Carrington, Tucker

    2017-08-01

    Vibrational spectra and wavefunctions of polyatomic molecules can be calculated at low memory cost using low-rank sum-of-product (SOP) decompositions to represent basis functions generated using an iterative eigensolver. Using a SOP tensor format does not determine the iterative eigensolver. The choice of the interative eigensolver is limited by the need to restrict the rank of the SOP basis functions at every stage of the calculation. We have adapted, implemented and compared different reduced-rank algorithms based on standard iterative methods (block-Davidson algorithm, Chebyshev iteration) to calculate vibrational energy levels and wavefunctions of the 12-dimensional acetonitrile molecule. The effect of using low-rank SOP basis functions on the different methods is analysed and the numerical results are compared with those obtained with the reduced rank block power method. Relative merits of the different algorithms are presented, showing that the advantage of using a more sophisticated method, although mitigated by the use of reduced-rank SOP functions, is noticeable in terms of CPU time.

  2. The spatial frequencies influence the aesthetic judgment of buildings transculturally.

    PubMed

    Vannucci, Manila; Gori, Simone; Kojima, Haruyuki

    2014-01-01

    Recent evidence has shown that buildings designed to be high-ranking, according to the Western architectural decorum, have more impact on the minds of their beholders than low-ranking buildings. Here we investigated whether and how the aesthetic judgment for high- and low-ranking buildings was affected by differences in cultural expertise and by power spectrum differences. A group of Italian and Japanese participants performed aesthetic judgment tasks, with line drawings of high- and low-ranking buildings and with their random-phase versions (an image with the exact power spectrum of the original one but non-recognizable anymore). Irrespective of cultural expertise, high-ranking buildings and their relative random-phase versions received higher aesthetic judgments than low-ranking buildings and their random-phase versions. These findings indicate that high- and low-ranking buildings are differentiated for their aesthetic value and they show that low-level visual processes influence the aesthetic judgment based on differences in the stimuli power spectrum, irrespective of the influence of cultural expertise.

  3. Analyzing panel acoustic contributions toward the sound field inside the passenger compartment of a full-size automobile.

    PubMed

    Wu, Sean F; Moondra, Manmohan; Beniwal, Ravi

    2015-04-01

    The Helmholtz equation least squares (HELS)-based nearfield acoustical holography (NAH) is utilized to analyze panel acoustic contributions toward the acoustic field inside the interior region of an automobile. Specifically, the acoustic power flows from individual panels are reconstructed, and relative contributions to sound pressure level and spectrum at any point of interest are calculated. Results demonstrate that by correlating the acoustic power flows from individual panels to the field acoustic pressure, one can correctly locate the panel allowing the most acoustic energy transmission into the vehicle interior. The panel on which the surface acoustic pressure amplitude is the highest should not be used as indicative of the panel responsible for the sound field in the vehicle passenger compartment. Another significant advantage of this HELS-based NAH is that measurements of the input data only need to be taken once by using a conformal array of microphones in the near field, and ranking of panel acoustic contributions to any field point can be readily performed. The transfer functions between individual panels of any vibrating structure to the acoustic pressure anywhere in space are calculated not measured, thus significantly reducing the time and effort involved in panel acoustic contributions analyses.

  4. Network-based ranking methods for prediction of novel disease associated microRNAs.

    PubMed

    Le, Duc-Hau

    2015-10-01

    Many studies have shown roles of microRNAs on human disease and a number of computational methods have been proposed to predict such associations by ranking candidate microRNAs according to their relevance to a disease. Among them, machine learning-based methods usually have a limitation in specifying non-disease microRNAs as negative training samples. Meanwhile, network-based methods are becoming dominant since they well exploit a "disease module" principle in microRNA functional similarity networks. Of which, random walk with restart (RWR) algorithm-based method is currently state-of-the-art. The use of this algorithm was inspired from its success in predicting disease gene because the "disease module" principle also exists in protein interaction networks. Besides, many algorithms designed for webpage ranking have been successfully applied in ranking disease candidate genes because web networks share topological properties with protein interaction networks. However, these algorithms have not yet been utilized for disease microRNA prediction. We constructed microRNA functional similarity networks based on shared targets of microRNAs, and then we integrated them with a microRNA functional synergistic network, which was recently identified. After analyzing topological properties of these networks, in addition to RWR, we assessed the performance of (i) PRINCE (PRIoritizatioN and Complex Elucidation), which was proposed for disease gene prediction; (ii) PageRank with Priors (PRP) and K-Step Markov (KSM), which were used for studying web networks; and (iii) a neighborhood-based algorithm. Analyses on topological properties showed that all microRNA functional similarity networks are small-worldness and scale-free. The performance of each algorithm was assessed based on average AUC values on 35 disease phenotypes and average rankings of newly discovered disease microRNAs. As a result, the performance on the integrated network was better than that on individual ones. In addition, the performance of PRINCE, PRP and KSM was comparable with that of RWR, whereas it was worst for the neighborhood-based algorithm. Moreover, all the algorithms were stable with the change of parameters. Final, using the integrated network, we predicted six novel miRNAs (i.e., hsa-miR-101, hsa-miR-181d, hsa-miR-192, hsa-miR-423-3p, hsa-miR-484 and hsa-miR-98) associated with breast cancer. Network-based ranking algorithms, which were successfully applied for either disease gene prediction or for studying social/web networks, can be also used effectively for disease microRNA prediction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. A Framework to Assess the Cumulative Hydrological Impacts of Dams on flow Regime

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Wang, D.

    2016-12-01

    In this study we proposed a framework to assess the cumulative impact of dams on hydrological regime, and the impacts of the Three Gorges Dam on flow regime in Yangtze River were investigated with the framework. We reconstructed the unregulated flow series to compare with the regulated flow series in the same period. Eco-surplus and eco-deficit and the Indicators of Hydrologic Alteration parameters were used to examine the hydrological regime change. Among IHA parameters, Wilcoxon signed-rank test and Principal Components Analysis identified the representative indicators of hydrological alterations. Eco-surplus and eco-deficit showed that the reservoir also changed the seasonal regime of the flows in autumn and winter. Annual extreme flows and October flows changes lead to negative ecological implications downstream from the Three Gorges Dam. Ecological operation for the Three Gorges Dam is necessary to mitigate the negative effects on the river ecosystem in the middle reach of Yangtze River. The framework proposed here could be a robust method to assess the cumulative impacts of reservoir operation.

  6. Structural MRI-based detection of Alzheimer's disease using feature ranking and classification error.

    PubMed

    Beheshti, Iman; Demirel, Hasan; Farokhian, Farnaz; Yang, Chunlan; Matsuda, Hiroshi

    2016-12-01

    This paper presents an automatic computer-aided diagnosis (CAD) system based on feature ranking for detection of Alzheimer's disease (AD) using structural magnetic resonance imaging (sMRI) data. The proposed CAD system is composed of four systematic stages. First, global and local differences in the gray matter (GM) of AD patients compared to the GM of healthy controls (HCs) are analyzed using a voxel-based morphometry technique. The aim is to identify significant local differences in the volume of GM as volumes of interests (VOIs). Second, the voxel intensity values of the VOIs are extracted as raw features. Third, the raw features are ranked using a seven-feature ranking method, namely, statistical dependency (SD), mutual information (MI), information gain (IG), Pearson's correlation coefficient (PCC), t-test score (TS), Fisher's criterion (FC), and the Gini index (GI). The features with higher scores are more discriminative. To determine the number of top features, the estimated classification error based on training set made up of the AD and HC groups is calculated, with the vector size that minimized this error selected as the top discriminative feature. Fourth, the classification is performed using a support vector machine (SVM). In addition, a data fusion approach among feature ranking methods is introduced to improve the classification performance. The proposed method is evaluated using a data-set from ADNI (130 AD and 130 HC) with 10-fold cross-validation. The classification accuracy of the proposed automatic system for the diagnosis of AD is up to 92.48% using the sMRI data. An automatic CAD system for the classification of AD based on feature-ranking method and classification errors is proposed. In this regard, seven-feature ranking methods (i.e., SD, MI, IG, PCC, TS, FC, and GI) are evaluated. The optimal size of top discriminative features is determined by the classification error estimation in the training phase. The experimental results indicate that the performance of the proposed system is comparative to that of state-of-the-art classification models. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slater, Paul B.

    Paralleling our recent computationally intensive (quasi-Monte Carlo) work for the case N=4 (e-print quant-ph/0308037), we undertake the task for N=6 of computing to high numerical accuracy, the formulas of Sommers and Zyczkowski (e-print quant-ph/0304041) for the (N{sup 2}-1)-dimensional volume and (N{sup 2}-2)-dimensional hyperarea of the (separable and nonseparable) NxN density matrices, based on the Bures (minimal monotone) metric--and also their analogous formulas (e-print quant-ph/0302197) for the (nonmonotone) flat Hilbert-Schmidt metric. With the same seven 10{sup 9} well-distributed ('low-discrepancy') sample points, we estimate the unknown volumes and hyperareas based on five additional (monotone) metrics of interest, including the Kubo-Mori and Wigner-Yanase.more » Further, we estimate all of these seven volume and seven hyperarea (unknown) quantities when restricted to the separable density matrices. The ratios of separable volumes (hyperareas) to separable plus nonseparable volumes (hyperareas) yield estimates of the separability probabilities of generically rank-6 (rank-5) density matrices. The (rank-6) separability probabilities obtained based on the 35-dimensional volumes appear to be--independently of the metric (each of the seven inducing Haar measure) employed--twice as large as those (rank-5 ones) based on the 34-dimensional hyperareas. (An additional estimate--33.9982--of the ratio of the rank-6 Hilbert-Schmidt separability probability to the rank-4 one is quite clearly close to integral too.) The doubling relationship also appears to hold for the N=4 case for the Hilbert-Schmidt metric, but not the others. We fit simple exact formulas to our estimates of the Hilbert-Schmidt separable volumes and hyperareas in both the N=4 and N=6 cases.« less

  8. A method for integrating and ranking the evidence for biochemical pathways by mining reactions from text

    PubMed Central

    Miwa, Makoto; Ohta, Tomoko; Rak, Rafal; Rowley, Andrew; Kell, Douglas B.; Pyysalo, Sampo; Ananiadou, Sophia

    2013-01-01

    Motivation: To create, verify and maintain pathway models, curators must discover and assess knowledge distributed over the vast body of biological literature. Methods supporting these tasks must understand both the pathway model representations and the natural language in the literature. These methods should identify and order documents by relevance to any given pathway reaction. No existing system has addressed all aspects of this challenge. Method: We present novel methods for associating pathway model reactions with relevant publications. Our approach extracts the reactions directly from the models and then turns them into queries for three text mining-based MEDLINE literature search systems. These queries are executed, and the resulting documents are combined and ranked according to their relevance to the reactions of interest. We manually annotate document-reaction pairs with the relevance of the document to the reaction and use this annotation to study several ranking methods, using various heuristic and machine-learning approaches. Results: Our evaluation shows that the annotated document-reaction pairs can be used to create a rule-based document ranking system, and that machine learning can be used to rank documents by their relevance to pathway reactions. We find that a Support Vector Machine-based system outperforms several baselines and matches the performance of the rule-based system. The success of the query extraction and ranking methods are used to update our existing pathway search system, PathText. Availability: An online demonstration of PathText 2 and the annotated corpus are available for research purposes at http://www.nactem.ac.uk/pathtext2/. Contact: makoto.miwa@manchester.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23813008

  9. Stratified Failure: Educational Stratification and Students' Attributions of Their Mathematics Performance in 24 Countries

    ERIC Educational Resources Information Center

    Mijs, Jonathan J. B.

    2016-01-01

    Country rankings based on the Programme for International Student Assessment (PISA) invite politicians and specialists to speculate about the reasons their countries did well or failed to do well. Rarely, however, do we hear from the students on whose performance these rankings are based. This omission is unfortunate for two reasons. First,…

  10. Standard Errors of Equating for the Percentile Rank-Based Equipercentile Equating with Log-Linear Presmoothing

    ERIC Educational Resources Information Center

    Wang, Tianyou

    2009-01-01

    Holland and colleagues derived a formula for analytical standard error of equating using the delta-method for the kernel equating method. Extending their derivation, this article derives an analytical standard error of equating procedure for the conventional percentile rank-based equipercentile equating with log-linear smoothing. This procedure is…

  11. Does the uncertainty in the representation of terrestrial water flows affect precipitation predictability? A WRF-Hydro ensemble analysis for Central Europe

    NASA Astrophysics Data System (ADS)

    Arnault, Joel; Rummler, Thomas; Baur, Florian; Lerch, Sebastian; Wagner, Sven; Fersch, Benjamin; Zhang, Zhenyu; Kerandi, Noah; Keil, Christian; Kunstmann, Harald

    2017-04-01

    Precipitation predictability can be assessed by the spread within an ensemble of atmospheric simulations being perturbed in the initial, lateral boundary conditions and/or modeled processes within a range of uncertainty. Surface-related processes are more likely to change precipitation when synoptic forcing is weak. This study investigates the effect of uncertainty in the representation of terrestrial water flows on precipitation predictability. The tools used for this investigation are the Weather Research and Forecasting (WRF) model and its hydrologically-enhanced version WRF-Hydro, applied over Central Europe during April-October 2008. The WRF grid is that of COSMO-DE, with a resolution of 2.8 km. In WRF-Hydro, the WRF grid is coupled with a sub-grid at 280 m resolution to resolve lateral terrestrial water flows. Vertical flow uncertainty is considered by modifying the parameter controlling the partitioning between surface runoff and infiltration in WRF, and horizontal flow uncertainty is considered by comparing WRF with WRF-Hydro. Precipitation predictability is deduced from the spread of an ensemble based on three turbulence parameterizations. Model results are validated with E-OBS precipitation and surface temperature, ESA-CCI soil moisture, FLUXNET-MTE surface evaporation and GRDC discharge. It is found that the uncertainty in the representation of terrestrial water flows is more likely to significantly affect precipitation predictability when surface flux spatial variability is high. In comparison to the WRF ensemble, WRF-Hydro slightly improves the adjusted continuous ranked probability score of daily precipitation. The reproduction of observed daily discharge with Nash-Sutcliffe model efficiency coefficients up to 0.91 demonstrates the potential of WRF-Hydro for flood forecasting.

  12. Reforesting severely degraded grassland in the Lesser Himalaya of Nepal: Effects on soil hydraulic conductivity and overland flow production

    NASA Astrophysics Data System (ADS)

    Ghimire, Chandra Prasad; Bonell, Mike; Bruijnzeel, L. Adrian; Coles, Neil A.; Lubczynski, Maciek W.

    2013-12-01

    degraded hillslopes in the Lesser Himalaya challenge local communities as a result of the frequent occurrence of overland flow and erosion during the rainy season and water shortages during the dry season. Reforestation is often perceived as an effective way of restoring predisturbance hydrological conditions but heavy usage of reforested land in the region has been shown to hamper full recovery of soil hydraulic properties. This paper investigates the effect of reforestation and forest usage on field-saturated soil hydraulic conductivities (Kfs) near Dhulikhel, Central Nepal, by comparing degraded pasture, a footpath within the pasture, a 25 year old pine reforestation, and little disturbed natural forest. The hillslope hydrological implications of changes in Kfs with land-cover change were assessed via comparisons with measured rainfall intensities over different durations. High surface and near-surface Kfs in natural forest (82-232 mm h-1) rule out overland flow occurrence and favor vertical percolation. Conversely, corresponding Kfs for degraded pasture (18-39 mm h-1) and footpath (12-26 mm h-1) were conducive to overland flow generation during medium- to high-intensity storms and thus to local flash flooding. Pertinently, surface and near-surface Kfs in the heavily used pine forest remained similar to those for degraded pasture. Estimated monsoonal overland flow totals for degraded pasture, pine forest, and natural forest were 21.3%, 15.5%, and 2.5% of incident rainfall, respectively, reflecting the relative ranking of surface Kfs. Along with high water use by the pines, this lack of recovery of soil hydraulic properties under pine reforestation is shown to be a critical factor in the regionally observed decline in base flows following large-scale planting of pines and has important implications for regional forest management.

  13. Prestige versus citation volume as journal indices in cognitive neuroscience.

    PubMed

    Ward, Jamie

    2014-01-01

    In recent years, alternative measures of a journal's influence have been developed to those based on citation metrics (such as Impact Factor). This includes the Scimago Journal Rank (SJR) which is adapted from algorithms used to prioritize webpages in search engines. It is considered a measure of "prestige" insofar as it takes into account the importance of links/citations and not just their total number. Taking a sample of 38 journals from within the field of cognitive neuroscience, it is shown that SJR and Impact Factor correlate highly (r = .83) but with a few large discrepancies in rankings. This journal, Cognitive Neuroscience, fares better on the prestige-based measure than might otherwise be expected from its citation-based rank.

  14. Health Information on Internet: Quality, Importance, and Popularity of Persian Health Websites

    PubMed Central

    Samadbeik, Mahnaz; Ahmadi, Maryam; Mohammadi, Ali; Mohseni Saravi, Beniamin

    2014-01-01

    Background: The Internet has provided great opportunities for disseminating both accurate and inaccurate health information. Therefore, the quality of information is considered as a widespread concern affecting the human life. Despite the increasingly substantial growth in the number of users, Persian health websites and the proportion of internet-using patients, little is known about the quality of Persian medical and health websites. Objectives: The current study aimed to first assess the quality, popularity and importance of websites providing Persian health-related information, and second to evaluate the correlation of the popularity and importance ranking with quality score on the Internet. Materials and Methods: The sample websites were identified by entering the health-related keywords into four most popular search engines of Iranian users based on the Alexa ranking at the time of study. Each selected website was assessed using three qualified tools including the Bomba and Land Index, Google PageRank and the Alexa ranking. Results: The evaluated sites characteristics (ownership structure, database, scope and objective) really did not have an effect on the Alexa traffic global rank, Alexa traffic rank in Iran, Google PageRank and Bomba total score. Most websites (78.9 percent, n = 56) were in the moderate category (8 ≤ x ≤ 11.99) based on their quality levels. There was no statistically significant association between Google PageRank with Bomba index variables and Alexa traffic global rank (P > 0.05). Conclusions: The Persian health websites had better Bomba quality scores in availability and usability guidelines as compared to other guidelines. The Google PageRank did not properly reflect the real quality of evaluated websites and Internet users seeking online health information should not merely rely on it for any kind of prejudgment regarding Persian health websites. However, they can use Iran Alexa rank as a primary filtering tool of these websites. Therefore, designing search engines dedicated to explore accredited Persian health-related Web sites can be an effective method to access high-quality Persian health websites. PMID:24910795

  15. Free-breathing pediatric chest MRI: Performance of self-navigated golden-angle ordered conical ultrashort echo time acquisition.

    PubMed

    Zucker, Evan J; Cheng, Joseph Y; Haldipur, Anshul; Carl, Michael; Vasanawala, Shreyas S

    2018-01-01

    To assess the feasibility and performance of conical k-space trajectory free-breathing ultrashort echo time (UTE) chest magnetic resonance imaging (MRI) versus four-dimensional (4D) flow and effects of 50% data subsampling and soft-gated motion correction. Thirty-two consecutive children who underwent both 4D flow and UTE ferumoxytol-enhanced chest MR (mean age: 5.4 years, range: 6 days to 15.7 years) in one 3T exam were recruited. From UTE k-space data, three image sets were reconstructed: 1) one with all data, 2) one using the first 50% of data, and 3) a final set with soft-gating motion correction, leveraging the signal magnitude immediately after each excitation. Two radiologists in blinded fashion independently scored image quality of anatomical landmarks on a 5-point scale. Ratings were compared using Wilcoxon rank-sum, Wilcoxon signed-ranks, and Kruskal-Wallis tests. Interobserver agreement was assessed with the intraclass correlation coefficient (ICC). For fully sampled UTE, mean scores for all structures were ≥4 (good-excellent). Full UTE surpassed 4D flow for lungs and airways (P < 0.001), with similar pulmonary artery (PA) quality (P = 0.62). 50% subsampling only slightly degraded all landmarks (P < 0.001), as did motion correction. Subsegmental PA visualization was possible in >93% scans for all techniques (P = 0.27). Interobserver agreement was excellent for combined scores (ICC = 0.83). High-quality free-breathing conical UTE chest MR is feasible, surpassing 4D flow for lungs and airways, with equivalent PA visualization. Data subsampling only mildly degraded images, favoring lesser scan times. Soft-gating motion correction overall did not improve image quality. 2 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018;47:200-209. © 2017 International Society for Magnetic Resonance in Medicine.

  16. KENNEDY SPACE CENTER, FLA. - Louis MacDowell (right), Testbed manager, explains to Center Director Jim Kennedy the use of astmospheric calibration specimens. Placed at various locations, they can rank the corrosivity of the given environment. The KSC Beach Corrosion Test Site was established in the 1960s and has provided more than 30 years of historical information on the long-term performance of many materials in use at KSC and other locations around the world. Located 100 feet from the Atlantic Ocean approximately 1 mile south of the Space Shuttle launch sites, the test facility includes an atmospheric exposure site, a flowing seawater exposure site, and an on-site electrochemistry laboratory and monitoring station. The beach laboratory is used to conduct real-time corrosion experiments and provides for the remote monitoring of surrounding weather conditions. The newly added flowing seawater immersion facility provides for the immersion testing of materials and devices under controlled conditions.

    NASA Image and Video Library

    2003-08-21

    KENNEDY SPACE CENTER, FLA. - Louis MacDowell (right), Testbed manager, explains to Center Director Jim Kennedy the use of astmospheric calibration specimens. Placed at various locations, they can rank the corrosivity of the given environment. The KSC Beach Corrosion Test Site was established in the 1960s and has provided more than 30 years of historical information on the long-term performance of many materials in use at KSC and other locations around the world. Located 100 feet from the Atlantic Ocean approximately 1 mile south of the Space Shuttle launch sites, the test facility includes an atmospheric exposure site, a flowing seawater exposure site, and an on-site electrochemistry laboratory and monitoring station. The beach laboratory is used to conduct real-time corrosion experiments and provides for the remote monitoring of surrounding weather conditions. The newly added flowing seawater immersion facility provides for the immersion testing of materials and devices under controlled conditions.

  17. Mapping the ecosystem service delivery chain: Capacity, flow, and demand pertaining to aesthetic experiences in mountain landscapes.

    PubMed

    Egarter Vigl, Lukas; Depellegrin, Daniel; Pereira, Paulo; de Groot, Rudolf; Tappeiner, Ulrike

    2017-01-01

    Accounting for the spatial connectivity between the provision of ecosystem services (ES) and their beneficiaries (supply-benefit chain) is fundamental to understanding ecosystem functioning and its management. However, the interrelationships of the specific chain links within ecosystems and the actual benefits that flow from natural landscapes to surrounding land have rarely been analyzed. We present a spatially explicit model for the analysis of one cultural ecosystem service (aesthetic experience), which integrates the complete ecosystem service delivery chain for Puez-Geisler Nature Park (Italy): (1) The potential service stock (ES capacity) relies on an expert-based land use ranking matrix, (2) the actual supply (ES flow) is based on visibility properties of observation points along recreational routes, (3) the beneficiaries of the service (ES demand) are derived from socioeconomic data as a measure of the visitation rate to the recreation location, and (4) the supply-demand relationship (ES budget) addresses the spatially explicit oversupply and undersupply of ES. The results indicate that potential ES stocks are substantially higher in core and buffer zones of protected areas than in surrounding land owing to the specific landscape composition. ES flow maps reveal service delivery to 80% of the total area studied, with the highest actual service supply to locations with long and open vistas. ES beneficiary analyses show the highest demand for aesthetic experiences in all-season tourist destinations like Val Badia and Val Gardena, where both recreational amenity and overnight stays are equally high. ES budget maps identify ES hot and cold spots in terms of ES delivery, and they highlight ES undersupply in nature protection buffer zones although they are characterized by highest ES capacity. We show how decision/policy makers can use the presented methodology to plan landscape protection measures and develop specific regulation strategies for visitors based on the ES delivery chain concept. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Determinants of immigration strategies in male crested macaques (Macaca nigra)

    PubMed Central

    Marty, Pascal R.; Hodges, Keith; Agil, Muhammad; Engelhardt, Antje

    2016-01-01

    Immigration into a new group can produce substantial costs due to resistance from residents, but also reproductive benefits. Whether or not individuals base their immigration strategy on prospective cost-benefit ratios remains unknown. We investigated individual immigration decisions in crested macaques, a primate species with a high reproductive skew in favour of high-ranking males. We found two different strategies. Males who achieved low rank in the new group usually immigrated after another male had immigrated within the previous 25 days and achieved high rank. They never got injured but also had low prospective reproductive success. We assume that these males benefitted from immigrating into a destabilized male hierarchy. Males who achieved high rank in the new group usually immigrated independent of previous immigrations. They recieved injuries more frequently and therefore bore immigration costs. They, however, also had higher reproductive success prospects. We conclude that male crested macaques base their immigration strategy on relative fighting ability and thus potential rank in the new group i.e. potential reproductive benefits, as well as potential costs of injury. PMID:27535622

  19. Sparsity-Based Representation for Classification Algorithms and Comparison Results for Transient Acoustic Signals

    DTIC Science & Technology

    2016-05-01

    large but correlated noise and signal interference (i.e., low -rank interference). Another contribution is the implementation of deep learning...representation, low rank, deep learning 52 Tung-Duong Tran-Luu 301-394-3082Unclassified Unclassified Unclassified UU ii Approved for public release; distribution...Classification of Acoustic Transients 6 3.2 Joint Sparse Representation with Low -Rank Interference 7 3.3 Simultaneous Group-and-Joint Sparse Representation

  20. Compound Antidunes: a Key to Detect Catastrophic Volcanic Eruptions

    NASA Astrophysics Data System (ADS)

    Yoshida, S.; Nemoto, Y.

    2008-12-01

    Antidunes are common in pyroclastic flow and surge deposits. However, the compound or nested occurrence of antidunes, where smaller antidunes reside within a larger-scale antidune, has seldom been documented or discussed in both pyroclastic and siliciclastic depositional settings. Without realizing this complexity, the frequency and magnitude of volcanic eruptions estimated from pyroclastic deposits are severely unrealistic. We have documented the Holocene outcrops of the antidune-bearing pyroclastites in Niijima Island, 100 miles SSW of Tokyo, Japan. The pyroclastites were formed by the eruptions in 886 AD Along the Habushiura coast in the southeastern part of the island, these outcrops form up to 50 m high cliffs, and are laterally traceable over 5 km from the volcano crater that shed the pyroclastites in the northward (downcurrent) direction. These pyroclastites were previously interpreted as recording about 30 small eruptions, each forming a 0.5-2 meter thick subhorizontal couplet of pumice (inversely grading) and lithic (normal grading) debris, with cm-m thick antidunes. However, we postulate that each of these couplets does not record a single volcanic eruption, but a much shorter time. These couplets occur between concave-up vertical accretion surfaces, which have both upstream- and downstream-migration components, within a 5-15 meter thick compound antidune (our "rank-1" antidune). Three erosively stacked compound antidunes form the coastal cliffs in the Habushiura coast, and each compound antidune is about ten times thicker than antidunes reported by earlier workers (corresponding to our "rank-2 antidunes" that nest within a rank-1 antidune, and "rank-3 antidunes" that nest within a rank-2 antidune). Hence, the Habushiura cliffs represent only three eruption events (instead of 30 events), but each representing much larger magnitude of eruptions. The geometry of these antidunes is comparable to "sediment waves" or "cyclic steps" of siliciclastic deposits recently reported from the modern deep sea (continental slope) and jökulhlaup (glacial outburst flood on land), and from flume studies. The erosional surfaces that separate rank-1 antidunes and hence individual eruption events are subhorizontal to slightly inclined to the upstream direction, and appear to onlap to the volcano's slope. Similar compound antidunes and erosion surfaces, both in size and geometry, occur within the older (c. 10-20 ka) pyroclastic deposits in Niijima and nearby volcanic islands, even though the chemical, mineral and lithologic compositions of pyroclastites associated with each volcano and eruption are highly variable. The geometry and size of these compound antidunes are remarkably similar to large "dunes" within the subaqueous pyroclastic-flow deposits within the Bay of Naples, associated with the AD 79 Mt. Vesuvius eruptions, recently reported by Italian researchers.

  1. Correlation of intra-tumor 18F-FDG uptake heterogeneity indices with perfusion CT derived parameters in colorectal cancer.

    PubMed

    Tixier, Florent; Groves, Ashley M; Goh, Vicky; Hatt, Mathieu; Ingrand, Pierre; Le Rest, Catherine Cheze; Visvikis, Dimitris

    2014-01-01

    Thirty patients with proven colorectal cancer prospectively underwent integrated 18F-FDG PET/DCE-CT to assess the metabolic-flow phenotype. Both CT blood flow parametric maps and PET images were analyzed. Correlations between PET heterogeneity and perfusion CT were assessed by Spearman's rank correlation analysis. Blood flow visualization provided by DCE-CT images was significantly correlated with 18F-FDG PET metabolically active tumor volume as well as with uptake heterogeneity for patients with stage III/IV tumors (|ρ|:0.66 to 0.78; p-value<0.02). The positive correlation found with tumor blood flow indicates that intra-tumor heterogeneity of 18F-FDG PET accumulation reflects to some extent tracer distribution and consequently indicates that 18F-FDG PET intra-tumor heterogeneity may be associated with physiological processes such as tumor vascularization.

  2. The Increasing Trend in Global Ranking of Websites of Iranian Medical Universities during January 2012-2015.

    PubMed

    Ramezan Ghorbani, Nahid; Fakour, Yousef; Nojoumi, Seyed Ali

    2017-08-01

    Researchers and academic institutions need assessment and rating to measure their performance. The criteria are designed to evaluate quality and adequacy of research and welcome by most universities as an international process to increase monitoring academic achievements. The study aimed to evaluate the increasing trend in global ranking of Iranian medical universities websites emphasizing on comparative approach. This is a cross-sectional study involving websites of Iranian medical universities. Sampling was conducted by census selecting universities affiliated to the Ministry of Health in webometrics rating system. Web sites of Iranian medical universities were investigated based on the webometrics indicators, global ranking as well as the process of changing their rating. Universities of medical sciences were associated with improved ratings in seven periods from Jan 2012 until Jan 2015. The highest rank was in Jan 2014. Tehran University of Medical Sciences ranked the first in all periods. The highest ratings were about impact factor in universities of medical sciences reflecting the low level of this index in university websites. The least ranking was observed in type 1 universities. Despite the criticisms and weaknesses of these webometrics criteria, they are critical to this equation and should be checked for authenticity and suitability of goals. Therefore, localizing these criteria by the advantages model, ranking systems features, continuous development and medical universities evaluation based on these indicators provide new opportunities for the development of the country especially through online media.

  3. The Increasing Trend in Global Ranking of Websites of Iranian Medical Universities during January 2012–2015

    PubMed Central

    RAMEZAN GHORBANI, Nahid; FAKOUR, Yousef; NOJOUMI, Seyed Ali

    2017-01-01

    Background: Researchers and academic institutions need assessment and rating to measure their performance. The criteria are designed to evaluate quality and adequacy of research and welcome by most universities as an international process to increase monitoring academic achievements. The study aimed to evaluate the increasing trend in global ranking of Iranian medical universities websites emphasizing on comparative approach. Methods: This is a cross-sectional study involving websites of Iranian medical universities. Sampling was conducted by census selecting universities affiliated to the Ministry of Health in webometrics rating system. Web sites of Iranian medical universities were investigated based on the webometrics indicators, global ranking as well as the process of changing their rating. Universities of medical sciences were associated with improved ratings in seven periods from Jan 2012 until Jan 2015. Results: The highest rank was in Jan 2014. Tehran University of Medical Sciences ranked the first in all periods. The highest ratings were about impact factor in universities of medical sciences reflecting the low level of this index in university websites. The least ranking was observed in type 1 universities. Conclusion: Despite the criticisms and weaknesses of these webometrics criteria, they are critical to this equation and should be checked for authenticity and suitability of goals. Therefore, localizing these criteria by the advantages model, ranking systems features, continuous development and medical universities evaluation based on these indicators provide new opportunities for the development of the country especially through online media. PMID:28894711

  4. Adaptive low-rank subspace learning with online optimization for robust visual tracking.

    PubMed

    Liu, Risheng; Wang, Di; Han, Yuzhuo; Fan, Xin; Luo, Zhongxuan

    2017-04-01

    In recent years, sparse and low-rank models have been widely used to formulate appearance subspace for visual tracking. However, most existing methods only consider the sparsity or low-rankness of the coefficients, which is not sufficient enough for appearance subspace learning on complex video sequences. Moreover, as both the low-rank and the column sparse measures are tightly related to all the samples in the sequences, it is challenging to incrementally solve optimization problems with both nuclear norm and column sparse norm on sequentially obtained video data. To address above limitations, this paper develops a novel low-rank subspace learning with adaptive penalization (LSAP) framework for subspace based robust visual tracking. Different from previous work, which often simply decomposes observations as low-rank features and sparse errors, LSAP simultaneously learns the subspace basis, low-rank coefficients and column sparse errors to formulate appearance subspace. Within LSAP framework, we introduce a Hadamard production based regularization to incorporate rich generative/discriminative structure constraints to adaptively penalize the coefficients for subspace learning. It is shown that such adaptive penalization can significantly improve the robustness of LSAP on severely corrupted dataset. To utilize LSAP for online visual tracking, we also develop an efficient incremental optimization scheme for nuclear norm and column sparse norm minimizations. Experiments on 50 challenging video sequences demonstrate that our tracker outperforms other state-of-the-art methods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Econophysics of a ranked demand and supply resource allocation problem

    NASA Astrophysics Data System (ADS)

    Priel, Avner; Tamir, Boaz

    2018-01-01

    We present a two sided resource allocation problem, between demands and supplies, where both parties are ranked. For example, in Big Data problems where a set of different computational tasks is divided between a set of computers each with its own resources, or between employees and employers where both parties are ranked, the employees by their fitness and the employers by their package benefits. The allocation process can be viewed as a repeated game where in each iteration the strategy is decided by a meta-rule, based on the ranks of both parties and the results of the previous games. We show the existence of a phase transition between an absorbing state, where all demands are satisfied, and an active one where part of the demands are always left unsatisfied. The phase transition is governed by the ratio between supplies and demand. In a job allocation problem we find positive correlation between the rank of the workers and the rank of the factories; higher rank workers are usually allocated to higher ranked factories. These all suggest global emergent properties stemming from local variables. To demonstrate the global versus local relations, we introduce a local inertial force that increases the rank of employees in proportion to their persistence time in the same factory. We show that such a local force induces non trivial global effects, mostly to benefit the lower ranked employees.

  6. Validation of SmartRank: A likelihood ratio software for searching national DNA databases with complex DNA profiles.

    PubMed

    Benschop, Corina C G; van de Merwe, Linda; de Jong, Jeroen; Vanvooren, Vanessa; Kempenaers, Morgane; Kees van der Beek, C P; Barni, Filippo; Reyes, Eusebio López; Moulin, Léa; Pene, Laurent; Haned, Hinda; Sijen, Titia

    2017-07-01

    Searching a national DNA database with complex and incomplete profiles usually yields very large numbers of possible matches that can present many candidate suspects to be further investigated by the forensic scientist and/or police. Current practice in most forensic laboratories consists of ordering these 'hits' based on the number of matching alleles with the searched profile. Thus, candidate profiles that share the same number of matching alleles are not differentiated and due to the lack of other ranking criteria for the candidate list it may be difficult to discern a true match from the false positives or notice that all candidates are in fact false positives. SmartRank was developed to put forward only relevant candidates and rank them accordingly. The SmartRank software computes a likelihood ratio (LR) for the searched profile and each profile in the DNA database and ranks database entries above a defined LR threshold according to the calculated LR. In this study, we examined for mixed DNA profiles of variable complexity whether the true donors are retrieved, what the number of false positives above an LR threshold is and the ranking position of the true donors. Using 343 mixed DNA profiles over 750 SmartRank searches were performed. In addition, the performance of SmartRank and CODIS were compared regarding DNA database searches and SmartRank was found complementary to CODIS. We also describe the applicable domain of SmartRank and provide guidelines. The SmartRank software is open-source and freely available. Using the best practice guidelines, SmartRank enables obtaining investigative leads in criminal cases lacking a suspect. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. RELATIVE POTENCY RANKING FOR CHLOROPHENOLS

    EPA Science Inventory

    Recently the National Center for Environmental Assessment-Cincinnati completed a feasibility study for developing a toxicity related relative potency ranking scheme for chlorophenols. In this study it was concluded that a large data base exists pertaining to the relative toxicity...

  8. Quantile rank maps: a new tool for understanding individual brain development.

    PubMed

    Chen, Huaihou; Kelly, Clare; Castellanos, F Xavier; He, Ye; Zuo, Xi-Nian; Reiss, Philip T

    2015-05-01

    We propose a novel method for neurodevelopmental brain mapping that displays how an individual's values for a quantity of interest compare with age-specific norms. By estimating smoothly age-varying distributions at a set of brain regions of interest, we derive age-dependent region-wise quantile ranks for a given individual, which can be presented in the form of a brain map. Such quantile rank maps could potentially be used for clinical screening. Bootstrap-based confidence intervals are proposed for the quantile rank estimates. We also propose a recalibrated Kolmogorov-Smirnov test for detecting group differences in the age-varying distribution. This test is shown to be more robust to model misspecification than a linear regression-based test. The proposed methods are applied to brain imaging data from the Nathan Kline Institute Rockland Sample and from the Autism Brain Imaging Data Exchange (ABIDE) sample. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Target Fishing for Chemical Compounds using Target-Ligand Activity data and Ranking based Methods

    PubMed Central

    Wale, Nikil; Karypis, George

    2009-01-01

    In recent years the development of computational techniques that identify all the likely targets for a given chemical compound, also termed as the problem of Target Fishing, has been an active area of research. Identification of likely targets of a chemical compound helps to understand problems such as toxicity, lack of efficacy in humans, and poor physical properties associated with that compound in the early stages of drug discovery. In this paper we present a set of techniques whose goal is to rank or prioritize targets in the context of a given chemical compound such that most targets that this compound may show activity against appear higher in the ranked list. These methods are based on our extensions to the SVM and Ranking Perceptron algorithms for this problem. Our extensive experimental study shows that the methods developed in this work outperform previous approaches by 2% to 60% under different evaluation criterions. PMID:19764745

  10. Chimpanzee females queue but males compete for social status

    PubMed Central

    Foerster, Steffen; Franz, Mathias; Murray, Carson M.; Gilby, Ian C.; Feldblum, Joseph T.; Walker, Kara K.; Pusey, Anne E.

    2016-01-01

    Dominance hierarchies are widespread in animal social groups and often have measureable effects on individual health and reproductive success. Dominance ranks are not static individual attributes, however, but instead are influenced by two independent processes: 1) changes in hierarchy membership and 2) successful challenges of higher-ranking individuals. Understanding which of these processes dominates the dynamics of rank trajectories can provide insights into fitness benefits of within-sex competition. This question has yet to be examined systematically in a wide range of taxa due to the scarcity of long-term data and a lack of appropriate methodologies for distinguishing between alternative causes of rank changes over time. Here, we expand on recent work and develop a new likelihood-based Elo rating method that facilitates the systematic assessment of rank dynamics in animal social groups, even when interaction data are sparse. We apply this method to characterize long-term rank trajectories in wild eastern chimpanzees (Pan troglodytes schweinfurthii) and find remarkable sex differences in rank dynamics, indicating that females queue for social status while males actively challenge each other to rise in rank. Further, our results suggest that natal females obtain a head start in the rank queue if they avoid dispersal, with potential fitness benefits. PMID:27739527

  11. A field study of the confluence between Negro and Solimões Rivers. Part 1: Hydrodynamics and sediment transport

    NASA Astrophysics Data System (ADS)

    Gualtieri, Carlo; Filizola, Naziano; de Oliveira, Marco; Santos, Andrè Martinelli; Ianniruberto, Marco

    2018-01-01

    Confluences are a common feature of riverine systems, where are located converging flow streamlines and potential mixing of separate flows. The confluence of the Negro and Solimões Rivers ranks among the largest on Earth and its study may provide some general insights into large confluence dynamics and processes. An investigation was recently conducted about that confluence in both low and high-flow conditions using acoustic Doppler velocity profiling (ADCP), water quality sampling and high-resolution seismic data. First, the study gained insights into the characterization of the basic hydrodynamics parameters about the confluence as well as of those affecting sediments transport. Second, the analysis of the results showed that common hydrodynamic features noted in previous confluence studies were herein observed. Finally, some differences between low-flow and relatively high-flow conditions about the transfer of momentum from the Solimões to the Negro side of the Amazon Channel were identified.

  12. Social ranking effects on tooth-brushing behaviour.

    PubMed

    Maltby, John; Paterson, Kevin; Day, Liz; Jones, Ceri; Kinnear, Hayley; Buchanan, Heather

    2016-05-01

    A tooth-brushing social rank hypothesis is tested suggesting tooth-brushing duration is influenced when individuals position their behaviour in a rank when comparing their behaviour with other individuals. Study 1 used a correlation design, Study 2 used a semi-experimental design, and Study 3 used a randomized intervention design to examine the tooth-brushing social rank hypothesis in terms of self-reported attitudes, cognitions, and behaviour towards tooth-brushing duration. Study 1 surveyed participants to examine whether the perceived health benefits of tooth-brushing duration could be predicted from the ranking of each person's tooth-brushing duration. Study 2 tested whether manipulating the rank position of the tooth-brushing duration influenced participant-perceived health benefits of tooth-brushing duration. Study 3 used a longitudinal intervention method to examine whether messages relating to the rank positions of tooth-brushing durations causally influenced the self-report tooth-brushing duration. Study 1 demonstrates that perceptions of the health benefits from tooth-brushing duration are predicted by the perceptions of how that behaviour ranks in comparison to other people's behaviour. Study 2 demonstrates that the perceptions of the health benefits of tooth-brushing duration can be manipulated experimentally by changing the ranked position of a person's tooth-brushing duration. Study 3 experimentally demonstrates the possibility of increasing the length of time for which individuals clean their teeth by focusing on how they rank among their peers in terms of tooth-brushing duration. The effectiveness of interventions using social-ranking methods relative to those that emphasize comparisons made against group averages or normative guidelines are discussed. What is already known on this subject? Individual make judgements based on social rank information. Social rank information has been shown to influence positive health behaviours such as exercise. What does this study add? The health benefits of tooth-brushing are predicted by how tooth-brushing duration ranks within a distribution. Focussing on how teeth-cleaning duration ranks among others produces longer teeth-cleaning durations. © 2015 The British Psychological Society.

  13. Effects of Abandoned Coal-Mine Drainage on Streamflow and Water Quality in the Shamokin Creek Basin, Northumberland and Columbia Counties, Pennsylvania, 1999-2001

    USGS Publications Warehouse

    Cravotta,, Charles A.; Kirby, Carl S.

    2003-01-01

    This report assesses the contaminant loading, effects to receiving streams, and possible remedial alternatives for abandoned mine drainage (AMD) within the upper Shamokin Creek Basin in east-central Pennsylvania. The upper Shamokin Creek Basin encompasses an area of 54 square miles (140 square kilometers) within the Western Middle Anthracite Field, including and upstream of the city of Shamokin. Elevated concentrations of acidity, metals, and sulfate in the AMD from flooded underground anthracite coal mines and (or) unreclaimed culm (waste rock) piles degrade the aquatic ecosystem and water quality of Shamokin Creek to its mouth and along many of its tributaries within the upper basin. Despite dilution by unpolluted streams that more than doubles the streamflow of Shamokin Creek in the lower basin, AMD contamination and ecological impairment persist to its mouth on the Susquehanna River at Sunbury, 20 miles (32 kilometers) downstream from the mined area. Aquatic ecological surveys were conducted by the U.S. Geological Survey (USGS) in cooperation with Bucknell University (BU) and the Northumberland County Conservation District (NCCD) at six stream sites in October 1999 and repeated in 2000 and 2001 on Shamokin Creek below Shamokin and at Sunbury. In 1999, fish were absent from Quaker Run and Shamokin Creek upstream of its confluence with Carbon Run; however, creek chub (Semotilus atromaculatus) were present within three sampled reaches of Carbon Run. During 1999, 2000, and 2001, six or more species of fish were identified in Shamokin Creek below Shamokin and at Sunbury despite elevated concentrations of dissolved iron and ironencrusted streambeds at these sites. Data on the flow rate and chemistry for 46 AMD sources and 22 stream sites throughout the upper basin plus 1 stream site at Sunbury were collected by the USGS with assistance from BU and the Shamokin Creek Restoration Alliance (SCRA) during low base-flow conditions in August 1999 and high baseflow conditions in March 2000. The water-quality data were used to determine priority ranks of the AMD sources on the basis of loadings of iron, manganese, and aluminum and to identify possible remedial alternatives, including passive-treatment options, for consideration by water-resource managers. The ranking sequence for the top AMD sources based on the high base-flow data generally matched that based on the low base-flow data. The contaminant loadings generally increased with flow, and 10 previously identified intermittent AMD sources were not discharging during the low base-flow sampling period. The top 3 AMD sources (SR19, SR12, and SR49) on the basis of dissolved metals loading in March 2000 accounted for more than 50 percent of the metals loading to Shamokin Creek, whereas the top 15 AMD sources accounted for more than 98 percent of the metals loading. When sampled in March 2000, these AMD sources had flow rates ranging from 0.7 to 19 cubic feet per second (1,138 to 32,285 liters per minute) and pH from 3.5 to 6.1 standard units. Only 1 of the top 15 AMD sources (SR21) was net alkaline (alkalinity > acidity); the others were net acidic and will require additional alkalinity to facilitate metals removal and maintain near-neutral pH. For the top 15 AMD sources, dissolved iron was the principal source of acidity and metals loading; concentrations of iron ranged from 10 to 57 milligrams per liter. Dissolved manganese ranged from 1.9 to 7.4 milligrams per liter. Dissolved aluminum exceeded 3.9 milligrams per liter at seven of the sites but was less than 0.2 milligram per liter at seven others. Alkalinity can be acquired by the dissolution of limestone and (or) bacterial sulfate reduction within various passive-treatment systems including anoxic or oxic limestone drains, limestone- lined channels, or compost wetlands. Subsequently, the gradual oxidation and consequent precipitation of iron and manganese can be accommodated within settling ponds or aerobic wetlands. Assum

  14. Speaker-sensitive emotion recognition via ranking: Studies on acted and spontaneous speech☆

    PubMed Central

    Cao, Houwei; Verma, Ragini; Nenkova, Ani

    2014-01-01

    We introduce a ranking approach for emotion recognition which naturally incorporates information about the general expressivity of speakers. We demonstrate that our approach leads to substantial gains in accuracy compared to conventional approaches. We train ranking SVMs for individual emotions, treating the data from each speaker as a separate query, and combine the predictions from all rankers to perform multi-class prediction. The ranking method provides two natural benefits. It captures speaker specific information even in speaker-independent training/testing conditions. It also incorporates the intuition that each utterance can express a mix of possible emotion and that considering the degree to which each emotion is expressed can be productively exploited to identify the dominant emotion. We compare the performance of the rankers and their combination to standard SVM classification approaches on two publicly available datasets of acted emotional speech, Berlin and LDC, as well as on spontaneous emotional data from the FAU Aibo dataset. On acted data, ranking approaches exhibit significantly better performance compared to SVM classification both in distinguishing a specific emotion from all others and in multi-class prediction. On the spontaneous data, which contains mostly neutral utterances with a relatively small portion of less intense emotional utterances, ranking-based classifiers again achieve much higher precision in identifying emotional utterances than conventional SVM classifiers. In addition, we discuss the complementarity of conventional SVM and ranking-based classifiers. On all three datasets we find dramatically higher accuracy for the test items on whose prediction the two methods agree compared to the accuracy of individual methods. Furthermore on the spontaneous data the ranking and standard classification are complementary and we obtain marked improvement when we combine the two classifiers by late-stage fusion. PMID:25422534

  15. Comparison and ranking of superelasticity of different austenite active nickel-titanium orthodontic archwires using mechanical tensile testing and correlating with its electrical resistivity

    PubMed Central

    Nagarajan, D.; Baskaranarayanan, Balashanmugam; Usha, K.; Jayanthi, M. S.; Vijjaykanth, M.

    2016-01-01

    Introduction: The application of light and continuous forces for optimum physiological response and the least damage to the tooth supporting structures should be the primary aim of an orthodontist. Nickel-titanium (NiTi) alloys with their desirable properties are one of the natural choices of the clinicians. Aim: This study was aimed to compare and rank them based on its tensile strength and electrical resistivity. Materials and Methods: The sample consisted of eight groups of 0.017 inch × 0.025 inch rectangular archwires from eight different manufacturers, and five samples from each group for tensile testing and nine samples for electrical resistivity tests were used. Data for stress at 10% strain and the initial slope were statistically analyzed with an analysis of variance and Scheffe tests with P < 0.05. The stress/strain plots of each product were ranked for superelastic behavior. The rankings of the wires tested were based primarily on the unloading curve's slope which is indicative of the magnitude of the deactivation force and secondarily on the length of the horizontal segment which is indicative of continuous forces during deactivation. For calculating the electric resistivity, the change in resistance after inducing strain in the wires was taken into account for the calculation of degree of martensite transformation and for ranking. Results: In tensile testing Ortho Organizers wires ranked first and GAC Lowland NiTi wires ranked last. For resistivity tests Ormco A wires were found superior and Morelli remained last. Conclusion: these rankings should be correlated clinically and need further studies. PMID:27829751

  16. Speaker-sensitive emotion recognition via ranking: Studies on acted and spontaneous speech☆

    PubMed

    Cao, Houwei; Verma, Ragini; Nenkova, Ani

    2015-01-01

    We introduce a ranking approach for emotion recognition which naturally incorporates information about the general expressivity of speakers. We demonstrate that our approach leads to substantial gains in accuracy compared to conventional approaches. We train ranking SVMs for individual emotions, treating the data from each speaker as a separate query, and combine the predictions from all rankers to perform multi-class prediction. The ranking method provides two natural benefits. It captures speaker specific information even in speaker-independent training/testing conditions. It also incorporates the intuition that each utterance can express a mix of possible emotion and that considering the degree to which each emotion is expressed can be productively exploited to identify the dominant emotion. We compare the performance of the rankers and their combination to standard SVM classification approaches on two publicly available datasets of acted emotional speech, Berlin and LDC, as well as on spontaneous emotional data from the FAU Aibo dataset. On acted data, ranking approaches exhibit significantly better performance compared to SVM classification both in distinguishing a specific emotion from all others and in multi-class prediction. On the spontaneous data, which contains mostly neutral utterances with a relatively small portion of less intense emotional utterances, ranking-based classifiers again achieve much higher precision in identifying emotional utterances than conventional SVM classifiers. In addition, we discuss the complementarity of conventional SVM and ranking-based classifiers. On all three datasets we find dramatically higher accuracy for the test items on whose prediction the two methods agree compared to the accuracy of individual methods. Furthermore on the spontaneous data the ranking and standard classification are complementary and we obtain marked improvement when we combine the two classifiers by late-stage fusion.

  17. A Corpus-Based Approach for Automatic Thai Unknown Word Recognition Using Boosting Techniques

    NASA Astrophysics Data System (ADS)

    Techo, Jakkrit; Nattee, Cholwich; Theeramunkong, Thanaruk

    While classification techniques can be applied for automatic unknown word recognition in a language without word boundary, it faces with the problem of unbalanced datasets where the number of positive unknown word candidates is dominantly smaller than that of negative candidates. To solve this problem, this paper presents a corpus-based approach that introduces a so-called group-based ranking evaluation technique into ensemble learning in order to generate a sequence of classification models that later collaborate to select the most probable unknown word from multiple candidates. Given a classification model, the group-based ranking evaluation (GRE) is applied to construct a training dataset for learning the succeeding model, by weighing each of its candidates according to their ranks and correctness when the candidates of an unknown word are considered as one group. A number of experiments have been conducted on a large Thai medical text to evaluate performance of the proposed group-based ranking evaluation approach, namely V-GRE, compared to the conventional naïve Bayes classifier and our vanilla version without ensemble learning. As the result, the proposed method achieves an accuracy of 90.93±0.50% when the first rank is selected while it gains 97.26±0.26% when the top-ten candidates are considered, that is 8.45% and 6.79% improvement over the conventional record-based naïve Bayes classifier and the vanilla version. Another result on applying only best features show 93.93±0.22% and up to 98.85±0.15% accuracy for top-1 and top-10, respectively. They are 3.97% and 9.78% improvement over naive Bayes and the vanilla version. Finally, an error analysis is given.

  18. 76 FR 22122 - Section 8 Housing Choice Voucher Program-Demonstration Project of Small Area Fair Market Rents in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-20

    ... rent and the core-based statistical area (CBSA) rent as applied to the 40th percentile FMR for that..., calculated on the basis of the core-based statistical area (CBSA) or the metropolitan Statistical Area (MSA... will be ranked according to each of the statistics specified above, and then a weighted average ranking...

  19. Reduced-rank technique for joint channel estimation in TD-SCDMA systems

    NASA Astrophysics Data System (ADS)

    Kamil Marzook, Ali; Ismail, Alyani; Mohd Ali, Borhanuddin; Sali, Adawati; Khatun, Sabira

    2013-02-01

    In time division-synchronous code division multiple access systems, increasing the system capacity by exploiting the inserting of the largest number of users in one time slot (TS) requires adding more estimation processes to estimate the joint channel matrix for the whole system. The increase in the number of channel parameters due the increase in the number of users in one TS directly affects the precision of the estimator's performance. This article presents a novel channel estimation with low complexity, which relies on reducing the rank order of the total channel matrix H. The proposed method exploits the rank deficiency of H to reduce the number of parameters that characterise this matrix. The adopted reduced-rank technique is based on truncated singular value decomposition algorithm. The algorithms for reduced-rank joint channel estimation (JCE) are derived and compared against traditional full-rank JCEs: least squares (LS) or Steiner and enhanced (LS or MMSE) algorithms. Simulation results of the normalised mean square error showed the superiority of reduced-rank estimators. In addition, the channel impulse responses founded by reduced-rank estimator for all active users offers considerable performance improvement over the conventional estimator along the channel window length.

  20. SRS: Site ranking system for hazardous chemical and radioactive waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rechard, R.P.; Chu, M.S.Y.; Brown, S.L.

    1988-05-01

    This report describes the rationale and presents instructions for a site ranking system (SRS). SRS ranks hazardous chemical and radioactive waste sites by scoring important and readily available factors that influence risk to human health. Using SRS, sites can be ranked for purposes of detailed site investigations. SRS evaluates the relative risk as a combination of potentially exposed population, chemical toxicity, and potential exposure of release from a waste site; hence, SRS uses the same concepts found in a detailed assessment of health risk. Basing SRS on the concepts of risk assessment tends to reduce the distortion of results foundmore » in other ranking schemes. More importantly, a clear logic helps ensure the successful application of the ranking procedure and increases its versatility when modifications are necessary for unique situations. Although one can rank sites using a detailed risk assessment, it is potentially costly because of data and resources required. SRS is an efficient approach to provide an order-of-magnitude ranking, requiring only readily available data (often only descriptive) and hand calculations. Worksheets are included to make the system easier to understand and use. 88 refs., 19 figs., 58 tabs.« less

  1. Diagnosing and ranking retinopathy disease level using diabetic fundus image recuperation approach.

    PubMed

    Somasundaram, K; Rajendran, P Alli

    2015-01-01

    Retinal fundus images are widely used in diagnosing different types of eye diseases. The existing methods such as Feature Based Macular Edema Detection (FMED) and Optimally Adjusted Morphological Operator (OAMO) effectively detected the presence of exudation in fundus images and identified the true positive ratio of exudates detection, respectively. These mechanically detected exudates did not include more detailed feature selection technique to the system for detection of diabetic retinopathy. To categorize the exudates, Diabetic Fundus Image Recuperation (DFIR) method based on sliding window approach is developed in this work to select the features of optic cup in digital retinal fundus images. The DFIR feature selection uses collection of sliding windows with varying range to obtain the features based on the histogram value using Group Sparsity Nonoverlapping Function. Using support vector model in the second phase, the DFIR method based on Spiral Basis Function effectively ranks the diabetic retinopathy disease level. The ranking of disease level on each candidate set provides a much promising result for developing practically automated and assisted diabetic retinopathy diagnosis system. Experimental work on digital fundus images using the DFIR method performs research on the factors such as sensitivity, ranking efficiency, and feature selection time.

  2. Diagnosing and Ranking Retinopathy Disease Level Using Diabetic Fundus Image Recuperation Approach

    PubMed Central

    Somasundaram, K.; Alli Rajendran, P.

    2015-01-01

    Retinal fundus images are widely used in diagnosing different types of eye diseases. The existing methods such as Feature Based Macular Edema Detection (FMED) and Optimally Adjusted Morphological Operator (OAMO) effectively detected the presence of exudation in fundus images and identified the true positive ratio of exudates detection, respectively. These mechanically detected exudates did not include more detailed feature selection technique to the system for detection of diabetic retinopathy. To categorize the exudates, Diabetic Fundus Image Recuperation (DFIR) method based on sliding window approach is developed in this work to select the features of optic cup in digital retinal fundus images. The DFIR feature selection uses collection of sliding windows with varying range to obtain the features based on the histogram value using Group Sparsity Nonoverlapping Function. Using support vector model in the second phase, the DFIR method based on Spiral Basis Function effectively ranks the diabetic retinopathy disease level. The ranking of disease level on each candidate set provides a much promising result for developing practically automated and assisted diabetic retinopathy diagnosis system. Experimental work on digital fundus images using the DFIR method performs research on the factors such as sensitivity, ranking efficiency, and feature selection time. PMID:25945362

  3. SpikeTemp: An Enhanced Rank-Order-Based Learning Approach for Spiking Neural Networks With Adaptive Structure.

    PubMed

    Wang, Jinling; Belatreche, Ammar; Maguire, Liam P; McGinnity, Thomas Martin

    2017-01-01

    This paper presents an enhanced rank-order-based learning algorithm, called SpikeTemp, for spiking neural networks (SNNs) with a dynamically adaptive structure. The trained feed-forward SNN consists of two layers of spiking neurons: 1) an encoding layer which temporally encodes real-valued features into spatio-temporal spike patterns and 2) an output layer of dynamically grown neurons which perform spatio-temporal classification. Both Gaussian receptive fields and square cosine population encoding schemes are employed to encode real-valued features into spatio-temporal spike patterns. Unlike the rank-order-based learning approach, SpikeTemp uses the precise times of the incoming spikes for adjusting the synaptic weights such that early spikes result in a large weight change and late spikes lead to a smaller weight change. This removes the need to rank all the incoming spikes and, thus, reduces the computational cost of SpikeTemp. The proposed SpikeTemp algorithm is demonstrated on several benchmark data sets and on an image recognition task. The results show that SpikeTemp can achieve better classification performance and is much faster than the existing rank-order-based learning approach. In addition, the number of output neurons is much smaller when the square cosine encoding scheme is employed. Furthermore, SpikeTemp is benchmarked against a selection of existing machine learning algorithms, and the results demonstrate the ability of SpikeTemp to classify different data sets after just one presentation of the training samples with comparable classification performance.

  4. Workflows and performances in the ranking prediction of 2016 D3R Grand Challenge 2: lessons learned from a collaborative effort.

    PubMed

    Gao, Ying-Duo; Hu, Yuan; Crespo, Alejandro; Wang, Deping; Armacost, Kira A; Fells, James I; Fradera, Xavier; Wang, Hongwu; Wang, Huijun; Sherborne, Brad; Verras, Andreas; Peng, Zhengwei

    2018-01-01

    The 2016 D3R Grand Challenge 2 includes both pose and affinity or ranking predictions. This article is focused exclusively on affinity predictions submitted to the D3R challenge from a collaborative effort of the modeling and informatics group. Our submissions include ranking of 102 ligands covering 4 different chemotypes against the FXR ligand binding domain structure, and the relative binding affinity predictions of the two designated free energy subsets of 15 and 18 compounds. Using all the complex structures prepared in the same way allowed us to cover many types of workflows and compare their performances effectively. We evaluated typical workflows used in our daily structure-based design modeling support, which include docking scores, force field-based scores, QM/MM, MMGBSA, MD-MMGBSA, and MacroModel interaction energy estimations. The best performing methods for the two free energy subsets are discussed. Our results suggest that affinity ranking still remains very challenging; that the knowledge of more structural information does not necessarily yield more accurate predictions; and that visual inspection and human intervention are considerably important for ranking. Knowledge of the mode of action and protein flexibility along with visualization tools that depict polar and hydrophobic maps are very useful for visual inspection. QM/MM-based workflows were found to be powerful in affinity ranking and are encouraged to be applied more often. The standardized input and output enable systematic analysis and support methodology development and improvement for high level blinded predictions.

  5. Workflows and performances in the ranking prediction of 2016 D3R Grand Challenge 2: lessons learned from a collaborative effort

    NASA Astrophysics Data System (ADS)

    Gao, Ying-Duo; Hu, Yuan; Crespo, Alejandro; Wang, Deping; Armacost, Kira A.; Fells, James I.; Fradera, Xavier; Wang, Hongwu; Wang, Huijun; Sherborne, Brad; Verras, Andreas; Peng, Zhengwei

    2018-01-01

    The 2016 D3R Grand Challenge 2 includes both pose and affinity or ranking predictions. This article is focused exclusively on affinity predictions submitted to the D3R challenge from a collaborative effort of the modeling and informatics group. Our submissions include ranking of 102 ligands covering 4 different chemotypes against the FXR ligand binding domain structure, and the relative binding affinity predictions of the two designated free energy subsets of 15 and 18 compounds. Using all the complex structures prepared in the same way allowed us to cover many types of workflows and compare their performances effectively. We evaluated typical workflows used in our daily structure-based design modeling support, which include docking scores, force field-based scores, QM/MM, MMGBSA, MD-MMGBSA, and MacroModel interaction energy estimations. The best performing methods for the two free energy subsets are discussed. Our results suggest that affinity ranking still remains very challenging; that the knowledge of more structural information does not necessarily yield more accurate predictions; and that visual inspection and human intervention are considerably important for ranking. Knowledge of the mode of action and protein flexibility along with visualization tools that depict polar and hydrophobic maps are very useful for visual inspection. QM/MM-based workflows were found to be powerful in affinity ranking and are encouraged to be applied more often. The standardized input and output enable systematic analysis and support methodology development and improvement for high level blinded predictions.

  6. Methodological reporting of randomized clinical trials in respiratory research in 2010.

    PubMed

    Lu, Yi; Yao, Qiuju; Gu, Jie; Shen, Ce

    2013-09-01

    Although randomized controlled trials (RCTs) are considered the highest level of evidence, they are also subject to bias, due to a lack of adequately reported randomization, and therefore the reporting should be as explicit as possible for readers to determine the significance of the contents. We evaluated the methodological quality of RCTs in respiratory research in high ranking clinical journals, published in 2010. We assessed the methodological quality, including generation of the allocation sequence, allocation concealment, double-blinding, sample-size calculation, intention-to-treat analysis, flow diagrams, number of medical centers involved, diseases, funding sources, types of interventions, trial registration, number of times the papers have been cited, journal impact factor, journal type, and journal endorsement of the CONSORT (Consolidated Standards of Reporting Trials) rules, in RCTs published in 12 top ranking clinical respiratory journals and 5 top ranking general medical journals. We included 176 trials, of which 93 (53%) reported adequate generation of the allocation sequence, 66 (38%) reported adequate allocation concealment, 79 (45%) were double-blind, 123 (70%) reported adequate sample-size calculation, 88 (50%) reported intention-to-treat analysis, and 122 (69%) included a flow diagram. Multivariate logistic regression analysis revealed that journal impact factor ≥ 5 was the only variable that significantly influenced adequate allocation sequence generation. Trial registration and journal impact factor ≥ 5 significantly influenced adequate allocation concealment. Medical interventions, trial registration, and journal endorsement of the CONSORT statement influenced adequate double-blinding. Publication in one of the general medical journal influenced adequate sample-size calculation. The methodological quality of RCTs in respiratory research needs improvement. Stricter enforcement of the CONSORT statement should enhance the quality of RCTs.

  7. Academic Media Ranking and the Configurations of Values in Higher Education: A Sociotechnical History of a Co-Production in France between the Media, State and Higher Education (1976-1989)

    ERIC Educational Resources Information Center

    Bouchard, Julie

    2017-01-01

    Before the 2000s and the buzz surrounding global rankings, many countries witnessed the emergence and development, starting in the 1970s, of academic media rankings produced primarily by press organisations. This domestic, media-based production, despite the relative lack of attention paid by the social sciences, has been progressively integrated…

  8. Evaluating nodes importance in complex network based on PageRank algorithm

    NASA Astrophysics Data System (ADS)

    Li, Kai; He, Yongfeng

    2018-04-01

    To evaluate the important nodes in the complex network, and aim at the problems existing in the traditional PageRank algorithm, we propose a modified PageRank algorithm. The algorithm has convergence for the evaluation of the importance of the suspended nodes and the nodes with a directed loop network. The simulation example shows the effectiveness of the modified algorithm for the evaluation of the complexity of the complex network nodes.

  9. Predicting intensity ranks of peptide fragment ions.

    PubMed

    Frank, Ari M

    2009-05-01

    Accurate modeling of peptide fragmentation is necessary for the development of robust scoring functions for peptide-spectrum matches, which are the cornerstone of MS/MS-based identification algorithms. Unfortunately, peptide fragmentation is a complex process that can involve several competing chemical pathways, which makes it difficult to develop generative probabilistic models that describe it accurately. However, the vast amounts of MS/MS data being generated now make it possible to use data-driven machine learning methods to develop discriminative ranking-based models that predict the intensity ranks of a peptide's fragment ions. We use simple sequence-based features that get combined by a boosting algorithm into models that make peak rank predictions with high accuracy. In an accompanying manuscript, we demonstrate how these prediction models are used to significantly improve the performance of peptide identification algorithms. The models can also be useful in the design of optimal multiple reaction monitoring (MRM) transitions, in cases where there is insufficient experimental data to guide the peak selection process. The prediction algorithm can also be run independently through PepNovo+, which is available for download from http://bix.ucsd.edu/Software/PepNovo.html.

  10. Predicting Intensity Ranks of Peptide Fragment Ions

    PubMed Central

    Frank, Ari M.

    2009-01-01

    Accurate modeling of peptide fragmentation is necessary for the development of robust scoring functions for peptide-spectrum matches, which are the cornerstone of MS/MS-based identification algorithms. Unfortunately, peptide fragmentation is a complex process that can involve several competing chemical pathways, which makes it difficult to develop generative probabilistic models that describe it accurately. However, the vast amounts of MS/MS data being generated now make it possible to use data-driven machine learning methods to develop discriminative ranking-based models that predict the intensity ranks of a peptide's fragment ions. We use simple sequence-based features that get combined by a boosting algorithm in to models that make peak rank predictions with high accuracy. In an accompanying manuscript, we demonstrate how these prediction models are used to significantly improve the performance of peptide identification algorithms. The models can also be useful in the design of optimal MRM transitions, in cases where there is insufficient experimental data to guide the peak selection process. The prediction algorithm can also be run independently through PepNovo+, which is available for download from http://bix.ucsd.edu/Software/PepNovo.html. PMID:19256476

  11. Active subspace: toward scalable low-rank learning.

    PubMed

    Liu, Guangcan; Yan, Shuicheng

    2012-12-01

    We address the scalability issues in low-rank matrix learning problems. Usually these problems resort to solving nuclear norm regularized optimization problems (NNROPs), which often suffer from high computational complexities if based on existing solvers, especially in large-scale settings. Based on the fact that the optimal solution matrix to an NNROP is often low rank, we revisit the classic mechanism of low-rank matrix factorization, based on which we present an active subspace algorithm for efficiently solving NNROPs by transforming large-scale NNROPs into small-scale problems. The transformation is achieved by factorizing the large solution matrix into the product of a small orthonormal matrix (active subspace) and another small matrix. Although such a transformation generally leads to nonconvex problems, we show that a suboptimal solution can be found by the augmented Lagrange alternating direction method. For the robust PCA (RPCA) (Candès, Li, Ma, & Wright, 2009 ) problem, a typical example of NNROPs, theoretical results verify the suboptimality of the solution produced by our algorithm. For the general NNROPs, we empirically show that our algorithm significantly reduces the computational complexity without loss of optimality.

  12. Tag-Based Social Image Search: Toward Relevant and Diverse Results

    NASA Astrophysics Data System (ADS)

    Yang, Kuiyuan; Wang, Meng; Hua, Xian-Sheng; Zhang, Hong-Jiang

    Recent years have witnessed a great success of social media websites. Tag-based image search is an important approach to access the image content of interest on these websites. However, the existing ranking methods for tag-based image search frequently return results that are irrelevant or lack of diversity. This chapter presents a diverse relevance ranking scheme which simultaneously takes relevance and diversity into account by exploring the content of images and their associated tags. First, it estimates the relevance scores of images with respect to the query term based on both visual information of images and semantic information of associated tags. Then semantic similarities of social images are estimated based on their tags. Based on the relevance scores and the similarities, the ranking list is generated by a greedy ordering algorithm which optimizes Average Diverse Precision (ADP), a novel measure that is extended from the conventional Average Precision (AP). Comprehensive experiments and user studies demonstrate the effectiveness of the approach.

  13. Identifying a set of influential spreaders in complex networks

    NASA Astrophysics Data System (ADS)

    Zhang, Jian-Xiong; Chen, Duan-Bing; Dong, Qiang; Zhao, Zhi-Dan

    2016-06-01

    Identifying a set of influential spreaders in complex networks plays a crucial role in effective information spreading. A simple strategy is to choose top-r ranked nodes as spreaders according to influence ranking method such as PageRank, ClusterRank and k-shell decomposition. Besides, some heuristic methods such as hill-climbing, SPIN, degree discount and independent set based are also proposed. However, these approaches suffer from a possibility that some spreaders are so close together that they overlap sphere of influence or time consuming. In this report, we present a simply yet effectively iterative method named VoteRank to identify a set of decentralized spreaders with the best spreading ability. In this approach, all nodes vote in a spreader in each turn, and the voting ability of neighbors of elected spreader will be decreased in subsequent turn. Experimental results on four real networks show that under Susceptible-Infected-Recovered (SIR) and Susceptible-Infected (SI) models, VoteRank outperforms the traditional benchmark methods on both spreading rate and final affected scale. What’s more, VoteRank has superior computational efficiency.

  14. F-theory models on K3 surfaces with various Mordell-Weil ranks — constructions that use quadratic base change of rational elliptic surfaces

    NASA Astrophysics Data System (ADS)

    Kimura, Yusuke

    2018-05-01

    We constructed several families of elliptic K3 surfaces with Mordell-Weil groups of ranks from 1 to 4. We studied F-theory compactifications on these elliptic K3 surfaces times a K3 surface. Gluing pairs of identical rational elliptic surfaces with nonzero Mordell-Weil ranks yields elliptic K3 surfaces, the Mordell-Weil groups of which have nonzero ranks. The sum of the ranks of the singularity type and the Mordell-Weil group of any rational elliptic surface with a global section is 8. By utilizing this property, families of rational elliptic surfaces with various nonzero Mordell-Weil ranks can be obtained by choosing appropriate singularity types. Gluing pairs of these rational elliptic surfaces yields families of elliptic K3 surfaces with various nonzero Mordell-Weil ranks. We also determined the global structures of the gauge groups that arise in F-theory compactifications on the resulting K3 surfaces times a K3 surface. U(1) gauge fields arise in these compactifications.

  15. Identifying the greatest team and captain—A complex network approach to cricket matches

    NASA Astrophysics Data System (ADS)

    Mukherjee, Satyam

    2012-12-01

    We consider all Test matches played between 1877 and 2010 and One Day International (ODI) matches played between 1971 and 2010. We form directed and weighted networks of teams and also of their captains. The success of a team (or captain) is determined by the ‘quality’ of the wins, not simply by the number of wins. We apply the diffusion-based PageRank algorithm to the networks to assess the importance of the wins, and rank the respective teams and captains. Our analysis identifies Australia as the best team in both forms of cricket, Test and ODI. Steve Waugh is identified as the best captain in Test cricket and Ricky Ponting is the best captain in the ODI format. We also compare our ranking scheme with an existing ranking scheme, the Reliance ICC ranking. Our method does not depend on ‘external’ criteria in the ranking of teams (captains). The purpose of this paper is to introduce a revised ranking of cricket teams and to quantify the success of the captains.

  16. RankProd 2.0: a refactored bioconductor package for detecting differentially expressed features in molecular profiling datasets.

    PubMed

    Del Carratore, Francesco; Jankevics, Andris; Eisinga, Rob; Heskes, Tom; Hong, Fangxin; Breitling, Rainer

    2017-09-01

    The Rank Product (RP) is a statistical technique widely used to detect differentially expressed features in molecular profiling experiments such as transcriptomics, metabolomics and proteomics studies. An implementation of the RP and the closely related Rank Sum (RS) statistics has been available in the RankProd Bioconductor package for several years. However, several recent advances in the understanding of the statistical foundations of the method have made a complete refactoring of the existing package desirable. We implemented a completely refactored version of the RankProd package, which provides a more principled implementation of the statistics for unpaired datasets. Moreover, the permutation-based P -value estimation methods have been replaced by exact methods, providing faster and more accurate results. RankProd 2.0 is available at Bioconductor ( https://www.bioconductor.org/packages/devel/bioc/html/RankProd.html ) and as part of the mzMatch pipeline ( http://www.mzmatch.sourceforge.net ). rainer.breitling@manchester.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  17. Evaluation of Reference Genes for Quantitative Real-Time PCR Analysis of the Gene Expression in Laticifers on the Basis of Latex Flow in Rubber Tree (Hevea brasiliensis Muell. Arg.)

    PubMed Central

    Chao, Jinquan; Yang, Shuguang; Chen, Yueyi; Tian, Wei-Min

    2016-01-01

    Latex exploitation-caused latex flow is effective in enhancing latex regeneration in laticifer cells of rubber tree. It should be suitable for screening appropriate reference gene for analysis of the expression of latex regeneration-related genes by quantitative real-time PCR (qRT-PCR). In the present study, the expression stability of 23 candidate reference genes was evaluated on the basis of latex flow by using geNorm and NormFinder algorithms. Ubiquitin-protein ligase 2a (UBC2a) and ubiquitin-protein ligase 2b (UBC2b) were the two most stable genes among the selected candidate references in rubber tree clones with differential duration of latex flow. The two genes were also high-ranked in previous reference gene screening across different tissues and experimental conditions. By contrast, the transcripts of latex regeneration-related genes fluctuated significantly during latex flow. The results suggest that screening reference gene during latex flow should be an efficient and effective clue for selection of reference genes in qRT-PCR. PMID:27524995

  18. The exact probability distribution of the rank product statistics for replicated experiments.

    PubMed

    Eisinga, Rob; Breitling, Rainer; Heskes, Tom

    2013-03-18

    The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  19. Exploring Several Methods of Groundwater Model Selection

    NASA Astrophysics Data System (ADS)

    Samani, Saeideh; Ye, Ming; Asghari Moghaddam, Asghar

    2017-04-01

    Selecting reliable models for simulating groundwater flow and solute transport is essential to groundwater resources management and protection. This work is to explore several model selection methods for avoiding over-complex and/or over-parameterized groundwater models. We consider six groundwater flow models with different numbers (6, 10, 10, 13, 13 and 15) of model parameters. These models represent alternative geological interpretations, recharge estimates, and boundary conditions at a study site in Iran. The models were developed with Model Muse, and calibrated against observations of hydraulic head using UCODE. Model selection was conducted by using the following four approaches: (1) Rank the models using their root mean square error (RMSE) obtained after UCODE-based model calibration, (2) Calculate model probability using GLUE method, (3) Evaluate model probability using model selection criteria (AIC, AICc, BIC, and KIC), and (4) Evaluate model weights using the Fuzzy Multi-Criteria-Decision-Making (MCDM) approach. MCDM is based on the fuzzy analytical hierarchy process (AHP) and fuzzy technique for order performance, which is to identify the ideal solution by a gradual expansion from the local to the global scale of model parameters. The KIC and MCDM methods are superior to other methods, as they consider not only the fit between observed and simulated data and the number of parameter, but also uncertainty in model parameters. Considering these factors can prevent from occurring over-complexity and over-parameterization, when selecting the appropriate groundwater flow models. These methods selected, as the best model, one with average complexity (10 parameters) and the best parameter estimation (model 3).

  20. Cityscape genetics: structural vs. functional connectivity of an urban lizard population.

    PubMed

    Beninde, Joscha; Feldmeier, Stephan; Werner, Maike; Peroverde, Daniel; Schulte, Ulrich; Hochkirch, Axel; Veith, Michael

    2016-10-01

    Functional connectivity is essential for the long-term persistence of populations. However, many studies assess connectivity with a focus on structural connectivity only. Cityscapes, namely urban landscapes, are particularly dynamic and include numerous potential anthropogenic barriers to animal movements, such as roads, traffic or buildings. To assess and compare structural connectivity of habitats and functional connectivity of gene flow of an urban lizard, we here combined species distribution models (SDMs) with an individual-based landscape genetic optimization procedure. The most important environmental factors of the SDMs are structural diversity and substrate type, with high and medium levels of structural diversity as well as open and rocky/gravel substrates contributing most to structural connectivity. By contrast, water cover was the best model of all environmental factors following landscape genetic optimization. The river is thus a major barrier to gene flow, while of the typical anthropogenic factors only buildings showed an effect. Nonetheless, using SDMs as a basis for landscape genetic optimization provided the highest ranked model for functional connectivity. Optimizing SDMs in this way can provide a sound basis for models of gene flow of the cityscape, and elsewhere, while presence-only and presence-absence modelling approaches showed differences in performance. Additionally, interpretation of results based on SDM factor importance can be misleading, dictating more thorough analyses following optimization of SDMs. Such approaches can be adopted for management strategies, for example aiming to connect native common wall lizard populations or disconnect them from non-native introduced populations, which are currently spreading in many cities in Central Europe. © 2016 John Wiley & Sons Ltd.

  1. Selection for family medicine residency training in Canada: How consistently are the same students ranked by different programs?

    PubMed

    Wycliffe-Jones, Keith; Hecker, Kent G; Schipper, Shirley; Topps, Maureen; Robinson, Jeanine; Abedin, Tasnima

    2018-02-01

    To examine the consistency of the ranking of Canadian and US medical graduates who applied to Canadian family medicine (FM) residency programs between 2007 and 2013. Descriptive cross-sectional study. Family medicine residency programs in Canada. All 17 Canadian medical schools allowed access to their anonymized program rank-order lists of students applying to FM residency programs submitted to the first iteration of the Canadian Resident Matching Service match from 2007 to 2013. The rank position of medical students who applied to more than 1 FM residency program on the rank-order lists submitted by the programs. Anonymized ranking data submitted to the Canadian Resident Matching Service from 2007 to 2013 by all 17 FM residency programs were used. Ranking data of eligible Canadian and US medical graduates were analyzed to assess the within-student and between-student variability in rank score. These covariance parameters were then used to calculate the intraclass correlation coefficient (ICC) for all programs. Program descriptions and selection criteria were also reviewed to identify sites with similar profiles for subset ICC analysis. Between 2007 and 2013, the consistency of ranking by all programs was fair at best (ICC = 0.34 to 0.39). The consistency of ranking by larger urban-based sites was weak to fair (ICC = 0.23 to 0.36), and the consistency of ranking by sites focusing on training for rural practice was weak to moderate (ICC = 0.16 to 0.55). In most cases, there is a low level of consistency of ranking of students applying for FM training in Canada. This raises concerns regarding fairness, particularly in relation to expectations around equity and distributive justice in selection processes. Copyright© the College of Family Physicians of Canada.

  2. Health systems around the world - a comparison of existing health system rankings.

    PubMed

    Schütte, Stefanie; Acevedo, Paula N Marin; Flahault, Antoine

    2018-06-01

    Existing health systems all over the world are different due to the different combinations of components that can be considered for their establishment. The ranking of health systems has been a focal points for many years especially the issue of performance. In 2000 the World Health Organization (WHO) performed a ranking to compare the Performance of the health system of the member countries. Since then other health system rankings have been performed and it became an issue of public discussion. A point of contention regarding these rankings is the methodology employed by each of them, since no gold standard exists. Therefore, this review focuses on evaluating the methodologies of each existing health system performance ranking to assess their reproducibility and transparency. A search was conducted to identify existing health system rankings, and a questionnaire was developed for the comparison of the methodologies based on the following indicators: (1) General information, (2) Statistical methods, (3) Data (4) Indicators. Overall nine rankings were identified whereas six of them focused rather on the measurement of population health without any financial component and were therefore excluded. Finally, three health system rankings were selected for this review: "Health Systems: Improving Performance" by the WHO, "Mirror, Mirror on the wall: How the Performance of the US Health Care System Compares Internationally" by the Commonwealth Fund and "the Most efficient Health Care" by Bloomberg. After the completion of the comparison of the rankings by giving them scores according to the indicators, the ranking performed the WHO was considered the most complete regarding the ability of reproducibility and transparency of the methodology. This review and comparison could help in establishing consensus in the field of health system research. This may also help giving recommendations for future health rankings and evaluating the current gap in the literature.

  3. Improved prediction of peptide detectability for targeted proteomics using a rank-based algorithm and organism-specific data.

    PubMed

    Qeli, Ermir; Omasits, Ulrich; Goetze, Sandra; Stekhoven, Daniel J; Frey, Juerg E; Basler, Konrad; Wollscheid, Bernd; Brunner, Erich; Ahrens, Christian H

    2014-08-28

    The in silico prediction of the best-observable "proteotypic" peptides in mass spectrometry-based workflows is a challenging problem. Being able to accurately predict such peptides would enable the informed selection of proteotypic peptides for targeted quantification of previously observed and non-observed proteins for any organism, with a significant impact for clinical proteomics and systems biology studies. Current prediction algorithms rely on physicochemical parameters in combination with positive and negative training sets to identify those peptide properties that most profoundly affect their general detectability. Here we present PeptideRank, an approach that uses learning to rank algorithm for peptide detectability prediction from shotgun proteomics data, and that eliminates the need to select a negative dataset for the training step. A large number of different peptide properties are used to train ranking models in order to predict a ranking of the best-observable peptides within a protein. Empirical evaluation with rank accuracy metrics showed that PeptideRank complements existing prediction algorithms. Our results indicate that the best performance is achieved when it is trained on organism-specific shotgun proteomics data, and that PeptideRank is most accurate for short to medium-sized and abundant proteins, without any loss in prediction accuracy for the important class of membrane proteins. Targeted proteomics approaches have been gaining a lot of momentum and hold immense potential for systems biology studies and clinical proteomics. However, since only very few complete proteomes have been reported to date, for a considerable fraction of a proteome there is no experimental proteomics evidence that would allow to guide the selection of the best-suited proteotypic peptides (PTPs), i.e. peptides that are specific to a given proteoform and that are repeatedly observed in a mass spectrometer. We describe a novel, rank-based approach for the prediction of the best-suited PTPs for targeted proteomics applications. By building on methods developed in the field of information retrieval (e.g. web search engines like Google's PageRank), we circumvent the delicate step of selecting positive and negative training sets and at the same time also more closely reflect the experimentalist´s need for selecting e.g. the 5 most promising peptides for targeting a protein of interest. This approach allows to predict PTPs for not yet observed proteins or for organisms without prior experimental proteomics data such as many non-model organisms. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Combining evidence and values in priority setting: testing the balance sheet method in a low-income country.

    PubMed

    Makundi, Emmanuel; Kapiriri, Lydia; Norheim, Ole Frithjof

    2007-09-24

    Procedures for priority setting need to incorporate both scientific evidence and public values. The aim of this study was to test out a model for priority setting which incorporates both scientific evidence and public values, and to explore use of evidence by a selection of stakeholders and to study reasons for the relative ranking of health care interventions in a setting of extreme resource scarcity. Systematic search for and assessment of relevant evidence for priority setting in a low-income country. Development of a balance sheet according to Eddy's explicit method. Eight group interviews (n-85), using a modified nominal group technique for eliciting individual and group rankings of a given set of health interventions. The study procedure made it possible to compare the groups' ranking before and after all the evidence was provided to participants. A rank deviation is significant if the rank order of the same intervention differed by two or more points on the ordinal scale. A comparison between the initial rank and the final rank (before deliberation) showed a rank deviation of 67%. The difference between the initial rank and the final rank after discussion and voting gave a rank deviation of 78%. Evidence-based and deliberative decision-making does change priorities significantly in an experimental setting. Our use of the balance sheet method was meant as a demonstration project, but could if properly developed be feasible for health planners, experts and health workers, although more work is needed before it can be used for laypersons.

  5. Physiology of Pseudomonas aeruginosa in biofilms as revealed by transcriptome analysis

    PubMed Central

    2010-01-01

    Background Transcriptome analysis was applied to characterize the physiological activities of Pseudomonas aeruginosa grown for three days in drip-flow biofilm reactors. Conventional applications of transcriptional profiling often compare two paired data sets that differ in a single experimentally controlled variable. In contrast this study obtained the transcriptome of a single biofilm state, ranked transcript signals to make the priorities of the population manifest, and compared ranki ngs for a priori identified physiological marker genes between the biofilm and published data sets. Results Biofilms tolerated exposure to antibiotics, harbored steep oxygen concentration gradients, and exhibited stratified and heterogeneous spatial patterns of protein synthetic activity. Transcriptional profiling was performed and the signal intensity of each transcript was ranked to gain insight into the physiological state of the biofilm population. Similar rankings were obtained from data sets published in the GEO database http://www.ncbi.nlm.nih.gov/geo. By comparing the rank of genes selected as markers for particular physiological activities between the biofilm and comparator data sets, it was possible to infer qualitative features of the physiological state of the biofilm bacteria. These biofilms appeared, from their transcriptome, to be glucose nourished, iron replete, oxygen limited, and growing slowly or exhibiting stationary phase character. Genes associated with elaboration of type IV pili were strongly expressed in the biofilm. The biofilm population did not indicate oxidative stress, homoserine lactone mediated quorum sensing, or activation of efflux pumps. Using correlations with transcript ranks, the average specific growth rate of biofilm cells was estimated to be 0.08 h-1. Conclusions Collectively these data underscore the oxygen-limited, slow-growing nature of the biofilm population and are consistent with antimicrobial tolerance due to low metabolic activity. PMID:21083928

  6. MetabolitePredict: A de novo human metabolomics prediction system and its applications in rheumatoid arthritis.

    PubMed

    Wang, QuanQiu; Xu, Rong

    2017-07-01

    Human metabolomics has great potential in disease mechanism understanding, early diagnosis, and therapy. Existing metabolomics studies are often based on profiling patient biofluids and tissue samples and are difficult owing to the challenges of sample collection and data processing. Here, we report an alternative approach and developed a computation-based prediction system, MetabolitePredict, for disease metabolomics biomarker prediction. We applied MetabolitePredict to identify metabolite biomarkers and metabolite targeting therapies for rheumatoid arthritis (RA), a last-lasting complex disease with multiple genetic and environmental factors involved. MetabolitePredict is a de novo prediction system. It first constructs a disease-specific genetic profile using genes and pathways data associated with an input disease. It then constructs genetic profiles for a total of 259,170 chemicals/metabolites using known chemical genetics and human metabolomic data. MetabolitePredict prioritizes metabolites for a given disease based on the genetic profile similarities between disease and metabolites. We evaluated MetabolitePredict using 63 known RA-associated metabolites. MetabolitePredict found 24 of the 63 metabolites (recall: 0.38) and ranked them highly (mean ranking: top 4.13%, median ranking: top 1.10%, P-value: 5.08E-19). MetabolitePredict performed better than an existing metabolite prediction system, PROFANCY, in predicting RA-associated metabolites (PROFANCY: recall: 0.31, mean ranking: 20.91%, median ranking: 16.47%, P-value: 3.78E-7). Short-chain fatty acids (SCFAs), the abundant metabolites of gut microbiota in the fermentation of fiber, ranked highly (butyrate, 0.03%; acetate, 0.05%; propionate, 0.38%). Finally, we established MetabolitePredict's potential in novel metabolite targeting for disease treatment: MetabolitePredict ranked highly three known metabolite inhibitors for RA treatments (methotrexate:0.25%; leflunomide: 0.56%; sulfasalazine: 0.92%). MetabolitePredict is a generalizable disease metabolite prediction system. The only required input to the system is a disease name or a set of disease-associated genes. The web-based MetabolitePredict is available at:http://xulab. edu/MetabolitePredict. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Risk-Based Prioritization Method for the Classification of Groundwater Pollution from Hazardous Waste Landfills.

    PubMed

    Yang, Yu; Jiang, Yong-Hai; Lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-Fei; Xu, Xiang-Jian; An, Da

    2016-12-01

    Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.

  8. Learning to rank diversified results for biomedical information retrieval from multiple features.

    PubMed

    Wu, Jiajin; Huang, Jimmy; Ye, Zheng

    2014-01-01

    Different from traditional information retrieval (IR), promoting diversity in IR takes consideration of relationship between documents in order to promote novelty and reduce redundancy thus to provide diversified results to satisfy various user intents. Diversity IR in biomedical domain is especially important as biologists sometimes want diversified results pertinent to their query. A combined learning-to-rank (LTR) framework is learned through a general ranking model (gLTR) and a diversity-biased model. The former is learned from general ranking features by a conventional learning-to-rank approach; the latter is constructed with diversity-indicating features added, which are extracted based on the retrieved passages' topics detected using Wikipedia and ranking order produced by the general learning-to-rank model; final ranking results are given by combination of both models. Compared with baselines BM25 and DirKL on 2006 and 2007 collections, the gLTR has 0.2292 (+16.23% and +44.1% improvement over BM25 and DirKL respectively) and 0.1873 (+15.78% and +39.0% improvement over BM25 and DirKL respectively) in terms of aspect level of mean average precision (Aspect MAP). The LTR method outperforms gLTR on 2006 and 2007 collections with 4.7% and 2.4% improvement in terms of Aspect MAP. The learning-to-rank method is an efficient way for biomedical information retrieval and the diversity-biased features are beneficial for promoting diversity in ranking results.

  9. Effective structural descriptors for natural and engineered radioactive waste confinement barriers

    NASA Astrophysics Data System (ADS)

    Lemmens, Laurent; Rogiers, Bart; De Craen, Mieke; Laloy, Eric; Jacques, Diederik; Huysmans, Marijke; Swennen, Rudy; Urai, Janos L.; Desbois, Guillaume

    2017-04-01

    The microstructure of a radioactive waste confinement barrier strongly influences its flow and transport properties. Numerical flow and transport simulations for these porous media at the pore scale therefore require input data that describe the microstructure as accurately as possible. To date, no imaging method can resolve all heterogeneities within important radioactive waste confinement barrier materials as hardened cement paste and natural clays at the micro scale (nm-cm). Therefore, it is necessary to merge information from different 2D and 3D imaging methods using porous media reconstruction techniques. To qualitatively compare the results of different reconstruction techniques, visual inspection might suffice. To quantitatively compare training-image based algorithms, Tan et al. (2014) proposed an algorithm using an analysis of distance. However, the ranking of the algorithm depends on the choice of the structural descriptor, in their case multiple-point or cluster-based histograms. We present here preliminary work in which we will review different structural descriptors and test their effectiveness, for capturing the main structural characteristics of radioactive waste confinement barrier materials, to determine the descriptors to use in the analysis of distance. The investigated descriptors are particle size distributions, surface area distributions, two point probability functions, multiple point histograms, linear functions and two point cluster functions. The descriptor testing consists of stochastically generating realizations from a reference image using the simulated annealing optimization procedure introduced by Karsanina et al. (2015). This procedure basically minimizes the differences between pre-specified descriptor values associated with the training image and the image being produced. The most efficient descriptor set can therefore be identified by comparing the image generation quality among the tested descriptor combinations. The assessment of the quality of the simulations will be made by combining all considered descriptors. Once the set of the most efficient descriptors is determined, they can be used in the analysis of distance, to rank different reconstruction algorithms in a more objective way in future work. Karsanina MV, Gerke KM, Skvortsova EB, Mallants D (2015) Universal Spatial Correlation Functions for Describing and Reconstructing Soil Microstructure. PLoS ONE 10(5): e0126515. doi:10.1371/journal.pone.0126515 Tan, Xiaojin, Pejman Tahmasebi, and Jef Caers. "Comparing training-image based algorithms using an analysis of distance." Mathematical Geosciences 46.2 (2014): 149-169.

  10. Reconstruction of dynamic image series from undersampled MRI data using data-driven model consistency condition (MOCCO).

    PubMed

    Velikina, Julia V; Samsonov, Alexey A

    2015-11-01

    To accelerate dynamic MR imaging through development of a novel image reconstruction technique using low-rank temporal signal models preestimated from training data. We introduce the model consistency condition (MOCCO) technique, which utilizes temporal models to regularize reconstruction without constraining the solution to be low-rank, as is performed in related techniques. This is achieved by using a data-driven model to design a transform for compressed sensing-type regularization. The enforcement of general compliance with the model without excessively penalizing deviating signal allows recovery of a full-rank solution. Our method was compared with a standard low-rank approach utilizing model-based dimensionality reduction in phantoms and patient examinations for time-resolved contrast-enhanced angiography (CE-MRA) and cardiac CINE imaging. We studied the sensitivity of all methods to rank reduction and temporal subspace modeling errors. MOCCO demonstrated reduced sensitivity to modeling errors compared with the standard approach. Full-rank MOCCO solutions showed significantly improved preservation of temporal fidelity and aliasing/noise suppression in highly accelerated CE-MRA (acceleration up to 27) and cardiac CINE (acceleration up to 15) data. MOCCO overcomes several important deficiencies of previously proposed methods based on pre-estimated temporal models and allows high quality image restoration from highly undersampled CE-MRA and cardiac CINE data. © 2014 Wiley Periodicals, Inc.

  11. RECONSTRUCTION OF DYNAMIC IMAGE SERIES FROM UNDERSAMPLED MRI DATA USING DATA-DRIVEN MODEL CONSISTENCY CONDITION (MOCCO)

    PubMed Central

    Velikina, Julia V.; Samsonov, Alexey A.

    2014-01-01

    Purpose To accelerate dynamic MR imaging through development of a novel image reconstruction technique using low-rank temporal signal models pre-estimated from training data. Theory We introduce the MOdel Consistency COndition (MOCCO) technique that utilizes temporal models to regularize the reconstruction without constraining the solution to be low-rank as performed in related techniques. This is achieved by using a data-driven model to design a transform for compressed sensing-type regularization. The enforcement of general compliance with the model without excessively penalizing deviating signal allows recovery of a full-rank solution. Methods Our method was compared to standard low-rank approach utilizing model-based dimensionality reduction in phantoms and patient examinations for time-resolved contrast-enhanced angiography (CE MRA) and cardiac CINE imaging. We studied sensitivity of all methods to rank-reduction and temporal subspace modeling errors. Results MOCCO demonstrated reduced sensitivity to modeling errors compared to the standard approach. Full-rank MOCCO solutions showed significantly improved preservation of temporal fidelity and aliasing/noise suppression in highly accelerated CE MRA (acceleration up to 27) and cardiac CINE (acceleration up to 15) data. Conclusions MOCCO overcomes several important deficiencies of previously proposed methods based on pre-estimated temporal models and allows high quality image restoration from highly undersampled CE-MRA and cardiac CINE data. PMID:25399724

  12. PharmDock: a pharmacophore-based docking program

    PubMed Central

    2014-01-01

    Background Protein-based pharmacophore models are enriched with the information of potential interactions between ligands and the protein target. We have shown in a previous study that protein-based pharmacophore models can be applied for ligand pose prediction and pose ranking. In this publication, we present a new pharmacophore-based docking program PharmDock that combines pose sampling and ranking based on optimized protein-based pharmacophore models with local optimization using an empirical scoring function. Results Tests of PharmDock on ligand pose prediction, binding affinity estimation, compound ranking and virtual screening yielded comparable or better performance to existing and widely used docking programs. The docking program comes with an easy-to-use GUI within PyMOL. Two features have been incorporated in the program suite that allow for user-defined guidance of the docking process based on previous experimental data. Docking with those features demonstrated superior performance compared to unbiased docking. Conclusion A protein pharmacophore-based docking program, PharmDock, has been made available with a PyMOL plugin. PharmDock and the PyMOL plugin are freely available from http://people.pharmacy.purdue.edu/~mlill/software/pharmdock. PMID:24739488

  13. Discrepancies between multicriteria decision analysis-based ranking and intuitive ranking for pharmaceutical benefit-risk profiles in a hypothetical setting.

    PubMed

    Hoshikawa, K; Ono, S

    2017-02-01

    Multicriteria decision analysis (MCDA) has been generally considered a promising decision-making methodology for the assessment of drug benefit-risk profiles. There have been many discussions in both public and private sectors on its feasibility and applicability, but it has not been employed in official decision-makings. For the purpose of examining to what extent MCDA would reflect the first-hand, intuitive preference of evaluators in practical pharmaceutical assessments, we conducted a questionnaire survey involving the participation of employees of pharmaceutical companies. Showing profiles of the efficacy and safety of four hypothetical drugs, each respondent was asked to rank them following the standard MCDA process and then to rank them intuitively (i.e. without applying any analytical framework). These two approaches resulted in substantially different ranking patterns from the same individuals, and the concordance rate was surprisingly low (17%). Although many respondents intuitively showed a preference for mild, balanced risk-benefit profiles over profiles with a conspicuous advantage in either risk or benefit, the ranking orders based on MCDA scores did not reflect the intuitive preference. Observed discrepancies between the rankings seemed to be primarily attributed to the structural characteristics of MCDA, which assumes that evaluation on each benefit and risk component should have monotonic impact on final scores. It would be difficult for MCDA to reflect commonly observed non-monotonic preferences for risk and benefit profiles. Possible drawbacks of MCDA should be further investigated prior to the real-world application of its benefit-risk assessment. © 2016 John Wiley & Sons Ltd.

  14. Finding differentially expressed genes in high dimensional data: Rank based test statistic via a distance measure.

    PubMed

    Mathur, Sunil; Sadana, Ajit

    2015-12-01

    We present a rank-based test statistic for the identification of differentially expressed genes using a distance measure. The proposed test statistic is highly robust against extreme values and does not assume the distribution of parent population. Simulation studies show that the proposed test is more powerful than some of the commonly used methods, such as paired t-test, Wilcoxon signed rank test, and significance analysis of microarray (SAM) under certain non-normal distributions. The asymptotic distribution of the test statistic, and the p-value function are discussed. The application of proposed method is shown using a real-life data set. © The Author(s) 2011.

  15. A denoising algorithm for CT image using low-rank sparse coding

    NASA Astrophysics Data System (ADS)

    Lei, Yang; Xu, Dong; Zhou, Zhengyang; Wang, Tonghe; Dong, Xue; Liu, Tian; Dhabaan, Anees; Curran, Walter J.; Yang, Xiaofeng

    2018-03-01

    We propose a denoising method of CT image based on low-rank sparse coding. The proposed method constructs an adaptive dictionary of image patches and estimates the sparse coding regularization parameters using the Bayesian interpretation. A low-rank approximation approach is used to simultaneously construct the dictionary and achieve sparse representation through clustering similar image patches. A variable-splitting scheme and a quadratic optimization are used to reconstruct CT image based on achieved sparse coefficients. We tested this denoising technology using phantom, brain and abdominal CT images. The experimental results showed that the proposed method delivers state-of-art denoising performance, both in terms of objective criteria and visual quality.

  16. Testing the robustness of management decisions to uncertainty: Everglades restoration scenarios.

    PubMed

    Fuller, Michael M; Gross, Louis J; Duke-Sylvester, Scott M; Palmer, Mark

    2008-04-01

    To effectively manage large natural reserves, resource managers must prepare for future contingencies while balancing the often conflicting priorities of different stakeholders. To deal with these issues, managers routinely employ models to project the response of ecosystems to different scenarios that represent alternative management plans or environmental forecasts. Scenario analysis is often used to rank such alternatives to aid the decision making process. However, model projections are subject to uncertainty in assumptions about model structure, parameter values, environmental inputs, and subcomponent interactions. We introduce an approach for testing the robustness of model-based management decisions to the uncertainty inherent in complex ecological models and their inputs. We use relative assessment to quantify the relative impacts of uncertainty on scenario ranking. To illustrate our approach we consider uncertainty in parameter values and uncertainty in input data, with specific examples drawn from the Florida Everglades restoration project. Our examples focus on two alternative 30-year hydrologic management plans that were ranked according to their overall impacts on wildlife habitat potential. We tested the assumption that varying the parameter settings and inputs of habitat index models does not change the rank order of the hydrologic plans. We compared the average projected index of habitat potential for four endemic species and two wading-bird guilds to rank the plans, accounting for variations in parameter settings and water level inputs associated with hypothetical future climates. Indices of habitat potential were based on projections from spatially explicit models that are closely tied to hydrology. For the American alligator, the rank order of the hydrologic plans was unaffected by substantial variation in model parameters. By contrast, simulated major shifts in water levels led to reversals in the ranks of the hydrologic plans in 24.1-30.6% of the projections for the wading bird guilds and several individual species. By exposing the differential effects of uncertainty, relative assessment can help resource managers assess the robustness of scenario choice in model-based policy decisions.

  17. Diversity rankings among bacterial lineages in soil.

    PubMed

    Youssef, Noha H; Elshahed, Mostafa S

    2009-03-01

    We used rarefaction curve analysis and diversity ordering-based approaches to rank the 11 most frequently encountered bacterial lineages in soil according to diversity in 5 previously reported 16S rRNA gene clone libraries derived from agricultural, undisturbed tall grass prairie and forest soils (n=26,140, 28 328, 31 818, 13 001 and 53 533). The Planctomycetes, Firmicutes and the delta-Proteobacteria were consistently ranked among the most diverse lineages in all data sets, whereas the Verrucomicrobia, Gemmatimonadetes and beta-Proteobacteria were consistently ranked among the least diverse. On the other hand, the rankings of alpha-Proteobacteria, Acidobacteria, Actinobacteria, Bacteroidetes and Chloroflexi varied widely in different soil clone libraries. In general, lineages exhibiting largest differences in diversity rankings also exhibited the largest difference in relative abundance in the data sets examined. Within these lineages, a positive correlation between relative abundance and diversity was observed within the Acidobacteria, Actinobacteria and Chloroflexi, and a negative diversity-abundance correlation was observed within the Bacteroidetes. The ecological and evolutionary implications of these results are discussed.

  18. The origins of deference: when do people prefer lower status?

    PubMed

    Anderson, Cameron; Willer, Robb; Kilduff, Gavin J; Brown, Courtney E

    2012-05-01

    Although the desire for high status is considered universal, prior research suggests individuals often opt for lower status positions. Why would anyone favor a position of apparent disadvantage? In 5 studies, we found that the broad construct of status striving can be broken up into two conceptions: one based on rank, the other on respect. While individuals might universally desire high levels of respect, we find that they vary widely in the extent to which they strive for high-status rank, with many individuals opting for middle- or low-status rank. The status rank that individuals preferred depended on their self-perceived value to the group: when they believed they provided less value, they preferred lower status rank. Mediation and moderation analyses suggest that beliefs about others' expectations were the primary driver of these effects. Individuals who believed they provided little value to their group inferred that others expected them to occupy a lower status position. Individuals in turn conformed to these perceived expectations, accepting lower status rank in such settings.

  19. A new method for comparing rankings through complex networks: Model and analysis of competitiveness of major European soccer leagues

    NASA Astrophysics Data System (ADS)

    Criado, Regino; García, Esther; Pedroche, Francisco; Romance, Miguel

    2013-12-01

    In this paper, we show a new technique to analyze families of rankings. In particular, we focus on sports rankings and, more precisely, on soccer leagues. We consider that two teams compete when they change their relative positions in consecutive rankings. This allows to define a graph by linking teams that compete. We show how to use some structural properties of this competitivity graph to measure to what extend the teams in a league compete. These structural properties are the mean degree, the mean strength, and the clustering coefficient. We give a generalization of the Kendall's correlation coefficient to more than two rankings. We also show how to make a dynamic analysis of a league and how to compare different leagues. We apply this technique to analyze the four major European soccer leagues: Bundesliga, Italian Lega, Spanish Liga, and Premier League. We compare our results with the classical analysis of sport ranking based on measures of competitive balance.

  20. Thalamo-Sensorimotor Functional Connectivity Correlates with World Ranking of Olympic, Elite, and High Performance Athletes.

    PubMed

    Huang, Zirui; Davis, Henry Hap; Wolff, Annemarie; Northoff, Georg

    2017-01-01

    Brain plasticity studies have shown functional reorganization in participants with outstanding motor expertise. Little is known about neural plasticity associated with exceptionally long motor training or of its predictive value for motor performance excellence. The present study utilised resting-state functional magnetic resonance imaging (rs-fMRI) in a unique sample of world-class athletes: Olympic, elite, and internationally ranked swimmers ( n = 30). Their world ranking ranged from 1st to 250th: each had prepared for participation in the Olympic Games. Combining rs-fMRI graph-theoretical and seed-based functional connectivity analyses, it was discovered that the thalamus has its strongest connections with the sensorimotor network in elite swimmers with the highest world rankings (career best rank: 1-35). Strikingly, thalamo-sensorimotor functional connections were highly correlated with the swimmers' motor performance excellence, that is, accounting for 41% of the individual variance in best world ranking. Our findings shed light on neural correlates of long-term athletic performance involving thalamo-sensorimotor functional circuits.

  1. A multimedia retrieval framework based on semi-supervised ranking and relevance feedback.

    PubMed

    Yang, Yi; Nie, Feiping; Xu, Dong; Luo, Jiebo; Zhuang, Yueting; Pan, Yunhe

    2012-04-01

    We present a new framework for multimedia content analysis and retrieval which consists of two independent algorithms. First, we propose a new semi-supervised algorithm called ranking with Local Regression and Global Alignment (LRGA) to learn a robust Laplacian matrix for data ranking. In LRGA, for each data point, a local linear regression model is used to predict the ranking scores of its neighboring points. A unified objective function is then proposed to globally align the local models from all the data points so that an optimal ranking score can be assigned to each data point. Second, we propose a semi-supervised long-term Relevance Feedback (RF) algorithm to refine the multimedia data representation. The proposed long-term RF algorithm utilizes both the multimedia data distribution in multimedia feature space and the history RF information provided by users. A trace ratio optimization problem is then formulated and solved by an efficient algorithm. The algorithms have been applied to several content-based multimedia retrieval applications, including cross-media retrieval, image retrieval, and 3D motion/pose data retrieval. Comprehensive experiments on four data sets have demonstrated its advantages in precision, robustness, scalability, and computational efficiency.

  2. Scoping Studies to Evaluate the Benefits of an Advanced Dry Feed System on the Use of Low-Rank Coal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rader, Jeff; Aguilar, Kelly; Aldred, Derek

    2012-03-30

    The purpose of this project was to evaluate the ability of advanced low rank coal gasification technology to cause a significant reduction in the COE for IGCC power plants with 90% carbon capture and sequestration compared with the COE for similarly configured IGCC plants using conventional low rank coal gasification technology. GE’s advanced low rank coal gasification technology uses the Posimetric Feed System, a new dry coal feed system based on GE’s proprietary Posimetric Feeder. In order to demonstrate the performance and economic benefits of the Posimetric Feeder in lowering the cost of low rank coal-fired IGCC power with carbonmore » capture, two case studies were completed. In the Base Case, the gasifier was fed a dilute slurry of Montana Rosebud PRB coal using GE’s conventional slurry feed system. In the Advanced Technology Case, the slurry feed system was replaced with the Posimetric Feed system. The process configurations of both cases were kept the same, to the extent possible, in order to highlight the benefit of substituting the Posimetric Feed System for the slurry feed system.« less

  3. The application of fuzzy Delphi and fuzzy inference system in supplier ranking and selection

    NASA Astrophysics Data System (ADS)

    Tahriri, Farzad; Mousavi, Maryam; Hozhabri Haghighi, Siamak; Zawiah Md Dawal, Siti

    2014-06-01

    In today's highly rival market, an effective supplier selection process is vital to the success of any manufacturing system. Selecting the appropriate supplier is always a difficult task because suppliers posses varied strengths and weaknesses that necessitate careful evaluations prior to suppliers' ranking. This is a complex process with many subjective and objective factors to consider before the benefits of supplier selection are achieved. This paper identifies six extremely critical criteria and thirteen sub-criteria based on the literature. A new methodology employing those criteria and sub-criteria is proposed for the assessment and ranking of a given set of suppliers. To handle the subjectivity of the decision maker's assessment, an integration of fuzzy Delphi with fuzzy inference system has been applied and a new ranking method is proposed for supplier selection problem. This supplier selection model enables decision makers to rank the suppliers based on three classifications including "extremely preferred", "moderately preferred", and "weakly preferred". In addition, in each classification, suppliers are put in order from highest final score to the lowest. Finally, the methodology is verified and validated through an example of a numerical test bed.

  4. Exchange-Hole Dipole Dispersion Model for Accurate Energy Ranking in Molecular Crystal Structure Prediction.

    PubMed

    Whittleton, Sarah R; Otero-de-la-Roza, A; Johnson, Erin R

    2017-02-14

    Accurate energy ranking is a key facet to the problem of first-principles crystal-structure prediction (CSP) of molecular crystals. This work presents a systematic assessment of B86bPBE-XDM, a semilocal density functional combined with the exchange-hole dipole moment (XDM) dispersion model, for energy ranking using 14 compounds from the first five CSP blind tests. Specifically, the set of crystals studied comprises 11 rigid, planar compounds and 3 co-crystals. The experimental structure was correctly identified as the lowest in lattice energy for 12 of the 14 total crystals. One of the exceptions is 4-hydroxythiophene-2-carbonitrile, for which the experimental structure was correctly identified once a quasi-harmonic estimate of the vibrational free-energy contribution was included, evidencing the occasional importance of thermal corrections for accurate energy ranking. The other exception is an organic salt, where charge-transfer error (also called delocalization error) is expected to cause the base density functional to be unreliable. Provided the choice of base density functional is appropriate and an estimate of temperature effects is used, XDM-corrected density-functional theory is highly reliable for the energetic ranking of competing crystal structures.

  5. Fast Low-Rank Bayesian Matrix Completion With Hierarchical Gaussian Prior Models

    NASA Astrophysics Data System (ADS)

    Yang, Linxiao; Fang, Jun; Duan, Huiping; Li, Hongbin; Zeng, Bing

    2018-06-01

    The problem of low rank matrix completion is considered in this paper. To exploit the underlying low-rank structure of the data matrix, we propose a hierarchical Gaussian prior model, where columns of the low-rank matrix are assumed to follow a Gaussian distribution with zero mean and a common precision matrix, and a Wishart distribution is specified as a hyperprior over the precision matrix. We show that such a hierarchical Gaussian prior has the potential to encourage a low-rank solution. Based on the proposed hierarchical prior model, a variational Bayesian method is developed for matrix completion, where the generalized approximate massage passing (GAMP) technique is embedded into the variational Bayesian inference in order to circumvent cumbersome matrix inverse operations. Simulation results show that our proposed method demonstrates superiority over existing state-of-the-art matrix completion methods.

  6. 10 CFR 455.131 - State ranking of grant applications.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) For technical assistance programs, buildings shall be ranked in descending priority based upon the... all buildings covered by eligible applications for: (1) Technical assistance programs for units of local government and public care institutions and (2) Technical assistance programs for schools and...

  7. Integrated Low-Rank-Based Discriminative Feature Learning for Recognition.

    PubMed

    Zhou, Pan; Lin, Zhouchen; Zhang, Chao

    2016-05-01

    Feature learning plays a central role in pattern recognition. In recent years, many representation-based feature learning methods have been proposed and have achieved great success in many applications. However, these methods perform feature learning and subsequent classification in two separate steps, which may not be optimal for recognition tasks. In this paper, we present a supervised low-rank-based approach for learning discriminative features. By integrating latent low-rank representation (LatLRR) with a ridge regression-based classifier, our approach combines feature learning with classification, so that the regulated classification error is minimized. In this way, the extracted features are more discriminative for the recognition tasks. Our approach benefits from a recent discovery on the closed-form solutions to noiseless LatLRR. When there is noise, a robust Principal Component Analysis (PCA)-based denoising step can be added as preprocessing. When the scale of a problem is large, we utilize a fast randomized algorithm to speed up the computation of robust PCA. Extensive experimental results demonstrate the effectiveness and robustness of our method.

  8. Monthly streamflow forecasting based on hidden Markov model and Gaussian Mixture Regression

    NASA Astrophysics Data System (ADS)

    Liu, Yongqi; Ye, Lei; Qin, Hui; Hong, Xiaofeng; Ye, Jiajun; Yin, Xingli

    2018-06-01

    Reliable streamflow forecasts can be highly valuable for water resources planning and management. In this study, we combined a hidden Markov model (HMM) and Gaussian Mixture Regression (GMR) for probabilistic monthly streamflow forecasting. The HMM is initialized using a kernelized K-medoids clustering method, and the Baum-Welch algorithm is then executed to learn the model parameters. GMR derives a conditional probability distribution for the predictand given covariate information, including the antecedent flow at a local station and two surrounding stations. The performance of HMM-GMR was verified based on the mean square error and continuous ranked probability score skill scores. The reliability of the forecasts was assessed by examining the uniformity of the probability integral transform values. The results show that HMM-GMR obtained reasonably high skill scores and the uncertainty spread was appropriate. Different HMM states were assumed to be different climate conditions, which would lead to different types of observed values. We demonstrated that the HMM-GMR approach can handle multimodal and heteroscedastic data.

  9. Quantum Max-flow/Min-cut

    NASA Astrophysics Data System (ADS)

    Cui, Shawn X.; Freedman, Michael H.; Sattath, Or; Stong, Richard; Minton, Greg

    2016-06-01

    The classical max-flow min-cut theorem describes transport through certain idealized classical networks. We consider the quantum analog for tensor networks. By associating an integral capacity to each edge and a tensor to each vertex in a flow network, we can also interpret it as a tensor network and, more specifically, as a linear map from the input space to the output space. The quantum max-flow is defined to be the maximal rank of this linear map over all choices of tensors. The quantum min-cut is defined to be the minimum product of the capacities of edges over all cuts of the tensor network. We show that unlike the classical case, the quantum max-flow=min-cut conjecture is not true in general. Under certain conditions, e.g., when the capacity on each edge is some power of a fixed integer, the quantum max-flow is proved to equal the quantum min-cut. However, concrete examples are also provided where the equality does not hold. We also found connections of quantum max-flow/min-cut with entropy of entanglement and the quantum satisfiability problem. We speculate that the phenomena revealed may be of interest both in spin systems in condensed matter and in quantum gravity.

  10. Application of the PROMETHEE technique to determine depression outlet location and flow direction in DEM

    NASA Astrophysics Data System (ADS)

    Chou, Tien-Yin; Lin, Wen-Tzu; Lin, Chao-Yuan; Chou, Wen-Chieh; Huang, Pi-Hui

    2004-02-01

    With the fast growing progress of computer technologies, spatial information on watersheds such as flow direction, watershed boundaries and the drainage network can be automatically calculated or extracted from a digital elevation model (DEM). The stubborn problem that depressions exist in DEMs has been frequently encountered while extracting the spatial information of terrain. Several filling-up methods have been proposed for solving depressions. However, their suitability for large-scale flat areas is inadequate. This study proposes a depression watershed method coupled with the Preference Ranking Organization METHod for Enrichment Evaluations (PROMETHEEs) theory to determine the optimal outlet and calculate the flow direction in depressions. Three processing procedures are used to derive the depressionless flow direction: (1) calculating the incipient flow direction; (2) establishing the depression watershed by tracing the upstream drainage area and determining the depression outlet using PROMETHEE theory; (3) calculating the depressionless flow direction. The developed method was used to delineate the Shihmen Reservoir watershed located in Northern Taiwan. The results show that the depression watershed method can effectively solve the shortcomings such as depression outlet differentiating and looped flow direction between depressions. The suitability of the proposed approach was verified.

  11. Signal detection on spontaneous reports of adverse events following immunisation: a comparison of the performance of a disproportionality-based algorithm and a time-to-onset-based algorithm

    PubMed Central

    van Holle, Lionel; Bauchau, Vincent

    2014-01-01

    Purpose Disproportionality methods measure how unexpected the observed number of adverse events is. Time-to-onset (TTO) methods measure how unexpected the TTO distribution of a vaccine-event pair is compared with what is expected from other vaccines and events. Our purpose is to compare the performance associated with each method. Methods For the disproportionality algorithms, we defined 336 combinations of stratification factors (sex, age, region and year) and threshold values of the multi-item gamma Poisson shrinker (MGPS). For the TTO algorithms, we defined 18 combinations of significance level and time windows. We used spontaneous reports of adverse events recorded for eight vaccines. The vaccine product labels were used as proxies for true safety signals. Algorithms were ranked according to their positive predictive value (PPV) for each vaccine separately; amedian rank was attributed to each algorithm across vaccines. Results The algorithm with the highest median rank was based on TTO with a significance level of 0.01 and a time window of 60 days after immunisation. It had an overall PPV 2.5 times higher than for the highest-ranked MGPS algorithm, 16th rank overall, which was fully stratified and had a threshold value of 0.8. A TTO algorithm with roughly the same sensitivity as the highest-ranked MGPS had better specificity but longer time-to-detection. Conclusions Within the scope of this study, the majority of the TTO algorithms presented a higher PPV than for any MGPS algorithm. Considering the complementarity of TTO and disproportionality methods, a signal detection strategy combining them merits further investigation. PMID:24038719

  12. Healthcare information systems: data mining methods in the creation of a clinical recommender system

    NASA Astrophysics Data System (ADS)

    Duan, L.; Street, W. N.; Xu, E.

    2011-05-01

    Recommender systems have been extensively studied to present items, such as movies, music and books that are likely of interest to the user. Researchers have indicated that integrated medical information systems are becoming an essential part of the modern healthcare systems. Such systems have evolved to an integrated enterprise-wide system. In particular, such systems are considered as a type of enterprise information systems or ERP system addressing healthcare industry sector needs. As part of efforts, nursing care plan recommender systems can provide clinical decision support, nursing education, clinical quality control, and serve as a complement to existing practice guidelines. We propose to use correlations among nursing diagnoses, outcomes and interventions to create a recommender system for constructing nursing care plans. In the current study, we used nursing diagnosis data to develop the methodology. Our system utilises a prefix-tree structure common in itemset mining to construct a ranked list of suggested care plan items based on previously-entered items. Unlike common commercial systems, our system makes sequential recommendations based on user interaction, modifying a ranked list of suggested items at each step in care plan construction. We rank items based on traditional association-rule measures such as support and confidence, as well as a novel measure that anticipates which selections might improve the quality of future rankings. Since the multi-step nature of our recommendations presents problems for traditional evaluation measures, we also present a new evaluation method based on average ranking position and use it to test the effectiveness of different recommendation strategies.

  13. Inland waterway ports nodal attraction indices relevant in development strategies on regional level

    NASA Astrophysics Data System (ADS)

    Dinu, O.; Burciu, Ş.; Oprea, C.; Ilie, A.; Rosca, M.

    2016-08-01

    Present paper aims to propose a set of ranking indices and related criteria, concerning mainly spatial analysis, for the inland waterway port, with special view on inland ports of Danube. Commonly, the attraction potential of a certain transport node is assessed by its spatial accessibility indices considering both spatial features of the location provided by the networks that connect into that node and its economic potential defining the level of traffic flows depending on the economic centers of its hinterland. Paper starts with a overview of the critical needs that are required for potential sites to become inland waterway ports and presents nodal functions that coexist at different levels, leading to a port hierarchy from the points of view of: capacity, connection to hinterland, traffic structure and volume. After a brief review of the key inland waterway port ranking criterion, a selection of nodal attraction measures is made. Particular considerations for the Danube inland port case follows proposed methodology concerning indices of performance for network scale and centrality. As expected, the shorter the distance from an inland port to the nearest access point the greater accessibility. Major differences in ranking, dependent on selected criterion, were registered.

  14. Natural and microcosm phytoneuston communities of Sequim Bay, Washington

    NASA Astrophysics Data System (ADS)

    Hardy, John T.; Valett, Melee

    1981-01-01

    The sea surface microlayer (upper 56 μm) of Sequim Bay, Washington was sampled using a glass plate collector. At the same stations, water was collected from 10 cm deep through a glass tube. The surface microlayer algal community differed significantly in several respects from the subsurface community. The mean ratio of microlayer phytoneuston to subsurface phytoplankton density (M/S ratio) was 7·8±10·9, and at one station with a visible slick reached 32·0. The rank order of individual species abundances differed significantly between the phytoneuston and phytoplankton. Outdoor tanks with a slow flow-through of seawater produced dense microlayer communities. Within 7 days, the tanks had an M/S ratio of 500. In terms of species rank order, the tank neuston are similar to field neuston, but differ significantly from tank plankton communities. Tank and field plankton communities are not significantly different either in total abundance or species rank order. These results indicate that natural phytoneuston communities can be duplicated with only minor species differences in the laboratory and used for controlled studies of the fate of pollutants passing into or through the air-sea interface.

  15. Methodology to identify risk-significant components for inservice inspection and testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, M.T.; Hartley, R.S.; Jones, J.L. Jr.

    1992-08-01

    Periodic inspection and testing of vital system components should be performed to ensure the safe and reliable operation of Department of Energy (DOE) nuclear processing facilities. Probabilistic techniques may be used to help identify and rank components by their relative risk. A risk-based ranking would allow varied DOE sites to implement inspection and testing programs in an effective and cost-efficient manner. This report describes a methodology that can be used to rank components, while addressing multiple risk issues.

  16. A Hybrid Method for Opinion Finding Task (KUNLP at TREC 2008 Blog Track)

    DTIC Science & Technology

    2008-11-01

    retrieve relevant documents. For the Opinion Retrieval subtask, we propose a hybrid model of lexicon-based approach and machine learning approach for...estimating and ranking the opinionated documents. For the Polarized Opinion Retrieval subtask, we employ machine learning for predicting the polarity...and linear combination technique for ranking polar documents. The hybrid model which utilize both lexicon-based approach and machine learning approach

  17. Notes from 1999 on computational algorithm of the Local Wave-Vector (LWV) model for the dynamical evolution of the second-rank velocity correlation tensor starting from the mean-flow-coupled Navier-Stokes equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zemach, Charles; Kurien, Susan

    These notes present an account of the Local Wave Vector (LWV) model of a turbulent flow defined throughout physical space. The previously-developed Local Wave Number (LWN) model is taken as a point of departure. Some general properties of turbulent fields and appropriate notation are given first. The LWV model is presently restricted to incompressible flows and the incompressibility assumption is introduced at an early point in the discussion. The assumption that the turbulence is homogeneous is also introduced early on. This assumption can be relaxed by generalizing the space diffusion terms of LWN, but the present discussion is focused onmore » a modeling of homogeneous turbulence.« less

  18. The Knowledge Economy and Higher Education: Rankings and Classifications, Research Metrics and Learning Outcomes Measures as a System for Regulating the Value of Knowledge

    ERIC Educational Resources Information Center

    Marginson, Simon

    2009-01-01

    This paper describes the global knowledge economy (the k-economy), comprised by (1) open source knowledge flows and (2) commercial markets in intellectual property and knowledge-intensive goods. Like all economy the global knowledge economy is a site of production. It is also social and cultural, taking the form of a one-world community mediated…

  19. American Prisoners of Japan: Did Rank have Its Privilege?

    DTIC Science & Technology

    transportation, leadership problems, and overall death rates . The study concludes that there were significant differences in treatment based on rank...These differences caused extremely high enlisted death rates during the first year of captivity. The officers fared worse as a group, however, because the

  20. PILOT-SCALE EVALUATION OF AN INCINERABILITY RANKING SYSTEM FOR HAZARDOUS ORGANIC COMPOUNDS

    EPA Science Inventory

    The subject study was conducted to evaluate an incinerability ranking system developed by teh University of Dayton Research Institute under contract to the EPA Risk Reduction Engineering Laboratory. Fixtures of organic compounds were prepared and combined with a clay-based sorben...

  1. On Row Rank Equal Column Rank

    ERIC Educational Resources Information Center

    Khalili, Parviz

    2009-01-01

    We will prove a well-known theorem in Linear Algebra, that is, for any "m x n" matrix the dimension of row space and column space are the same. The proof is based on the subject of "elementary matrices" and "reduced row-echelon" form of a matrix.

  2. A study on the impact of parameter uncertainty on the emission-based ranking of transportation projects.

    DOT National Transportation Integrated Search

    2014-01-01

    With the growing concern with air quality levels and, hence, the livability of urban regions in the nation, it has become increasingly common to incorporate vehicular emission considerations in the ranking of transportation projects. Network assignme...

  3. Effects of normalization on quantitative traits in association test

    PubMed Central

    2009-01-01

    Background Quantitative trait loci analysis assumes that the trait is normally distributed. In reality, this is often not observed and one strategy is to transform the trait. However, it is not clear how much normality is required and which transformation works best in association studies. Results We performed simulations on four types of common quantitative traits to evaluate the effects of normalization using the logarithm, Box-Cox, and rank-based transformations. The impact of sample size and genetic effects on normalization is also investigated. Our results show that rank-based transformation gives generally the best and consistent performance in identifying the causal polymorphism and ranking it highly in association tests, with a slight increase in false positive rate. Conclusion For small sample size or genetic effects, the improvement in sensitivity for rank transformation outweighs the slight increase in false positive rate. However, for large sample size and genetic effects, normalization may not be necessary since the increase in sensitivity is relatively modest. PMID:20003414

  4. A Novel Riemannian Metric Based on Riemannian Structure and Scaling Information for Fixed Low-Rank Matrix Completion.

    PubMed

    Mao, Shasha; Xiong, Lin; Jiao, Licheng; Feng, Tian; Yeung, Sai-Kit

    2017-05-01

    Riemannian optimization has been widely used to deal with the fixed low-rank matrix completion problem, and Riemannian metric is a crucial factor of obtaining the search direction in Riemannian optimization. This paper proposes a new Riemannian metric via simultaneously considering the Riemannian geometry structure and the scaling information, which is smoothly varying and invariant along the equivalence class. The proposed metric can make a tradeoff between the Riemannian geometry structure and the scaling information effectively. Essentially, it can be viewed as a generalization of some existing metrics. Based on the proposed Riemanian metric, we also design a Riemannian nonlinear conjugate gradient algorithm, which can efficiently solve the fixed low-rank matrix completion problem. By experimenting on the fixed low-rank matrix completion, collaborative filtering, and image and video recovery, it illustrates that the proposed method is superior to the state-of-the-art methods on the convergence efficiency and the numerical performance.

  5. Efficient Multiple Kernel Learning Algorithms Using Low-Rank Representation.

    PubMed

    Niu, Wenjia; Xia, Kewen; Zu, Baokai; Bai, Jianchuan

    2017-01-01

    Unlike Support Vector Machine (SVM), Multiple Kernel Learning (MKL) allows datasets to be free to choose the useful kernels based on their distribution characteristics rather than a precise one. It has been shown in the literature that MKL holds superior recognition accuracy compared with SVM, however, at the expense of time consuming computations. This creates analytical and computational difficulties in solving MKL algorithms. To overcome this issue, we first develop a novel kernel approximation approach for MKL and then propose an efficient Low-Rank MKL (LR-MKL) algorithm by using the Low-Rank Representation (LRR). It is well-acknowledged that LRR can reduce dimension while retaining the data features under a global low-rank constraint. Furthermore, we redesign the binary-class MKL as the multiclass MKL based on pairwise strategy. Finally, the recognition effect and efficiency of LR-MKL are verified on the datasets Yale, ORL, LSVT, and Digit. Experimental results show that the proposed LR-MKL algorithm is an efficient kernel weights allocation method in MKL and boosts the performance of MKL largely.

  6. Automated Assessment of the Quality of Depression Websites

    PubMed Central

    Tang, Thanh Tin; Hawking, David; Christensen, Helen

    2005-01-01

    Background Since health information on the World Wide Web is of variable quality, methods are needed to assist consumers to identify health websites containing evidence-based information. Manual assessment tools may assist consumers to evaluate the quality of sites. However, these tools are poorly validated and often impractical. There is a need to develop better consumer tools, and in particular to explore the potential of automated procedures for evaluating the quality of health information on the web. Objective This study (1) describes the development of an automated quality assessment procedure (AQA) designed to automatically rank depression websites according to their evidence-based quality; (2) evaluates the validity of the AQA relative to human rated evidence-based quality scores; and (3) compares the validity of Google PageRank and the AQA as indicators of evidence-based quality. Method The AQA was developed using a quality feedback technique and a set of training websites previously rated manually according to their concordance with statements in the Oxford University Centre for Evidence-Based Mental Health’s guidelines for treating depression. The validation phase involved 30 websites compiled from the DMOZ, Yahoo! and LookSmart Depression Directories by randomly selecting six sites from each of the Google PageRank bands of 0, 1-2, 3-4, 5-6 and 7-8. Evidence-based ratings from two independent raters (based on concordance with the Oxford guidelines) were then compared with scores derived from the automated AQA and Google algorithms. There was no overlap in the websites used in the training and validation phases of the study. Results The correlation between the AQA score and the evidence-based ratings was high and significant (r=0.85, P<.001). Addition of a quadratic component improved the fit, the combined linear and quadratic model explaining 82 percent of the variance. The correlation between Google PageRank and the evidence-based score was lower than that for the AQA. When sites with zero PageRanks were included the association was weak and non-significant (r=0.23, P=.22). When sites with zero PageRanks were excluded, the correlation was moderate (r=.61, P=.002). Conclusions Depression websites of different evidence-based quality can be differentiated using an automated system. If replicable, generalizable to other health conditions and deployed in a consumer-friendly form, the automated procedure described here could represent an important advance for consumers of Internet medical information. PMID:16403723

  7. RANKL/RANK: from bone loss to the prevention of breast cancer.

    PubMed

    Sigl, Verena; Jones, Laundette P; Penninger, Josef M

    2016-11-01

    RANK and RANKL, a receptor ligand pair belonging to the tumour necrosis factor family, are the critical regulators of osteoclast development and bone metabolism. Besides their essential function in bone, RANK and RANKL have also been identified as the key factors for the formation of a lactating mammary gland in pregnancy. Mechanistically, RANK and RANKL link the sex hormone progesterone with stem cell expansion and proliferation of mammary epithelial cells. Based on their normal physiology, RANKL/RANK control the onset of hormone-induced breast cancer through the expansion of mammary progenitor cells. Recently, we and others were able to show that RANK and RANKL are also critical regulators of BRCA1-mutation-driven breast cancer. Currently, the preventive strategy for BRCA1-mutation carriers includes preventive mastectomy, associated with wide-ranging risks and psychosocial effects. The search for an alternative non-invasive prevention strategy is therefore of paramount importance. As our work strongly implicates RANK and RANKL as key molecules involved in the initiation of BRCA1-associated breast cancer, we propose that anti-RANKL therapy could be a feasible preventive strategy for women carrying BRCA1 mutations, and by extension to other women with high risk of breast cancer. © 2016 The Authors.

  8. Low-rank structure learning via nonconvex heuristic recovery.

    PubMed

    Deng, Yue; Dai, Qionghai; Liu, Risheng; Zhang, Zengke; Hu, Sanqing

    2013-03-01

    In this paper, we propose a nonconvex framework to learn the essential low-rank structure from corrupted data. Different from traditional approaches, which directly utilizes convex norms to measure the sparseness, our method introduces more reasonable nonconvex measurements to enhance the sparsity in both the intrinsic low-rank structure and the sparse corruptions. We will, respectively, introduce how to combine the widely used ℓp norm (0 < p < 1) and log-sum term into the framework of low-rank structure learning. Although the proposed optimization is no longer convex, it still can be effectively solved by a majorization-minimization (MM)-type algorithm, with which the nonconvex objective function is iteratively replaced by its convex surrogate and the nonconvex problem finally falls into the general framework of reweighed approaches. We prove that the MM-type algorithm can converge to a stationary point after successive iterations. The proposed model is applied to solve two typical problems: robust principal component analysis and low-rank representation. Experimental results on low-rank structure learning demonstrate that our nonconvex heuristic methods, especially the log-sum heuristic recovery algorithm, generally perform much better than the convex-norm-based method (0 < p < 1) for both data with higher rank and with denser corruptions.

  9. Numerical and experimental approaches to simulate soil clogging in porous media

    NASA Astrophysics Data System (ADS)

    Kanarska, Yuliya; LLNL Team

    2012-11-01

    Failure of a dam by erosion ranks among the most serious accidents in civil engineering. The best way to prevent internal erosion is using adequate granular filters in the transition areas where important hydraulic gradients can appear. In case of cracking and erosion, if the filter is capable of retaining the eroded particles, the crack will seal and the dam safety will be ensured. A finite element numerical solution of the Navier-Stokes equations for fluid flow together with Lagrange multiplier technique for solid particles was applied to the simulation of soil filtration. The numerical approach was validated through comparison of numerical simulations with the experimental results of base soil particle clogging in the filter layers performed at ERDC. The numerical simulation correctly predicted flow and pressure decay due to particle clogging. The base soil particle distribution was almost identical to those measured in the laboratory experiment. To get more precise understanding of the soil transport in granular filters we investigated sensitivity of particle clogging mechanisms to various aspects such as particle size ration, the amplitude of hydraulic gradient, particle concentration and contact properties. By averaging the results derived from the grain-scale simulations, we investigated how those factors affect the semi-empirical multiphase model parameters in the large-scale simulation tool. The Department of Homeland Security Science and Technology Directorate provided funding for this research.

  10. Processes in ranking nutrients of foods in a food data base.

    PubMed

    Khan, A S

    1996-01-01

    Depending on the type of user, it is possible that there are many purposes for retrieval of foods from a computerised nutrient data base. A Dietitian on one occasion may need to come up with a qualified assessment of foods in the process of diet construction so that the process of balancing nutrients for the diet takes less time. On another occasion the dietitian may want to recommend a food for a client which requires knowledge of the standing of that food with respect to one or more of its contents of nutrients. A dietitian is not able to memorise all the foods and their nutrient content. Moreover if the number of foods is many then the dietitian's ability to refer foods according to their standing may become impossible. Ranking foods with respect to their nutrient contents within a reasonable number could be very useful for dietetic purposes. This paper discusses the processes of ranking of foods as high, medium and low only, and proposes guidelines which can be referred to for rejecting inappropriate ranking schemes of foods. The proposed guidelines are based on the results of experiments which are included in this paper.

  11. Optimization of the two-sample rank Neyman-Pearson detector

    NASA Astrophysics Data System (ADS)

    Akimov, P. S.; Barashkov, V. M.

    1984-10-01

    The development of optimal algorithms concerned with rank considerations in the case of finite sample sizes involves considerable mathematical difficulties. The present investigation provides results related to the design and the analysis of an optimal rank detector based on a utilization of the Neyman-Pearson criteria. The detection of a signal in the presence of background noise is considered, taking into account n observations (readings) x1, x2, ... xn in the experimental communications channel. The computation of the value of the rank of an observation is calculated on the basis of relations between x and the variable y, representing interference. Attention is given to conditions in the absence of a signal, the probability of the detection of an arriving signal, details regarding the utilization of the Neyman-Pearson criteria, the scheme of an optimal rank, multichannel, incoherent detector, and an analysis of the detector.

  12. Optimizing Search and Ranking in Folksonomy Systems by Exploiting Context Information

    NASA Astrophysics Data System (ADS)

    Abel, Fabian; Henze, Nicola; Krause, Daniel

    Tagging systems enable users to annotate resources with freely chosen keywords. The evolving bunch of tag assignments is called folksonomy and there exist already some approaches that exploit folksonomies to improve resource retrieval. In this paper, we analyze and compare graph-based ranking algorithms: FolkRank and SocialPageRank. We enhance these algorithms by exploiting the context of tags, and evaluate the results on the GroupMe! dataset. In GroupMe!, users can organize and maintain arbitrary Web resources in self-defined groups. When users annotate resources in GroupMe!, this can be interpreted in context of a certain group. The grouping activity itself is easy for users to perform. However, it delivers valuable semantic information about resources and their context. We present GRank that uses the context information to improve and optimize the detection of relevant search results, and compare different strategies for ranking result lists in folksonomy systems.

  13. Transformational leadership in the local police in Spain: a leader-follower distance approach.

    PubMed

    Álvarez, Octavio; Lila, Marisol; Tomás, Inés; Castillo, Isabel

    2014-01-01

    Based on the transformational leadership theory (Bass, 1985), the aim of the present study was to analyze the differences in leadership styles according to the various leading ranks and the organizational follower-leader distance reported by a representative sample of 975 local police members (828 male and 147 female) from Valencian Community (Spain). Results showed differences by rank (p < .01), and by rank distance (p < .05). The general intendents showed the most optimal profile of leadership in all the variables examined (transformational-leadership behaviors, transactional-leadership behaviors, laissez-faire behaviors, satisfaction with the leader, extra effort by follower, and perceived leadership effectiveness). By contrast, the least optimal profiles were presented by intendents. Finally, the maximum distance (five ranks) generally yielded the most optimal profiles, whereas the 3-rank distance generally produced the least optimal profiles for all variables examined. Outcomes and practical implications for the workforce dimensioning are also discussed.

  14. Solving the interval type-2 fuzzy polynomial equation using the ranking method

    NASA Astrophysics Data System (ADS)

    Rahman, Nurhakimah Ab.; Abdullah, Lazim

    2014-07-01

    Polynomial equations with trapezoidal and triangular fuzzy numbers have attracted some interest among researchers in mathematics, engineering and social sciences. There are some methods that have been developed in order to solve these equations. In this study we are interested in introducing the interval type-2 fuzzy polynomial equation and solving it using the ranking method of fuzzy numbers. The ranking method concept was firstly proposed to find real roots of fuzzy polynomial equation. Therefore, the ranking method is applied to find real roots of the interval type-2 fuzzy polynomial equation. We transform the interval type-2 fuzzy polynomial equation to a system of crisp interval type-2 fuzzy polynomial equation. This transformation is performed using the ranking method of fuzzy numbers based on three parameters, namely value, ambiguity and fuzziness. Finally, we illustrate our approach by numerical example.

  15. Scalable Faceted Ranking in Tagging Systems

    NASA Astrophysics Data System (ADS)

    Orlicki, José I.; Alvarez-Hamelin, J. Ignacio; Fierens, Pablo I.

    Nowadays, web collaborative tagging systems which allow users to upload, comment on and recommend contents, are growing. Such systems can be represented as graphs where nodes correspond to users and tagged-links to recommendations. In this paper we analyze the problem of computing a ranking of users with respect to a facet described as a set of tags. A straightforward solution is to compute a PageRank-like algorithm on a facet-related graph, but it is not feasible for online computation. We propose an alternative: (i) a ranking for each tag is computed offline on the basis of tag-related subgraphs; (ii) a faceted order is generated online by merging rankings corresponding to all the tags in the facet. Based on the graph analysis of YouTube and Flickr, we show that step (i) is scalable. We also present efficient algorithms for step (ii), which are evaluated by comparing their results with two gold standards.

  16. Existence and stability, and discrete BB and rank conditions, for general mixed-hybrid finite elements in elasticity

    NASA Technical Reports Server (NTRS)

    Xue, W.-M.; Atluri, S. N.

    1985-01-01

    In this paper, all possible forms of mixed-hybrid finite element methods that are based on multi-field variational principles are examined as to the conditions for existence, stability, and uniqueness of their solutions. The reasons as to why certain 'simplified hybrid-mixed methods' in general, and the so-called 'simplified hybrid-displacement method' in particular (based on the so-called simplified variational principles), become unstable, are discussed. A comprehensive discussion of the 'discrete' BB-conditions, and the rank conditions, of the matrices arising in mixed-hybrid methods, is given. Some recent studies aimed at the assurance of such rank conditions, and the related problem of the avoidance of spurious kinematic modes, are presented.

  17. Lateral Flow Rapid Test for Accurate and Early Diagnosis of Scrub Typhus: A Febrile Illness of Historically Military Importance in the Pacific Rim.

    PubMed

    Chao, Chien-Chung; Zhangm, Zhiwen; Weissenberger, Giulia; Chen, Hua-Wei; Ching, Wei-Mei

    2017-03-01

    Scrub typhus (ST) is an infection caused by Orientia tsutsugamushi. Historically, ST was ranked as the second most important arthropod-borne medical problem only behind malaria during World War II and the Vietnam War. The disease occurs mainly in Southeast Asia and has been shown to emerge and reemerge in new areas, implying the increased risk for U.S. military and civilian personnel deployed to these regions. ST can effectively be treated by doxycycline provided the diagnosis is made early, before the development of severe complications. Scrub Typhus Detect is a lateral flow rapid test based on a mixture of recombinant 56-kDa antigens with broad reactivity. The performance of this prototype product was evaluated against indirect immunofluorescence assay, the serological gold standard. Using 249 prospectively collected samples from Thailand, the sensitivity and specificity for IgM was found to be 100% and 92%, respectively, suggesting a high potential of this product for clinical use. This product will provide a user friendly, rapid, and accurate diagnosis of ST for clinicians to provide timely and accurate treatments of deployed personnel. Reprint & Copyright © 2017 Association of Military Surgeons of the U.S.

  18. Pharmacological profiles of acute myeloid leukemia treatments in patient samples by automated flow cytometry: a bridge to individualized medicine.

    PubMed

    Bennett, Teresa A; Montesinos, Pau; Moscardo, Federico; Martinez-Cuadron, David; Martinez, Joaquin; Sierra, Jorge; García, Raimundo; de Oteyza, Jaime Perez; Fernandez, Pascual; Serrano, Josefina; Fernandez, Angeles; Herrera, Pilar; Gonzalez, Ataulfo; Bethancourt, Concepcion; Rodriguez-Macias, Gabriela; Alonso, Arancha; Vera, Juan A; Navas, Begoña; Lavilla, Esperanza; Lopez, Juan A; Jimenez, Santiago; Simiele, Adriana; Vidriales, Belen; Gonzalez, Bernardo J; Burgaleta, Carmen; Hernandez Rivas, Jose A; Mascuñano, Raul Cordoba; Bautista, Guiomar; Perez Simon, Jose A; Fuente, Adolfo de la; Rayón, Consolación; Troconiz, Iñaki F; Janda, Alvaro; Bosanquet, Andrew G; Hernandez-Campo, Pilar; Primo, Daniel; Lopez, Rocio; Liebana, Belen; Rojas, Jose L; Gorrochategui, Julian; Sanz, Miguel A; Ballesteros, Joan

    2014-08-01

    We have evaluated the ex vivo pharmacology of single drugs and drug combinations in malignant cells of bone marrow samples from 125 patients with acute myeloid leukemia using a novel automated flow cytometry-based platform (ExviTech). We have improved previous ex vivo drug testing with 4 innovations: identifying individual leukemic cells, using intact whole blood during the incubation, using an automated platform that escalates reliably data, and performing analyses pharmacodynamic population models. Samples were sent from 24 hospitals to a central laboratory and incubated for 48 hours in whole blood, after which drug activity was measured in terms of depletion of leukemic cells. The sensitivity of single drugs is assessed for standard efficacy (EMAX) and potency (EC50) variables, ranked as percentiles within the population. The sensitivity of drug-combination treatments is assessed for the synergism achieved in each patient sample. We found a large variability among patient samples in the dose-response curves to a single drug or combination treatment. We hypothesize that the use of the individual patient ex vivo pharmacological profiles may help to guide a personalized treatment selection. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Jet Surface Interaction Scrubbing Noise from High Aspect-Ratio Rectangular Jets

    NASA Technical Reports Server (NTRS)

    Khavaran, Abbas; Bozak, Richard F.

    2015-01-01

    Concepts envisioned for the future of civil air transport consist of unconventional propulsion systems in the close proximity of the airframe. Distributed propulsion system with exhaust configurations that resemble a high aspect ratio rectangular jet are among geometries of interest. Nearby solid surfaces could provide noise shielding for the purpose of reduced community noise. Interaction of high-speed jet exhaust with structure could also generate new sources of sound as a result of flow scrubbing past the structure, and or scattered noise from sharp edges. The present study provides a theoretical framework to predict the scrubbing noise component from a high aspect ratio rectangular exhaust in proximity of a solid surface. The analysis uses the Greens function (GF) to the variable density Pridmore-Brown equation in a transversely sheared mean flow. Sources of sound are defined as the auto-covariance function of second-rank velocity fluctuations in the jet plume, and are modeled using a RANS-based acoustic analogy approach. Acoustic predictions are presented in an 8:1 aspect ratio rectangular exhaust at three subsonic Mach numbers. The effect of nearby surface on the scrubbing noise component is shown on both reflected and shielded sides of the plate.

  20. A gender-based comparison of academic rank and scholarly productivity in academic neurological surgery.

    PubMed

    Tomei, Krystal L; Nahass, Meghan M; Husain, Qasim; Agarwal, Nitin; Patel, Smruti K; Svider, Peter F; Eloy, Jean Anderson; Liu, James K

    2014-07-01

    The number of women pursuing training opportunities in neurological surgery has increased, although they are still underrepresented at senior positions relative to junior academic ranks. Research productivity is an important component of the academic advancement process. We sought to use the h-index, a bibliometric previously analyzed among neurological surgeons, to evaluate whether there are gender differences in academic rank and research productivity among academic neurological surgeons. The h-index was calculated for 1052 academic neurological surgeons from 84 institutions, and organized by gender and academic rank. Overall men had statistically higher research productivity (mean 13.3) than their female colleagues (mean 9.5), as measured by the h-index, in the overall sample (p<0.0007). When separating by academic rank, there were no statistical differences (p>0.05) in h-index at the assistant professor (mean 7.2 male, 6.3 female), associate professor (11.2 male, 10.8 female), and professor (20.0 male, 18.0 female) levels based on gender. There was insufficient data to determine significance at the chairperson rank, as there was only one female chairperson. Although overall gender differences in scholarly productivity were detected, these differences did not reach statistical significance upon controlling for academic rank. Women were grossly underrepresented at the level of chairpersons in this sample of 1052 academic neurological surgeons, likely a result of the low proportion of females in this specialty. Future studies may be needed to investigate gender-specific research trends for neurosurgical residents, a cohort that in recent years has seen increased representation by women. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Relationship between Small Animal Intern Rank and Performance at a University Teaching Hospital.

    PubMed

    Hofmeister, Erik H; Saba, Corey; Kent, Marc; Creevy, Kate E

    2015-01-01

    The purpose of this study was to determine if there is a relationship between selection committee rankings of internship applicants and the performance of small animal interns. The hypothesis was that there would be a relationship between selection committee rank order and intern performance; the more highly an application was ranked, the better the intern's performance scores would be. In 2007, the Department of Small Animal Medicine and Surgery instituted a standardized approach to its intern selection process both to streamline the process and to track its effectiveness. At the end of intern years 2010-2014, every faculty member in the department was provided an intern assessment form for that year's class. There was no relationship between an individual intern's final rank by the selection committee and his/her performance either as a percentile score or a Likert-type score (p=.25, R2=0.04; p=0.31, R2=0.03, respectively). Likewise, when interns were divided into the top and bottom quartile based on their final rank by the selection committee, there was no relationship between their rank and their performance as a percentile score (median rank 15 vs. 20; p=.14) or Likert-type score (median rank 14 vs. 19; p=.27). Institutions that use a similar intern selection method may need to reconsider the time and effort being expended for an outcome that does not predict performance. Alternatively, specific criteria more predictive of performance outcomes should be identified and employed in the internship selection process.

  2. Wisconsin Versus Minnesota: A Border Battle for the Healthiest State.

    PubMed

    Pollock, Elizabeth; Norrbom, Corina; Ehlinger, Edward; Remington, Patrick

    2016-08-01

    Measuring and ranking the health of counties helps raise awareness of health disparities based on where people live. Recently, there has been increasing interest in comparing the health of counties across state lines, to potentially measure the impact of local and state-level policies. The counties in Minnesota (n = 87) and Wisconsin (n = 72) were combined into a single 2-state region, and all 159 counties were ranked according to the County Health Rankings methods, with summary ranks for health outcomes and health factors. Multivariable regression analysis was then used to examine the potential impact of state-based programs and policies on health outcomes. Minnesota was healthier overall than Wisconsin, with lower rates of premature death and better quality of life. Minnesota also performed better than Wisconsin for all 9 health behavior measures, 4 of 7 clinical care measures, 7 of 8 social and economic factors, and 3 of 5 physical environment measures. Furthermore, counties in Wisconsin were more likely to have lower (worse) ranks than counties in Minnesota for both health outcomes and health factors, as well as for the subcategories that make up these summary ranks. Regression analysis showed that Minnesota’s better health status was explained primarily by healthier behaviors and more desirable social and economic factors. Minnesota’s better health outcomes are largely explained by better social, economic, and behavioral factors. These findings suggest a need for examination of policies and strategies that may be influencing the observed differences across these 2 states.

  3. Unreported links between trial registrations and published articles were identified using document similarity measures in a cross-sectional analysis of ClinicalTrials.gov.

    PubMed

    Dunn, Adam G; Coiera, Enrico; Bourgeois, Florence T

    2018-03-01

    Trial registries can be used to measure reporting biases and support systematic reviews, but 45% of registrations do not provide a link to the article reporting on the trial. We evaluated the use of document similarity methods to identify unreported links between ClinicalTrials.gov and PubMed. We extracted terms and concepts from a data set of 72,469 ClinicalTrials.gov registrations and 276,307 PubMed articles and tested methods for ranking articles across 16,005 reported links and 90 manually identified unreported links. Performance was measured by the median rank of matching articles and the proportion of unreported links that could be found by screening ranked candidate articles in order. The best-performing concept-based representation produced a median rank of 3 (interquartile range [IQR] 1-21) for reported links and 3 (IQR 1-19) for the manually identified unreported links, and term-based representations produced a median rank of 2 (1-20) for reported links and 2 (IQR 1-12) in unreported links. The matching article was ranked first for 40% of registrations, and screening 50 candidate articles per registration identified 86% of the unreported links. Leveraging the growth in the corpus of reported links between ClinicalTrials.gov and PubMed, we found that document similarity methods can assist in the identification of unreported links between trial registrations and corresponding articles. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Chromatographic and computational assessment of lipophilicity using sum of ranking differences and generalized pair-correlation.

    PubMed

    Andrić, Filip; Héberger, Károly

    2015-02-06

    Lipophilicity (logP) represents one of the most studied and most frequently used fundamental physicochemical properties. At present there are several possibilities for its quantitative expression and many of them stems from chromatographic experiments. Numerous attempts have been made to compare different computational methods, chromatographic methods vs. computational approaches, as well as chromatographic methods and direct shake-flask procedure without definite results or these findings are not accepted generally. In the present work numerous chromatographically derived lipophilicity measures in combination with diverse computational methods were ranked and clustered using the novel variable discrimination and ranking approaches based on the sum of ranking differences and the generalized pair correlation method. Available literature logP data measured on HILIC, and classical reversed-phase combining different classes of compounds have been compared with most frequently used multivariate data analysis techniques (principal component and hierarchical cluster analysis) as well as with the conclusions in the original sources. Chromatographic lipophilicity measures obtained under typical reversed-phase conditions outperform the majority of computationally estimated logPs. Oppositely, in the case of HILIC none of the many proposed chromatographic indices overcomes any of the computationally assessed logPs. Only two of them (logkmin and kmin) may be selected as recommended chromatographic lipophilicity measures. Both ranking approaches, sum of ranking differences and generalized pair correlation method, although based on different backgrounds, provides highly similar variable ordering and grouping leading to the same conclusions. Copyright © 2015. Published by Elsevier B.V.

  5. Estimation of Flux Between Interacting Nodes on Huge Inter-Firm Networks

    NASA Astrophysics Data System (ADS)

    Tamura, Koutarou; Miura, Wataru; Takayasu, Misako; Takayasu, Hideki; Kitajima, Satoshi; Goto, Hayato

    We analyze Japanese inter-firm network data showing scale-free properties as an example of a real complex network. The data contains information on money flow (annual transaction volume) between about 7000 pairs of firms. We focus on this money-flow data and investigate the correlation between various quantities such as sales or link numbers. We find that the flux from a buyer to a supplier is given by the product of the fractional powers of both sales with different exponents. This result indicates that the principle of detailed balance does not hold in the real transport of money; therefore, random walk type transport models such as PageRank are not suitable.

  6. An Automated Energy Detection Algorithm Based on Consecutive Mean Excision

    DTIC Science & Technology

    2018-01-01

    present in the RF spectrum. 15. SUBJECT TERMS RF spectrum, detection threshold algorithm, consecutive mean excision, rank order filter , statistical...Median 4 3.1.9 Rank Order Filter (ROF) 4 3.1.10 Crest Factor (CF) 5 3.2 Statistical Summary 6 4. Algorithm 7 5. Conclusion 8 6. References 9...energy detection algorithm based on morphological filter processing with a semi- disk structure. Adelphi (MD): Army Research Laboratory (US); 2018 Jan

  7. Comparison of the effects of pilocarpine and cevimeline on salivary flow.

    PubMed

    Braga, M A; Tarzia, O; Bergamaschi, C C; Santos, F A; Andrade, E D; Groppo, F C

    2009-05-01

    The aim of the present study was to compare the effect of low-dose pilocarpine and cevimeline as stimulants for salivary flow in healthy subjects. In this cross-over clinical trial with a 1-week washout period, 40 male volunteers were submitted to an oral dose of pilocarpine 1% (Salagen) -60 microg kg(-1) body-weight (Group 1) or Cevimeline (Evoxac) -30 mg (Group 2). Saliva samples were collected and the salivary flow rate was measured (ml min(-1)) at baseline and 20, 40, 60, 80, 140 and 200 min after administration of drugs. In addition, salivary secretion was also measured under mechanical stimulation to observe salivary gland function. The data were analyzed by Friedman and Wilcoxon signed-rank tests (significance level = 5%). Pilocarpine and cevimeline significantly increased salivary flow 140 min after intake. There was a significant higher secretion with cevimeline 140 and 200 min after administration. There were no differences seen among subjects in the salivary glands function by mechanical stimulation. Both drugs showed efficacy in increasing the salivary flow in healthy volunteers, but cevimeline was more effective than pilocarpine.

  8. A Systems Study to Determine the Attractiveness of Solar System Bodies and Sites for Eventual Human Exploration

    NASA Technical Reports Server (NTRS)

    Andringa, Jason M.; Gray, Andrew A.

    2005-01-01

    A pre-phase A idea-generation team at the Jet Propulsion Laboratory (JPL), has conducted a study to rank all locations in the solar system based on attractiveness for human exploration. The process used to perform the study was composed of the following primary steps: determination of criteria (including value, cost, and risk criteria) upon which to rate sites in the solar system; weighting of the criteria based upon importance to eventual human exploration; selection of sites to consider and assignment of team members to the task of advocating the benefits of particular sites; rating the sites in both the short- and longterm based on team member presentations and team discussions; compilation of a score based on criteria weights and individual ratings. Finally a comparison of the total scores of different sites was completed to determine a ranking of all the bodies and sites in the solar system. Sensitivity analysis was also performed to determine how weightings affect the rankings.

  9. EXAMINING SOCIOECONOMIC HEALTH DISPARITIES USING A RANK-DEPENDENT RÉNYI INDEX.

    PubMed

    Talih, Makram

    2015-06-01

    The Rényi index (RI) is a one-parameter class of indices that summarize health disparities among population groups by measuring divergence between the distributions of disease burden and population shares of these groups. The rank-dependent RI introduced in this paper is a two-parameter class of health disparity indices that also accounts for the association between socioeconomic rank and health; it may be derived from a rank-dependent social welfare function. Two competing classes are discussed and the rank-dependent RI is shown to be more robust to changes in the distribution of either socioeconomic rank or health. The standard error and sampling distribution of the rank-dependent RI are evaluated using linearization and re-sampling techniques, and the methodology is illustrated using health survey data from the U.S. National Health and Nutrition Examination Survey and registry data from the U.S. Surveillance, Epidemiology and End Results Program. Such data underlie many population-based objectives within the U.S. Healthy People 2020 initiative. The rank-dependent RI provides a unified mathematical framework for eliciting various societal positions with regards to the policies that are tied to such wide-reaching public health initiatives. For example, if population groups with lower socioeconomic position were ascertained to be more likely to utilize costly public programs, then the parameters of the RI could be selected to reflect prioritizing those population groups for intervention or treatment.

  10. EXAMINING SOCIOECONOMIC HEALTH DISPARITIES USING A RANK-DEPENDENT RÉNYI INDEX

    PubMed Central

    Talih, Makram

    2015-01-01

    The Rényi index (RI) is a one-parameter class of indices that summarize health disparities among population groups by measuring divergence between the distributions of disease burden and population shares of these groups. The rank-dependent RI introduced in this paper is a two-parameter class of health disparity indices that also accounts for the association between socioeconomic rank and health; it may be derived from a rank-dependent social welfare function. Two competing classes are discussed and the rank-dependent RI is shown to be more robust to changes in the distribution of either socioeconomic rank or health. The standard error and sampling distribution of the rank-dependent RI are evaluated using linearization and re-sampling techniques, and the methodology is illustrated using health survey data from the U.S. National Health and Nutrition Examination Survey and registry data from the U.S. Surveillance, Epidemiology and End Results Program. Such data underlie many population-based objectives within the U.S. Healthy People 2020 initiative. The rank-dependent RI provides a unified mathematical framework for eliciting various societal positions with regards to the policies that are tied to such wide-reaching public health initiatives. For example, if population groups with lower socioeconomic position were ascertained to be more likely to utilize costly public programs, then the parameters of the RI could be selected to reflect prioritizing those population groups for intervention or treatment. PMID:26566419

  11. SortNet: learning to rank by a neural preference function.

    PubMed

    Rigutini, Leonardo; Papini, Tiziano; Maggini, Marco; Scarselli, Franco

    2011-09-01

    Relevance ranking consists in sorting a set of objects with respect to a given criterion. However, in personalized retrieval systems, the relevance criteria may usually vary among different users and may not be predefined. In this case, ranking algorithms that adapt their behavior from users' feedbacks must be devised. Two main approaches are proposed in the literature for learning to rank: the use of a scoring function, learned by examples, that evaluates a feature-based representation of each object yielding an absolute relevance score, a pairwise approach, where a preference function is learned to determine the object that has to be ranked first in a given pair. In this paper, we present a preference learning method for learning to rank. A neural network, the comparative neural network (CmpNN), is trained from examples to approximate the comparison function for a pair of objects. The CmpNN adopts a particular architecture designed to implement the symmetries naturally present in a preference function. The learned preference function can be embedded as the comparator into a classical sorting algorithm to provide a global ranking of a set of objects. To improve the ranking performances, an active-learning procedure is devised, that aims at selecting the most informative patterns in the training set. The proposed algorithm is evaluated on the LETOR dataset showing promising performances in comparison with other state-of-the-art algorithms.

  12. The Gatekeepers of Business Education Research: An Institutional Analysis

    ERIC Educational Resources Information Center

    Urbancic, Frank R.

    2011-01-01

    The author ranked the academic standing of universities based on faculty representation to the editorial boards of business education journals. Previous studies that ranked institutions for editorial board representation focused on journals that primarily favor publication of basic and applied research contributions. As a result, prior research…

  13. University Mission and Identity for a Post Post-Public Era

    ERIC Educational Resources Information Center

    Marginson, Simon

    2007-01-01

    The paper reflects on the implications of two influential albeit contrary movements affecting research universities in Australia (and many other nations): global rankings, which normalize the comprehensive science-based research university; and the policy emphasis on diversification. It critiques the global rankings developed by the "Times…

  14. 38 CFR 36.4318 - Servicer tier ranking-temporary procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Servicer tier ranking... § 36.4318 Servicer tier ranking—temporary procedures. (a) The Secretary shall assign to each servicer a “Tier Ranking” based upon the servicer's performance in servicing guaranteed loans. There shall be four...

  15. Term Dependence: A Basis for Luhn and Zipf Models.

    ERIC Educational Resources Information Center

    Losee, Robert M.

    2001-01-01

    Discusses relationships between the frequency-based characteristics of neighboring terms in natural language and the rank or frequency of the terms. Topics include information theory measures, including expected mutual information measure (EMIM); entropy and rank; Luhn's model of term aboutness; Zipf's law; and implications for indexing and…

  16. Cross-Cultural Faculty Values.

    ERIC Educational Resources Information Center

    Keim, Marybelle C.

    1992-01-01

    Compares the terminal values of 24 visiting scholars from the People's Republic of China based at a midwestern community college with resident faculty values. The Chinese scholars ranked freedom, equality, and self-respect highest, whereas U.S. schools gave highest rankings to salvation, family security, and self-respect. Contrasts findings with a…

  17. Efficiently Ranking Hyphotheses in Machine Learning

    NASA Technical Reports Server (NTRS)

    Chien, Steve

    1997-01-01

    This paper considers the problem of learning the ranking of a set of alternatives based upon incomplete information (e.g. a limited number of observations). At each decision cycle, the system can output a complete ordering on the hypotheses or decide to gather additional information (e.g. observation) at some cost.

  18. 12 CFR 1806.203 - Selection Process, actual award amounts.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Community Financing Activities, ranked in the order set forth in the applicable NOFA. (3) Third Priority. If... amounts based on the process described in this section. (c) Priority of Awards. The Fund will rank Applicants in each category of Qualified Activity according to the priorities described in this paragraph (c...

  19. Combining stakeholder analysis and spatial multicriteria evaluation to select and rank inert landfill sites.

    PubMed

    Geneletti, Davide

    2010-02-01

    This paper presents a method based on the combination of stakeholder analysis and spatial multicriteria evaluation (SMCE) to first design possible sites for an inert landfill, and then rank them according to their suitability. The method was tested for the siting of an inert landfill in the Sarca's Plain, located in south-western Trentino, an alpine region in northern Italy. Firstly, stakeholder analysis was conducted to identify a set of criteria to be satisfied by new inert landfill sites. SMCE techniques were then applied to combine the criteria, and obtain a suitability map of the study region. Subsequently, the most suitable sites were extracted by taking into account also thresholds based on size and shape. These sites were then compared and ranked according to their visibility, accessibility and dust pollution. All these criteria were assessed through GIS modelling. Sensitivity analyses were performed on the results to assess the stability of the ranking with respect to variations in the input (criterion scores and weights). The study concluded that the three top-ranking sites are located close to each other, in the northernmost sector of the study area. A more general finding was that the use of different criteria in the different stages of the analysis allowed to better differentiate the suitability of the potential landfill sites.

  20. Patients' self-interested preferences: empirical evidence from a priority setting experiment.

    PubMed

    Alvarez, Begoña; Rodríguez-Míguez, Eva

    2011-04-01

    This paper explores whether patients act according to self-interest in priority setting experiments. The analysis is based on a ranking experiment, conducted in Galicia (Spain), to elicit preferences regarding the prioritization of patients on a waiting list for an elective surgical intervention (prostatectomy for benign prostatic hyperplasia). Participants were patients awaiting a similar intervention and members of the general populations. All of them were asked to rank hypothetical patients on a waiting list. A rank-ordered logit was then applied to their responses in order to obtain a prioritization scoring system. Using these estimations, we first test for differences in preferences between patients and general population. Second, we implement a procedure based on the similarity between respondents (true patients) and the hypothetical scenarios they evaluate (hypothetical patients) to analyze whether patients provide self-interested rankings. Our results show that patient preferences differ significantly from general population preferences. The findings also indicate that, when patients rank the hypothetical scenarios on the waiting list, they consider not only the explicit attributes but also the similarity of each scenario to their own. In particular, they assign a higher priority to scenarios that more closely match their own states. We also find that such a preference structure increases their likelihood of reporting "irrational" answers. Copyright © 2011 Elsevier Ltd. All rights reserved.

Top