Sample records for selected papers number

  1. The Changing Selectivity of American Colleges. NBER Working Paper No. 15446

    ERIC Educational Resources Information Center

    Hoxby, Caroline M.

    2009-01-01

    This paper shows that although the top ten percent of colleges are substantially more selective now than they were 5 decades ago, most colleges are not more selective. Moreover, at least 50 percent of colleges are substantially less selective now than they were then. This paper demonstrates that competition for space--the number of students who…

  2. Selecting a Good Conference Location Based on Participants' Interests

    ERIC Educational Resources Information Center

    Miah, Muhammed

    2011-01-01

    Selecting a good conference location within budget constraints to attract paper authors and participants is a very difficult job for the conference organizers. A conference location is also very important along with other issues such as ranking of the conference. Selecting a bad conference location may reduce the number of paper submissions and…

  3. Estimating research productivity and quality in assistive technology: a bibliometric analysis spanning four decades.

    PubMed

    Ryan, Cindy; Tewey, Betsy; Newman, Shelia; Turner, Tracy; Jaeger, Robert J

    2004-12-01

    Conduct a quantitative assessment of the number of papers contained in MEDLINE related to selected types of assistive technology (AT), and to identify journals publishing significant numbers of papers related to AT, and evaluate them with quantitative productivity and quality measures. Consecutive sample of all papers in MEDLINE identified by standard medical subject headings for selected types of AT from 1963-2003. Number of journals carrying AT papers, papers per journal (both total number and those specific to AT), journal impact factor, circulation, and number of AT citations per year over time for each area of AT. We present search terms, estimates of the numbers of AT citations in MEDLINE, the journals most likely to contain articles related to AT, journal impact factors, and journal circulations (when available). We also present the number of citations in various areas of AT over time from 1963-2003. Suggestions are presented for possible future modifications of the MEDLINE controlled vocabulary, based on terminology used in existing AT classifications schemes, such as ISO 9999. Research papers in the areas of AT examined showed publication across a wide variety of journals. There are a number of journals publishing articles in AT that have impact factors above the median. Some areas of AT have shown an increase in publications per year over time, while others have shown a more constant level of productivity.

  4. Highlighting impact: Do editors' selections identify influential papers?

    NASA Astrophysics Data System (ADS)

    Antonoyiannakis, Manolis

    A recent trend in scientific publishing is that journal editors highlight each week a select set among the papers published (usually) in their respective journals. The highlighted papers are deemed of higher quality, importance, or interest than the 'average' paper and feature prominently in the publishers' websites. We perform a citation analysis of the highlighted papers for a number of journals from various publishers in physics. By comparing the performance of highlighted papers relative to (a) typical papers and (b) highly cited papers in their source journals and in other journals in the field, we explore whether, and to what extent, the selection process at the time of publication identifies papers that will turn out to be influential. We discuss the broader implications for research assessment.

  5. The use of grey literature in health sciences: a preliminary survey.

    PubMed Central

    Alberani, V; De Castro Pietrangeli, P; Mazza, A M

    1990-01-01

    The paper describes some initiatives in the field of grey literature (GL) and the activities, from 1985, of the Italian Library Association Study Group. The major categories of GL are defined; a survey that evaluates the use of GL by end users in the health sciences is described. References in selected periodicals and databases have been analyzed for the years 1987-1988 to determine the number of articles citing GL, the number of GL citations found in selected periodicals, the various types of GL found, and the number of technical reports cited and their country of origin and intergovernmental issuing organization. Selected databases were also searched to determine the presence of GL during those same years. The paper presents the first results obtained. Images PMID:2224298

  6. Selection of School Counsellors in New Zealand.

    ERIC Educational Resources Information Center

    Manthei, R. J.

    This paper presents the views of the New Zealand Counselling and Guidance Association regarding the need for changes in the system of selecting individuals for training as school counselors in New Zealand. A number of options are offered for improving the mechanics of selection, recommending selection criteria, and suggesting procedures for…

  7. A Selection of Papers from NWAVE [New Ways of Analyzing Variation] (25th, Las Vegas, Nevada, October 1996). University of Pennsylvania Working Papers in Linguistics, Volume 4, Number 1.

    ERIC Educational Resources Information Center

    Boberg, Charles, Ed.; Meyerhoff, Miriam, Ed.; Strassel, Stephanie, Ed.

    This issue includes the following articles: "Towards a Sociolinguistics of Style" (Alan Bell, Gary Johnson); "Engendering Identities: Pronoun Selection as an Indicator of Salient Intergroup Identities" (Miriam Meyerhoff); "A Majority Sound Change in a Minority Community" (Carmen Fought); "Addressing the Actuation…

  8. Hypervelocity impact

    NASA Astrophysics Data System (ADS)

    Fair, Harry D.; Kiehne, Thomas M.; Anderson, Charles E., Jr.

    1993-10-01

    The 1992 Hypervelocity Impact Symposium was held in Austin, Texas on November 17-20, 1992. The proceedings are published in three volumes. Seventy-six papers were unclassified, and published together as Volume 14, Numbers 1-4 of the International Journal of Impact Engineering, which can be obtained from Pergamon Press, Maxwell House, Fairview Park, Elmsford, NY 10523. Nine papers were selected for presentation and publication in the Classified Proceedings, and eight papers were selected for presentation and publication in the NOFORN Proceedings.

  9. [Why evidence-based medicine? 20 years of meta-analysis].

    PubMed

    Ceballos, C; Valdizán, J R; Artal, A; Almárcegui, C; Allepuz, C; García Campayo, J; Fernández Liesa, R; Giraldo, P; Puértolas, T

    2000-10-01

    Meta-analysis, described within evidence-based medicine, has become a frequent issue in recent medical literature. An exhaustive search of reported meta-analysis from any medical specialty is described. Search of papers included in Medline or Embase between 1973-1998. A study of intra and inter-reviewers liability about selection and classification have been performed. A descriptive analysis of the reported papers (frequency tables and graphics) is described, including differences of mean of reported meta-analysis papers by medical specialty and year. 1,518 papers were selected and classified. Most frequently found (45.91%) were: methodology (15.7%), psychiatry (11.79%), cardiology (10.01%) and oncology (8.36%). Inter personal agreement was 0.93 in selecting papers and 0.72 in classifying them. Between 1977-1987 overall mean of reported studies of meta-analysis (1.67 + 4.10) was significatively inferior to the 1988-1998 (49.54 + 56.55) (p < 0.001). Global number of meta-analysis was positively correlated (p < 0.05) with the number of studies about fundamentals and methodology during the study period. The method used to identify meta-analysis reports can be considered to be adequate; however, the agreement in classifying them in medical specialties was inferior. A progressive increase in the number of reported meta-analysis since 1977 can be demonstrated. Specialties with a greater number of meta-analysis published in the literature were: psychiatry, oncology and cardiology. Diffusion of knowledge about fundamentals and methodology of meta-analysis seems to have drawn and increase in performing and reporting this kind of analysis.

  10. Health Information Technology Coordination to Support Patient-centered Care Coordination.

    PubMed

    Steichen, O; Gregg, W

    2015-08-13

    To select papers published in 2014, illustrating how information technology can contribute to and improve patient-centered care coordination. The two section editors performed a literature review from Medline and Web of Science to select a list of candidate best papers on the use of information technology for patient-centered care coordination. These papers were peer-reviewed by external reviewers and three of them were selected as "best papers". The first selected paper reports a qualitative study exploring the gap between current practices of care coordination in various settings and idealized longitudinal care plans. The second selected paper illustrates several unintended consequences of HIT designed to improve care coordination. The third selected paper shows that advanced analytic techniques in medical informatics can be instrumental in studying patient-centered care coordination. The realization of true patient-centered care coordination is dependent upon a number of factors. Standardization of clinical documentation and HIT interoperability across organization and settings is a critical prerequisite for HIT to support patient-centered care coordination. Enabling patient involvement is an efficient means for goal setting and health information sharing. Additionally, unintended consequences of HIT tools (both positive and negative) must be measured and taken into account for quality improvement.

  11. Optimal PMU placement using topology transformation method in power systems.

    PubMed

    Rahman, Nadia H A; Zobaa, Ahmed F

    2016-09-01

    Optimal phasor measurement units (PMUs) placement involves the process of minimizing the number of PMUs needed while ensuring the entire power system completely observable. A power system is identified observable when the voltages of all buses in the power system are known. This paper proposes selection rules for topology transformation method that involves a merging process of zero-injection bus with one of its neighbors. The result from the merging process is influenced by the selection of bus selected to merge with the zero-injection bus. The proposed method will determine the best candidate bus to merge with zero-injection bus according to the three rules created in order to determine the minimum number of PMUs required for full observability of the power system. In addition, this paper also considered the case of power flow measurements. The problem is formulated as integer linear programming (ILP). The simulation for the proposed method is tested by using MATLAB for different IEEE bus systems. The explanation of the proposed method is demonstrated by using IEEE 14-bus system. The results obtained in this paper proved the effectiveness of the proposed method since the number of PMUs obtained is comparable with other available techniques.

  12. Closing the loops in biomedical informatics from theory to daily practice.

    PubMed

    Gaudinat, A

    2009-01-01

    This article presents the 2009 selection of the best papers in the special section dedicated to biomedical informatics and cybernetics. Synopsis of the articles selected for the IMIA yearbook 2009 Five papers from international peer reviewed journals where selected for this section. Most of the papers have a strong practical orientation in clinical care. And this selection gives a good overview of what is done with "closing loop" approach, particularly during the year 2008. While quite mature for some clinical applications such as mechanical ventilation, it remains a challenge where rules for the decision system could be difficult to identify due to the number of variables. More complex systems with greater Artificial Intelligence approaches will certainly be the next trend for closed-loop applications.

  13. Evaluating the application of failure mode and effects analysis technique in hospital wards: a systematic review

    PubMed Central

    Asgari Dastjerdi, Hoori; Khorasani, Elahe; Yarmohammadian, Mohammad Hossein; Ahmadzade, Mahdiye Sadat

    2017-01-01

    Abstract: Background: Medical errors are one of the greatest problems in any healthcare systems. The best way to prevent such problems is errors identification and their roots. Failure Mode and Effects Analysis (FMEA) technique is a prospective risk analysis method. This study is a review of risk analysis using FMEA technique in different hospital wards and departments. Methods: This paper has systematically investigated the available databases. After selecting inclusion and exclusion criteria, the related studies were found. This selection was made in two steps. First, the abstracts and titles were investigated by the researchers and, after omitting papers which did not meet the inclusion criteria, 22 papers were finally selected and the text was thoroughly examined. At the end, the results were obtained. Results: The examined papers had focused mostly on the process and had been conducted in the pediatric wards and radiology departments, and most participants were nursing staffs. Many of these papers attempted to express almost all the steps of model implementation; and after implementing the strategies and interventions, the Risk Priority Number (RPN) was calculated to determine the degree of the technique’s effect. However, these papers have paid less attention to the identification of risk effects. Conclusions: The study revealed that a small number of studies had failed to show the FMEA technique effects. In general, however, most of the studies recommended this technique and had considered it a useful and efficient method in reducing the number of risks and improving service quality. PMID:28039688

  14. Analysis of the relationship between the number of citations and the quality evaluated by experts in psychology journals.

    PubMed

    Buela-Casal, Gualberto; Zych, Izabela

    2010-05-01

    The study analyzes the relationship between the number of citations as calculated by the IN-RECS database and the quality evaluated by experts. The articles published in journals of the Spanish Psychological Association between 1996 and 2008 and selected by the Editorial Board of Psychology in Spain were the subject of the study. Psychology in Spain is a journal that includes the best papers published throughout the previous year, chosen by the Editorial Board made up of fifty specialists of acknowledged prestige within Spanish psychology and translated into English. The number of the citations of the 140 original articles republished in Psychology in Spain was compared to the number of the citations of the 140 randomly selected articles. Additionally, the study searched for a relationship between the number of the articles selected from each journal and their mean number of citations. The number of citations received by the best articles as evaluated by experts is significantly higher than the number of citations of the randomly selected articles. Also, the number of citations is higher in the articles from the most frequently selected journals. A statistically significant relation between the quality evaluated by experts and the number of the citations was found.

  15. HALT Selected Papers, 1993 with Language Teaching Ideas from Paradise.

    ERIC Educational Resources Information Center

    Chandler, Paul, Ed.; Hodnett, Edda, Ed.

    In section I, papers presented at the Hawaii Association of Language Teachers (HALT) in 1993 are presented. Section II includes a number of projects received from a call for papers simultaneous to the call for the HALT papers. Section 1 contains: "This is Like a Foreign Language to Me: Keynote Address" (Bill VanPatten); "From Discussion Questions…

  16. Selecting the most appropriate inferential statistical test for your quantitative research study.

    PubMed

    Bettany-Saltikov, Josette; Whittaker, Victoria Jane

    2014-06-01

    To discuss the issues and processes relating to the selection of the most appropriate statistical test. A review of the basic research concepts together with a number of clinical scenarios is used to illustrate this. Quantitative nursing research generally features the use of empirical data which necessitates the selection of both descriptive and statistical tests. Different types of research questions can be answered by different types of research designs, which in turn need to be matched to a specific statistical test(s). Discursive paper. This paper discusses the issues relating to the selection of the most appropriate statistical test and makes some recommendations as to how these might be dealt with. When conducting empirical quantitative studies, a number of key issues need to be considered. Considerations for selecting the most appropriate statistical tests are discussed and flow charts provided to facilitate this process. When nursing clinicians and researchers conduct quantitative research studies, it is crucial that the most appropriate statistical test is selected to enable valid conclusions to be made. © 2013 John Wiley & Sons Ltd.

  17. The Effects of Television on Children. Foundation for Child and Youth Studies Selected Papers Number 47.

    ERIC Educational Resources Information Center

    Tregoning, Julia

    This paper begins with an introduction which covers viewing time; television as a significant developmental experience; Piagetian stages of cognitive development; and changing developmental tastes in television viewing. The paper then focuses on television in relation to learning; television and violence; advertising; and parents' ability to…

  18. ATLAS Series of Shuttle Missions. Volume 23

    NASA Technical Reports Server (NTRS)

    1996-01-01

    This technical paper contains selected papers from Geophysical Research Letters (Volume 23, Number 17) on ATLAS series of shuttle missions. The ATLAS space shuttle missions were conducted in March 1992, April 1993, and November 1994. This paper discusses solar irradiance, middle atmospheric temperatures, and trace gas concentrations measurements made by the ATLAS payload and companion instruments.

  19. Linear reduction method for predictive and informative tag SNP selection.

    PubMed

    He, Jingwu; Westbrooks, Kelly; Zelikovsky, Alexander

    2005-01-01

    Constructing a complete human haplotype map is helpful when associating complex diseases with their related SNPs. Unfortunately, the number of SNPs is very large and it is costly to sequence many individuals. Therefore, it is desirable to reduce the number of SNPs that should be sequenced to a small number of informative representatives called tag SNPs. In this paper, we propose a new linear algebra-based method for selecting and using tag SNPs. We measure the quality of our tag SNP selection algorithm by comparing actual SNPs with SNPs predicted from selected linearly independent tag SNPs. Our experiments show that for sufficiently long haplotypes, knowing only 0.4% of all SNPs the proposed linear reduction method predicts an unknown haplotype with the error rate below 2% based on 10% of the population.

  20. Advanced aerodynamics. Selected NASA research

    NASA Technical Reports Server (NTRS)

    1981-01-01

    This Conference Publication contains selected NASA papers that were presented at the Fifth Annual Status Review of the NASA Aircraft Energy Efficiency (ACEE) Energy Efficient Transport (EET) Program held at Dryden Flight Research Center in Edwards, California on September 14 to 15, 1981. These papers describe the status of several NASA in-house research activities in the areas of advanced turboprops, natural laminar flow, oscillating control surfaces, high-Reynolds-number airfoil tests, high-lift technology, and theoretical design techniques.

  1. Integrating Multiple Criteria in Selection Procedures for Improving Student Quality and Reducing Cost Per Graduate. AIR Forum 1979 Paper.

    ERIC Educational Resources Information Center

    Jones, Gerald L.; Westen, Risdon J.

    The multivariate approach of canonical correlation was used to assess selection procedures of the Air Force Academy. It was felt that improved student selection methods might reduce the number of dropouts while maintaining or improving the quality of graduates. The method of canonical correlation was designed to maximize prediction of academic…

  2. Feature selection method based on multi-fractal dimension and harmony search algorithm and its application

    NASA Astrophysics Data System (ADS)

    Zhang, Chen; Ni, Zhiwei; Ni, Liping; Tang, Na

    2016-10-01

    Feature selection is an important method of data preprocessing in data mining. In this paper, a novel feature selection method based on multi-fractal dimension and harmony search algorithm is proposed. Multi-fractal dimension is adopted as the evaluation criterion of feature subset, which can determine the number of selected features. An improved harmony search algorithm is used as the search strategy to improve the efficiency of feature selection. The performance of the proposed method is compared with that of other feature selection algorithms on UCI data-sets. Besides, the proposed method is also used to predict the daily average concentration of PM2.5 in China. Experimental results show that the proposed method can obtain competitive results in terms of both prediction accuracy and the number of selected features.

  3. Metrics for evaluating performance and uncertainty of Bayesian network models

    Treesearch

    Bruce G. Marcot

    2012-01-01

    This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...

  4. Characterization of active paper packaging incorporated with ginger pulp oleoresin

    NASA Astrophysics Data System (ADS)

    Wiastuti, T.; Khasanah, L. U.; Atmaka Kawiji, W.; Manuhara, G. J.; Utami, R.

    2016-02-01

    Utilization of ginger pulp waste from herbal medicine and instant drinks industry in Indonesia currently used for fertilizer and fuel, whereas the ginger pulp still contains high oleoresin. Active paper packaging were developed incorporated with ginger pulp oleoresin (0%, 2%, 4%, and 6% w/w). Physical (thickness, tensile strength, and folding endurance, moisture content), sensory characteristics and antimicrobial activity of the active paper were evaluated. Selected active paper then were chemically characterized (functional groups). The additional of ginger pulp oleoresin levels are reduced tensile strength, folding endurance and sensory characteristic (color, texture and overall) and increased antimicrobial activity. Due to physical, sensory characteristic and antimicrobial activity, active paper with 2% ginger pulp oleoresin incorporation was selected. Characteristics of selected paper were 9.93% of water content; 0.81 mm of thickness; 0.54 N / mm of tensile strength; 0.30 of folding endurance; 8.43 mm inhibits the growth of Pseudomonas fluorescence and 27.86 mm inhibits the growth of Aspergillus niger (antimicrobial activity) and neutral preference response for sensory properties. For chemical characteristic, selected paper had OH functional group of ginger in 3422.83 cm-1 of wave number and indicated contain red ginger active compounds.

  5. Statistical auditing and randomness test of lotto k/N-type games

    NASA Astrophysics Data System (ADS)

    Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Rapallo, F.; Scalas, E.

    2008-11-01

    One of the most popular lottery games worldwide is the so-called “lotto k/N”. It considers N numbers 1,2,…,N from which k are drawn randomly, without replacement. A player selects k or more numbers and the first prize is shared amongst those players whose selected numbers match all of the k randomly drawn. Exact rules may vary in different countries. In this paper, mean values and covariances for the random variables representing the numbers drawn from this kind of game are presented, with the aim of using them to audit statistically the consistency of a given sample of historical results with theoretical values coming from a hypergeometric statistical model. The method can be adapted to test pseudorandom number generators.

  6. [School Organization: Theory and Practice; Selected Readings on Grading, Nongrading, Multigrading, Self-Contained Classrooms, Departmentalization, Team Heterogeneous Grouping. Selected Bibliographies.] Rand McNally Education Series.

    ERIC Educational Resources Information Center

    Franklin, Marian Pope, Comp.

    Over 400 journal articles, case studies, research reports, dissertations, and position papers are briefly described in a series of eight selected bibliographies related to school organization. The eight specific areas treated in the volume and the number of items listed for each include: nongraded elementary school organization, 96; nongraded…

  7. Mother Daughter Relationships: From Infancy to Adulthood. Unit for Child Studies Selected Papers Number 15.

    ERIC Educational Resources Information Center

    Phillips, Shelley

    Topics related to characteristics of mother/daughter relationships in contemporary patriarchal societies are discussed in this seminar paper. The first section describes cases intended to illustrate ways patriarchal social structures limit contemporary mother/daughter relationships, provides a brief historical contrast, and suggests possible…

  8. Hybrid Binary Imperialist Competition Algorithm and Tabu Search Approach for Feature Selection Using Gene Expression Data.

    PubMed

    Wang, Shuaiqun; Aorigele; Kong, Wei; Zeng, Weiming; Hong, Xiaomin

    2016-01-01

    Gene expression data composed of thousands of genes play an important role in classification platforms and disease diagnosis. Hence, it is vital to select a small subset of salient features over a large number of gene expression data. Lately, many researchers devote themselves to feature selection using diverse computational intelligence methods. However, in the progress of selecting informative genes, many computational methods face difficulties in selecting small subsets for cancer classification due to the huge number of genes (high dimension) compared to the small number of samples, noisy genes, and irrelevant genes. In this paper, we propose a new hybrid algorithm HICATS incorporating imperialist competition algorithm (ICA) which performs global search and tabu search (TS) that conducts fine-tuned search. In order to verify the performance of the proposed algorithm HICATS, we have tested it on 10 well-known benchmark gene expression classification datasets with dimensions varying from 2308 to 12600. The performance of our proposed method proved to be superior to other related works including the conventional version of binary optimization algorithm in terms of classification accuracy and the number of selected genes.

  9. Hybrid Binary Imperialist Competition Algorithm and Tabu Search Approach for Feature Selection Using Gene Expression Data

    PubMed Central

    Aorigele; Zeng, Weiming; Hong, Xiaomin

    2016-01-01

    Gene expression data composed of thousands of genes play an important role in classification platforms and disease diagnosis. Hence, it is vital to select a small subset of salient features over a large number of gene expression data. Lately, many researchers devote themselves to feature selection using diverse computational intelligence methods. However, in the progress of selecting informative genes, many computational methods face difficulties in selecting small subsets for cancer classification due to the huge number of genes (high dimension) compared to the small number of samples, noisy genes, and irrelevant genes. In this paper, we propose a new hybrid algorithm HICATS incorporating imperialist competition algorithm (ICA) which performs global search and tabu search (TS) that conducts fine-tuned search. In order to verify the performance of the proposed algorithm HICATS, we have tested it on 10 well-known benchmark gene expression classification datasets with dimensions varying from 2308 to 12600. The performance of our proposed method proved to be superior to other related works including the conventional version of binary optimization algorithm in terms of classification accuracy and the number of selected genes. PMID:27579323

  10. Technology Requirements and Selection for Securely Partitioning OBSW

    NASA Astrophysics Data System (ADS)

    Mendham, Peter; Windsor, James; Eckstein, Knut

    2010-08-01

    The Securely Partitioning Spacecraft Computing Resources project is a current ESA TRP activity investigating the application of secure time and space partitioning (TSP) technologies to enable multi-use missions from a single platform. Secure TSP technologies are used in a number of application areas outside the space domain and an opportunity exists to 'spin-in' a suitable solution. The selection of a technology for use within space the European space industry relies on an understanding of the requirements for the application of secure TSP, of which this paper presents a summary. Further, the paper outlines the selection process taken by the project and highlights promising solutions for use today.

  11. Health and Wages: Panel Data Estimates Considering Selection and Endogeneity

    ERIC Educational Resources Information Center

    Jackle, Robert; Himmler, Oliver

    2010-01-01

    This paper complements previous studies on the effects of health on wages by addressing the problems of unobserved heterogeneity, sample selection, and endogeneity in one comprehensive framework. Using data from the German Socio-Economic Panel (GSOEP), we find the health variable to suffer from measurement error and a number of tests provide…

  12. Qualitative Indices for Selected Educational Innovations. Teacher Education Forum; Volume 3, Number 8.

    ERIC Educational Resources Information Center

    Mahan, James M.

    Presented in this paper is a selection of qualitative indices for four educational innovations (cultural pluralism, individualized instruction, open classroom, and team teaching) prepared by participants of a continuing action lab of the Association for Supervision and Curriculum Development. Participants estimated that over 65 percent of the…

  13. Two new discipline-independent indices to quantify individual's scientific research output

    NASA Astrophysics Data System (ADS)

    Valentinuzzi, M. E.; Laciar, E.; Atrio, J. L.

    2007-11-01

    Interest in quantitative measurement of scientific output has been steadily growing because of increasing needs in the evaluation of candidates for new positions and promotions in academic careers. Recently, a new index H was proposed; it is based on an hyperbolic relationship between the number of citations and the number of papers of a given investigator, which intersects with the equality straight line. The crossing point gives the number of papers that received at least H references in a predetermined period of time. Such index neglects the contribution of the less cited papers and depends strongly on the discipline. Herein, using Hirsch's crossing point idea, we propose two new normalized indices, selectivity S and amplitude A, that are independent on the discipline and that take into account the whole spectrum of published and cited papers. The proposed method was applied to 100 scientists using information obtained from SCOPUS. The potential function appeared as the best fit to the data. Correlation coefficients were always high (r = 0.79 ± 0.11). Most of the authors displayed a marked selectivity because a typical researcher concentrates only on a single subject or perhaps a few while a wide reach did not predominate. In conclusion, these parameters are proposed as a way to complement the scientific evaluation process of a candidate.

  14. Evaluation of new collision-pair selection models in DSMC

    NASA Astrophysics Data System (ADS)

    Akhlaghi, Hassan; Roohi, Ehsan

    2017-10-01

    The current paper investigates new collision-pair selection procedures in a direct simulation Monte Carlo (DSMC) method. Collision partner selection based on the random procedure from nearest neighbor particles and deterministic selection of nearest neighbor particles have already been introduced as schemes that provide accurate results in a wide range of problems. In the current research, new collision-pair selections based on the time spacing and direction of the relative movement of particles are introduced and evaluated. Comparisons between the new and existing algorithms are made considering appropriate test cases including fluctuations in homogeneous gas, 2D equilibrium flow, and Fourier flow problem. Distribution functions for number of particles and collisions in cell, velocity components, and collisional parameters (collision separation, time spacing, relative velocity, and the angle between relative movements of particles) are investigated and compared with existing analytical relations for each model. The capability of each model in the prediction of the heat flux in the Fourier problem at different cell numbers, numbers of particles, and time steps is examined. For new and existing collision-pair selection schemes, the effect of an alternative formula for the number of collision-pair selections and avoiding repetitive collisions are investigated via the prediction of the Fourier heat flux. The simulation results demonstrate the advantages and weaknesses of each model in different test cases.

  15. Racial Intolerance: A Child's Perspective. Foundation for Child and Youth Studies Selected Papers Number 48.

    ERIC Educational Resources Information Center

    Black, Hedda Dasia

    After a brief introduction to prejudice and stereotyping, this paper explores the feelings of stigmatized children; identifies components of stereotypes and prejudice; and discusses strategies for combating prejudice. The exploration of stigmatized children's feelings focuses on the time at which children become aware of their stigma and their…

  16. Selected Bibliography on Lead Poisoning in Children.

    ERIC Educational Resources Information Center

    Lin-Fu, Jane S., Comp.

    This comprehensive bibliography was prepared in response to the growing interest in the problem of childhood lead poisoning. Most of the papers noted are from the pediatric literature and include only those published in English. A limited number of papers on experiments in laboratory animals are cited. Documents are grouped under several general…

  17. Improving hot region prediction by parameter optimization of density clustering in PPI.

    PubMed

    Hu, Jing; Zhang, Xiaolong

    2016-11-01

    This paper proposed an optimized algorithm which combines density clustering of parameter selection with feature-based classification for hot region prediction. First, all the residues are classified by SVM to remove non-hot spot residues, then density clustering of parameter selection is used to find hot regions. In the density clustering, this paper studies how to select input parameters. There are two parameters radius and density in density-based incremental clustering. We firstly fix density and enumerate radius to find a pair of parameters which leads to maximum number of clusters, and then we fix radius and enumerate density to find another pair of parameters which leads to maximum number of clusters. Experiment results show that the proposed method using both two pairs of parameters provides better prediction performance than the other method, and compare these two predictive results, the result by fixing radius and enumerating density have slightly higher prediction accuracy than that by fixing density and enumerating radius. Copyright © 2016. Published by Elsevier Inc.

  18. Paper-based microfluidic devices by asymmetric calendaring

    PubMed Central

    Oyola-Reynoso, S.; Frankiewicz, C.; Chang, B.; Chen, J.; Bloch, J.-F.

    2017-01-01

    We report a simple, efficient, one-step, affordable method to produce open-channel paper-based microfluidic channels. One surface of a sheet of paper is selectively calendared, with concomitant hydrophobization, to create the microfluidic channel. Our method involves asymmetric mechanical modification of a paper surface using a rolling ball (ball-point pen) under a controlled amount of applied stress (σz) to ascertain that only one side is modified. A lubricating solvent (hexane) aids in the selective deformation. The lubricant also serves as a carrier for a perfluoroalkyl trichlorosilane allowing the channel to be made hydrophobic as it is formed. For brevity and clarity, we abbreviated this method as TACH (Targeted Asymmetric Calendaring and Hydrophobization). We demonstrate that TACH can be used to reliably produce channels of variable widths (size of the ball) and depths (number of passes), without affecting the nonworking surface of the paper. Using tomography, we demonstrate that these channels can vary from 10s to 100s of microns in diameter. The created hydrophobic barrier extends around the channel through wicking to ensure no leakages. We demonstrate, through modeling and fabrication, that flow properties of the resulting channels are analogous to conventional devices and are tunable based on associated dimensionless numbers. PMID:28798839

  19. Peer-selected "best papers"-are they really that "good"?

    PubMed

    Wainer, Jacques; Eckmann, Michael; Rocha, Anderson

    2015-01-01

    Peer evaluation is the cornerstone of science evaluation. In this paper, we analyze whether or not a form of peer evaluation, the pre-publication selection of the best papers in Computer Science (CS) conferences, is better than random, when considering future citations received by the papers. Considering 12 conferences (for several years), we collected the citation counts from Scopus for both the best papers and the non-best papers. For a different set of 17 conferences, we collected the data from Google Scholar. For each data set, we computed the proportion of cases whereby the best paper has more citations. We also compare this proportion for years before 2010 and after to evaluate if there is a propaganda effect. Finally, we count the proportion of best papers that are in the top 10% and 20% most cited for each conference instance. The probability that a best paper will receive more citations than a non best paper is 0.72 (95% CI = 0.66, 0.77) for the Scopus data, and 0.78 (95% CI = 0.74, 0.81) for the Scholar data. There are no significant changes in the probabilities for different years. Also, 51% of the best papers are among the top 10% most cited papers in each conference/year, and 64% of them are among the top 20% most cited. There is strong evidence that the selection of best papers in Computer Science conferences is better than a random selection, and that a significant number of the best papers are among the top cited papers in the conference.

  20. Changing and Changed Stance toward Norm Selection in Philippine Universities: Its Pedagogical Implications

    ERIC Educational Resources Information Center

    Bernardo, Alejandro S.

    2014-01-01

    This paper reports the results of a survey which involved College English teachers from three leading universities in the Philippines. The results point to one conclusion--College English teachers now have a changing and changed stance toward norm selection in Philippine Universities. The results give the impression that a good number of College…

  1. Criteria for Selection and Rejection of Social Relationships among Children in Urban and Rural Kindergartens in Greece

    ERIC Educational Resources Information Center

    Rekalidou, Galini; Petrogiannis, Konstantinos

    2012-01-01

    This paper reports on preschool children's social relationships developed in urban and rural kindergarten classes in Greece. We investigated the selection and rejection criteria children use and examined potential criteria differences as a function of a number of socio-demographic variables (children's age group, gender, parental job status,…

  2. [Neurological sciences based on evidence].

    PubMed

    Ceballos, C; Almárcegui, C; Artal, A; García-Campayo, J; Valdizán, J R

    An exhaustive search of reported metanalysis from any medical speciality is described. Search of papers included in MEDLINE or EMBASE between 1973-1998. A descriptive analysis of the reported papers (frequency tables and graphics) is described, including differences of mean of reported metanalysis papers by medical speciality and year. 1,514 papers were selected and classified. Between 1977-1987 overall mean of reported studies of neurologic metanalysis (1.20 +/- 1.10) was significatively inferior to the 1988-1998 (11.20 +/- 7.85) (p < 0.001). Global number of neurologic metanalysis was positively correlated (p < 0.05) with the number of studies about fundamentals and methodology during the study period. A progressive increase in the number of reported neurologic metanalysis since 1977 can be demonstrated. Diffusion of knowledge about fundamentals and methodology of metanalysis seems to have drawn and increase in performing and reporting this kind of analysis.

  3. On the number of Bose-selected modes in driven-dissipative ideal Bose gases

    NASA Astrophysics Data System (ADS)

    Schnell, Alexander; Ketzmerick, Roland; Eckardt, André

    2018-03-01

    In an ideal Bose gas that is driven into a steady state far from thermal equilibrium, a generalized form of Bose condensation can occur. Namely, the single-particle states unambiguously separate into two groups: the group of Bose-selected states, whose occupations increase linearly with the total particle number, and the group of all other states whose occupations saturate [Phys. Rev. Lett. 111, 240405 (2013), 10.1103/PhysRevLett.111.240405]. However, so far very little is known about how the number of Bose-selected states depends on the properties of the system and its coupling to the environment. The answer to this question is crucial since systems hosting a single, a few, or an extensive number of Bose-selected states will show rather different behavior. While in the former two scenarios each selected mode acquires a macroscopic occupation, corresponding to (fragmented) Bose condensation, the latter case rather bears resemblance to a high-temperature state of matter. In this paper, we systematically investigate the number of Bose-selected states, considering different classes of the rate matrices that characterize the driven-dissipative ideal Bose gases in the limit of weak system-bath coupling. These include rate matrices with continuum limit, rate matrices of chaotic driven systems, random rate matrices, and rate matrices resulting from thermal baths that couple to a few observables only.

  4. On the number of Bose-selected modes in driven-dissipative ideal Bose gases.

    PubMed

    Schnell, Alexander; Ketzmerick, Roland; Eckardt, André

    2018-03-01

    In an ideal Bose gas that is driven into a steady state far from thermal equilibrium, a generalized form of Bose condensation can occur. Namely, the single-particle states unambiguously separate into two groups: the group of Bose-selected states, whose occupations increase linearly with the total particle number, and the group of all other states whose occupations saturate [Phys. Rev. Lett. 111, 240405 (2013)PRLTAO0031-900710.1103/PhysRevLett.111.240405]. However, so far very little is known about how the number of Bose-selected states depends on the properties of the system and its coupling to the environment. The answer to this question is crucial since systems hosting a single, a few, or an extensive number of Bose-selected states will show rather different behavior. While in the former two scenarios each selected mode acquires a macroscopic occupation, corresponding to (fragmented) Bose condensation, the latter case rather bears resemblance to a high-temperature state of matter. In this paper, we systematically investigate the number of Bose-selected states, considering different classes of the rate matrices that characterize the driven-dissipative ideal Bose gases in the limit of weak system-bath coupling. These include rate matrices with continuum limit, rate matrices of chaotic driven systems, random rate matrices, and rate matrices resulting from thermal baths that couple to a few observables only.

  5. Peer-Selected “Best Papers”—Are They Really That “Good”?

    PubMed Central

    Wainer, Jacques; Eckmann, Michael; Rocha, Anderson

    2015-01-01

    Background Peer evaluation is the cornerstone of science evaluation. In this paper, we analyze whether or not a form of peer evaluation, the pre-publication selection of the best papers in Computer Science (CS) conferences, is better than random, when considering future citations received by the papers. Methods Considering 12 conferences (for several years), we collected the citation counts from Scopus for both the best papers and the non-best papers. For a different set of 17 conferences, we collected the data from Google Scholar. For each data set, we computed the proportion of cases whereby the best paper has more citations. We also compare this proportion for years before 2010 and after to evaluate if there is a propaganda effect. Finally, we count the proportion of best papers that are in the top 10% and 20% most cited for each conference instance. Results The probability that a best paper will receive more citations than a non best paper is 0.72 (95% CI = 0.66, 0.77) for the Scopus data, and 0.78 (95% CI = 0.74, 0.81) for the Scholar data. There are no significant changes in the probabilities for different years. Also, 51% of the best papers are among the top 10% most cited papers in each conference/year, and 64% of them are among the top 20% most cited. Discussion There is strong evidence that the selection of best papers in Computer Science conferences is better than a random selection, and that a significant number of the best papers are among the top cited papers in the conference. PMID:25789480

  6. Powerful Voter Selection for Making Multistep Delegate Ballot Fair

    NASA Astrophysics Data System (ADS)

    Yamakawa, Hiroshi

    For decision by majority, each voter often exercises his right by delegating to trustable other voters. Multi-step delegates rule allows indirect delegating through more than one voter, and this helps each voter finding his delegate voters. In this paper, we propose powerful voter selection method depending on the multi-step delegate rule. This method sequentially selects voters who is most delegated indirectly. Multi-agent simulation demonstrate that we can achieve highly fair poll results from small number of vote by using proposed method. Here, fairness is prediction accuracy to sum of all voters preferences for choices. In simulation, each voter selects choices arranged on one dimensional preference axis for voting. Acquaintance relationships among voters were generated as a random network, and each voter delegates some of his acquaintances who has similar preferences. We obtained simulation results from various acquaintance networks, and then averaged these results. Firstly, if each voter has enough acquaintances in average, proposed method can help predicting sum of all voters' preferences of choices from small number of vote. Secondly, if the number of each voter's acquaintances increases corresponding to an increase in the number of voters, prediction accuracy (fairness) from small number of vote can be kept in appropriate level.

  7. Publication Trends Over 55 Years of Behavioral Genetic Research.

    PubMed

    Ayorech, Ziada; Selzam, Saskia; Smith-Woolley, Emily; Knopik, Valerie S; Neiderhiser, Jenae M; DeFries, John C; Plomin, Robert

    2016-09-01

    We document the growth in published papers on behavioral genetics for 5-year intervals from 1960 through 2014. We used 1861 papers published in Behavior Genetics to train our search strategy which, when applied to Ovid PsychINFO, selected more than 45,000 publications. Five trends stand out: (1) the number of behavioral genetic publications has grown enormously; nearly 20,000 papers were published in 2010-2014. (2) The number of human quantitative genetic (QG) publications (e.g., twin and adoption studies) has steadily increased with more than 3000 papers published in 2010-2014. (3) The number of human molecular genetic (MG) publications increased substantially from about 2000 in 2000-2004 to 5000 in 2005-2009 to 9000 in 2010-2014. (4) Nonhuman publications yielded similar trends. (5) Although there has been exponential growth in MG publications, both human and nonhuman QG publications continue to grow. A searchable resource of this corpus of behavioral genetic papers is freely available online at http://www.teds.ac.uk/public_datasets.html and will be updated annually.

  8. Improved targeted immunization strategies based on two rounds of selection

    NASA Astrophysics Data System (ADS)

    Xia, Ling-Ling; Song, Yu-Rong; Li, Chan-Chan; Jiang, Guo-Ping

    2018-04-01

    In the case of high degree targeted immunization where the number of vaccine is limited, when more than one node associated with the same degree meets the requirement of high degree centrality, how can we choose a certain number of nodes from those nodes, so that the number of immunized nodes will not exceed the limit? In this paper, we introduce a new idea derived from the selection process of second-round exam to solve this problem and then propose three improved targeted immunization strategies. In these proposed strategies, the immunized nodes are selected through two rounds of selection, where we increase the quotas of first-round selection according the evaluation criterion of degree centrality and then consider another characteristic parameter of node, such as node's clustering coefficient, betweenness and closeness, to help choose targeted nodes in the second-round selection. To validate the effectiveness of the proposed strategies, we compare them with the degree immunizations including the high degree targeted and the high degree adaptive immunizations using two metrics: the size of the largest connected component of immunized network and the number of infected nodes. Simulation results demonstrate that the proposed strategies based on two rounds of sorting are effective for heterogeneous networks and their immunization effects are better than that of the degree immunizations.

  9. Scientific Productivity and Academic Promotion: A Study on French and Italian Physicists. NBER Working Paper No. 16341

    ERIC Educational Resources Information Center

    Lissoni, Francesco; Mairesse, Jacques; Montobbio, Fabio; Pezzoni, Michele

    2010-01-01

    The paper examines the determinants of scientific productivity (number of articles and journals' impact factor) for a panel of about 3600 French and Italian academic physicists active in 2004-05. Endogeneity problems concerning promotion and productivity are addressed by specifying a generalized Tobit model, in which a selection probit equation…

  10. Multi-task feature selection in microarray data by binary integer programming.

    PubMed

    Lan, Liang; Vucetic, Slobodan

    2013-12-20

    A major challenge in microarray classification is that the number of features is typically orders of magnitude larger than the number of examples. In this paper, we propose a novel feature filter algorithm to select the feature subset with maximal discriminative power and minimal redundancy by solving a quadratic objective function with binary integer constraints. To improve the computational efficiency, the binary integer constraints are relaxed and a low-rank approximation to the quadratic term is applied. The proposed feature selection algorithm was extended to solve multi-task microarray classification problems. We compared the single-task version of the proposed feature selection algorithm with 9 existing feature selection methods on 4 benchmark microarray data sets. The empirical results show that the proposed method achieved the most accurate predictions overall. We also evaluated the multi-task version of the proposed algorithm on 8 multi-task microarray datasets. The multi-task feature selection algorithm resulted in significantly higher accuracy than when using the single-task feature selection methods.

  11. Parameters selection in gene selection using Gaussian kernel support vector machines by genetic algorithm.

    PubMed

    Mao, Yong; Zhou, Xiao-Bo; Pi, Dao-Ying; Sun, You-Xian; Wong, Stephen T C

    2005-10-01

    In microarray-based cancer classification, gene selection is an important issue owing to the large number of variables and small number of samples as well as its non-linearity. It is difficult to get satisfying results by using conventional linear statistical methods. Recursive feature elimination based on support vector machine (SVM RFE) is an effective algorithm for gene selection and cancer classification, which are integrated into a consistent framework. In this paper, we propose a new method to select parameters of the aforementioned algorithm implemented with Gaussian kernel SVMs as better alternatives to the common practice of selecting the apparently best parameters by using a genetic algorithm to search for a couple of optimal parameter. Fast implementation issues for this method are also discussed for pragmatic reasons. The proposed method was tested on two representative hereditary breast cancer and acute leukaemia datasets. The experimental results indicate that the proposed method performs well in selecting genes and achieves high classification accuracies with these genes.

  12. Suggestions and Procedures for Choosing a Chinese Institution of Higher Education as a Partner

    ERIC Educational Resources Information Center

    Willis, Mike

    2007-01-01

    China has one of the largest and most complex higher education systems in the world, and a key challenge facing any foreign education institution is how to select an appropriate potential partner. This paper considers how a number of foreign university managers selected a university in China. Issues of location, ranking, status, programs, mutual…

  13. On fast carry select adders

    NASA Technical Reports Server (NTRS)

    Shamanna, M.; Whitaker, S.

    1992-01-01

    This paper presents an architecture for a high-speed carry select adder with very long bit lengths utilizing a conflict-free bypass scheme. The proposed scheme has almost half the number of transistors and is faster than a conventional carry select adder. A comparative study is also made between the proposed adder and a Manchester carry chain adder which shows that the proposed scheme has the same transistor count, without suffering any performance degradation, compared to the Manchester carry chain adder.

  14. Diversity Order Analysis of Dual-Hop Relaying with Partial Relay Selection

    NASA Astrophysics Data System (ADS)

    Bao, Vo Nguyen Quoc; Kong, Hyung Yun

    In this paper, we study the performance of dual hop relaying in which the best relay selected by partial relay selection will help the source-destination link to overcome the channel impairment. Specifically, closed-form expressions for outage probability, symbol error probability and achievable diversity gain are derived using the statistical characteristic of the signal-to-noise ratio. Numerical investigation shows that the system achieves diversity of two regardless of relay number and also confirms the correctness of the analytical results. Furthermore, the performance loss due to partial relay selection is investigated.

  15. Comparative study of minutiae selection algorithms for ISO fingerprint templates

    NASA Astrophysics Data System (ADS)

    Vibert, B.; Charrier, C.; Le Bars, J.-M.; Rosenberger, C.

    2015-03-01

    We address the selection of fingerprint minutiae given a fingerprint ISO template. Minutiae selection plays a very important role when a secure element (i.e. a smart-card) is used. Because of the limited capability of computation and memory, the number of minutiae of a stored reference in the secure element is limited. We propose in this paper a comparative study of 6 minutiae selection methods including 2 methods from the literature and 1 like reference (No Selection). Experimental results on 3 fingerprint databases from the Fingerprint Verification Competition show their relative efficiency in terms of performance and computation time.

  16. How directional mobility affects coexistence in rock-paper-scissors models

    NASA Astrophysics Data System (ADS)

    Avelino, P. P.; Bazeia, D.; Losano, L.; Menezes, J.; de Oliveira, B. F.; Santos, M. A.

    2018-03-01

    This work deals with a system of three distinct species that changes in time under the presence of mobility, selection, and reproduction, as in the popular rock-paper-scissors game. The novelty of the current study is the modification of the mobility rule to the case of directional mobility, in which the species move following the direction associated to a larger (averaged) number density of selection targets in the surrounding neighborhood. Directional mobility can be used to simulate eyes that see or a nose that smells, and we show how it may contribute to reduce the probability of coexistence.

  17. How directional mobility affects coexistence in rock-paper-scissors models.

    PubMed

    Avelino, P P; Bazeia, D; Losano, L; Menezes, J; de Oliveira, B F; Santos, M A

    2018-03-01

    This work deals with a system of three distinct species that changes in time under the presence of mobility, selection, and reproduction, as in the popular rock-paper-scissors game. The novelty of the current study is the modification of the mobility rule to the case of directional mobility, in which the species move following the direction associated to a larger (averaged) number density of selection targets in the surrounding neighborhood. Directional mobility can be used to simulate eyes that see or a nose that smells, and we show how it may contribute to reduce the probability of coexistence.

  18. Fuzzy Multi-Objective Vendor Selection Problem with Modified S-CURVE Membership Function

    NASA Astrophysics Data System (ADS)

    Díaz-Madroñero, Manuel; Peidro, David; Vasant, Pandian

    2010-06-01

    In this paper, the S-Curve membership function methodology is used in a vendor selection (VS) problem. An interactive method for solving multi-objective VS problems with fuzzy goals is developed. The proposed method attempts simultaneously to minimize the total order costs, the number of rejected items and the number of late delivered items with reference to several constraints such as meeting buyers' demand, vendors' capacity, vendors' quota flexibility, vendors' allocated budget, etc. We compare in an industrial case the performance of S-curve membership functions, representing uncertainty goals and constraints in VS problems, with linear membership functions.

  19. An opinion formation based binary optimization approach for feature selection

    NASA Astrophysics Data System (ADS)

    Hamedmoghadam, Homayoun; Jalili, Mahdi; Yu, Xinghuo

    2018-02-01

    This paper proposed a novel optimization method based on opinion formation in complex network systems. The proposed optimization technique mimics human-human interaction mechanism based on a mathematical model derived from social sciences. Our method encodes a subset of selected features to the opinion of an artificial agent and simulates the opinion formation process among a population of agents to solve the feature selection problem. The agents interact using an underlying interaction network structure and get into consensus in their opinions, while finding better solutions to the problem. A number of mechanisms are employed to avoid getting trapped in local minima. We compare the performance of the proposed method with a number of classical population-based optimization methods and a state-of-the-art opinion formation based method. Our experiments on a number of high dimensional datasets reveal outperformance of the proposed algorithm over others.

  20. Children in Blended and Step Families. Foundation for Child and Youth Studies Selected Papers Number 44.

    ERIC Educational Resources Information Center

    Phillips, Shelley

    Fairy tales dramatize a widely held presumption that children are at high risk of abuse in stepfamilies. The idea seems almost accepted as fact, yet it has never been adequately tested. This paper provides detailed information about life in blended and stepfamilies. Sections focus on (1) stepmothers; (2) children's views of their stepmothers; (3)…

  1. Equity Issues in Public Examinations in Developing Countries. World Bank Technical Paper Number 272. Asia Technical Series.

    ERIC Educational Resources Information Center

    Greaney, Vincent; Kellaghan, Thomas

    Public examinations in developing countries play a critical role in the selection of students for participation in the educational system. The examinations tend to be highly academic, bearing little reference to the everyday lives of students, limited to paper-and-pencil tests, and geared toward discriminating among high achieving students.…

  2. How to Suceed in Team Teaching--By Really Trying. Occasional Paper No. 13.

    ERIC Educational Resources Information Center

    Nolan, Robert R.; Roper, Susan Stavert

    This paper provides guidelines in three areas for organizing and working in teaching teams: (1) getting started; (2) making the most of team meetings; and (3) minimizing team problems. Getting started requires a number of key decisions: (1) on what basis should team members be selected? (2) what should be the size of the team? (3) what areas…

  3. Steps to consider for effective decision making when selecting and prioritizing eHealth services.

    PubMed

    Vimarlund, Vivian; Davoody, Nadia; Koch, Sabine

    2013-01-01

    Making the best choice for an organization when selecting IT applications or eHealth services is not always easy as there are a lot of parameters to take into account. The aim of this paper is to explore some steps to support effective decision making when selecting and prioritizing eHealth services prior to implementation and/or procurement. The steps presented in this paper were identified by interviewing nine key stakeholders at Stockholm County Council. They are supposed to work as a guide for decision making and aim to identify objectives and expected effects, technical, organizational, and economic requirements, and opportunities important to consider before decisions are taken. The steps and their respective issues and variables are concretized in a number of templates to be filled in by decision makers when selecting and prioritizing eHealth services.

  4. An Assessment of Number Sense among Secondary School Students

    ERIC Educational Resources Information Center

    Singh, Parmjit

    2009-01-01

    This paper reports selected findings from a study of number sense proficiency of students aged 13 to 16 years in a state in Malaysia. A total of 1756 students, from thirteen schools in a state in Malaysia participated in this study. A majority (74.9%) of these students obtained an A grade for their respective year-end school examinations. The…

  5. Evolution of Gender Differences in Post-Secondary Human Capital Investments: College Majors. Working Paper #03-11

    ERIC Educational Resources Information Center

    Gemici, Ahu; Wiswall, Matthew

    2011-01-01

    Over the past 40 years, the level of human capital investments has changed substantially for men and women. Changes in the intensive margin of college major selection have been also been substantial, as the number of graduates in humanities, social science, and teaching has declined, and the number in science, engineering, and business has…

  6. Expanding the Private School Sector: Government Policy and Private Secondary Schools in Hong Kong, 1988-2001

    ERIC Educational Resources Information Center

    Cheung, Alan C. K.; Randall, E. Vance; Tam, Man-Kwan

    2005-01-01

    This paper addresses the extent to which government policy has helped increase the number and diversity of private secondary schools in Hong Kong, which, in turn, has expanded the options for parental choice. Five indicators were selected to measure this objective. They are as follows: (1) Number of private schools and students enrolled; (2) Types…

  7. Cluster randomised trials in the medical literature: two bibliometric surveys

    PubMed Central

    Bland, J Martin

    2004-01-01

    Background Several reviews of published cluster randomised trials have reported that about half did not take clustering into account in the analysis, which was thus incorrect and potentially misleading. In this paper I ask whether cluster randomised trials are increasing in both number and quality of reporting. Methods Computer search for papers on cluster randomised trials since 1980, hand search of trial reports published in selected volumes of the British Medical Journal over 20 years. Results There has been a large increase in the numbers of methodological papers and of trial reports using the term 'cluster random' in recent years, with about equal numbers of each type of paper. The British Medical Journal contained more such reports than any other journal. In this journal there was a corresponding increase over time in the number of trials where subjects were randomised in clusters. In 2003 all reports showed awareness of the need to allow for clustering in the analysis. In 1993 and before clustering was ignored in most such trials. Conclusion Cluster trials are becoming more frequent and reporting is of higher quality. Perhaps statistician pressure works. PMID:15310402

  8. Bericht uber den 2. Internationalen Kongress fur Angewandte Linguistik. Cambridge 8.-12. IX. 1969. [Report on the Second International Congress for Applied Linguistics, Cambridge, Dec. 8-12, 1969.

    ERIC Educational Resources Information Center

    Mohr, Peter

    This paper is a summary report on the Second International Congress of Applied Linguistics held in Cambridge, England in September 1969. Because of the large number of papers delivered, only a selection of the papers delivered in any one section of the Congress are considered, and the author attempts to identify current interests and trends in…

  9. Quality Programming in H.P.E.R. Volume II. Selected Papers Presented at the Convention of the Canadian Association for Health, Physical Education and Recreation (British Columbia, Canada, June 10-13, 1981). Physical Education Series Number 3.

    ERIC Educational Resources Information Center

    Jackson, John J., Ed.; Turkington, H. David, Ed.

    This volume contains 27 edited papers, and abstracts of 14 papers, presented during the 1981 convention of the Canadian Association for Health, Physical Education and Recreation. Subjects discussed are listed in 10 categories: (1) working together for quality programs; (2) challenges facing the physical education teacher; (3) skill development and…

  10. [Indicators of the persistent pro-inflammatory activation of the immune system in depression].

    PubMed

    Cubała, Wiesław Jerzy; Godlewska, Beata; Trzonkowski, Piotr; Landowski, Jerzy

    2006-01-01

    The aetiology of depression remains tentative. Current hypotheses on the aetiology of the depressive disorder tend to integrate monoaminoergic, neuroendocrine and immunological concepts of depression. A number of research papers emphasise the altered hormonal and immune status of patients with depression with pronounced cytokine level variations. Those studies tend to link the variable course of depression in relation to the altered proinflammatory activity of the immune system. The results of the studies on the activity of the selected elements of the immune system are ambiguous indicating both increased and decreased activities of its selected elements. However, a number of basic and psychopharmacological studies support the hypothesis of the increased proinflammatory activity of the immune system in the course of depression which is the foundation for the immunological hypothesis of depression. The aim of this paper is to review the functional abnormalities that are observed in depression focusing on the monoaminoergic deficiency and increased immune activation as well as endocrine dysregulation. This paper puts together and discusses current studies related to this subject with a detailed insight into interactions involving nervous, endocrine and immune systems.

  11. Configurable Cellular Automata for Pseudorandom Number Generation

    NASA Astrophysics Data System (ADS)

    Quieta, Marie Therese; Guan, Sheng-Uei

    This paper proposes a generalized structure of cellular automata (CA) — the configurable cellular automata (CoCA). With selected properties from programmable CA (PCA) and controllable CA (CCA), a new approach to cellular automata is developed. In CoCA, the cells are dynamically reconfigured at run-time via a control CA. Reconfiguration of a cell simply means varying the properties of that cell with time. Some examples of properties to be reconfigured are rule selection, boundary condition, and radius. While the objective of this paper is to propose CoCA as a new CA method, the main focus is to design a CoCA that can function as a good pseudorandom number generator (PRNG). As a PRNG, CoCA can be a suitable candidate as it can pass 17 out of 18 Diehard tests with 31 cells. CoCA PRNG's performance based on Diehard test is considered superior over other CA PRNG works. Moreover, CoCA opens new rooms for research not only in the field of random number generation, but in modeling complex systems as well.

  12. Concentration of the Most-Cited Papers in the Scientific Literature: Analysis of Journal Ecosystems

    PubMed Central

    Ioannidis, John P. A.

    2006-01-01

    Background A minority of scientific journals publishes the majority of scientific papers and receives the majority of citations. The extent of concentration of the most influential articles is less well known. Methods/Principal Findings The 100 most-cited papers in the last decade in each of 21 scientific fields were analyzed; fields were considered as ecosystems and their “species” (journal) diversity was evaluated. Only 9% of journals in Journal Citation Reports had published at least one such paper. Among this 9%, half of them had published only one such paper. The number of journals that had published a larger number of most-cited papers decreased exponentially according to a Lotka law. Except for three scientific fields, six journals accounted for 53 to 94 of the 100 most-cited papers in their field. With increasing average number of citations per paper (citation density) in a scientific field, concentration of the most-cited papers in a few journals became even more prominent (p<0.001). Concentration was unrelated to the number of papers published or number of journals available in a scientific field. Multidisciplinary journals accounted for 24% of all most-cited papers, with large variability across fields. The concentration of most-cited papers in multidisciplinary journals was most prominent in fields with high citation density (correlation coefficient 0.70, p<0.001). Multidisciplinary journals had published fewer than eight of the 100 most-cited papers in eight scientific fields (none in two fields). Journals concentrating most-cited original articles often differed from those concentrating most-cited reviews. The concentration of the most-influential papers was stronger than the already prominent concentration of papers published and citations received. Conclusions Despite a plethora of available journals, the most influential papers are extremely concentrated in few journals, especially in fields with high citation density. Existing multidisciplinary journals publish selectively most-cited papers from fields with high citation density. PMID:17183679

  13. Comparison of Different EHG Feature Selection Methods for the Detection of Preterm Labor

    PubMed Central

    Alamedine, D.; Khalil, M.; Marque, C.

    2013-01-01

    Numerous types of linear and nonlinear features have been extracted from the electrohysterogram (EHG) in order to classify labor and pregnancy contractions. As a result, the number of available features is now very large. The goal of this study is to reduce the number of features by selecting only the relevant ones which are useful for solving the classification problem. This paper presents three methods for feature subset selection that can be applied to choose the best subsets for classifying labor and pregnancy contractions: an algorithm using the Jeffrey divergence (JD) distance, a sequential forward selection (SFS) algorithm, and a binary particle swarm optimization (BPSO) algorithm. The two last methods are based on a classifier and were tested with three types of classifiers. These methods have allowed us to identify common features which are relevant for contraction classification. PMID:24454536

  14. Parameter selection in limited data cone-beam CT reconstruction using edge-preserving total variation algorithms

    NASA Astrophysics Data System (ADS)

    Lohvithee, Manasavee; Biguri, Ander; Soleimani, Manuchehr

    2017-12-01

    There are a number of powerful total variation (TV) regularization methods that have great promise in limited data cone-beam CT reconstruction with an enhancement of image quality. These promising TV methods require careful selection of the image reconstruction parameters, for which there are no well-established criteria. This paper presents a comprehensive evaluation of parameter selection in a number of major TV-based reconstruction algorithms. An appropriate way of selecting the values for each individual parameter has been suggested. Finally, a new adaptive-weighted projection-controlled steepest descent (AwPCSD) algorithm is presented, which implements the edge-preserving function for CBCT reconstruction with limited data. The proposed algorithm shows significant robustness compared to three other existing algorithms: ASD-POCS, AwASD-POCS and PCSD. The proposed AwPCSD algorithm is able to preserve the edges of the reconstructed images better with fewer sensitive parameters to tune.

  15. Combining Mixture Components for Clustering*

    PubMed Central

    Baudry, Jean-Patrick; Raftery, Adrian E.; Celeux, Gilles; Lo, Kenneth; Gottardo, Raphaël

    2010-01-01

    Model-based clustering consists of fitting a mixture model to data and identifying each cluster with one of its components. Multivariate normal distributions are typically used. The number of clusters is usually determined from the data, often using BIC. In practice, however, individual clusters can be poorly fitted by Gaussian distributions, and in that case model-based clustering tends to represent one non-Gaussian cluster by a mixture of two or more Gaussian distributions. If the number of mixture components is interpreted as the number of clusters, this can lead to overestimation of the number of clusters. This is because BIC selects the number of mixture components needed to provide a good approximation to the density, rather than the number of clusters as such. We propose first selecting the total number of Gaussian mixture components, K, using BIC and then combining them hierarchically according to an entropy criterion. This yields a unique soft clustering for each number of clusters less than or equal to K. These clusterings can be compared on substantive grounds, and we also describe an automatic way of selecting the number of clusters via a piecewise linear regression fit to the rescaled entropy plot. We illustrate the method with simulated data and a flow cytometry dataset. Supplemental Materials are available on the journal Web site and described at the end of the paper. PMID:20953302

  16. Policy and Practice of Tertiary Literacy. Selected Proceedings of the First National Conference on Tertiary Literacy: Research and Practice, Volume 1 (1st, Melbourne, Australia, March 14-16, 1996).

    ERIC Educational Resources Information Center

    Golebiowski, Zofia, Ed.

    This selection of papers from the First Conference on Tertiary Literacy, which examined the role of literacy as a foundation for knowledge acquisition and dissemination that influences the academic success of tertiary students, presents a number of case studies of policy and practice in Australian universities. Keynote addresses included:…

  17. Multi-scale textural feature extraction and particle swarm optimization based model selection for false positive reduction in mammography.

    PubMed

    Zyout, Imad; Czajkowska, Joanna; Grzegorzek, Marcin

    2015-12-01

    The high number of false positives and the resulting number of avoidable breast biopsies are the major problems faced by current mammography Computer Aided Detection (CAD) systems. False positive reduction is not only a requirement for mass but also for calcification CAD systems which are currently deployed for clinical use. This paper tackles two problems related to reducing the number of false positives in the detection of all lesions and masses, respectively. Firstly, textural patterns of breast tissue have been analyzed using several multi-scale textural descriptors based on wavelet and gray level co-occurrence matrix. The second problem addressed in this paper is the parameter selection and performance optimization. For this, we adopt a model selection procedure based on Particle Swarm Optimization (PSO) for selecting the most discriminative textural features and for strengthening the generalization capacity of the supervised learning stage based on a Support Vector Machine (SVM) classifier. For evaluating the proposed methods, two sets of suspicious mammogram regions have been used. The first one, obtained from Digital Database for Screening Mammography (DDSM), contains 1494 regions (1000 normal and 494 abnormal samples). The second set of suspicious regions was obtained from database of Mammographic Image Analysis Society (mini-MIAS) and contains 315 (207 normal and 108 abnormal) samples. Results from both datasets demonstrate the efficiency of using PSO based model selection for optimizing both classifier hyper-parameters and parameters, respectively. Furthermore, the obtained results indicate the promising performance of the proposed textural features and more specifically, those based on co-occurrence matrix of wavelet image representation technique. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Biometric templates selection and update using quality measures

    NASA Astrophysics Data System (ADS)

    Abboud, Ali J.; Jassim, Sabah A.

    2012-06-01

    To deal with severe variation in recording conditions, most biometric systems acquire multiple biometric samples, at the enrolment stage, for the same person and then extract their individual biometric feature vectors and store them in the gallery in the form of biometric template(s), labelled with the person's identity. The number of samples/templates and the choice of the most appropriate templates influence the performance of the system. The desired biometric template(s) selection technique must aim to control the run time and storage requirements while improving the recognition accuracy of the biometric system. This paper is devoted to elaborating on and discussing a new two stages approach for biometric templates selection and update. This approach uses a quality-based clustering, followed by a special criterion for the selection of an ultimate set of biometric templates from the various clusters. This approach is developed to select adaptively a specific number of templates for each individual. The number of biometric templates depends mainly on the performance of each individual (i.e. gallery size should be optimised to meet the needs of each target individual). These experiments have been conducted on two face image databases and their results will demonstrate the effectiveness of proposed quality-guided approach.

  19. Wind Energy Developments: Incentives In Selected Countries

    EIA Publications

    1999-01-01

    This paper discusses developments in wind energy for the countries with significant wind capacity. After a brief overview of world capacity, it examines development trends, beginning with the United States - the number one country in wind electric generation capacity until 1997.

  20. An affine projection algorithm using grouping selection of input vectors

    NASA Astrophysics Data System (ADS)

    Shin, JaeWook; Kong, NamWoong; Park, PooGyeon

    2011-10-01

    This paper present an affine projection algorithm (APA) using grouping selection of input vectors. To improve the performance of conventional APA, the proposed algorithm adjusts the number of the input vectors using two procedures: grouping procedure and selection procedure. In grouping procedure, the some input vectors that have overlapping information for update is grouped using normalized inner product. Then, few input vectors that have enough information for for coefficient update is selected using steady-state mean square error (MSE) in selection procedure. Finally, the filter coefficients update using selected input vectors. The experimental results show that the proposed algorithm has small steady-state estimation errors comparing with the existing algorithms.

  1. Selection of monitoring locations for storm water quality assessment.

    PubMed

    Langeveld, J G; Boogaard, F; Liefting, H J; Schilperoort, R P S; Hof, A; Nijhof, H; de Ridder, A C; Kuiper, M W

    2014-01-01

    Storm water runoff is a major contributor to the pollution of receiving waters. Storm water characteristics may vary significantly between locations and events. Hence, for each given location, this necessitates a well-designed monitoring campaign prior to selection of an appropriate storm water management strategy. The challenge for the design of a monitoring campaign with a given budget is to balance detailed monitoring at a limited number of locations versus less detailed monitoring at a large number of locations. This paper proposes a methodology for the selection of monitoring locations for storm water quality monitoring, based on (pre-)screening, a quick scan monitoring campaign, and final selection of locations and design of the monitoring setup. The main advantage of the method is the ability to prevent the selection of monitoring locations that turn out to be inappropriate. In addition, in this study, the quick scan resulted in a first useful dataset on storm water quality and a strong indication of illicit connections at one of the monitoring locations.

  2. The Irish contribution to the plastic surgery literature: 21 years of publications.

    PubMed

    Rahmani, G; Joyce, C W; Jones, D M; Kelly, J L; Hussey, A J; Regan, P J

    2015-09-01

    The Republic of Ireland has always had an influence on medicine and has produced many renowned doctors who have helped shape its history. Furthermore, many clinical articles that have originated from Ireland have changed clinical practice throughout the world. The Irish have also had an impact on the plastic surgery literature yet it has never specifically been analyzed before. The purpose of this study was to identify and analyze all papers that have originated from the plastic surgery units in the Republic of Ireland in the medical literature over the past 21 years. Twenty-four well-known plastic surgery, hand surgery and burns journals were selected for this study. By utilizing Scopus, the largest abstract and citation database of peer-reviewed literature, we analyzed each of our chosen 24 journals looking for Irish publications. Each paper was examined for article type, authorship, year of publication, institution of origin and level of evidence. Papers from the Republic of Ireland were published in 20 of the 24 journals over the past 21 years. A total of 245 articles from Ireland were published in the plastic surgery, hand surgery and burns literature over the 21-year period. Of these, 111 were original articles and 73 were case reports. The institution that published the most papers over the past 21 years was University Hospital Galway (66 publications) followed by Cork University Hospital with 54 papers. The journal with the most Irish articles was the Journal of Plastic, Reconstructive and Aesthetic Surgery with 56 papers. 2014 was the year with the most publications (28 papers). Authorship numbers also increased over time as the average number of authors in 1994 was 3.5, whereas it was 5.54 in 2014. The number of publications per year continues to increase along with authorship numbers. This mirrors the trend in other specialties. Publications are now no longer required for selection on to a higher surgical training scheme. There is now a fear that the academic output of trainees will decrease as a consequence. To prevent this, each unit must actively support and encourage research activity with their trainees.

  3. Self Concept and Self Esteem: Infancy to Adolescence. A Cognitive Developmental Outline with Some Reference to Behaviour and Health Effects. Unit for Child Studies. Selected Papers Number 27.

    ERIC Educational Resources Information Center

    Phillips, Shelley

    Initially differentiating ideas of self-concept, personality, and self-esteem, this paper discusses the development of the self from infancy through adolescence. The discussion of infancy focuses on learning about bodily self and related disruptions and describes the emergence of the social and independent self. The discussion of toddlers and…

  4. Health Education Needs: A Survey of Rural Adults in Armstrong County, Pennsylvania, 1975. An Interim Report. Rural Health Staff Papers - Paper Number 8.

    ERIC Educational Resources Information Center

    Leadley, Samuel M.; Taranto, Angelo A.

    In July and August 1975, 138 rural residents of Armstrong County, Pennsylvania were interviewed as to their behaviors, beliefs, and attitudes regarding the prevention of cancer and coronary heart disease. Respondents were selected by interviewing an adult living on a commercial farm (a farm that either sold $10,000 or more produce per year or the…

  5. Children and Loss. Part 1: A Teacher's View: The Child in the Single Parent and Blended Family. Part 2: Helping Children Cope with Loss. Unit for Child Studies. Selected Papers Number 25.

    ERIC Educational Resources Information Center

    Rowling, Louise

    In two parts, this paper discusses areas in family disruption, describes children's experiences of loss, and suggests strategies for helping children cope with such experiences. Specifically, part one provides: (1) global, normative summary descriptions of adjustment patterns of children from preschool age through adolescence and descriptions of…

  6. Radioactive cobalt removal from Salem liquid radwaste with cobalt selective media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maza R.; Wilson, J.A.; Hetherington, R.

    This paper reports results of benchtop tests using ion exchange material to selectively remove radioactive cobalt from high conductivity liquid radwaste at the Salem Nuclear Generating Station. The purpose of this test program is to reduce the number of curies in liquid releases without increasing the solid waste volume. These tests have identified two cobalt selective materials that together remove radioactive cobalt more effectively than the single component currently used. All test materials were preconditioned by conversion to the divalent calcium or sulfate form to simulate chemically exhausted media.

  7. TECHNOLOGICAL OPTIONS FOR ACID RAIN CONTROL

    EPA Science Inventory

    The paper discusses technological options for acid rain control. Compliance with Title IV of the Clean Air Act Amendments of 1990 will require careful scrutiny of a number of issues before selecting control options to reduce sulfur dioxide (SO2) and nitrogen oxide (NOx) emissions...

  8. Chatter detection in milling process based on VMD and energy entropy

    NASA Astrophysics Data System (ADS)

    Liu, Changfu; Zhu, Lida; Ni, Chenbing

    2018-05-01

    This paper presents a novel approach to detect the milling chatter based on Variational Mode Decomposition (VMD) and energy entropy. VMD has already been employed in feature extraction from non-stationary signals. The parameters like number of modes (K) and the quadratic penalty (α) need to be selected empirically when raw signal is decomposed by VMD. Aimed at solving the problem how to select K and α, the automatic selection method of VMD's based on kurtosis is proposed in this paper. When chatter occurs in the milling process, energy will be absorbed to chatter frequency bands. To detect the chatter frequency bands automatically, the chatter detection method based on energy entropy is presented. The vibration signal containing chatter frequency is simulated and three groups of experiments which represent three cutting conditions are conducted. To verify the effectiveness of method presented by this paper, chatter feather extraction has been successfully employed on simulation signals and experimental signals. The simulation and experimental results show that the proposed method can effectively detect the chatter.

  9. Bibliometric trend and patent analysis in nano-alloys research for period 2000-2013.

    PubMed

    Živković, Dragana; Niculović, Milica; Manasijević, Dragan; Minić, Duško; Ćosović, Vladan; Sibinović, Maja

    2015-05-04

    This paper presents an overview of current situation in nano-alloys investigations based on bibliometric and patent analysis. Bibliometric analysis data, for period from 2000 to September 2013, were obtained using Scopus database as selected index database, whereas analyzed parameters were: number of scientific papers per years, authors, countries, affiliations, subject areas and document types. Analysis of nano-alloys patents was done with specific database, using the International Patent Classification and Patent Scope for the period from 2003 to 2013 year. Information found in this database was the number of patents, patent classification by country, patent applicators, main inventors and pub date.

  10. Bibliometric trend and patent analysis in nano-alloys research for period 2000-2013.

    PubMed

    Živković, Dragana; Niculović, Milica; Manasijević, Dragan; Minić, Duško; Ćosović, Vladan; Sibinović, Maja

    2015-01-01

    This paper presents an overview of current situation in nano-alloys investigations based on bibliometric and patent analysis. Bibliometric analysis data, for the period 2000 to 2013, were obtained using Scopus database as selected index database, whereas analyzed parameters were: number of scientific papers per year, authors, countries, affiliations, subject areas and document types. Analysis of nano-alloys patents was done with specific database, using the International Patent Classification and Patent Scope for the period 2003 to 2013. Information found in this database was the number of patents, patent classification by country, patent applicators, main inventors and publication date.

  11. Linear reduction methods for tag SNP selection.

    PubMed

    He, Jingwu; Zelikovsky, Alex

    2004-01-01

    It is widely hoped that constructing a complete human haplotype map will help to associate complex diseases with certain SNP's. Unfortunately, the number of SNP's is huge and it is very costly to sequence many individuals. Therefore, it is desirable to reduce the number of SNP's that should be sequenced to considerably small number of informative representatives, so called tag SNP's. In this paper, we propose a new linear algebra based method for selecting and using tag SNP's. Our method is purely combinatorial and can be combined with linkage disequilibrium (LD) and block based methods. We measure the quality of our tag SNP selection algorithm by comparing actual SNP's with SNP's linearly predicted from linearly chosen tag SNP's. We obtain an extremely good compression and prediction rates. For example, for long haplotypes (>25000 SNP's), knowing only 0.4% of all SNP's we predict the entire unknown haplotype with 2% accuracy while the prediction method is based on a 10% sample of the population.

  12. The synthesis of tritium, carbon-14 and stable isotope labelled selective estrogen receptor degraders.

    PubMed

    Bragg, Ryan A; Bushby, Nick; Ericsson, Cecilia; Kingston, Lee P; Ji, Hailong; Elmore, Charles S

    2016-09-01

    As part of a Medicinal Chemistry program aimed at developing an orally bioavailable selective estrogen receptor degrader, a number of tritium, carbon-14, and stable isotope labelled (E)-3-[4-(2,3,4,9-tetrahydro-1H-pyrido[3,4-b]indol-1-yl)phenyl]prop-2-enoic acids were required. This paper discusses 5 synthetic approaches to this compound class. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Variable Selection through Correlation Sifting

    NASA Astrophysics Data System (ADS)

    Huang, Jim C.; Jojic, Nebojsa

    Many applications of computational biology require a variable selection procedure to sift through a large number of input variables and select some smaller number that influence a target variable of interest. For example, in virology, only some small number of viral protein fragments influence the nature of the immune response during viral infection. Due to the large number of variables to be considered, a brute-force search for the subset of variables is in general intractable. To approximate this, methods based on ℓ1-regularized linear regression have been proposed and have been found to be particularly successful. It is well understood however that such methods fail to choose the correct subset of variables if these are highly correlated with other "decoy" variables. We present a method for sifting through sets of highly correlated variables which leads to higher accuracy in selecting the correct variables. The main innovation is a filtering step that reduces correlations among variables to be selected, making the ℓ1-regularization effective for datasets on which many methods for variable selection fail. The filtering step changes both the values of the predictor variables and output values by projections onto components obtained through a computationally-inexpensive principal components analysis. In this paper we demonstrate the usefulness of our method on synthetic datasets and on novel applications in virology. These include HIV viral load analysis based on patients' HIV sequences and immune types, as well as the analysis of seasonal variation in influenza death rates based on the regions of the influenza genome that undergo diversifying selection in the previous season.

  14. Feature selection in feature network models: finding predictive subsets of features with the Positive Lasso.

    PubMed

    Frank, Laurence E; Heiser, Willem J

    2008-05-01

    A set of features is the basis for the network representation of proximity data achieved by feature network models (FNMs). Features are binary variables that characterize the objects in an experiment, with some measure of proximity as response variable. Sometimes features are provided by theory and play an important role in the construction of the experimental conditions. In some research settings, the features are not known a priori. This paper shows how to generate features in this situation and how to select an adequate subset of features that takes into account a good compromise between model fit and model complexity, using a new version of least angle regression that restricts coefficients to be non-negative, called the Positive Lasso. It will be shown that features can be generated efficiently with Gray codes that are naturally linked to the FNMs. The model selection strategy makes use of the fact that FNM can be considered as univariate multiple regression model. A simulation study shows that the proposed strategy leads to satisfactory results if the number of objects is less than or equal to 22. If the number of objects is larger than 22, the number of features selected by our method exceeds the true number of features in some conditions.

  15. Dynamic selection mechanism for quality of service aware web services

    NASA Astrophysics Data System (ADS)

    D'Mello, Demian Antony; Ananthanarayana, V. S.

    2010-02-01

    A web service is an interface of the software component that can be accessed by standard Internet protocols. The web service technology enables an application to application communication and interoperability. The increasing number of web service providers throughout the globe have produced numerous web services providing the same or similar functionality. This necessitates the use of tools and techniques to search the suitable services available over the Web. UDDI (universal description, discovery and integration) is the first initiative to find the suitable web services based on the requester's functional demands. However, the requester's requirements may also include non-functional aspects like quality of service (QoS). In this paper, the authors define a QoS model for QoS aware and business driven web service publishing and selection. The authors propose a QoS requirement format for the requesters, to specify their complex demands on QoS for the web service selection. The authors define a tree structure called quality constraint tree (QCT) to represent the requester's variety of requirements on QoS properties having varied preferences. The paper proposes a QoS broker based architecture for web service selection, which facilitates the requesters to specify their QoS requirements to select qualitatively optimal web service. A web service selection algorithm is presented, which ranks the functionally similar web services based on the degree of satisfaction of the requester's QoS requirements and preferences. The paper defines web service provider qualities to distinguish qualitatively competitive web services. The paper also presents the modelling and selection mechanism for the requester's alternative constraints defined on the QoS. The authors implement the QoS broker based system to prove the correctness of the proposed web service selection mechanism.

  16. The Adopted Adolescent. Selected Papers Number 55.

    ERIC Educational Resources Information Center

    Banning, Anne

    This review of studies on clinical and nonclinical populations explores outcomes of adoption and developmental issues for adolescents, and in particular, developmental problems for adopted adolescents. Studies on nonclinical populations demonstrate that adoption is a highly successful form of substitute care. Prospective longitudinal studies show…

  17. Adaptive Elastic Net for Generalized Methods of Moments.

    PubMed

    Caner, Mehmet; Zhang, Hao Helen

    2014-01-30

    Model selection and estimation are crucial parts of econometrics. This paper introduces a new technique that can simultaneously estimate and select the model in generalized method of moments (GMM) context. The GMM is particularly powerful for analyzing complex data sets such as longitudinal and panel data, and it has wide applications in econometrics. This paper extends the least squares based adaptive elastic net estimator of Zou and Zhang (2009) to nonlinear equation systems with endogenous variables. The extension is not trivial and involves a new proof technique due to estimators lack of closed form solutions. Compared to Bridge-GMM of Caner (2009), we allow for the number of parameters to diverge to infinity as well as collinearity among a large number of variables, also the redundant parameters set to zero via a data dependent technique. This method has the oracle property, meaning that we can estimate nonzero parameters with their standard limit and the redundant parameters are dropped from the equations simultaneously. Numerical examples are used to illustrate the performance of the new method.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Husain, Tausif; Hasan, Iftekhar; Sozer, Yilmaz

    This paper presents the design considerations of a double-sided transverse flux machine (TFM) for direct-drive wind turbine applications. The TFM has a modular structure with quasi-U stator cores and ring windings. The rotor is constructed with ferrite magnets in a flux-concentrating arrangement to achieve high air gap flux density. The design considerations for this TFM with respect to initial sizing, pole number selection, key design ratios, and pole shaping are presented in this paper. Pole number selection is critical in the design process of a TFM because it affects both the torque density and power factor under fixed magnetic andmore » changing electrical loading. Several key design ratios are introduced to facilitate the design procedure. The effect of pole shaping on back-emf and inductance is also analyzed. These investigations provide guidance toward the required design of a TFM for direct-drive applications. The analyses are carried out using analytical and three-dimensional finite element analysis. A prototype is under construction for experimental verification.« less

  19. Development of Probiotic Formulation for the Treatment of Iron Deficiency Anemia.

    PubMed

    Korčok, Davor Jovan; Tršić-Milanović, Nada Aleksandar; Ivanović, Nevena Djuro; Đorđević, Brižita Ivan

    2018-04-01

    Probiotics are increasingly more present both as functional foods, and in pharmaceutical preparations with multiple levels of action that contribute to human health. Probiotics realize their positive effects with a proper dose, and by maintaining a declared number of probiotics cells by the expiration date. Important precondition for developing a probiotic product is the right choice of clinically proven probiotic strain, the choice of other active components, as well as, the optimization of the quantity of active component of probiotic per product dose. This scientific paper describes the optimization of the number of probiotics cells in the formulation of dietary supplement that contains probiotic culture Lactobacillus plantarum 299v, iron and vitamin C. Variations of the quantity of active component were analyzed in development batches of the encapsulated probiotic product categorized as dietary supplement with the following ingredients: probiotic culture, sucrosomal form of iron and vitamin C. Optimal quantity of active component L. plantarum of 50 mg, was selected. The purpose of this scientific paper is to select the optimal formulation of probiotic culture in a dietary supplement that contains iron and vitamin C, and to also determine its expiration date by the analysis of the number of viable probiotic cells.

  20. Quality Programming in H.P.E.R. Selected Papers Presented at the Convention of the Canadian Association for Health, Physical Education and Recreation (British Columbia, Canada, June 10-13, 1981). Physical Education Series Number 2.

    ERIC Educational Resources Information Center

    Jackson, John J., Ed.; Turkington, H. David, Ed.

    These papers, presented during the 1981 convention of the Canadian Association for Health, Physical Education and Recreation, addressed eight major topics: (1) the physical education and sport profession in Canada; (2) physical fitness (community agencies, radiology, aging and physical activity, the effective physical education program, aerobic…

  1. Best one hundred papers of International Orthopaedics: a bibliometric analysis.

    PubMed

    Mavrogenis, Andreas F; Megaloikonomos, Panayiotis D; Panagopoulos, Georgios N; Mauffrey, Cyril; Quaile, Andrew; Scarlat, Marius M

    2017-04-01

    International Orthopaedics was founded in 1977. Within the 40 volumes and 247 issues since its launch, 5462 scientific articles have been published. This article identifies, analyses and categorises the best cited articles published by the journal to date. We searched Elsevier Scopus database for citations of all papers published in International Orthopaedics since its foundation. Source title was selected, and the journal's title was introduced in the search engine. The identified articles were sorted based on their total number of received citations, forming a descending list from 1 to 100. Total citations and self-citations of all co-authors were recorded. Year of publication, number of co-authors, number of pages, country and institution of origin and study type were identified. The best 100 papers and their citations correspond approximately to 2% of all the journal's publications. Total citations ranged from 62 to 272; 26 papers had >100 citations, of which self-citations accounted for <4%. Mean authorship number per paper was four and mean page number 6.5. United States, Japan and Germany ranked the top three countries of origin. The most common study type was case series, and most common topics were adult reconstruction, sports medicine and trauma. This article identifies topics, authors and institutions that contributed with their high-quality work in the journal's development over time. International Orthopaedics remains faithful to its authors and readers by publishing topical, well-written articles in excellent English.

  2. Toddlers. Selected Papers Number 58.

    ERIC Educational Resources Information Center

    Phillips, Shelley

    Discussed are adults' egocentric attitudes about children, particularly toddlers, and ways in which such attitudes can creat unnecessary stress in the caregiver and toddler. Emphasis is given to: (1) hostile myths about toddlers that obscure reality and muddy relationships; (2) misunderstandings about ways in which toddlers think; (3) young…

  3. Influence of single particle orbital sets and configuration selection on multideterminant wavefunctions in quantum Monte Carlo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clay, Raymond C.; Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore, California 94550; Morales, Miguel A., E-mail: moralessilva2@llnl.gov

    2015-06-21

    Multideterminant wavefunctions, while having a long history in quantum chemistry, are increasingly being used in highly accurate quantum Monte Carlo calculations. Since the accuracy of QMC is ultimately limited by the quality of the trial wavefunction, multi-Slater determinants wavefunctions offer an attractive alternative to Slater-Jastrow and more sophisticated wavefunction ansatz for several reasons. They can be efficiently calculated, straightforwardly optimized, and systematically improved by increasing the number of included determinants. In spite of their potential, however, the convergence properties of multi-Slater determinant wavefunctions with respect to orbital set choice and excited determinant selection are poorly understood, which hinders the applicationmore » of these wavefunctions to large systems and solids. In this paper, by performing QMC calculations on the equilibrium and stretched carbon dimer, we find that convergence of the recovered correlation energy with respect to number of determinants can depend quite strongly on basis set and determinant selection methods, especially where there is strong correlation. We demonstrate that properly chosen orbital sets and determinant selection techniques from quantum chemistry methods can dramatically reduce the required number of determinants (and thus the computational cost) to reach a given accuracy, which we argue shows clear need for an automatic QMC-only method for selecting determinants and generating optimal orbital sets.« less

  4. Measuring science: An exploration

    PubMed Central

    Adams, James; Griliches, Zvi

    1996-01-01

    This paper examines the available United States data on academic research and development (R&D) expenditures and the number of papers published and the number of citations to these papers as possible measures of “output” of this enterprise. We look at these numbers for science and engineering as a whole, for five selected major fields, and at the individual university field level. The published data in Science and Engineering Indicators imply sharply diminishing returns to academic R&D using published papers as an “output” measure. These data are quite problematic. Using a newer set of data on papers and citations, based on an “expanding” set of journals and the newly released Bureau of Economic Analysis R&D deflators, changes the picture drastically, eliminating the appearance of diminishing returns but raising the question of why the input prices of academic R&D are rising so much faster than either the gross domestic product deflator or the implicit R&D deflator in industry. A production function analysis of such data at the individual field level follows. It indicates significant diminishing returns to “own” R&D, with the R&D coefficients hovering around 0.5 for estimates with paper numbers as the dependent variable and around 0.6 if total citations are used as the dependent variable. When we substitute scientists and engineers in place of R&D as the right-hand side variables, the coefficient on papers rises from 0.5 to 0.8, and the coefficient on citations rises from 0.6 to 0.9, indicating systematic measurement problems with R&D as the sole input into the production of scientific output. But allowing for individual university field effects drives these numbers down significantly below unity. Because in the aggregate both paper numbers and citations are growing as fast or faster than R&D, this finding can be interpreted as leaving a major, yet unmeasured, role for the contribution of spillovers from other fields, other universities, and other countries. PMID:8917477

  5. Random-effects meta-analysis: the number of studies matters.

    PubMed

    Guolo, Annamaria; Varin, Cristiano

    2017-06-01

    This paper investigates the impact of the number of studies on meta-analysis and meta-regression within the random-effects model framework. It is frequently neglected that inference in random-effects models requires a substantial number of studies included in meta-analysis to guarantee reliable conclusions. Several authors warn about the risk of inaccurate results of the traditional DerSimonian and Laird approach especially in the common case of meta-analysis involving a limited number of studies. This paper presents a selection of likelihood and non-likelihood methods for inference in meta-analysis proposed to overcome the limitations of the DerSimonian and Laird procedure, with a focus on the effect of the number of studies. The applicability and the performance of the methods are investigated in terms of Type I error rates and empirical power to detect effects, according to scenarios of practical interest. Simulation studies and applications to real meta-analyses highlight that it is not possible to identify an approach uniformly superior to alternatives. The overall recommendation is to avoid the DerSimonian and Laird method when the number of meta-analysis studies is modest and prefer a more comprehensive procedure that compares alternative inferential approaches. R code for meta-analysis according to all of the inferential methods examined in the paper is provided.

  6. Statistical molecular design of building blocks for combinatorial chemistry.

    PubMed

    Linusson, A; Gottfries, J; Lindgren, F; Wold, S

    2000-04-06

    The reduction of the size of a combinatorial library can be made in two ways, either base the selection on the building blocks (BB's) or base it on the full set of virtually constructed products. In this paper we have investigated the effects of applying statistical designs to BB sets compared to selections based on the final products. The two sets of BB's and the virtually constructed library were described by structural parameters, and the correlation between the two characterizations was investigated. Three different selection approaches were used both for the BB sets and for the products. In the first two the selection algorithms were applied directly to the data sets (D-optimal design and space-filling design), while for the third a cluster analysis preceded the selection (cluster-based design). The selections were compared using visual inspection, the Tanimoto coefficient, the Euclidean distance, the condition number, and the determinant of the resulting data matrix. No difference in efficiency was found between selections made in the BB space and in the product space. However, it is of critical importance to investigate the BB space carefully and to select an appropriate number of BB's to result in an adequate diversity. An example from the pharmaceutical industry is then presented, where selection via BB's was made using a cluster-based design.

  7. Improved training for target detection using Fukunaga-Koontz transform and distance classifier correlation filter

    NASA Astrophysics Data System (ADS)

    Elbakary, M. I.; Alam, M. S.; Aslan, M. S.

    2008-03-01

    In a FLIR image sequence, a target may disappear permanently or may reappear after some frames and crucial information such as direction, position and size related to the target are lost. If the target reappears at a later frame, it may not be tracked again because the 3D orientation, size and location of the target might be changed. To obtain information about the target before disappearing and to detect the target after reappearing, distance classifier correlation filter (DCCF) is trained manualy by selecting a number of chips randomly. This paper introduces a novel idea to eliminates the manual intervention in training phase of DCCF. Instead of selecting the training chips manually and selecting the number of the training chips randomly, we adopted the K-means algorithm to cluster the training frames and based on the number of clusters we select the training chips such that a training chip for each cluster. To detect and track the target after reappearing in the field-ofview ,TBF and DCCF are employed. The contduced experiemnts using real FLIR sequences show results similar to the traditional agorithm but eleminating the manual intervention is the advantage of the proposed algorithm.

  8. An Intelligent Ensemble Neural Network Model for Wind Speed Prediction in Renewable Energy Systems.

    PubMed

    Ranganayaki, V; Deepa, S N

    2016-01-01

    Various criteria are proposed to select the number of hidden neurons in artificial neural network (ANN) models and based on the criterion evolved an intelligent ensemble neural network model is proposed to predict wind speed in renewable energy applications. The intelligent ensemble neural model based wind speed forecasting is designed by averaging the forecasted values from multiple neural network models which includes multilayer perceptron (MLP), multilayer adaptive linear neuron (Madaline), back propagation neural network (BPN), and probabilistic neural network (PNN) so as to obtain better accuracy in wind speed prediction with minimum error. The random selection of hidden neurons numbers in artificial neural network results in overfitting or underfitting problem. This paper aims to avoid the occurrence of overfitting and underfitting problems. The selection of number of hidden neurons is done in this paper employing 102 criteria; these evolved criteria are verified by the computed various error values. The proposed criteria for fixing hidden neurons are validated employing the convergence theorem. The proposed intelligent ensemble neural model is applied for wind speed prediction application considering the real time wind data collected from the nearby locations. The obtained simulation results substantiate that the proposed ensemble model reduces the error value to minimum and enhances the accuracy. The computed results prove the effectiveness of the proposed ensemble neural network (ENN) model with respect to the considered error factors in comparison with that of the earlier models available in the literature.

  9. An Intelligent Ensemble Neural Network Model for Wind Speed Prediction in Renewable Energy Systems

    PubMed Central

    Ranganayaki, V.; Deepa, S. N.

    2016-01-01

    Various criteria are proposed to select the number of hidden neurons in artificial neural network (ANN) models and based on the criterion evolved an intelligent ensemble neural network model is proposed to predict wind speed in renewable energy applications. The intelligent ensemble neural model based wind speed forecasting is designed by averaging the forecasted values from multiple neural network models which includes multilayer perceptron (MLP), multilayer adaptive linear neuron (Madaline), back propagation neural network (BPN), and probabilistic neural network (PNN) so as to obtain better accuracy in wind speed prediction with minimum error. The random selection of hidden neurons numbers in artificial neural network results in overfitting or underfitting problem. This paper aims to avoid the occurrence of overfitting and underfitting problems. The selection of number of hidden neurons is done in this paper employing 102 criteria; these evolved criteria are verified by the computed various error values. The proposed criteria for fixing hidden neurons are validated employing the convergence theorem. The proposed intelligent ensemble neural model is applied for wind speed prediction application considering the real time wind data collected from the nearby locations. The obtained simulation results substantiate that the proposed ensemble model reduces the error value to minimum and enhances the accuracy. The computed results prove the effectiveness of the proposed ensemble neural network (ENN) model with respect to the considered error factors in comparison with that of the earlier models available in the literature. PMID:27034973

  10. Integrated manufacturing flow for selective-etching SADP/SAQP

    NASA Astrophysics Data System (ADS)

    Ali, Rehab Kotb; Fatehy, Ahmed Hamed; Word, James

    2018-03-01

    Printing cut mask in SAMP (Self Aligned Multi Patterning) is very challenging at advanced nodes. One of the proposed solutions is to print the cut shapes selectively. Which means the design is decomposed into mandrel tracks, Mandrel cuts and non-Mandrel cuts. The mandrel and non-Mandrel cuts are mutually independent which results in relaxing spacing constrains and as a consequence more dense metal lines. In this paper, we proposed the manufacturing flow of selective etching process. The results are quantified in terms of measuring PVBand, EPE and the number of hard bridging and pinching across the layout.

  11. Electrode channel selection based on backtracking search optimization in motor imagery brain-computer interfaces.

    PubMed

    Dai, Shengfa; Wei, Qingguo

    2017-01-01

    Common spatial pattern algorithm is widely used to estimate spatial filters in motor imagery based brain-computer interfaces. However, use of a large number of channels will make common spatial pattern tend to over-fitting and the classification of electroencephalographic signals time-consuming. To overcome these problems, it is necessary to choose an optimal subset of the whole channels to save computational time and improve the classification accuracy. In this paper, a novel method named backtracking search optimization algorithm is proposed to automatically select the optimal channel set for common spatial pattern. Each individual in the population is a N-dimensional vector, with each component representing one channel. A population of binary codes generate randomly in the beginning, and then channels are selected according to the evolution of these codes. The number and positions of 1's in the code denote the number and positions of chosen channels. The objective function of backtracking search optimization algorithm is defined as the combination of classification error rate and relative number of channels. Experimental results suggest that higher classification accuracy can be achieved with much fewer channels compared to standard common spatial pattern with whole channels.

  12. Structural Design of a 4-Meter Off-Axis Space Telescope for the Habitable-Zone Exoplanet Direct Imaging Mission

    NASA Technical Reports Server (NTRS)

    Arnold, William, Sr.; Stahl, H Philip

    2017-01-01

    This design study was conducted to support the HABEX project. There are a number of companion papers at this conference which go into detail on what all the HABEX goals are. The objective of this paper is to establish a baseline primary mirror design which satisfies the following structural related requirements. The designs in this study have a high TRL (Technology Readiness Level), realistic manufacturing limits and performance in line with the HABEX mission. A secondary goal of the study was to evaluate a number competing criteria for the selection. Questions such as differences in the on axis versus off axis static and dynamic response to disturbances. This study concentrates on the structural behavior, companion papers cover thermal and long term stability aspects of the problem.

  13. Paper- or Web-Based Questionnaire Invitations as a Method for Data Collection: Cross-Sectional Comparative Study of Differences in Response Rate, Completeness of Data, and Financial Cost

    PubMed Central

    Huibers, Linda; Christensen, Bo; Christensen, Morten Bondo

    2018-01-01

    Background Paper questionnaires have traditionally been the first choice for data collection in research. However, declining response rates over the past decade have increased the risk of selection bias in cross-sectional studies. The growing use of the Internet offers new ways of collecting data, but trials using Web-based questionnaires have so far seen mixed results. A secure, online digital mailbox (e-Boks) linked to a civil registration number became mandatory for all Danish citizens in 2014 (exemption granted only in extraordinary cases). Approximately 89% of the Danish population have a digital mailbox, which is used for correspondence with public authorities. Objective We aimed to compare response rates, completeness of data, and financial costs for different invitation methods: traditional surface mail and digital mail. Methods We designed a cross-sectional comparative study. An invitation to participate in a survey on help-seeking behavior in out-of-hours care was sent to two groups of randomly selected citizens from age groups 30-39 and 50-59 years and parents to those aged 0-4 years using either traditional surface mail (paper group) or digital mail sent to a secure online mailbox (digital group). Costs per respondent were measured by adding up all costs for handling, dispatch, printing, and work salary and then dividing the total figure by the number of respondents. Data completeness was assessed by comparing the number of missing values between the two methods. Socioeconomic variables (age, gender, family income, education duration, immigrant status, and job status) were compared both between respondents and nonrespondents and within these groups to evaluate the degree of selection bias. Results A total 3600 citizens were invited in each group; 1303 (36.29%) responded to the digital invitation and 1653 (45.99%) to the paper invitation (difference 9.66%, 95% CI 7.40-11.92). The costs were €1.51 per respondent for the digital group and €15.67 for paper group respondents. Paper questionnaires generally had more missing values; this was significant in five of 17 variables (P<.05). Substantial differences were found in the socioeconomic variables between respondents and nonrespondents, whereas only minor differences were seen within the groups of respondents and nonrespondents. Conclusions Although we found lower response rates for Web-based invitations, this solution was more cost-effective (by a factor of 10) and had slightly lower numbers of missing values than questionnaires sent with paper invitations. Analyses of socioeconomic variables showed almost no difference between nonrespondents in both groups, which could imply that the lower response rate in the digital group does not necessarily increase the level of selection bias. Invitations to questionnaire studies via digital mail may be an excellent option for collecting research data in the future. This study may serve as the foundational pillar of digital data collection in health care research in Scandinavia and other countries considering implementing similar systems. PMID:29362206

  14. Paper- or Web-Based Questionnaire Invitations as a Method for Data Collection: Cross-Sectional Comparative Study of Differences in Response Rate, Completeness of Data, and Financial Cost.

    PubMed

    Ebert, Jonas Fynboe; Huibers, Linda; Christensen, Bo; Christensen, Morten Bondo

    2018-01-23

    Paper questionnaires have traditionally been the first choice for data collection in research. However, declining response rates over the past decade have increased the risk of selection bias in cross-sectional studies. The growing use of the Internet offers new ways of collecting data, but trials using Web-based questionnaires have so far seen mixed results. A secure, online digital mailbox (e-Boks) linked to a civil registration number became mandatory for all Danish citizens in 2014 (exemption granted only in extraordinary cases). Approximately 89% of the Danish population have a digital mailbox, which is used for correspondence with public authorities. We aimed to compare response rates, completeness of data, and financial costs for different invitation methods: traditional surface mail and digital mail. We designed a cross-sectional comparative study. An invitation to participate in a survey on help-seeking behavior in out-of-hours care was sent to two groups of randomly selected citizens from age groups 30-39 and 50-59 years and parents to those aged 0-4 years using either traditional surface mail (paper group) or digital mail sent to a secure online mailbox (digital group). Costs per respondent were measured by adding up all costs for handling, dispatch, printing, and work salary and then dividing the total figure by the number of respondents. Data completeness was assessed by comparing the number of missing values between the two methods. Socioeconomic variables (age, gender, family income, education duration, immigrant status, and job status) were compared both between respondents and nonrespondents and within these groups to evaluate the degree of selection bias. A total 3600 citizens were invited in each group; 1303 (36.29%) responded to the digital invitation and 1653 (45.99%) to the paper invitation (difference 9.66%, 95% CI 7.40-11.92). The costs were €1.51 per respondent for the digital group and €15.67 for paper group respondents. Paper questionnaires generally had more missing values; this was significant in five of 17 variables (P<.05). Substantial differences were found in the socioeconomic variables between respondents and nonrespondents, whereas only minor differences were seen within the groups of respondents and nonrespondents. Although we found lower response rates for Web-based invitations, this solution was more cost-effective (by a factor of 10) and had slightly lower numbers of missing values than questionnaires sent with paper invitations. Analyses of socioeconomic variables showed almost no difference between nonrespondents in both groups, which could imply that the lower response rate in the digital group does not necessarily increase the level of selection bias. Invitations to questionnaire studies via digital mail may be an excellent option for collecting research data in the future. This study may serve as the foundational pillar of digital data collection in health care research in Scandinavia and other countries considering implementing similar systems. ©Jonas Fynboe Ebert, Linda Huibers, Bo Christensen, Morten Bondo Christensen. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 23.01.2018.

  15. View from OSERS. [Question-and-Answer Session.

    ERIC Educational Resources Information Center

    Kaufman, Martin J.

    The paper describes the perspective of the Office of Special Education and Rehabilitative Services on special education research. The process for reviewing research proposals is explained, along with procedures for selecting experts in the field for creating registers of reviewers. Also discussed are the number of points allocated to different…

  16. Investigating the Advantages of Constructing Multidigit Numeration Understanding through Oneida and Lakota Native Languages.

    ERIC Educational Resources Information Center

    Hankes, Judith Elaine

    This paper documents a culturally specific language strength for developing number sense among Oneida- and Lakota-speaking primary students. Qualitative research methods scaffolded this research study: culture informants were interviewed and interviews were transcribed and coded for analysis; culture documents were selected for analysis; and…

  17. Understanding Crying in Infancy. Selected Papers Number 43.

    ERIC Educational Resources Information Center

    Hope, Margaret

    Presented are practical ideas parents can use to prevent and manage excessive crying during their infant's first year. Designed to foster the development of children's physical and emotional independence, the ideas discussed concern (1) the changing functions of crying throughout the infant's first year, (2) causes of crying, (3) the intellectual…

  18. Selected Bibliography of Polish Educational Materials. Volume 11, Number 4, 1972.

    ERIC Educational Resources Information Center

    Wieczorek, Barbara, Ed.; Krajewska, Korolina, Ed.

    The publication is an annotated bibliography of books, papers, and articles on Polish education which were published from September to November, 1972. Entries are arranged alphabetically by author under the following subjects: History of Education, Laws and Legislation, General Information on Education, Social and Educational Sciences, Teacher's…

  19. The Sibling Bond. Sibling Conflict and Sexuality. Selected Papers, Number 57.

    ERIC Educational Resources Information Center

    Martin, June

    Perhaps the most obvious feature of the relationship between siblings is its diversity. Sibling relationships differ from peer relationships because of their frequency and amount or interaction; the durability of the relationships; the existence of prescribed roles; accessibility; and the degree of common experience. Conflict between siblings in…

  20. How Children Think. Unit for Child Studies. Selected Papers Number 30.

    ERIC Educational Resources Information Center

    Phillips, Shelley

    In four parts, this discussion describes characteristics of the thought of infants, preschool children, primary school students, and adolescents. Topics briefly addressed in part I, on the thought processes/capabilities of babies, concern sensorimotor thought without abstraction, the importance of physical exploration, the development of…

  1. Collection Evaluation Techniques: A Short, Selective, Practical, Current, Annotated Bibliography, 1990-1998. RUSA Occasional Papers Number 24.

    ERIC Educational Resources Information Center

    Strohl, Bonnie, Comp.

    This bibliography contains annotations of 110 journal articles on topics related to library collection evaluation techniques, including academic library collections, access-vs-ownership, "Books for College Libraries," business collections, the OCLC/AMIGOS Collection Analysis CD, circulation data, citation-checking, collection bias,…

  2. Program Evaluation in Alternative Education: An Annotated Bibliography. Teacher Education Forum; Volume 4, Number 17.

    ERIC Educational Resources Information Center

    Esp, Barbara

    The Forum Series is a collection of papers dealing with all phases of teacher education including inservice training and graduate study. This selection is an annotated bibliography in two parts: (1) Evaluation Issues and Methods; and (2) Studies of Alternative Environments. (DMT)

  3. Republication of: Large number coincidences and the anthropic principle in cosmology

    NASA Astrophysics Data System (ADS)

    Carter, Brandon

    2011-11-01

    This is a reprinting of the paper by Brandon Carter, first published in a little-known volume of conference proceedings in 1974, that moved the anthropic principle from the realm of philosophical speculations to the subject of theoretical physics. The paper has been selected by the Editors of General Relativity and Gravitation for re-publication in the Golden Oldies series of the journal. This republication is accompanied by an editorial note written by George Ellis.

  4. Working Together to Educate about the Environment. Selected Papers from the Joint Conference of the North American Association for Environmental Education and the Conservation Education Association (Estes Park, Colorado, August 18-23, 1989).

    ERIC Educational Resources Information Center

    Gross, Michael P., Ed.; And Others

    Proceedings of a conference on environmental education are presented in this document. Featured at the conference were four general sessions, a number of additional invited presentations, three symposia, four workshops, and over 170 contributed presentations. The purpose of this volume is to provide a record of the papers presented at the…

  5. Fifty years of hemodialysis access literature: The fifty most cited publications in the medical literature.

    PubMed

    Skripochnik, Edvard; O'Connor, David J; Trestman, Eric B; Lipsitz, Evan C; Scher, Larry A

    2018-02-01

    Objectives The modern era of hemodialysis access surgery began with the publication in 1966 by Brescia et al. describing the use of a surgically created arteriovenous fistula. Since then, the number of patients on chronic hemodialysis and the number of publications dealing with hemodialysis access have steadily increased. We have chronicled the increase in publications in the medical literature dealing with hemodialysis access by evaluating the characteristics of the 50 most cited articles. Methods We queried the Science Citation Index from the years 1960-2014. Articles were selected based on a subject search and were ranked according to the number of times they were cited in the medical literature. Results The 50 most frequently cited articles were selected for further analysis and the number of annual publications was tracked. The landmark publication by Dr Brescia et al. was unequivocally the most cited article dealing with hemodialysis access (1109 citations). The subject matter of the papers included AV fistula and graft (9), hemodialysis catheter (9), complications and outcomes (24), and other topics (8). Most articles were published in nephrology journals (33), with fewer in surgery (7), medicine (7), and radiology (3) journals. Of the 17 journals represented, Kidney International was the clear leader, publishing 18 articles. There has been an exponential rise in the frequency of publications regarding dialysis access with 42 of 50 analyzed papers being authored after 1990. Conclusion As the number of patients on hemodialysis has increased dramatically over the past five decades, there has been a commensurate increase in the overall number of publications related to hemodialysis access.

  6. Collected software engineering papers, volume 12

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1993 through October 1994. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the 12th such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document.

  7. Collected software engineering papers, volume 11

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1992 through November 1993. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the 11th such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document.

  8. Research on the selection of innovation compound using Possibility Construction Space Theory and fuzzy pattern recognition

    NASA Astrophysics Data System (ADS)

    Xie, Songhua; Li, Dehua; Nie, Hui

    2009-10-01

    There are a large number of fuzzy concepts and fuzzy phenomena in traditional Chinese medicine, which have led to great difficulties for study of traditional Chinese medicine. In this paper, the mathematical methods are used to quantify fuzzy concepts of drugs and prescription. We put forward the process of innovation formulations and selection method in Chinese medicine based on the Possibility Construction Space Theory (PCST) and fuzzy pattern recognition. Experimental results show that the method of selecting medicines from a number of characteristics of traditional Chinese medicine is consistent with the basic theory of traditional Chinese medicine. The results also reflect the integrated effects of the innovation compound. Through the use of the innovation formulations system, we expect to provide software tools for developing new traditional Chinese medicine and to inspire traditional Chinese medicine researchers to develop novel drugs.

  9. Case Mix Management Systems: An Opportunity to Integrate Medical Records and Financial Management System Data Bases

    PubMed Central

    Rusnak, James E.

    1987-01-01

    Due to previous systems selections, many hospitals (health care facilities) are faced with the problem of fragmented data bases containing clinical, demographic and financial information. Projects to select and implement a Case Mix Management System (CMMS) provide an opportunity to reduce the number of separate physical files and to migrate towards systems with an integrated data base. The number of CMMS candidate systems is often restricted due to data base and system interface issues. The hospital must insure the CMMS project provides a means to implement an integrated on-line hospital information data base for use by departments in operating under a DRG-based Prospective Payment System. This paper presents guidelines for use in selecting a Case Mix Mangement System to meet the hospital's financial and operations planning, budgeting, marketing, and other management needs, while considering the data base implications of the implementation.

  10. Selection of test paths for solder joint intermittent connection faults under DC stimulus

    NASA Astrophysics Data System (ADS)

    Huakang, Li; Kehong, Lv; Jing, Qiu; Guanjun, Liu; Bailiang, Chen

    2018-06-01

    The test path of solder joint intermittent connection faults under direct-current stimulus is examined in this paper. According to the physical structure of the circuit, a network model is established first. A network node is utilised to represent the test node. The path edge refers to the number of intermittent connection faults in the path. Then, the selection criteria of the test path based on the node degree index are proposed and the solder joint intermittent connection faults are covered using fewer test paths. Finally, three circuits are selected to verify the method. To test if the intermittent fault is covered by the test paths, the intermittent fault is simulated by a switch. The results show that the proposed method can detect the solder joint intermittent connection fault using fewer test paths. Additionally, the number of detection steps is greatly reduced without compromising fault coverage.

  11. An Approximate Approach to Automatic Kernel Selection.

    PubMed

    Ding, Lizhong; Liao, Shizhong

    2016-02-02

    Kernel selection is a fundamental problem of kernel-based learning algorithms. In this paper, we propose an approximate approach to automatic kernel selection for regression from the perspective of kernel matrix approximation. We first introduce multilevel circulant matrices into automatic kernel selection, and develop two approximate kernel selection algorithms by exploiting the computational virtues of multilevel circulant matrices. The complexity of the proposed algorithms is quasi-linear in the number of data points. Then, we prove an approximation error bound to measure the effect of the approximation in kernel matrices by multilevel circulant matrices on the hypothesis and further show that the approximate hypothesis produced with multilevel circulant matrices converges to the accurate hypothesis produced with kernel matrices. Experimental evaluations on benchmark datasets demonstrate the effectiveness of approximate kernel selection.

  12. The influence of Reynolds numbers on resistance properties of jet pumps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geng, Q.; Graduate University of Chinese Academy of Sciences, Beijing 100049; Zhou, G.

    2014-01-29

    Jet pumps are widely used in thermoacoustic Stirling heat engines and pulse tube cryocoolers to eliminate the effect of Gedeon streaming. The resistance properties of jet pumps are principally influenced by their structures and flow regimes which are always characterized by Reynolds numbers. In this paper, the jet pump of which cross section contracts abruptly is selected as our research subject. Based on linear thermoacoustic theory, a CFD model is built and the oscillating flow of the working gas is simulated and analyzed with different Reynolds numbers in the jet pump. According to the calculations, the influence of different structuresmore » and Reynolds numbers on the resistance properties of the jet pump are analyzed and presented. The results show that Reynolds numbers have a great influence on the resistance properties of jet pumps and some empirical formulas which are widely used are unsuitable for oscillating flow with small Reynolds numbers. This paper provides a more comprehensive understanding on resistance properties of jet pumps with oscillating flow and is significant for the design of jet pumps in practical thermoacoustic engines and refrigerators.« less

  13. The influence of Reynolds numbers on resistance properties of jet pumps

    NASA Astrophysics Data System (ADS)

    Geng, Q.; Zhou, G.; Li, Q.

    2014-01-01

    Jet pumps are widely used in thermoacoustic Stirling heat engines and pulse tube cryocoolers to eliminate the effect of Gedeon streaming. The resistance properties of jet pumps are principally influenced by their structures and flow regimes which are always characterized by Reynolds numbers. In this paper, the jet pump of which cross section contracts abruptly is selected as our research subject. Based on linear thermoacoustic theory, a CFD model is built and the oscillating flow of the working gas is simulated and analyzed with different Reynolds numbers in the jet pump. According to the calculations, the influence of different structures and Reynolds numbers on the resistance properties of the jet pump are analyzed and presented. The results show that Reynolds numbers have a great influence on the resistance properties of jet pumps and some empirical formulas which are widely used are unsuitable for oscillating flow with small Reynolds numbers. This paper provides a more comprehensive understanding on resistance properties of jet pumps with oscillating flow and is significant for the design of jet pumps in practical thermoacoustic engines and refrigerators.

  14. On efficiency of fire simulation realization: parallelization with greater number of computational meshes

    NASA Astrophysics Data System (ADS)

    Valasek, Lukas; Glasa, Jan

    2017-12-01

    Current fire simulation systems are capable to utilize advantages of high-performance computer (HPC) platforms available and to model fires efficiently in parallel. In this paper, efficiency of a corridor fire simulation on a HPC computer cluster is discussed. The parallel MPI version of Fire Dynamics Simulator is used for testing efficiency of selected strategies of allocation of computational resources of the cluster using a greater number of computational cores. Simulation results indicate that if the number of cores used is not equal to a multiple of the total number of cluster node cores there are allocation strategies which provide more efficient calculations.

  15. Prevalence and Diagnosis of Molar-Incisor- Hypomineralisation (MIH): A systematic review.

    PubMed

    Jälevik, B

    2010-04-01

    This was to review the literature published, to point out shortcomings and to suggest areas in need of improvement concerning the diagnosis and prevalence of MIH. A broad search of the PubMed database was conducted. Relevant papers published in English were identified after a review of their titles, abstracts or full reading of the papers. Papers were selected if the number of children with at least one first permanent molar affected by demarcated opacities could be deciphered. Targeted publications were critically assessed by the author concerning examination criteria, selection and character of the study groups, examiners' calibration and result presentation. The initial search revealed 414 papers of which 24 met the inclusion criteria. A wide variation in defect prevalence (2.4 - 40.2 %) was reported. Cross comparison of the results of the various studies were difficult because of use of different indices and criteria, examination variability, methods of recording and different age groups. Further standardization of study design and methods is needed to make the results comparable.

  16. Current Scientific Impact of Ss Cyril and Methodius University of Skopje, Republic of Macedonia in the Scopus Database (1960-2014).

    PubMed

    Spiroski, Mirko

    2015-03-15

    The aim of this study was to analyze current scientific impact of Ss Cyril and Methodius University of Skopje, Republic of Macedonia in the Scopus Database (1960-2014). Affiliation search of the Scopus database was performed on November 23, 2014 in order to identify published papers from the Ss Cyril and Methodius University of Skopje (UC&M), Republic of Macedonia. A total number of 3960 articles (3055 articles from UC&M, 861 articles from Faculty of Medicine, UC&M, and 144 articles from Faculty of Pharmacy, UC&M) were selected for analysis (1960-2014). SCImago Journal Rank (SJR), Source Normalized Impact per Paper (SNIP) and h-index were calculated from the Scopus database. The number of published papers was sharply increased with maximum of 379 papers in 2012 year. The largest number of papers has been published in Macedonian Journal of Medical Sciences, Journal of Molecular Structure, Lecture Notes in Computer Science, Acta Pharmecutica, and Macedonian Journal of Chemistry and Chemical Engineering. The biggest SJR and SNIP has journal Nephrology Dialysis Transplantation. First three places of the top ten authors belong to Dimirovski GM, Gavrilovska L, and Gusev M. Top three places based on Scopus h-index (total number of published papers) belong to Kocarev L, Stafilov T, and Polenakovic M. The majority of papers originate from UC&M, but significant numbers of papers are affiliated to Faculty of Medicine, Faculty of Pharmacy, and Institute of Chemistry as members of UC&M, as well as Macedonian Academy of Sciences and Arts. Articles are the most dominant type of documents followed by conference papers, and review articles. Medicine is the most represented subject. Officials of the Ss Cyril and Methodius University of Skopje should undertake more effective and proactive policies for journal publishers and their Editorial Boards in order to include more journals from UC&M in the Scopus database.

  17. Current Scientific Impact of Ss Cyril and Methodius University of Skopje, Republic of Macedonia in the Scopus Database (1960-2014)

    PubMed Central

    Spiroski, Mirko

    2015-01-01

    AIM: The aim of this study was to analyze current scientific impact of Ss Cyril and Methodius University of Skopje, Republic of Macedonia in the Scopus Database (1960-2014). MATERIAL AND METHODS: Affiliation search of the Scopus database was performed on November 23, 2014 in order to identify published papers from the Ss Cyril and Methodius University of Skopje (UC&M), Republic of Macedonia. A total number of 3960 articles (3055 articles from UC&M, 861 articles from Faculty of Medicine, UC&M, and 144 articles from Faculty of Pharmacy, UC&M) were selected for analysis (1960-2014). SCImago Journal Rank (SJR), Source Normalized Impact per Paper (SNIP) and h-index were calculated from the Scopus database. RESULTS: The number of published papers was sharply increased with maximum of 379 papers in 2012 year. The largest number of papers has been published in Macedonian Journal of Medical Sciences, Journal of Molecular Structure, Lecture Notes in Computer Science, Acta Pharmecutica, and Macedonian Journal of Chemistry and Chemical Engineering. The biggest SJR and SNIP has journal Nephrology Dialysis Transplantation. First three places of the top ten authors belong to Dimirovski GM, Gavrilovska L, and Gusev M. Top three places based on Scopus h-index (total number of published papers) belong to Kocarev L, Stafilov T, and Polenakovic M. The majority of papers originate from UC&M, but significant numbers of papers are affiliated to Faculty of Medicine, Faculty of Pharmacy, and Institute of Chemistry as members of UC&M, as well as Macedonian Academy of Sciences and Arts. Articles are the most dominant type of documents followed by conference papers, and review articles. Medicine is the most represented subject. CONCLUSION: Officials of the Ss Cyril and Methodius University of Skopje should undertake more effective and proactive policies for journal publishers and their Editorial Boards in order to include more journals from UC&M in the Scopus database. PMID:27275188

  18. Weak lensing magnification in the Dark Energy Survey Science Verification data

    NASA Astrophysics Data System (ADS)

    Garcia-Fernandez, M.; Sanchez, E.; Sevilla-Noarbe, I.; Suchyta, E.; Huff, E. M.; Gaztanaga, E.; Aleksić, J.; Ponce, R.; Castander, F. J.; Hoyle, B.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Eifler, T. F.; Evrard, A. E.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Jarvis, M.; Kirk, D.; Krause, E.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; MacCrann, N.; Maia, M. A. G.; March, M.; Marshall, J. L.; Melchior, P.; Miquel, R.; Mohr, J. J.; Plazas, A. A.; Romer, A. K.; Roodman, A.; Rykoff, E. S.; Scarpine, V.; Schubnell, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Tarle, G.; Thomas, D.; Walker, A. R.; Wester, W.; DES Collaboration

    2018-05-01

    In this paper, the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using the Dark Energy Survey Science Verification data set. This analysis is carried out for galaxies that are selected only by its photometric redshift. An extensive analysis of the systematic effects, using new methods based on simulations is performed, including a Monte Carlo sampling of the selection function of the survey.

  19. French Immersion Programs Across Canada: The Influence of Cumulative Amounts of Time, Starting Age and Yearly Time Allotment on the Learning of French. A Review of Evaluations of French Immersion Programs. Research Report 81-12.

    ERIC Educational Resources Information Center

    MacNab, G. L.

    This paper reviews the results of research on various programs for learning French as a second language from kindergarten to grade 11 or 12 in selected Canadian school systems. Generally, it examines the effects of a number of factors on French learning, such as student selection procedures and student ability, starting age, total amount of time…

  20. Comparison of selected analytical techniques for protein sizing, quantitation and molecular weight determination.

    PubMed

    Goetz, H; Kuschel, M; Wulff, T; Sauber, C; Miller, C; Fisher, S; Woodward, C

    2004-09-30

    Protein analysis techniques are developing fast due to the growing number of proteins obtained by recombinant DNA techniques. In the present paper we compare selected techniques, which are used for protein sizing, quantitation and molecular weight determination: sodium dodecylsulfate-polyacrylamide gel electrophoresis (SDS-PAGE), lab-on-a-chip or microfluidics technology (LoaC), size exclusion chromatography (SEC) and mass spectrometry (MS). We compare advantages and limitations of each technique in respect to different application areas, analysis time, protein sizing and quantitation performance.

  1. Markowitz portfolio optimization model employing fuzzy measure

    NASA Astrophysics Data System (ADS)

    Ramli, Suhailywati; Jaaman, Saiful Hafizah

    2017-04-01

    Markowitz in 1952 introduced the mean-variance methodology for the portfolio selection problems. His pioneering research has shaped the portfolio risk-return model and become one of the most important research fields in modern finance. This paper extends the classical Markowitz's mean-variance portfolio selection model applying the fuzzy measure to determine the risk and return. In this paper, we apply the original mean-variance model as a benchmark, fuzzy mean-variance model with fuzzy return and the model with return are modeled by specific types of fuzzy number for comparison. The model with fuzzy approach gives better performance as compared to the mean-variance approach. The numerical examples are included to illustrate these models by employing Malaysian share market data.

  2. Heuristic Bayesian segmentation for discovery of coexpressed genes within genomic regions.

    PubMed

    Pehkonen, Petri; Wong, Garry; Törönen, Petri

    2010-01-01

    Segmentation aims to separate homogeneous areas from the sequential data, and plays a central role in data mining. It has applications ranging from finance to molecular biology, where bioinformatics tasks such as genome data analysis are active application fields. In this paper, we present a novel application of segmentation in locating genomic regions with coexpressed genes. We aim at automated discovery of such regions without requirement for user-given parameters. In order to perform the segmentation within a reasonable time, we use heuristics. Most of the heuristic segmentation algorithms require some decision on the number of segments. This is usually accomplished by using asymptotic model selection methods like the Bayesian information criterion. Such methods are based on some simplification, which can limit their usage. In this paper, we propose a Bayesian model selection to choose the most proper result from heuristic segmentation. Our Bayesian model presents a simple prior for the segmentation solutions with various segment numbers and a modified Dirichlet prior for modeling multinomial data. We show with various artificial data sets in our benchmark system that our model selection criterion has the best overall performance. The application of our method in yeast cell-cycle gene expression data reveals potential active and passive regions of the genome.

  3. Version control system of CAD documents and PLC projects

    NASA Astrophysics Data System (ADS)

    Khudyakov, P. Yu; Kisel’nikov, A. Yu; Startcev, I. M.; Kovalev, A. A.

    2018-05-01

    The paper presents the process of developing a version control system for CAD documents and PLC projects. The software was tested and the optimal composition of the modules was selected. The introduction of the system has made it possible to increase the safety and stability of the process control systems, as well as to reduce the number of conflicts for versions of CAD files. The number of incidents at the enterprise related to the use of incorrect versions of PLC projects is reduced to 0.

  4. Visualization as an Aid to Problem-Solving: Examples from History.

    ERIC Educational Resources Information Center

    Rieber, Lloyd P.

    This paper presents a historical overview of visualization as a human problem-solving tool. Visualization strategies, such as mental imagery, pervade historical accounts of scientific discovery and invention. A selected number of historical examples are presented and discussed on a wide range of topics such as physics, aviation, and the science of…

  5. Family Bonding with Universities. NBER Working Paper No. 15493

    ERIC Educational Resources Information Center

    Meer, Jonathan; Rosen, Harvey S.

    2009-01-01

    One justification offered for legacy admissions policies at universities is that they bind entire families to the university. Proponents maintain that these policies have a number of benefits, including increased donations from members of these families. We use a rich set of data from an anonymous selective research institution to investigate…

  6. Facilitating Innovations in Higher Education in Transition Economies

    ERIC Educational Resources Information Center

    Saginova, Olga; Belyansky, Vladimir

    2008-01-01

    Purpose: The purpose of this paper is to analyse innovations in education from the point of view of product content and markets selected. Emerging market economies face a number of problems many of which are closely linked to and dependent upon the effectiveness of higher professional education. External environment changes, such as the formation…

  7. The Child in the Divorcing Family. Unit for Child Studies. Selected Papers Number 22.

    ERIC Educational Resources Information Center

    Phillips, Shelley

    A sequence of stages in children's emotional response to parental separation and divorce is described, some effects of continued parent hostility are pointed out, and aspects of children's adjustment to changed family circumstances are briefly discussed. Developmental differences in children's responses to divorce are considered on the basis of…

  8. 75 FR 80804 - Combined Notice of Filings No. 1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-23

    ...: Compliance Filing--Missing Data Element to be effective 8/25/2010. Filed Date: 12/09/2010. Accession Number... of paper, using the FERC Online links at http://www.ferc.gov . To facilitate electronic service... must create and validate an eRegistration account using the eRegistration link. Select the eFiling link...

  9. The Toddler and the Pre-Schooler. Unit for Child Studies. Selected Papers Number 29.

    ERIC Educational Resources Information Center

    Phillips, Shelley

    This description of toddlers and preschool children emphasizes how young children think and describes the development of self-concept. Language development and antisocial and prosocial behavior are also discussed. The exploration of children's thought processes begins with two principles: (1) Concepts originate in activity; and (2) Motor…

  10. Leaving Teaching: Lack of Resilience or Sign of Agency?

    ERIC Educational Resources Information Center

    Smith, Kari; Ulvik, Marit

    2017-01-01

    Alarming numbers of teacher attrition are reported in many countries, including in Norway. Whereas most of the research tells about personal and professional negative experiences which have a harmful impact on teachers' resilience, a different approach is taken in the current paper. Four cases of leavers are purposefully selected because they…

  11. Parameter estimation and order selection for an empirical model of VO2 on-kinetics.

    PubMed

    Alata, O; Bernard, O

    2007-04-27

    In humans, VO2 on-kinetics are noisy numerical signals that reflect the pulmonary oxygen exchange kinetics at the onset of exercise. They are empirically modelled as a sum of an offset and delayed exponentials. The number of delayed exponentials; i.e. the order of the model, is commonly supposed to be 1 for low-intensity exercises and 2 for high-intensity exercises. As no ground truth has ever been provided to validate these postulates, physiologists still need statistical methods to verify their hypothesis about the number of exponentials of the VO2 on-kinetics especially in the case of high-intensity exercises. Our objectives are first to develop accurate methods for estimating the parameters of the model at a fixed order, and then, to propose statistical tests for selecting the appropriate order. In this paper, we provide, on simulated Data, performances of Simulated Annealing for estimating model parameters and performances of Information Criteria for selecting the order. These simulated Data are generated with both single-exponential and double-exponential models, and noised by white and Gaussian noise. The performances are given at various Signal to Noise Ratio (SNR). Considering parameter estimation, results show that the confidences of estimated parameters are improved by increasing the SNR of the response to be fitted. Considering model selection, results show that Information Criteria are adapted statistical criteria to select the number of exponentials.

  12. Design Considerations of a Transverse Flux Machine for Direct-Drive Wind Turbine Applications: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Husain, Tausif; Hasan, Iftekhar; Sozer, Yilmaz

    This paper presents the design considerations of a double-sided transverse flux machine (TFM) for direct-drive wind turbine applications. The TFM has a modular structure with quasi-U stator cores and ring windings. The rotor is constructed with ferrite magnets in a flux-concentrating arrangement to achieve high air gap flux density. The design considerations for this TFM with respect to initial sizing, pole number selection, key design ratios, and pole shaping are presented in this paper. Pole number selection is critical in the design process of a TFM because it affects both the torque density and power factor under fixed magnetic andmore » changing electrical loading. Several key design ratios are introduced to facilitate the design procedure. The effect of pole shaping on back-emf and inductance is also analyzed. These investigations provide guidance toward the required design of a TFM for direct-drive applications. The analyses are carried out using analytical and three-dimensional finite element analysis. A prototype is under construction for experimental verification.« less

  13. Information Theory for Gabor Feature Selection for Face Recognition

    NASA Astrophysics Data System (ADS)

    Shen, Linlin; Bai, Li

    2006-12-01

    A discriminative and robust feature—kernel enhanced informative Gabor feature—is proposed in this paper for face recognition. Mutual information is applied to select a set of informative and nonredundant Gabor features, which are then further enhanced by kernel methods for recognition. Compared with one of the top performing methods in the 2004 Face Verification Competition (FVC2004), our methods demonstrate a clear advantage over existing methods in accuracy, computation efficiency, and memory cost. The proposed method has been fully tested on the FERET database using the FERET evaluation protocol. Significant improvements on three of the test data sets are observed. Compared with the classical Gabor wavelet-based approaches using a huge number of features, our method requires less than 4 milliseconds to retrieve a few hundreds of features. Due to the substantially reduced feature dimension, only 4 seconds are required to recognize 200 face images. The paper also unified different Gabor filter definitions and proposed a training sample generation algorithm to reduce the effects caused by unbalanced number of samples available in different classes.

  14. Design Considerations of a Transverse Flux Machine for Direct-Drive Wind Turbine Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Husain, Tausif; Hasan, Iftekhar; Sozer, Yilmaz

    This paper presents the design considerations of a double-sided transverse flux machine (TFM) for direct-drive wind turbine applications. The TFM has a modular structure with quasi-U stator cores and ring windings. The rotor is constructed with ferrite magnets in a flux-concentrating arrangement to achieve high air gap flux density. The design considerations for this TFM with respect to initial sizing, pole number selection, key design ratios, and pole shaping are presented in this paper. Pole number selection is critical in the design process of a TFM because it affects both the torque density and power factor under fixed magnetic andmore » changing electrical loading. Several key design ratios are introduced to facilitate the design procedure. The effect of pole shaping on back-emf and inductance is also analyzed. These investigations provide guidance toward the required design of a TFM for direct-drive applications. The analyses are carried out using analytical and three-dimensional finite element analysis. A prototype is under construction for experimental verification.« less

  15. Complexity of bioindicator selection for ecological, human, and cultural health: Chinook salmon and red knot as case studies

    PubMed Central

    Burger, Joanna; Gochfeld, Michael; Niles, Lawrence; Powers, Charles; Brown, Kevin; Clarke, James; Dey, Amanda; Kosson, David

    2015-01-01

    There is considerable interest in developing bioindicators of ecological health that are also useful indicators for human health. Yet, human health assessment usually encompasses physical/chemical exposures and not cultural well-being. In this paper, we propose that bioindicators can be selected for all three purposes. We use Chinook or king salmon (Oncorhynchus tshawytscha) and red knot (Calidris canutus rufa, a sandpiper) as examples of indicators that can be used to assess human, ecological, and cultural health. Even so, selecting endpoints or metrics for each indicator species is complex and is explored in this paper. We suggest that there are several endpoint types to examine for a given species, including physical environment, environmental stressors, habitat, life history, demography, population counts, and cultural/societal aspects. Usually cultural endpoints are economic indicators (e.g., number of days fished, number of hunting licenses), rather than the importance of a fishing culture. Development of cultural/societal endpoints must include the perceptions of local communities, cultural groups, and tribal nations, as well as governmental and regulatory communities (although not usually so defined, the latter have cultures as well). Endpoint selection in this category is difficult because the underlying issues need to be identified and used to develop endpoints that tribes and stakeholders themselves see as reasonable surrogates of the qualities they value. We describe several endpoints for salmon and knots that can be used for ecological, human, and cultural/societal health. PMID:25666646

  16. Band selection method based on spectrum difference in targets of interest in hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaohan; Yang, Guang; Yang, Yongbo; Huang, Junhua

    2016-10-01

    While hyperspectral data shares rich spectrum information, it has numbers of bands with high correlation coefficients, causing great data redundancy. A reasonable band selection is important for subsequent processing. Bands with large amount of information and low correlation should be selected. On this basis, according to the needs of target detection applications, the spectral characteristics of the objects of interest are taken into consideration in this paper, and a new method based on spectrum difference is proposed. Firstly, according to the spectrum differences of targets of interest, a difference matrix which represents the different spectral reflectance of different targets in different bands is structured. By setting a threshold, the bands satisfying the conditions would be left, constituting a subset of bands. Then, the correlation coefficients between bands are calculated and correlation matrix is given. According to the size of the correlation coefficient, the bands can be set into several groups. At last, the conception of normalized variance is used on behalf of the information content of each band. The bands are sorted by the value of its normalized variance. Set needing number of bands, and the optimum band combination solution can be get by these three steps. This method retains the greatest degree of difference between the target of interest and is easy to achieve by computer automatically. Besides, false color image synthesis experiment is carried out using the bands selected by this method as well as other 3 methods to show the performance of method in this paper.

  17. Complexity of bioindicator selection for ecological, human, and cultural health: Chinook salmon and red knot as case studies

    DOE PAGES

    Burger, Joanna; Gochfeld, Michael; Niles, Lawrence; ...

    2015-02-10

    There is considerable interest in developing bioindicators of ecological health that are also useful indicators for human health. Yet, human health assessment usually encompasses physical/chemical exposures and not cultural well-being. In this paper, we propose that bioindicators can be selected for all three purposes. We use Chinook or king salmon (Oncorhynchus tshawytscha) and red knot (Calidris canutus rufa, a sandpiper) as examples of indicators that can be used to assess human, ecological, and cultural health. Even so, selecting endpoints or metrics for each indicator species is complex and is explored in this paper. Here, we suggest that there are severalmore » endpoint types to examine for a given species, including physical environment, environmental stressors, habitat, life history, demography, population counts, and cultural/societal aspects. Usually cultural endpoints are economic indicators (e.g., number of days fished, number of hunting licenses), rather than the importance of a fishing culture. Development of cultural/societal endpoints must include the perceptions of local communities, cultural groups, and tribal nations, as well as governmental and regulatory communities (although not usually so defined, the latter have cultures as well). Endpoint selection in this category is difficult because the underlying issues need to be identified and used to develop endpoints that tribes and stakeholders themselves see as reasonable surrogates of the qualities they value. We describe several endpoints for salmon and knots that can be used for ecological, human, and cultural/societal health.« less

  18. Complexity of bioindicator selection for ecological, human, and cultural health: Chinook salmon and red knot as case studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burger, Joanna; Gochfeld, Michael; Niles, Lawrence

    There is considerable interest in developing bioindicators of ecological health that are also useful indicators for human health. Yet, human health assessment usually encompasses physical/chemical exposures and not cultural well-being. In this paper, we propose that bioindicators can be selected for all three purposes. We use Chinook or king salmon (Oncorhynchus tshawytscha) and red knot (Calidris canutus rufa, a sandpiper) as examples of indicators that can be used to assess human, ecological, and cultural health. Even so, selecting endpoints or metrics for each indicator species is complex and is explored in this paper. Here, we suggest that there are severalmore » endpoint types to examine for a given species, including physical environment, environmental stressors, habitat, life history, demography, population counts, and cultural/societal aspects. Usually cultural endpoints are economic indicators (e.g., number of days fished, number of hunting licenses), rather than the importance of a fishing culture. Development of cultural/societal endpoints must include the perceptions of local communities, cultural groups, and tribal nations, as well as governmental and regulatory communities (although not usually so defined, the latter have cultures as well). Endpoint selection in this category is difficult because the underlying issues need to be identified and used to develop endpoints that tribes and stakeholders themselves see as reasonable surrogates of the qualities they value. We describe several endpoints for salmon and knots that can be used for ecological, human, and cultural/societal health.« less

  19. An automatic optimum number of well-distributed ground control lines selection procedure based on genetic algorithm

    NASA Astrophysics Data System (ADS)

    Yavari, Somayeh; Valadan Zoej, Mohammad Javad; Salehi, Bahram

    2018-05-01

    The procedure of selecting an optimum number and best distribution of ground control information is important in order to reach accurate and robust registration results. This paper proposes a new general procedure based on Genetic Algorithm (GA) which is applicable for all kinds of features (point, line, and areal features). However, linear features due to their unique characteristics are of interest in this investigation. This method is called Optimum number of Well-Distributed ground control Information Selection (OWDIS) procedure. Using this method, a population of binary chromosomes is randomly initialized. The ones indicate the presence of a pair of conjugate lines as a GCL and zeros specify the absence. The chromosome length is considered equal to the number of all conjugate lines. For each chromosome, the unknown parameters of a proper mathematical model can be calculated using the selected GCLs (ones in each chromosome). Then, a limited number of Check Points (CPs) are used to evaluate the Root Mean Square Error (RMSE) of each chromosome as its fitness value. The procedure continues until reaching a stopping criterion. The number and position of ones in the best chromosome indicate the selected GCLs among all conjugate lines. To evaluate the proposed method, a GeoEye and an Ikonos Images are used over different areas of Iran. Comparing the obtained results by the proposed method in a traditional RFM with conventional methods that use all conjugate lines as GCLs shows five times the accuracy improvement (pixel level accuracy) as well as the strength of the proposed method. To prevent an over-parametrization error in a traditional RFM due to the selection of a high number of improper correlated terms, an optimized line-based RFM is also proposed. The results show the superiority of the combination of the proposed OWDIS method with an optimized line-based RFM in terms of increasing the accuracy to better than 0.7 pixel, reliability, and reducing systematic errors. These results also demonstrate the high potential of linear features as reliable control features to reach sub-pixel accuracy in registration applications.

  20. Sparsity-aware multiple relay selection in large multi-hop decode-and-forward relay networks

    NASA Astrophysics Data System (ADS)

    Gouissem, A.; Hamila, R.; Al-Dhahir, N.; Foufou, S.

    2016-12-01

    In this paper, we propose and investigate two novel techniques to perform multiple relay selection in large multi-hop decode-and-forward relay networks. The two proposed techniques exploit sparse signal recovery theory to select multiple relays using the orthogonal matching pursuit algorithm and outperform state-of-the-art techniques in terms of outage probability and computation complexity. To reduce the amount of collected channel state information (CSI), we propose a limited-feedback scheme where only a limited number of relays feedback their CSI. Furthermore, a detailed performance-complexity tradeoff investigation is conducted for the different studied techniques and verified by Monte Carlo simulations.

  1. Fuzzy feature selection based on interval type-2 fuzzy sets

    NASA Astrophysics Data System (ADS)

    Cherif, Sahar; Baklouti, Nesrine; Alimi, Adel; Snasel, Vaclav

    2017-03-01

    When dealing with real world data; noise, complexity, dimensionality, uncertainty and irrelevance can lead to low performance and insignificant judgment. Fuzzy logic is a powerful tool for controlling conflicting attributes which can have similar effects and close meanings. In this paper, an interval type-2 fuzzy feature selection is presented as a new approach for removing irrelevant features and reducing complexity. We demonstrate how can Feature Selection be joined with Interval Type-2 Fuzzy Logic for keeping significant features and hence reducing time complexity. The proposed method is compared with some other approaches. The results show that the number of attributes is proportionally small.

  2. Evaluation of Scientific Journal Validity, It's Articles and Their Authors.

    PubMed

    Masic, Izet; Begic, Edin

    2016-01-01

    The science that deals with evaluation of a scientific article refer to the finding quantitative indicators (index) of the scientific research success is called scientometrics. Scientometrics is part of scientology (the science of science) that analyzes scientific papers and their citations in a selected sample of scientific journals. There are four indexes by which it is possible to measure the validity of scientific research: number of articles, impact factor of the journal, the number and order of authors and citations number. Every scientific article is a record of the data written by the rules recommended by several scientific associations and committees. Growing number of authors and lot of authors with same name and surname led to the introduction of the necessary identification agent - ORCID number.

  3. A selective-update affine projection algorithm with selective input vectors

    NASA Astrophysics Data System (ADS)

    Kong, NamWoong; Shin, JaeWook; Park, PooGyeon

    2011-10-01

    This paper proposes an affine projection algorithm (APA) with selective input vectors, which based on the concept of selective-update in order to reduce estimation errors and computations. The algorithm consists of two procedures: input- vector-selection and state-decision. The input-vector-selection procedure determines the number of input vectors by checking with mean square error (MSE) whether the input vectors have enough information for update. The state-decision procedure determines the current state of the adaptive filter by using the state-decision criterion. As the adaptive filter is in transient state, the algorithm updates the filter coefficients with the selected input vectors. On the other hand, as soon as the adaptive filter reaches the steady state, the update procedure is not performed. Through these two procedures, the proposed algorithm achieves small steady-state estimation errors, low computational complexity and low update complexity for colored input signals.

  4. Optimal test selection for prediction uncertainty reduction

    DOE PAGES

    Mullins, Joshua; Mahadevan, Sankaran; Urbina, Angel

    2016-12-02

    Economic factors and experimental limitations often lead to sparse and/or imprecise data used for the calibration and validation of computational models. This paper addresses resource allocation for calibration and validation experiments, in order to maximize their effectiveness within given resource constraints. When observation data are used for model calibration, the quality of the inferred parameter descriptions is directly affected by the quality and quantity of the data. This paper characterizes parameter uncertainty within a probabilistic framework, which enables the uncertainty to be systematically reduced with additional data. The validation assessment is also uncertain in the presence of sparse and imprecisemore » data; therefore, this paper proposes an approach for quantifying the resulting validation uncertainty. Since calibration and validation uncertainty affect the prediction of interest, the proposed framework explores the decision of cost versus importance of data in terms of the impact on the prediction uncertainty. Often, calibration and validation tests may be performed for different input scenarios, and this paper shows how the calibration and validation results from different conditions may be integrated into the prediction. Then, a constrained discrete optimization formulation that selects the number of tests of each type (calibration or validation at given input conditions) is proposed. Furthermore, the proposed test selection methodology is demonstrated on a microelectromechanical system (MEMS) example.« less

  5. Detector location selection based on VIP analysis in near-infrared detection of dural hematoma.

    PubMed

    Sun, Qiuming; Zhang, Yanjun; Ma, Jun; Tian, Feng; Wang, Huiquan; Liu, Dongyuan

    2018-03-01

    Detection of dural hematoma based on multi-channel near-infrared differential absorbance has the advantages of rapid and non-invasive detection. The location and number of detectors around the light source are critical for reducing the pathological characteristics of the prediction model on dural hematoma degree. Therefore, rational selection of detector numbers and their distances from the light source is very important. In this paper, a detector position screening method based on Variable Importance in the Projection (VIP) analysis is proposed. A preliminary modeling based on Partial Least Squares method (PLS) for the prediction of dural position μ a was established using light absorbance information from 30 detectors located 2.0-5.0 cm from the light source with a 0.1 cm interval. The mean relative error (MRE) of the dural position μ a prediction model was 4.08%. After VIP analysis, the number of detectors was reduced from 30 to 4 and the MRE of the dural position μ a prediction was reduced from 4.08% to 2.06% after the reduction in detector numbers. The prediction model after VIP detector screening still showed good prediction of the epidural position μ a . This study provided a new approach and important reference on the selection of detector location in near-infrared dural hematoma detection.

  6. Selection of experimental modal data sets for damage detection via model update

    NASA Technical Reports Server (NTRS)

    Doebling, S. W.; Hemez, F. M.; Barlow, M. S.; Peterson, L. D.; Farhat, C.

    1993-01-01

    When using a finite element model update algorithm for detecting damage in structures, it is important that the experimental modal data sets used in the update be selected in a coherent manner. In the case of a structure with extremely localized modal behavior, it is necessary to use both low and high frequency modes, but many of the modes in between may be excluded. In this paper, we examine two different mode selection strategies based on modal strain energy, and compare their success to the choice of an equal number of modes based merely on lowest frequency. Additionally, some parameters are introduced to enable a quantitative assessment of the success of our damage detection algorithm when using the various set selection criteria.

  7. Balancing selection on immunity genes: review of the current literature and new analysis in Drosophila melanogaster.

    PubMed

    Croze, Myriam; Živković, Daniel; Stephan, Wolfgang; Hutter, Stephan

    2016-08-01

    Balancing selection has been widely assumed to be an important evolutionary force, yet even today little is known about its abundance and its impact on the patterns of genetic diversity. Several studies have shown examples of balancing selection in humans, plants or parasites, and many genes under balancing selection are involved in immunity. It has been proposed that host-parasite coevolution is one of the main forces driving immune genes to evolve under balancing selection. In this paper, we review the literature on balancing selection on immunity genes in several organisms, including Drosophila. Furthermore, we performed a genome scan for balancing selection in an African population of Drosophila melanogaster using coalescent simulations of a demographic model with and without selection. We find very few genes under balancing selection and only one novel candidate gene related to immunity. Finally, we discuss the possible causes of the low number of genes under balancing selection. Copyright © 2016 The Authors. Published by Elsevier GmbH.. All rights reserved.

  8. Multicriteria framework for selecting a process modelling language

    NASA Astrophysics Data System (ADS)

    Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel

    2016-01-01

    The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.

  9. Comparison of Genetic Algorithm, Particle Swarm Optimization and Biogeography-based Optimization for Feature Selection to Classify Clusters of Microcalcifications

    NASA Astrophysics Data System (ADS)

    Khehra, Baljit Singh; Pharwaha, Amar Partap Singh

    2017-04-01

    Ductal carcinoma in situ (DCIS) is one type of breast cancer. Clusters of microcalcifications (MCCs) are symptoms of DCIS that are recognized by mammography. Selection of robust features vector is the process of selecting an optimal subset of features from a large number of available features in a given problem domain after the feature extraction and before any classification scheme. Feature selection reduces the feature space that improves the performance of classifier and decreases the computational burden imposed by using many features on classifier. Selection of an optimal subset of features from a large number of available features in a given problem domain is a difficult search problem. For n features, the total numbers of possible subsets of features are 2n. Thus, selection of an optimal subset of features problem belongs to the category of NP-hard problems. In this paper, an attempt is made to find the optimal subset of MCCs features from all possible subsets of features using genetic algorithm (GA), particle swarm optimization (PSO) and biogeography-based optimization (BBO). For simulation, a total of 380 benign and malignant MCCs samples have been selected from mammogram images of DDSM database. A total of 50 features extracted from benign and malignant MCCs samples are used in this study. In these algorithms, fitness function is correct classification rate of classifier. Support vector machine is used as a classifier. From experimental results, it is also observed that the performance of PSO-based and BBO-based algorithms to select an optimal subset of features for classifying MCCs as benign or malignant is better as compared to GA-based algorithm.

  10. Social power and opinion formation in complex networks

    NASA Astrophysics Data System (ADS)

    Jalili, Mahdi

    2013-02-01

    In this paper we investigate the effects of social power on the evolution of opinions in model networks as well as in a number of real social networks. A continuous opinion formation model is considered and the analysis is performed through numerical simulation. Social power is given to a proportion of agents selected either randomly or based on their degrees. As artificial network structures, we consider scale-free networks constructed through preferential attachment and Watts-Strogatz networks. Numerical simulations show that scale-free networks with degree-based social power on the hub nodes have an optimal case where the largest number of the nodes reaches a consensus. However, given power to a random selection of nodes could not improve consensus properties. Introducing social power in Watts-Strogatz networks could not significantly change the consensus profile.

  11. Use of a fluoride channel as a new selection marker for fission yeast plasmids and application to fast genome editing with CRISPR/Cas9

    PubMed Central

    Fernandez, Ronan; Berro, Julien

    2017-01-01

    Fission yeast is a powerful model organism that has provided insights into important cellular processes thanks to the ease of its genome editing by homologous recombination. However, creation of strains with a large number of targeted mutations or containing plasmids has been challenging because only a very small number of selection markers is available in Schizosaccharomyces pombe. In this paper, we identify two fission yeast fluoride exporter channels (Fex1p and Fex2p) and describe the development of a new strategy using Fex1p as a selection marker for transformants in rich media supplemented with fluoride. To our knowledge this is the first positive selection marker identified in S. pombe that does not use auxotrophy or drug resistance and that can be used for plasmids transformation or genomic integration in rich media. We illustrate the application of our new marker by significantly accelerating the protocol for genome edition using CRISPR/Cas9 in S. pombe. PMID:27327046

  12. Randomness and diversity matter in the maintenance of the public resources

    NASA Astrophysics Data System (ADS)

    Liu, Aizhi; Zhang, Yanling; Chen, Xiaojie; Sun, Changyin

    2017-03-01

    Most previous models about the public goods game usually assume two possible strategies, i.e., investing all or nothing. The real-life situation is rarely all or nothing. In this paper, we consider that multiple strategies are adopted in a well-mixed population, and each strategy represents an investment to produce the public goods. Past efforts have found that randomness matters in the evolution of fairness in the ultimatum game. In the framework involving no other mechanisms, we study how diversity and randomness influence the average investment of the population defined by the mean value of all individuals' strategies. The level of diversity is increased by increasing the strategy number, and the level of randomness is increased by increasing the mutation probability, or decreasing the population size or the selection intensity. We find that a higher level of diversity and a higher level of randomness lead to larger average investment and favor more the evolution of cooperation. Under weak selection, the average investment changes very little with the strategy number, the population size, and the mutation probability. Under strong selection, the average investment changes very little with the strategy number and the population size, but changes a lot with the mutation probability. Under intermediate selection, the average investment increases significantly with the strategy number and the mutation probability, and decreases significantly with the population size. These findings are meaningful to study how to maintain the public resource.

  13. Social Origins, Academic Strength of School Curriculum and Access to Selective Higher Education Institutions: Evidence from Scotland and the USA

    ERIC Educational Resources Information Center

    Duta, Adriana; An, Brian; Iannelli, Cristina

    2018-01-01

    This paper analyses the role that different components of the academic strength of the secondary-school curriculum (i.e. "number," "subjects" and "grades" of advanced academic courses) play in explaining social origin differences in access to prestigious universities (but also to other higher education institutions)…

  14. Using Mobile Apps to Entice General Education Students into Technology Fields

    ERIC Educational Resources Information Center

    Liu, Michelle; Murphy, Diane

    2013-01-01

    It is of national importance to increase the number of college students pursuing degrees in information systems/information technology (IT/IS) subjects. The primary focus at many institutions is renovating or enhancing existing IT/IS programs and the target audience is the students who have selected to major in IT/IS subjects. This paper looks at…

  15. Illness in the Family: Old Myths and New Truths! Unit for Child Studies. Selected Papers Number 19.

    ERIC Educational Resources Information Center

    Perkins, Richard; Oldenburg, Brian

    A multifactorial model of phases in the development and progress of physical illness is described, and the model's utility is illustrated. The model consists of antecedent and concurrent conditions and consequences related to physical, psychological, and social factors and their interaction. The application of the model is illustrated by a…

  16. Recognition of Depression in the Primary School Child. Selected Papers, Number 56.

    ERIC Educational Resources Information Center

    Schick, Tom

    Real depression in children can manifest itself in many ways. Every human being is somewhat different and consequently distress signals and breaking points do not always imitate each other. What remains constant is the internal emotional desert and emptiness. Depression is not just feeling sad or lethargic. Depression is a state in which…

  17. Changing Patterns in Internal Communication in Large Academic Libraries. Occasional Paper Number 6.

    ERIC Educational Resources Information Center

    Euster, Joanne R.

    Based on data from a 1979 survey of ARL member libraries, this study by the Office of Management Studies analyzes the responses of selected libraries which had provided internal studies or planning documents on the subject of internal communication and notes the extent of resulting changes in procedures. The studies yielded information on staff…

  18. Alcoholism and the Family. Unit for Child Studies Selected Papers Number 34.

    ERIC Educational Resources Information Center

    Wilson, G. C.

    Alcoholism, and particularly alcoholism in the family, is an unsolved medical and social problem. Addictive drinking results in several social and psychological problems, most of which are caused by a change in brain function. Excessive drinking of alcoholic beverages operates as a stressor and produces alkaloids at the base of the brain that are…

  19. The Need for Participation in Open and Distance Education: The Open University Malaysia Experience

    ERIC Educational Resources Information Center

    Raghavan, Santhi; Kumar, P. Rajesh

    2007-01-01

    This paper provides an overview of adult learner participation in open and distance education by focusing participation needs based on selected socio-demographic variables such as age, years of working experience and monthly income. The related study involved a sample of 454 Open University Malaysia students from a number of learning centres…

  20. Fantasy, Lies and Imaginary Companions. Foundation for Child and Youth Studies Selected Papers Number 45.

    ERIC Educational Resources Information Center

    Phillips, Shelley

    This description of the development of imagination and fantasy in children outlines how children view their fantasies, imaginings, imaginary companions, and lies at different stages of development. Main topics include (1) the purposes of fantasy; (2) fantasy in preschool children; (3) imaginative games and dramas; (4) promotion or inhibition of…

  1. Loans for Vocational Education and Training in Europe. Research Paper. Number 20

    ERIC Educational Resources Information Center

    Cedefop - European Centre for the Development of Vocational Training, 2012

    2012-01-01

    This report reviews the use of loans for learning in 33 European countries and analyses the schemes in eight selected Member States: France, Hungary, the Netherlands, Austria, Poland, Finland, Sweden and the UK. The analysis shows that loan schemes vary considerably across Europe in terms of types and levels of learning covered, conditions of…

  2. "Heroes" and "Villains" in the Lives of Children and Young People

    ERIC Educational Resources Information Center

    Power, Sally; Smith, Kevin

    2017-01-01

    This paper explores the responses of nearly 1,200 children and young people in Wales who were asked to identify which three famous people they most admired and which three they most disliked. Analysis of these young people's responses reveals a number of sociological and educational issues. Their selections confirm other research which has…

  3. Students, History Textbooks, and the Hidden Dimension. Occasional Paper Number 77-1.

    ERIC Educational Resources Information Center

    Kingman, Barry

    Since history textbooks omit and/or emphasize certain data, students are left with a false sense of history. Although the "hard data" presented in history texts is generally regarded as reliable, the selection and organization of that data is inherently manipulative because other data has been excluded. Because authors do not begin with a…

  4. Classification of motor imagery tasks for BCI with multiresolution analysis and multiobjective feature selection.

    PubMed

    Ortega, Julio; Asensio-Cubero, Javier; Gan, John Q; Ortiz, Andrés

    2016-07-15

    Brain-computer interfacing (BCI) applications based on the classification of electroencephalographic (EEG) signals require solving high-dimensional pattern classification problems with such a relatively small number of training patterns that curse of dimensionality problems usually arise. Multiresolution analysis (MRA) has useful properties for signal analysis in both temporal and spectral analysis, and has been broadly used in the BCI field. However, MRA usually increases the dimensionality of the input data. Therefore, some approaches to feature selection or feature dimensionality reduction should be considered for improving the performance of the MRA based BCI. This paper investigates feature selection in the MRA-based frameworks for BCI. Several wrapper approaches to evolutionary multiobjective feature selection are proposed with different structures of classifiers. They are evaluated by comparing with baseline methods using sparse representation of features or without feature selection. The statistical analysis, by applying the Kolmogorov-Smirnoff and Kruskal-Wallis tests to the means of the Kappa values evaluated by using the test patterns in each approach, has demonstrated some advantages of the proposed approaches. In comparison with the baseline MRA approach used in previous studies, the proposed evolutionary multiobjective feature selection approaches provide similar or even better classification performances, with significant reduction in the number of features that need to be computed.

  5. Wall Interference Study of the NTF Slotted Tunnel Using Bodies of Revolution Wall Signature Data

    NASA Technical Reports Server (NTRS)

    Iyer, Venkit; Kuhl, David D.; Walker, Eric L.

    2004-01-01

    This paper is a description of the analysis of blockage corrections for bodies of revolution for the slotted-wall configuration of the National Transonic Facility (NTF) at the NASA Langley Research Center (LaRC). A wall correction method based on the measured wall signature is used. Test data from three different-sized blockage bodies and four wall ventilation settings were analyzed at various Mach numbers and unit Reynolds numbers. The results indicate that with the proper selection of the boundary condition parameters, the wall correction method can predict blockage corrections consistent with the wall measurements for Mach numbers as high as 0.95.

  6. Three Dimensional Sector Design with Optimal Number of Sectors

    NASA Technical Reports Server (NTRS)

    Xue, Min

    2010-01-01

    In the national airspace system, sectors get overloaded due to high traffic demand and inefficient airspace designs. Overloads can be eliminated in some cases by redesigning sector boundaries. This paper extends the Voronoi-based sector design method by automatically selecting the number of sectors, allowing three-dimensional partitions, and enforcing traffic pattern conformance. The method was used to design sectors at Fort-Worth and Indianapolis centers for current traffic scenarios. Results show that new designs can eliminate overloaded sectors, although not in all cases, reduce the number of necessary sectors, and conform to major traffic patterns. Overall, the new methodology produces enhanced and efficient sector designs.

  7. Self-adaptive MOEA feature selection for classification of bankruptcy prediction data.

    PubMed

    Gaspar-Cunha, A; Recio, G; Costa, L; Estébanez, C

    2014-01-01

    Bankruptcy prediction is a vast area of finance and accounting whose importance lies in the relevance for creditors and investors in evaluating the likelihood of getting into bankrupt. As companies become complex, they develop sophisticated schemes to hide their real situation. In turn, making an estimation of the credit risks associated with counterparts or predicting bankruptcy becomes harder. Evolutionary algorithms have shown to be an excellent tool to deal with complex problems in finances and economics where a large number of irrelevant features are involved. This paper provides a methodology for feature selection in classification of bankruptcy data sets using an evolutionary multiobjective approach that simultaneously minimise the number of features and maximise the classifier quality measure (e.g., accuracy). The proposed methodology makes use of self-adaptation by applying the feature selection algorithm while simultaneously optimising the parameters of the classifier used. The methodology was applied to four different sets of data. The obtained results showed the utility of using the self-adaptation of the classifier.

  8. Angular Spacing Control for Segmented Data Pages in Angle-Multiplexed Holographic Memory

    NASA Astrophysics Data System (ADS)

    Kinoshita, Nobuhiro; Muroi, Tetsuhiko; Ishii, Norihiko; Kamijo, Koji; Kikuchi, Hiroshi; Shimidzu, Naoki; Ando, Toshio; Masaki, Kazuyoshi; Shimizu, Takehiro

    2011-09-01

    To improve the recording density of angle-multiplexed holographic memory, it is effective to increase the numerical aperture of the lens and to shorten the wavelength of the laser source as well as to increase the multiplexing number. The angular selectivity of a hologram, which determines the multiplexing number, is dependent on the incident angle of not only the reference beam but also the signal beam to the holographic recording medium. The actual signal beam, which is a convergent or divergent beam, is regarded as the sum of plane waves that have different propagation directions, angular selectivities, and optimal angular spacings. In this paper, focusing on the differences in the optimal angular spacing, we proposed a method to control the angular spacing for each segmented data page. We investigated the angular selectivity of a hologram and crosstalk for segmented data pages using numerical simulation. The experimental results showed a practical bit-error rate on the order of 10-3.

  9. Using a two-phase evolutionary framework to select multiple network spreaders based on community structure

    NASA Astrophysics Data System (ADS)

    Fu, Yu-Hsiang; Huang, Chung-Yuan; Sun, Chuen-Tsai

    2016-11-01

    Using network community structures to identify multiple influential spreaders is an appropriate method for analyzing the dissemination of information, ideas and infectious diseases. For example, data on spreaders selected from groups of customers who make similar purchases may be used to advertise products and to optimize limited resource allocation. Other examples include community detection approaches aimed at identifying structures and groups in social or complex networks. However, determining the number of communities in a network remains a challenge. In this paper we describe our proposal for a two-phase evolutionary framework (TPEF) for determining community numbers and maximizing community modularity. Lancichinetti-Fortunato-Radicchi benchmark networks were used to test our proposed method and to analyze execution time, community structure quality, convergence, and the network spreading effect. Results indicate that our proposed TPEF generates satisfactory levels of community quality and convergence. They also suggest a need for an index, mechanism or sampling technique to determine whether a community detection approach should be used for selecting multiple network spreaders.

  10. Self-Adaptive MOEA Feature Selection for Classification of Bankruptcy Prediction Data

    PubMed Central

    Gaspar-Cunha, A.; Recio, G.; Costa, L.; Estébanez, C.

    2014-01-01

    Bankruptcy prediction is a vast area of finance and accounting whose importance lies in the relevance for creditors and investors in evaluating the likelihood of getting into bankrupt. As companies become complex, they develop sophisticated schemes to hide their real situation. In turn, making an estimation of the credit risks associated with counterparts or predicting bankruptcy becomes harder. Evolutionary algorithms have shown to be an excellent tool to deal with complex problems in finances and economics where a large number of irrelevant features are involved. This paper provides a methodology for feature selection in classification of bankruptcy data sets using an evolutionary multiobjective approach that simultaneously minimise the number of features and maximise the classifier quality measure (e.g., accuracy). The proposed methodology makes use of self-adaptation by applying the feature selection algorithm while simultaneously optimising the parameters of the classifier used. The methodology was applied to four different sets of data. The obtained results showed the utility of using the self-adaptation of the classifier. PMID:24707201

  11. Knee Society Award Papers Are Highly Cited Works.

    PubMed

    Mroz, Tommy P; Clarke, Henry D; Chang, Yu-Hui H; Scuderi, Giles R

    2016-01-01

    Since 1993, The Knee Society has presented three annual awards recognizing the best research papers presented at the annual meetings. To date, no quantitative evaluation has determined whether the selection process identifies the most meritorious papers based on subsequent citations. In the absence of validation of this process, it is unclear whether the journal readership should view the award-winning papers as those with potentially greater impact for the specialty. (1) Are award papers cited both more than nonaward papers published in the same Knee Society proceedings issue of CORR(®) and more than all other knee research papers published in all issues of CORR(®) during any given year? (2) Does the award selection process identify potentially highly influential knee research? Subsequent citations for each award and nonaward paper published in The Knee Society proceedings issue for 2002 to 2008 were determined using the SCOPUS citation index. The citations for all papers on knee surgery published in CORR(®) during the same years were also determined. Mean citations for an award paper were statistically greater than for a nonaward paper: 86 (SD 95; median 55; 95% confidence interval [CI] of the mean, 44-128) versus 33 (SD 30; median 24; 95% CI of the mean, 28-37; p < 0.001). Mean number of citations for award papers was also higher than for all other knee research papers published in nonproceedings issues of CORR(®): 86 (SD 95; median 55; 95% CI of the mean, 44-128) versus 30 (SD 31; median 20; 95% CI for the mean, 25-35; p < 0.001). Twelve of the 22 (54.6%) award papers were in the top five cited papers from the proceedings issue for the respective year versus 24 of the 190 (12.6%) of the nonaward papers (difference in the percentages is 41.9% and the 95% CI for the risk difference is 20.6%-63.3%; p < 0.001). In 3 of 7 years, an award paper was the most cited knee paper published in CORR(®). The selection process for The Knee Society scientific awards identifies potentially influential papers that are likely to be highly cited in future research articles about the knee. The selection process for Knee Society Award Papers appears to identify papers that are potentially influential in the field of knee surgery and are likely to be highly cited in future published articles. As such, these award papers deserve special attention from the readership.

  12. Novel harmonic regularization approach for variable selection in Cox's proportional hazards model.

    PubMed

    Chu, Ge-Jin; Liang, Yong; Wang, Jia-Xuan

    2014-01-01

    Variable selection is an important issue in regression and a number of variable selection methods have been proposed involving nonconvex penalty functions. In this paper, we investigate a novel harmonic regularization method, which can approximate nonconvex Lq  (1/2 < q < 1) regularizations, to select key risk factors in the Cox's proportional hazards model using microarray gene expression data. The harmonic regularization method can be efficiently solved using our proposed direct path seeking approach, which can produce solutions that closely approximate those for the convex loss function and the nonconvex regularization. Simulation results based on the artificial datasets and four real microarray gene expression datasets, such as real diffuse large B-cell lymphoma (DCBCL), the lung cancer, and the AML datasets, show that the harmonic regularization method can be more accurate for variable selection than existing Lasso series methods.

  13. A GPU-Based Implementation of the Firefly Algorithm for Variable Selection in Multivariate Calibration Problems

    PubMed Central

    de Paula, Lauro C. M.; Soares, Anderson S.; de Lima, Telma W.; Delbem, Alexandre C. B.; Coelho, Clarimar J.; Filho, Arlindo R. G.

    2014-01-01

    Several variable selection algorithms in multivariate calibration can be accelerated using Graphics Processing Units (GPU). Among these algorithms, the Firefly Algorithm (FA) is a recent proposed metaheuristic that may be used for variable selection. This paper presents a GPU-based FA (FA-MLR) with multiobjective formulation for variable selection in multivariate calibration problems and compares it with some traditional sequential algorithms in the literature. The advantage of the proposed implementation is demonstrated in an example involving a relatively large number of variables. The results showed that the FA-MLR, in comparison with the traditional algorithms is a more suitable choice and a relevant contribution for the variable selection problem. Additionally, the results also demonstrated that the FA-MLR performed in a GPU can be five times faster than its sequential implementation. PMID:25493625

  14. A GPU-Based Implementation of the Firefly Algorithm for Variable Selection in Multivariate Calibration Problems.

    PubMed

    de Paula, Lauro C M; Soares, Anderson S; de Lima, Telma W; Delbem, Alexandre C B; Coelho, Clarimar J; Filho, Arlindo R G

    2014-01-01

    Several variable selection algorithms in multivariate calibration can be accelerated using Graphics Processing Units (GPU). Among these algorithms, the Firefly Algorithm (FA) is a recent proposed metaheuristic that may be used for variable selection. This paper presents a GPU-based FA (FA-MLR) with multiobjective formulation for variable selection in multivariate calibration problems and compares it with some traditional sequential algorithms in the literature. The advantage of the proposed implementation is demonstrated in an example involving a relatively large number of variables. The results showed that the FA-MLR, in comparison with the traditional algorithms is a more suitable choice and a relevant contribution for the variable selection problem. Additionally, the results also demonstrated that the FA-MLR performed in a GPU can be five times faster than its sequential implementation.

  15. Publication in a Brazilian journal by Brazilian scientists whose papers have international impact.

    PubMed

    Meneghini, R

    2010-09-01

    Nine Brazilian scientists with an outstanding profile of international publications were invited to publish an original article in the same issue of a Brazilian Journal (Anais da Academia Brasileira de Ciências). The objective was to measure the impact of the papers on the number of citations to the articles, the assumption being that these authors would carry their international prestige to the Brazilian periodical. In a 2-year period there was a larger number of citations of these articles compared to others published in the same journal. Nevertheless, the number of citations in Brazilian journals did not equal the number of citations obtained by the other papers by the same authors in their international publications within the same 2-year period. The reasons for this difference in the number of citations could be either that less significant invited articles were submitted or that it was due to the intrinsic lack of visibility of the Brazilian journals, but this could not be fully determined with the present data. Also relevant was a comparison between the citations of Brazilian journals and the publication in Brazilian journals by these selected authors. A clear imbalance due to a remarkable under-citation of Brazilian authors by authors publishing in Brazilian journals raises the possibility that psychological factors may affect the decision of citing Brazilian journals.

  16. VizieR Online Data Catalog: The CLASS blazar survey. I. (Marcha+, 2001)

    NASA Astrophysics Data System (ADS)

    Marcha, M. J.; Caccianiga, A.; Browne, I. W. A.; Jackson, N.

    2002-04-01

    This paper presents a new complete and well-defined sample of flat-spectrum radio sources (FSRS) selected from the Cosmic Lens All-Sky Survey (CLASS), with the further constraint of a bright (mag<=17.5) optical counterpart. The sample has been designed to produce a large number of low-luminosity blazars in order to test the current unifying models in the low-luminosity regime. In this first paper the new sample is presented and the radio properties of the 325 sources contained therein are discussed. (1 data file).

  17. On the use of ANN interconnection weights in optimal structural design

    NASA Technical Reports Server (NTRS)

    Hajela, P.; Szewczyk, Z.

    1992-01-01

    The present paper describes the use of interconnection weights of a multilayer, feedforward network, to extract information pertinent to the mapping space that the network is assumed to represent. In particular, these weights can be used to determine an appropriate network architecture, and an adequate number of training patterns (input-output pairs) have been used for network training. The weight analysis also provides an approach to assess the influence of each input parameter on a selected output component. The paper shows the significance of this information in decomposition driven optimal design.

  18. Do highly cited clinicians get more citations when being present at social networking sites?

    PubMed

    Ramezani-Pakpour-Langeroudi, Fatemeh; Okhovati, Maryam; Talebian, Ali

    2018-01-01

    The advent of social networking sites has facilitated the dissemination of scientific research. This article aims to investigate the presence of Iranian highly cited clinicians in social networking sites. This is a scientometrics study. Essential Science Indicator (ESI) was searched for Iranian highly cited papers in clinical medicine during November-December 2015. Then, the authors of the papers were checked and a list of authors was obtained. In the second phase, the authors' names were searched in the selected social networking sites (ResearchGate [RG], Academia, Mendeley, LinkedIn). The total citations and h-index in Scopus were also gathered. Fifty-five highly cited papers were retrieved. A total of 107 authors participated in writing these papers. RG was the most popular (64.5%) and LinkedIn and Academia were in 2 nd and 3 rd places. None of the authors of highly cited papers were subscribed to Mendeley. A positive direct relationship was observed between visibility at social networking sites with citation and h-index rate. A significant relationship was observed between the RG score, citations, reads indicators in RG, and citation numbers and there was a significant relationship between the number of document indicator in Academia and the citation numbers. It seems putting the papers in social networking sites can influence the citation rate. We recommend all scientists to be present at social networking sites to have better chance of visibility and also citation.

  19. Do highly cited clinicians get more citations when being present at social networking sites?

    PubMed Central

    Ramezani-Pakpour-Langeroudi, Fatemeh; Okhovati, Maryam; Talebian, Ali

    2018-01-01

    BACKGROUND AND AIMS: The advent of social networking sites has facilitated the dissemination of scientific research. This article aims to investigate the presence of Iranian highly cited clinicians in social networking sites. MATERIALS AND METHODS: This is a scientometrics study. Essential Science Indicator (ESI) was searched for Iranian highly cited papers in clinical medicine during November–December 2015. Then, the authors of the papers were checked and a list of authors was obtained. In the second phase, the authors’ names were searched in the selected social networking sites (ResearchGate [RG], Academia, Mendeley, LinkedIn). The total citations and h-index in Scopus were also gathered. RESULTS: Fifty-five highly cited papers were retrieved. A total of 107 authors participated in writing these papers. RG was the most popular (64.5%) and LinkedIn and Academia were in 2nd and 3rd places. None of the authors of highly cited papers were subscribed to Mendeley. A positive direct relationship was observed between visibility at social networking sites with citation and h-index rate. A significant relationship was observed between the RG score, citations, reads indicators in RG, and citation numbers and there was a significant relationship between the number of document indicator in Academia and the citation numbers. CONCLUSION: It seems putting the papers in social networking sites can influence the citation rate. We recommend all scientists to be present at social networking sites to have better chance of visibility and also citation. PMID:29629379

  20. Finite element mesh refinement criteria for stress analysis

    NASA Technical Reports Server (NTRS)

    Kittur, Madan G.; Huston, Ronald L.

    1990-01-01

    This paper discusses procedures for finite-element mesh selection and refinement. The objective is to improve accuracy. The procedures are based on (1) the minimization of the stiffness matrix race (optimizing node location); (2) the use of h-version refinement (rezoning, element size reduction, and increasing the number of elements); and (3) the use of p-version refinement (increasing the order of polynomial approximation of the elements). A step-by-step procedure of mesh selection, improvement, and refinement is presented. The criteria for 'goodness' of a mesh are based on strain energy, displacement, and stress values at selected critical points of a structure. An analysis of an aircraft lug problem is presented as an example.

  1. Fuzzy Random λ-Mean SAD Portfolio Selection Problem: An Ant Colony Optimization Approach

    NASA Astrophysics Data System (ADS)

    Thakur, Gour Sundar Mitra; Bhattacharyya, Rupak; Mitra, Swapan Kumar

    2010-10-01

    To reach the investment goal, one has to select a combination of securities among different portfolios containing large number of securities. Only the past records of each security do not guarantee the future return. As there are many uncertain factors which directly or indirectly influence the stock market and there are also some newer stock markets which do not have enough historical data, experts' expectation and experience must be combined with the past records to generate an effective portfolio selection model. In this paper the return of security is assumed to be Fuzzy Random Variable Set (FRVS), where returns are set of random numbers which are in turn fuzzy numbers. A new λ-Mean Semi Absolute Deviation (λ-MSAD) portfolio selection model is developed. The subjective opinions of the investors to the rate of returns of each security are taken into consideration by introducing a pessimistic-optimistic parameter vector λ. λ-Mean Semi Absolute Deviation (λ-MSAD) model is preferred as it follows absolute deviation of the rate of returns of a portfolio instead of the variance as the measure of the risk. As this model can be reduced to Linear Programming Problem (LPP) it can be solved much faster than quadratic programming problems. Ant Colony Optimization (ACO) is used for solving the portfolio selection problem. ACO is a paradigm for designing meta-heuristic algorithms for combinatorial optimization problem. Data from BSE is used for illustration.

  2. Modified Dempster-Shafer approach using an expected utility interval decision rule

    NASA Astrophysics Data System (ADS)

    Cheaito, Ali; Lecours, Michael; Bosse, Eloi

    1999-03-01

    The combination operation of the conventional Dempster- Shafer algorithm has a tendency to increase exponentially the number of propositions involved in bodies of evidence by creating new ones. The aim of this paper is to explore a 'modified Dempster-Shafer' approach of fusing identity declarations emanating form different sources which include a number of radars, IFF and ESM systems in order to limit the explosion of the number of propositions. We use a non-ad hoc decision rule based on the expected utility interval to select the most probable object in a comprehensive Platform Data Base containing all the possible identity values that a potential target may take. We study the effect of the redistribution of the confidence levels of the eliminated propositions which otherwise overload the real-time data fusion system; these eliminated confidence levels can in particular be assigned to ignorance, or uniformly added to the remaining propositions and to ignorance. A scenario has been selected to demonstrate the performance of our modified Dempster-Shafer method of evidential reasoning.

  3. Laparoscopic repair of perforated peptic duodenal ulcer.

    PubMed

    Busić, Zeljko; Servis, Draien; Slisurić, Ferdinand; Kristek, Jozo; Kolovrat, Marijan; Cavka, Vlatka; Cavka, Mislav; Cupurdija, Kristijan; Patrlj, Leonardo; Kvesić, Ante

    2010-03-01

    Although prevalence of peptic ulcer is decreasing, the number of peptic ulcer perforations appears to be unchanged. This complication of peptic ulcer is traditionally surgically treated. In recent years, a number of papers have been published where the authors managed perforated duodenal peptic ulcer in selected patients using laparoscopic approach. Laparoscopic treatment of perforated duodenal ulcer has been described as safe and advantageous compared to open technique but advantages are still not clear due to small number of cases in published studies. Based on these recommendations we decided to establish our own protocol for laparoscopic treatment of perforated peptic duodenal ulcer. In this prospective study we evaluated the first 10 patients in whom we performed laparoscopic repair of perforated duodenal ulcer. There were no conversions to open procedure and no early postoperative complications. The patients were contacted by phone a year after the operation, and all were satisfied with the operation and the appearance of postoperative scars. We regard laparoscopic repair of selected patients with perforated duodenal ulcer as a safe and preferable treatment.

  4. A study of metaheuristic algorithms for high dimensional feature selection on microarray data

    NASA Astrophysics Data System (ADS)

    Dankolo, Muhammad Nasiru; Radzi, Nor Haizan Mohamed; Sallehuddin, Roselina; Mustaffa, Noorfa Haszlinna

    2017-11-01

    Microarray systems enable experts to examine gene profile at molecular level using machine learning algorithms. It increases the potentials of classification and diagnosis of many diseases at gene expression level. Though, numerous difficulties may affect the efficiency of machine learning algorithms which includes vast number of genes features comprised in the original data. Many of these features may be unrelated to the intended analysis. Therefore, feature selection is necessary to be performed in the data pre-processing. Many feature selection algorithms are developed and applied on microarray which including the metaheuristic optimization algorithms. This paper discusses the application of the metaheuristics algorithms for feature selection in microarray dataset. This study reveals that, the algorithms have yield an interesting result with limited resources thereby saving computational expenses of machine learning algorithms.

  5. Distributed estimation for adaptive sensor selection in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Mahmoud, Magdi S.; Hassan Hamid, Matasm M.

    2014-05-01

    Wireless sensor networks (WSNs) are usually deployed for monitoring systems with the distributed detection and estimation of sensors. Sensor selection in WSNs is considered for target tracking. A distributed estimation scenario is considered based on the extended information filter. A cost function using the geometrical dilution of precision measure is derived for active sensor selection. A consensus-based estimation method is proposed in this paper for heterogeneous WSNs with two types of sensors. The convergence properties of the proposed estimators are analyzed under time-varying inputs. Accordingly, a new adaptive sensor selection (ASS) algorithm is presented in which the number of active sensors is adaptively determined based on the absolute local innovations vector. Simulation results show that the tracking accuracy of the ASS is comparable to that of the other algorithms.

  6. A performance analysis in AF full duplex relay selection network

    NASA Astrophysics Data System (ADS)

    Ngoc, Long Nguyen; Hong, Nhu Nguyen; Loan, Nguyen Thi Phuong; Kieu, Tam Nguyen; Voznak, Miroslav; Zdralek, Jaroslav

    2018-04-01

    This paper studies on the relaying selective matter in amplify-and-forward (AF) cooperation communication with full-duplex (FD) activity. Various relay choice models supposing the present of different instant information are investigated. We examine a maximal relaying choice that optimizes the instant FD channel capacity and asks for global channel state information (CSI) as well as partial CSI learning. To make comparison easy, accurate outage probability clauses and asymptote form of these strategies that give a diversity rank are extracted. From that, we can see clearly that the number of relays, noise factor, the transmittance coefficient as well as the information transfer power had impacted on their performance. Besides, the optimal relay selection (ORS) model can promote than that of the partial relay selection (PRS) model.

  7. Dynamic map labeling.

    PubMed

    Been, Ken; Daiches, Eli; Yap, Chee

    2006-01-01

    We address the problem of filtering, selecting and placing labels on a dynamic map, which is characterized by continuous zooming and panning capabilities. This consists of two interrelated issues. The first is to avoid label popping and other artifacts that cause confusion and interrupt navigation, and the second is to label at interactive speed. In most formulations the static map labeling problem is NP-hard, and a fast approximation might have O(nlogn) complexity. Even this is too slow during interaction, when the number of labels shown can be several orders of magnitude less than the number in the map. In this paper we introduce a set of desiderata for "consistent" dynamic map labeling, which has qualities desirable for navigation. We develop a new framework for dynamic labeling that achieves the desiderata and allows for fast interactive display by moving all of the selection and placement decisions into the preprocessing phase. This framework is general enough to accommodate a variety of selection and placement algorithms. It does not appear possible to achieve our desiderata using previous frameworks. Prior to this paper, there were no formal models of dynamic maps or of dynamic labels; our paper introduces both. We formulate a general optimization problem for dynamic map labeling and give a solution to a simple version of the problem. The simple version is based on label priorities and a versatile and intuitive class of dynamic label placements we call "invariant point placements". Despite these restrictions, our approach gives a useful and practical solution. Our implementation is incorporated into the G-Vis system which is a full-detail dynamic map of the continental USA. This demo is available through any browser.

  8. A Systematic Approach for Model-Based Aircraft Engine Performance Estimation

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Garg, Sanjay

    2010-01-01

    A requirement for effective aircraft engine performance estimation is the ability to account for engine degradation, generally described in terms of unmeasurable health parameters such as efficiencies and flow capacities related to each major engine module. This paper presents a linear point design methodology for minimizing the degradation-induced error in model-based aircraft engine performance estimation applications. The technique specifically focuses on the underdetermined estimation problem, where there are more unknown health parameters than available sensor measurements. A condition for Kalman filter-based estimation is that the number of health parameters estimated cannot exceed the number of sensed measurements. In this paper, the estimated health parameter vector will be replaced by a reduced order tuner vector whose dimension is equivalent to the sensed measurement vector. The reduced order tuner vector is systematically selected to minimize the theoretical mean squared estimation error of a maximum a posteriori estimator formulation. This paper derives theoretical estimation errors at steady-state operating conditions, and presents the tuner selection routine applied to minimize these values. Results from the application of the technique to an aircraft engine simulation are presented and compared to the estimation accuracy achieved through conventional maximum a posteriori and Kalman filter estimation approaches. Maximum a posteriori estimation results demonstrate that reduced order tuning parameter vectors can be found that approximate the accuracy of estimating all health parameters directly. Kalman filter estimation results based on the same reduced order tuning parameter vectors demonstrate that significantly improved estimation accuracy can be achieved over the conventional approach of selecting a subset of health parameters to serve as the tuner vector. However, additional development is necessary to fully extend the methodology to Kalman filter-based estimation applications.

  9. Microbiologic endodontic status of young traumatized tooth.

    PubMed

    Baumotte, Karla; Bombana, Antonio C; Cai, Silvana

    2011-12-01

    Traumatic dental injuries could expose the dentin and, even the pulp, to the oral environment, making possible their contamination. The presence of microorganisms causes pulpal disease and further a tecidual clutter in the periradicular region. The therapy of periradicular pathosis is the consequence of a correct diagnoses which depends on the knowledge of the nature and complexity of endodontic infections. As there is no information on the microbiology of primary endodontic infection in young teeth, the aim of the current study was to investigate the microbiologic status of root canals from permanent young teeth with primary endodontic infection. Twelve patients with the need for endodontic treatment participated in the study. The selected teeth were uniradicular and had an incomplete root formation. They had untreated necrotic pulp. After the access preparation, nineteen microbiologic samples were obtained from the root canals with sterile paper points. Afterwards, the paper points were pooled in a sterile tube containing 2 ml of prereduced transport fluid. The samples were diluted and spread onto plates with selective medium for Enterococcus spp. and for yeast species and onto plates with non-selective medium. A quantitative analysis was performed. The mean number of cultivable bacterial cells in the root canals was 5.7 × 10(6). In four samples (21.05%) black pigmented species were recovered and the mean number of cells was 6.5 × 10(5). One specimen (5.25%) showed the growth of Enterococcus species and the mean number of cells in this case was of 1.5 × 10(4) . The results showed a root canal microbiota with similar design as seen in completely formed teeth. © 2011 John Wiley & Sons A/S.

  10. Error Analysis of Deep Sequencing of Phage Libraries: Peptides Censored in Sequencing

    PubMed Central

    Matochko, Wadim L.; Derda, Ratmir

    2013-01-01

    Next-generation sequencing techniques empower selection of ligands from phage-display libraries because they can detect low abundant clones and quantify changes in the copy numbers of clones without excessive selection rounds. Identification of errors in deep sequencing data is the most critical step in this process because these techniques have error rates >1%. Mechanisms that yield errors in Illumina and other techniques have been proposed, but no reports to date describe error analysis in phage libraries. Our paper focuses on error analysis of 7-mer peptide libraries sequenced by Illumina method. Low theoretical complexity of this phage library, as compared to complexity of long genetic reads and genomes, allowed us to describe this library using convenient linear vector and operator framework. We describe a phage library as N × 1 frequency vector n = ||ni||, where ni is the copy number of the ith sequence and N is the theoretical diversity, that is, the total number of all possible sequences. Any manipulation to the library is an operator acting on n. Selection, amplification, or sequencing could be described as a product of a N × N matrix and a stochastic sampling operator (S a). The latter is a random diagonal matrix that describes sampling of a library. In this paper, we focus on the properties of S a and use them to define the sequencing operator (S e q). Sequencing without any bias and errors is S e q = S a IN, where IN is a N × N unity matrix. Any bias in sequencing changes IN to a nonunity matrix. We identified a diagonal censorship matrix (C E N), which describes elimination or statistically significant downsampling, of specific reads during the sequencing process. PMID:24416071

  11. Design and analysis of sustainable computer mouse using design for disassembly methodology

    NASA Astrophysics Data System (ADS)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  12. The Continuum of Health Professions

    PubMed Central

    Jensen, Clyde B.

    2015-01-01

    The large number of health care professions with overlapping scopes of practice is intimidating to students, confusing to patients, and frustrating to policymakers. As abundant and diverse as the hundreds of health care professions are, they possess sufficient numbers of common characteristics to warrant their placement on a common continuum of health professions that permits methodical comparisons. From 2009–2012, the author developed and delivered experimental courses at 2 community colleges for the purposes of creating and validating a novel method for comparing health care professions. This paper describes the bidirectional health professions continuum that emerged from these courses and its potential value in helping students select a health care career, motivating health care providers to seek interprofessional collaboration, assisting patients with the selection of health care providers, and helping policymakers to better understand the health care professions they regulate. PMID:26770147

  13. A probabilistic union model with automatic order selection for noisy speech recognition.

    PubMed

    Jancovic, P; Ming, J

    2001-09-01

    A critical issue in exploiting the potential of the sub-band-based approach to robust speech recognition is the method of combining the sub-band observations, for selecting the bands unaffected by noise. A new method for this purpose, i.e., the probabilistic union model, was recently introduced. This model has been shown to be capable of dealing with band-limited corruption, requiring no knowledge about the band position and statistical distribution of the noise. A parameter within the model, which we call its order, gives the best results when it equals the number of noisy bands. Since this information may not be available in practice, in this paper we introduce an automatic algorithm for selecting the order, based on the state duration pattern generated by the hidden Markov model (HMM). The algorithm has been tested on the TIDIGITS database corrupted by various types of additive band-limited noise with unknown noisy bands. The results have shown that the union model equipped with the new algorithm can achieve a recognition performance similar to that achieved when the number of noisy bands is known. The results show a very significant improvement over the traditional full-band model, without requiring prior information on either the position or the number of noisy bands. The principle of the algorithm for selecting the order based on state duration may also be applied to other sub-band combination methods.

  14. Comparison of Sensor Selection Mechanisms for an ERP-Based Brain-Computer Interface

    PubMed Central

    Metzen, Jan H.

    2013-01-01

    A major barrier for a broad applicability of brain-computer interfaces (BCIs) based on electroencephalography (EEG) is the large number of EEG sensor electrodes typically used. The necessity for this results from the fact that the relevant information for the BCI is often spread over the scalp in complex patterns that differ depending on subjects and application scenarios. Recently, a number of methods have been proposed to determine an individual optimal sensor selection. These methods have, however, rarely been compared against each other or against any type of baseline. In this paper, we review several selection approaches and propose one additional selection criterion based on the evaluation of the performance of a BCI system using a reduced set of sensors. We evaluate the methods in the context of a passive BCI system that is designed to detect a P300 event-related potential and compare the performance of the methods against randomly generated sensor constellations. For a realistic estimation of the reduced system's performance we transfer sensor constellations found on one experimental session to a different session for evaluation. We identified notable (and unanticipated) differences among the methods and could demonstrate that the best method in our setup is able to reduce the required number of sensors considerably. Though our application focuses on EEG data, all presented algorithms and evaluation schemes can be transferred to any binary classification task on sensor arrays. PMID:23844021

  15. Current Issues in Maternal and Paternal Deprivation. Unit for Child Studies Selected Papers Number 6.

    ERIC Educational Resources Information Center

    Phillips, Shelley

    An overview of some major current issues in maternal and paternal deprivation is presented. Parts I and II focus on (1) single parents and issues in paternal deprivation and (2) sex stereotyping and issues in maternal deprivation, respectively. More particularly, Part I discusses the effects of divorce and death on children and the problem of…

  16. Bridging Cultures and Traditions for Educational and International Development: Comparative Research, Dialogue and Difference

    ERIC Educational Resources Information Center

    Crossley, Michael

    2008-01-01

    Addressing the central theme of the XIII World Congress, the paper explores a number of contemporary theoretical, methodological and organisational developments in the field of comparative education. In doing so it draws upon the author's recent work and a selection of studies carried out in the South Pacific, the Caribbean and Africa. It is…

  17. Your Child's Self Esteem, a Family Affair. Unit for Child Studies Selected Papers Number 7.

    ERIC Educational Resources Information Center

    Fahey, Mary

    Two lectures on the topic of self-esteem are provided. The purposes of the first lecture are to help parents and professionals better appreciate themselves as support persons and to help them develop and practice skills which will lead to building positive self-esteem both within themselves and others. Five activities are provided to further…

  18. Peace and Violence in the School: A Constructive Curriculum. Selected Papers Number 59.

    ERIC Educational Resources Information Center

    Larsson, Yvonne

    In order to promote peace in our personal lives and in our world it is necessary for teachers to espouse peace education throughout the process and content of their educational systems. Peace education refers to a non-authoritarian educational process that is compatible with peace and avoids all structural violence, not just education about peace.…

  19. Challenging Gifted Learners: General Principles for Science Educators; and Exemplification in the Context of Teaching Chemistry

    ERIC Educational Resources Information Center

    Taber, Keith S.

    2010-01-01

    There is concern in some counties about the number of able young people entering degree level study and careers in physical science, including chemistry. Too few of the most talented young people are selecting "STEM" subjects to ensure the future supply of scientists, engineers and related professionals. The present paper sets out general…

  20. Education and Training of Adults in the Context of Scientific and Technological Development. A Summary.

    ERIC Educational Resources Information Center

    Ohayon-Kaczmarek, Marit

    This paper summarizes 14 case studies from 13 countries commissioned by Unesco to describe the educational provisions for adults in the special context of scientific and technological development and to study a selected number of programs at greater depth. The countries chosen by Unesco are all countries with developed or developing industries,…

  1. A Tuning-AHELO Conceptual Framework of Expected Desired/Learning Outcomes in Engineering. OECD Education Working Papers, Number 60

    ERIC Educational Resources Information Center

    OECD Publishing (NJ1), 2011

    2011-01-01

    The OECD Secretariat, at the invitation of the AHELO Group of National Experts, contracted the Tuning Association to undertake initial development work on learning outcomes to be used for valid and reliable assessments of students from diverse institutions and countries. The two disciplines selected for the AEHLO Feasibility Study are engineering…

  2. Understanding Why Students Participate in Multiple Surveys: Who are the Hard-Core Responders?

    ERIC Educational Resources Information Center

    Porter, Stephen R.; Whitcomb, Michael E.

    2004-01-01

    What causes a student to participate in a survey? This paper looks at survey response across multiple surveys to understand who the hard-core survey responders and non-responders are. Students at a selective liberal arts college were administered four different surveys throughout the 2002-2003 academic year, and we use the number of surveys…

  3. Parental Attitudes to Open and Traditional Education. Unit for Child Studies Selected Papers Number 5.

    ERIC Educational Resources Information Center

    Waterhouse, Marie

    The major focus of interest in the present research is the question of congruence between parental and school attitudes toward issues of authority and freedom. It was hypothesized that the child's adjustment to his/her particular type of classroom (either open or traditional) would be affected by whether he/she came from a family which shared…

  4. Auditing Subject English: A Review of Text Selection Practices Inspired by the National Year of Reading

    ERIC Educational Resources Information Center

    Davies, Larissa McLean

    2012-01-01

    The year 2012 is significant for English teachers in Australia, not only is it the National Year of Reading, but it is also the year when an increasing number of English teachers across the country are implementing the "Australian Curriculum: English," the first national curriculum in the history of the nation. This paper addresses the…

  5. The Student Monograph. Original Articles by Student Gammans, 1995 Edition. The Eta Sigma Gamma Monograph Series, Volume 13, Number 1.

    ERIC Educational Resources Information Center

    1995

    This publication contains a selection of undergraduate and graduate student research papers offered to the Eta Sigma Gamma Society in the health science disciplines. Articles include: (1) "The Development of Public/Private Partnerships and Their Impact on the Future of Public Health" (James Broadbear); (2) "Cancer Knowledge,…

  6. Recruiting, Educating, and Training Librarians for Collection Development. New Directions in Information Management, Number 33.

    ERIC Educational Resources Information Center

    Johnson, Peggy, Ed.; Intner, Sheila S., Ed.

    Collection development as it is practiced now, and as it will continue to be practiced, is presented in its varying aspects. This book is a collection of 15 papers related to recruiting, educating, and training librarians for collection development as well as implications for the future. The titles include: "Book Selection and Collection…

  7. Volunteer Rehabilitation Technology: International Perspectives and Possibilities. Report of a Symposium Sponsored by the RESNA (ICAART) Conference (Montreal, Canada, June 27, 1988). Monograph Number Forty-Two.

    ERIC Educational Resources Information Center

    Tobias, Jim, Ed.; Woods, Diane E., Ed.

    Symposium papers describe programs which use volunteers to provide rehabilitation technology services. George Winston describes Australia's Technical Aid to the Disabled (TAD), focusing on volunteer recruitment and selection, legal liability, volunteer insurance, advantages and limitations of the volunteer approach, and the TAD organization,…

  8. Sleep Problems, Overtiredness and Overanxiety and Frustrating Children: From Birth to Preschool. Unit for Child Studies. Selected Papers Number 24.

    ERIC Educational Resources Information Center

    Harris, Michael J.

    The first of the two discussions presented here, "Sleep Problems, Overtiredness and Overanxiety," describes sleeping behavior of children from birth to 3 years of age and considers situations that affect children's sleep. Topics briefly addressed include the physiology of sleep; developmental aspects of sleep patterns; the effect of lack…

  9. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  10. PSO algorithm enhanced with Lozi Chaotic Map - Tuning experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pluhacek, Michal; Senkerik, Roman; Zelinka, Ivan

    2015-03-10

    In this paper it is investigated the effect of tuning of control parameters of the Lozi Chaotic Map employed as a chaotic pseudo-random number generator for the particle swarm optimization algorithm. Three different benchmark functions are selected from the IEEE CEC 2013 competition benchmark set. The Lozi map is extensively tuned and the performance of PSO is evaluated.

  11. Size Is Big or Little: An Approach to the Dimensionality of Children's Concepts.

    ERIC Educational Resources Information Center

    Webb, Roger A.

    This paper reports a study carried out with 14 children (ranging in age from 2.8 to 3.5 years) which investigated children's concepts of difference. Pairs of small objects differing on a number of dimensions were presented to the children. As each pair of objects was presented, children were asked to select the object that was…

  12. Laser modification of macroscopic properties of metal surface layer

    NASA Astrophysics Data System (ADS)

    Kostrubiec, Franciszek

    1995-03-01

    Surface laser treatment of metals comprises a number of diversified technological operations out of which the following can be considered the most common: oxidation and rendering surfaces amorphous, surface hardening of steel, modification of selected physical properties of metal surface layers. In the paper basic results of laser treatment of a group of metals used as base materials for electric contacts have been presented. The aim of the study was to test the usability of laser treatment from the viewpoint of requirements imposed on materials for electric contacts. The results presented in the paper refer to two different surface treatment technologies: (1) modification of infusible metal surface layer: tungsten and molybdenum through laser fusing of their surface layer and its crystallization, and (2) modification of surface layer properties of other metals through laser doping of their surface layer with foreign elements. In the paper a number of results of experimental investigations obtained by the team under the author's supervision are presented.

  13. Analytical network process based optimum cluster head selection in wireless sensor network.

    PubMed

    Farman, Haleem; Javed, Huma; Jan, Bilal; Ahmad, Jamil; Ali, Shaukat; Khalil, Falak Naz; Khan, Murad

    2017-01-01

    Wireless Sensor Networks (WSNs) are becoming ubiquitous in everyday life due to their applications in weather forecasting, surveillance, implantable sensors for health monitoring and other plethora of applications. WSN is equipped with hundreds and thousands of small sensor nodes. As the size of a sensor node decreases, critical issues such as limited energy, computation time and limited memory become even more highlighted. In such a case, network lifetime mainly depends on efficient use of available resources. Organizing nearby nodes into clusters make it convenient to efficiently manage each cluster as well as the overall network. In this paper, we extend our previous work of grid-based hybrid network deployment approach, in which merge and split technique has been proposed to construct network topology. Constructing topology through our proposed technique, in this paper we have used analytical network process (ANP) model for cluster head selection in WSN. Five distinct parameters: distance from nodes (DistNode), residual energy level (REL), distance from centroid (DistCent), number of times the node has been selected as cluster head (TCH) and merged node (MN) are considered for CH selection. The problem of CH selection based on these parameters is tackled as a multi criteria decision system, for which ANP method is used for optimum cluster head selection. Main contribution of this work is to check the applicability of ANP model for cluster head selection in WSN. In addition, sensitivity analysis is carried out to check the stability of alternatives (available candidate nodes) and their ranking for different scenarios. The simulation results show that the proposed method outperforms existing energy efficient clustering protocols in terms of optimum CH selection and minimizing CH reselection process that results in extending overall network lifetime. This paper analyzes that ANP method used for CH selection with better understanding of the dependencies of different components involved in the evaluation process.

  14. Analytical network process based optimum cluster head selection in wireless sensor network

    PubMed Central

    Javed, Huma; Jan, Bilal; Ahmad, Jamil; Ali, Shaukat; Khalil, Falak Naz; Khan, Murad

    2017-01-01

    Wireless Sensor Networks (WSNs) are becoming ubiquitous in everyday life due to their applications in weather forecasting, surveillance, implantable sensors for health monitoring and other plethora of applications. WSN is equipped with hundreds and thousands of small sensor nodes. As the size of a sensor node decreases, critical issues such as limited energy, computation time and limited memory become even more highlighted. In such a case, network lifetime mainly depends on efficient use of available resources. Organizing nearby nodes into clusters make it convenient to efficiently manage each cluster as well as the overall network. In this paper, we extend our previous work of grid-based hybrid network deployment approach, in which merge and split technique has been proposed to construct network topology. Constructing topology through our proposed technique, in this paper we have used analytical network process (ANP) model for cluster head selection in WSN. Five distinct parameters: distance from nodes (DistNode), residual energy level (REL), distance from centroid (DistCent), number of times the node has been selected as cluster head (TCH) and merged node (MN) are considered for CH selection. The problem of CH selection based on these parameters is tackled as a multi criteria decision system, for which ANP method is used for optimum cluster head selection. Main contribution of this work is to check the applicability of ANP model for cluster head selection in WSN. In addition, sensitivity analysis is carried out to check the stability of alternatives (available candidate nodes) and their ranking for different scenarios. The simulation results show that the proposed method outperforms existing energy efficient clustering protocols in terms of optimum CH selection and minimizing CH reselection process that results in extending overall network lifetime. This paper analyzes that ANP method used for CH selection with better understanding of the dependencies of different components involved in the evaluation process. PMID:28719616

  15. Carbon Nanotubes Filled with Ferromagnetic Materials

    PubMed Central

    Weissker, Uhland; Hampel, Silke; Leonhardt, Albrecht; Büchner, Bernd

    2010-01-01

    Carbon nanotubes (CNT) filled with ferromagnetic metals like iron, cobalt or nickel are new and very interesting nanostructured materials with a number of unique properties. In this paper we give an overview about different chemical vapor deposition (CVD) methods for their synthesis and discuss the influence of selected growth parameters. In addition we evaluate possible growth mechanisms involved in their formation. Moreover we show their identified structural and magnetic properties. On the basis of these properties we present different application possibilities. Some selected examples reveal the high potential of these materials in the field of medicine and nanotechnology. PMID:28883334

  16. Development of Solution Algorithm and Sensitivity Analysis for Random Fuzzy Portfolio Selection Model

    NASA Astrophysics Data System (ADS)

    Hasuike, Takashi; Katagiri, Hideki

    2010-10-01

    This paper focuses on the proposition of a portfolio selection problem considering an investor's subjectivity and the sensitivity analysis for the change of subjectivity. Since this proposed problem is formulated as a random fuzzy programming problem due to both randomness and subjectivity presented by fuzzy numbers, it is not well-defined. Therefore, introducing Sharpe ratio which is one of important performance measures of portfolio models, the main problem is transformed into the standard fuzzy programming problem. Furthermore, using the sensitivity analysis for fuzziness, the analytical optimal portfolio with the sensitivity factor is obtained.

  17. An acoustic sensitivity study of general aviation propellers

    NASA Technical Reports Server (NTRS)

    Korkan, K. D.; Gregorek, G. M.; Keiter, I.

    1980-01-01

    This paper describes the results of a study in which a systematic approach has been taken in studying the effect of selected propeller parameters on the character and magnitude of propeller noise. Four general aviation aircraft were chosen, i.e., a Cessna 172, Cessna 210, Cessna 441, and a 19 passenger commuter concept, to provide a range in flight velocity, engine horsepower, and gross weight. The propeller parameters selected for examination consisted of number of blades, rpm reduction, thickness/chord reduction, activity factor reduction, proplets, airfoil improvement, sweep, position of maximum blade loading, and diameter reduction.

  18. The evaluation of a number of prototypes for the free-tip rotor constant-moment controller

    NASA Technical Reports Server (NTRS)

    Young, L. A.

    1986-01-01

    The development of several prototypes of a constant moment controller, a critical component of the free-tip rotor (FTR) concept, is described. Also presented are the experimental results of a whirl test designed to select a final controller configuration to be included in a future wind-tunnel test of this innovative rotor system. A brief explanation of the FTR concept and its history are included. The paper documents the controller design constraints, each prototype's operating principle, the evaluation test, and the individual prototype test results. A recommended design is identified, along with the selection rationale.

  19. Sonic-boom research: Selected bibliography with annotation

    NASA Technical Reports Server (NTRS)

    Hubbard, H. H.; Maglieri, D. J.; Stephens, D. G.

    1986-01-01

    Citations of selected documents are included which represent the state of the art of technology in each of the following subject areas: prediction, measurement, and minimization of steady-flight sonic booms; prediction and measurement of accelerating-flight sonic booms; sonic-boom propagation; the effects of sonic booms on people, communities, structures, animals, birds, and terrain; and sonic-boom simulator technology. Documents are listed in chronological order in each section of the paper, with key documents and associated annotation listed first. The sources are given along with acquisition numbers, when available, to expedite the acquisition of copies of the documents.

  20. Collected software engineering papers, volume 7

    NASA Technical Reports Server (NTRS)

    1989-01-01

    A collection is presented of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period Dec. 1988 to Oct. 1989. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. For the convenience of this presentation, the seven papers contained here are grouped into three major categories: (1) Software Measurement and Technology Studies; (2) Measurement Environment Studies; and (3) Ada Technology Studies. The first category presents experimental research and evaluation of software measurement and technology; the second presents studies on software environments pertaining to measurement. The last category represents Ada technology and includes research, development, and measurement studies.

  1. A model of two-way selection system for human behavior.

    PubMed

    Zhou, Bin; Qin, Shujia; Han, Xiao-Pu; He, Zhe; Xie, Jia-Rong; Wang, Bing-Hong

    2014-01-01

    Two-way selection is a common phenomenon in nature and society. It appears in the processes like choosing a mate between men and women, making contracts between job hunters and recruiters, and trading between buyers and sellers. In this paper, we propose a model of two-way selection system, and present its analytical solution for the expectation of successful matching total and the regular pattern that the matching rate trends toward an inverse proportion to either the ratio between the two sides or the ratio of the state total to the smaller group's people number. The proposed model is verified by empirical data of the matchmaking fairs. Results indicate that the model well predicts this typical real-world two-way selection behavior to the bounded error extent, thus it is helpful for understanding the dynamics mechanism of the real-world two-way selection system.

  2. Novel Harmonic Regularization Approach for Variable Selection in Cox's Proportional Hazards Model

    PubMed Central

    Chu, Ge-Jin; Liang, Yong; Wang, Jia-Xuan

    2014-01-01

    Variable selection is an important issue in regression and a number of variable selection methods have been proposed involving nonconvex penalty functions. In this paper, we investigate a novel harmonic regularization method, which can approximate nonconvex Lq  (1/2 < q < 1) regularizations, to select key risk factors in the Cox's proportional hazards model using microarray gene expression data. The harmonic regularization method can be efficiently solved using our proposed direct path seeking approach, which can produce solutions that closely approximate those for the convex loss function and the nonconvex regularization. Simulation results based on the artificial datasets and four real microarray gene expression datasets, such as real diffuse large B-cell lymphoma (DCBCL), the lung cancer, and the AML datasets, show that the harmonic regularization method can be more accurate for variable selection than existing Lasso series methods. PMID:25506389

  3. A Self-Adaptive Fuzzy c-Means Algorithm for Determining the Optimal Number of Clusters

    PubMed Central

    Wang, Zhihao; Yi, Jing

    2016-01-01

    For the shortcoming of fuzzy c-means algorithm (FCM) needing to know the number of clusters in advance, this paper proposed a new self-adaptive method to determine the optimal number of clusters. Firstly, a density-based algorithm was put forward. The algorithm, according to the characteristics of the dataset, automatically determined the possible maximum number of clusters instead of using the empirical rule n and obtained the optimal initial cluster centroids, improving the limitation of FCM that randomly selected cluster centroids lead the convergence result to the local minimum. Secondly, this paper, by introducing a penalty function, proposed a new fuzzy clustering validity index based on fuzzy compactness and separation, which ensured that when the number of clusters verged on that of objects in the dataset, the value of clustering validity index did not monotonically decrease and was close to zero, so that the optimal number of clusters lost robustness and decision function. Then, based on these studies, a self-adaptive FCM algorithm was put forward to estimate the optimal number of clusters by the iterative trial-and-error process. At last, experiments were done on the UCI, KDD Cup 1999, and synthetic datasets, which showed that the method not only effectively determined the optimal number of clusters, but also reduced the iteration of FCM with the stable clustering result. PMID:28042291

  4. Recognition physical activities with optimal number of wearable sensors using data mining algorithms and deep belief network.

    PubMed

    Al-Fatlawi, Ali H; Fatlawi, Hayder K; Sai Ho Ling

    2017-07-01

    Daily physical activities monitoring is benefiting the health care field in several ways, in particular with the development of the wearable sensors. This paper adopts effective ways to calculate the optimal number of the necessary sensors and to build a reliable and a high accuracy monitoring system. Three data mining algorithms, namely Decision Tree, Random Forest and PART Algorithm, have been applied for the sensors selection process. Furthermore, the deep belief network (DBN) has been investigated to recognise 33 physical activities effectively. The results indicated that the proposed method is reliable with an overall accuracy of 96.52% and the number of sensors is minimised from nine to six sensors.

  5. Analysis of Content Shared in Online Cancer Communities: Systematic Review

    PubMed Central

    van de Poll-Franse, Lonneke V; Krahmer, Emiel; Verberne, Suzan; Mols, Floortje

    2018-01-01

    Background The content that cancer patients and their relatives (ie, posters) share in online cancer communities has been researched in various ways. In the past decade, researchers have used automated analysis methods in addition to manual coding methods. Patients, providers, researchers, and health care professionals can learn from experienced patients, provided that their experience is findable. Objective The aim of this study was to systematically review all relevant literature that analyzes user-generated content shared within online cancer communities. We reviewed the quality of available research and the kind of content that posters share with each other on the internet. Methods A computerized literature search was performed via PubMed (MEDLINE), PsycINFO (5 and 4 stars), Cochrane Central Register of Controlled Trials, and ScienceDirect. The last search was conducted in July 2017. Papers were selected if they included the following terms: (cancer patient) and (support group or health communities) and (online or internet). We selected 27 papers and then subjected them to a 14-item quality checklist independently scored by 2 investigators. Results The methodological quality of the selected studies varied: 16 were of high quality and 11 were of adequate quality. Of those 27 studies, 15 were manually coded, 7 automated, and 5 used a combination of methods. The best results can be seen in the papers that combined both analytical methods. The number of analyzed posts ranged from 200 to 1,500,000; the number of analyzed posters ranged from 75 to 90,000. The studies analyzing large numbers of posts mainly related to breast cancer, whereas those analyzing small numbers were related to other types of cancers. A total of 12 studies involved some or entirely automatic analysis of the user-generated content. All the authors referred to two main content categories: informational support and emotional support. In all, 15 studies reported only on the content, 6 studies explicitly reported on content and social aspects, and 6 studies focused on emotional changes. Conclusions In the future, increasing amounts of user-generated content will become available on the internet. The results of content analysis, especially of the larger studies, give detailed insights into patients’ concerns and worries, which can then be used to improve cancer care. To make the results of such analyses as usable as possible, automatic content analysis methods will need to be improved through interdisciplinary collaboration. PMID:29615384

  6. Selective progesterone receptor modulators 3: use in oncology, endocrinology and psychiatry.

    PubMed

    Benagiano, Giuseppe; Bastianelli, Carlo; Farris, Manuela

    2008-10-01

    A number of synthetic steroids are capable of modulating progesterone receptors with a spectrum of activities ranging from pure antagonism to a mixture of agonism and antagonism. The best known of these are mifepristone (RU 486), asoprisnil (J 867), onapristone (ZK 98299), ulipristal (CDB 2914), Proellex() (CDB 4124), ORG 33628 and ORG 31710. Outside reproduction selective modulators of progesterone receptors have been under investigation for a large variety of indications, for example in oncology as adjuvants in breast, cervical, endometrial, ovarian and prostate cancer, as well as inoperable meningioma and leiomyosarcoma. In addition, they have been used as antiglucocorticoids. It is therefore useful to review the results obtained in these conditions. A careful evaluation of existing major review papers and of recently published articles was carried out for the indications under review, focusing not only on mifepristone but also on those other selective modulators of progesterone receptors for which data are available. In preliminary studies selective modulators of progesterone receptors had some activity on a number of neoplasias. Their antiglucocorticoid activity has been tested with some success in Cushing's syndrome, several psychiatric conditions (e.g., mood disorders and Alzheimer's disease) and acute renal failure. Finally they are being used in a gene regulator system.

  7. Gene selection for tumor classification using neighborhood rough sets and entropy measures.

    PubMed

    Chen, Yumin; Zhang, Zunjun; Zheng, Jianzhong; Ma, Ying; Xue, Yu

    2017-03-01

    With the development of bioinformatics, tumor classification from gene expression data becomes an important useful technology for cancer diagnosis. Since a gene expression data often contains thousands of genes and a small number of samples, gene selection from gene expression data becomes a key step for tumor classification. Attribute reduction of rough sets has been successfully applied to gene selection field, as it has the characters of data driving and requiring no additional information. However, traditional rough set method deals with discrete data only. As for the gene expression data containing real-value or noisy data, they are usually employed by a discrete preprocessing, which may result in poor classification accuracy. In this paper, we propose a novel gene selection method based on the neighborhood rough set model, which has the ability of dealing with real-value data whilst maintaining the original gene classification information. Moreover, this paper addresses an entropy measure under the frame of neighborhood rough sets for tackling the uncertainty and noisy of gene expression data. The utilization of this measure can bring about a discovery of compact gene subsets. Finally, a gene selection algorithm is designed based on neighborhood granules and the entropy measure. Some experiments on two gene expression data show that the proposed gene selection is an effective method for improving the accuracy of tumor classification. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Collected software engineering papers, volume 8

    NASA Technical Reports Server (NTRS)

    1990-01-01

    A collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period November 1989 through October 1990 is presented. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography. The seven presented papers are grouped into four major categories: (1) experimental research and evaluation of software measurement; (2) studies on models for software reuse; (3) a software tool evaluation; and (4) Ada technology and studies in the areas of reuse and specification.

  9. Collected Software Engineering Papers, Volume 10

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from Oct. 1991 - Nov. 1992. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document. For the convenience of this presentation, the 11 papers contained here are grouped into 5 major sections: (1) the Software Engineering Laboratory; (2) software tools studies; (3) software models studies; (4) software measurement studies; and (5) Ada technology studies.

  10. Outage Analysis of Dual-hop Cognitive Networks with Relay Selection over Nakagami-m Fading Environment

    NASA Astrophysics Data System (ADS)

    Zhang, Zongsheng; Pi, Xurong

    2014-09-01

    In this paper, we investigate the outage performance of decode-and-forward cognitive relay networks for Nakagami-m fading channels, with considering both best relay selection and interference constraints. Focusing on the relay selection and making use of the underlay cognitive approach, an exact closed-form outage probability expression is derived in an independent, non-identical distributed Nakagami-m environment. The closed-form outage probability provides an efficient means to evaluate the effects of the maximum allowable interference power, number of cognitive relays, and channel conditions between the primary user and cognitive users. Finally, we present numerical results to validate the theory analysis. Moreover, from the simulation results, we obtain that the system can obtain the full diversity.

  11. Analytics for vaccine economics and pricing: insights and observations.

    PubMed

    Robbins, Matthew J; Jacobson, Sheldon H

    2015-04-01

    Pediatric immunization programs in the USA are a successful and cost-effective public health endeavor, profoundly reducing mortalities caused by infectious diseases. Two important issues relate to the success of the immunization programs, the selection of cost-effective vaccines and the appropriate pricing of vaccines. The recommended childhood immunization schedule, published annually by the CDC, continues to expand with respect to the number of injections required and the number of vaccines available for selection. The advent of new vaccines to meet the growing requirements of the schedule results: in a large, combinatorial number of possible vaccine formularies. The expansion of the schedule and the increase in the number of available vaccines constitutes a challenge for state health departments, large city immunization programs, private practices and other vaccine purchasers, as a cost-effective vaccine formulary must be selected from an increasingly large set of possible vaccine combinations to satisfy the schedule. The pediatric vaccine industry consists of a relatively small number of pharmaceutical firms engaged in the research, development, manufacture and distribution of pediatric vaccines. The number of vaccine manufacturers has dramatically decreased in the past few decades for a myriad of reasons, most notably due to low profitability. The contraction of the industry negatively impacts the reliable provision of pediatric vaccines. The determination of appropriate vaccine prices is an important issue and influences a vaccine manufacturer's decision to remain in the market. Operations research is a discipline that applies advanced analytical methods to improve decision making; analytics is the application of operations research to a particular problem using pertinent data to provide a practical result. Analytics provides a mechanism to resolve the challenges facing stakeholders in the vaccine development and delivery system, in particular, the selection of cost-effective vaccines and the appropriate pricing of vaccines. A review of applicable analytics papers is provided.

  12. Analysis of Information Content in High-Spectral Resolution Sounders using Subset Selection Analysis

    NASA Technical Reports Server (NTRS)

    Velez-Reyes, Miguel; Joiner, Joanna

    1998-01-01

    In this paper, we summarize the results of the sensitivity analysis and data reduction carried out to determine the information content of AIRS and IASI channels. The analysis and data reduction was based on the use of subset selection techniques developed in the linear algebra and statistical community to study linear dependencies in high dimensional data sets. We applied the subset selection method to study dependency among channels by studying the dependency among their weighting functions. Also, we applied the technique to study the information provided by the different levels in which the atmosphere is discretized for retrievals and analysis. Results from the method correlate well with intuition in many respects and point out to possible modifications for band selection in sensor design and number and location of levels in the analysis process.

  13. [Suicide and suicide prevention in Vienna from 1938 to 1945].

    PubMed

    Sonneck, Gernot; Hirnsperger, Hans; Mundschütz, Reinhard

    2012-01-01

    Beginning with the inception of suicide prevention in interwar Vienna, the paper illustrates how the high number of counselling centres contrasted with a discourse of selection. Despite the fact that suicide rates proved extremely high, suicide prevention declined in importance between 1934 and 1945. Suicide was increasingly attributed to the weak and the inferior. The massive threat to Vienna's Jewish population and the high suicide rates among Viennese Jews are also outlined. The paper concludes with a synopsis of V. E. Frankl's activities in the field of suicide prevention at the Rothschild Hospital as well as the concentration camp in Theresienstadt.

  14. An Analysis of Current Graduation Thesis Writing by English Majors in Independent Institute

    ERIC Educational Resources Information Center

    Han, Ying

    2014-01-01

    The paper takes 414 graduates from ZJU in 2011 and 2012, NIT as a case, analyzing the status of their writing of graduation thesis. It is found that a considerable number of students have problems in selection and report of topics, writing of each part and debating in the whole process of graduation thesis. In view of the situation, based on the…

  15. A Dying Child in the Family: The Child's and Sibling's Perspective. Selected Papers, Number 60.

    ERIC Educational Resources Information Center

    Said, John

    Children and adolescents have different understandings of death. For the baby, death is equated with separation. For toddlers, grief occurs when they realize the person is not returning. The preschool child who tends to live in the present with no clear concept of past or future will not understand the finality. Around ages 4 and 5, death is often…

  16. Nanoscale Measurements of Magnetism & Spin Coherence in Semiconductors

    DTIC Science & Technology

    2015-12-17

    superconductor created by magnetic defects. These energy-resolved studies are distinct from typical spin-selective measurements performed previously using...MacDonald, B. A. Bernevig, A. Yazdani. Observation of Majorana fermions in ferromagnetic atomic chains on a superconductor , Science, (10 2014): 602... superconductor , Physical Review B, (07 2013): 0. doi: 10.1103/PhysRevB.88.020407 TOTAL: 4 Number of Papers published in peer-reviewed journals

  17. A Father's Role in His Child's Development. Unit for Child Studies. Selected Papers Number 20.

    ERIC Educational Resources Information Center

    Patterson, Ross

    By considering three historical stages of the father's role in the development of his child (father of the past, present, and future), one may perceive a pattern. The father of the past had a clearly defined role; however, this role did not take into account the emotional well-being of family members. Consequently, all in the family lost out,…

  18. Bonding: Mothering Magic or Pseudo Science: A Critical Review of Some of the Research in the Area. Selected Papers Number 40.

    ERIC Educational Resources Information Center

    James, Deidre

    This article commences with a review of the issues in the empirical literature surrounding the concept of 'bonding,' and notes some particular parallels with the concept of 'attachment,' demonstrating links between the two. The comparison is followed by a review of empirical findings of studies involving animals and humans, including those dealing…

  19. A Profile of Minority Graduate Students at the University of California, Berkeley: Recruitment, Selection, Fields of Study and Financial Support.

    ERIC Educational Resources Information Center

    Collins, O. R.

    This paper presents a profile of minority graduate students at the University of California, Berkeley. Following a brief overview of Berkeley's Graduate Minority Program (GMP), data is presented concerning the number of GMP students supported; available funds and average grants for students from 1968-69 to 1973-74; distribution of GMP students…

  20. Repeated Judgements of Interest in Vocational Education: A Lens Model Analysis. Occasional Paper Number 6.

    ERIC Educational Resources Information Center

    Athanasou, James A.

    The topic of repeated judgments of interest in vocational education was examined in a study in which 10 female full-time technical and further education (TAFE) students (aged 15-60 years) were handed 120 randomly selected real profiles of TAFE students who had completed subject interest surveys in a previous study. The 10 TAFE students judged how…

  1. Liquid Metal Fast Breeder Reactors: a bibliography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raleigh, H.D.

    1980-11-01

    This bibliography includes 5465 selected citations on LMFBR development. The citations were compiled from the DOE Energy Data Base covering the period January 1978 (EDB File No. 78R1087) through August 1980 (EDB File No. 80C79142). The references are to reports from the Department of Energy and its contractors, reports from other government or private organizations, and journal articles, books, conference papers, and monographs from US originators. Report citations are arranged alphanumerically by report number; nonreport literature citations are arranged chronologically. Corporate, Personal Author, Subject, and Report Number Indexes are provided in Volume 2.

  2. Liquid Metal Fast Breeder Reactors: a bibliography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raleigh, H.D.

    1980-11-01

    This bibliogralphy includes 5465 selected citations on LMFBR development. The citations were compiled from the DOE Energy Data Base covering the period January 1978 (EDB File No. 78R1087) through August 1980 (EDB File No. 80C79142). The references are to reports from the Department of Energy and its contractors, reports from other government or private organizations, and journal articles, books, conference papers, and monographs from US originators. Report citations are arranged alphanumerically by report number; nonreport literature citations are arranged chronologically. Corporate, Personal Author, Subject, and Report Number Indexes are provided in Volume 2.

  3. Limits to the Stability of Pulsar Time

    NASA Technical Reports Server (NTRS)

    Petit, Gerard

    1996-01-01

    The regularity of the rotation rate of millisecond pulsars is the underlying hypothesis for using these neutron stars as 'celestial clocks'. Given their remote location in our galaxy and to our lack of precise knowledge on the galactic environment, a number of phenomena effect the apparent rotation rate observed on Earth. This paper reviews these phenomena and estimates the order of magnitude of their effect. It concludes that an ensemble pulsar time based on a number of selected millisecond pulsars should have a fractional frequency stability close to 2 x 10(sup -15) for an averaging time of a few years.

  4. A standardized way to select, evaluate, and test an analog-to-digital converter for ultrawide bandwidth radiofrequency signals based on user's needs, ideal, published,and actual specifications

    NASA Astrophysics Data System (ADS)

    Chang, Daniel Y.; Rowe, Neil C.

    2012-06-01

    The most important adverse impact on the Electronic Warfare (EW) simulation is that the number of signal sources that can be tested simultaneously is relatively small. When the number of signal sources increases, the analog hardware, complexity and costs grow by the order of N2, since the number of connections among N components is O(N*N) and the signal communication is bi-directional. To solve this problem, digitization of the signal is suggested. In digitizing a radiofrequency signal, an Analog-to-Digital Converter (ADC) is widely used. Most research studies on ADCs are conducted from designer/test engineers' perspective. Some research studies are conducted from market's perspective. This paper presents a generic way to select, evaluate and test ultra high bandwidth COTS ADCs and generate requirements for digitizing continuous time signals from the perspective of user's needs. Based on user's needs, as well as vendor's published, ideal and actual specifications, a decision can be made in selecting a proper ADC for an application. To support our arguments and illustrate the methodology, we evaluate a Tektronix TADC-1000, an 8-bit and 12 gigasamples per second ADC. This project is funded by JEWEL lab, NAWCWD at Point Mugu, CA.

  5. Conceptual design of a crewed reusable space transportation system aimed at parabolic flights: stakeholder analysis, mission concept selection, and spacecraft architecture definition

    NASA Astrophysics Data System (ADS)

    Fusaro, Roberta; Viola, Nicole; Fenoglio, Franco; Santoro, Francesco

    2017-03-01

    This paper proposes a methodology to derive architectures and operational concepts for future earth-to-orbit and sub-orbital transportation systems. In particular, at first, it describes the activity flow, methods, and tools leading to the generation of a wide range of alternative solutions to meet the established goal. Subsequently, the methodology allows selecting a small number of feasible options among which the optimal solution can be found. For the sake of clarity, the first part of the paper describes the methodology from a theoretical point of view, while the second part proposes the selection of mission concepts and of a proper transportation system aimed at sub-orbital parabolic flights. Starting from a detailed analysis of the stakeholders and their needs, the major objectives of the mission have been derived. Then, following a system engineering approach, functional analysis tools as well as concept of operations techniques allowed generating a very high number of possible ways to accomplish the envisaged goals. After a preliminary pruning activity, aimed at defining the feasibility of these concepts, more detailed analyses have been carried out. Going on through the procedure, the designer should move from qualitative to quantitative evaluations, and for this reason, to support the trade-off analysis, an ad-hoc built-in mission simulation software has been exploited. This support tool aims at estimating major mission drivers (mass, heat loads, manoeuverability, earth visibility, and volumetric efficiency) as well as proving the feasibility of the concepts. Other crucial and multi-domain mission drivers, such as complexity, innovation level, and safety have been evaluated through the other appropriate analyses. Eventually, one single mission concept has been selected and detailed in terms of layout, systems, and sub-systems, highlighting also logistic, safety, and maintainability aspects.

  6. Tomlinson-Harashima Precoding for Multiuser MIMO Systems With Quantized CSI Feedback and User Scheduling

    NASA Astrophysics Data System (ADS)

    Sun, Liang; McKay, Matthew R.

    2014-08-01

    This paper studies the sum rate performance of a low complexity quantized CSI-based Tomlinson-Harashima (TH) precoding scheme for downlink multiuser MIMO tansmission, employing greedy user selection. The asymptotic distribution of the output signal to interference plus noise ratio of each selected user and the asymptotic sum rate as the number of users K grows large are derived by using extreme value theory. For fixed finite signal to noise ratios and a finite number of transmit antennas $n_T$, we prove that as K grows large, the proposed approach can achieve the optimal sum rate scaling of the MIMO broadcast channel. We also prove that, if we ignore the precoding loss, the average sum rate of this approach converges to the average sum capacity of the MIMO broadcast channel. Our results provide insights into the effect of multiuser interference caused by quantized CSI on the multiuser diversity gain.

  7. Alternating Magnetic Field Forces for Satellite Formation Flying

    NASA Technical Reports Server (NTRS)

    Youngquist, Robert C.; Nurge, Mark A.; Starr, Stnaley O.

    2012-01-01

    Selected future space missions, such as large aperture telescopes and multi-component interferometers, will require the precise positioning of a number of isolated satellites, yet many of the suggested approaches for providing satellites positioning forces have serious limitations. In this paper we propose a new approach, capable of providing both position and orientation forces, that resolves or alleviates many of these problems. We show that by using alternating fields and currents that finely-controlled forces can be induced on the satellites, which can be individually selected through frequency allocation. We also show, through analysis and experiment, that near field operation is feasible and can provide sufficient force and the necessary degrees of freedom to accurately position and orient small satellites relative to one another. In particular, the case of a telescope with a large number of free mirrors is developed to provide an example of the concept. We. also discuss the far field extension of this concept.

  8. Optical Sensor/Actuator Locations for Active Structural Acoustic Control

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.; Palumbo, Daniel L.; Kincaid, Rex K.

    1998-01-01

    Researchers at NASA Langley Research Center have extensive experience using active structural acoustic control (ASAC) for aircraft interior noise reduction. One aspect of ASAC involves the selection of optimum locations for microphone sensors and force actuators. This paper explains the importance of sensor/actuator selection, reviews optimization techniques, and summarizes experimental and numerical results. Three combinatorial optimization problems are described. Two involve the determination of the number and position of piezoelectric actuators, and the other involves the determination of the number and location of the sensors. For each case, a solution method is suggested, and typical results are examined. The first case, a simplified problem with simulated data, is used to illustrate the method. The second and third cases are more representative of the potential of the method and use measured data. The three case studies and laboratory test results establish the usefulness of the numerical methods.

  9. GPS baseline configuration design based on robustness analysis

    NASA Astrophysics Data System (ADS)

    Yetkin, M.; Berber, M.

    2012-11-01

    The robustness analysis results obtained from a Global Positioning System (GPS) network are dramatically influenced by the configurationof the observed baselines. The selection of optimal GPS baselines may allow for a cost effective survey campaign and a sufficiently robustnetwork. Furthermore, using the approach described in this paper, the required number of sessions, the baselines to be observed, and thesignificance levels for statistical testing and robustness analysis can be determined even before the GPS campaign starts. In this study, wepropose a robustness criterion for the optimal design of geodetic networks, and present a very simple and efficient algorithm based on thiscriterion for the selection of optimal GPS baselines. We also show the relationship between the number of sessions and the non-centralityparameter. Finally, a numerical example is given to verify the efficacy of the proposed approach.

  10. A Comparison Between Jerusalem Cross and Square Patch Frequency Selective Surfaces for Low Profile Antenna Applications

    NASA Technical Reports Server (NTRS)

    Cure, David; Weller, Thomas; Miranda, Felix A.

    2011-01-01

    In this paper, a comparison between Jerusalem Cross (JC) and Square Patch (SP) based Frequency Selected Surfaces (FSS) for low profile antenna applications is presented. The comparison is aimed at understanding the performance of low profile antennas backed by high impedance surfaces. In particular, an end loaded planar open sleeve dipole (ELPOSD) antenna is examined due to the various parameters within its configuration, offering significant design flexibility and a wide operating bandwidth. Measured data of the antennas demonstrate that increasing the number of unit cells improves the fractional bandwidth. The antenna bandwidth increased from 0.8% to 1.8% and from 0.8% to 2.7% for the JC and SP structures, respectively. The number of unit cells was increased from 48 to 80 for the JC-FSS and from 24 to 48 for the SP-FSS.

  11. Sustainable Supplier Performance Evaluation and Selection with Neofuzzy TOPSIS Method

    PubMed Central

    Chaharsooghi, S. K.; Ashrafi, Mehdi

    2014-01-01

    Supplier selection plays an important role in the supply chain management and traditional criteria such as price, quality, and flexibility are considered for supplier performance evaluation in researches. In recent years sustainability has received more attention in the supply chain management literature with triple bottom line (TBL) describing the sustainability in supply chain management with social, environmental, and economic initiatives. This paper explores sustainability in supply chain management and examines the problem of identifying a new model for supplier selection based on extended model of TBL approach in supply chain by presenting fuzzy multicriteria method. Linguistic values of experts' subjective preferences are expressed with fuzzy numbers and Neofuzzy TOPSIS is proposed for finding the best solution of supplier selection problem. Numerical results show that the proposed model is efficient for integrating sustainability in supplier selection problem. The importance of using complimentary aspects of sustainability and Neofuzzy TOPSIS concept in sustainable supplier selection process is shown with sensitivity analysis. PMID:27379267

  12. Sustainable Supplier Performance Evaluation and Selection with Neofuzzy TOPSIS Method.

    PubMed

    Chaharsooghi, S K; Ashrafi, Mehdi

    2014-01-01

    Supplier selection plays an important role in the supply chain management and traditional criteria such as price, quality, and flexibility are considered for supplier performance evaluation in researches. In recent years sustainability has received more attention in the supply chain management literature with triple bottom line (TBL) describing the sustainability in supply chain management with social, environmental, and economic initiatives. This paper explores sustainability in supply chain management and examines the problem of identifying a new model for supplier selection based on extended model of TBL approach in supply chain by presenting fuzzy multicriteria method. Linguistic values of experts' subjective preferences are expressed with fuzzy numbers and Neofuzzy TOPSIS is proposed for finding the best solution of supplier selection problem. Numerical results show that the proposed model is efficient for integrating sustainability in supplier selection problem. The importance of using complimentary aspects of sustainability and Neofuzzy TOPSIS concept in sustainable supplier selection process is shown with sensitivity analysis.

  13. Optimized hyperspectral band selection using hybrid genetic algorithm and gravitational search algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Aizhu; Sun, Genyun; Wang, Zhenjie

    2015-12-01

    The serious information redundancy in hyperspectral images (HIs) cannot contribute to the data analysis accuracy, instead it require expensive computational resources. Consequently, to identify the most useful and valuable information from the HIs, thereby improve the accuracy of data analysis, this paper proposed a novel hyperspectral band selection method using the hybrid genetic algorithm and gravitational search algorithm (GA-GSA). In the proposed method, the GA-GSA is mapped to the binary space at first. Then, the accuracy of the support vector machine (SVM) classifier and the number of selected spectral bands are utilized to measure the discriminative capability of the band subset. Finally, the band subset with the smallest number of spectral bands as well as covers the most useful and valuable information is obtained. To verify the effectiveness of the proposed method, studies conducted on an AVIRIS image against two recently proposed state-of-the-art GSA variants are presented. The experimental results revealed the superiority of the proposed method and indicated that the method can indeed considerably reduce data storage costs and efficiently identify the band subset with stable and high classification precision.

  14. Working definitions, subjective and objective assessments and experimental paradigms in a study exploring social withdrawal in schizophrenia and Alzheimer's disease.

    PubMed

    van der Wee, Nic J A; Bilderbeck, Amy C; Cabello, Maria; Ayuso-Mateos, Jose L; Saris, Ilya M J; Giltay, Erik J; Penninx, Brenda Wjh; Arango, Celso; Post, Anke; Porcelli, Stefano

    2018-06-24

    Social withdrawal is one of the first and common signs of early social dysfunction in a number of important neuropsychiatric disorders, likely because of the enormous amount and complexity of brain processes required to initiate and maintain social relationships (Adolphs, 2009). The Psychiatric Ratings using Intermediate Stratified Markers (PRISM) project focusses on the shared and unique neurobiological basis of social withdrawal in schizophrenia, Alzheimer and depression. In this paper, we discuss the working definition of social withdrawal for this study and the selection of objective and subjective rating scales to assess social withdrawal chosen or adapted for this project. We also discuss the MRI and EEG paradigms selected to study the systems and neural circuitry thought to underlie social functioning and more particularly to be involved in social withdrawal in humans, such as the social perception and the social affiliation networks. A number of behavioral paradigms were selected to assess complementary aspects of social cognition. Also, a digital phenotyping method (a smartphone application) was chosen to obtain real-life data. Copyright © 2018. Published by Elsevier Ltd.

  15. Magnetic field feature extraction and selection for indoor location estimation.

    PubMed

    Galván-Tejada, Carlos E; García-Vázquez, Juan Pablo; Brena, Ramon F

    2014-06-20

    User indoor positioning has been under constant improvement especially with the availability of new sensors integrated into the modern mobile devices, which allows us to exploit not only infrastructures made for everyday use, such as WiFi, but also natural infrastructure, as is the case of natural magnetic field. In this paper we present an extension and improvement of our current indoor localization model based on the feature extraction of 46 magnetic field signal features. The extension adds a feature selection phase to our methodology, which is performed through Genetic Algorithm (GA) with the aim of optimizing the fitness of our current model. In addition, we present an evaluation of the final model in two different scenarios: home and office building. The results indicate that performing a feature selection process allows us to reduce the number of signal features of the model from 46 to 5 regardless the scenario and room location distribution. Further, we verified that reducing the number of features increases the probability of our estimator correctly detecting the user's location (sensitivity) and its capacity to detect false positives (specificity) in both scenarios.

  16. Semisupervised Clustering by Iterative Partition and Regression with Neuroscience Applications

    PubMed Central

    Qian, Guoqi; Wu, Yuehua; Ferrari, Davide; Qiao, Puxue; Hollande, Frédéric

    2016-01-01

    Regression clustering is a mixture of unsupervised and supervised statistical learning and data mining method which is found in a wide range of applications including artificial intelligence and neuroscience. It performs unsupervised learning when it clusters the data according to their respective unobserved regression hyperplanes. The method also performs supervised learning when it fits regression hyperplanes to the corresponding data clusters. Applying regression clustering in practice requires means of determining the underlying number of clusters in the data, finding the cluster label of each data point, and estimating the regression coefficients of the model. In this paper, we review the estimation and selection issues in regression clustering with regard to the least squares and robust statistical methods. We also provide a model selection based technique to determine the number of regression clusters underlying the data. We further develop a computing procedure for regression clustering estimation and selection. Finally, simulation studies are presented for assessing the procedure, together with analyzing a real data set on RGB cell marking in neuroscience to illustrate and interpret the method. PMID:27212939

  17. Randomly auditing research labs could be an affordable way to improve research quality: A simulation study

    PubMed Central

    Zardo, Pauline; Graves, Nicholas

    2018-01-01

    The “publish or perish” incentive drives many researchers to increase the quantity of their papers at the cost of quality. Lowering quality increases the number of false positive errors which is a key cause of the reproducibility crisis. We adapted a previously published simulation of the research world where labs that produce many papers are more likely to have “child” labs that inherit their characteristics. This selection creates a competitive spiral that favours quantity over quality. To try to halt the competitive spiral we added random audits that could detect and remove labs with a high proportion of false positives, and also improved the behaviour of “child” and “parent” labs who increased their effort and so lowered their probability of making a false positive error. Without auditing, only 0.2% of simulations did not experience the competitive spiral, defined by a convergence to the highest possible false positive probability. Auditing 1.35% of papers avoided the competitive spiral in 71% of simulations, and auditing 1.94% of papers in 95% of simulations. Audits worked best when they were only applied to established labs with 50 or more papers compared with labs with 25 or more papers. Adding a ±20% random error to the number of false positives to simulate peer reviewer error did not reduce the audits’ efficacy. The main benefit of the audits was via the increase in effort in “child” and “parent” labs. Audits improved the literature by reducing the number of false positives from 30.2 per 100 papers to 12.3 per 100 papers. Auditing 1.94% of papers would cost an estimated $15.9 million per year if applied to papers produced by National Institutes of Health funding. Our simulation greatly simplifies the research world and there are many unanswered questions about if and how audits would work that can only be addressed by a trial of an audit. PMID:29649314

  18. Randomly auditing research labs could be an affordable way to improve research quality: A simulation study.

    PubMed

    Barnett, Adrian G; Zardo, Pauline; Graves, Nicholas

    2018-01-01

    The "publish or perish" incentive drives many researchers to increase the quantity of their papers at the cost of quality. Lowering quality increases the number of false positive errors which is a key cause of the reproducibility crisis. We adapted a previously published simulation of the research world where labs that produce many papers are more likely to have "child" labs that inherit their characteristics. This selection creates a competitive spiral that favours quantity over quality. To try to halt the competitive spiral we added random audits that could detect and remove labs with a high proportion of false positives, and also improved the behaviour of "child" and "parent" labs who increased their effort and so lowered their probability of making a false positive error. Without auditing, only 0.2% of simulations did not experience the competitive spiral, defined by a convergence to the highest possible false positive probability. Auditing 1.35% of papers avoided the competitive spiral in 71% of simulations, and auditing 1.94% of papers in 95% of simulations. Audits worked best when they were only applied to established labs with 50 or more papers compared with labs with 25 or more papers. Adding a ±20% random error to the number of false positives to simulate peer reviewer error did not reduce the audits' efficacy. The main benefit of the audits was via the increase in effort in "child" and "parent" labs. Audits improved the literature by reducing the number of false positives from 30.2 per 100 papers to 12.3 per 100 papers. Auditing 1.94% of papers would cost an estimated $15.9 million per year if applied to papers produced by National Institutes of Health funding. Our simulation greatly simplifies the research world and there are many unanswered questions about if and how audits would work that can only be addressed by a trial of an audit.

  19. Iranian Nephrology and Urology Research Output in the Past Two Decades: A Bibliographic Analysis of Medline Database.

    PubMed

    Einollahi, Behzad; Motalebi, Mohsen; Taghipour, Mehrdad; Ebrahimi, Mehrdad

    2015-09-01

    We performed a bibliometric search to evaluate the number of papers published in the field of nephrology and urology by Iranian researchers in the past two decades. We did an online search in abstract/title part of articles with 129 keywords such as kidney, renal, hemodialysis, transplant, nephrology, glomerulonephritis, ureteral, nephrolithiasis, and etc. Endnote software version 7 was used to search articles published in PubMed database from November 1993 to November 2013. Those articles in which Iran was the affiliation of at least one of the authors were selected. These articles in the field of nephrology and urology were analyzed regarding the name of originated institution, field of study, total number of publications, type of study, collaboration rate of Iranian nephrologist and urologists for every year, annual sharing of Iranian articles in five journals with highest impact factor (IF) and journal IF. The total number of publications in the field of nephrology and urology was 3,771 (average of 189 papers per year). Most of the Iranian nephrology and urology papers were from the capital city, Tehran (50.03%). There was an increasing trend in the number of publications over the years. Most papers were about transplantation (44.6%), nephrology (20.9%) and hemodialysis (16.4%). Of all, 53.7% were retrospective articles, whereas the proportion of clinical trials was relatively small (10.8%). Although Iranian publications in the field of nephrology and urology have had a considerable and significant increase in the recent years amongst the Middle Eastern countries, there is a wide distance to be a science exporter country.

  20. Clustering analysis of moving target signatures

    NASA Astrophysics Data System (ADS)

    Martone, Anthony; Ranney, Kenneth; Innocenti, Roberto

    2010-04-01

    Previously, we developed a moving target indication (MTI) processing approach to detect and track slow-moving targets inside buildings, which successfully detected moving targets (MTs) from data collected by a low-frequency, ultra-wideband radar. Our MTI algorithms include change detection, automatic target detection (ATD), clustering, and tracking. The MTI algorithms can be implemented in a real-time or near-real-time system; however, a person-in-the-loop is needed to select input parameters for the clustering algorithm. Specifically, the number of clusters to input into the cluster algorithm is unknown and requires manual selection. A critical need exists to automate all aspects of the MTI processing formulation. In this paper, we investigate two techniques that automatically determine the number of clusters: the adaptive knee-point (KP) algorithm and the recursive pixel finding (RPF) algorithm. The KP algorithm is based on a well-known heuristic approach for determining the number of clusters. The RPF algorithm is analogous to the image processing, pixel labeling procedure. Both algorithms are used to analyze the false alarm and detection rates of three operational scenarios of personnel walking inside wood and cinderblock buildings.

  1. EPG therapy for children with long-standing speech disorders: predictions and outcomes.

    PubMed

    Carter, Penny; Edwards, Susan

    2004-01-01

    This paper reports on a project using a series of single subjects to investigate the effectiveness of using electropalatography (EPG) in treating ten children with persisting speech difficulties of no known organic aetiology. The aims of the project were two-fold, firstly to assess whether the subjects selected benefited from this treatment, and secondly to investigate whether it was possible to predict which children would make maximum improvement. A number of factors were identified as possible predictors for successful EPG therapy and subjects were then ranked according to these predictions. Baseline measures of each subject's speech were taken using word lists. Scores reflected the correct number of realizations of consonants produced by each subject. Subjects received the same number of therapy sessions and were then re-tested. Scores before and after therapy were compared and found to be significantly different although the initial predictions as to the magnitude of improvement for each subject were not verified. The selection of appropriate candidates for therapy and the need for objective means of establishing effectiveness are discussed.

  2. Robotic Vision, Tray-Picking System Design Using Multiple, Optical Matched Filters

    NASA Astrophysics Data System (ADS)

    Leib, Kenneth G.; Mendelsohn, Jay C.; Grieve, Philip G.

    1986-10-01

    The optical correlator is applied to a robotic vision, tray-picking problem. Complex matched filters (MFs) are designed to provide sufficient optical memory for accepting any orientation of the desired part, and a multiple holographic lens (MHL) is used to increase the memory for continuous coverage. It is shown that with appropriate thresholding a small part can be selected using optical matched filters. A number of criteria are presented for optimizing the vision system. Two of the part-filled trays that Mendelsohn used are considered in this paper which is the analog (optical) expansion of his paper. Our view in this paper is that of the optical correlator as a cueing device for subsequent, finer vision techniques.

  3. Gene selection for cancer classification with the help of bees.

    PubMed

    Moosa, Johra Muhammad; Shakur, Rameen; Kaykobad, Mohammad; Rahman, Mohammad Sohel

    2016-08-10

    Development of biologically relevant models from gene expression data notably, microarray data has become a topic of great interest in the field of bioinformatics and clinical genetics and oncology. Only a small number of gene expression data compared to the total number of genes explored possess a significant correlation with a certain phenotype. Gene selection enables researchers to obtain substantial insight into the genetic nature of the disease and the mechanisms responsible for it. Besides improvement of the performance of cancer classification, it can also cut down the time and cost of medical diagnoses. This study presents a modified Artificial Bee Colony Algorithm (ABC) to select minimum number of genes that are deemed to be significant for cancer along with improvement of predictive accuracy. The search equation of ABC is believed to be good at exploration but poor at exploitation. To overcome this limitation we have modified the ABC algorithm by incorporating the concept of pheromones which is one of the major components of Ant Colony Optimization (ACO) algorithm and a new operation in which successive bees communicate to share their findings. The proposed algorithm is evaluated using a suite of ten publicly available datasets after the parameters are tuned scientifically with one of the datasets. Obtained results are compared to other works that used the same datasets. The performance of the proposed method is proved to be superior. The method presented in this paper can provide subset of genes leading to more accurate classification results while the number of selected genes is smaller. Additionally, the proposed modified Artificial Bee Colony Algorithm could conceivably be applied to problems in other areas as well.

  4. 3-D surface reconstruction of patient specific anatomic data using a pre-specified number of polygons.

    PubMed

    Aharon, S; Robb, R A

    1997-01-01

    Virtual reality environments provide highly interactive, natural control of the visualization process, significantly enhancing the scientific value of the data produced by medical imaging systems. Due to the computational and real time display update requirements of virtual reality interfaces, however, the complexity of organ and tissue surfaces which can be displayed is limited. In this paper, we present a new algorithm for the production of a polygonal surface containing a pre-specified number of polygons from patient or subject specific volumetric image data. The advantage of this new algorithm is that it effectively tiles complex structures with a specified number of polygons selected to optimize the trade-off between surface detail and real-time display rates.

  5. Powered Upper Limb Orthosis Actuation System Based on Pneumatic Artificial Muscles

    NASA Astrophysics Data System (ADS)

    Chakarov, Dimitar; Veneva, Ivanka; Tsveov, Mihail; Venev, Pavel

    2018-03-01

    The actuation system of a powered upper limb orthosis is studied in the work. To create natural safety in the mutual "man-robot" interaction, an actuation system based on pneumatic artificial muscles (PAM) is selected. Experimentally obtained force/contraction diagrams for bundles, consisting of different number of muscles are shown in the paper. The pooling force and the stiffness of the pneumatic actuators is assessed as a function of the number of muscles in the bundle and the supply pressure. Joint motion and torque is achieved by antagonistic actions through pulleys, driven by bundles of pneumatic muscles. Joint stiffness and joint torques are determined on condition of a power balance, as a function of the joint position, pressure, number of muscles and muscles

  6. Comparative analysis on the selection of number of clusters in community detection

    NASA Astrophysics Data System (ADS)

    Kawamoto, Tatsuro; Kabashima, Yoshiyuki

    2018-02-01

    We conduct a comparative analysis on various estimates of the number of clusters in community detection. An exhaustive comparison requires testing of all possible combinations of frameworks, algorithms, and assessment criteria. In this paper we focus on the framework based on a stochastic block model, and investigate the performance of greedy algorithms, statistical inference, and spectral methods. For the assessment criteria, we consider modularity, map equation, Bethe free energy, prediction errors, and isolated eigenvalues. From the analysis, the tendency of overfit and underfit that the assessment criteria and algorithms have becomes apparent. In addition, we propose that the alluvial diagram is a suitable tool to visualize statistical inference results and can be useful to determine the number of clusters.

  7. Disturbance characteristics of half-selected cells in a cross-point resistive switching memory array

    NASA Astrophysics Data System (ADS)

    Chen, Zhe; Li, Haitong; Chen, Hong-Yu; Chen, Bing; Liu, Rui; Huang, Peng; Zhang, Feifei; Jiang, Zizhen; Ye, Hongfei; Gao, Bin; Liu, Lifeng; Liu, Xiaoyan; Kang, Jinfeng; Wong, H.-S. Philip; Yu, Shimeng

    2016-05-01

    Disturbance characteristics of cross-point resistive random access memory (RRAM) arrays are comprehensively studied in this paper. An analytical model is developed to quantify the number of pulses (#Pulse) the cell can bear before disturbance occurs under various sub-switching voltage stresses based on physical understanding. An evaluation methodology is proposed to assess the disturb behavior of half-selected (HS) cells in cross-point RRAM arrays by combining the analytical model and SPICE simulation. The characteristics of cross-point RRAM arrays such as energy consumption, reliable operating cycles and total error bits are evaluated by the methodology. A possible solution to mitigate disturbance is proposed.

  8. Portfolios with fuzzy returns: Selection strategies based on semi-infinite programming

    NASA Astrophysics Data System (ADS)

    Vercher, Enriqueta

    2008-08-01

    This paper provides new models for portfolio selection in which the returns on securities are considered fuzzy numbers rather than random variables. The investor's problem is to find the portfolio that minimizes the risk of achieving a return that is not less than the return of a riskless asset. The corresponding optimal portfolio is derived using semi-infinite programming in a soft framework. The return on each asset and their membership functions are described using historical data. The investment risk is approximated by mean intervals which evaluate the downside risk for a given fuzzy portfolio. This approach is illustrated with a numerical example.

  9. A probabilistic and multi-objective analysis of lexicase selection and ε-lexicase selection.

    PubMed

    Cava, William La; Helmuth, Thomas; Spector, Lee; Moore, Jason H

    2018-05-10

    Lexicase selection is a parent selection method that considers training cases individually, rather than in aggregate, when performing parent selection. Whereas previous work has demonstrated the ability of lexicase selection to solve difficult problems in program synthesis and symbolic regression, the central goal of this paper is to develop the theoretical underpinnings that explain its performance. To this end, we derive an analytical formula that gives the expected probabilities of selection under lexicase selection, given a population and its behavior. In addition, we expand upon the relation of lexicase selection to many-objective optimization methods to describe the behavior of lexicase selection, which is to select individuals on the boundaries of Pareto fronts in high-dimensional space. We show analytically why lexicase selection performs more poorly for certain sizes of population and training cases, and show why it has been shown to perform more poorly in continuous error spaces. To address this last concern, we propose new variants of ε-lexicase selection, a method that modifies the pass condition in lexicase selection to allow near-elite individuals to pass cases, thereby improving selection performance with continuous errors. We show that ε-lexicase outperforms several diversity-maintenance strategies on a number of real-world and synthetic regression problems.

  10. A Paediatrician Looks at Traditional Approaches to Emotional Development in Preschool and Primary Years. Foundation for Child and Youth Studies Selected Papers Number 46.

    ERIC Educational Resources Information Center

    Phillips, Susi Erika

    This discussion of the emotional development of young children is structured upon Erik Erikson's schemata of psycho-social development. Stage 1, which involves trust versus mistrust, includes references to Erikson's theory and the work of Melanie Klein, Berry Brazelton, Stella Chess and Alexander Thomas, John Bowlby, Anthony Stevens, and D. W.…

  11. Women and Literacy in India: A Study in a Re-settlement Colony in Delhi. Education for Development Occasional Papers Series 1, Number 2.

    ERIC Educational Resources Information Center

    Dighe, Anita

    A group of 100 randomly selected women living in the resettlement colony of Ambedkernager in South Delhi, India, who had participated in colony's Total Literacy Campaign (TLC) were interviewed regarding their participation in the TLC. Of the 100 women, 34 had attended school earlier. Four of the 34 women were still attending school. Most…

  12. Select Papers. Volume 1

    DTIC Science & Technology

    2011-08-01

    the Texture Evolution During Cold Rolling of Al –Mg Alloys . s.l.: Journal of Alloys and Compounds 2011, 508, 922–928. 11. Suhuddin, U.F.H.R.; Mironov...graphene onto a substrate with insulator properties . The current transfer process is still preliminary and presents a number of challenges. Since the...dimensions. The fabrication process flow for the stators uses chemical solution deposited PZT, metal sputtering and evaporation, reactive ion etching

  13. The Application of Augmented Reality in Online Education: A Review of Studies Published in Selected Journals from 2003 to 2012

    ERIC Educational Resources Information Center

    Tsai, Chia-Wen; Shen, Pei-Di; Fan, Ya-Ting

    2014-01-01

    In this paper, the authors reviewed the empirical augmented reality (AR) and online education studies, and those focused on designing or development of AR to help students learn, published in SSCI, SCI-EXPANDED, and A&HCI journals from 2003 to 2012. The authors in this study found that the number of AR and online education studies has…

  14. Measuring Academic Progress of Students with Learning Difficulties: A Comparison of the Semi-Logarithmic Chart and Equal Interval Graph Paper.

    ERIC Educational Resources Information Center

    Marston, Doug; Deno, Stanley L.

    The accuracy of predictions of future student performance on the basis of graphing data on semi-logarithmic charts and equal interval graphs was examined. All 83 low-achieving students in grades 3 to 6 read randomly-selected lists of words from the Harris-Jacobson Word List for 1 minute. The number of words read correctly and words read…

  15. Color image encryption based on hybrid hyper-chaotic system and cellular automata

    NASA Astrophysics Data System (ADS)

    Yaghouti Niyat, Abolfazl; Moattar, Mohammad Hossein; Niazi Torshiz, Masood

    2017-03-01

    This paper proposes an image encryption scheme based on Cellular Automata (CA). CA is a self-organizing structure with a set of cells in which each cell is updated by certain rules that are dependent on a limited number of neighboring cells. The major disadvantages of cellular automata in cryptography include limited number of reversal rules and inability to produce long sequences of states by these rules. In this paper, a non-uniform cellular automata framework is proposed to solve this problem. This proposed scheme consists of confusion and diffusion steps. In confusion step, the positions of the original image pixels are replaced by chaos mapping. Key image is created using non-uniform cellular automata and then the hyper-chaotic mapping is used to select random numbers from the image key for encryption. The main contribution of the paper is the application of hyper chaotic functions and non-uniform CA for robust key image generation. Security analysis and experimental results show that the proposed method has a very large key space and is resistive against noise and attacks. The correlation between adjacent pixels in the encrypted image is reduced and the amount of entropy is equal to 7.9991 which is very close to 8 which is ideal.

  16. Performance analysis of microcomputer based differential protection of UHV lines under selective phase switching

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhatti, A.A.

    1990-04-01

    This paper examines the effects of primary and secondary fault quantities as well s of mutual couplings of neighboring circuits on the sensitivity of operation and threshold settings of a microcomputer based differential protection of UHV lines under selective phase switching. Microcomputer based selective phase switching allows the disconnection of minimum number of phases involved in a fault and requires the autoreclosing of these phases immediately after the extinction of secondary arc. During a primary fault a heavy current contribution to the healthy phases tends to cause an unwanted tripping. Faulty phases physically disconnected constitute an isolated fault which beingmore » coupled to the system affects the current and voltage levels of the healthy phases still retained in the system and may cause an unwanted tripping. The microcomputer based differential protection, appears to have poor performance when applied to uncompensated lines employing selective pole switching.« less

  17. On the development of an expert system for wheelchair selection

    NASA Technical Reports Server (NTRS)

    Madey, Gregory R.; Bhansin, Charlotte A.; Alaraini, Sulaiman A.; Nour, Mohamed A.

    1994-01-01

    The presentation of wheelchairs for the Multiple Sclerosis (MS) patients involves the examination of a number of complicated factors including ambulation status, length of diagnosis, and funding sources, to name a few. Consequently, only a few experts exist in this area. To aid medical therapists with the wheelchair selection decision, a prototype medical expert system (ES) was developed. This paper describes and discusses the steps of designing and developing the system, the experiences of the authors, and the lessons learned from working on this project. Wheelchair Advisor, programmed in CLIPS, serves as diagnosis, classification, prescription, and training tool in the MS field. Interviews, insurance letters, forms, and prototyping were used to gain knowledge regarding the wheelchair selection problem. Among the lessons learned are that evolutionary prototyping is superior to the conventional system development life-cycle (SDLC), the wheelchair selection is a good candidate for ES applications, and that ES can be applied to other similar medical subdomains.

  18. Existence of Lipschitz selections of the Steiner map

    NASA Astrophysics Data System (ADS)

    Bednov, B. B.; Borodin, P. A.; Chesnokova, K. V.

    2018-02-01

    This paper is concerned with the problem of the existence of Lipschitz selections of the Steiner map {St}_n, which associates with n points of a Banach space X the set of their Steiner points. The answer to this problem depends on the geometric properties of the unit sphere S(X) of X, its dimension, and the number n. For n≥slant 4 general conditions are obtained on the space X under which {St}_n admits no Lipschitz selection. When X is finite dimensional it is shown that, if n≥slant 4 is even, the map {St}_n has a Lipschitz selection if and only if S(X) is a finite polytope; this is not true if n≥slant 3 is odd. For n=3 the (single-valued) map {St}_3 is shown to be Lipschitz continuous in any smooth strictly-convex two-dimensional space; this ceases to be true in three-dimensional spaces. Bibliography: 21 titles.

  19. The relative age effect in sport: a developmental systems model.

    PubMed

    Wattie, Nick; Schorer, Jörg; Baker, Joseph

    2015-01-01

    The policies that dictate the participation structure of many youth sport systems involve the use of a set selection date (e.g. 31 December), which invariably produces relative age differences between those within the selection year (e.g. 1 January to 31 December). Those born early in the selection year (e.g. January) are relatively older—by as much as 12 months minus 1 day—than those born later in the selection year (e.g. December). Research in the area of sport has identified a number of significant developmental effects associated with such relative age differences. However, a theoretical framework that describes the breadth and complexity of relative age effects (RAEs) in sport does not exist in the literature. This paper reviews and summarizes the existing literature on relative age in sport, and proposes a constraints-based developmental systems model for RAEs in sport.

  20. Rough sets and Laplacian score based cost-sensitive feature selection

    PubMed Central

    Yu, Shenglong

    2018-01-01

    Cost-sensitive feature selection learning is an important preprocessing step in machine learning and data mining. Recently, most existing cost-sensitive feature selection algorithms are heuristic algorithms, which evaluate the importance of each feature individually and select features one by one. Obviously, these algorithms do not consider the relationship among features. In this paper, we propose a new algorithm for minimal cost feature selection called the rough sets and Laplacian score based cost-sensitive feature selection. The importance of each feature is evaluated by both rough sets and Laplacian score. Compared with heuristic algorithms, the proposed algorithm takes into consideration the relationship among features with locality preservation of Laplacian score. We select a feature subset with maximal feature importance and minimal cost when cost is undertaken in parallel, where the cost is given by three different distributions to simulate different applications. Different from existing cost-sensitive feature selection algorithms, our algorithm simultaneously selects out a predetermined number of “good” features. Extensive experimental results show that the approach is efficient and able to effectively obtain the minimum cost subset. In addition, the results of our method are more promising than the results of other cost-sensitive feature selection algorithms. PMID:29912884

  1. Rough sets and Laplacian score based cost-sensitive feature selection.

    PubMed

    Yu, Shenglong; Zhao, Hong

    2018-01-01

    Cost-sensitive feature selection learning is an important preprocessing step in machine learning and data mining. Recently, most existing cost-sensitive feature selection algorithms are heuristic algorithms, which evaluate the importance of each feature individually and select features one by one. Obviously, these algorithms do not consider the relationship among features. In this paper, we propose a new algorithm for minimal cost feature selection called the rough sets and Laplacian score based cost-sensitive feature selection. The importance of each feature is evaluated by both rough sets and Laplacian score. Compared with heuristic algorithms, the proposed algorithm takes into consideration the relationship among features with locality preservation of Laplacian score. We select a feature subset with maximal feature importance and minimal cost when cost is undertaken in parallel, where the cost is given by three different distributions to simulate different applications. Different from existing cost-sensitive feature selection algorithms, our algorithm simultaneously selects out a predetermined number of "good" features. Extensive experimental results show that the approach is efficient and able to effectively obtain the minimum cost subset. In addition, the results of our method are more promising than the results of other cost-sensitive feature selection algorithms.

  2. Hierarchical Gene Selection and Genetic Fuzzy System for Cancer Microarray Data Classification

    PubMed Central

    Nguyen, Thanh; Khosravi, Abbas; Creighton, Douglas; Nahavandi, Saeid

    2015-01-01

    This paper introduces a novel approach to gene selection based on a substantial modification of analytic hierarchy process (AHP). The modified AHP systematically integrates outcomes of individual filter methods to select the most informative genes for microarray classification. Five individual ranking methods including t-test, entropy, receiver operating characteristic (ROC) curve, Wilcoxon and signal to noise ratio are employed to rank genes. These ranked genes are then considered as inputs for the modified AHP. Additionally, a method that uses fuzzy standard additive model (FSAM) for cancer classification based on genes selected by AHP is also proposed in this paper. Traditional FSAM learning is a hybrid process comprising unsupervised structure learning and supervised parameter tuning. Genetic algorithm (GA) is incorporated in-between unsupervised and supervised training to optimize the number of fuzzy rules. The integration of GA enables FSAM to deal with the high-dimensional-low-sample nature of microarray data and thus enhance the efficiency of the classification. Experiments are carried out on numerous microarray datasets. Results demonstrate the performance dominance of the AHP-based gene selection against the single ranking methods. Furthermore, the combination of AHP-FSAM shows a great accuracy in microarray data classification compared to various competing classifiers. The proposed approach therefore is useful for medical practitioners and clinicians as a decision support system that can be implemented in the real medical practice. PMID:25823003

  3. Hierarchical gene selection and genetic fuzzy system for cancer microarray data classification.

    PubMed

    Nguyen, Thanh; Khosravi, Abbas; Creighton, Douglas; Nahavandi, Saeid

    2015-01-01

    This paper introduces a novel approach to gene selection based on a substantial modification of analytic hierarchy process (AHP). The modified AHP systematically integrates outcomes of individual filter methods to select the most informative genes for microarray classification. Five individual ranking methods including t-test, entropy, receiver operating characteristic (ROC) curve, Wilcoxon and signal to noise ratio are employed to rank genes. These ranked genes are then considered as inputs for the modified AHP. Additionally, a method that uses fuzzy standard additive model (FSAM) for cancer classification based on genes selected by AHP is also proposed in this paper. Traditional FSAM learning is a hybrid process comprising unsupervised structure learning and supervised parameter tuning. Genetic algorithm (GA) is incorporated in-between unsupervised and supervised training to optimize the number of fuzzy rules. The integration of GA enables FSAM to deal with the high-dimensional-low-sample nature of microarray data and thus enhance the efficiency of the classification. Experiments are carried out on numerous microarray datasets. Results demonstrate the performance dominance of the AHP-based gene selection against the single ranking methods. Furthermore, the combination of AHP-FSAM shows a great accuracy in microarray data classification compared to various competing classifiers. The proposed approach therefore is useful for medical practitioners and clinicians as a decision support system that can be implemented in the real medical practice.

  4. Delamination Defect Detection Using Ultrasonic Guided Waves in Advanced Hybrid Structural Elements

    NASA Astrophysics Data System (ADS)

    Yan, Fei; Qi, Kevin ``Xue''; Rose, Joseph L.; Weiland, Hasso

    2010-02-01

    Nondestructive testing for multilayered structures is challenging because of increased numbers of layers and plate thicknesses. In this paper, ultrasonic guided waves are applied to detect delamination defects inside a 23-layer Alcoa Advanced Hybrid Structural plate. A semi-analytical finite element (SAFE) method generates dispersion curves and wave structures in order to select appropriate wave structures to detect certain defects. One guided wave mode and frequency is chosen to achieve large in-plane displacements at regions of interest. The interactions of the selected mode with defects are simulated using finite element models. Experiments are conducted and compared with bulk wave measurements. It is shown that guided waves can detect deeply embedded damages inside thick multilayer fiber-metal laminates with suitable mode and frequency selection.

  5. Surfer: An Extensible Pull-Based Framework for Resource Selection and Ranking

    NASA Technical Reports Server (NTRS)

    Zolano, Paul Z.

    2004-01-01

    Grid computing aims to connect large numbers of geographically and organizationally distributed resources to increase computational power; resource utilization, and resource accessibility. In order to effectively utilize grids, users need to be connected to the best available resources at any given time. As grids are in constant flux, users cannot be expected to keep up with the configuration and status of the grid, thus they must be provided with automatic resource brokering for selecting and ranking resources meeting constraints and preferences they specify. This paper presents a new OGSI-compliant resource selection and ranking framework called Surfer that has been implemented as part of NASA's Information Power Grid (IPG) project. Surfer is highly extensible and may be integrated into any grid environment by adding information providers knowledgeable about that environment.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hajian, Amir; Alvarez, Marcelo A.; Bond, J. Richard, E-mail: ahajian@cita.utoronto.ca, E-mail: malvarez@cita.utoronto.ca, E-mail: bond@cita.utoronto.ca

    Making mock simulated catalogs is an important component of astrophysical data analysis. Selection criteria for observed astronomical objects are often too complicated to be derived from first principles. However the existence of an observed group of objects is a well-suited problem for machine learning classification. In this paper we use one-class classifiers to learn the properties of an observed catalog of clusters of galaxies from ROSAT and to pick clusters from mock simulations that resemble the observed ROSAT catalog. We show how this method can be used to study the cross-correlations of thermal Sunya'ev-Zeldovich signals with number density maps ofmore » X-ray selected cluster catalogs. The method reduces the bias due to hand-tuning the selection function and is readily scalable to large catalogs with a high-dimensional space of astrophysical features.« less

  7. Vis-NIR spectrometric determination of Brix and sucrose in sugar production samples using kernel partial least squares with interval selection based on the successive projections algorithm.

    PubMed

    de Almeida, Valber Elias; de Araújo Gomes, Adriano; de Sousa Fernandes, David Douglas; Goicoechea, Héctor Casimiro; Galvão, Roberto Kawakami Harrop; Araújo, Mario Cesar Ugulino

    2018-05-01

    This paper proposes a new variable selection method for nonlinear multivariate calibration, combining the Successive Projections Algorithm for interval selection (iSPA) with the Kernel Partial Least Squares (Kernel-PLS) modelling technique. The proposed iSPA-Kernel-PLS algorithm is employed in a case study involving a Vis-NIR spectrometric dataset with complex nonlinear features. The analytical problem consists of determining Brix and sucrose content in samples from a sugar production system, on the basis of transflectance spectra. As compared to full-spectrum Kernel-PLS, the iSPA-Kernel-PLS models involve a smaller number of variables and display statistically significant superiority in terms of accuracy and/or bias in the predictions. Published by Elsevier B.V.

  8. An evaluation of selected NASA scientific and technical information products: Results of a pilot study

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Glassman, Myron

    1989-01-01

    A pilot study was conducted to evaluate selected NASA scientific and technical information (STI) products. The study, which utilized survey research in the form of a self-administered mail questionnaire, had a two-fold purpose -- to gather baseline data regarding the use and perceived usefulness of selected NASA STI products and to develop/validate questions that could be used in a future study concerned with the role of the U.S. government technical report in aeronautics. The sample frame consisted of 25,000 members of the American Institute of Aeronautics and Astronautics in the U.S. with academic, government or industrial affiliation. Simple random sampling was used to select 2000 individuals to participate in the study. Three hundred fifty-three usable questionnaires (17 percent response rate) were received by the established cutoff date. The findings indicate that: (1) NASA STI is used and is generally perceived as being important; (2) the use rate for NASA-authored conference/meeting papers, journal articles, and technical reports is fairly uniform; (3) a considerable number of respondents are unfamiliar with STAR (Scientific and Technical Aerospace Reports), IAA (International Aerospace Abstracts), SCAN (Selected Current Aerospace Notices), and the RECON on-line retrieval system; (4) a considerable number of respondents who are familiar with these media do not use them; and (5) the perceived quality of NASA-authored journal articles and technical reports is very good.

  9. Online feature selection with streaming features.

    PubMed

    Wu, Xindong; Yu, Kui; Ding, Wei; Wang, Hao; Zhu, Xingquan

    2013-05-01

    We propose a new online feature selection framework for applications with streaming features where the knowledge of the full feature space is unknown in advance. We define streaming features as features that flow in one by one over time whereas the number of training examples remains fixed. This is in contrast with traditional online learning methods that only deal with sequentially added observations, with little attention being paid to streaming features. The critical challenges for Online Streaming Feature Selection (OSFS) include 1) the continuous growth of feature volumes over time, 2) a large feature space, possibly of unknown or infinite size, and 3) the unavailability of the entire feature set before learning starts. In the paper, we present a novel Online Streaming Feature Selection method to select strongly relevant and nonredundant features on the fly. An efficient Fast-OSFS algorithm is proposed to improve feature selection performance. The proposed algorithms are evaluated extensively on high-dimensional datasets and also with a real-world case study on impact crater detection. Experimental results demonstrate that the algorithms achieve better compactness and higher prediction accuracy than existing streaming feature selection algorithms.

  10. New insights into time series analysis. II - Non-correlated observations

    NASA Astrophysics Data System (ADS)

    Ferreira Lopes, C. E.; Cross, N. J. G.

    2017-08-01

    Context. Statistical parameters are used to draw conclusions in a vast number of fields such as finance, weather, industrial, and science. These parameters are also used to identify variability patterns on photometric data to select non-stochastic variations that are indicative of astrophysical effects. New, more efficient, selection methods are mandatory to analyze the huge amount of astronomical data. Aims: We seek to improve the current methods used to select non-stochastic variations on non-correlated data. Methods: We used standard and new data-mining parameters to analyze non-correlated data to find the best way to discriminate between stochastic and non-stochastic variations. A new approach that includes a modified Strateva function was performed to select non-stochastic variations. Monte Carlo simulations and public time-domain data were used to estimate its accuracy and performance. Results: We introduce 16 modified statistical parameters covering different features of statistical distribution such as average, dispersion, and shape parameters. Many dispersion and shape parameters are unbound parameters, I.e. equations that do not require the calculation of average. Unbound parameters are computed with single loop and hence decreasing running time. Moreover, the majority of these parameters have lower errors than previous parameters, which is mainly observed for distributions with few measurements. A set of non-correlated variability indices, sample size corrections, and a new noise model along with tests of different apertures and cut-offs on the data (BAS approach) are introduced. The number of mis-selections are reduced by about 520% using a single waveband and 1200% combining all wavebands. On the other hand, the even-mean also improves the correlated indices introduced in Paper I. The mis-selection rate is reduced by about 18% if the even-mean is used instead of the mean to compute the correlated indices in the WFCAM database. Even-statistics allows us to improve the effectiveness of both correlated and non-correlated indices. Conclusions: The selection of non-stochastic variations is improved by non-correlated indices. The even-averages provide a better estimation of mean and median for almost all statistical distributions analyzed. The correlated variability indices, which are proposed in the first paper of this series, are also improved if the even-mean is used. The even-parameters will also be useful for classifying light curves in the last step of this project. We consider that the first step of this project, where we set new techniques and methods that provide a huge improvement on the efficiency of selection of variable stars, is now complete. Many of these techniques may be useful for a large number of fields. Next, we will commence a new step of this project regarding the analysis of period search methods.

  11. Application of queuing model in Dubai's busiest megaplex

    NASA Astrophysics Data System (ADS)

    Bhagchandani, Maneesha; Bajpai, Priti

    2013-09-01

    This paper provides a study and analysis of the extremely busy booking counters at the Megaplex in Dubai using the queuing model and simulation. Dubai is an emirate in UAE with a multicultural population. Majority of the population in Dubai is foreign born. Cinema is one of the major forms of entertainment. There are more than 13 megaplexes each with a number of screens ranging from 3 to 22. They screen movies in English, Arabic, Hindi and other languages. It has been observed that during the weekends megaplexes attract a large number of crowd resulting in long queues at the booking counters. One of the busiest megaplex was selected for the study. Queuing theory satisfies the model when tested in real time situation. The concepts of arrival rate, service rate, utilization rate, waiting time in the system, average number of people in the queue, using Little's Theorem and M/M/s queuing model along with simulation software have been used to suggest an empirical solution. The aim of the paper is twofold-To assess the present situation at the Megaplex and give recommendations to optimize the use of booking counters.

  12. Free Web-based personal health records: an analysis of functionality.

    PubMed

    Fernández-Alemán, José Luis; Seva-Llor, Carlos Luis; Toval, Ambrosio; Ouhbi, Sofia; Fernández-Luque, Luis

    2013-12-01

    This paper analyzes and assesses the functionality of free Web-based PHRs as regards health information, user actions and connection with other tools. A systematic literature review in Medline, ACM Digital Library, IEEE Digital Library and ScienceDirect was used to select 19 free Web-based PHRs from the 47 PHRs identified. The results show that none of the PHRs selected met 100% of the 28 functions presented in this paper. Two free Web-based PHRs target a particular public. Around 90 % of the PHRs identified allow users throughout the world to create their own profiles without any geographical restrictions. Only half of the PHRs selected provide physicians with user actions. Few PHRs can connect with other tools. There was considerable variability in the types of data included in free Web-based PHRs. Functionality may have implications for PHR use and adoption, particularly as regards patients with chronic illnesses or disabilities. Support for standard medical document formats and protocols are required to enable data to be exchanged with other stakeholders in the health care domain. The results of our study may assist users in selecting the PHR that best fits their needs, since no significant connection exists between the number of functions of the PHRs identified and their popularity.

  13. Instabilities of convection patterns in a shear-thinning fluid between plates of finite conductivity

    NASA Astrophysics Data System (ADS)

    Varé, Thomas; Nouar, Chérif; Métivier, Christel

    2017-10-01

    Rayleigh-Bénard convection in a horizontal layer of a non-Newtonian fluid between slabs of arbitrary thickness and finite thermal conductivity is considered. The first part of the paper deals with the primary bifurcation and the relative stability of convective patterns at threshold. Weakly nonlinear analysis combined with Stuart-Landau equation is used. The competition between squares and rolls, as a function of the shear-thinning degree of the fluid, the slabs' thickness, and the ratio of the thermal conductivity of the slabs to that of the fluid is investigated. Computations of heat transfer coefficients are in agreement with the maximum heat transfer principle. The second part of the paper concerns the stability of the convective patterns toward spatial perturbations and the determination of the band width of the stable wave number in the neighborhood of the critical Rayleigh number. The approach used is based on the Ginzburg-Landau equations. The study of rolls stability shows that: (i) for low shear-thinning effects, the band of stable wave numbers is bounded by zigzag instability and cross-roll instability. Furthermore, the marginal cross-roll stability boundary enlarges with increasing shear-thinning properties; (ii) for high shear-thinning effects, Eckhaus instability becomes more dangerous than cross-roll instability. For square patterns, the wave number selection is always restricted by zigzag instability and by "rectangular Eckhaus" instability. In addition, the width of the stable wave number decreases with increasing shear-thinning effects. Numerical simulations of the planform evolution are also presented to illustrate the different instabilities considered in the paper.

  14. A machine learning heuristic to identify biologically relevant and minimal biomarker panels from omics data

    PubMed Central

    2015-01-01

    Background Investigations into novel biomarkers using omics techniques generate large amounts of data. Due to their size and numbers of attributes, these data are suitable for analysis with machine learning methods. A key component of typical machine learning pipelines for omics data is feature selection, which is used to reduce the raw high-dimensional data into a tractable number of features. Feature selection needs to balance the objective of using as few features as possible, while maintaining high predictive power. This balance is crucial when the goal of data analysis is the identification of highly accurate but small panels of biomarkers with potential clinical utility. In this paper we propose a heuristic for the selection of very small feature subsets, via an iterative feature elimination process that is guided by rule-based machine learning, called RGIFE (Rule-guided Iterative Feature Elimination). We use this heuristic to identify putative biomarkers of osteoarthritis (OA), articular cartilage degradation and synovial inflammation, using both proteomic and transcriptomic datasets. Results and discussion Our RGIFE heuristic increased the classification accuracies achieved for all datasets when no feature selection is used, and performed well in a comparison with other feature selection methods. Using this method the datasets were reduced to a smaller number of genes or proteins, including those known to be relevant to OA, cartilage degradation and joint inflammation. The results have shown the RGIFE feature reduction method to be suitable for analysing both proteomic and transcriptomics data. Methods that generate large ‘omics’ datasets are increasingly being used in the area of rheumatology. Conclusions Feature reduction methods are advantageous for the analysis of omics data in the field of rheumatology, as the applications of such techniques are likely to result in improvements in diagnosis, treatment and drug discovery. PMID:25923811

  15. Link Correlation Based Transmit Sector Antenna Selection for Alamouti Coded OFDM

    NASA Astrophysics Data System (ADS)

    Ahn, Chang-Jun

    In MIMO systems, the deployment of a multiple antenna technique can enhance the system performance. However, since the cost of RF transmitters is much higher than that of antennas, there is growing interest in techniques that use a larger number of antennas than the number of RF transmitters. These methods rely on selecting the optimal transmitter antennas and connecting them to the respective. In this case, feedback information (FBI) is required to select the optimal transmitter antenna elements. Since FBI is control overhead, the rate of the feedback is limited. This motivates the study of limited feedback techniques where only partial or quantized information from the receiver is conveyed back to the transmitter. However, in MIMO/OFDM systems, it is difficult to develop an effective FBI quantization method for choosing the space-time, space-frequency, or space-time-frequency processing due to the numerous subchannels. Moreover, MIMO/OFDM systems require antenna separation of 5 ∼ 10 wavelengths to keep the correlation coefficient below 0.7 to achieve a diversity gain. In this case, the base station requires a large space to set up multiple antennas. To reduce these problems, in this paper, we propose the link correlation based transmit sector antenna selection for Alamouti coded OFDM without FBI.

  16. Analysis of the GRNs Inference by Using Tsallis Entropy and a Feature Selection Approach

    NASA Astrophysics Data System (ADS)

    Lopes, Fabrício M.; de Oliveira, Evaldo A.; Cesar, Roberto M.

    An important problem in the bioinformatics field is to understand how genes are regulated and interact through gene networks. This knowledge can be helpful for many applications, such as disease treatment design and drugs creation purposes. For this reason, it is very important to uncover the functional relationship among genes and then to construct the gene regulatory network (GRN) from temporal expression data. However, this task usually involves data with a large number of variables and small number of observations. In this way, there is a strong motivation to use pattern recognition and dimensionality reduction approaches. In particular, feature selection is specially important in order to select the most important predictor genes that can explain some phenomena associated with the target genes. This work presents a first study about the sensibility of entropy methods regarding the entropy functional form, applied to the problem of topology recovery of GRNs. The generalized entropy proposed by Tsallis is used to study this sensibility. The inference process is based on a feature selection approach, which is applied to simulated temporal expression data generated by an artificial gene network (AGN) model. The inferred GRNs are validated in terms of global network measures. Some interesting conclusions can be drawn from the experimental results, as reported for the first time in the present paper.

  17. 32 CFR 1615.6 - Selective service number.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Selective service number. 1615.6 Section 1615.6... REGISTRATION § 1615.6 Selective service number. Every registrant shall be given a selective service number. The Social Security Account Number will not be used for this purpose. ...

  18. The influence of construction measurement and structure storey on seismic performance of masonry structure

    NASA Astrophysics Data System (ADS)

    Sun, Baitao; Zhao, Hexian; Yan, Peilei

    2017-08-01

    The damage of masonry structures in earthquakes is generally more severe than other structures. Through the analysis of two typical earthquake damage buildings in the Wenchuan earthquake in Xuankou middle school, we found that the number of storeys and the construction measures had great influence on the seismic performance of masonry structures. This paper takes a teachers’ dormitory in Xuankou middle school as an example, selected the structure arrangement and storey number as two independent variables to design working conditions. Finally we researched on the seismic performance difference of masonry structure under two variables by finite element analysis method.

  19. Assessment of metal ion concentration in water with structured feature selection.

    PubMed

    Naula, Pekka; Airola, Antti; Pihlasalo, Sari; Montoya Perez, Ileana; Salakoski, Tapio; Pahikkala, Tapio

    2017-10-01

    We propose a cost-effective system for the determination of metal ion concentration in water, addressing a central issue in water resources management. The system combines novel luminometric label array technology with a machine learning algorithm that selects a minimal number of array reagents (modulators) and liquid sample dilutions, such that enable accurate quantification. The algorithm is able to identify the optimal modulators and sample dilutions leading to cost reductions since less manual labour and resources are needed. Inferring the ion detector involves a unique type of a structured feature selection problem, which we formalize in this paper. We propose a novel Cartesian greedy forward feature selection algorithm for solving the problem. The novel algorithm was evaluated in the concentration assessment of five metal ions and the performance was compared to two known feature selection approaches. The results demonstrate that the proposed system can assist in lowering the costs with minimal loss in accuracy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Improved Sparse Multi-Class SVM and Its Application for Gene Selection in Cancer Classification

    PubMed Central

    Huang, Lingkang; Zhang, Hao Helen; Zeng, Zhao-Bang; Bushel, Pierre R.

    2013-01-01

    Background Microarray techniques provide promising tools for cancer diagnosis using gene expression profiles. However, molecular diagnosis based on high-throughput platforms presents great challenges due to the overwhelming number of variables versus the small sample size and the complex nature of multi-type tumors. Support vector machines (SVMs) have shown superior performance in cancer classification due to their ability to handle high dimensional low sample size data. The multi-class SVM algorithm of Crammer and Singer provides a natural framework for multi-class learning. Despite its effective performance, the procedure utilizes all variables without selection. In this paper, we propose to improve the procedure by imposing shrinkage penalties in learning to enforce solution sparsity. Results The original multi-class SVM of Crammer and Singer is effective for multi-class classification but does not conduct variable selection. We improved the method by introducing soft-thresholding type penalties to incorporate variable selection into multi-class classification for high dimensional data. The new methods were applied to simulated data and two cancer gene expression data sets. The results demonstrate that the new methods can select a small number of genes for building accurate multi-class classification rules. Furthermore, the important genes selected by the methods overlap significantly, suggesting general agreement among different variable selection schemes. Conclusions High accuracy and sparsity make the new methods attractive for cancer diagnostics with gene expression data and defining targets of therapeutic intervention. Availability: The source MATLAB code are available from http://math.arizona.edu/~hzhang/software.html. PMID:23966761

  1. Differences between selection on sex versus recombination in red queen models with diploid hosts.

    PubMed

    Agrawal, Aneil F

    2009-08-01

    The Red Queen hypothesis argues that parasites generate selection for genetic mixing (sex and recombination) in their hosts. A number of recent papers have examined this hypothesis using models with haploid hosts. In these haploid models, sex and recombination are selectively equivalent. However, sex and recombination are not equivalent in diploids because selection on sex depends on the consequences of segregation as well as recombination. Here I compare how parasites select on modifiers of sexual reproduction and modifiers of recombination rate. Across a wide set of parameters, parasites tend to select against both sex and recombination, though recombination is favored more often than is sex. There is little correspondence between the conditions favoring sex and those favoring recombination, indicating that the direction of selection on sex is often determined by the effects of segregation, not recombination. Moreover, when sex was favored it is usually due to a long-term advantage whereas short-term effects are often responsible for selection favoring recombination. These results strongly indicate that Red Queen models focusing exclusively on the effects of recombination cannot be used to infer the type of selection on sex that is generated by parasites on diploid hosts.

  2. Upweighting rare favourable alleles increases long-term genetic gain in genomic selection programs.

    PubMed

    Liu, Huiming; Meuwissen, Theo H E; Sørensen, Anders C; Berg, Peer

    2015-03-21

    The short-term impact of using different genomic prediction (GP) models in genomic selection has been intensively studied, but their long-term impact is poorly understood. Furthermore, long-term genetic gain of genomic selection is expected to improve by using Jannink's weighting (JW) method, in which rare favourable marker alleles are upweighted in the selection criterion. In this paper, we extend the JW method by including an additional parameter to decrease the emphasis on rare favourable alleles over the time horizon, with the purpose of further improving the long-term genetic gain. We call this new method dynamic weighting (DW). The paper explores the long-term impact of different GP models with or without weighting methods. Different selection criteria were tested by simulating a population of 500 animals with truncation selection of five males and 50 females. Selection criteria included unweighted and weighted genomic estimated breeding values using the JW or DW methods, for which ridge regression (RR) and Bayesian lasso (BL) were used to estimate marker effects. The impacts of these selection criteria were compared under three genetic architectures, i.e. varying numbers of QTL for the trait and for two time horizons of 15 (TH15) or 40 (TH40) generations. For unweighted GP, BL resulted in up to 21.4% higher long-term genetic gain and 23.5% lower rate of inbreeding under TH40 than RR. For weighted GP, DW resulted in 1.3 to 5.5% higher long-term gain compared to unweighted GP. JW, however, showed a 6.8% lower long-term genetic gain relative to unweighted GP when BL was used to estimate the marker effects. Under TH40, both DW and JW obtained significantly higher genetic gain than unweighted GP. With DW, the long-term genetic gain was increased by up to 30.8% relative to unweighted GP, and also increased by 8% relative to JW, although at the expense of a lower short-term gain. Irrespective of the number of QTL simulated, BL is superior to RR in maintaining genetic variance and therefore results in higher long-term genetic gain. Moreover, DW is a promising method with which high long-term genetic gain can be expected within a fixed time frame.

  3. Front and Back Movement Analysis of a Triangle-Structured Three-Wheeled Omnidirectional Mobile Robot by Varying the Angles between Two Selected Wheels

    PubMed Central

    Mohanraj, A. P.; Elango, A.; Reddy, Mutra Chanakya

    2016-01-01

    Omnidirectional robots can move in all directions without steering their wheels and it can rotate clockwise and counterclockwise with reference to their axis. In this paper, we focused only on front and back movement, to analyse the square- and triangle-structured omnidirectional robot movements. An omnidirectional mobile robot shows different performances with the different number of wheels and the omnidirectional mobile robot's chassis design. Research is going on in this field to improve the accurate movement capability of omnidirectional mobile robots. This paper presents a design of a unique device of Angle Variable Chassis (AVC) for linear movement analysis of a three-wheeled omnidirectional mobile robot (TWOMR), at various angles (θ) between the wheels. Basic mobility algorithm is developed by varying the angles between the two selected omnidirectional wheels in TWOMR. The experiment is carried out by varying the angles (θ = 30°, 45°, 60°, 90°, and 120°) between the two selected omniwheels and analysing the movement of TWOMR in forward direction and reverse direction on a smooth cement surface. Respectively, it is compared to itself for various angles (θ), to get its advantages and weaknesses. The conclusion of the paper provides effective movement of TWOMR at a particular angle (θ) and also the application of TWOMR in different situations. PMID:26981585

  4. Front and Back Movement Analysis of a Triangle-Structured Three-Wheeled Omnidirectional Mobile Robot by Varying the Angles between Two Selected Wheels.

    PubMed

    Mohanraj, A P; Elango, A; Reddy, Mutra Chanakya

    2016-01-01

    Omnidirectional robots can move in all directions without steering their wheels and it can rotate clockwise and counterclockwise with reference to their axis. In this paper, we focused only on front and back movement, to analyse the square- and triangle-structured omnidirectional robot movements. An omnidirectional mobile robot shows different performances with the different number of wheels and the omnidirectional mobile robot's chassis design. Research is going on in this field to improve the accurate movement capability of omnidirectional mobile robots. This paper presents a design of a unique device of Angle Variable Chassis (AVC) for linear movement analysis of a three-wheeled omnidirectional mobile robot (TWOMR), at various angles (θ) between the wheels. Basic mobility algorithm is developed by varying the angles between the two selected omnidirectional wheels in TWOMR. The experiment is carried out by varying the angles (θ = 30°, 45°, 60°, 90°, and 120°) between the two selected omniwheels and analysing the movement of TWOMR in forward direction and reverse direction on a smooth cement surface. Respectively, it is compared to itself for various angles (θ), to get its advantages and weaknesses. The conclusion of the paper provides effective movement of TWOMR at a particular angle (θ) and also the application of TWOMR in different situations.

  5. [Investigation methodology and application on scientific and technological personnel of traditional Chinese medical resources based on data from Chinese scientific research paper].

    PubMed

    Li, Hai-yan; Li, Yuan-hai; Yang, Yang; Liu, Fang-zhou; Wang, Jing; Tian, Ye; Yang, Ce; Liu, Yang; Li, Meng; Sun Li-ying

    2015-12-01

    The aim of this study is to identify the present status of the scientific and technological personnel in the field of traditional Chinese medicine (TCM) resource science. Based on the data from Chinese scientific research paper, an investigation regarding the number of the personnel, the distribution, their output of paper, their scientific research teams, high-yield authors and high-cited authors was conducted. The study covers seven subfields of traditional Chinese medicine identification, quality standard, Chinese medicine cultivation, harvest processing of TCM, market development and resource protection and resource management, as well as 82 widely used Chinese medicine species, such as Ginseng and Radix Astragali. One hundred and fifteen domain authority experts were selected based on the data of high-yield authors and high-cited authors. The database system platform "Skilled Scientific and Technological Personnel in the field of Traditional Chinese Medicine Resource Science-Chinese papers" was established. This platform successfully provided the retrieval result of the personnel, output of paper, and their core research team by input the study field, year, and Chinese medicine species. The investigation provides basic data of scientific and technological personnel in the field of traditional Chinese medicine resource science for administrative agencies and also evidence for the selection of scientific and technological personnel and construction of scientific research teams.

  6. A voluntary deductible in health insurance: the more years you opt for it, the lower your premium?

    PubMed

    van Winssen, K P M; van Kleef, R C; van de Ven, W P M M

    2017-03-01

    Adverse selection regarding a voluntary deductible (VD) in health insurance implies that insured only opt for a VD if they expect no (or few) healthcare expenses. This paper investigates two potential strategies to reduce adverse selection: (1) differentiating the premium to the duration of the contract for which the VD holds (ex-ante approach) and (2) differentiating the premium to the number of years for which insured have opted for a VD (ex-post approach). It can be hypothesized that premiums will decrease with the duration of the contract or the number of years for which insured have opted for a VD, providing an incentive to insured to opt for a deductible also in (incidental) years they expect relatively high expenses. To test this hypothesis, we examine which premium patterns would occur under these strategies using data on healthcare expenses and risk characteristics of over 750,000 insured from 6 years. Our results show that, under the assumptions made, only without risk equalization the premiums could decrease with the duration of the contract or the number of years for which insured have opted for a VD. With (sophisticated) risk equalization, decreasing premiums seem unfeasible, both under the ex-ante and ex-post approach. Given these findings, we are sceptical about the feasibility of these strategies to counteract adverse selection.

  7. Analysis of Content Shared in Online Cancer Communities: Systematic Review.

    PubMed

    van Eenbergen, Mies C; van de Poll-Franse, Lonneke V; Krahmer, Emiel; Verberne, Suzan; Mols, Floortje

    2018-04-03

    The content that cancer patients and their relatives (ie, posters) share in online cancer communities has been researched in various ways. In the past decade, researchers have used automated analysis methods in addition to manual coding methods. Patients, providers, researchers, and health care professionals can learn from experienced patients, provided that their experience is findable. The aim of this study was to systematically review all relevant literature that analyzes user-generated content shared within online cancer communities. We reviewed the quality of available research and the kind of content that posters share with each other on the internet. A computerized literature search was performed via PubMed (MEDLINE), PsycINFO (5 and 4 stars), Cochrane Central Register of Controlled Trials, and ScienceDirect. The last search was conducted in July 2017. Papers were selected if they included the following terms: (cancer patient) and (support group or health communities) and (online or internet). We selected 27 papers and then subjected them to a 14-item quality checklist independently scored by 2 investigators. The methodological quality of the selected studies varied: 16 were of high quality and 11 were of adequate quality. Of those 27 studies, 15 were manually coded, 7 automated, and 5 used a combination of methods. The best results can be seen in the papers that combined both analytical methods. The number of analyzed posts ranged from 200 to 1,500,000; the number of analyzed posters ranged from 75 to 90,000. The studies analyzing large numbers of posts mainly related to breast cancer, whereas those analyzing small numbers were related to other types of cancers. A total of 12 studies involved some or entirely automatic analysis of the user-generated content. All the authors referred to two main content categories: informational support and emotional support. In all, 15 studies reported only on the content, 6 studies explicitly reported on content and social aspects, and 6 studies focused on emotional changes. In the future, increasing amounts of user-generated content will become available on the internet. The results of content analysis, especially of the larger studies, give detailed insights into patients' concerns and worries, which can then be used to improve cancer care. To make the results of such analyses as usable as possible, automatic content analysis methods will need to be improved through interdisciplinary collaboration. ©Mies C van Eenbergen, Lonneke V van de Poll-Franse, Emiel Krahmer, Suzan Verberne, Floortje Mols. Originally published in JMIR Cancer (http://cancer.jmir.org), 03.04.2018.

  8. University Examinations and Standardized Testing: Principles, Experience, and Policy Options. World Bank Technical Paper Number 78. Proceedings of a Seminar on the Uses of Standardized Tests and Selection Examinations (Beijing, China, April 1985).

    ERIC Educational Resources Information Center

    Heyneman, Stephen P., Ed.; Fagerlind, Ingemar, Ed.

    In September 1984, the Chinese government asked the Economic Development Institute of the World Bank to assist the officials of the Chinese Ministry of Education in thinking through some policy options for examinations and standardized testing. This document summarizes the descriptions of testing programs and advice provided to these Chinese…

  9. Imaging techniques in digital forensic investigation: a study using neural networks

    NASA Astrophysics Data System (ADS)

    Williams, Godfried

    2006-09-01

    Imaging techniques have been applied to a number of applications, such as translation and classification problems in medicine and defence. This paper examines the application of imaging techniques in digital forensics investigation using neural networks. A review of applications of digital image processing is presented, whiles a Pedagogical analysis of computer forensics is also highlighted. A data set describing selected images in different forms are used in the simulation and experimentation.

  10. Optimization Techniques for Design Problems in Selected Areas in WSNs: A Tutorial

    PubMed Central

    Ibrahim, Ahmed; Alfa, Attahiru

    2017-01-01

    This paper is intended to serve as an overview of, and mostly a tutorial to illustrate, the optimization techniques used in several different key design aspects that have been considered in the literature of wireless sensor networks (WSNs). It targets the researchers who are new to the mathematical optimization tool, and wish to apply it to WSN design problems. We hence divide the paper into two main parts. One part is dedicated to introduce optimization theory and an overview on some of its techniques that could be helpful in design problem in WSNs. In the second part, we present a number of design aspects that we came across in the WSN literature in which mathematical optimization methods have been used in the design. For each design aspect, a key paper is selected, and for each we explain the formulation techniques and the solution methods implemented. We also provide in-depth analyses and assessments of the problem formulations, the corresponding solution techniques and experimental procedures in some of these papers. The analyses and assessments, which are provided in the form of comments, are meant to reflect the points that we believe should be taken into account when using optimization as a tool for design purposes. PMID:28763039

  11. Optimization Techniques for Design Problems in Selected Areas in WSNs: A Tutorial.

    PubMed

    Ibrahim, Ahmed; Alfa, Attahiru

    2017-08-01

    This paper is intended to serve as an overview of, and mostly a tutorial to illustrate, the optimization techniques used in several different key design aspects that have been considered in the literature of wireless sensor networks (WSNs). It targets the researchers who are new to the mathematical optimization tool, and wish to apply it to WSN design problems. We hence divide the paper into two main parts. One part is dedicated to introduce optimization theory and an overview on some of its techniques that could be helpful in design problem in WSNs. In the second part, we present a number of design aspects that we came across in the WSN literature in which mathematical optimization methods have been used in the design. For each design aspect, a key paper is selected, and for each we explain the formulation techniques and the solution methods implemented. We also provide in-depth analyses and assessments of the problem formulations, the corresponding solution techniques and experimental procedures in some of these papers. The analyses and assessments, which are provided in the form of comments, are meant to reflect the points that we believe should be taken into account when using optimization as a tool for design purposes.

  12. [Evaluation of using statistical methods in selected national medical journals].

    PubMed

    Sych, Z

    1996-01-01

    The paper covers the performed evaluation of frequency with which the statistical methods were applied in analyzed works having been published in six selected, national medical journals in the years 1988-1992. For analysis the following journals were chosen, namely: Klinika Oczna, Medycyna Pracy, Pediatria Polska, Polski Tygodnik Lekarski, Roczniki Państwowego Zakładu Higieny, Zdrowie Publiczne. Appropriate number of works up to the average in the remaining medical journals was randomly selected from respective volumes of Pol. Tyg. Lek. The studies did not include works wherein the statistical analysis was not implemented, which referred both to national and international publications. That exemption was also extended to review papers, casuistic ones, reviews of books, handbooks, monographies, reports from scientific congresses, as well as papers on historical topics. The number of works was defined in each volume. Next, analysis was performed to establish the mode of finding out a suitable sample in respective studies, differentiating two categories: random and target selections. Attention was also paid to the presence of control sample in the individual works. In the analysis attention was also focussed on the existence of sample characteristics, setting up three categories: complete, partial and lacking. In evaluating the analyzed works an effort was made to present the results of studies in tables and figures (Tab. 1, 3). Analysis was accomplished with regard to the rate of employing statistical methods in analyzed works in relevant volumes of six selected, national medical journals for the years 1988-1992, simultaneously determining the number of works, in which no statistical methods were used. Concurrently the frequency of applying the individual statistical methods was analyzed in the scrutinized works. Prominence was given to fundamental statistical methods in the field of descriptive statistics (measures of position, measures of dispersion) as well as most important methods of mathematical statistics such as parametric tests of significance, analysis of variance (in single and dual classifications). non-parametric tests of significance, correlation and regression. The works, in which use was made of either multiple correlation or multiple regression or else more complex methods of studying the relationship for two or more numbers of variables, were incorporated into the works whose statistical methods were constituted by correlation and regression as well as other methods, e.g. statistical methods being used in epidemiology (coefficients of incidence and morbidity, standardization of coefficients, survival tables) factor analysis conducted by Jacobi-Hotellng's method, taxonomic methods and others. On the basis of the performed studies it has been established that the frequency of employing statistical methods in the six selected national, medical journals in the years 1988-1992 was 61.1-66.0% of the analyzed works (Tab. 3), and they generally were almost similar to the frequency provided in English language medical journals. On a whole, no significant differences were disclosed in the frequency of applied statistical methods (Tab. 4) as well as in frequency of random tests (Tab. 3) in the analyzed works, appearing in the medical journals in respective years 1988-1992. The most frequently used statistical methods in analyzed works for 1988-1992 were the measures of position 44.2-55.6% and measures of dispersion 32.5-38.5% as well as parametric tests of significance 26.3-33.1% of the works analyzed (Tab. 4). For the purpose of increasing the frequency and reliability of the used statistical methods, the didactics should be widened in the field of biostatistics at medical studies and postgraduation training designed for physicians and scientific-didactic workers.

  13. DENTINE CARIES: ACID-TOLERANT MICROORGANISMS AND ASPECTS ON COLLAGEN DEGRADATION.

    PubMed

    Lager, Anders Hedenbjörk

    2014-01-01

    Dental caries is a common disease all over the world, despite the fact that it can be both effectively prevented and treated. It is driven by acids produced by oral microorganisms as a consequence of their metabolism of dietary carbohydrates. Given enough acid challenge, eventually the tooth enamel barrier will be broken down, and the carious lesion will extend into underlying hard tissue, forming a macroscopic cavity in the dentine. In comparison to biofilm on enamel, a dentine carious lesion provides a vastly different environment for the residing microorganisms. The environment influences the types and numbers of microorganisms that can colonize the dentine caries lesion. The overall aims for this thesis are to enumerate and further study microorganisms found in established dentine caries lesions and also to illuminate how host-derived proteolytic enzymes might contribute to this degradation, not only to better understand the caries process in dentine but also to find incitements for new methods to influence the natural progression of caries lesions. In Paper I, the numbers of remaining viable microorganisms after completed excavation using two excavation methods were investigated. Samples of carious dentine tissue were collected before and after excavation and cultivated on different agar media in different atmospheres. Analysis was performed by counting the number of colony-forming units (CFUs). Key findings: The number of remaining microorganisms after excavation was low for both methods, but some microorganisms always remained in the cavity floors even when the cavities were judged as caries free using normal clinical criteria. In Paper II, the acid tolerant microbiota in established dentine caries lesions was investigated. Samples were taken as in Paper I, but on three levels (superficial, center of lesion, floor of lesion after completed excavation). The samples were cultivated in anaerobic conditions on solid pH-selective agar media of different acidity. Key findings: Each investigated lesion harbored a unique microbiota in terms of both species composition and numbers of microorganisms. This indicates that various combinations of aciduric microorganisms can colonize, survive in and probably also propagate dentine carious lesions. We also found that solid pH-selective agars can be used successfully to select acid-tolerant microorganisms in caries lesions. This would preserve their phenotypic traits for further study. In Paper III, the relation between salivary levels of matrix metalloproteinase-8 (MMP-8), salivary levels of tissue inhibitor of MMP (TIMP-1), and the presence of manifest caries lesions in a large number of subjects was investigated. Saliva samples were collected and analyzed for concentrations of MMP-8, TIMP-1 and total protein using immunofluorometric assays, enzyme linked immunosorbent assays and Bradford assays, respectively. Key findings: Subjects with manifest caries lesions had significantly elevated levels of salivary MMP-8 compared to subjects without caries lesions. TIMP-1 was not significant in any case. In Paper IV, a new method for generating bioactive demineralized dentine matrix substrate (DDM) was developed using a dialysis system and two different demineralization approaches (acetic acid or EDTA). The generated DDM was subsequently analyzed for the presence of type 1 collagen, active MMP-8 and hydroxyproline (HYP) levels using SDS-PAGE, ELISA or immunofluorescence assay. Key findings: Both demineralization methods produced a substrate rich in collagen and with preserved MMP-8 activity. This report presents new knowledge on the composition of the acid tolerant dentine caries microbiota from three levels in dentine carious lesions and on the efficacy of operative caries removal on the numbers of viable microorganisms in the caries free cavity using two operative methods. Moreover, the basic mechanisms behind collagen degradation in the dentine caries process are studied from both a clinical and laboratory perspective. The report also provides a reference for further studies on dentine caries microbiology and dentine caries collagen degradation mechanisms, both of which are known only in part.

  14. Relevance popularity: A term event model based feature selection scheme for text classification.

    PubMed

    Feng, Guozhong; An, Baiguo; Yang, Fengqin; Wang, Han; Zhang, Libiao

    2017-01-01

    Feature selection is a practical approach for improving the performance of text classification methods by optimizing the feature subsets input to classifiers. In traditional feature selection methods such as information gain and chi-square, the number of documents that contain a particular term (i.e. the document frequency) is often used. However, the frequency of a given term appearing in each document has not been fully investigated, even though it is a promising feature to produce accurate classifications. In this paper, we propose a new feature selection scheme based on a term event Multinomial naive Bayes probabilistic model. According to the model assumptions, the matching score function, which is based on the prediction probability ratio, can be factorized. Finally, we derive a feature selection measurement for each term after replacing inner parameters by their estimators. On a benchmark English text datasets (20 Newsgroups) and a Chinese text dataset (MPH-20), our numerical experiment results obtained from using two widely used text classifiers (naive Bayes and support vector machine) demonstrate that our method outperformed the representative feature selection methods.

  15. Self-organization in a distributed coordination game through heuristic rules

    NASA Astrophysics Data System (ADS)

    Agarwal, Shubham; Ghosh, Diptesh; Chakrabarti, Anindya S.

    2016-12-01

    In this paper, we consider a distributed coordination game played by a large number of agents with finite information sets, which characterizes emergence of a single dominant attribute out of a large number of competitors. Formally, N agents play a coordination game repeatedly, which has exactly N pure strategy Nash equilibria, and all of the equilibria are equally preferred by the agents. The problem is to select one equilibrium out of N possible equilibria in the least number of attempts. We propose a number of heuristic rules based on reinforcement learning to solve the coordination problem. We see that the agents self-organize into clusters with varying intensities depending on the heuristic rule applied, although all clusters but one are transitory in most cases. Finally, we characterize a trade-off in terms of the time requirement to achieve a degree of stability in strategies versus the efficiency of such a solution.

  16. Bibliometrics: tracking research impact by selecting the appropriate metrics.

    PubMed

    Agarwal, Ashok; Durairajanayagam, Damayanthi; Tatagari, Sindhuja; Esteves, Sandro C; Harlev, Avi; Henkel, Ralf; Roychoudhury, Shubhadeep; Homa, Sheryl; Puchalt, Nicolás Garrido; Ramasamy, Ranjith; Majzoub, Ahmad; Ly, Kim Dao; Tvrda, Eva; Assidi, Mourad; Kesari, Kavindra; Sharma, Reecha; Banihani, Saleem; Ko, Edmund; Abu-Elmagd, Muhammad; Gosalvez, Jaime; Bashiri, Asher

    2016-01-01

    Traditionally, the success of a researcher is assessed by the number of publications he or she publishes in peer-reviewed, indexed, high impact journals. This essential yardstick, often referred to as the impact of a specific researcher, is assessed through the use of various metrics. While researchers may be acquainted with such matrices, many do not know how to use them to enhance their careers. In addition to these metrics, a number of other factors should be taken into consideration to objectively evaluate a scientist's profile as a researcher and academician. Moreover, each metric has its own limitations that need to be considered when selecting an appropriate metric for evaluation. This paper provides a broad overview of the wide array of metrics currently in use in academia and research. Popular metrics are discussed and defined, including traditional metrics and article-level metrics, some of which are applied to researchers for a greater understanding of a particular concept, including varicocele that is the thematic area of this Special Issue of Asian Journal of Andrology. We recommend the combined use of quantitative and qualitative evaluation using judiciously selected metrics for a more objective assessment of scholarly output and research impact.

  17. A SVM framework for fault detection of the braking system in a high speed train

    NASA Astrophysics Data System (ADS)

    Liu, Jie; Li, Yan-Fu; Zio, Enrico

    2017-03-01

    In April 2015, the number of operating High Speed Trains (HSTs) in the world has reached 3603. An efficient, effective and very reliable braking system is evidently very critical for trains running at a speed around 300 km/h. Failure of a highly reliable braking system is a rare event and, consequently, informative recorded data on fault conditions are scarce. This renders the fault detection problem a classification problem with highly unbalanced data. In this paper, a Support Vector Machine (SVM) framework, including feature selection, feature vector selection, model construction and decision boundary optimization, is proposed for tackling this problem. Feature vector selection can largely reduce the data size and, thus, the computational burden. The constructed model is a modified version of the least square SVM, in which a higher cost is assigned to the error of classification of faulty conditions than the error of classification of normal conditions. The proposed framework is successfully validated on a number of public unbalanced datasets. Then, it is applied for the fault detection of braking systems in HST: in comparison with several SVM approaches for unbalanced datasets, the proposed framework gives better results.

  18. Bibliometrics: tracking research impact by selecting the appropriate metrics

    PubMed Central

    Agarwal, Ashok; Durairajanayagam, Damayanthi; Tatagari, Sindhuja; Esteves, Sandro C; Harlev, Avi; Henkel, Ralf; Roychoudhury, Shubhadeep; Homa, Sheryl; Puchalt, Nicolás Garrido; Ramasamy, Ranjith; Majzoub, Ahmad; Ly, Kim Dao; Tvrda, Eva; Assidi, Mourad; Kesari, Kavindra; Sharma, Reecha; Banihani, Saleem; Ko, Edmund; Abu-Elmagd, Muhammad; Gosalvez, Jaime; Bashiri, Asher

    2016-01-01

    Traditionally, the success of a researcher is assessed by the number of publications he or she publishes in peer-reviewed, indexed, high impact journals. This essential yardstick, often referred to as the impact of a specific researcher, is assessed through the use of various metrics. While researchers may be acquainted with such matrices, many do not know how to use them to enhance their careers. In addition to these metrics, a number of other factors should be taken into consideration to objectively evaluate a scientist's profile as a researcher and academician. Moreover, each metric has its own limitations that need to be considered when selecting an appropriate metric for evaluation. This paper provides a broad overview of the wide array of metrics currently in use in academia and research. Popular metrics are discussed and defined, including traditional metrics and article-level metrics, some of which are applied to researchers for a greater understanding of a particular concept, including varicocele that is the thematic area of this Special Issue of Asian Journal of Andrology. We recommend the combined use of quantitative and qualitative evaluation using judiciously selected metrics for a more objective assessment of scholarly output and research impact. PMID:26806079

  19. Feature Selection based on Machine Learning in MRIs for Hippocampal Segmentation

    NASA Astrophysics Data System (ADS)

    Tangaro, Sabina; Amoroso, Nicola; Brescia, Massimo; Cavuoti, Stefano; Chincarini, Andrea; Errico, Rosangela; Paolo, Inglese; Longo, Giuseppe; Maglietta, Rosalia; Tateo, Andrea; Riccio, Giuseppe; Bellotti, Roberto

    2015-01-01

    Neurodegenerative diseases are frequently associated with structural changes in the brain. Magnetic resonance imaging (MRI) scans can show these variations and therefore can be used as a supportive feature for a number of neurodegenerative diseases. The hippocampus has been known to be a biomarker for Alzheimer disease and other neurological and psychiatric diseases. However, it requires accurate, robust, and reproducible delineation of hippocampal structures. Fully automatic methods are usually the voxel based approach; for each voxel a number of local features were calculated. In this paper, we compared four different techniques for feature selection from a set of 315 features extracted for each voxel: (i) filter method based on the Kolmogorov-Smirnov test; two wrapper methods, respectively, (ii) sequential forward selection and (iii) sequential backward elimination; and (iv) embedded method based on the Random Forest Classifier on a set of 10 T1-weighted brain MRIs and tested on an independent set of 25 subjects. The resulting segmentations were compared with manual reference labelling. By using only 23 feature for each voxel (sequential backward elimination) we obtained comparable state-of-the-art performances with respect to the standard tool FreeSurfer.

  20. Analytical performance of the various acquisition modes in Orbitrap MS and MS/MS.

    PubMed

    Kaufmann, Anton

    2018-04-30

    Quadrupole Orbitrap instruments (Q Orbitrap) permit high-resolution mass spectrometry (HRMS)-based full scan acquisitions and have a number of acquisition modes where the quadrupole isolates a particular mass range prior to a possible fragmentation and HRMS-based acquisition. Selecting the proper acquisition mode(s) is essential if trace analytes are to be quantified in complex matrix extracts. Depending on the particular requirements, such as sensitivity, selectivity of detection, linear dynamic range, and speed of analysis, different acquisition modes may have to be chosen. This is particularly important in the field of multi-residue analysis (e.g., pesticides or veterinary drugs in food samples) where a large number of analytes within a complex matrix have to be detected and reliably quantified. Meeting the specific detection and quantification performance criteria for every targeted compound may be challenging. It is the aim of this paper to describe the strengths and the limitations of the currently available Q Orbitrap acquisition modes. In addition, the incorporation of targeted acquisitions between full scan experiments is discussed. This approach is intended to integrate compounds that require an additional degree of sensitivity or selectivity into multi-residue methods. This article is protected by copyright. All rights reserved.

  1. Psychosocial Interventions to Improve the School Performance of Students with Attention-Deficit/Hyperactivity Disorder

    PubMed Central

    Tresco, Katy E.; Lefler, Elizabeth K.; Power, Thomas J.

    2010-01-01

    Children with ADHD typically show impairments throughout the school day. A number of interventions have been demonstrated to address both the academic and behavioral impairments associated with this disorder. Although the focus of research has been on classroom-based strategies of intervention for children with ADHD, school-based interventions applicable for non-classroom environments such as lunchrooms and playgrounds are beginning to emerge. This paper provides a brief description of the guiding principles of behavioral intervention, identifies selected strategies to address behavioral and academic concerns, discusses how school contextual factors have an effect on intervention selection and implementation, and considers the effects of using psychosocial interventions in combination with medication. PMID:21152355

  2. Distribution of Causes in Selected US Aviation Accident Reports Between 1996 and 2003

    NASA Technical Reports Server (NTRS)

    Holloway, C. M.; Johnson, C. W.

    2004-01-01

    This paper describes the results of an independent analysis of the probable and contributory causes of selected aviation accidents in the United States between 1996 and 2003. The purpose of the study was to assess the comparative frequency of a variety of causal factors in the reporting of these adverse events. Although our results show that more of these high consequence accidents were attributed to human error than to any other single factor, a large number of reports also mentioned wider systemic issues, including the managerial and regulatory context of aviation operations. These wider issues are more likely to appear as contributory rather than primary causes in this set of accident reports.

  3. Solar Sail Loads, Dynamics, and Membrane Studies

    NASA Technical Reports Server (NTRS)

    Slade, K. N.; Belvin, W. K.; Behun, V.

    2002-01-01

    While a number of solar sail missions have been proposed recently, these missions have not been selected for flight validation. Although the reasons for non-selection are varied, principal among them is the lack of subsystem integration and ground testing. This paper presents some early results from a large-scale ground testing program for integrated solar sail systems. In this series of tests, a 10 meter solar sail tested is subjected to dynamic excitation both in ambient atmospheric and vacuum conditions. Laser vibrometry is used to determine resonant frequencies and deformation shapes. The results include some low-order sail modes which only can be seen in vacuum, pointing to the necessity of testing in that environment.

  4. Segmentation and selection of appropriate Chinese characters in writing place names in Japanese.

    PubMed

    Tokimoto, S; Flores d'Arcais, G B

    2001-03-01

    This paper explores the relation between an unknown place name written in hiragana (a Japanese syllabary) and its corresponding written representation in kanji (Chinese characters). We propose three principles as those operating in the selection of the appropriate Chinese characters in writing unknown place names. The three principles are concerned with the combination of on and kun readings (zyuubako-yomi), the number of segmentations, and the bimoraicity characteristics of kanji chosen. We performed two experiments to test the principles; the results supported our hypotheses. These results have some implications for the structure of the Japanese mental lexicon, for the processing load in the use of Chinese characters, and for Japanese prosody and morphology.

  5. Adaptable mission planning for kino-dynamic systems

    NASA Astrophysics Data System (ADS)

    Bush, Lawrence A. M.; Jimenez, Tony R.; Williams, Brian C.

    Autonomous systems can perform tasks that are dangerous, monotonous, or even impossible for humans. To approach the problem of planning for Unmanned Aerial Vehicles (UAVs) we present a hierarchical method that combines a high-level planner with a low-level planner. We pose the problem of high-level planning as a Selective Traveling Salesman Problem (STSP) and select the order in which to visit our science sites. We then use a kino-dynamic path planner to create a large number of intermediate waypoints. This is a complete system that combines high and low level planning to achieve a goal. This paper demonstrates the benefits gained by adaptable high-level plans versus static and greedy plans.

  6. Temporal BYY encoding, Markovian state spaces, and space dimension determination.

    PubMed

    Xu, Lei

    2004-09-01

    As a complementary to those temporal coding approaches of the current major stream, this paper aims at the Markovian state space temporal models from the perspective of the temporal Bayesian Ying-Yang (BYY) learning with both new insights and new results on not only the discrete state featured Hidden Markov model and extensions but also the continuous state featured linear state spaces and extensions, especially with a new learning mechanism that makes selection of the state number or the dimension of state space either automatically during adaptive learning or subsequently after learning via model selection criteria obtained from this mechanism. Experiments are demonstrated to show how the proposed approach works.

  7. Material Selection for Cable Gland to Improved Reliability of the High-hazard Industries

    NASA Astrophysics Data System (ADS)

    Vashchuk, S. P.; Slobodyan, S. M.; Deeva, V. S.; Vashchuk, D. S.

    2018-01-01

    The sealed cable glands (SCG) are available to ensure safest connection sheathed single wire for the hazard production facility (nuclear power plant and others) the same as pilot cable, control cables, radio-frequency cables et al. In this paper, we investigate the specifics of the material selection of SCG with the express aim of hazardous man-made facility. We discuss the safe working conditions for cable glands. The research indicates the sintering powdered metals cables provide the reliability growth due to their properties. A number of studies have demonstrated the verification of material selection. On the face of it, we make findings indicating that double glazed sealed units could enhance reliability. We had evaluated sample reliability under fire conditions, seismic load, and pressure containment failure. We used the samples mineral insulated thermocouple cable.

  8. An evaluation of Bayesian techniques for controlling model complexity and selecting inputs in a neural network for short-term load forecasting.

    PubMed

    Hippert, Henrique S; Taylor, James W

    2010-04-01

    Artificial neural networks have frequently been proposed for electricity load forecasting because of their capabilities for the nonlinear modelling of large multivariate data sets. Modelling with neural networks is not an easy task though; two of the main challenges are defining the appropriate level of model complexity, and choosing the input variables. This paper evaluates techniques for automatic neural network modelling within a Bayesian framework, as applied to six samples containing daily load and weather data for four different countries. We analyse input selection as carried out by the Bayesian 'automatic relevance determination', and the usefulness of the Bayesian 'evidence' for the selection of the best structure (in terms of number of neurones), as compared to methods based on cross-validation. Copyright 2009 Elsevier Ltd. All rights reserved.

  9. Model selection as a science driver for dark energy surveys

    NASA Astrophysics Data System (ADS)

    Mukherjee, Pia; Parkinson, David; Corasaniti, Pier Stefano; Liddle, Andrew R.; Kunz, Martin

    2006-07-01

    A key science goal of upcoming dark energy surveys is to seek time-evolution of the dark energy. This problem is one of model selection, where the aim is to differentiate between cosmological models with different numbers of parameters. However, the power of these surveys is traditionally assessed by estimating their ability to constrain parameters, which is a different statistical problem. In this paper, we use Bayesian model selection techniques, specifically forecasting of the Bayes factors, to compare the abilities of different proposed surveys in discovering dark energy evolution. We consider six experiments - supernova luminosity measurements by the Supernova Legacy Survey, SNAP, JEDI and ALPACA, and baryon acoustic oscillation measurements by WFMOS and JEDI - and use Bayes factor plots to compare their statistical constraining power. The concept of Bayes factor forecasting has much broader applicability than dark energy surveys.

  10. 'Everything is everywhere: but the environment selects': ubiquitous distribution and ecological determinism in microbial biogeography.

    PubMed

    O'Malley, Maureen A

    2008-09-01

    Recent discoveries of geographical patterns in microbial distribution are undermining microbiology's exclusively ecological explanations of biogeography and their fundamental assumption that 'everything is everywhere: but the environment selects'. This statement was generally promulgated by Dutch microbiologist Martinus Wilhelm Beijerinck early in the twentieth century and specifically articulated in 1934 by his compatriot, Lourens G. M. Baas Becking. The persistence of this precept throughout twentieth-century microbiology raises a number of issues in relation to its formulation and widespread acceptance. This paper will trace the conceptual history of Beijerinck's claim that 'everything is everywhere' in relation to a more general account of its theoretical, experimental and institutional context. His principle also needs to be situated in relationship to plant and animal biogeography, which, this paper will argue, forms a continuum of thought with microbial biogeography. Finally, a brief overview of the contemporary microbiological research challenging 'everything is everywhere' reveals that philosophical issues from Beijerinck's era of microbiology still provoke intense discussion in twenty-first century investigations of microbial biogeography.

  11. Application of derivative spectrophotometry under orthogonal polynomial at unequal intervals: determination of metronidazole and nystatin in their pharmaceutical mixture.

    PubMed

    Korany, Mohamed A; Abdine, Heba H; Ragab, Marwa A A; Aboras, Sara I

    2015-05-15

    This paper discusses a general method for the use of orthogonal polynomials for unequal intervals (OPUI) to eliminate interferences in two-component spectrophotometric analysis. In this paper, a new approach was developed by using first derivative D1 curve instead of absorbance curve to be convoluted using OPUI method for the determination of metronidazole (MTR) and nystatin (NYS) in their mixture. After applying derivative treatment of the absorption data many maxima and minima points appeared giving characteristic shape for each drug allowing the selection of different number of points for the OPUI method for each drug. This allows the specific and selective determination of each drug in presence of the other and in presence of any matrix interference. The method is particularly useful when the two absorption spectra have considerable overlap. The results obtained are encouraging and suggest that the method can be widely applied to similar problems. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Mode selective generation of guided waves by systematic optimization of the interfacial shear stress profile

    NASA Astrophysics Data System (ADS)

    Yazdanpanah Moghadam, Peyman; Quaegebeur, Nicolas; Masson, Patrice

    2015-01-01

    Piezoelectric transducers are commonly used in structural health monitoring systems to generate and measure ultrasonic guided waves (GWs) by applying interfacial shear and normal stresses to the host structure. In most cases, in order to perform damage detection, advanced signal processing techniques are required, since a minimum of two dispersive modes are propagating in the host structure. In this paper, a systematic approach for mode selection is proposed by optimizing the interfacial shear stress profile applied to the host structure, representing the first step of a global optimization of selective mode actuator design. This approach has the potential of reducing the complexity of signal processing tools as the number of propagating modes could be reduced. Using the superposition principle, an analytical method is first developed for GWs excitation by a finite number of uniform segments, each contributing with a given elementary shear stress profile. Based on this, cost functions are defined in order to minimize the undesired modes and amplify the selected mode and the optimization problem is solved with a parallel genetic algorithm optimization framework. Advantages of this method over more conventional transducers tuning approaches are that (1) the shear stress can be explicitly optimized to both excite one mode and suppress other undesired modes, (2) the size of the excitation area is not constrained and mode-selective excitation is still possible even if excitation width is smaller than all excited wavelengths, and (3) the selectivity is increased and the bandwidth extended. The complexity of the optimal shear stress profile obtained is shown considering two cost functions with various optimal excitation widths and number of segments. Results illustrate that the desired mode (A0 or S0) can be excited dominantly over other modes up to a wave power ratio of 1010 using an optimal shear stress profile.

  13. Sensitivity of an underwater Čerenkov km 3 telescope to TeV neutrinos from Galactic microquasars

    NASA Astrophysics Data System (ADS)

    Aiello, S.; Ambriola, M.; Ameli, F.; Amore, I.; Anghinolfi, M.; Anzalone, A.; Barbarino, G.; Barbarito, E.; Battaglieri, M.; Bellotti, R.; Beverini, N.; Bonori, M.; Bouhadef, B.; Brescia, M.; Cacopardo, G.; Cafagna, F.; Capone, A.; Caponetto, L.; Castorina, E.; Ceres, A.; Chiarusi, T.; Circella, M.; Cocimano, R.; Coniglione, R.; Cordelli, M.; Costa, M.; Cuneo, S.; D'Amico, A.; De Bonis, G.; De Marzo, C.; De Rosa, G.; De Vita, R.; Distefano, C.; Falchini, E.; Fiorello, C.; Flaminio, V.; Fratini, K.; Gabrielli, A.; Galeotti, S.; Gandolfi, E.; Giacomelli, G.; Giorgi, F.; Grimaldi, A.; Habel, R.; Leonora, E.; Lonardo, A.; Longo, G.; Lo Presti, D.; Lucarelli, F.; Maccioni, E.; Margiotta, A.; Martini, A.; Masullo, R.; Megna, R.; Migneco, E.; Mongelli, M.; Montaruli, T.; Morganti, M.; Musumeci, M.; Nicolau, C. A.; Orlando, A.; Osipenko, M.; Osteria, G.; Papaleo, R.; Pappalardo, V.; Petta, C.; Piattelli, P.; Raia, G.; Randazzo, N.; Reito, S.; Ricco, G.; Riccobene, G.; Ripani, M.; Rovelli, A.; Ruppi, M.; Russo, G. V.; Russo, S.; Sapienza, P.; Sedita, M.; Shirokov, E.; Simeone, F.; Sipala, V.; Spurio, M.; Taiuti, M.; Terreni, G.; Trasatti, L.; Urso, S.; Valente, V.; Vicini, P.

    2007-09-01

    In this paper are presented the results of Monte Carlo simulations on the capability of the proposed NEMO-km 3 telescope to detect TeV muon neutrinos from Galactic microquasars. For each known microquasar we compute the number of detectable events, together with the atmospheric neutrino and muon background events. We also discuss the detector sensitivity to neutrino fluxes expected from known microquasars, optimizing the event selection also to reject the background; the number of events surviving the event selection are given. The best candidates are the steady microquasars SS433 and GX339-4 for which we estimate a sensitivity of about 5 × 10 -11 erg/cm 2 s; the predicted fluxes are expected to be well above this sensitivity. For bursting microquasars the most interesting candidates are Cygnus X-3, GRO J1655-40 and XTE J1118+480: their analyses are more complicated because of the stochastic nature of the bursts.

  14. A dynamic replication management strategy in distributed GIS

    NASA Astrophysics Data System (ADS)

    Pan, Shaoming; Xiong, Lian; Xu, Zhengquan; Chong, Yanwen; Meng, Qingxiang

    2018-03-01

    Replication strategy is one of effective solutions to meet the requirement of service response time by preparing data in advance to avoid the delay of reading data from disks. This paper presents a brand-new method to create copies considering the selection of replicas set, the number of copies for each replica and the placement strategy of all copies. First, the popularities of all data are computed considering both the historical access records and the timeliness of the records. Then, replica set can be selected based on their recent popularities. Also, an enhanced Q-value scheme is proposed to assign the number of copies for each replica. Finally, a reasonable copies placement strategy is designed to meet the requirement of load balance. In addition, we present several experiments that compare the proposed method with techniques that use other replication management strategies. The results show that the proposed model has better performance than other algorithms in all respects. Moreover, the experiments based on different parameters also demonstrated the effectiveness and adaptability of the proposed algorithm.

  15. Prediction of municipal solid waste generation using nonlinear autoregressive network.

    PubMed

    Younes, Mohammad K; Nopiah, Z M; Basri, N E Ahmad; Basri, H; Abushammala, Mohammed F M; Maulud, K N A

    2015-12-01

    Most of the developing countries have solid waste management problems. Solid waste strategic planning requires accurate prediction of the quality and quantity of the generated waste. In developing countries, such as Malaysia, the solid waste generation rate is increasing rapidly, due to population growth and new consumption trends that characterize society. This paper proposes an artificial neural network (ANN) approach using feedforward nonlinear autoregressive network with exogenous inputs (NARX) to predict annual solid waste generation in relation to demographic and economic variables like population number, gross domestic product, electricity demand per capita and employment and unemployment numbers. In addition, variable selection procedures are also developed to select a significant explanatory variable. The model evaluation was performed using coefficient of determination (R(2)) and mean square error (MSE). The optimum model that produced the lowest testing MSE (2.46) and the highest R(2) (0.97) had three inputs (gross domestic product, population and employment), eight neurons and one lag in the hidden layer, and used Fletcher-Powell's conjugate gradient as the training algorithm.

  16. Sniffer Channel Selection for Monitoring Wireless LANs

    NASA Astrophysics Data System (ADS)

    Song, Yuan; Chen, Xian; Kim, Yoo-Ah; Wang, Bing; Chen, Guanling

    Wireless sniffers are often used to monitor APs in wireless LANs (WLANs) for network management, fault detection, traffic characterization, and optimizing deployment. It is cost effective to deploy single-radio sniffers that can monitor multiple nearby APs. However, since nearby APs often operate on orthogonal channels, a sniffer needs to switch among multiple channels to monitor its nearby APs. In this paper, we formulate and solve two optimization problems on sniffer channel selection. Both problems require that each AP be monitored by at least one sniffer. In addition, one optimization problem requires minimizing the maximum number of channels that a sniffer listens to, and the other requires minimizing the total number of channels that the sniffers listen to. We propose a novel LP-relaxation based algorithm, and two simple greedy heuristics for the above two optimization problems. Through simulation, we demonstrate that all the algorithms are effective in achieving their optimization goals, and the LP-based algorithm outperforms the greedy heuristics.

  17. FIFS: A data mining method for informative marker selection in high dimensional population genomic data.

    PubMed

    Kavakiotis, Ioannis; Samaras, Patroklos; Triantafyllidis, Alexandros; Vlahavas, Ioannis

    2017-11-01

    Single Nucleotide Polymorphism (SNPs) are, nowadays, becoming the marker of choice for biological analyses involving a wide range of applications with great medical, biological, economic and environmental interest. Classification tasks i.e. the assignment of individuals to groups of origin based on their (multi-locus) genotypes, are performed in many fields such as forensic investigations, discrimination between wild and/or farmed populations and others. Τhese tasks, should be performed with a small number of loci, for computational as well as biological reasons. Thus, feature selection should precede classification tasks, especially for Single Nucleotide Polymorphism (SNP) datasets, where the number of features can amount to hundreds of thousands or millions. In this paper, we present a novel data mining approach, called FIFS - Frequent Item Feature Selection, based on the use of frequent items for selection of the most informative markers from population genomic data. It is a modular method, consisting of two main components. The first one identifies the most frequent and unique genotypes for each sampled population. The second one selects the most appropriate among them, in order to create the informative SNP subsets to be returned. The proposed method (FIFS) was tested on a real dataset, which comprised of a comprehensive coverage of pig breed types present in Britain. This dataset consisted of 446 individuals divided in 14 sub-populations, genotyped at 59,436 SNPs. Our method outperforms the state-of-the-art and baseline methods in every case. More specifically, our method surpassed the assignment accuracy threshold of 95% needing only half the number of SNPs selected by other methods (FIFS: 28 SNPs, Delta: 70 SNPs Pairwise FST: 70 SNPs, In: 100 SNPs.) CONCLUSION: Our approach successfully deals with the problem of informative marker selection in high dimensional genomic datasets. It offers better results compared to existing approaches and can aid biologists in selecting the most informative markers with maximum discrimination power for optimization of cost-effective panels with applications related to e.g. species identification, wildlife management, and forensics. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Sampling in health geography: reconciling geographical objectives and probabilistic methods. An example of a health survey in Vientiane (Lao PDR)

    PubMed Central

    Vallée, Julie; Souris, Marc; Fournet, Florence; Bochaton, Audrey; Mobillion, Virginie; Peyronnie, Karine; Salem, Gérard

    2007-01-01

    Background Geographical objectives and probabilistic methods are difficult to reconcile in a unique health survey. Probabilistic methods focus on individuals to provide estimates of a variable's prevalence with a certain precision, while geographical approaches emphasise the selection of specific areas to study interactions between spatial characteristics and health outcomes. A sample selected from a small number of specific areas creates statistical challenges: the observations are not independent at the local level, and this results in poor statistical validity at the global level. Therefore, it is difficult to construct a sample that is appropriate for both geographical and probability methods. Methods We used a two-stage selection procedure with a first non-random stage of selection of clusters. Instead of randomly selecting clusters, we deliberately chose a group of clusters, which as a whole would contain all the variation in health measures in the population. As there was no health information available before the survey, we selected a priori determinants that can influence the spatial homogeneity of the health characteristics. This method yields a distribution of variables in the sample that closely resembles that in the overall population, something that cannot be guaranteed with randomly-selected clusters, especially if the number of selected clusters is small. In this way, we were able to survey specific areas while minimising design effects and maximising statistical precision. Application We applied this strategy in a health survey carried out in Vientiane, Lao People's Democratic Republic. We selected well-known health determinants with unequal spatial distribution within the city: nationality and literacy. We deliberately selected a combination of clusters whose distribution of nationality and literacy is similar to the distribution in the general population. Conclusion This paper describes the conceptual reasoning behind the construction of the survey sample and shows that it can be advantageous to choose clusters using reasoned hypotheses, based on both probability and geographical approaches, in contrast to a conventional, random cluster selection strategy. PMID:17543100

  19. Sampling in health geography: reconciling geographical objectives and probabilistic methods. An example of a health survey in Vientiane (Lao PDR).

    PubMed

    Vallée, Julie; Souris, Marc; Fournet, Florence; Bochaton, Audrey; Mobillion, Virginie; Peyronnie, Karine; Salem, Gérard

    2007-06-01

    Geographical objectives and probabilistic methods are difficult to reconcile in a unique health survey. Probabilistic methods focus on individuals to provide estimates of a variable's prevalence with a certain precision, while geographical approaches emphasise the selection of specific areas to study interactions between spatial characteristics and health outcomes. A sample selected from a small number of specific areas creates statistical challenges: the observations are not independent at the local level, and this results in poor statistical validity at the global level. Therefore, it is difficult to construct a sample that is appropriate for both geographical and probability methods. We used a two-stage selection procedure with a first non-random stage of selection of clusters. Instead of randomly selecting clusters, we deliberately chose a group of clusters, which as a whole would contain all the variation in health measures in the population. As there was no health information available before the survey, we selected a priori determinants that can influence the spatial homogeneity of the health characteristics. This method yields a distribution of variables in the sample that closely resembles that in the overall population, something that cannot be guaranteed with randomly-selected clusters, especially if the number of selected clusters is small. In this way, we were able to survey specific areas while minimising design effects and maximising statistical precision. We applied this strategy in a health survey carried out in Vientiane, Lao People's Democratic Republic. We selected well-known health determinants with unequal spatial distribution within the city: nationality and literacy. We deliberately selected a combination of clusters whose distribution of nationality and literacy is similar to the distribution in the general population. This paper describes the conceptual reasoning behind the construction of the survey sample and shows that it can be advantageous to choose clusters using reasoned hypotheses, based on both probability and geographical approaches, in contrast to a conventional, random cluster selection strategy.

  20. Annals of Occupational Hygiene at volume 50: many achievements, a few mistakes, and an interesting future.

    PubMed

    Ogden, Trevor

    2006-11-01

    The past 10 years have seen a doubling of the number of papers submitted to the Annals, and a 5-fold increase in the number of institutions with access to the journal. Electronic access is now far more important than print access. Papers from British authors dominated the first 20 years of the journal, but the rest of Europe is now more important, with Scandinavia and The Netherlands being the major continental sources. North America is the other major source. For British papers, there has been a big growth in government authors, and a decline in papers from industry and armed forces. From many possible topics, trends are selectively reviewed in: standards and exposure limits; measurement methods and criteria; sampling strategy and statistics; fibres; control banding; dermal exposure; and evaluation of control. For the future, we will continue to have the same aims and standards, but the changes of the past few years, and the growth of new approaches such as open access, have emphasized the difficulty of forecasting. The growth in submissions from countries which we presently regard as 'developing', and especially the growth in higher education in China, and the amount of occupational disease there, are bound to have major impacts. Perhaps the English language will not continue to dominate scientific publishing, but in any case an eastward shift in the source of papers must lead to other changes.

  1. Scoping meta-review: introducing a new methodology.

    PubMed

    Sarrami-Foroushani, Pooria; Travaglia, Joanne; Debono, Deborah; Clay-Williams, Robyn; Braithwaite, Jeffrey

    2015-02-01

    For researchers, policymakers, and practitioners facing a new field, undertaking a systematic review can typically present a challenge due to the enormous number of relevant papers. A scoping review is a method suggested for addressing this dilemma; however, scoping reviews present their own challenges. This paper introduces the "scoping meta-review" (SMR) for expanding current methodologies and is based on our experiences in mapping the field of consumer engagement in healthcare. During this process, we developed the novel SMR method. An SMR combines aspects of a scoping review and a meta-review to establish an evidence-based map of a field. Similar to a scoping review, an SMR offers a practical and flexible methodology. However, unlike in a traditional scoping review, only systematic reviews are included. Stages of the SMR include: undertaking a preliminary nonsystematic review; building a search strategy; interrogating academic literature databases; classifying and excluding studies based on titles and abstracts; saving the refined database of references; revising the search strategy; selecting and reviewing the full text papers; and thematically analyzing the selected texts and writing the report. The main benefit of an SMR is to map a new field based on high-level evidence provided by systematic reviews. © 2014 Wiley Periodicals, Inc.

  2. Identifying Node Role in Social Network Based on Multiple Indicators

    PubMed Central

    Huang, Shaobin; Lv, Tianyang; Zhang, Xizhe; Yang, Yange; Zheng, Weimin; Wen, Chao

    2014-01-01

    It is a classic topic of social network analysis to evaluate the importance of nodes and identify the node that takes on the role of core or bridge in a network. Because a single indicator is not sufficient to analyze multiple characteristics of a node, it is a natural solution to apply multiple indicators that should be selected carefully. An intuitive idea is to select some indicators with weak correlations to efficiently assess different characteristics of a node. However, this paper shows that it is much better to select the indicators with strong correlations. Because indicator correlation is based on the statistical analysis of a large number of nodes, the particularity of an important node will be outlined if its indicator relationship doesn't comply with the statistical correlation. Therefore, the paper selects the multiple indicators including degree, ego-betweenness centrality and eigenvector centrality to evaluate the importance and the role of a node. The importance of a node is equal to the normalized sum of its three indicators. A candidate for core or bridge is selected from the great degree nodes or the nodes with great ego-betweenness centrality respectively. Then, the role of a candidate is determined according to the difference between its indicators' relationship with the statistical correlation of the overall network. Based on 18 real networks and 3 kinds of model networks, the experimental results show that the proposed methods perform quite well in evaluating the importance of nodes and in identifying the node role. PMID:25089823

  3. Fault diagnosis of automobile hydraulic brake system using statistical features and support vector machines

    NASA Astrophysics Data System (ADS)

    Jegadeeshwaran, R.; Sugumaran, V.

    2015-02-01

    Hydraulic brakes in automobiles are important components for the safety of passengers; therefore, the brakes are a good subject for condition monitoring. The condition of the brake components can be monitored by using the vibration characteristics. On-line condition monitoring by using machine learning approach is proposed in this paper as a possible solution to such problems. The vibration signals for both good as well as faulty conditions of brakes were acquired from a hydraulic brake test setup with the help of a piezoelectric transducer and a data acquisition system. Descriptive statistical features were extracted from the acquired vibration signals and the feature selection was carried out using the C4.5 decision tree algorithm. There is no specific method to find the right number of features required for classification for a given problem. Hence an extensive study is needed to find the optimum number of features. The effect of the number of features was also studied, by using the decision tree as well as Support Vector Machines (SVM). The selected features were classified using the C-SVM and Nu-SVM with different kernel functions. The results are discussed and the conclusion of the study is presented.

  4. Stochastic subset selection for learning with kernel machines.

    PubMed

    Rhinelander, Jason; Liu, Xiaoping P

    2012-06-01

    Kernel machines have gained much popularity in applications of machine learning. Support vector machines (SVMs) are a subset of kernel machines and generalize well for classification, regression, and anomaly detection tasks. The training procedure for traditional SVMs involves solving a quadratic programming (QP) problem. The QP problem scales super linearly in computational effort with the number of training samples and is often used for the offline batch processing of data. Kernel machines operate by retaining a subset of observed data during training. The data vectors contained within this subset are referred to as support vectors (SVs). The work presented in this paper introduces a subset selection method for the use of kernel machines in online, changing environments. Our algorithm works by using a stochastic indexing technique when selecting a subset of SVs when computing the kernel expansion. The work described here is novel because it separates the selection of kernel basis functions from the training algorithm used. The subset selection algorithm presented here can be used in conjunction with any online training technique. It is important for online kernel machines to be computationally efficient due to the real-time requirements of online environments. Our algorithm is an important contribution because it scales linearly with the number of training samples and is compatible with current training techniques. Our algorithm outperforms standard techniques in terms of computational efficiency and provides increased recognition accuracy in our experiments. We provide results from experiments using both simulated and real-world data sets to verify our algorithm.

  5. Proceedings of the TOUGH Symposium 2009

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moridis, George J.; Doughty, Christine; Finsterle, Stefan

    2009-10-01

    Welcome to the TOUGH Symposium 2009. Within this volume are the Symposium Program for eighty-nine papers to be presented in both oral and poster formats. The full papers are available as pdfs linked from the Symposium Program posted on the TOUGH Symposium 2009 website http://esd.lbl.gov/newsandevents/events/toughsymposium09/program.html Additional updated information including any changes to the Program will also be available at the website. The papers cover a wide range of application areas and reflect the continuing trend toward increased sophistication of the TOUGH codes. A CD containing the proceedings papers will be published immediately following the Symposium and sent to all participants.more » As in the prior Symposium, selected papers will be invited for submission to a number of journals for inclusion in Special Issues focused on applications and developments of the TOUGH codes. These journals include, Transport in Porous Media, Geothermics, Energy Conversion and Management, Journal of Nuclear Science and Technology, and the Vadose Zone Journal.« less

  6. List of Films Recommended for Children and Adolescents up to 16 Years Following Selections Made in Twenty-Two Countries. Reports and Papers on Mass Communication, Number 19.

    ERIC Educational Resources Information Center

    Barrot, Jean-Pierre; Billard, Ginette

    This is a list of films suitable for children and adolescents up to the age of 16, compiled from 37 lists sent in by 22 countries and in accordance with the recommendations of the conference on the exhibition and distribution of films for children and adolescents, under the patronage of UNESCO. Approximately 1000 films which run for over 40…

  7. Ways to improve your correlation functions

    NASA Technical Reports Server (NTRS)

    Hamilton, A. J. S.

    1993-01-01

    This paper describes a number of ways to improve on the standard method for measuring the two-point correlation function of large scale structure in the Universe. Issues addressed are: (1) the problem of the mean density, and how to solve it; (2) how to estimate the uncertainty in a measured correlation function; (3) minimum variance pair weighting; (4) unbiased estimation of the selection function when magnitudes are discrete; and (5) analytic computation of angular integrals in background pair counts.

  8. Application of Weka environment to determine factors that stand behind non-alcoholic fatty liver disease (NAFLD)

    NASA Astrophysics Data System (ADS)

    Plutecki, Michal M.; Wierzbicka, Aldona; Socha, Piotr; Mulawka, Jan J.

    2009-06-01

    The paper describes an innovative approach to discover new knowledge in non-alcoholic fatty liver disease (NAFLD). In order to determine the factors that may cause the disease a number of classification and attribute selection algorithms have been applied. Only those with the best classification results were chosen. Several interesting facts associated with this unclear disease have been discovered. All data mining computations were made in Weka environment.

  9. Constructing an Employee Benefit Package for Part-Time Workers. A Rationale for Arriving at an Equitable Benefit Package at No Extra Cost to the Employer. A Catalyst Position Paper.

    ERIC Educational Resources Information Center

    Catalyst, New York, NY.

    Guidelines are presented for constructing an employee benefit package for part-time workers in which benefits are calculated on a pro-rated (and sometimes selective) basis, such that part-time employees receive a fair proportion of the total benefit package, based on the number of hours they actually work. Pro-rating is recommended as feasible for…

  10. Build your own low-cost seismic/bathymetric recorder annotator

    USGS Publications Warehouse

    Robinson, W.

    1994-01-01

    An inexpensive programmable annotator, completely compatible with at least three models of widely used graphic recorders (Raytheon LSR-1811, Raytheon LSR-1807 M, and EDO 550) has been developed to automatically write event marks and print up to sixteen numbers on the paper record. Event mark and character printout intervals, character height and character position are all selectable with front panel switches. Operation is completely compatible with recorders running in either continuous or start-stop mode. ?? 1994.

  11. The Use of Epistemic Markers as a Means of Hedging and Boosting in the Discourse of L1 and L2 Speakers of Modern Greek: A Corpus-Based Study in Informal Letter-Writing

    ERIC Educational Resources Information Center

    Efstathiadi, Lia

    2010-01-01

    The paper investigates the semantic area of Epistemic Modality in Modern Greek, by means of a corpus-based research. A comparative, quantitative study was performed between written corpora (informal letter-writing) of non-native informants with various language backgrounds and Greek native speakers. A number of epistemic markers were selected for…

  12. Some Impacts of Risk-Centric Certification Requirements for UAS

    NASA Technical Reports Server (NTRS)

    Neogi, Natasha A. (Inventor); Hayhurst, Kelly J.; Maddalon, Jeffrey M.; Verstynen, Harry A.

    2016-01-01

    This paper discusses results from a recent study that investigates certification requirements for an unmanned rotorcraft performing agricultural application operations. The process of determining appropriate requirements using a risk-centric approach revealed a number of challenges that could impact larger UAS standardization efforts. Fundamental challenges include selecting the correct level of abstraction for requirements to permit design flexibility, transforming human-centric operational requirements to aircraft airworthiness requirements, and assessing all hazards associated with the operation.

  13. The fate of triaged and rejected manuscripts.

    PubMed

    Zoccali, Carmine; Amodeo, Daniela; Argiles, Angel; Arici, Mustafa; D'arrigo, Graziella; Evenepoel, Pieter; Fliser, Danilo; Fox, Jonathan; Gesualdo, Loreto; Jadoul, Michel; Ketteler, Markus; Malyszko, Jolanta; Massy, Ziad; Mayer, Gert; Ortiz, Alberto; Sever, Mehmet; Vanholder, Raymond; Vinck, Caroline; Wanner, Christopher; Więcek, Andrzej

    2015-12-01

    In 2011, Nephrology Dialysis and Transplantation (NDT) established a more restrictive selection process for manuscripts submitted to the journal, reducing the acceptance rate from 25% (2008-2009) to currently about 12-15%. To achieve this goal, we decided to score the priority of manuscripts submitted to NDT and to reject more papers at triage than in the past. This new scoring system allows a rapid decision for the authors without external review. However, the risk of such a restrictive policy may be that the journal might fail to capture important studies that are eventually published in higher-ranked journals. To look into this problem, we analysed random samples of papers (∼10%) rejected by NDT in 2012. Of the papers rejected at triage and those rejected after regular peer review, 59 and 61%, respectively, were accepted in other journals. A detailed analysis of these papers showed that only 4 out of 104 and 7 out of 93 of the triaged and rejected papers, respectively, were published in journals with an impact factor higher than that of NDT. Furthermore, for all these papers, independent assessors confirmed the evaluation made by the original reviewers. The number of citations of these papers was similar to that typically obtained by publications in the corresponding journals. Even though the analyses seem reassuring, previous observations made by leading journals warn that the risk of 'big misses', resulting from selective editorial policies, remains a real possibility. We will therefore continue to maintain a high degree of alertness and will periodically track the history of manuscripts rejected by NDT, particularly papers that are rejected at triage by our journal. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  14. Diffusion of knowledge in dentistry. The pit and fissure sealant conferences.

    PubMed

    Chapko, M K

    1988-01-01

    This paper presents data on the diffusion to dentists of information from the May 1981 American Dental Association conference and December 1983 National Institutes of Health (NIH) conference on pit and fissure sealants. A mail survey was sent to 521 (72.2% returned) randomly selected dentists in general practice and all 47 (76.6% returned) pedodontists in the State of Washington. Approximately 70% of the dentists were aware that the conferences had taken place and a little over 50% were aware of the conference recommendations. Awareness was related to: being a pedodontist, number of continuing education hours taken, number of referred journals read, and being an officer in a dental organization.

  15. Parenteral nutrition.

    PubMed

    Inayet, N; Neild, P

    2015-03-01

    Over the last 50 years, parenteral nutrition has been recognised as an invaluable and potentially lifesaving tool in the physician's arsenal in the management of patients with intestinal failure or inaccessibility; however, it may also be associated with a number of potentially life-threatening complications. A recent NCEPOD report (2010) identified a number of inadequacies in the overall provision and management of parenteral nutrition and recommendations were made with the aim of improving clinical practice in the future. This paper focuses on the practical aspects relating to parenteral nutrition for adults, including important concepts, such as patient selection, as well as general management. We also explore the various pitfalls and potential complications and how these may be minimised.

  16. Target selection and comparison of mission design for space debris removal by DLR's advanced study group

    NASA Astrophysics Data System (ADS)

    van der Pas, Niels; Lousada, Joao; Terhes, Claudia; Bernabeu, Marc; Bauer, Waldemar

    2014-09-01

    Space debris is a growing problem. Models show that the Kessler syndrome, the exponential growth of debris due to collisions, has become unavoidable unless an active debris removal program is initiated. The debris population in LEO with inclination between 60° and 95° is considered as the most critical zone. In order to stabilize the debris population in orbit, especially in LEO, 5 to 10 objects will need to be removed every year. The unique circumstances of such a mission could require that several objects are removed with a single launch. This will require a mission to rendezvous with a multitude of objects orbiting on different altitudes, inclinations and planes. Removal models have assumed that the top priority targets will be removed first. However this will lead to a suboptimal mission design and increase the ΔV-budget. Since there is a multitude of targets to choose from, the targets can be selected for an optimal mission design. In order to select a group of targets for a removal mission the orbital parameters and political constraints should also be taken into account. Within this paper a number of the target selection criteria are presented. The possible mission targets and their order of retrieval is dependent on the mission architecture. A comparison between several global mission architectures is given. Under consideration are 3 global missions of which a number of parameters are varied. The first mission launches multiple separate deorbit kits. The second launches a mother craft with deorbit kits. The third launches an orbital tug which pulls the debris in a lower orbit, after which a deorbit kit performs the final deorbit burn. A RoM mass and cost comparison is presented. The research described in this paper has been conducted as part of an active debris removal study by the Advanced Study Group (ASG). The ASG is an interdisciplinary student group working at the DLR, analyzing existing technologies and developing new ideas into preliminary concepts.

  17. Computer language for identifying chemicals with comprehensive two-dimensional gas chromatography and mass spectrometry.

    PubMed

    Reichenbach, Stephen E; Kottapalli, Visweswara; Ni, Mingtian; Visvanathan, Arvind

    2005-04-15

    This paper describes a language for expressing criteria for chemical identification with comprehensive two-dimensional gas chromatography paired with mass spectrometry (GC x GC-MS) and presents computer-based tools implementing the language. The Computer Language for Indentifying Chemicals (CLIC) allows expressions that describe rules (or constraints) for selecting chemical peaks or data points based on multi-dimensional chromatographic properties and mass spectral characteristics. CLIC offers chromatographic functions of retention times, functions of mass spectra, numbers for quantitative and relational evaluation, and logical and arithmetic operators. The language is demonstrated with the compound-class selection rules described by Welthagen et al. [W. Welthagen, J. Schnelle-Kreis, R. Zimmermann, J. Chromatogr. A 1019 (2003) 233-249]. A software implementation of CLIC provides a calculator-like graphical user-interface (GUI) for building and applying selection expressions. From the selection calculator, expressions can be used to select chromatographic peaks that meet the criteria or create selection chromatograms that mask data points inconsistent with the criteria. Selection expressions can be combined with graphical, geometric constraints in the retention-time plane as a powerful component for chemical identification with template matching or used to speed and improve mass spectrum library searches.

  18. A procedure to select ground-motion time histories for deterministic seismic hazard analysis from the Next Generation Attenuation (NGA) database

    NASA Astrophysics Data System (ADS)

    Huang, Duruo; Du, Wenqi; Zhu, Hong

    2017-10-01

    In performance-based seismic design, ground-motion time histories are needed for analyzing dynamic responses of nonlinear structural systems. However, the number of ground-motion data at design level is often limited. In order to analyze seismic performance of structures, ground-motion time histories need to be either selected from recorded strong-motion database or numerically simulated using stochastic approaches. In this paper, a detailed procedure to select proper acceleration time histories from the Next Generation Attenuation (NGA) database for several cities in Taiwan is presented. Target response spectra are initially determined based on a local ground-motion prediction equation under representative deterministic seismic hazard analyses. Then several suites of ground motions are selected for these cities using the Design Ground Motion Library (DGML), a recently proposed interactive ground-motion selection tool. The selected time histories are representatives of the regional seismic hazard and should be beneficial to earthquake studies when comprehensive seismic hazard assessments and site investigations are unavailable. Note that this method is also applicable to site-specific motion selections with the target spectra near the ground surface considering the site effect.

  19. Aerodynamics of an Axisymmetric Missile Concept Having Cruciform Strakes and In-Line Tail Fins From Mach 0.60 to 4.63

    NASA Technical Reports Server (NTRS)

    Allen, Jerry M.

    2005-01-01

    An experimental study has been performed to develop a large force and moment aerodynamic data set on a slender axisymmetric missile configuration having cruciform strakes and in-line control tail fins. The data include six-component balance measurements of the configuration aerodynamics and three-component measurements on all four tail fins. The test variables include angle of attack, roll angle, Mach number, model buildup, strake length, nose size, and tail fin deflection angles to provide pitch, yaw, and roll control. Test Mach numbers ranged from 0.60 to 4.63. The entire data set is presented on a CD-ROM that is attached to this paper. The CD-ROM also includes extensive plots of both the six-component configuration data and the three-component tail fin data. Selected samples of these plots are presented in this paper to illustrate the features of the data and to investigate the effects of the test variables.

  20. Aerodynamics of an Axisymmetric Missile Concept Having Cruciform Strakes and In-Line Tail Fins From Mach 0.60 to 4.63, Supplement

    NASA Technical Reports Server (NTRS)

    Allen, Jerry M.

    2005-01-01

    An experimental study has been performed to develop a large force and moment aerodynamic data set on a slender axisymmetric missile configuration having cruciform strakes and in-line control tail fins. The data include six-component balance measurements of the configuration aerodynamics and three-component measurements on all four tail fins. The test variables include angle of attack, roll angle, Mach number, model buildup, strake length, nose size, and tail fin deflection angles to provide pitch, yaw, and roll control. Test Mach numbers ranged from 0.60 to 4.63. The entire data set is presented on a CD-ROM that is attached to this paper. The CD-ROM also includes extensive plots of both the six-component configuration data and the three-component tail fin data. Selected samples of these plots are presented in this paper to illustrate the features of the data and to investigate the effects of the test variables.

  1. Code-Time Diversity for Direct Sequence Spread Spectrum Systems

    PubMed Central

    Hassan, A. Y.

    2014-01-01

    Time diversity is achieved in direct sequence spread spectrum by receiving different faded delayed copies of the transmitted symbols from different uncorrelated channel paths when the transmission signal bandwidth is greater than the coherence bandwidth of the channel. In this paper, a new time diversity scheme is proposed for spread spectrum systems. It is called code-time diversity. In this new scheme, N spreading codes are used to transmit one data symbol over N successive symbols interval. The diversity order in the proposed scheme equals to the number of the used spreading codes N multiplied by the number of the uncorrelated paths of the channel L. The paper represents the transmitted signal model. Two demodulators structures will be proposed based on the received signal models from Rayleigh flat and frequency selective fading channels. Probability of error in the proposed diversity scheme is also calculated for the same two fading channels. Finally, simulation results are represented and compared with that of maximal ration combiner (MRC) and multiple-input and multiple-output (MIMO) systems. PMID:24982925

  2. A Gaussian Mixture Model-based continuous Boundary Detection for 3D sensor networks.

    PubMed

    Chen, Jiehui; Salim, Mariam B; Matsumoto, Mitsuji

    2010-01-01

    This paper proposes a high precision Gaussian Mixture Model-based novel Boundary Detection 3D (BD3D) scheme with reasonable implementation cost for 3D cases by selecting a minimum number of Boundary sensor Nodes (BNs) in continuous moving objects. It shows apparent advantages in that two classes of boundary and non-boundary sensor nodes can be efficiently classified using the model selection techniques for finite mixture models; furthermore, the set of sensor readings within each sensor node's spatial neighbors is formulated using a Gaussian Mixture Model; different from DECOMO [1] and COBOM [2], we also formatted a BN Array with an additional own sensor reading to benefit selecting Event BNs (EBNs) and non-EBNs from the observations of BNs. In particular, we propose a Thick Section Model (TSM) to solve the problem of transition between 2D and 3D. It is verified by simulations that the BD3D 2D model outperforms DECOMO and COBOM in terms of average residual energy and the number of BNs selected, while the BD3D 3D model demonstrates sound performance even for sensor networks with low densities especially when the value of the sensor transmission range (r) is larger than the value of Section Thickness (d) in TSM. We have also rigorously proved its correctness for continuous geometric domains and full robustness for sensor networks over 3D terrains.

  3. Case formulation and management using pattern-based formulation (PBF) methodology: clinical case 1.

    PubMed

    Fernando, Irosh; Cohen, Martin

    2014-02-01

    A tool for psychiatric case formulation known as pattern-based formulation (PBF) has been recently introduced. This paper presents an application of this methodology in formulating and managing complex clinical cases. The symptomatology of the clinical presentation has been parsed into individual clinical phenomena and interpreted by selecting explanatory models. The clinical presentation demonstrates how PBF has been used as a clinical tool to guide clinicians' thinking, that takes a structured approach to manage multiple issues using a broad range of management strategies. In doing so, the paper also introduces a number of patterns related to the observed clinical phenomena that can be re-used as explanatory models when formulating other clinical cases. It is expected that this paper will assist clinicians, and particularly trainees, to better understand PBF methodology and apply it to improve their formulation skills.

  4. Data-adaptive test statistics for microarray data.

    PubMed

    Mukherjee, Sach; Roberts, Stephen J; van der Laan, Mark J

    2005-09-01

    An important task in microarray data analysis is the selection of genes that are differentially expressed between different tissue samples, such as healthy and diseased. However, microarray data contain an enormous number of dimensions (genes) and very few samples (arrays), a mismatch which poses fundamental statistical problems for the selection process that have defied easy resolution. In this paper, we present a novel approach to the selection of differentially expressed genes in which test statistics are learned from data using a simple notion of reproducibility in selection results as the learning criterion. Reproducibility, as we define it, can be computed without any knowledge of the 'ground-truth', but takes advantage of certain properties of microarray data to provide an asymptotically valid guide to expected loss under the true data-generating distribution. We are therefore able to indirectly minimize expected loss, and obtain results substantially more robust than conventional methods. We apply our method to simulated and oligonucleotide array data. By request to the corresponding author.

  5. Ad Hoc Access Gateway Selection Algorithm

    NASA Astrophysics Data System (ADS)

    Jie, Liu

    With the continuous development of mobile communication technology, Ad Hoc access network has become a hot research, Ad Hoc access network nodes can be used to expand capacity of multi-hop communication range of mobile communication system, even business adjacent to the community, improve edge data rates. For mobile nodes in Ad Hoc network to internet, internet communications in the peer nodes must be achieved through the gateway. Therefore, the key Ad Hoc Access Networks will focus on the discovery gateway, as well as gateway selection in the case of multi-gateway and handover problems between different gateways. This paper considers the mobile node and the gateway, based on the average number of hops from an average access time and the stability of routes, improved gateway selection algorithm were proposed. An improved gateway selection algorithm, which mainly considers the algorithm can improve the access time of Ad Hoc nodes and the continuity of communication between the gateways, were proposed. This can improve the quality of communication across the network.

  6. Ten Steps to Improve Quality of the Journal Materia Socio-Medica.

    PubMed

    Donev, Doncho M; Masic, Izet

    2017-03-01

    Materia Socio-Medica is one of the oldest public health journals in Europe, established in 1978, and among the most important journals for public health in South-Eastern Europe. The Journal covers all important public health professional, academic and research areas in this field. The aim of the paper is to analyze the journal articles and statistical facts in 2016 and to point out the directions for action and planned further activities for improving the quality of the published papers and visibility of the journal. Review and analysis of documentation and production of the journal, evidence of submitted and rejected manuscripts and published papers in 2016. Total number of 111 articles was published in Materia Socio-Medica during 2016. The most of them were original articles (64,5%). Articles from the fields of Health promotion and prevention were predominant (82,7%), which is one of the primary scope of the journal. Authors of the published articles in 2016 are dispersed to three continents (Europe, Asia and North America) and 15 different countries. The largest number of articles was submitted by authors from the country of origin of the journal, Bosnia and Herzegovina. The acceptance rate of Materia Socioi-Medica in 2016 was 35.7%. Total number of 116 reviewers participated in the manuscript review process in 2016. Materia Socio-Medica will continue to improve the quality of the published papers in 2017 and beyond through education of potential authors, reviewers and Editorial Board members, quality selection of reviewers, supportive editing of articles, and clearly defining instructions and ethical standards of the journal.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feather, F.

    This volume was prepared in conjunction with the First Global Conference on the Future, held in Toronto, Canada, July 20-24, 1980. The conference combined the Third General Assembly of the World Future Society and the fifth annual conference of the Canadian Futures Society. The 59 papers presented here were selected from the very large number submitted to the conference committee; space limitations permitted only a small number of papers to be published in this volume. Included also are: the foreword, Mystery of the Future, by Edward R. Schreyer, Governor General of Canada; preface, A Time for Action, by Maurice F.more » Strong; introduction, Transition to Harmonic Globalism, by Frank Feather; conclusion, What We Must Do: An Agenda for Futurists; and postscript, The Challenge of the '80s, by Aurelio Peccei. The papers were presented under the following topics: The Trauma of Change (4); A Global Perspective (7); Inventorying Our Resources (7); The International Context (8); Economics: Getting Down to Business (9); Human Values: Personal, Social, Religious (6); Communications: Connecting Ourselves Together (4); Education: Learning to Meet Tomorrow (4); Health: New Approaches to Staying Fit (3); Futurism as a Way of Life (5); and Dreams into Action: Methods and Real-Life Experience (2).« less

  8. Computational modeling of high performance steel fiber reinforced concrete using a micromorphic approach

    NASA Astrophysics Data System (ADS)

    Huespe, A. E.; Oliver, J.; Mora, D. F.

    2013-12-01

    A finite element methodology for simulating the failure of high performance fiber reinforced concrete composites (HPFRC), with arbitrarily oriented short fibers, is presented. The composite material model is based on a micromorphic approach. Using the framework provided by this theory, the body configuration space is described through two kinematical descriptors. At the structural level, the displacement field represents the standard kinematical descriptor. Additionally, a morphological kinematical descriptor, the micromorphic field, is introduced. It describes the fiber-matrix relative displacement, or slipping mechanism of the bond, observed at the mesoscale level. In the first part of this paper, we summarize the model formulation of the micromorphic approach presented in a previous work by the authors. In the second part, and as the main contribution of the paper, we address specific issues related to the numerical aspects involved in the computational implementation of the model. The developed numerical procedure is based on a mixed finite element technique. The number of dofs per node changes according with the number of fiber bundles simulated in the composite. Then, a specific solution scheme is proposed to solve the variable number of unknowns in the discrete model. The HPFRC composite model takes into account the important effects produced by concrete fracture. A procedure for simulating quasi-brittle fracture is introduced into the model and is described in the paper. The present numerical methodology is assessed by simulating a selected set of experimental tests which proves its viability and accuracy to capture a number of mechanical phenomenon interacting at the macro- and mesoscale and leading to failure of HPFRC composites.

  9. The large bright quasar survey. 6: Quasar catalog and survey parameters

    NASA Astrophysics Data System (ADS)

    Hewett, Paul C.; Foltz, Craig B.; Chaffee, Frederic H.

    1995-04-01

    Positions, redshifts, and magnitudes for the 1055 quasars in the Large Bright Quasar Survey (LBQS) are presented in a single catalog. Celestial positions have been derived using the PPM catalog to provide an improved reference frame. J2000.0 coordinates are given together with improved b1950.0 positions. Redshifts calculated via cross correlation with a high signal-to-noise ratio composite quasar spectrum are included and the small number of typographic and redshift misidentifications in the discovery papers are corrected. Spectra of the 12 quasars added to the sample since the publication of the discovery papers are included. Discriptions of the plate material, magnitude calibration, quasar candidate selection procedures, and the identification spectroscopy are given. Calculation of the effective area of the survey for the 1055 quasars comprising the well-defined LBQS sample specified in detail. Number-redshift and number-magnitude relations for the quasars are derived and the strengths and limitastions of the LBSQ sample summarized. Comparison with existing surveys is made and a qualitative assessment of the effectiveness of the LBQS undertaken. Positions, magnitudes, and optical spectra of the eight objects (less than 1%) in the survey that remain unidentified are also presented.

  10. Improving receiver performance of diffusive molecular communication with enzymes.

    PubMed

    Noel, Adam; Cheung, Karen C; Schober, Robert

    2014-03-01

    This paper studies the mitigation of intersymbol interference in a diffusive molecular communication system using enzymes that freely diffuse in the propagation environment. The enzymes form reaction intermediates with information molecules and then degrade them so that they cannot interfere with future transmissions. A lower bound expression on the expected number of molecules measured at the receiver is derived. A simple binary receiver detection scheme is proposed where the number of observed molecules is sampled at the time when the maximum number of molecules is expected. Insight is also provided into the selection of an appropriate bit interval. The expected bit error probability is derived as a function of the current and all previously transmitted bits. Simulation results show the accuracy of the bit error probability expression and the improvement in communication performance by having active enzymes present.

  11. Choquet integral as an alternative aggregation method to measure the overall academic performance of primary school students: A case study

    NASA Astrophysics Data System (ADS)

    Kasim, Maznah Mat; Abdullah, Siti Rohana Goh

    2014-07-01

    Many average methods are available to aggregate a set of numbers to become single number. However these methods do not consider the interdependencies between the criteria of the related numbers. This paper is highlighting the Choquet Integral method as an alternative aggregation method where the interdependency estimates between the criteria are comprised in the aggregation process. The interdependency values can be estimated by using lambda fuzzy measure method. By considering the interdependencies or interaction between the criteria, the resulted aggregated values are more meaningful as compared to the ones obtained by normal average methods. The application of the Choquet Integral is illustrated in a case study of finding the overall academic achievement of year six pupils in a selected primary school in a northern state of Malaysia.

  12. Curvature Effect in Shear Flow: Slowdown of Turbulent Flame Speeds with Markstein Number

    NASA Astrophysics Data System (ADS)

    Lyu, Jiancheng; Xin, Jack; Yu, Yifeng

    2017-12-01

    It is well-known in the combustion community that curvature effect in general slows down flame propagation speeds because it smooths out wrinkled flames. However, such a folklore has never been justified rigorously. In this paper, as the first theoretical result in this direction, we prove that the turbulent flame speed (an effective burning velocity) is decreasing with respect to the curvature diffusivity (Markstein number) for shear flows in the well-known G-equation model. Our proof involves several novel and rather sophisticated inequalities arising from the nonlinear structure of the equation. On a related fundamental issue, we solve the selection problem of weak solutions or find the "physical fluctuations" when the Markstein number goes to zero and solutions approach those of the inviscid G-equation model. The limiting solution is given by a closed form analytical formula.

  13. Parameter reduction in nonlinear state-space identification of hysteresis

    NASA Astrophysics Data System (ADS)

    Fakhrizadeh Esfahani, Alireza; Dreesen, Philippe; Tiels, Koen; Noël, Jean-Philippe; Schoukens, Johan

    2018-05-01

    Recent work on black-box polynomial nonlinear state-space modeling for hysteresis identification has provided promising results, but struggles with a large number of parameters due to the use of multivariate polynomials. This drawback is tackled in the current paper by applying a decoupling approach that results in a more parsimonious representation involving univariate polynomials. This work is carried out numerically on input-output data generated by a Bouc-Wen hysteretic model and follows up on earlier work of the authors. The current article discusses the polynomial decoupling approach and explores the selection of the number of univariate polynomials with the polynomial degree. We have found that the presented decoupling approach is able to reduce the number of parameters of the full nonlinear model up to about 50%, while maintaining a comparable output error level.

  14. Cognitive access to numbers: the philosophical significance of empirical findings about basic number abilities.

    PubMed

    Giaquinto, Marcus

    2017-02-19

    How can we acquire a grasp of cardinal numbers, even the first very small positive cardinal numbers, given that they are abstract mathematical entities? That problem of cognitive access is the main focus of this paper. All the major rival views about the nature and existence of cardinal numbers face difficulties; and the view most consonant with our normal thought and talk about numbers, the view that cardinal numbers are sizes of sets, runs into the cognitive access problem. The source of the problem is the plausible assumption that cognitive access to something requires causal contact with it. It is argued that this assumption is in fact wrong, and that in this and similar cases, we should accept that a certain recognize-and-distinguish capacity is sufficient for cognitive access. We can then go on to solve the cognitive access problem, and thereby support the set-size view of cardinal numbers, by paying attention to empirical findings about basic number abilities. To this end, some selected studies of infants, pre-school children and a trained chimpanzee are briefly discussed.This article is part of a discussion meeting issue 'The origins of numerical abilities'. © 2017 The Author(s).

  15. Paper spray mass spectrometry and PLS-DA improved by variable selection for the forensic discrimination of beers.

    PubMed

    Pereira, Hebert Vinicius; Amador, Victória Silva; Sena, Marcelo Martins; Augusti, Rodinei; Piccin, Evandro

    2016-10-12

    Paper spray mass spectrometry (PS-MS) combined with partial least squares discriminant analysis (PLS-DA) was applied for the first time in a forensic context to a fast and effective differentiation of beers. Eight different brands of American standard lager beers produced by four different breweries (141 samples from 55 batches) were studied with the aim at performing a differentiation according to their market prices. The three leader brands in the Brazilian beer market, which have been subject to fraud, were modeled as the higher-price class, while the five brands most used for counterfeiting were modeled as the lower-price class. Parameters affecting the paper spray ionization were examined and optimized. The best MS signal stability and intensity was obtained while using the positive ion mode, with PS(+) mass spectra characterized by intense pairs of signals corresponding to sodium and potassium adducts of malto-oligosaccharides. Discrimination was not apparent neither by using visual inspection nor principal component analysis (PCA). However, supervised classification models provided high rates of sensitivity and specificity. A PLS-DA model using full scan mass spectra were improved by variable selection with ordered predictors selection (OPS), providing 100% of reliability rate and reducing the number of variables from 1701 to 60. This model was interpreted by detecting fifteen variables as the most significant VIP (variable importance in projection) scores, which were therefore considered diagnostic ions for this type of beer counterfeit. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Developing a weighting strategy to include mobile phone numbers into an ongoing population health survey using an overlapping dual-frame design with limited benchmark information

    PubMed Central

    2014-01-01

    Background In 2012 mobile phone numbers were included into the ongoing New South Wales Population Health Survey (NSWPHS) using an overlapping dual-frame design. Previously in the NSWPHS the sample was selected using random digit dialing (RDD) of landline phone numbers. The survey was undertaken using computer assisted telephone interviewing (CATI). The weighting strategy needed to be significantly expanded to manage the differing probabilities of selection by frame, including that of children of mobile-only phone users, and to adjust for the increased chance of selection of dual-phone users. This paper describes the development of the final weighting strategy to properly combine the data from two overlapping sample frames accounting for the fact that population benchmarks for the different sampling frames were not available at the state or regional level. Methods Estimates of the number of phone numbers for the landline and mobile phone frames used to calculate the differing probabilities of selection by frame, for New South Wales (NSW) and by stratum, were obtained by apportioning Australian estimates as none were available for NSW. The weighting strategy was then developed by calculating person selection probabilities, selection weights, applying a constant composite factor to the dual-phone users sample weights, and benchmarking to the latest NSW population by age group, sex and stratum. Results Data from the NSWPHS for the first quarter of 2012 was used to test the weighting strategy. This consisted of data on 3395 respondents with 2171 (64%) from the landline frame and 1224 (36%) from the mobile frame. However, in order to calculate the weights, data needed to be available for all core weighting variables and so 3378 respondents, 2933 adults and 445 children, had sufficient data to be included. Average person weights were 3.3 times higher for the mobile-only respondents, 1.3 times higher for the landline-only respondents and 1.7 times higher for dual-phone users in the mobile frame compared to the dual-phone users in the landline frame. The overall weight effect for the first quarter of 2012 was 1.93 and the coefficient of variation of the weights was 0.96. The weight effects for 2012 were similar to, and in many cases less than, the effects found in the corresponding quarter of the 2011 NSWPHS when only a landline based sample was used. Conclusions The inclusion of mobile phone numbers, through an overlapping dual-frame design, improved the coverage of the survey and an appropriate weighing procedure is feasible, although it added substantially to the complexity of the weighting strategy. Access to accurate Australian, State and Territory estimates of the number of landline and mobile phone numbers and type of phone use by at least age group and sex would greatly assist in the weighting of dual-frame surveys in Australia. PMID:25189826

  17. Developing a weighting strategy to include mobile phone numbers into an ongoing population health survey using an overlapping dual-frame design with limited benchmark information.

    PubMed

    Barr, Margo L; Ferguson, Raymond A; Hughes, Phil J; Steel, David G

    2014-09-04

    In 2012 mobile phone numbers were included into the ongoing New South Wales Population Health Survey (NSWPHS) using an overlapping dual-frame design. Previously in the NSWPHS the sample was selected using random digit dialing (RDD) of landline phone numbers. The survey was undertaken using computer assisted telephone interviewing (CATI). The weighting strategy needed to be significantly expanded to manage the differing probabilities of selection by frame, including that of children of mobile-only phone users, and to adjust for the increased chance of selection of dual-phone users. This paper describes the development of the final weighting strategy to properly combine the data from two overlapping sample frames accounting for the fact that population benchmarks for the different sampling frames were not available at the state or regional level. Estimates of the number of phone numbers for the landline and mobile phone frames used to calculate the differing probabilities of selection by frame, for New South Wales (NSW) and by stratum, were obtained by apportioning Australian estimates as none were available for NSW. The weighting strategy was then developed by calculating person selection probabilities, selection weights, applying a constant composite factor to the dual-phone users sample weights, and benchmarking to the latest NSW population by age group, sex and stratum. Data from the NSWPHS for the first quarter of 2012 was used to test the weighting strategy. This consisted of data on 3395 respondents with 2171 (64%) from the landline frame and 1224 (36%) from the mobile frame. However, in order to calculate the weights, data needed to be available for all core weighting variables and so 3378 respondents, 2933 adults and 445 children, had sufficient data to be included. Average person weights were 3.3 times higher for the mobile-only respondents, 1.3 times higher for the landline-only respondents and 1.7 times higher for dual-phone users in the mobile frame compared to the dual-phone users in the landline frame. The overall weight effect for the first quarter of 2012 was 1.93 and the coefficient of variation of the weights was 0.96. The weight effects for 2012 were similar to, and in many cases less than, the effects found in the corresponding quarter of the 2011 NSWPHS when only a landline based sample was used. The inclusion of mobile phone numbers, through an overlapping dual-frame design, improved the coverage of the survey and an appropriate weighing procedure is feasible, although it added substantially to the complexity of the weighting strategy. Access to accurate Australian, State and Territory estimates of the number of landline and mobile phone numbers and type of phone use by at least age group and sex would greatly assist in the weighting of dual-frame surveys in Australia.

  18. Joint Feature Selection and Classification for Multilabel Learning.

    PubMed

    Huang, Jun; Li, Guorong; Huang, Qingming; Wu, Xindong

    2018-03-01

    Multilabel learning deals with examples having multiple class labels simultaneously. It has been applied to a variety of applications, such as text categorization and image annotation. A large number of algorithms have been proposed for multilabel learning, most of which concentrate on multilabel classification problems and only a few of them are feature selection algorithms. Current multilabel classification models are mainly built on a single data representation composed of all the features which are shared by all the class labels. Since each class label might be decided by some specific features of its own, and the problems of classification and feature selection are often addressed independently, in this paper, we propose a novel method which can perform joint feature selection and classification for multilabel learning, named JFSC. Different from many existing methods, JFSC learns both shared features and label-specific features by considering pairwise label correlations, and builds the multilabel classifier on the learned low-dimensional data representations simultaneously. A comparative study with state-of-the-art approaches manifests a competitive performance of our proposed method both in classification and feature selection for multilabel learning.

  19. Selective remediation of contaminated sites using a two-level multiphase strategy and geostatistics.

    PubMed

    Saito, Hirotaka; Goovaerts, Pierre

    2003-05-01

    Selective soil remediation aims to reduce costs by cleaning only the fraction of an exposure unit (EU) necessary to lower the average concentration below the regulatory threshold. This approach requires a prior stratification of each EU into smaller remediation units (RU) which are then selected according to various criteria. This paper presents a geostatistical framework to account for uncertainties attached to both RU and EU average concentrations in selective remediation. The selection of RUs is based on their impact on the postremediation probability for the EU average concentration to exceed the regulatory threshold, which is assessed using geostatistical stochastic simulation. Application of the technique to a set of 600 dioxin concentrations collected at Piazza Road EPA Superfund site in Missouri shows a substantial decrease in the number of RU remediated compared with single phase remediation. The lower remediation costs achieved by the new strategy are obtained to the detriment of a higher risk of false negatives, yet for this data set this risk remains below the 5% rate set by EPA region 7.

  20. Do health technology assessments comply with QUOROM diagram guidance? An empirical study.

    PubMed

    Hind, Daniel; Booth, Andrew

    2007-11-20

    The Quality of Reporting of Meta-analyses (QUOROM) statement provides guidance for improving the quality of reporting of systematic reviews and meta-analyses. To make the process of study selection transparent it recommends "a flow diagram providing information about the number of RCTs identified, included, and excluded and the reasons for excluding them". We undertook an empirical study to identify the extent of compliance in the UK Health Technology Assessment (HTA) programme. We searched Medline to retrieve all systematic reviews of therapeutic interventions in the HTA monograph series published from 2001 to 2005. Two researchers recorded whether each study contained a meta-analysis of controlled trials, whether a QUOROM flow diagram was presented and, if so, whether it expressed the relationship between the number of citations and the number of studies. We used Cohen's kappa to test inter-rater reliability. 87 systematic reviews were retrieved. There was good and excellent inter-rater reliability for, respectively, whether a review contained a meta-analysis and whether each diagram contained a citation-to-study relationship. 49% of systematic reviews used a study selection flow diagram. When only systematic reviews containing a meta-analysis were analysed, compliance was only 32%. Only 20 studies (23% of all systematic reviews; 43% of those having a study selection diagram) had a diagram which expressed the relationship between citations and studies. Compliance with the recommendations of the QUOROM statement is not universal in systematic reviews or meta-analyses. Flow diagrams make the conduct of study selection transparent only if the relationship between citations and studies is clearly expressed. Reviewers should understand what they are counting: citations, papers, studies and trials are fundamentally different concepts which should not be confused in a diagram.

  1. Modified complementary ensemble empirical mode decomposition and intrinsic mode functions evaluation index for high-speed train gearbox fault diagnosis

    NASA Astrophysics Data System (ADS)

    Chen, Dongyue; Lin, Jianhui; Li, Yanping

    2018-06-01

    Complementary ensemble empirical mode decomposition (CEEMD) has been developed for the mode-mixing problem in Empirical Mode Decomposition (EMD) method. Compared to the ensemble empirical mode decomposition (EEMD), the CEEMD method reduces residue noise in the signal reconstruction. Both CEEMD and EEMD need enough ensemble number to reduce the residue noise, and hence it would be too much computation cost. Moreover, the selection of intrinsic mode functions (IMFs) for further analysis usually depends on experience. A modified CEEMD method and IMFs evaluation index are proposed with the aim of reducing the computational cost and select IMFs automatically. A simulated signal and in-service high-speed train gearbox vibration signals are employed to validate the proposed method in this paper. The results demonstrate that the modified CEEMD can decompose the signal efficiently with less computation cost, and the IMFs evaluation index can select the meaningful IMFs automatically.

  2. Selection of rolling-element bearing steels for long-life applications

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin V.

    1989-01-01

    Nearly four decades of research in bearing steel metallurgy and processing have resulted in improvements in bearing life by a factor of 100 over that obtained in the early 1940s. For critical applications such as aircraft, these improvements have resulted in longer lived, more reliable commercial aircraft engines. Material factors such as hardness, retained austenite, grain size and carbide size, number, and area can influence rolling-element fatigue life. Bearing steel processing such as double vacuum melting can have a greater effect on bearing life than material chemistry. The selection and specification of a bearing steel is dependent on the integration of all these considerations into the bearing design and application. The paper reviews rolling-element fatigue data and analysis which can enable the engineer or metallurgist to select a rolling-element bearing steel for critical applications where long life is required.

  3. Selection of rolling-element bearing steels for long-life application

    NASA Technical Reports Server (NTRS)

    Zaretsky, E. V.

    1986-01-01

    Nearly four decades of research in bearing steel metallurgy and processing have resulted in improvements in bearing life by a factor of 100 over that obtained in the early 1940's. For critical applications such as aircraft, these improvements have resulted in longer lived, more reliable commercial aircraft engines. Material factors such as hardness, retained austenite, grain size and carbide size, number, and area can influence rolling-element fatigue life. Bearing steel processing such as double vacuum melting can have a greater efect on bearing life than material chemistry. The selection and specification of a bearing steel is dependent on the integration of all these considerations into the bearing design and application. The paper reviews rolling-element fatigue data and analysis which can enable the engineer or metallurgist to select a rolling-element bearing steel for critical applications where long life is required.

  4. Battery selection for space experiments

    NASA Technical Reports Server (NTRS)

    Francisco, David R.

    1992-01-01

    This paper will delineate the criteria required for the selection of batteries as a power source for space experiments. Four basic types of batteries will be explored, lead acid, silver zinc, alkaline manganese and nickel cadmium. A detailed description of the lead acid and silver zinc cells while a brief exploration of the alkaline manganese and nickel cadmium will be given. The factors involved in battery selection such as packaging, energy density, discharge voltage regulation, and cost will be thoroughly examined. The pros and cons of each battery type will be explored. Actual laboratory test data acquired for the lead acid and silver zinc cell will be discussed. This data will include discharging under various temperature conditions, after three months of storage and with different types of loads. A description of the required maintenance for each type of battery will be investigated. The lifetime and number of charge/discharge cycles will be discussed.

  5. Battery selection for Space Shuttle experiments

    NASA Technical Reports Server (NTRS)

    Francisco, David R.

    1993-01-01

    This paper will delineate the criteria required for the selection of batteries as a power source for space experiments. Four basic types of batteries will be explored, lead acid, silver zinc, alkaline manganese, and nickel cadmium. A detailed description of the lead acid and silver zinc cells and a brief exploration of the alkaline manganese and nickel cadmium will be given. The factors involved in battery selection such as packaging, energy density, discharge voltage regulation, and cost will be thoroughly examined. The pros and cons of each battery type will be explored. Actual laboratory test data acquired for the lead acid and silver zinc cell will be discussed. This data will include discharging under various temperature conditions, after three months of storage, and with different types of loads. The lifetime and number of charge/discharge cycles will also be discussed. A description of the required maintenance for each type of battery will be investigated.

  6. A new adaptive L1-norm for optimal descriptor selection of high-dimensional QSAR classification model for anti-hepatitis C virus activity of thiourea derivatives.

    PubMed

    Algamal, Z Y; Lee, M H

    2017-01-01

    A high-dimensional quantitative structure-activity relationship (QSAR) classification model typically contains a large number of irrelevant and redundant descriptors. In this paper, a new design of descriptor selection for the QSAR classification model estimation method is proposed by adding a new weight inside L1-norm. The experimental results of classifying the anti-hepatitis C virus activity of thiourea derivatives demonstrate that the proposed descriptor selection method in the QSAR classification model performs effectively and competitively compared with other existing penalized methods in terms of classification performance on both the training and the testing datasets. Moreover, it is noteworthy that the results obtained in terms of stability test and applicability domain provide a robust QSAR classification model. It is evident from the results that the developed QSAR classification model could conceivably be employed for further high-dimensional QSAR classification studies.

  7. Paper Tape Prevents Foot Blisters: A Randomized Prevention Trial Assessing Paper Tape in Endurance Distances II (Pre-TAPED II).

    PubMed

    Lipman, Grant S; Sharp, Louis J; Christensen, Mark; Phillips, Caleb; DiTullio, Alexandra; Dalton, Andrew; Ng, Pearlly; Shangkuan, Jennifer; Shea, Katherine; Krabak, Brian J

    2016-09-01

    To determine whether paper tape prevents foot blisters in multistage ultramarathon runners. Multisite prospective randomized trial. The 2014 250-km (155-mile) 6-stage RacingThePlanet ultramarathons in Jordan, Gobi, Madagascar, and Atacama Deserts. One hundred twenty-eight participants were enrolled: 19 (15%) from the Jordan, 35 (27%) from Gobi, 21 (16%) from Madagascar, and 53 (41%) from the Atacama Desert. The mean age was 39.3 years (22-63) and body mass index was 24.2 kg/m (17.4-35.1), with 31 (22.5%) females. Paper tape was applied to a randomly selected foot before the race, either to participants' blister-prone areas or randomly selected location if there was no blister history, with untaped areas of the same foot used as the control. Development of a blister anywhere on the study foot. One hundred six (83%) participants developed 117 blisters, with treatment success in 98 (77%) runners. Paper tape reduced blisters by 40% (P < 0.01, 95% confidence interval, 28-52) with a number needed to treat of 1.31. Most of the study participants had 1 blister (78%), with most common locations on the toes (n = 58, 50%) and heel (n = 27, 23%), with 94 (80%) blisters occurring by the end of stage 2. Treatment success was associated with earlier stages [odds ratio (OR), 74.9, P < 0.01] and time spent running (OR, 0.66, P = 0.01). Paper tape was found to prevent both the incidence and frequency of foot blisters in runners.

  8. A hybrid learning method for constructing compact rule-based fuzzy models.

    PubMed

    Zhao, Wanqing; Niu, Qun; Li, Kang; Irwin, George W

    2013-12-01

    The Takagi–Sugeno–Kang-type rule-based fuzzy model has found many applications in different fields; a major challenge is, however, to build a compact model with optimized model parameters which leads to satisfactory model performance. To produce a compact model, most existing approaches mainly focus on selecting an appropriate number of fuzzy rules. In contrast, this paper considers not only the selection of fuzzy rules but also the structure of each rule premise and consequent, leading to the development of a novel compact rule-based fuzzy model. Here, each fuzzy rule is associated with two sets of input attributes, in which the first is used for constructing the rule premise and the other is employed in the rule consequent. A new hybrid learning method combining the modified harmony search method with a fast recursive algorithm is hereby proposed to determine the structure and the parameters for the rule premises and consequents. This is a hard mixed-integer nonlinear optimization problem, and the proposed hybrid method solves the problem by employing an embedded framework, leading to a significantly reduced number of model parameters and a small number of fuzzy rules with each being as simple as possible. Results from three examples are presented to demonstrate the compactness (in terms of the number of model parameters and the number of rules) and the performance of the fuzzy models obtained by the proposed hybrid learning method, in comparison with other techniques from the literature.

  9. Dataset of breath research manuscripts curated using PubMed search strings from 1995-2016.

    PubMed

    Geer Wallace, M Ariel; Pleil, Joachim D

    2018-06-01

    The data contained in this article are PubMed search strings and search string builders used to curate breath research manuscripts published from 1995-2016 and the respective number of articles found that satisfied the search requirements for selected categories. Breath sampling represents a non-invasive technique that has gained usefulness for public health, clinical, diagnostic, and environmental exposure assessment applications over the years. This data article includes search strings that were utilized to retrieve publications through the PubMed database for different breath research-related topics that were related to the analysis of exhaled breath, exhaled breath condensate (EBC), and exhaled breath aerosol (EBA) as well as the analysis of cellular headspace. Manuscripts were curated for topics including EBC, EBA, Direct MS, GC-MS, LC-MS, alcohol, and sensors. A summary of the number of papers published per year for the data retrieved using each of the search strings is also included. These data can be utilized to discern trends in the number of breath research publications in each of the different topics over time. A supplementary Appendix A containing the titles, author lists, journal names, publication dates, PMID numbers, and EntrezUID numbers for each of the journal articles curated using the finalized search strings for the seven breath research-related topics can also be found within this article. The selected manuscripts can be used to explore the impact that breath research has had on expanding the scientific knowledge in each of the investigated topics.

  10. Reduction of false-positives in a CAD scheme for automated detection of architectural distortion in digital mammography

    NASA Astrophysics Data System (ADS)

    de Oliveira, Helder C. R.; Mencattini, Arianna; Casti, Paola; Martinelli, Eugenio; di Natale, Corrado; Catani, Juliana H.; de Barros, Nestor; Melo, Carlos F. E.; Gonzaga, Adilson; Vieira, Marcelo A. C.

    2018-02-01

    This paper proposes a method to reduce the number of false-positives (FP) in a computer-aided detection (CAD) scheme for automated detection of architectural distortion (AD) in digital mammography. AD is a subtle contraction of breast parenchyma that may represent an early sign of breast cancer. Due to its subtlety and variability, AD is more difficult to detect compared to microcalcifications and masses, and is commonly found in retrospective evaluations of false-negative mammograms. Several computer-based systems have been proposed for automated detection of AD in breast images. The usual approach is automatically detect possible sites of AD in a mammographic image (segmentation step) and then use a classifier to eliminate the false-positives and identify the suspicious regions (classification step). This paper focus on the optimization of the segmentation step to reduce the number of FPs that is used as input to the classifier. The proposal is to use statistical measurements to score the segmented regions and then apply a threshold to select a small quantity of regions that should be submitted to the classification step, improving the detection performance of a CAD scheme. We evaluated 12 image features to score and select suspicious regions of 74 clinical Full-Field Digital Mammography (FFDM). All images in this dataset contained at least one region with AD previously marked by an expert radiologist. The results showed that the proposed method can reduce the false positives of the segmentation step of the CAD scheme from 43.4 false positives (FP) per image to 34.5 FP per image, without increasing the number of false negatives.

  11. VizieR Online Data Catalog: SDSS DR7 voids and superclusters (Nadathur+, 2014)

    NASA Astrophysics Data System (ADS)

    Nadathur, S.; Hotchkiss, S.

    2016-02-01

    This is a public catalogue of voids and superclusters identified in the SDSS DR7 main galaxy and luminous red galaxy samples. This version is dated 04.11.2013. We make the catalogues available for general use. If you use them for your own work, we ask that you cite the original paper, Nadathur & Hotchkiss (2014MNRAS.440.1248N). The top-level directory cat_v11.11.13 contains an example python script called postproc.py, and two folders called comovcoords and redshiftcoords containing two versions of the catalogue in different coordinate systems. The comoving coordinate system is pretty self-explanatory, for a description of the other one please refer to the paper. Each of these directories is further divided into six folders containing the Type1 and Type2 void catalogues and the supercluster catalogue for each of the galaxy samples analysed here, and a folder called tools, which contains data useful for users wishing to apply their own selection criteria. The basic information provided includes the location of the barycentre of each structure, its volume, effective radius, average density and minimum or maximum density, its core galaxy and seed zone, the total number of galaxies in the seed zone, the number of zones merged to form the structure, the total number of particles in the structure, and its density ratio. These are split between two files for each structure type and each sample, named xxxinfo.txt and xxxlist.txt, where xxx refers to the structure type. It is also possible to extract lists of member galaxies of each structure and their magnitudes. An example python script, postproc.py, demonstrates how to access this information and how to build alternative catalogues using user-defined selection criteria. (27 data files).

  12. Bandwidth Study of the Microwave Reflectors with Rectangular Corrugations

    NASA Astrophysics Data System (ADS)

    Zhang, Liang; He, Wenlong; Donaldson, Craig R.; Cross, Adrian W.

    2016-09-01

    The mode-selective microwave reflector with periodic rectangular corrugations in the inner surface of a circular metallic waveguide is studied in this paper. The relations between the bandwidth and reflection coefficient for different numbers of corrugation sections were studied through a global optimization method. Two types of reflectors were investigated. One does not consider the phase response and the other does. Both types of broadband reflectors operating at W-band were machined and measured to verify the numerical simulations.

  13. Work on the physics of ultracold atoms in Russia

    NASA Astrophysics Data System (ADS)

    Kolachevsky, N. N.; Taichenachev, A. V.

    2018-05-01

    In December 2017, the regular All-Russian Conference 'Physics of Ultracold Atoms' was held. Several tens of Russian scientists from major scientific centres of the country, as well as a number of leading foreign scientists took part in the Conference. The Conference topics covered a wide range of urgent problems: quantum metrology, quantum gases, waves of matter, spectroscopy, quantum computing, and laser cooling. This issue of Quantum Electronics publishes the papers reported at the conference and selected for the Journal by the Organising committee.

  14. AIC Computations Using Navier-Stokes Equations on Single Image Supercomputers For Design Optimization

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru

    2004-01-01

    A procedure to accurately generate AIC using the Navier-Stokes solver including grid deformation is presented. Preliminary results show good comparisons between experiment and computed flutter boundaries for a rectangular wing. A full wing body configuration of an orbital space plane is selected for demonstration on a large number of processors. In the final paper the AIC of full wing body configuration will be computed. The scalability of the procedure on supercomputer will be demonstrated.

  15. A survey of recently published cardiovascular, hematological and pneumological original articles in the Brazilian scientific press

    PubMed Central

    Patel, Kavita Kirankumar; Caramelli, Bruno; Gomes, Ariane

    2011-01-01

    Recent original scientific contributions published in selected Brazilian periodicals and classifiable under cardiovascular and pulmonary subject categories cover a wide range of sub specialties, both clinical and exprimental. Because they appear in journals with only recently enhanced visibility, we have decided to highlight a number of specific items appeared in four Brazilian journals, because we understand that this is an important subsidy to keep our readership adequately informed. These papers cover extensive sub-areas in both fields. PMID:22189744

  16. Materials Compatibility Testing in RSRM ODC: Free Cleaner Selection

    NASA Technical Reports Server (NTRS)

    Keen, Jill M.; Sagers, Neil W.; McCool, Alex (Technical Monitor)

    2001-01-01

    Government regulations have mandated production phase-outs of a number of solvents, including 1,1,1-trichloroethane, an ozone-depleting chemical (ODC). This solvent was used extensively in the production of the Reusable Solid Rocket Motors (RSRMs) for the Space Shuttle. Many tests have been performed to identify replacement cleaners. One major area of concern in the selection of a new cleaner has been compatibility. Some specific areas considered included cleaner compatibility with non-metallic surfaces, painted surfaces, support materials such as gloves and wipers as well as corrosive properties of the cleaners on the alloys used on these motors. The intent of this paper is to summarize the test logic, methodology, and results acquired from testing the many cleaner and material combinations.

  17. Statistical trends of episiotomy around the world: Comparative systematic review of changing practices.

    PubMed

    Clesse, Christophe; Lighezzolo-Alnot, Joëlle; De Lavergne, Sylvie; Hamlin, Sandrine; Scheffler, Michèle

    2018-06-01

    The authors' purpose for this article is to identify, review and interpret all publications about the episiotomy rates worldwide. Based on the criteria from the PRISMA guidelines, twenty databases were scrutinized. All studies which include national statistics related to episiotomy were selected, as well as studies presenting estimated data. Sixty-one papers were selected with publication dates between 1995 and 2016. A static and dynamic analysis of all the results was carried out. The assumption for the decline in the number of episiotomies is discussed and confirmed, recalling that nowadays high rates of episiotomy remain in less industrialized countries and East Asia. Finally, our analysis aims to investigate the potential determinants which influence apparent statistical disparities.

  18. Cleaning up contaminated wood-treating sites. Background paper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This report identifies technologies available for organic hazardous waste cleanup at woodtreating sites throughout the country. OTA has identified a range of such technologies that have been selected in the past and could be applied to other sites in the future. The applicability of a technology to a particular Superfund site has to be based on many site-specific factors. Nevertheless, it is clear that a number of the approaches identified by OTA may be appropriate and could prove useful if more detailed site-specific studies and tests were done. Although this study focused on the Texarkana site, decisionmakers and the publicmore » could benefit from this analysis in selecting future cleanup strategies for other sites.« less

  19. [Errors in Peruvian medical journals references].

    PubMed

    Huamaní, Charles; Pacheco-Romero, José

    2009-01-01

    References are fundamental in our studies; an adequate selection is asimportant as an adequate description. To determine the number of errors in a sample of references found in Peruvian medical journals. We reviewed 515 scientific papers references selected by systematic randomized sampling and corroborated reference information with the original document or its citation in Pubmed, LILACS or SciELO-Peru. We found errors in 47,6% (245) of the references, identifying 372 types of errors; the most frequent were errors in presentation style (120), authorship (100) and title (100), mainly due to spelling mistakes (91). References error percentage was high, varied and multiple. We suggest systematic revision of references in the editorial process as well as to extend the discussion on this theme. references, periodicals, research, bibliometrics.

  20. Horizontal spacing, depletion, and infill potential in the Austin Chalk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kyte, D.G.; Meehan, D.N.

    1996-12-31

    There have been more than 4500 laterals drilled in the Austin Chalk. This paper looks at estimated ultimate recoveries (EUR) on a barrels/acre basis for these Austin Chalk wells. Baffels/acre recoveries were computed by estimating ultimate per-well recoveries, drilled density and the impact of vertical production. The data were then analyzed for depletion and infill potential. Certain areas were selected for further study using an artificial neural network. The network was built and used to study the effects of parameters such as lateral length, first production date, structure of the Austin Chalk, etc. on these recoverable barrel/acre numbers. The methodologymore » and regional results of the study are reviewed with detailed analyses shown in selected areas.« less

  1. Electronic health record systems in ophthalmology: impact on clinical documentation.

    PubMed

    Sanders, David S; Lattin, Daniel J; Read-Brown, Sarah; Tu, Daniel C; Wilson, David J; Hwang, Thomas S; Morrison, John C; Yackel, Thomas R; Chiang, Michael F

    2013-09-01

    To evaluate quantitative and qualitative differences in documentation of the ophthalmic examination between paper and electronic health record (EHR) systems. Comparative case series. One hundred fifty consecutive pairs of matched paper and EHR notes, documented by 3 attending ophthalmologist providers. An academic ophthalmology department implemented an EHR system in 2006. Database queries were performed to identify cases in which the same problems were documented by the same provider on different dates, using paper versus EHR methods. This was done for 50 consecutive pairs of examinations in 3 different diseases: age-related macular degeneration (AMD), glaucoma, and pigmented choroidal lesions (PCLs). Quantitative measures were used to compare completeness of documenting the complete ophthalmologic examination, as well as disease-specific critical findings using paper versus an EHR system. Qualitative differences in paper versus EHR documentation were illustrated by selecting representative paired examples. (1) Documentation score, defined as the number of examination elements recorded for the slit-lamp examination, fundus examination, and complete ophthalmologic examination and for critical clinical findings for each disease. (2) Paired comparison of qualitative differences in paper versus EHR documentation. For all 3 diseases (AMD, glaucoma, PCL), the number of complete examination findings recorded was significantly lower with paper than the EHR system (P ≤ 0.004). Among the 3 individual examination sections (general, slit lamp, fundus) for the 3 diseases, 5 of the 9 possible combinations had significantly lower mean documentation scores with paper than EHR notes. For 2 of the 3 diseases, the number of critical clinical findings recorded was significantly lower using paper versus EHR notes (P ≤ 0.022). All (150/150) paper notes relied on graphical representations using annotated hand-drawn sketches, whereas no (0/150) EHR notes contained drawings. Instead, the EHR systems documented clinical findings using textual descriptions and interpretations. There were quantitative and qualitative differences in the nature of paper versus EHR documentation of ophthalmic findings in this study. The EHR notes included more complete documentation of examination elements using structured textual descriptions and interpretations, whereas paper notes used graphical representations of findings. The author(s) have no proprietary or commercial interest in any materials discussed in this article. Copyright © 2013 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  2. Collaborative filtering for brain-computer interaction using transfer learning and active class selection.

    PubMed

    Wu, Dongrui; Lance, Brent J; Parsons, Thomas D

    2013-01-01

    Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both k nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing.

  3. Collaborative Filtering for Brain-Computer Interaction Using Transfer Learning and Active Class Selection

    PubMed Central

    Wu, Dongrui; Lance, Brent J.; Parsons, Thomas D.

    2013-01-01

    Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing. PMID:23437188

  4. Genetic Particle Swarm Optimization-Based Feature Selection for Very-High-Resolution Remotely Sensed Imagery Object Change Detection.

    PubMed

    Chen, Qiang; Chen, Yunhao; Jiang, Weiguo

    2016-07-30

    In the field of multiple features Object-Based Change Detection (OBCD) for very-high-resolution remotely sensed images, image objects have abundant features and feature selection affects the precision and efficiency of OBCD. Through object-based image analysis, this paper proposes a Genetic Particle Swarm Optimization (GPSO)-based feature selection algorithm to solve the optimization problem of feature selection in multiple features OBCD. We select the Ratio of Mean to Variance (RMV) as the fitness function of GPSO, and apply the proposed algorithm to the object-based hybrid multivariate alternative detection model. Two experiment cases on Worldview-2/3 images confirm that GPSO can significantly improve the speed of convergence, and effectively avoid the problem of premature convergence, relative to other feature selection algorithms. According to the accuracy evaluation of OBCD, GPSO is superior at overall accuracy (84.17% and 83.59%) and Kappa coefficient (0.6771 and 0.6314) than other algorithms. Moreover, the sensitivity analysis results show that the proposed algorithm is not easily influenced by the initial parameters, but the number of features to be selected and the size of the particle swarm would affect the algorithm. The comparison experiment results reveal that RMV is more suitable than other functions as the fitness function of GPSO-based feature selection algorithm.

  5. Methodological development for selection of significant predictors explaining fatal road accidents.

    PubMed

    Dadashova, Bahar; Arenas-Ramírez, Blanca; Mira-McWilliams, José; Aparicio-Izquierdo, Francisco

    2016-05-01

    Identification of the most relevant factors for explaining road accident occurrence is an important issue in road safety research, particularly for future decision-making processes in transport policy. However model selection for this particular purpose is still an ongoing research. In this paper we propose a methodological development for model selection which addresses both explanatory variable and adequate model selection issues. A variable selection procedure, TIM (two-input model) method is carried out by combining neural network design and statistical approaches. The error structure of the fitted model is assumed to follow an autoregressive process. All models are estimated using Markov Chain Monte Carlo method where the model parameters are assigned non-informative prior distributions. The final model is built using the results of the variable selection. For the application of the proposed methodology the number of fatal accidents in Spain during 2000-2011 was used. This indicator has experienced the maximum reduction internationally during the indicated years thus making it an interesting time series from a road safety policy perspective. Hence the identification of the variables that have affected this reduction is of particular interest for future decision making. The results of the variable selection process show that the selected variables are main subjects of road safety policy measures. Published by Elsevier Ltd.

  6. Measuring food intake in studies of obesity.

    PubMed

    Lissner, Lauren

    2002-12-01

    The problem of how to measure habitual food intake in studies of obesity remains an enigma in nutritional research. The existence of obesity-specific underreporting was rather controversial until the advent of the doubly labelled water technique gave credence to previously anecdotal evidence that such a bias does in fact exist. This paper reviews a number of issues relevant to interpreting dietary data in studies involving obesity. Topics covered include: participation biases, normative biases,importance of matching method to study, selective underreporting, and a brief discussion of the potential implications of generalised and selective underreporting in analytical epidemiology. It is concluded that selective underreporting of certain food types by obese individuals would produce consequences in analytical epidemiological studies that are both unpredictable and complex. Since it is becoming increasingly acknowledged that selective reporting error does occur, it is important to emphasise that correction for energy intake is not sufficient to eliminate the biases from this type of error. This is true both for obesity-related selective reporting errors and more universal types of selective underreporting, e.g. foods of low social desirability. Additional research is urgently required to examine the consequences of this type of error.

  7. Cognitive load reducing in destination decision system

    NASA Astrophysics Data System (ADS)

    Wu, Chunhua; Wang, Cong; Jiang, Qien; Wang, Jian; Chen, Hong

    2007-12-01

    With limited cognitive resource, the quantity of information can be processed by a person is limited. If the limitation is broken, the whole cognitive process would be affected, so did the final decision. The research of effective ways to reduce the cognitive load is launched from two aspects: cutting down the number of alternatives and directing the user to allocate his limited attention resource based on the selective visual attention theory. Decision-making is such a complex process that people usually have difficulties to express their requirements completely. An effective method to get user's hidden requirements is put forward in this paper. With more requirements be caught, the destination decision system can filtering more quantity of inappropriate alternatives. Different information piece has different utility, if the information with high utility would get attention easily, the decision might be made more easily. After analyzing the current selective visual attention theory, a new presentation style based on user's visual attention also put forward in this paper. This model arranges information presentation according to the movement of sightline. Through visual attention, the user can put their limited attention resource on the important information. Hidden requirements catching and presenting information based on the selective visual attention are effective ways to reducing the cognitive load.

  8. RANDOMIZED CONTROLLED TRIALS IN ORTHOPEDICS AND TRAUMATOLOGY: SYSTEMATIC ANALYSIS ON THE NATIONAL EVIDENCE

    PubMed Central

    de Moraes, Vinícius Ynoe; Moreira, Cesar Domingues; Tamaoki, Marcel Jun Sugawara; Faloppa, Flávio; Belloti, Joao Carlos

    2015-01-01

    Objective: To assess whether there has been any improvement in the quality and quantity of randomized controlled trials (RCTs) in nationally published journals through the application of standardized and validated scores. Methods: We electronically selected all RCTs published in the two indexed Brazilian journals that focus on orthopedics, over the period 2000-2009: Acta Ortopédica Brasileira (AOB) and Revista Brasileira de Ortopedia (RBO). These RCTs were identified and scored by two independent researchers in accordance with the Jadad scale and the Cochrane Bone, Joint and Muscle Trauma Group score. The studies selected were grouped as follows: 1) publication period (2000-2004 or 2004-2009); 2) journal of publication (AOB or RBO). Results: Twenty-two papers were selected: 10 from AOB and 12 from RBO. No statistically significant differences were found between the proportions (nRCT/nTotal of published papers) of RCTs published in the two journals (p = 0.458), or in the Jadad score (p = 0.722) and Cochrane score (p = 0.630). Conclusion: The relative quality and quantity of RCTs in the journals analyzed were similar. There was a trend towards improvement of quality, but there was no increase in the number of RCTs between the two periods analyzed. PMID:27026971

  9. On the Error State Selection for Stationary SINS Alignment and Calibration Kalman Filters—Part II: Observability/Estimability Analysis

    PubMed Central

    Silva, Felipe O.; Hemerly, Elder M.; Leite Filho, Waldemar C.

    2017-01-01

    This paper presents the second part of a study aiming at the error state selection in Kalman filters applied to the stationary self-alignment and calibration (SSAC) problem of strapdown inertial navigation systems (SINS). The observability properties of the system are systematically investigated, and the number of unobservable modes is established. Through the analytical manipulation of the full SINS error model, the unobservable modes of the system are determined, and the SSAC error states (except the velocity errors) are proven to be individually unobservable. The estimability of the system is determined through the examination of the major diagonal terms of the covariance matrix and their eigenvalues/eigenvectors. Filter order reduction based on observability analysis is shown to be inadequate, and several misconceptions regarding SSAC observability and estimability deficiencies are removed. As the main contributions of this paper, we demonstrate that, except for the position errors, all error states can be minimally estimated in the SSAC problem and, hence, should not be removed from the filter. Corroborating the conclusions of the first part of this study, a 12-state Kalman filter is found to be the optimal error state selection for SSAC purposes. Results from simulated and experimental tests support the outlined conclusions. PMID:28241494

  10. Dynamic Portfolio Strategy Using Clustering Approach

    PubMed Central

    Lu, Ya-Nan; Li, Sai-Ping; Jiang, Xiong-Fei; Zhong, Li-Xin; Qiu, Tian

    2017-01-01

    The problem of portfolio optimization is one of the most important issues in asset management. We here propose a new dynamic portfolio strategy based on the time-varying structures of MST networks in Chinese stock markets, where the market condition is further considered when using the optimal portfolios for investment. A portfolio strategy comprises two stages: First, select the portfolios by choosing central and peripheral stocks in the selection horizon using five topological parameters, namely degree, betweenness centrality, distance on degree criterion, distance on correlation criterion and distance on distance criterion. Second, use the portfolios for investment in the investment horizon. The optimal portfolio is chosen by comparing central and peripheral portfolios under different combinations of market conditions in the selection and investment horizons. Market conditions in our paper are identified by the ratios of the number of trading days with rising index to the total number of trading days, or the sum of the amplitudes of the trading days with rising index to the sum of the amplitudes of the total trading days. We find that central portfolios outperform peripheral portfolios when the market is under a drawup condition, or when the market is stable or drawup in the selection horizon and is under a stable condition in the investment horizon. We also find that peripheral portfolios gain more than central portfolios when the market is stable in the selection horizon and is drawdown in the investment horizon. Empirical tests are carried out based on the optimal portfolio strategy. Among all possible optimal portfolio strategies based on different parameters to select portfolios and different criteria to identify market conditions, 65% of our optimal portfolio strategies outperform the random strategy for the Shanghai A-Share market while the proportion is 70% for the Shenzhen A-Share market. PMID:28129333

  11. Dynamic Portfolio Strategy Using Clustering Approach.

    PubMed

    Ren, Fei; Lu, Ya-Nan; Li, Sai-Ping; Jiang, Xiong-Fei; Zhong, Li-Xin; Qiu, Tian

    2017-01-01

    The problem of portfolio optimization is one of the most important issues in asset management. We here propose a new dynamic portfolio strategy based on the time-varying structures of MST networks in Chinese stock markets, where the market condition is further considered when using the optimal portfolios for investment. A portfolio strategy comprises two stages: First, select the portfolios by choosing central and peripheral stocks in the selection horizon using five topological parameters, namely degree, betweenness centrality, distance on degree criterion, distance on correlation criterion and distance on distance criterion. Second, use the portfolios for investment in the investment horizon. The optimal portfolio is chosen by comparing central and peripheral portfolios under different combinations of market conditions in the selection and investment horizons. Market conditions in our paper are identified by the ratios of the number of trading days with rising index to the total number of trading days, or the sum of the amplitudes of the trading days with rising index to the sum of the amplitudes of the total trading days. We find that central portfolios outperform peripheral portfolios when the market is under a drawup condition, or when the market is stable or drawup in the selection horizon and is under a stable condition in the investment horizon. We also find that peripheral portfolios gain more than central portfolios when the market is stable in the selection horizon and is drawdown in the investment horizon. Empirical tests are carried out based on the optimal portfolio strategy. Among all possible optimal portfolio strategies based on different parameters to select portfolios and different criteria to identify market conditions, 65% of our optimal portfolio strategies outperform the random strategy for the Shanghai A-Share market while the proportion is 70% for the Shenzhen A-Share market.

  12. An energy efficient distance-aware routing algorithm with multiple mobile sinks for wireless sensor networks.

    PubMed

    Wang, Jin; Li, Bin; Xia, Feng; Kim, Chang-Seob; Kim, Jeong-Uk

    2014-08-18

    Traffic patterns in wireless sensor networks (WSNs) usually follow a many-to-one model. Sensor nodes close to static sinks will deplete their limited energy more rapidly than other sensors, since they will have more data to forward during multihop transmission. This will cause network partition, isolated nodes and much shortened network lifetime. Thus, how to balance energy consumption for sensor nodes is an important research issue. In recent years, exploiting sink mobility technology in WSNs has attracted much research attention because it can not only improve energy efficiency, but prolong network lifetime. In this paper, we propose an energy efficient distance-aware routing algorithm with multiple mobile sink for WSNs, where sink nodes will move with a certain speed along the network boundary to collect monitored data. We study the influence of multiple mobile sink nodes on energy consumption and network lifetime, and we mainly focus on the selection of mobile sink node number and the selection of parking positions, as well as their impact on performance metrics above. We can see that both mobile sink node number and the selection of parking position have important influence on network performance. Simulation results show that our proposed routing algorithm has better performance than traditional routing ones in terms of energy consumption.

  13. Optimized Energy Harvesting, Cluster-Head Selection and Channel Allocation for IoTs in Smart Cities

    PubMed Central

    Aslam, Saleem; Hasan, Najam Ul; Jang, Ju Wook; Lee, Kyung-Geun

    2016-01-01

    This paper highlights three critical aspects of the internet of things (IoTs), namely (1) energy efficiency, (2) energy balancing and (3) quality of service (QoS) and presents three novel schemes for addressing these aspects. For energy efficiency, a novel radio frequency (RF) energy-harvesting scheme is presented in which each IoT device is associated with the best possible RF source in order to maximize the overall energy that the IoT devices harvest. For energy balancing, the IoT devices in close proximity are clustered together and then an IoT device with the highest residual energy is selected as a cluster head (CH) on a rotational basis. Once the CH is selected, it assigns channels to the IoT devices to report their data using a novel integer linear program (ILP)-based channel allocation scheme by satisfying their desired QoS. To evaluate the presented schemes, exhaustive simulations are carried out by varying different parameters, including the number of IoT devices, the number of harvesting sources, the distance between RF sources and IoT devices and the primary user (PU) activity of different channels. The simulation results demonstrate that our proposed schemes perform better than the existing ones. PMID:27918424

  14. An Energy Efficient Distance-Aware Routing Algorithm with Multiple Mobile Sinks for Wireless Sensor Networks

    PubMed Central

    Wang, Jin; Li, Bin; Xia, Feng; Kim, Chang-Seob; Kim, Jeong-Uk

    2014-01-01

    Traffic patterns in wireless sensor networks (WSNs) usually follow a many-to-one model. Sensor nodes close to static sinks will deplete their limited energy more rapidly than other sensors, since they will have more data to forward during multihop transmission. This will cause network partition, isolated nodes and much shortened network lifetime. Thus, how to balance energy consumption for sensor nodes is an important research issue. In recent years, exploiting sink mobility technology in WSNs has attracted much research attention because it can not only improve energy efficiency, but prolong network lifetime. In this paper, we propose an energy efficient distance-aware routing algorithm with multiple mobile sink for WSNs, where sink nodes will move with a certain speed along the network boundary to collect monitored data. We study the influence of multiple mobile sink nodes on energy consumption and network lifetime, and we mainly focus on the selection of mobile sink node number and the selection of parking positions, as well as their impact on performance metrics above. We can see that both mobile sink node number and the selection of parking position have important influence on network performance. Simulation results show that our proposed routing algorithm has better performance than traditional routing ones in terms of energy consumption. PMID:25196015

  15. Optimized Energy Harvesting, Cluster-Head Selection and Channel Allocation for IoTs in Smart Cities.

    PubMed

    Aslam, Saleem; Hasan, Najam Ul; Jang, Ju Wook; Lee, Kyung-Geun

    2016-12-02

    This paper highlights three critical aspects of the internet of things (IoTs), namely (1) energy efficiency, (2) energy balancing and (3) quality of service (QoS) and presents three novel schemes for addressing these aspects. For energy efficiency, a novel radio frequency (RF) energy-harvesting scheme is presented in which each IoT device is associated with the best possible RF source in order to maximize the overall energy that the IoT devices harvest. For energy balancing, the IoT devices in close proximity are clustered together and then an IoT device with the highest residual energy is selected as a cluster head (CH) on a rotational basis. Once the CH is selected, it assigns channels to the IoT devices to report their data using a novel integer linear program (ILP)-based channel allocation scheme by satisfying their desired QoS. To evaluate the presented schemes, exhaustive simulations are carried out by varying different parameters, including the number of IoT devices, the number of harvesting sources, the distance between RF sources and IoT devices and the primary user (PU) activity of different channels. The simulation results demonstrate that our proposed schemes perform better than the existing ones.

  16. Bayesian block-diagonal variable selection and model averaging

    PubMed Central

    Papaspiliopoulos, O.; Rossell, D.

    2018-01-01

    Summary We propose a scalable algorithmic framework for exact Bayesian variable selection and model averaging in linear models under the assumption that the Gram matrix is block-diagonal, and as a heuristic for exploring the model space for general designs. In block-diagonal designs our approach returns the most probable model of any given size without resorting to numerical integration. The algorithm also provides a novel and efficient solution to the frequentist best subset selection problem for block-diagonal designs. Posterior probabilities for any number of models are obtained by evaluating a single one-dimensional integral, and other quantities of interest such as variable inclusion probabilities and model-averaged regression estimates are obtained by an adaptive, deterministic one-dimensional numerical integration. The overall computational cost scales linearly with the number of blocks, which can be processed in parallel, and exponentially with the block size, rendering it most adequate in situations where predictors are organized in many moderately-sized blocks. For general designs, we approximate the Gram matrix by a block-diagonal matrix using spectral clustering and propose an iterative algorithm that capitalizes on the block-diagonal algorithms to explore efficiently the model space. All methods proposed in this paper are implemented in the R library mombf. PMID:29861501

  17. A bibliometric analysis on top-cited articles in pain research.

    PubMed

    Chuang, Kun-Yang; Ho, Yuh-Shan

    2014-05-01

    The field of pain-related research has gained more attention as the prevalence of chronic pain increased over the years. The objective of this research was to identify highly cited papers, as well as contributors, to pain-related research. Pain-related articles published from 1900 to 2011 were screened, and highly cited papers, with at least 100 citations since publication, were identified and selected for a bibliometric analysis. The total number of papers, authorship, and collaboration statistics are presented for countries, institutions, and authors. To assess contributions, a new indicator, the major contributor index (MCI), was used. Citation trends for all papers, as well as for top papers, are presented. A total of 7,327 articles, 2.4% of all pain related articles, had received at least 100 citations since publication. In recent decades, top-cited articles have reached a citation peak more quickly, and have shown a more-rapid decreasing trend, compared with top-cited articles from earlier decades. The leading countries were United States, U.K., Canada, and Germany. The leading institutions were Harvard University, University of California, San Francisco, University of Texas, and University of Washington. MCI varied among leading institutions, as well as among individual authors. An indicator like the MCI can provide a proxy for the contributions made by an individual or institution. It reflects the independent research ability and leadership. In future evaluations of institution or individual performances, the MCI should be included, together with the number of total papers, to provide a better profile of research performance.

  18. Building and testing of MIDAS instrument sub-assemblies

    NASA Astrophysics Data System (ADS)

    Lewis, S. D.

    2001-09-01

    The MIDAS instrument is an atomic force microscope developed by ESTEC to fly on Rosetta. The purpose of the instrument is to sample and characterise cometary dust, which impinges upon a facetted wheel contained within the instrument enclosure. Due to its relative complexity, the long cruise phase of the Rosetta mission and the relatively novel use of piezomotors for all drive requirements the instrument has a number of interesting mechanisms engineering challenges. This paper describes the lubricant selection, EM and FM subassembly build and test campaigns carried out by AEA Technology Space in close support of the instrumentlevel activities which ran in parallel at ESTEC. The paper also identifies some lessons learned, which can be generally applied in other mechanism programmes.

  19. Robust High-Capacity Audio Watermarking Based on FFT Amplitude Modification

    NASA Astrophysics Data System (ADS)

    Fallahpour, Mehdi; Megías, David

    This paper proposes a novel robust audio watermarking algorithm to embed data and extract it in a bit-exact manner based on changing the magnitudes of the FFT spectrum. The key point is selecting a frequency band for embedding based on the comparison between the original and the MP3 compressed/decompressed signal and on a suitable scaling factor. The experimental results show that the method has a very high capacity (about 5kbps), without significant perceptual distortion (ODG about -0.25) and provides robustness against common audio signal processing such as added noise, filtering and MPEG compression (MP3). Furthermore, the proposed method has a larger capacity (number of embedded bits to number of host bits rate) than recent image data hiding methods.

  20. Further considerations of engine emissions from subsonic aircraft at cruise altitude

    NASA Astrophysics Data System (ADS)

    Lee, S. H.; Le Dilosquer, M.; Singh, R.; Rycroft, M. J.

    The most significant man-made sources of pollution of the higher troposphere and lower stratosphere are exhaust emissions from civil subsonic aircraft at cruise altitude (8-12 km). This paper examines such issues by computational modelling of Boeing 747-400 flights during their cruise phase between selected city pairs, for example London to Tokyo. The engine performance, exhaust pollutant prediction, and detailed flight history analysis effects of different Mach numbers and of increasing the cruise altitude from 9.8 to 12.1 km during the flight rather than staying at a constant cruise altitude of 10.5 km are studied in detail. To minimise the overall effects of atmospheric pollution, a Mach number of 0.85 and increasing altitude is the favoured cruise technique.

  1. Parametric Investigation of Liquid Jets in Low Gravity

    NASA Technical Reports Server (NTRS)

    Chato, David J.

    2005-01-01

    An axisymmetric phase field model is developed and used to model surface tension forces on liquid jets in microgravity. The previous work in this area is reviewed and a baseline drop tower experiment selected for model comparison. This paper uses the model to parametrically investigate the influence of key parameters on the geysers formed by jets in microgravity. Investigation of the contact angle showed the expected trend of increasing contact angle increasing geyser height. Investigation of the tank radius showed some interesting effects and demonstrated the zone of free surface deformation is quite large. Variation of the surface tension with a laminar jet showed clearly the evolution of free surface shape with Weber number. It predicted a breakthrough Weber number of 1.

  2. Integrating Kano’s Model into Quality Function Deployment for Product Design: A Comprehensive Review

    NASA Astrophysics Data System (ADS)

    Ginting, Rosnani; Hidayati, Juliza; Siregar, Ikhsan

    2018-03-01

    Many methods and techniques are adopted by some companies to improve the competitiveness through the fulfillment of customer satisfaction by enhancement and improvement the product design quality. Over the past few years, several researcher have studied extensively combining Quality Function Deployment and Kano’s model as design techniques by focusing on translating consumer desires into a product design. This paper presents a review and analysis of several literatures that associated to the integration methodology of Kano into the QFD process. Various of international journal articles were selected, collected and analyzed through a number of relevant scientific publications. In-depth analysis was performed, and focused in this paper on the results, advantages and drawbacks of its methodology. In addition, this paper also provides the analysis that acquired in this study related to the development of the methodology. It is hopedd this paper can be a reference for other researchers and manufacturing companies to implement the integration method of QFD- Kano for product design.

  3. Access to scientific publications: the scientist's perspective.

    PubMed

    Voronin, Yegor; Myrzahmetov, Askar; Bernstein, Alan

    2011-01-01

    Scientific publishing is undergoing significant changes due to the growth of online publications, increases in the number of open access journals, and policies of funders and universities requiring authors to ensure that their publications become publicly accessible. Most studies of the impact of these changes have focused on the growth of articles available through open access or the number of open-access journals. Here, we investigated access to publications at a number of institutes and universities around the world, focusing on publications in HIV vaccine research--an area of biomedical research with special importance to the developing world. We selected research papers in HIV vaccine research field, creating: 1) a first set of 50 most recently published papers with keywords "HIV vaccine" and 2) a second set of 200 articles randomly selected from those cited in the first set. Access to the majority (80%) of the recently published articles required subscription, while cited literature was much more accessible (67% freely available online). Subscriptions at a number of institutions around the world were assessed for providing access to subscription-only articles from the two sets. The access levels varied widely, ranging among institutions from 20% to 90%. Through the WHO-supported HINARI program, institutes in low-income countries had access comparable to that of institutes in the North. Finally, we examined the response rates for reprint requests sent to corresponding authors, a method commonly used before internet access became widespread. Contacting corresponding authors with requests for electronic copies of articles by email resulted in a 55-60% success rate, although in some cases it took up to 1.5 months to get a response. While research articles are increasingly available on the internet in open access format, institutional subscriptions continue to play an important role. However, subscriptions do not provide access to the full range of HIV vaccine research literature. Access to papers through subscriptions is complemented by a variety of other means, including emailing corresponding authors, joint affiliations, use of someone else's login information and posting requests on message boards. This complex picture makes it difficult to assess the real ability of scientists to access literature, but the observed differences in access levels between institutions suggest an unlevel playing field, in which some researchers have to spend more efforts than others to obtain the same information.

  4. Trends in animal experimentation.

    PubMed

    Monteiro, Rosangela; Brandau, Ricardo; Gomes, Walter J; Braile, Domingo M

    2009-01-01

    The search of the understanding of etiological factors, mechanisms and treatment of the diseases has been taking to the development of several animal models in the last decades. To discuss aspects related to animal models of experimentation, animal choice and current trends in this field in our country. In addition, this study evaluated the frequency of experimental articles in medical journals. Five Brazilian journals indexed by LILACS, SciELO, MEDLINE, and recently incorporate for Institute for Scientific Information Journal of Citation Reports were analyzed. All the papers published in those journals, between 2007 and 2008, that used animal models, were selected based on the abstracts. Of the total of 832 articles published in the period, 92 (11.1%) experimentation papers were selected. The number of experimental articles ranged from 5.2% to 17.9% of the global content of the journal. In the instructions to the authors, four (80%) journals presented explicit reference to the ethical principles in the conduction of studies with animals. The induced animal models represented 100% of the articles analyzed in this study. The rat was the most employed animal in the analyzed articles (78.3%). The present study can contribute, supplying subsidies for adoption of future editorials policies regarding the publication of animal research papers in Brazilian Journal of Cardiovascular Surgery.

  5. Grasp specific and user friendly interface design for myoelectric hand prostheses.

    PubMed

    Mohammadi, Alireza; Lavranos, Jim; Howe, Rob; Choong, Peter; Oetomo, Denny

    2017-07-01

    This paper presents the design and characterisation of a hand prosthesis and its user interface, focusing on performing the most commonly used grasps in activities of daily living (ADLs). Since the operation of a multi-articulated powered hand prosthesis is difficult to learn and master, there is a significant rate of abandonment by amputees in preference for simpler devices. In choosing so, amputees chose to live with fewer features in their prosthesis that would more reliably perform the basic operations. In this paper, we look simultaneously at a hand prosthesis design method that aims for a small number of grasps, a low complexity user interface and an alternative method to the current use of EMG as a preshape selection method through the use of a simple button; to enable amputees to get to and execute the intended hand movements intuitively, quickly and reliably. An experiment is reported at the end of the paper comparing the speed and accuracy with which able-bodied naive subjects are able to select the intended preshapes through the use of a simplified EMG method and a simple button. It is shown that the button was significantly superior in the speed of successful task completion and marginally superior in accuracy (success of first attempt).

  6. Selected results of the F-15 propulsion interactions program

    NASA Technical Reports Server (NTRS)

    Webb, L. D.; Nugent, J.

    1982-01-01

    A better understanding of propulsion system/airframe flow interactions could aid in the reduction of aircraft drag. For this purpose, NASA and the United States Air Force have conducted a series of wind-tunnel and flight tests on the F-15 airplane. This paper presents a correlation of flight test data from tests conducted at the NASA Dryden Flight Research Facility of the Ames Research Center, with data obtained from wind-tunnel tests. Flights were made at stabilized Mach numbers around 0.6, 0.9, 1.2, and 1.5 with accelerations up to near Mach number 2. Wind-tunnel tests used a 7.5 percent-scale F-15 inlet/airframe model. Flight and wind-tunnel pressure coefficients showed good agreement in most cases. Correlation of interaction effects caused by changes in cowl angle, angle-of-attack, and Mach number are presented. For the afterbody region, the pressure coefficients on the nozzle surfaces were influenced by boattail angles and Mach number. Boundary-layer thickness decreased as angle of attack increased above 4 deg.

  7. Numerical and analytical approaches to an advection-diffusion problem at small Reynolds number and large Péclet number

    NASA Astrophysics Data System (ADS)

    Fuller, Nathaniel J.; Licata, Nicholas A.

    2018-05-01

    Obtaining a detailed understanding of the physical interactions between a cell and its environment often requires information about the flow of fluid surrounding the cell. Cells must be able to effectively absorb and discard material in order to survive. Strategies for nutrient acquisition and toxin disposal, which have been evolutionarily selected for their efficacy, should reflect knowledge of the physics underlying this mass transport problem. Motivated by these considerations, in this paper we discuss the results from an undergraduate research project on the advection-diffusion equation at small Reynolds number and large Péclet number. In particular, we consider the problem of mass transport for a Stokesian spherical swimmer. We approach the problem numerically and analytically through a rescaling of the concentration boundary layer. A biophysically motivated first-passage problem for the absorption of material by the swimming cell demonstrates quantitative agreement between the numerical and analytical approaches. We conclude by discussing the connections between our results and the design of smart toxin disposal systems.

  8. An evaluation of the utility and limitations of counting motor unit action potentials in the surface electromyogram

    NASA Astrophysics Data System (ADS)

    Zhou, Ping; Zev Rymer, William

    2004-12-01

    The number of motor unit action potentials (MUAPs) appearing in the surface electromyogram (EMG) signal is directly related to motor unit recruitment and firing rates and therefore offers potentially valuable information about the level of activation of the motoneuron pool. In this paper, based on morphological features of the surface MUAPs, we try to estimate the number of MUAPs present in the surface EMG by counting the negative peaks in the signal. Several signal processing procedures are applied to the surface EMG to facilitate this peak counting process. The MUAP number estimation performance by this approach is first illustrated using the surface EMG simulations. Then, by evaluating the peak counting results from the EMG records detected by a very selective surface electrode, at different contraction levels of the first dorsal interosseous (FDI) muscles, the utility and limitations of such direct peak counts for MUAP number estimation in surface EMG are further explored.

  9. Of Papers and Pens: Polysemes and Homophones in Lexical (mis)Selection.

    PubMed

    Li, Leon; Slevc, L Robert

    2017-05-01

    Every word signifies multiple senses. Many studies using comprehension-based measures suggest that polysemes' senses (e.g., paper as in printer paper or term paper) share lexical representations, whereas homophones' meanings (e.g., pen as in ballpoint pen or pig pen) correspond to distinct lexical representations. Less is known about the lexical representations of polysemes compared to homophones in language production. In this study, speakers named pictures after reading sentence fragments that primed polysemes and homophones either as direct competitors to pictures (i.e., semantic-competitors), or as indirect-competitors to pictures (e.g., polysemous senses of semantic competitors, or homophonous meanings of semantic competitors). Polysemes (e.g., paper) elicited equal numbers of intrusions to picture names (e.g., cardboard) compared to in control conditions whether primed as direct competitors (printer paper) or as indirect-competitors (term paper). This contrasted with the finding that homophones (e.g., pen) elicited more intrusions to picture names (e.g., crayon) compared to in control conditions when primed as direct competitors (ballpoint pen) than when primed as indirect-competitors (pig pen). These results suggest that polysemes, unlike homophones, are stored and retrieved as unified lexical representations. Copyright © 2016 Cognitive Science Society, Inc.

  10. ID card number detection algorithm based on convolutional neural network

    NASA Astrophysics Data System (ADS)

    Zhu, Jian; Ma, Hanjie; Feng, Jie; Dai, Leiyan

    2018-04-01

    In this paper, a new detection algorithm based on Convolutional Neural Network is presented in order to realize the fast and convenient ID information extraction in multiple scenarios. The algorithm uses the mobile device equipped with Android operating system to locate and extract the ID number; Use the special color distribution of the ID card, select the appropriate channel component; Use the image threshold segmentation, noise processing and morphological processing to take the binary processing for image; At the same time, the image rotation and projection method are used for horizontal correction when image was tilting; Finally, the single character is extracted by the projection method, and recognized by using Convolutional Neural Network. Through test shows that, A single ID number image from the extraction to the identification time is about 80ms, the accuracy rate is about 99%, It can be applied to the actual production and living environment.

  11. Fast Bayesian Inference of Copy Number Variants using Hidden Markov Models with Wavelet Compression

    PubMed Central

    Wiedenhoeft, John; Brugel, Eric; Schliep, Alexander

    2016-01-01

    By integrating Haar wavelets with Hidden Markov Models, we achieve drastically reduced running times for Bayesian inference using Forward-Backward Gibbs sampling. We show that this improves detection of genomic copy number variants (CNV) in array CGH experiments compared to the state-of-the-art, including standard Gibbs sampling. The method concentrates computational effort on chromosomal segments which are difficult to call, by dynamically and adaptively recomputing consecutive blocks of observations likely to share a copy number. This makes routine diagnostic use and re-analysis of legacy data collections feasible; to this end, we also propose an effective automatic prior. An open source software implementation of our method is available at http://schlieplab.org/Software/HaMMLET/ (DOI: 10.5281/zenodo.46262). This paper was selected for oral presentation at RECOMB 2016, and an abstract is published in the conference proceedings. PMID:27177143

  12. Habitat Selection Response of Small Pelagic Fish in Different Environments. Two Examples from the Oligotrophic Mediterranean Sea

    PubMed Central

    Bonanno, Angelo; Giannoulaki, Marianna; Barra, Marco; Basilone, Gualtiero; Machias, Athanassios; Genovese, Simona; Goncharov, Sergey; Popov, Sergey; Rumolo, Paola; Di Bitetto, Massimiliano; Aronica, Salvatore; Patti, Bernardo; Fontana, Ignazio; Giacalone, Giovanni; Ferreri, Rosalia; Buscaino, Giuseppa; Somarakis, Stylianos; Pyrounaki, Maria-Myrto; Tsoukali, Stavroula; Mazzola, Salvatore

    2014-01-01

    A number of scientific papers in the last few years singled out the influence of environmental conditions on the spatial distribution of fish species, highlighting the need for the fisheries scientific community to investigate, besides biomass estimates, also the habitat selection of commercially important fish species. The Mediterranean Sea, although generally oligotrophic, is characterized by high habitat variability and represents an ideal study area to investigate the adaptive behavior of small pelagics under different environmental conditions. In this study the habitat selection of European anchovy Engraulis encrasicolus and European sardine Sardina pilchardus is analyzed in two areas of the Mediterranean Sea that largely differentiate in terms of environmental regimes: the Strait of Sicily and the North Aegean Sea. A number of environmental parameters were used to investigate factors influencing anchovy and sardine habitat selection. Acoustic surveys data, collected during the summer period 2002–2010, were used for this purpose. The quotient analysis was used to identify the association between high density values and environmental variables; it was applied to the entire dataset in each area in order to identify similarities or differences in the “mean” spatial behavioral pattern for each species. Principal component analysis was applied to selected environmental variables in order to identify those environmental regimes which drive each of the two ecosystems. The analysis revealed the effect of food availability along with bottom depth selection on the spatial distribution of both species. Furthermore PCA results highlighted that observed selectivity for shallower waters is mainly associated to specific environmental processes that locally increase productivity. The common trends in habitat selection of the two species, as observed in the two regions although they present marked differences in hydrodynamics, seem to be driven by the oligotrophic character of the study areas, highlighting the role of areas where the local environmental regimes meet ‘the ocean triad hypothesis’. PMID:24992576

  13. Sunspot activity and influenza pandemics: a statistical assessment of the purported association.

    PubMed

    Towers, S

    2017-10-01

    Since 1978, a series of papers in the literature have claimed to find a significant association between sunspot activity and the timing of influenza pandemics. This paper examines these analyses, and attempts to recreate the three most recent statistical analyses by Ertel (1994), Tapping et al. (2001), and Yeung (2006), which all have purported to find a significant relationship between sunspot numbers and pandemic influenza. As will be discussed, each analysis had errors in the data. In addition, in each analysis arbitrary selections or assumptions were also made, and the authors did not assess the robustness of their analyses to changes in those arbitrary assumptions. Varying the arbitrary assumptions to other, equally valid, assumptions negates the claims of significance. Indeed, an arbitrary selection made in one of the analyses appears to have resulted in almost maximal apparent significance; changing it only slightly yields a null result. This analysis applies statistically rigorous methodology to examine the purported sunspot/pandemic link, using more statistically powerful un-binned analysis methods, rather than relying on arbitrarily binned data. The analyses are repeated using both the Wolf and Group sunspot numbers. In all cases, no statistically significant evidence of any association was found. However, while the focus in this particular analysis was on the purported relationship of influenza pandemics to sunspot activity, the faults found in the past analyses are common pitfalls; inattention to analysis reproducibility and robustness assessment are common problems in the sciences, that are unfortunately not noted often enough in review.

  14. Research to policy and practice change: is capacity building in operational research delivering the goods?

    PubMed

    Zachariah, Rony; Guillerm, Nathalie; Berger, Selma; Kumar, Ajay M V; Satyanarayana, Srinath; Bissell, Karen; Edginton, Mary; Hinderaker, Sven Gudmund; Tayler-Smith, Katie; Van den Bergh, Rafael; Khogali, Mohammed; Manzi, Marcel; Reid, Anthony J; Ramsay, Andrew; Reeder, John C; Harries, Anthony D

    2014-09-01

    Between 2009 and 2012, eight operational research capacity building courses were completed in Paris (3), Luxembourg (1), India (1), Nepal (1), Kenya (1) and Fiji (1). Courses had strict milestones that were subsequently adopted by the Structured Operational Research and Training InitiaTive (SORT IT) of the World Health Organization. We report on the numbers of enrolled participants who successfully completed courses, the number of papers published and their reported effect on policy and/or practice. Retrospective cohort study including a survey. Participant selection criteria ensured that only those proposing specific programme-related and relevant operational research questions were selected. Effects on policy and/or practice were assessed in a standardised manner by two independent reviewers. Of 93 enrolled participants from 31 countries (14 in Africa, 13 in Asia, two in Latin America and two in South Pacific), 83 (89%) completed their courses. A total of 96 papers were submitted to scientific journals of which 89 (93%) were published and 88 assessed for effect on policy and practice. There was a reported effect in 65 (74%) studies including changes to programme implementation (27), adaptation of monitoring tools (24) and changes to existing guidelines (20). Three quarters of published operational research studies from these structured courses had reported effects on policy and/or practice. It is important that this type of tracking becomes a standard component of operational research and research in general. © 2014 John Wiley & Sons Ltd.

  15. Biomimetic Flow Control

    NASA Technical Reports Server (NTRS)

    Anders, John B.

    2000-01-01

    Biologic flight has undoubtedly intrigued man for thousands of years, yet it has been only the last 100 years or so that any serious challenge has been mounted to the pre-eminence of birds. Although present-day large-scale aircraft are now clearly able to fly higher, faster and farther than any bird or insect, it is obvious that these biological creatures have a mastery of low Reynolds number, unsteady flows that is unrivaled by man-made systems. This paper suggests that biological flight should be examined for mechanisms that may apply to engineered flight systems, especially in the emerging field of small-scale, uninhabited aerial vehicles (UAV). This paper discusses the kinematics and aerodynamics of bird and insect flight, including some aspects of unsteady aerodynamics. The dynamics of flapping wing flight is briefly examined, including gait selection, flapping frequency and amplitude selection, as well as wing planform and angle-of-attack dynamics. Unsteady aerodynamic mechanisms as practiced by small birds and insects are reviewed. Drag reduction morphologies of birds and marine animals are discussed and fruitful areas of research are suggested.

  16. Assessing information technologies for health.

    PubMed

    Kulikowski, C; Haux, R

    2006-01-01

    To provide an editorial introduction to the 2006 IMIA Yearbook of Medical Informatics with an overview of its contents and contributors. A brief overview of the main theme of 'Assessing Information Technology for Health Care', and an outline of the purposes, readership, contents, new format, and acknowledgment of contributions for the 2006 IMIA Yearbook. Assessing information technology (IT) in biomedicine and health care is emphasized in a number of survey and review articles. Synopses of a selection of best papers for the past 12 months are included, as are original papers on the history of medical informatics by pioneers in the field, and selected research and education programs. Information about IMIA and its constituent societies is given, as well as the authors, reviewers, and advisors to the Yearbook. The 2006 IMIA Yearbook of Medical Informatics highlights as its theme one of the most significant yet difficult aspects of information technology in health: the assessment of IT as part of the complex enterprise of biomedical research and practice. It is being published in a new format with a wide range of original survey and review articles.

  17. Milagro Observations of Potential TeV Emitters

    NASA Technical Reports Server (NTRS)

    Abdo, A. A.; Abeysekara, A. U.; Allen, B. T.; Aune, T.; Barber, A. S.; Berley, D.; Braun, J.; Chen, C.; Christopher, G. E.; DeYoung, T.; hide

    2014-01-01

    This paper reports the results from three targeted searches of Milagro TeV sky maps: two extragalactic point source lists and one pulsar source list. The first extragalactic candidate list consists of 709 candidates selected from the Fermi-LAT 2FGL catalog. The second extragalactic candidate list contains 31 candidates selected from the TeVCat source catalog that have been detected by imaging atmospheric Cherenkov telescopes (IACTs). In both extragalactic candidate lists Mkn 421 was the only source detected by Milagro. This paper presents the Milagro TeV flux for Mkn 421 and flux limits for the brighter Fermi- LAT extragalactic sources and for all TeVCat candidates. The pulsar list extends a previously published Milagro targeted search for Galactic sources. With the 32 new gamma-ray pulsars identified in 2FGL, the number of pulsars that are studied by both Fermi-LAT and Milagro is increased to 52. In this sample, we find that the probability of Milagro detecting a TeV emission coincident with a pulsar increases with the GeV flux observed by the Fermi-LAT in the energy range from 0.1 GeV to 100 GeV.

  18. Longitudinal-control design approach for high-angle-of-attack aircraft

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Proffitt, Melissa S.

    1993-01-01

    This paper describes a control synthesis methodology that emphasizes a variable-gain output feedback technique that is applied to the longitudinal channel of a high-angle-of-attack aircraft. The aircraft is a modified F/A-18 aircraft with thrust-vectored controls. The flight regime covers a range up to a Mach number of 0.7; an altitude range from 15,000 to 35,000 ft; and an angle-of-attack (alpha) range up to 70 deg, which is deep into the poststall region. A brief overview is given of the variable-gain mathematical formulation as well as a description of the discrete control structure used for the feedback controller. This paper also presents an approximate design procedure with relationships for the optimal weights for the selected feedback control structure. These weights are selected to meet control design guidelines for high-alpha flight controls. Those guidelines that apply to the longitudinal-control design are also summarized. A unique approach is presented for the feed-forward command generator to obtain smooth transitions between load factor and alpha commands. Finally, representative linear analysis results and nonlinear batch simulation results are provided.

  19. Improving Conceptual Understanding and Representation Skills Through Excel-Based Modeling

    NASA Astrophysics Data System (ADS)

    Malone, Kathy L.; Schunn, Christian D.; Schuchardt, Anita M.

    2018-02-01

    The National Research Council framework for science education and the Next Generation Science Standards have developed a need for additional research and development of curricula that is both technologically model-based and includes engineering practices. This is especially the case for biology education. This paper describes a quasi-experimental design study to test the effectiveness of a model-based curriculum focused on the concepts of natural selection and population ecology that makes use of Excel modeling tools (Modeling Instruction in Biology with Excel, MBI-E). The curriculum revolves around the bio-engineering practice of controlling an invasive species. The study takes place in the Midwest within ten high schools teaching a regular-level introductory biology class. A post-test was designed that targeted a number of common misconceptions in both concept areas as well as representational usage. The results of a post-test demonstrate that the MBI-E students significantly outperformed the traditional classes in both natural selection and population ecology concepts, thus overcoming a number of misconceptions. In addition, implementing students made use of more multiple representations as well as demonstrating greater fascination for science.

  20. A Genetic-Based Feature Selection Approach in the Identification of Left/Right Hand Motor Imagery for a Brain-Computer Interface

    PubMed Central

    Yaacoub, Charles; Mhanna, Georges; Rihana, Sandy

    2017-01-01

    Electroencephalography is a non-invasive measure of the brain electrical activity generated by millions of neurons. Feature extraction in electroencephalography analysis is a core issue that may lead to accurate brain mental state classification. This paper presents a new feature selection method that improves left/right hand movement identification of a motor imagery brain-computer interface, based on genetic algorithms and artificial neural networks used as classifiers. Raw electroencephalography signals are first preprocessed using appropriate filtering. Feature extraction is carried out afterwards, based on spectral and temporal signal components, and thus a feature vector is constructed. As various features might be inaccurate and mislead the classifier, thus degrading the overall system performance, the proposed approach identifies a subset of features from a large feature space, such that the classifier error rate is reduced. Experimental results show that the proposed method is able to reduce the number of features to as low as 0.5% (i.e., the number of ignored features can reach 99.5%) while improving the accuracy, sensitivity, specificity, and precision of the classifier. PMID:28124985

  1. Traffic engineering and regenerator placement in GMPLS networks with restoration

    NASA Astrophysics Data System (ADS)

    Yetginer, Emre; Karasan, Ezhan

    2002-07-01

    In this paper we study regenerator placement and traffic engineering of restorable paths in Generalized Multipro-tocol Label Switching (GMPLS) networks. Regenerators are necessary in optical networks due to transmission impairments. We study a network architecture where there are regenerators at selected nodes and we propose two heuristic algorithms for the regenerator placement problem. Performances of these algorithms in terms of required number of regenerators and computational complexity are evaluated. In this network architecture with sparse regeneration, offline computation of working and restoration paths is studied with bandwidth reservation and path rerouting as the restoration scheme. We study two approaches for selecting working and restoration paths from a set of candidate paths and formulate each method as an Integer Linear Programming (ILP) prob-lem. Traffic uncertainty model is developed in order to compare these methods based on their robustness with respect to changing traffic patterns. Traffic engineering methods are compared based on number of additional demands due to traffic uncertainty that can be carried. Regenerator placement algorithms are also evaluated from a traffic engineering point of view.

  2. All-optical OXC transition strategy from WDM optical network to elastic optical network.

    PubMed

    Chen, Xin; Li, Juhao; Guo, Bingli; Zhu, Paikun; Tang, Ruizhi; Chen, Zhangyuan; He, Yongqi

    2016-02-22

    Elastic optical network (EON) has been proposed recently as a spectrum-efficient optical layer to adapt to rapidly-increasing traffic demands instead of current deployed wavelength-division-multiplexing (WDM) optical network. In contrast with conventional WDM optical cross-connect (OXCs) based on wavelength selective switches (WSSs), the EON OXCs are based on spectrum selective switches (SSSs) which are much more expensive than WSSs, especially for large-scale switching architectures. So the transition cost from WDM OXCs to EON OXCs is a major obstacle to realizing EON. In this paper, we propose and experimentally demonstrate a transition OXC (TOXC) structure based on 2-stage cascading switching architectures, which make full use of available WSSs in current deployed WDM OXCs to reduce number and port count of required SSSs. Moreover, we propose a contention-aware spectrum allocation (CASA) scheme for EON built with the proposed TOXCs. We show by simulation that the TOXCs reduce the network capital expenditure transiting from WDM optical network to EON about 50%, with a minor traffic blocking performance degradation and about 10% accommodated traffic number detriment compared with all-SSS EON OXC architectures.

  3. A Genetic-Based Feature Selection Approach in the Identification of Left/Right Hand Motor Imagery for a Brain-Computer Interface.

    PubMed

    Yaacoub, Charles; Mhanna, Georges; Rihana, Sandy

    2017-01-23

    Electroencephalography is a non-invasive measure of the brain electrical activity generated by millions of neurons. Feature extraction in electroencephalography analysis is a core issue that may lead to accurate brain mental state classification. This paper presents a new feature selection method that improves left/right hand movement identification of a motor imagery brain-computer interface, based on genetic algorithms and artificial neural networks used as classifiers. Raw electroencephalography signals are first preprocessed using appropriate filtering. Feature extraction is carried out afterwards, based on spectral and temporal signal components, and thus a feature vector is constructed. As various features might be inaccurate and mislead the classifier, thus degrading the overall system performance, the proposed approach identifies a subset of features from a large feature space, such that the classifier error rate is reduced. Experimental results show that the proposed method is able to reduce the number of features to as low as 0.5% (i.e., the number of ignored features can reach 99.5%) while improving the accuracy, sensitivity, specificity, and precision of the classifier.

  4. Computer supported collaborative learning in a clerkship: an exploratory study on the relation of discussion activity and revision of critical appraisal papers.

    PubMed

    Koops, Willem J M; van der Vleuten, Cees P M; de Leng, Bas A; Snoeckx, Luc H E H

    2012-08-20

    Medical students in clerkship are continuously confronted with real and relevant patient problems. To support clinical problem solving skills, students perform a Critical Appraisal of a Topic (CAT) task, often resulting in a paper. Because such a paper may contain errors, students could profit from discussion with peers, leading to paper revision. Active peer discussion by a Computer Supported Collaborative Learning (CSCL) environment show positive medical students perceptions on subjective knowledge improvement. High students' activity during discussions in a CSCL environment demonstrated higher task-focussed discussion reflecting higher levels of knowledge construction. However, it remains unclear whether high discussion activity influences students' decisions revise their CAT paper. The aim of this research is to examine whether students who revise their critical appraisal papers after discussion in a CSCL environment show more task-focussed activity and discuss more intensively on critical appraisal topics than students who do not revise their papers. Forty-seven medical students, stratified in subgroups, participated in a structured asynchronous online discussion of individual written CAT papers on self-selected clinical problems. The discussion was structured by three critical appraisal topics. After the discussion, the students could revise their paper. For analysis purposes, all students' postings were blinded and analysed by the investigator, unaware of students characteristics and whether or not the paper was revised. Postings were counted and analysed by an independent rater, Postings were assigned into outside activity, non-task-focussed activity or task-focussed activity. Additionally, postings were assigned to one of the three critical appraisal topics. Analysis results were compared by revised and unrevised papers. Twenty-four papers (51.6%) were revised after the online discussion. The discussions of the revised papers showed significantly higher numbers of postings, more task-focussed activities, and more postings about the two critical appraisal topics: "appraisal of the selected article(s)", and "relevant conclusion regarding the clinical problem". A CSCL environment can support medical students in the execution and critical appraisal of authentic tasks in the clinical workplace. Revision of CAT papers appears to be related to discussions activity, more specifically reflecting high task-focussed activity of critical appraisal topics.

  5. Firefly as a novel swarm intelligence variable selection method in spectroscopy.

    PubMed

    Goodarzi, Mohammad; dos Santos Coelho, Leandro

    2014-12-10

    A critical step in multivariate calibration is wavelength selection, which is used to build models with better prediction performance when applied to spectral data. Up to now, many feature selection techniques have been developed. Among all different types of feature selection techniques, those based on swarm intelligence optimization methodologies are more interesting since they are usually simulated based on animal and insect life behavior to, e.g., find the shortest path between a food source and their nests. This decision is made by a crowd, leading to a more robust model with less falling in local minima during the optimization cycle. This paper represents a novel feature selection approach to the selection of spectroscopic data, leading to more robust calibration models. The performance of the firefly algorithm, a swarm intelligence paradigm, was evaluated and compared with genetic algorithm and particle swarm optimization. All three techniques were coupled with partial least squares (PLS) and applied to three spectroscopic data sets. They demonstrate improved prediction results in comparison to when only a PLS model was built using all wavelengths. Results show that firefly algorithm as a novel swarm paradigm leads to a lower number of selected wavelengths while the prediction performance of built PLS stays the same. Copyright © 2014. Published by Elsevier B.V.

  6. Ten Steps to Improve Quality of the Journal Materia Socio-Medica

    PubMed Central

    Donev, Doncho M.; Masic, Izet

    2017-01-01

    Introduction and aim: Materia Socio-Medica is one of the oldest public health journals in Europe, established in 1978, and among the most important journals for public health in South-Eastern Europe. The Journal covers all important public health professional, academic and research areas in this field. The aim of the paper is to analyze the journal articles and statistical facts in 2016 and to point out the directions for action and planned further activities for improving the quality of the published papers and visibility of the journal. Methods: Review and analysis of documentation and production of the journal, evidence of submitted and rejected manuscripts and published papers in 2016. Results: Total number of 111 articles was published in Materia Socio-Medica during 2016. The most of them were original articles (64,5%). Articles from the fields of Health promotion and prevention were predominant (82,7%), which is one of the primary scope of the journal. Authors of the published articles in 2016 are dispersed to three continents (Europe, Asia and North America) and 15 different countries. The largest number of articles was submitted by authors from the country of origin of the journal, Bosnia and Herzegovina. The acceptance rate of Materia Socioi-Medica in 2016 was 35.7%. Total number of 116 reviewers participated in the manuscript review process in 2016. Conclusion: Materia Socio-Medica will continue to improve the quality of the published papers in 2017 and beyond through education of potential authors, reviewers and Editorial Board members, quality selection of reviewers, supportive editing of articles, and clearly defining instructions and ethical standards of the journal. PMID:28484345

  7. Improving data analysis in herpetology: Using Akaike's information criterion (AIC) to assess the strength of biological hypotheses

    USGS Publications Warehouse

    Mazerolle, M.J.

    2006-01-01

    In ecology, researchers frequently use observational studies to explain a given pattern, such as the number of individuals in a habitat patch, with a large number of explanatory (i.e., independent) variables. To elucidate such relationships, ecologists have long relied on hypothesis testing to include or exclude variables in regression models, although the conclusions often depend on the approach used (e.g., forward, backward, stepwise selection). Though better tools have surfaced in the mid 1970's, they are still underutilized in certain fields, particularly in herpetology. This is the case of the Akaike information criterion (AIC) which is remarkably superior in model selection (i.e., variable selection) than hypothesis-based approaches. It is simple to compute and easy to understand, but more importantly, for a given data set, it provides a measure of the strength of evidence for each model that represents a plausible biological hypothesis relative to the entire set of models considered. Using this approach, one can then compute a weighted average of the estimate and standard error for any given variable of interest across all the models considered. This procedure, termed model-averaging or multimodel inference, yields precise and robust estimates. In this paper, I illustrate the use of the AIC in model selection and inference, as well as the interpretation of results analysed in this framework with two real herpetological data sets. The AIC and measures derived from it is should be routinely adopted by herpetologists. ?? Koninklijke Brill NV 2006.

  8. Fermentation process tracking through enhanced spectral calibration modeling.

    PubMed

    Triadaphillou, Sophia; Martin, Elaine; Montague, Gary; Norden, Alison; Jeffkins, Paul; Stimpson, Sarah

    2007-06-15

    The FDA process analytical technology (PAT) initiative will materialize in a significant increase in the number of installations of spectroscopic instrumentation. However, to attain the greatest benefit from the data generated, there is a need for calibration procedures that extract the maximum information content. For example, in fermentation processes, the interpretation of the resulting spectra is challenging as a consequence of the large number of wavelengths recorded, the underlying correlation structure that is evident between the wavelengths and the impact of the measurement environment. Approaches to the development of calibration models have been based on the application of partial least squares (PLS) either to the full spectral signature or to a subset of wavelengths. This paper presents a new approach to calibration modeling that combines a wavelength selection procedure, spectral window selection (SWS), where windows of wavelengths are automatically selected which are subsequently used as the basis of the calibration model. However, due to the non-uniqueness of the windows selected when the algorithm is executed repeatedly, multiple models are constructed and these are then combined using stacking thereby increasing the robustness of the final calibration model. The methodology is applied to data generated during the monitoring of broth concentrations in an industrial fermentation process from on-line near-infrared (NIR) and mid-infrared (MIR) spectrometers. It is shown that the proposed calibration modeling procedure outperforms traditional calibration procedures, as well as enabling the identification of the critical regions of the spectra with regard to the fermentation process.

  9. Collected software engineering papers, volume 9

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1990 through October 1991. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the ninth such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. For the convenience of this presentation, the eight papers contained here are grouped into three major categories: (1) software models studies; (2) software measurement studies; and (3) Ada technology studies. The first category presents studies on reuse models, including a software reuse model applied to maintenance and a model for an organization to support software reuse. The second category includes experimental research methods and software measurement techniques. The third category presents object-oriented approaches using Ada and object-oriented features proposed for Ada. The SEL is actively working to understand and improve the software development process at GSFC.

  10. Statistical properties of the Jukes-Holmquist method of estimating the number of nucleotide substitutions: reply to Holmquist and Conroy's criticism.

    PubMed

    Nei, M; Tateno, Y

    1981-01-01

    Conducting computer simulations, Nei and Tateno (1978) have shown that Jukes and Holmquist's (1972) method of estimating the number of nucleotide substitutions tends to give an overestimate and the estimate obtained has a large variance. Holmquist and Conroy (1980) repeated some parts of our simulation and claim that the overestimation of nucleotide substitutions in our paper occurred mainly because we used selected data. Examination of Holmquist and Conroy's simulation indicates that their results are essentially the same as ours when the Jukes-Holmquist method is used, but since they used a different method of computation their estimates of nucleotide substitutions differed substantially from ours. Another problem in Holmquist and Conroy's Letter is that they confused the expected number of nucleotide substitution with the number in a sample. This confusion has resulted in a number of unnecessary arguments. They also criticized our X2 measure, but this criticism is apparently due to a misunderstanding of the assumptions of our method and a failure to use our method in the way we described. We believe that our earlier conclusions remain unchanged.

  11. Sample selection via angular distance in the space of the arguments of an artificial neural network

    NASA Astrophysics Data System (ADS)

    Fernández Jaramillo, J. M.; Mayerle, R.

    2018-05-01

    In the construction of an artificial neural network (ANN) a proper data splitting of the available samples plays a major role in the training process. This selection of subsets for training, testing and validation affects the generalization ability of the neural network. Also the number of samples has an impact in the time required for the design of the ANN and the training. This paper introduces an efficient and simple method for reducing the set of samples used for training a neural network. The method reduces the required time to calculate the network coefficients, while keeping the diversity and avoiding overtraining the ANN due the presence of similar samples. The proposed method is based on the calculation of the angle between two vectors, each one representing one input of the neural network. When the angle formed among samples is smaller than a defined threshold only one input is accepted for the training. The accepted inputs are scattered throughout the sample space. Tidal records are used to demonstrate the proposed method. The results of a cross-validation show that with few inputs the quality of the outputs is not accurate and depends on the selection of the first sample, but as the number of inputs increases the accuracy is improved and differences among the scenarios with a different starting sample have and important reduction. A comparison with the K-means clustering algorithm shows that for this application the proposed method with a smaller number of samples is producing a more accurate network.

  12. Variation and evolution of male sex combs in Drosophila: nature of selection response and theories of genetic variation for sexual traits.

    PubMed

    Ahuja, Abha; Singh, Rama S

    2008-05-01

    We investigated the genetic architecture of variation in male sex comb bristle number, a rapidly evolving secondary sexual character of Drosophila. Twenty-four generations of divergent artificial selection for sex comb bristle number in a heterogeneous population of Drosophila melanogaster resulted in a significant response that was more pronounced in the direction of low bristle numbers. We observed a strong positive correlated response to selection in the corresponding female transverse bristle row. The correlated response in male abdominal and sternopleural bristle numbers, on the other hand, did not follow the same pattern as sex comb bristle number differences between selection lines. Relaxation-of-selection experiments along with mate choice and fecundity assays using the selection lines developed demonstrated the action of stabilizing selection on sex comb bristle number. Our results show (1) substantial genetic variation underlying sex comb bristle number variation; (2) a weak relationship between the sex comb and developmentally related, non-sex bristle systems; and (3) that sexual selection may be a driving force in sex comb evolution, indicating the potential of sex combs to diversify rapidly during population differentiation and speciation. We discuss the implications of these results for theories of genetic variation in display and nondisplay male sex traits.

  13. Polymeric film application for phase change heat transfer

    NASA Astrophysics Data System (ADS)

    Bart, Hans-Jörg; Dreiser, Christian

    2018-06-01

    The paper gives a concise review on polymer film heat exchangers (PFHX) with a focus on polyether ether ketone (PEEK) foil as heat transfer element, mechanically supported by a grid structure. In order to promote PFHX applications, heat transfer performance and wetting behavior are studied in detail. Surface modifications to improve wetting are discussed and correlations are presented for critical Reynolds numbers to sustain a stable liquid film. Scaling phenomena related to surface properties and easily adaptable cleaning-in-place (CIP) procedures are further content. The contribution of the foil thickness and material selection on thermal performance is quantified and a correlation for enhanced aqueous film heat transfer for the grid supported PFHX is given. The basic research results and the design criteria enable early stage material selection and conceptual apparatus design.

  14. Material radioassay and selection for the XENON1T dark matter experiment

    NASA Astrophysics Data System (ADS)

    Aprile, E.; Aalbers, J.; Agostini, F.; Alfonsi, M.; Amaro, F. D.; Anthony, M.; Arneodo, F.; Barrow, P.; Baudis, L.; Bauermeister, B.; Benabderrahmane, M. L.; Berger, T.; Breur, P. A.; Brown, A.; Brown, E.; Bruenner, S.; Bruno, G.; Budnik, R.; Bütikofer, L.; Calvén, J.; Cardoso, J. M. R.; Cervantes, M.; Cichon, D.; Coderre, D.; Colijn, A. P.; Conrad, J.; Cussonneau, J. P.; Decowski, M. P.; de Perio, P.; Di Gangi, P.; Di Giovanni, A.; Diglio, S.; Eurin, G.; Fei, J.; Ferella, A. D.; Fieguth, A.; Franco, D.; Fulgione, W.; Gallo Rosso, A.; Galloway, M.; Gao, F.; Garbini, M.; Geis, C.; Goetzke, L. W.; Grandi, L.; Greene, Z.; Grignon, C.; Hasterok, C.; Hogenbirk, E.; Itay, R.; Kaminsky, B.; Kessler, G.; Kish, A.; Landsman, H.; Lang, R. F.; Lellouch, D.; Levinson, L.; Le Calloch, M.; Lin, Q.; Lindemann, S.; Lindner, M.; Lopes, J. A. M.; Manfredini, A.; Maris, I.; Marrodán Undagoitia, T.; Masbou, J.; Massoli, F. V.; Masson, D.; Mayani, D.; Messina, M.; Micheneau, K.; Miguez, B.; Molinario, A.; Murra, M.; Naganoma, J.; Ni, K.; Oberlack, U.; Pakarha, P.; Pelssers, B.; Persiani, R.; Piastra, F.; Pienaar, J.; Piro, M.-C.; Pizzella, V.; Plante, G.; Priel, N.; Rauch, L.; Reichard, S.; Reuter, C.; Rizzo, A.; Rosendahl, S.; Rupp, N.; Saldanha, R.; dos Santos, J. M. F.; Sartorelli, G.; Scheibelhut, M.; Schindler, S.; Schreiner, J.; Schumann, M.; Scotto Lavina, L.; Selvi, M.; Shagin, P.; Shockley, E.; Silva, M.; Simgen, H.; Sivers, M. v.; Stein, A.; Thers, D.; Tiseni, A.; Trinchero, G.; Tunnell, C.; Upole, N.; Wang, H.; Wei, Y.; Weinheimer, C.; Wulf, J.; Ye, J.; Zhang, Y.; Laubenstein, M.; Nisi, S.

    2017-12-01

    The XENON1T dark matter experiment aims to detect weakly interacting massive particles (WIMPs) through low-energy interactions with xenon atoms. To detect such a rare event necessitates the use of radiopure materials to minimize the number of background events within the expected WIMP signal region. In this paper we report the results of an extensive material radioassay campaign for the XENON1T experiment. Using gamma-ray spectroscopy and mass spectrometry techniques, systematic measurements of trace radioactive impurities in over one hundred samples within a wide range of materials were performed. The measured activities allowed for stringent selection and placement of materials during the detector construction phase and provided the input for XENON1T detection sensitivity estimates through Monte Carlo simulations.

  15. Optimization Of Mean-Semivariance-Skewness Portfolio Selection Model In Fuzzy Random Environment

    NASA Astrophysics Data System (ADS)

    Chatterjee, Amitava; Bhattacharyya, Rupak; Mukherjee, Supratim; Kar, Samarjit

    2010-10-01

    The purpose of the paper is to construct a mean-semivariance-skewness portfolio selection model in fuzzy random environment. The objective is to maximize the skewness with predefined maximum risk tolerance and minimum expected return. Here the security returns in the objectives and constraints are assumed to be fuzzy random variables in nature and then the vagueness of the fuzzy random variables in the objectives and constraints are transformed into fuzzy variables which are similar to trapezoidal numbers. The newly formed fuzzy model is then converted into a deterministic optimization model. The feasibility and effectiveness of the proposed method is verified by numerical example extracted from Bombay Stock Exchange (BSE). The exact parameters of fuzzy membership function and probability density function are obtained through fuzzy random simulating the past dates.

  16. GR 20 parallel session A3: modified gravity

    NASA Astrophysics Data System (ADS)

    Hořava, Petr; Mohd, Arif; Melby-Thompson, Charles M.; Shawhan, Peter

    2014-05-01

    The parallel session (A3), on "Modified Gravity", enjoyed one on the largest number of abstract submissions (over 80), resulting in the selection of 24 oral presentations. The three short papers presented in the following sections are based on the session talks by Arif Mohd on thermodynamics of universal horizons in Einstein-Æther theory, Conformal anomalies in Hořava-Lifshitz gravity by Charles Melby-Thompson and detectability of scalar gravitational waves by LIGO and Virgo by Peter Shawhan. They have been selected as a representative sample, to illustrate some of the best in the remarkable and encouraging variety of topics discussed in the session—ranging from highly theoretical, to phenomenological, observational, and experimental—with all these areas playing an integral part in our quest to understand the limits of standard general relativity.

  17. Neural architecture design based on extreme learning machine.

    PubMed

    Bueno-Crespo, Andrés; García-Laencina, Pedro J; Sancho-Gómez, José-Luis

    2013-12-01

    Selection of the optimal neural architecture to solve a pattern classification problem entails to choose the relevant input units, the number of hidden neurons and its corresponding interconnection weights. This problem has been widely studied in many research works but their solutions usually involve excessive computational cost in most of the problems and they do not provide a unique solution. This paper proposes a new technique to efficiently design the MultiLayer Perceptron (MLP) architecture for classification using the Extreme Learning Machine (ELM) algorithm. The proposed method provides a high generalization capability and a unique solution for the architecture design. Moreover, the selected final network only retains those input connections that are relevant for the classification task. Experimental results show these advantages. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Polymeric film application for phase change heat transfer

    NASA Astrophysics Data System (ADS)

    Bart, Hans-Jörg; Dreiser, Christian

    2018-01-01

    The paper gives a concise review on polymer film heat exchangers (PFHX) with a focus on polyether ether ketone (PEEK) foil as heat transfer element, mechanically supported by a grid structure. In order to promote PFHX applications, heat transfer performance and wetting behavior are studied in detail. Surface modifications to improve wetting are discussed and correlations are presented for critical Reynolds numbers to sustain a stable liquid film. Scaling phenomena related to surface properties and easily adaptable cleaning-in-place (CIP) procedures are further content. The contribution of the foil thickness and material selection on thermal performance is quantified and a correlation for enhanced aqueous film heat transfer for the grid supported PFHX is given. The basic research results and the design criteria enable early stage material selection and conceptual apparatus design.

  19. Determination of the optimal number of components in independent components analysis.

    PubMed

    Kassouf, Amine; Jouan-Rimbaud Bouveresse, Delphine; Rutledge, Douglas N

    2018-03-01

    Independent components analysis (ICA) may be considered as one of the most established blind source separation techniques for the treatment of complex data sets in analytical chemistry. Like other similar methods, the determination of the optimal number of latent variables, in this case, independent components (ICs), is a crucial step before any modeling. Therefore, validation methods are required in order to decide about the optimal number of ICs to be used in the computation of the final model. In this paper, three new validation methods are formally presented. The first one, called Random_ICA, is a generalization of the ICA_by_blocks method. Its specificity resides in the random way of splitting the initial data matrix into two blocks, and then repeating this procedure several times, giving a broader perspective for the selection of the optimal number of ICs. The second method, called KMO_ICA_Residuals is based on the computation of the Kaiser-Meyer-Olkin (KMO) index of the transposed residual matrices obtained after progressive extraction of ICs. The third method, called ICA_corr_y, helps to select the optimal number of ICs by computing the correlations between calculated proportions and known physico-chemical information about samples, generally concentrations, or between a source signal known to be present in the mixture and the signals extracted by ICA. These three methods were tested using varied simulated and experimental data sets and compared, when necessary, to ICA_by_blocks. Results were relevant and in line with expected ones, proving the reliability of the three proposed methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Multi-UAV Routing for Area Coverage and Remote Sensing with Minimum Time

    PubMed Central

    Avellar, Gustavo S. C.; Pereira, Guilherme A. S.; Pimenta, Luciano C. A.; Iscold, Paulo

    2015-01-01

    This paper presents a solution for the problem of minimum time coverage of ground areas using a group of unmanned air vehicles (UAVs) equipped with image sensors. The solution is divided into two parts: (i) the task modeling as a graph whose vertices are geographic coordinates determined in such a way that a single UAV would cover the area in minimum time; and (ii) the solution of a mixed integer linear programming problem, formulated according to the graph variables defined in the first part, to route the team of UAVs over the area. The main contribution of the proposed methodology, when compared with the traditional vehicle routing problem’s (VRP) solutions, is the fact that our method solves some practical problems only encountered during the execution of the task with actual UAVs. In this line, one of the main contributions of the paper is that the number of UAVs used to cover the area is automatically selected by solving the optimization problem. The number of UAVs is influenced by the vehicles’ maximum flight time and by the setup time, which is the time needed to prepare and launch a UAV. To illustrate the methodology, the paper presents experimental results obtained with two hand-launched, fixed-wing UAVs. PMID:26540055

  1. Multi-UAV Routing for Area Coverage and Remote Sensing with Minimum Time.

    PubMed

    Avellar, Gustavo S C; Pereira, Guilherme A S; Pimenta, Luciano C A; Iscold, Paulo

    2015-11-02

    This paper presents a solution for the problem of minimum time coverage of ground areas using a group of unmanned air vehicles (UAVs) equipped with image sensors. The solution is divided into two parts: (i) the task modeling as a graph whose vertices are geographic coordinates determined in such a way that a single UAV would cover the area in minimum time; and (ii) the solution of a mixed integer linear programming problem, formulated according to the graph variables defined in the first part, to route the team of UAVs over the area. The main contribution of the proposed methodology, when compared with the traditional vehicle routing problem's (VRP) solutions, is the fact that our method solves some practical problems only encountered during the execution of the task with actual UAVs. In this line, one of the main contributions of the paper is that the number of UAVs used to cover the area is automatically selected by solving the optimization problem. The number of UAVs is influenced by the vehicles' maximum flight time and by the setup time, which is the time needed to prepare and launch a UAV. To illustrate the methodology, the paper presents experimental results obtained with two hand-launched, fixed-wing UAVs.

  2. Deleting 'irrational' responses from discrete choice experiments: a case of investigating or imposing preferences?

    PubMed

    Lancsar, Emily; Louviere, Jordan

    2006-08-01

    Investigation of the 'rationality' of responses to discrete choice experiments (DCEs) has been a theme of research in health economics. Responses have been deleted from DCEs where they have been deemed by researchers to (a) be 'irrational', defined by such studies as failing tests for non-satiation, or (b) represent lexicographic preferences. This paper outlines a number of reasons why deleting responses from DCEs may be inappropriate after first reviewing the theory underpinning rationality, highlighting that the importance placed on rationality depends on the approach to consumer theory to which one ascribes. The aim of this paper is not to suggest that all preferences elicited via DCEs are rational. Instead, it is to suggest a number of reasons why it may not be the case that all preferences labelled as 'irrational' are indeed so. Hence, deleting responses may result in the removal of valid preferences; induce sample selection bias; and reduce the statistical efficiency and power of the estimated choice models. Further, evidence suggests random utility theory may be able to cope with such preferences. Finally, we discuss a number of implications for the design, implementation and interpretation of DCEs and recommend caution regarding the deletion of preferences from stated preference experiments. Copyright 2006 John Wiley & Sons, Ltd.

  3. 32 CFR 1615.6 - Selective service number.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Selective service number. 1615.6 Section 1615.6 National Defense Other Regulations Relating to National Defense SELECTIVE SERVICE SYSTEM ADMINISTRATION OF... Social Security Account Number will not be used for this purpose. ...

  4. A new randomized Kaczmarz based kernel canonical correlation analysis algorithm with applications to information retrieval.

    PubMed

    Cai, Jia; Tang, Yi

    2018-02-01

    Canonical correlation analysis (CCA) is a powerful statistical tool for detecting the linear relationship between two sets of multivariate variables. Kernel generalization of it, namely, kernel CCA is proposed to describe nonlinear relationship between two variables. Although kernel CCA can achieve dimensionality reduction results for high-dimensional data feature selection problem, it also yields the so called over-fitting phenomenon. In this paper, we consider a new kernel CCA algorithm via randomized Kaczmarz method. The main contributions of the paper are: (1) A new kernel CCA algorithm is developed, (2) theoretical convergence of the proposed algorithm is addressed by means of scaled condition number, (3) a lower bound which addresses the minimum number of iterations is presented. We test on both synthetic dataset and several real-world datasets in cross-language document retrieval and content-based image retrieval to demonstrate the effectiveness of the proposed algorithm. Numerical results imply the performance and efficiency of the new algorithm, which is competitive with several state-of-the-art kernel CCA methods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. 75 FR 7548 - Amendments to the Select Agents Controls in Export Control Classification Number (ECCN) 1C360 on...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-22

    ...-91434-01] RIN 0694-AE67 Amendments to the Select Agents Controls in Export Control Classification Number... controls on certain select agents identified in Export Control Classification Number (ECCN) 1C360 on the...) list of select agents and toxins. The changes made by APHIS were part of a biennial review and...

  6. IJS: An Intelligent Junction Selection Based Routing Protocol for VANET to Support ITS Services.

    PubMed

    Bhoi, Sourav Kumar; Khilar, Pabitra Mohan

    2014-01-01

    Selecting junctions intelligently for data transmission provides better intelligent transportation system (ITS) services. The main problem in vehicular communication is high disturbances of link connectivity due to mobility and less density of vehicles. If link conditions are predicted earlier, then there is a less chance of performance degradation. In this paper, an intelligent junction selection based routing protocol (IJS) is proposed to transmit the data in a quickest path, in which the vehicles are mostly connected and have less link connectivity problem. In this protocol, a helping vehicle is set at every junction to control the communication by predicting link failures or network gaps in a route. Helping vehicle at the junction produces a score for every neighboring junction to forward the data to the destination by considering the current traffic information and selects that junction which has minimum score. IJS protocol is implemented and compared with GyTAR, A-STAR, and GSR routing protocols. Simulation results show that IJS performs better in terms of average end-to-end delay, network gap encounter, and number of hops.

  7. IJS: An Intelligent Junction Selection Based Routing Protocol for VANET to Support ITS Services

    PubMed Central

    Khilar, Pabitra Mohan

    2014-01-01

    Selecting junctions intelligently for data transmission provides better intelligent transportation system (ITS) services. The main problem in vehicular communication is high disturbances of link connectivity due to mobility and less density of vehicles. If link conditions are predicted earlier, then there is a less chance of performance degradation. In this paper, an intelligent junction selection based routing protocol (IJS) is proposed to transmit the data in a quickest path, in which the vehicles are mostly connected and have less link connectivity problem. In this protocol, a helping vehicle is set at every junction to control the communication by predicting link failures or network gaps in a route. Helping vehicle at the junction produces a score for every neighboring junction to forward the data to the destination by considering the current traffic information and selects that junction which has minimum score. IJS protocol is implemented and compared with GyTAR, A-STAR, and GSR routing protocols. Simulation results show that IJS performs better in terms of average end-to-end delay, network gap encounter, and number of hops. PMID:27433485

  8. Extraction Selectivity of a Quaternary Alkylammonium Salt for Trivalent Actinides over Trivalent Lanthanides: Does Extractant Aggregation Play a Role?

    DOE PAGES

    Knight, Andrew W.; Chiarizia, Renato; Soderholm, L.

    2017-05-10

    In this paper, the extraction behavior of a quaternary alkylammonium salt extractant was investigated for its selectivity for trivalent actinides over trivalent lanthanides in nitrate and thiocyanate media. The selectivity was evaluated by solvent extraction experiments through radiochemical analysis of 241Am and 152/154Eu. Solvent extraction distribution and slope-analysis experiments were performed with americium(III) and europium(III) with respect to the ligand (nitrate and thiocyanate), extractant, and metal (europium only) concentrations. Further evaluation of the equilibrium expression that governs the extraction process indicated the appropriate use of the saturation method for estimation of the aggregation state of quaternary ammonium extractants in themore » organic phase. From the saturation method, we observed an average aggregation number of 5.4 ± 0.8 and 8.5 ± 0.9 monomers/aggregate for nitrate and thiocyanate, respectively. Through a side-by-side comparison of the nitrate and thiocyanate forms, we discuss the potential role of the aggregation in the increased selectivity for trivalent actinides over trivalent lanthanides in thiocyanate media.« less

  9. Extraction Selectivity of a Quaternary Alkylammonium Salt for Trivalent Actinides over Trivalent Lanthanides: Does Extractant Aggregation Play a Role?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knight, Andrew W.; Chiarizia, Renato; Soderholm, L.

    In this paper, the extraction behavior of a quaternary alkylammonium salt extractant was investigated for its selectivity for trivalent actinides over trivalent lanthanides in nitrate and thiocyanate media. The selectivity was evaluated by solvent extraction experiments through radiochemical analysis of 241Am and 152/154Eu. Solvent extraction distribution and slope-analysis experiments were performed with americium(III) and europium(III) with respect to the ligand (nitrate and thiocyanate), extractant, and metal (europium only) concentrations. Further evaluation of the equilibrium expression that governs the extraction process indicated the appropriate use of the saturation method for estimation of the aggregation state of quaternary ammonium extractants in themore » organic phase. From the saturation method, we observed an average aggregation number of 5.4 ± 0.8 and 8.5 ± 0.9 monomers/aggregate for nitrate and thiocyanate, respectively. Through a side-by-side comparison of the nitrate and thiocyanate forms, we discuss the potential role of the aggregation in the increased selectivity for trivalent actinides over trivalent lanthanides in thiocyanate media.« less

  10. Selecting a proper design period for heliostat field layout optimization using Campo code

    NASA Astrophysics Data System (ADS)

    Saghafifar, Mohammad; Gadalla, Mohamed

    2016-09-01

    In this paper, different approaches are considered to calculate the cosine factor which is utilized in Campo code to expand the heliostat field layout and maximize its annual thermal output. Furthermore, three heliostat fields containing different number of mirrors are taken into consideration. Cosine factor is determined by considering instantaneous and time-average approaches. For instantaneous method, different design days and design hours are selected. For the time average method, daily time average, monthly time average, seasonally time average, and yearly time averaged cosine factor determinations are considered. Results indicate that instantaneous methods are more appropriate for small scale heliostat field optimization. Consequently, it is proposed to consider the design period as the second design variable to ensure the best outcome. For medium and large scale heliostat fields, selecting an appropriate design period is more important. Therefore, it is more reliable to select one of the recommended time average methods to optimize the field layout. Optimum annual weighted efficiency for heliostat fields (small, medium, and large) containing 350, 1460, and 3450 mirrors are 66.14%, 60.87%, and 54.04%, respectively.

  11. Transshipment site selection using the AHP and TOPSIS approaches under fuzzy environment.

    PubMed

    Onüt, Semih; Soner, Selin

    2008-01-01

    Site selection is an important issue in waste management. Selection of the appropriate solid waste site requires consideration of multiple alternative solutions and evaluation criteria because of system complexity. Evaluation procedures involve several objectives, and it is often necessary to compromise among possibly conflicting tangible and intangible factors. For these reasons, multiple criteria decision-making (MCDM) has been found to be a useful approach to solve this kind of problem. Different MCDM models have been applied to solve this problem. But most of them are basically mathematical and ignore qualitative and often subjective considerations. It is easier for a decision-maker to describe a value for an alternative by using linguistic terms. In the fuzzy-based method, the rating of each alternative is described using linguistic terms, which can also be expressed as triangular fuzzy numbers. Furthermore, there have not been any studies focused on the site selection in waste management using both fuzzy TOPSIS (technique for order preference by similarity to ideal solution) and AHP (analytical hierarchy process) techniques. In this paper, a fuzzy TOPSIS based methodology is applied to solve the solid waste transshipment site selection problem in Istanbul, Turkey. The criteria weights are calculated by using the AHP.

  12. Hypergraph Based Feature Selection Technique for Medical Diagnosis.

    PubMed

    Somu, Nivethitha; Raman, M R Gauthama; Kirthivasan, Kannan; Sriram, V S Shankar

    2016-11-01

    The impact of internet and information systems across various domains have resulted in substantial generation of multidimensional datasets. The use of data mining and knowledge discovery techniques to extract the original information contained in the multidimensional datasets play a significant role in the exploitation of complete benefit provided by them. The presence of large number of features in the high dimensional datasets incurs high computational cost in terms of computing power and time. Hence, feature selection technique has been commonly used to build robust machine learning models to select a subset of relevant features which projects the maximal information content of the original dataset. In this paper, a novel Rough Set based K - Helly feature selection technique (RSKHT) which hybridize Rough Set Theory (RST) and K - Helly property of hypergraph representation had been designed to identify the optimal feature subset or reduct for medical diagnostic applications. Experiments carried out using the medical datasets from the UCI repository proves the dominance of the RSKHT over other feature selection techniques with respect to the reduct size, classification accuracy and time complexity. The performance of the RSKHT had been validated using WEKA tool, which shows that RSKHT had been computationally attractive and flexible over massive datasets.

  13. Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability

    NASA Astrophysics Data System (ADS)

    Thanh, Vo Hong; Priami, Corrado; Zunino, Roberto

    2016-06-01

    Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximate algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.

  14. Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thanh, Vo Hong, E-mail: vo@cosbi.eu; Priami, Corrado, E-mail: priami@cosbi.eu; Department of Mathematics, University of Trento, Trento

    Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximatemore » algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.« less

  15. Effects of channel tap spacing on delay-lock tracking

    NASA Astrophysics Data System (ADS)

    Dana, Roger A.; Milner, Brian R.; Bogusch, Robert L.

    1995-12-01

    High fidelity simulations of communication links operating through frequency selective fading channels require both accurate channel models and faithful reproduction of the received signal. In modern radio receivers, processing beyond the analog-to-digital converter (A/D) is done digitally, so a high fidelity simulation is actually an emulation of this digital signal processing. The 'simulation' occurs in constructing the output of the A/D. One approach to constructing the A/D output is to convolve the channel impulse response function with the combined impulse response of the transmitted modulation and the A/D. For both link simulations and hardware channel simulators, the channel impulse response function is then generated with a finite number of samples per chip, and the convolution is implemented in a tapped delay line. In this paper we discuss the effects of the channel model tap spacing on the performance of delay locked loops (DLLs) in both direct sequence and frequency hopped spread spectrum systems. A frequency selective fading channel is considered, and the channel impulse response function is constructed with an integer number of taps per modulation symbol or chip. The tracking loop time delay is computed theoretically for this tapped delay line channel model and is compared to the results of high fidelity simulations of actual DLLs. A surprising result is obtained. The performance of the DLL depends strongly on the number of taps per chip. As this number increases the DLL delay approaches the theoretical limit.

  16. Academic Primer Series: Five Key Papers about Study Designs in Medical Education.

    PubMed

    Gottlieb, Michael; Chan, Teresa M; Fredette, Jenna; Messman, Anne; Robinson, Daniel W; Cooney, Robert; Boysen-Osborn, Megan; Sherbino, Jonathan

    2017-06-01

    A proper understanding of study design is essential to creating successful studies. This is also important when reading or peer reviewing publications. In this article, we aimed to identify and summarize key papers that would be helpful for faculty members interested in learning more about study design in medical education research. The online discussions of the 2016-2017 Academic Life in Emergency Medicine Faculty Incubator program included a robust and vigorous discussion about education study design, which highlighted a number of papers on that topic. We augmented this list of papers with further suggestions by expert mentors. Via this process, we created a list of 29 papers in total on the topic of medical education study design. After gathering these papers, our authorship group engaged in a modified Delphi approach to build consensus on the papers that were most valuable for the understanding of proper study design in medical education. We selected the top five most highly rated papers on the topic domain of study design as determined by our study group. We subsequently summarized these papers with respect to their relevance to junior faculty members and to faculty developers. This article summarizes five key papers addressing study design in medical education with discussions and applications for junior faculty members and faculty developers. These papers provide a basis upon which junior faculty members might build for developing and analyzing studies.

  17. Mode Selection Techniques in Variable Mass Flexible Body Modeling

    NASA Technical Reports Server (NTRS)

    Quiocho, Leslie J.; Ghosh, Tushar K.; Frenkel, David; Huynh, An

    2010-01-01

    In developing a flexible body spacecraft simulation for the Launch Abort System of the Orion vehicle, when a rapid mass depletion takes place, the dynamics problem with time varying eigenmodes had to be addressed. Three different techniques were implemented, with different trade-offs made between performance and fidelity. A number of technical issues had to be solved in the process. This paper covers the background of the variable mass flexibility problem, the three approaches to simulating it, and the technical issues that were solved in formulating and implementing them.

  18. Robert E. Slaughter Research Award Studies 1975. Research Report. Number 3. Effectiveness of Model Office, Cooperative Office Education, and Office Procedures Courses Based on Employee Satisfaction and Satisfactoriness Eighteen Months after Graduation. [AND] A Study of the Content in Selected Textbooks for the Commonly Offered Basic Business Courses in Secondary Schools.

    ERIC Educational Resources Information Center

    McLean, Gary N.; Jones, L. Eugene

    The two studies which received the 1975 Robert E. Slaughter Research Award in Business and Office Education are summarized in the document. The first paper, entitled "Effectiveness of Model Office, Cooperative Office Education, and Office Procedures Courses Based on Employee Satisfaction and Satisfactoriness Eighteen Months After…

  19. Selected technology for the gas industry

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A number of papers were presented at a conference concerned with the application of technical topics from aerospace activities for the gas industry. The following subjects were covered: general future of fossil fuels in America, exploration for fossil and nuclear fuels from orbital altitudes, technology for liquefied gas, safety considerations relative to fires, explosions, and detonations, gas turbomachinery technology, fluid properties, fluid flow, and heat transfer, NASA information and documentation systems, instrumentation and measurement, materials and life prediction, reliability and quality assurance, and advanced energy systems (including synthetic fuels, energy storage, solar energy, and wind energy).

  20. Recent advances in catchment hydrology

    NASA Astrophysics Data System (ADS)

    van Meerveld, I. H. J.

    2017-12-01

    Despite the consensus that field observations and catchment studies are imperative to understand hydrological processes, to determine the impacts of global change, to quantify the spatial and temporal variability in hydrological fluxes, and to refine and test hydrological models, there is a decline in the number of field studies. This decline and the importance of fieldwork for catchment hydrology have been described in several recent opinion papers. This presentation will summarize these commentaries, describe how catchment studies have evolved over time, and highlight the findings from selected recent studies published in Water Resources Research.

Top