Science.gov

Sample records for based empirical survey

  1. Risk-adjusted capitation based on the Diagnostic Cost Group Model: an empirical evaluation with health survey information.

    PubMed Central

    Lamers, L M

    1999-01-01

    OBJECTIVE: To evaluate the predictive accuracy of the Diagnostic Cost Group (DCG) model using health survey information. DATA SOURCES/STUDY SETTING: Longitudinal data collected for a sample of members of a Dutch sickness fund. In the Netherlands the sickness funds provide compulsory health insurance coverage for the 60 percent of the population in the lowest income brackets. STUDY DESIGN: A demographic model and DCG capitation models are estimated by means of ordinary least squares, with an individual's annual healthcare expenditures in 1994 as the dependent variable. For subgroups based on health survey information, costs predicted by the models are compared with actual costs. Using stepwise regression procedures a subset of relevant survey variables that could improve the predictive accuracy of the three-year DCG model was identified. Capitation models were extended with these variables. DATA COLLECTION/EXTRACTION METHODS: For the empirical analysis, panel data of sickness fund members were used that contained demographic information, annual healthcare expenditures, and diagnostic information from hospitalizations for each member. In 1993, a mailed health survey was conducted among a random sample of 15,000 persons in the panel data set, with a 70 percent response rate. PRINCIPAL FINDINGS: The predictive accuracy of the demographic model improves when it is extended with diagnostic information from prior hospitalizations (DCGs). A subset of survey variables further improves the predictive accuracy of the DCG capitation models. The predictable profits and losses based on survey information for the DCG models are smaller than for the demographic model. Most persons with predictable losses based on health survey information were not hospitalized in the preceding year. CONCLUSIONS: The use of diagnostic information from prior hospitalizations is a promising option for improving the demographic capitation payment formula. This study suggests that diagnostic

  2. Risk-adjusted capitation based on the Diagnostic Cost Group Model: an empirical evaluation with health survey information.

    PubMed

    Lamers, L M

    1999-02-01

    To evaluate the predictive accuracy of the Diagnostic Cost Group (DCG) model using health survey information. Longitudinal data collected for a sample of members of a Dutch sickness fund. In the Netherlands the sickness funds provide compulsory health insurance coverage for the 60 percent of the population in the lowest income brackets. A demographic model and DCG capitation models are estimated by means of ordinary least squares, with an individual's annual healthcare expenditures in 1994 as the dependent variable. For subgroups based on health survey information, costs predicted by the models are compared with actual costs. Using stepwise regression procedures a subset of relevant survey variables that could improve the predictive accuracy of the three-year DCG model was identified. Capitation models were extended with these variables. For the empirical analysis, panel data of sickness fund members were used that contained demographic information, annual healthcare expenditures, and diagnostic information from hospitalizations for each member. In 1993, a mailed health survey was conducted among a random sample of 15,000 persons in the panel data set, with a 70 percent response rate. The predictive accuracy of the demographic model improves when it is extended with diagnostic information from prior hospitalizations (DCGs). A subset of survey variables further improves the predictive accuracy of the DCG capitation models. The predictable profits and losses based on survey information for the DCG models are smaller than for the demographic model. Most persons with predictable losses based on health survey information were not hospitalized in the preceding year. The use of diagnostic information from prior hospitalizations is a promising option for improving the demographic capitation payment formula. This study suggests that diagnostic information from outpatient utilization is complementary to DCGs in predicting future costs.

  3. Monitoring of Qualifications and Employment in Austria: An Empirical Approach Based on the Labour Force Survey

    ERIC Educational Resources Information Center

    Lassnigg, Lorenz; Vogtenhuber, Stefan

    2011-01-01

    The empirical approach referred to in this article describes the relationship between education and training (ET) supply and employment in Austria; the use of the new ISCED (International Standard Classification of Education) fields of study variable makes this approach applicable abroad. The purpose is to explore a system that produces timely…

  4. Monitoring of Qualifications and Employment in Austria: An Empirical Approach Based on the Labour Force Survey

    ERIC Educational Resources Information Center

    Lassnigg, Lorenz; Vogtenhuber, Stefan

    2011-01-01

    The empirical approach referred to in this article describes the relationship between education and training (ET) supply and employment in Austria; the use of the new ISCED (International Standard Classification of Education) fields of study variable makes this approach applicable abroad. The purpose is to explore a system that produces timely…

  5. Defining Empirically Based Practice.

    ERIC Educational Resources Information Center

    Siegel, Deborah H.

    1984-01-01

    Provides a definition of empirically based practice, both conceptually and operationally. Describes a study of how research and practice were integrated in the graduate social work program at the School of Social Service Administration, University of Chicago. (JAC)

  6. Defining Empirically Based Practice.

    ERIC Educational Resources Information Center

    Siegel, Deborah H.

    1984-01-01

    Provides a definition of empirically based practice, both conceptually and operationally. Describes a study of how research and practice were integrated in the graduate social work program at the School of Social Service Administration, University of Chicago. (JAC)

  7. The ALHAMBRA survey: An empirical estimation of the cosmic variance for merger fraction studies based on close pairs

    NASA Astrophysics Data System (ADS)

    López-Sanjuan, C.; Cenarro, A. J.; Hernández-Monteagudo, C.; Varela, J.; Molino, A.; Arnalte-Mur, P.; Ascaso, B.; Castander, F. J.; Fernández-Soto, A.; Huertas-Company, M.; Márquez, I.; Martínez, V. J.; Masegosa, J.; Moles, M.; Pović, M.; Aguerri, J. A. L.; Alfaro, E.; Aparicio-Villegas, T.; Benítez, N.; Broadhurst, T.; Cabrera-Caño, J.; Cepa, J.; Cerviño, M.; Cristóbal-Hornillos, D.; Del Olmo, A.; González Delgado, R. M.; Husillos, C.; Infante, L.; Perea, J.; Prada, F.; Quintana, J. M.

    2014-04-01

    Aims: Our goal is to estimate empirically the cosmic variance that affects merger fraction studies based on close pairs for the first time. Methods: We compute the merger fraction from photometric redshift close pairs with 10 h-1 kpc ≤ rp ≤ 50 h-1 kpc and Δv ≤ 500 km s-1 and measure it in the 48 sub-fields of the ALHAMBRA survey. We study the distribution of the measured merger fractions that follow a log-normal function and estimate the cosmic variance σv as the intrinsic dispersion of the observed distribution. We develop a maximum likelihood estimator to measure a reliable σv and avoid the dispersion due to the observational errors (including the Poisson shot noise term). Results: The cosmic variance σv of the merger fraction depends mainly on (i) the number density of the populations under study for both the principal (n1) and the companion (n2) galaxy in the close pair and (ii) the probed cosmic volume Vc. We do not find a significant dependence on either the search radius used to define close companions, the redshift, or the physical selection (luminosity or stellar mass) of the samples. Conclusions: We have estimated the cosmic variance that affects the measurement of the merger fraction by close pairs from observations. We provide a parametrisation of the cosmic variance with n1, n2, and Vc, σv ∝ n1-0.54Vc-0.48 (n_2/n_1)-0.37 . Thanks to this prescription, future merger fraction studies based on close pairs could properly account for the cosmic variance on their results. Based on observations collected at the German-Spanish Astronomical Center, Calar Alto, jointly operated by the Max-Planck-Institut für Astronomie (MPIA) at Heidelberg and the Instituto de Astrofísica de Andalucía (IAA-CSIC).Appendix is available in electronic form at http://www.aanda.org

  8. GIS Teacher Training: Empirically-Based Indicators of Effectiveness

    ERIC Educational Resources Information Center

    Höhnle, Steffen; Fögele, Janis; Mehren, Rainer; Schubert, Jan Christoph

    2016-01-01

    In spite of various actions, the implementation of GIS (geographic information systems) in German schools is still very low. In the presented research, teaching experts as well as teaching novices were presented with empirically based constraints for implementation stemming from an earlier survey. In the process of various group discussions, the…

  9. GIS Teacher Training: Empirically-Based Indicators of Effectiveness

    ERIC Educational Resources Information Center

    Höhnle, Steffen; Fögele, Janis; Mehren, Rainer; Schubert, Jan Christoph

    2016-01-01

    In spite of various actions, the implementation of GIS (geographic information systems) in German schools is still very low. In the presented research, teaching experts as well as teaching novices were presented with empirically based constraints for implementation stemming from an earlier survey. In the process of various group discussions, the…

  10. An empirical Bayes approach to analyzing recurring animal surveys

    USGS Publications Warehouse

    Johnson, D.H.

    1989-01-01

    Recurring estimates of the size of animal populations are often required by biologists or wildlife managers. Because of cost or other constraints, estimates frequently lack the accuracy desired but cannot readily be improved by additional sampling. This report proposes a statistical method employing empirical Bayes (EB) estimators as alternatives to those customarily used to estimate population size, and evaluates them by a subsampling experiment on waterfowl surveys. EB estimates, especially a simple limited-translation version, were more accurate and provided shorter confidence intervals with greater coverage probabilities than customary estimates.

  11. Developing Empirically Based Models of Practice.

    ERIC Educational Resources Information Center

    Blythe, Betty J.; Briar, Scott

    1985-01-01

    Over the last decade emphasis has shifted from theoretically based models of practice to empirically based models whose elements are derived from clinical research. These models are defined and a developing model of practice through the use of single-case methodology is examined. Potential impediments to this new role are identified. (Author/BL)

  12. First Results From The Empire Nearby Galaxy Dense Gas Survey

    NASA Astrophysics Data System (ADS)

    Bigiel, Frank

    2016-09-01

    I will present first results from our EMPIRE survey, a large program ( 500 hr) at the IRAM 30m telescope to map high critical density gas and shock tracers (e.g., HCN, HCO+, HNC, N2H+, etc.) as well as the optically thin 1-0 lines of 13CO and C18O for the first time systematically across 9 prominent, nearby Disk Galaxies."How is star formation regulated across disk galaxies" is the central question framing our science. Specifically, and building on a large suite of available ancillary data from the radio to the UV, we study, among other things, dense gas fractions and star formation efficiencies and how they vary with environment within and among nearby disk galaxies. Of particular interest is how our measurements compare to studies in the Milky Way, predicting a fairly constant star formation efficiency of the dense gas. Already in our first case study focusing on the prominent nearby spiral galaxy M51, we find signifycant variations of this quantity across the disk.In my talk, I will present results from a first series of studies about to me submitted addressing these questions with our EMPIRE and complementary, high-resolution ALMA data. In addition, I will present details of the survey and report on ongoing projects and future directions. I will place our work in context with other work, including studies of dense gas tracers in other galaxies and in particular the Milky Way.

  13. An Analysis of Strata Differences in Higher Education Opportunities (1982-2010)--Based on an Empirical Survey of 16 Chinese Universities

    ERIC Educational Resources Information Center

    Weiyi, Wang

    2015-01-01

    Empirical study results show that in the past 30 years, after slightly expanding in the early 1990s, the differences in overall higher education opportunities among the children of all strata of China have continually shrunk. Regarding different types of higher education opportunities, the differences in access to key universities first expanded,…

  14. Empirically Based Play Interventions for Children

    ERIC Educational Resources Information Center

    Reddy, Linda A., Ed.; Files-Hall, Tara M., Ed.; Schaefer, Charles E., Ed.

    2005-01-01

    "Empirically Based Play Interventions for Children" is a compilation of innovative, well-designed play interventions, presented for the first time in one text. Play therapy is the oldest and most popular form of child therapy in clinical practice and is widely considered by practitioners to be uniquely responsive to children's developmental needs.…

  15. Empirically Based Play Interventions for Children

    ERIC Educational Resources Information Center

    Reddy, Linda A., Ed.; Files-Hall, Tara M., Ed.; Schaefer, Charles E., Ed.

    2005-01-01

    "Empirically Based Play Interventions for Children" is a compilation of innovative, well-designed play interventions, presented for the first time in one text. Play therapy is the oldest and most popular form of child therapy in clinical practice and is widely considered by practitioners to be uniquely responsive to children's developmental needs.…

  16. Physicians' attitudes towards health telematics--an empirical survey.

    PubMed

    Langer, B; Wetter, T

    2000-01-01

    Telemedical networks and services have received high attention in professional and scientific media in the recent past. In Germany some institutions and few physicians volunteer in experimenting with diverse telemedical service offerings. However, much is speculated but little is known about attitudes and expectations of the majority of physicians in local offices towards this new medium. Therefore we conducted an empirical survey using a random regional sample to poll the respective opinions. Encouraged by a high response rate to our paper questionnaire, we offer as conclusion: that physicians are surprisingly realistic about costs and benefits and can therefore be expected to subscribe as soon as benefits become obvious; that this trend increases with offices being taken over or newly established by younger physicians; and that the establishment of networks of comprehensive care offered by health care professionals from different disciplines is regarded as essential future advantage of telemedical networks.

  17. Empirical Study of Family Background and Higher Education: Relationship to Acceptance Opportunities and Trends--Based on Surveys at a Key Beijing University from 2007 to 2012

    ERIC Educational Resources Information Center

    Silin, Huang; Ziqiang, Xin; Jiawei, Hou

    2015-01-01

    Which family had a child that was accepted at a key university? To investigate the relationship between family background and children obtaining higher education opportunities and developing trends, the authors analyze survey data from 2007 to 2012 at a key university in Beijing. The results show there is a clear trend of enlargement of the…

  18. Differential Weighting: A Survey of Methods and Empirical Studies.

    ERIC Educational Resources Information Center

    Stanley, Julian C.; Wang, Marilyn D.

    The literature on a priori and empirical weighting of test items and test-item options is reviewed. While multiple regression is the best known technique for deriving fixed empirical weights for component variables (such as tests and test items), other methods allow one to derive weights which equalize the effective weights of the component…

  19. Issues and Controversies that Surround Recent Texts on Empirically Supported and Empirically Based Treatments

    ERIC Educational Resources Information Center

    Paul, Howard A.

    2004-01-01

    Since the 1993 APA task force of the Society of Clinical Psychology developed guidelines to apply data-based psychology to the identification of effective psychotherapy, there has been an increasing number of texts focussing on Empirically based Psychotherapy and Empirically Supported Treatments. This manuscript examines recent key texts and…

  20. Prosocial Motivation and Blood Donations: A Survey of the Empirical Literature.

    PubMed

    Goette, Lorenz; Stutzer, Alois; Frey, Beat M

    2010-06-01

    Recent shortages in the supply of blood donations have renewed the interest in how blood donations can be increased temporarily. We survey the evidence on the role of financial and other incentives in eliciting blood donations among donors who are normally willing to donate pro bono. We present the predictions from different empirical/psychological-based theories, with some predicting that incentives are effective while others predict that incentives may undermine prosocial motivation. The evidence suggests that incentives work relatively well in settings in which donors are relatively anonymous, but evidence indicates also that when image concerns become important, incentives may be counterproductive as donors do not want to be seen as greedy.

  1. What should we mean by empirical validation in hypnotherapy: evidence-based practice in clinical hypnosis.

    PubMed

    Alladin, Assen; Sabatini, Linda; Amundson, Jon K

    2007-04-01

    This paper briefly surveys the trend of and controversy surrounding empirical validation in psychotherapy. Empirical validation of hypnotherapy has paralleled the practice of validation in psychotherapy and the professionalization of clinical psychology, in general. This evolution in determining what counts as evidence for bona fide clinical practice has gone from theory-driven clinical approaches in the 1960s and 1970s through critical attempts at categorization of empirically supported therapies in the 1990s on to the concept of evidence-based practice in 2006. Implications of this progression in professional psychology are discussed in the light of hypnosis's current quest for validation and empirical accreditation.

  2. Watershed-based survey designs

    USGS Publications Warehouse

    Detenbeck, N.E.; Cincotta, D.; Denver, J.M.; Greenlee, S.K.; Olsen, A.R.; Pitchford, A.M.

    2005-01-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream–downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs.

  3. Watershed-based survey designs.

    PubMed

    Detenbeck, Naomi E; Cincotta, Dan; Denver, Judith M; Greenlee, Susan K; Olsen, Anthony R; Pitchford, Ann M

    2005-04-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream-downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs.

  4. Enhanced FMAM based on empirical kernel map.

    PubMed

    Wang, Min; Chen, Songcan

    2005-05-01

    The existing morphological auto-associative memory models based on the morphological operations, typically including morphological auto-associative memories (auto-MAM) proposed by Ritter et al. and our fuzzy morphological auto-associative memories (auto-FMAM), have many attractive advantages such as unlimited storage capacity, one-shot recall speed and good noise-tolerance to single erosive or dilative noise. However, they suffer from the extreme vulnerability to noise of mixing erosion and dilation, resulting in great degradation on recall performance. To overcome this shortcoming, we focus on FMAM and propose an enhanced FMAM (EFMAM) based on the empirical kernel map. Although it is simple, EFMAM can significantly improve the auto-FMAM with respect to the recognition accuracy under hybrid-noise and computational effort. Experiments conducted on the thumbnail-sized faces (28 x 23 and 14 x 11) scaled from the ORL database show the average accuracies of 92%, 90%, and 88% with 40 classes under 10%, 20%, and 30% randomly generated hybrid-noises, respectively, which are far higher than the auto-FMAM (67%, 46%, 31%) under the same noise levels.

  5. Empirical Validation and Application of the Computing Attitudes Survey

    ERIC Educational Resources Information Center

    Dorn, Brian; Elliott Tew, Allison

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…

  6. Empirical Validation and Application of the Computing Attitudes Survey

    ERIC Educational Resources Information Center

    Dorn, Brian; Elliott Tew, Allison

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…

  7. Empirically Based Myths: Astrology, Biorhythms, and ATIs.

    ERIC Educational Resources Information Center

    Ragsdale, Ronald G.

    1980-01-01

    A myth may have an empirical basis through chance occurrence; perhaps Aptitude Treatment Interactions (ATIs) are in this category. While ATIs have great utility in describing, planning, and implementing instruction, few disordinal interactions have been found. Article suggests narrowing of ATI research with replications and estimates of effect…

  8. Empirically Based Myths: Astrology, Biorhythms, and ATIs.

    ERIC Educational Resources Information Center

    Ragsdale, Ronald G.

    1980-01-01

    A myth may have an empirical basis through chance occurrence; perhaps Aptitude Treatment Interactions (ATIs) are in this category. While ATIs have great utility in describing, planning, and implementing instruction, few disordinal interactions have been found. Article suggests narrowing of ATI research with replications and estimates of effect…

  9. Emotional Risks to Respondents in Survey Research: Some Empirical Evidence

    PubMed Central

    Labott, Susan M.; Johnson, Timothy P.; Fendrich, Michael; Feeny, Norah C.

    2014-01-01

    Some survey research has documented distress in respondents with pre-existing emotional vulnerabilities, suggesting the possibility of harm. In this study, respondents were interviewed about a personally distressing event; mood, stress, and emotional reactions were assessed. Two days later, respondents participated in interventions to either enhance or alleviate the effects of the initial interview. Results indicated that distressing interviews increased stress and negative mood, although no adverse events occurred. Between the interviews, moods returned to baseline. Respondents who again discussed a distressing event reported moods more negative than those who discussed a neutral or a positive event. This study provides evidence that, among nonvulnerable survey respondents, interviews on distressing topics can result in negative moods and stress, but they do not harm respondents. PMID:24169422

  10. Prosocial Motivation and Blood Donations: A Survey of the Empirical Literature

    PubMed Central

    Goette, Lorenz; Stutzer, Alois; Frey, Beat M.

    2010-01-01

    Summary Recent shortages in the supply of blood donations have renewed the interest in how blood donations can be increased temporarily. We survey the evidence on the role of financial and other incentives in eliciting blood donations among donors who are normally willing to donate pro bono. We present the predictions from different empirical/psychological-based theories, with some predicting that incentives are effective while others predict that incentives may undermine prosocial motivation. The evidence suggests that incentives work relatively well in settings in which donors are relatively anonymous, but evidence indicates also that when image concerns become important, incentives may be counterproductive as donors do not want to be seen as greedy. PMID:20737018

  11. A survey on hematology-oncology pediatric AIEOP centers: prophylaxis, empirical therapy and nursing prevention procedures of infectious complications.

    PubMed

    Livadiotti, Susanna; Milano, Giuseppe Maria; Serra, Annalisa; Folgori, Laura; Jenkner, Alessandro; Castagnola, Elio; Cesaro, Simone; Rossi, Mario R; Barone, Angelica; Zanazzo, Giulio; Nesi, Francesca; Licciardello, Maria; De Santis, Raffaella; Ziino, Ottavio; Cellini, Monica; Porta, Fulvio; Caselli, Desiree; Pontrelli, Giuseppe

    2012-01-01

    A nationwide questionnaire-based survey was designed to evaluate the management and prophylaxis of febrile neutropenia in pediatric patients admitted to hematology-oncology and hematopoietic stem cell transplant units. Of the 34 participating centers, 40 and 63%, respectively, continue to prescribe antibacterial and antimycotic prophylaxis in low-risk subjects and 78 and 94% in transplant patients. Approximately half of the centers prescribe a combination antibiotic regimen as first-line therapy in low-risk patients and up to 81% in high-risk patients. When initial empirical therapy fails after seven days, 63% of the centers add empirical antimycotic therapy in low-and 81% in high-risk patients. Overall management varies significantly across centers. Preventive nursing procedures are in accordance with international guidelines. This survey is the first to focus on prescribing practices in children with cancer and could help to implement practice guidelines.

  12. Empirical likelihood-based tests for stochastic ordering

    PubMed Central

    BARMI, HAMMOU EL; MCKEAGUE, IAN W.

    2013-01-01

    This paper develops an empirical likelihood approach to testing for the presence of stochastic ordering among univariate distributions based on independent random samples from each distribution. The proposed test statistic is formed by integrating a localized empirical likelihood statistic with respect to the empirical distribution of the pooled sample. The asymptotic null distribution of this test statistic is found to have a simple distribution-free representation in terms of standard Brownian bridge processes. The approach is used to compare the lengths of rule of Roman Emperors over various historical periods, including the “decline and fall” phase of the empire. In a simulation study, the power of the proposed test is found to improve substantially upon that of a competing test due to El Barmi and Mukerjee. PMID:23874142

  13. Web-Based Surveys: Not Your Basic Survey Anymore

    ERIC Educational Resources Information Center

    Bertot, John Carlo

    2009-01-01

    Web-based surveys are not new to the library environment. Although such surveys began as extensions of print surveys, the Web-based environment offers a number of approaches to conducting a survey that the print environment cannot duplicate easily. Since 1994, the author and others have conducted national surveys of public library Internet…

  14. Empirically Based Comprehensive Treatment Program for Parasuicide.

    ERIC Educational Resources Information Center

    Clum, George A.; And Others

    1979-01-01

    Suggests secondary parasuicide prevention is the most viable path for future research. Aggressive case findings and primary prevention approaches have failed to reduce suicide attempt rates. A secondary prevention model, based on factors predictive of parasuicide, was developed. Stress reduction and cognitive restructuring were primary goals of…

  15. Empirically Based Strategies for Preventing Juvenile Delinquency.

    PubMed

    Pardini, Dustin

    2016-04-01

    Juvenile crime is a serious public health problem that results in significant emotional and financial costs for victims and society. Using etiologic models as a guide, multiple interventions have been developed to target risk factors thought to perpetuate the emergence and persistence of delinquent behavior. Evidence suggests that the most effective interventions tend to have well-defined treatment protocols, focus on therapeutic approaches as opposed to external control techniques, and use multimodal cognitive-behavioral treatment strategies. Moving forward, there is a need to develop effective policies and procedures that promote the widespread adoption of evidence-based delinquency prevention practices across multiple settings.

  16. Image-Based Empirical Modeling of the Plasmasphere

    NASA Technical Reports Server (NTRS)

    Adrian, Mark L.; Gallagher, D. L.

    2008-01-01

    A new suite of empirical models of plasmaspheric plasma based on remote, global images from the IMAGE EUV instrument is proposed for development. The purpose of these empirical models is to establish the statistical properties of the plasmasphere as a function of conditions. This suite of models will mark the first time the plasmaspheric plume is included in an empirical model. Development of these empirical plasmaspheric models will support synoptic studies (such as for wave propagation and growth, energetic particle loss through collisions and dust transport as influenced by charging) and serves as a benchmark against which physical models can be tested. The ability to know that a specific global density distribution occurs in response to specific magnetospheric and solar wind factors is a huge advantage over all previous in-situ based empirical models. The consequence of creating these new plasmaspheric models will be to provide much higher fidelity and much richer quantitative descriptions of the statistical properties of plasmaspheric plasma in the inner magnetosphere, whether that plasma is in the main body of the plasmasphere, nearby during recovery or in the plasmaspheric plume. Model products to be presented include statistical probabilities for being in the plasmasphere, near thermal He+ density boundaries and the complexity of its spatial structure.

  17. Image-Based Empirical Modeling of the Plasmasphere

    NASA Technical Reports Server (NTRS)

    Adrian, Mark L.; Gallagher, D. L.

    2008-01-01

    A new suite of empirical models of plasmaspheric plasma based on remote, global images from the IMAGE EUV instrument is proposed for development. The purpose of these empirical models is to establish the statistical properties of the plasmasphere as a function of conditions. This suite of models will mark the first time the plasmaspheric plume is included in an empirical model. Development of these empirical plasmaspheric models will support synoptic studies (such as for wave propagation and growth, energetic particle loss through collisions and dust transport as influenced by charging) and serves as a benchmark against which physical models can be tested. The ability to know that a specific global density distribution occurs in response to specific magnetospheric and solar wind factors is a huge advantage over all previous in-situ based empirical models. The consequence of creating these new plasmaspheric models will be to provide much higher fidelity and much richer quantitative descriptions of the statistical properties of plasmaspheric plasma in the inner magnetosphere, whether that plasma is in the main body of the plasmasphere, nearby during recovery or in the plasmaspheric plume. Model products to be presented include statistical probabilities for being in the plasmasphere, near thermal He+ density boundaries and the complexity of its spatial structure.

  18. An Empirical Evaluation of Puzzle-Based Learning as an Interest Approach for Teaching Introductory Computer Science

    ERIC Educational Resources Information Center

    Merrick, K. E.

    2010-01-01

    This correspondence describes an adaptation of puzzle-based learning to teaching an introductory computer programming course. Students from two offerings of the course--with and without the puzzle-based learning--were surveyed over a two-year period. Empirical results show that the synthesis of puzzle-based learning concepts with existing course…

  19. Empirical Likelihood-Based Confidence Interval of ROC Curves.

    PubMed

    Su, Haiyan; Qin, Yongsong; Liang, Hua

    2009-11-01

    In this article we propose an empirical likelihood-based confidence interval for receiver operating characteristic curves which are based on a continuous-scale test. The approach is easily understood, simply implemented, and computationally efficient. The results from our simulation studies indicate that the finite-sample numerical performance slightly outperforms the most promising methods published recently. Two real datasets are analyzed by using the proposed method and the existing bootstrap-based method.

  20. Agent-Based Models in Empirical Social Research

    ERIC Educational Resources Information Center

    Bruch, Elizabeth; Atwell, Jon

    2015-01-01

    Agent-based modeling has become increasingly popular in recent years, but there is still no codified set of recommendations or practices for how to use these models within a program of empirical research. This article provides ideas and practical guidelines drawn from sociology, biology, computer science, epidemiology, and statistics. We first…

  1. Responses to Commentaries on Advances in Empirically Based Assessment.

    ERIC Educational Resources Information Center

    McConaughy, Stephanie H.

    1993-01-01

    Author of article (this issue) describing research program to advance assessment of children's behavioral and emotional problems; presenting conceptual framework for multiaxial empirically based assessment; and summarizing research efforts to develop cross-informant scales for scoring parent, teacher, and self-reports responds to commentaries on…

  2. Empirical Data Sets for Agent Based Modeling of Crowd Scenarios

    DTIC Science & Technology

    2009-08-06

    Conclusion 2UNCLASSIFIED- Approved for Public Release Crowd Research • Large numbers • Heterogeneous • Individual Actors • Interdependence • Language ... Barriers • Empirical testing is difficult • Simulations require models based on real data, otherwise they are fiction 3UNCLASSIFIED- Approved for

  3. Survey-Based Measurement of Public Management and Policy Networks

    ERIC Educational Resources Information Center

    Henry, Adam Douglas; Lubell, Mark; McCoy, Michael

    2012-01-01

    Networks have become a central concept in the policy and public management literature; however, theoretical development is hindered by a lack of attention to the empirical properties of network measurement methods. This paper compares three survey-based methods for measuring organizational networks: the roster, the free-recall name generator, and…

  4. Survey-Based Measurement of Public Management and Policy Networks

    ERIC Educational Resources Information Center

    Henry, Adam Douglas; Lubell, Mark; McCoy, Michael

    2012-01-01

    Networks have become a central concept in the policy and public management literature; however, theoretical development is hindered by a lack of attention to the empirical properties of network measurement methods. This paper compares three survey-based methods for measuring organizational networks: the roster, the free-recall name generator, and…

  5. WATERSHED BASED SURVEY DESIGNS

    EPA Science Inventory

    The development of watershed-based design and assessment tools will help to serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional condition to meet Section 305(b), identification of impaired water bodies or wate...

  6. WATERSHED BASED SURVEY DESIGNS

    EPA Science Inventory

    The development of watershed-based design and assessment tools will help to serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional condition to meet Section 305(b), identification of impaired water bodies or wate...

  7. Empirical analysis on a keyword-based semantic system

    NASA Astrophysics Data System (ADS)

    Zhang, Zi-Ke; Lü, Linyuan; Liu, Jian-Guo; Zhou, Tao

    2008-12-01

    Keywords in scientific articles have found their significance in information filtering and classification. In this article, we empirically investigated statistical characteristics and evolutionary properties of keywords in a very famous journal, namely Proceedings of the National Academy of Science of the United States of America (PNAS), including frequency distribution, temporal scaling behavior, and decay factor. The empirical results indicate that the keyword frequency in PNAS approximately follows a Zipf’s law with exponent 0.86. In addition, there is a power-low correlation between the cumulative number of distinct keywords and the cumulative number of keyword occurrences. Extensive empirical analysis on some other journals’ data is also presented, with decaying trends of most popular keywords being monitored. Interestingly, top journals from various subjects share very similar decaying tendency, while the journals of low impact factors exhibit completely different behavior. Those empirical characters may shed some light on the in-depth understanding of semantic evolutionary behaviors. In addition, the analysis of keyword-based system is helpful for the design of corresponding recommender systems.

  8. Determining VA physician requirements through empirically based models.

    PubMed Central

    Lipscomb, J; Kilpatrick, K E; Lee, K L; Pieper, K S

    1995-01-01

    OBJECTIVE: As part of a project to estimate physician requirements for the Department of Veterans Affairs, the Institute of Medicine (IOM) developed and tested empirically based models of physician staffing, by specialty, that could be applied to each VA facility. DATA SOURCE/STUDY SETTING. These analyses used selected data on all patient encounters and all facilities in VA's management information systems for FY 1989. STUDY DESIGN. Production functions (PFs), with patient workload dependent on physicians, other providers, and nonpersonnel factors, were estimated for each of 14 patient care areas in a VA medical center. Inverse production functions (IPFs), with physician staffing levels dependent on workload and other factors, were estimated for each of 11 specialty groupings. These models provide complementary approaches to deriving VA physician requirements for patient care and medical education. DATA COLLECTION/EXTRACTION METHODS. All data were assembled by VA and put in analyzable SAS data sets containing FY 1989 workload and staffing variables used in the PFs and IPFs. All statistical analyses reported here were conducted by the IOM. PRINCIPAL FINDINGS. Existing VA data can be used to develop statistically strong, clinically plausible, empirically based models for calculating physician requirements, by specialty. These models can (1) compare current physician staffing in a given setting with systemwide norms and (2) yield estimates of future staffing requirements conditional on future workload. CONCLUSIONS. Empirically based models can play an important role in determining VA physician staffing requirements. VA should test, evaluate, and revise these models on an ongoing basis. PMID:7860320

  9. Unsupervised self-organized mapping: a versatile empirical tool for object selection, classification and redshift estimation in large surveys

    NASA Astrophysics Data System (ADS)

    Geach, James E.

    2012-01-01

    We present an application of unsupervised machine learning - the self-organized map (SOM) - as a tool for visualizing, exploring and mining the catalogues of large astronomical surveys. Self-organization culminates in a low-resolution representation of the 'topology' of a parameter volume, and this can be exploited in various ways pertinent to astronomy. Using data from the Cosmological Evolution Survey (COSMOS), we demonstrate two key astronomical applications of the SOM: (i) object classification and selection, using galaxies with active galactic nuclei as an example, and (ii) photometric redshift estimation, illustrating how SOMs can be used as totally empirical predictive tools. With a training set of ˜3800 galaxies with zspec≤ 1, we achieve photometric redshift accuracies competitive with other (mainly template fitting) techniques that use a similar number of photometric bands [σ(Δz) = 0.03 with a ˜2 per cent outlier rate when using u* band to 8 ?m photometry]. We also test the SOM as a photo-z tool using the PHoto-z Accuracy Testing (PHAT) synthetic catalogue of Hildebrandt et al., which compares several different photo-z codes using a common input/training set. We find that the SOM can deliver accuracies that are competitive with many of the established template fitting and empirical methods. This technique is not without clear limitations, which are discussed, but we suggest it could be a powerful tool in the era of extremely large -'petabyte'- data bases where efficient data mining is a paramount concern.

  10. The fundamental points of the Second Military Survey of the Habsburg Empire

    NASA Astrophysics Data System (ADS)

    Timár, G.

    2009-04-01

    The Second Military Survey was carried out between 1806 and 1869 in the continuously changing territory of the Habsburg Empire. More than 4000 tiles of the 1:28,800 scale survey sheets cover the Empire, which was the second in territorial extents in Europe after Russia. In the terms of the cartography, the Empire has been divided into eight zones; each zones had its own Cassini-type projection with a center at a geodetically defined fundamental point. The points were the following: - Wien-Stephansdom (valid for Lower and Upper Austria, Hungary, Dalmacy, Moravia and Vorarlberg; latitude=48,20910N; longitude=16,37655E on local datum). - Gusterberg (valid for Bohemia; latitude=48,03903N; longitude=14,13976E). - Schöcklberg (valid for Styria; latitude=47,19899N; longitude=15,46902E). - Krimberg (valid for Illyria and Coastal Land; latitude=45,92903N; longitude=14,47423E). - Löwenberg (valid for Galizia and Bukovina; latitude=49,84889N; longitude=24,04639E). - Hermannstadt (valid for Transylvania; latitude=45,84031N; longitude=24,11297). - Ivanić (valid for Croatia; latitude=45,73924N; longitude=16,42309). - Milano, Duomo San Salvatore (valid for Lombardy, Venezia, Parma and Modena; latitude=45,45944N; longitude=9,18757E) - a simulated fundametal point for Tyrol and Salzburg, several hundred miles north of the territories. The poster shows systematically the fundamental points, their topographic settings and the present situation of the geodetic point sites.

  11. Nonparametric Bayes Factors Based On Empirical Likelihood Ratios

    PubMed Central

    Vexler, Albert; Deng, Wei; Wilding, Gregory E.

    2012-01-01

    Bayes methodology provides posterior distribution functions based on parametric likelihoods adjusted for prior distributions. A distribution-free alternative to the parametric likelihood is use of empirical likelihood (EL) techniques, well known in the context of nonparametric testing of statistical hypotheses. Empirical likelihoods have been shown to exhibit many of the properties of conventional parametric likelihoods. In this article, we propose and examine Bayes factors (BF) methods that are derived via the EL ratio approach. Following Kass & Wasserman [10], we consider Bayes factors type decision rules in the context of standard statistical testing techniques. We show that the asymptotic properties of the proposed procedure are similar to the classical BF’s asymptotic operating characteristics. Although we focus on hypothesis testing, the proposed approach also yields confidence interval estimators of unknown parameters. Monte Carlo simulations were conducted to evaluate the theoretical results as well as to demonstrate the power of the proposed test. PMID:23180904

  12. Space Based Dark Energy Surveys

    NASA Astrophysics Data System (ADS)

    Dore, Olivier

    2016-03-01

    Dark energy, the name given to the cause of the accelerating expansion of the Universe, is one of the most tantalizing mystery in modern physics. Current cosmological models hold that dark energy is currently the dominant component of the Universe, but the exact nature of DE remains poorly understood. There are ambitious ground-based surveys underway that seek to understand DE and NASA is participating in the development of significantly more ambitious space-based surveys planned for the next decade. NASA has provided mission enabling technology to the European Space Agency's (ESA) Euclid mission in exchange for US scientists to participate in the Euclid mission. NASA is also developing the Wide Field Infrared Survey Telescope-Astrophysics Focused Telescope Asset (WFIRST) mission for possible launch in 2024. WFIRST was the highest ranked space mission in the Astro2010 Decadal Survey and the current design uses a 2.4m space telescope to go beyond what was then envisioned. Understanding DE is one of the primary science goals of WFIRST-AFTA. This talk will review the state of DE, the relevant activities of the Cosmic Structure Interest Group (CoSSIG) of the PhyPAG, and detail the status and complementarity between Euclid, WFIRST and ot ambitious ground-based efforts.

  13. AGENT-BASED MODELS IN EMPIRICAL SOCIAL RESEARCH*

    PubMed Central

    Bruch, Elizabeth; Atwell, Jon

    2014-01-01

    Agent-based modeling has become increasingly popular in recent years, but there is still no codified set of recommendations or practices for how to use these models within a program of empirical research. This article provides ideas and practical guidelines drawn from sociology, biology, computer science, epidemiology, and statistics. We first discuss the motivations for using agent-based models in both basic science and policy-oriented social research. Next, we provide an overview of methods and strategies for incorporating data on behavior and populations into agent-based models, and review techniques for validating and testing the sensitivity of agent-based models. We close with suggested directions for future research. PMID:25983351

  14. What Can Student Perception Surveys Tell Us about Teaching? Empirically Testing the Underlying Structure of the Tripod Student Perception Survey

    ERIC Educational Resources Information Center

    Wallace, Tanner LeBaron; Kelcey, Benjamin; Ruzek, Erik

    2016-01-01

    We conducted a theory-based analysis of the underlying structure of the Tripod student perception survey instrument using the Measures of Effective Teaching (MET) database (N = 1,049 middle school math class sections; N = 25,423 students). Multilevel item factor analyses suggested that an alternative bifactor structure best fit the Tripod items,…

  15. What Can Student Perception Surveys Tell Us about Teaching? Empirically Testing the Underlying Structure of the Tripod Student Perception Survey

    ERIC Educational Resources Information Center

    Wallace, Tanner LeBaron; Kelcey, Benjamin; Ruzek, Erik

    2016-01-01

    We conducted a theory-based analysis of the underlying structure of the Tripod student perception survey instrument using the Measures of Effective Teaching (MET) database (N = 1,049 middle school math class sections; N = 25,423 students). Multilevel item factor analyses suggested that an alternative bifactor structure best fit the Tripod items,…

  16. Video watermarking with empirical PCA-based decoding.

    PubMed

    Khalilian, Hanieh; Bajic, Ivan V

    2013-12-01

    A new method for video watermarking is presented in this paper. In the proposed method, data are embedded in the LL subband of wavelet coefficients, and decoding is performed based on the comparison among the elements of the first principal component resulting from empirical principal component analysis (PCA). The locations for data embedding are selected such that they offer the most robust PCA-based decoding. Data are inserted in the LL subband in an adaptive manner based on the energy of high frequency subbands and visual saliency. Extensive testing was performed under various types of attacks, such as spatial attacks (uniform and Gaussian noise and median filtering), compression attacks (MPEG-2, H. 263, and H. 264), and temporal attacks (frame repetition, frame averaging, frame swapping, and frame rate conversion). The results show that the proposed method offers improved performance compared with several methods from the literature, especially under additive noise and compression attacks.

  17. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data.

    PubMed

    Shannon, Graeme; Lewis, Jesse S; Gerber, Brian D

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km(2) of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10-120 cameras) and occasions (20-120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ψ) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with

  18. Empirically based device modeling of bulk heterojunction organic photovoltaics

    NASA Astrophysics Data System (ADS)

    Pierre, Adrien; Lu, Shaofeng; Howard, Ian A.; Facchetti, Antonio; Arias, Ana Claudia

    2013-10-01

    An empirically based, open source, optoelectronic model is constructed to accurately simulate organic photovoltaic (OPV) devices. Bulk heterojunction OPV devices based on a new low band gap dithienothiophene- diketopyrrolopyrrole donor polymer (P(TBT-DPP)) are blended with PC70BM and processed under various conditions, with efficiencies up to 4.7%. The mobilities of electrons and holes, bimolecular recombination coefficients, exciton quenching efficiencies in donor and acceptor domains and optical constants of these devices are measured and input into the simulator to yield photocurrent with less than 7% error. The results from this model not only show carrier activity in the active layer but also elucidate new routes of device optimization by varying donor-acceptor composition as a function of position. Sets of high and low performance devices are investigated and compared side-by-side.

  19. Consumer segmentation based on the level and structure of fruit and vegetable intake: an empirical evidence for US adults from the National Health and Nutrition Examination Survey (NHANES) 2005-2006.

    PubMed

    Demydas, Tetyana

    2011-06-01

    To identify consumption patterns of fruit and vegetables within a representative sample of US adults with a focus on degree of produce processing and to explore sociodemographic, lifestyle and nutritional profiles associated with these patterns. Cross-sectional analysis. Fruit and vegetable (F&V) consumption data were collected using two non-consecutive 24 h recalls. For the purpose of the study, F&V intakes were aggregated into seven subgroups indicating degree of processing, which afterwards were used as inputs into cluster analysis. The 2005-2006 National Health and Nutrition Examination Survey. The sample consisted of 2444 adults aged 20-59 years. Total average F&V intake of the adults was below the recommended level. Thereby, 20 % of the respondents consumed fruit only in the form of juice. Three F&V consumption patterns were identified: 'low-intake F&V consumers' (74 % of respondents), 'consumers of healthier F&V options' (13 %) and 'intensive fruit juice consumers' (13 %). These groups differed markedly in terms of their sociodemographic, lifestyle and health characteristics, such as gender, age, race/ethnicity, education, smoking, weight status, etc. Differences in nutrient profiles were also found, with the 'consumers of healthier F&V options' showing better nutritional quality compared with other clusters. Only a small share of US adults combines high F&V intakes with healthier F&V options that lead to a better nutritional profile. This raises discussion about a need to deliver more specific F&V promotion messages, including advice on healthier preparation methods, especially for the specific population groups.

  20. Arts Entrepreneurship Education in the UK and Germany: An Empirical Survey among Lecturers in Fine Art

    ERIC Educational Resources Information Center

    Thom, Marco

    2017-01-01

    Purpose: The purpose of this paper is to report on the current state of arts entrepreneurship education at higher educational institutions (HEIs) in the UK and Germany. It is based on findings from questionnaire surveys among 210 lecturers in fine art at 89 HEIs in the UK and Germany. Design/methodology/approach: This paper explores issues related…

  1. Development of an empirically based dynamic biomechanical strength model

    NASA Technical Reports Server (NTRS)

    Pandya, A.; Maida, J.; Aldridge, A.; Hasson, S.; Woolford, B.

    1992-01-01

    The focus here is on the development of a dynamic strength model for humans. Our model is based on empirical data. The shoulder, elbow, and wrist joints are characterized in terms of maximum isolated torque, position, and velocity in all rotational planes. This information is reduced by a least squares regression technique into a table of single variable second degree polynomial equations determining the torque as a function of position and velocity. The isolated joint torque equations are then used to compute forces resulting from a composite motion, which in this case is a ratchet wrench push and pull operation. What is presented here is a comparison of the computed or predicted results of the model with the actual measured values for the composite motion.

  2. Ontology-Based Empirical Knowledge Verification for Professional Virtual Community

    ERIC Educational Resources Information Center

    Chen, Yuh-Jen

    2011-01-01

    A professional virtual community provides an interactive platform for enterprise experts to create and share their empirical knowledge cooperatively, and the platform contains a tremendous amount of hidden empirical knowledge that knowledge experts have preserved in the discussion process. Therefore, enterprise knowledge management highly…

  3. Ontology-Based Empirical Knowledge Verification for Professional Virtual Community

    ERIC Educational Resources Information Center

    Chen, Yuh-Jen

    2011-01-01

    A professional virtual community provides an interactive platform for enterprise experts to create and share their empirical knowledge cooperatively, and the platform contains a tremendous amount of hidden empirical knowledge that knowledge experts have preserved in the discussion process. Therefore, enterprise knowledge management highly…

  4. Empirical Likelihood-Based ANOVA for Trimmed Means

    PubMed Central

    Velina, Mara; Valeinis, Janis; Greco, Luca; Luta, George

    2016-01-01

    In this paper, we introduce an alternative to Yuen’s test for the comparison of several population trimmed means. This nonparametric ANOVA type test is based on the empirical likelihood (EL) approach and extends the results for one population trimmed mean from Qin and Tsao (2002). The results of our simulation study indicate that for skewed distributions, with and without variance heterogeneity, Yuen’s test performs better than the new EL ANOVA test for trimmed means with respect to control over the probability of a type I error. This finding is in contrast with our simulation results for the comparison of means, where the EL ANOVA test for means performs better than Welch’s heteroscedastic F test. The analysis of a real data example illustrates the use of Yuen’s test and the new EL ANOVA test for trimmed means for different trimming levels. Based on the results of our study, we recommend the use of Yuen’s test for situations involving the comparison of population trimmed means between groups of interest. PMID:27690063

  5. Are smokers rational addicts? Empirical evidence from the Indonesian Family Life Survey

    PubMed Central

    2011-01-01

    Background Indonesia is one of the largest consumers of tobacco in the world, however there has been little work done on the economics addiction of tobacco. This study provides an empirical test of a rational addiction (henceforth RA) hypothesis of cigarette demand in Indonesia. Methods Four estimators (OLS, 2SLS, GMM, and System-GMM) were explored to test the RA hypothesis. The author adopted several diagnostics tests to select the best estimator to overcome econometric problems faced in presence of the past and future cigarette consumption (suspected endogenous variables). A short-run and long-run price elasticities of cigarettes demand was then calculated. The model was applied to individuals pooled data derived from three-waves a panel of the Indonesian Family Life Survey spanning the period 1993-2000. Results The past cigarette consumption coefficients turned out to be a positive with a p-value < 1%, implying that cigarettes indeed an addictive goods. The rational addiction hypothesis was rejected in favour of myopic ones. The short-run cigarette price elasticity for male and female was estimated to be-0.38 and -0.57, respectively, and the long-run one was -0.4 and -3.85, respectively. Conclusions Health policymakers should redesign current public health campaign against cigarette smoking in the country. Given the demand for cigarettes to be more prices sensitive for the long run (and female) than the short run (and male), an increase in the price of cigarettes could lead to a significant fall in cigarette consumption in the long run rather than as a constant source of government revenue. PMID:21345229

  6. Empirically based device modeling of bulk heterojunction organic photovoltaics

    NASA Astrophysics Data System (ADS)

    Pierre, Adrien; Lu, Shaofeng; Howard, Ian A.; Facchetti, Antonio; Arias, Ana Claudia

    2013-04-01

    We develop an empirically based optoelectronic model to accurately simulate the photocurrent in organic photovoltaic (OPV) devices with novel materials including bulk heterojunction OPV devices based on a new low band gap dithienothiophene-DPP donor polymer, P(TBT-DPP), blended with PC70BM at various donor-acceptor weight ratios and solvent compositions. Our devices exhibit power conversion efficiencies ranging from 1.8% to 4.7% at AM 1.5G. Electron and hole mobilities are determined using space-charge limited current measurements. Bimolecular recombination coefficients are both analytically calculated using slowest-carrier limited Langevin recombination and measured using an electro-optical pump-probe technique. Exciton quenching efficiencies in the donor and acceptor domains are determined from photoluminescence spectroscopy. In addition, dielectric and optical constants are experimentally determined. The photocurrent and its bias-dependence that we simulate using the optoelectronic model we develop, which takes into account these physically measured parameters, shows less than 7% error with respect to the experimental photocurrent (when both experimentally and semi-analytically determined recombination coefficient is used). Free carrier generation and recombination rates of the photocurrent are modeled as a function of the position in the active layer at various applied biases. These results show that while free carrier generation is maximized in the center of the device, free carrier recombination is most dominant near the electrodes even in high performance devices. Such knowledge of carrier activity is essential for the optimization of the active layer by enhancing light trapping and minimizing recombination. Our simulation program is intended to be freely distributed for use in laboratories fabricating OPV devices.

  7. 'The Healthy Migrant Effect' for Mental Health in England: Propensity-score Matched Analysis Using the EMPIRIC Survey.

    PubMed

    Dhadda, Amrit; Greene, Giles

    2017-04-07

    Evidence has demonstrated that immigrants have a mental health advantage over the indigenous population of developed countries. However, much of the evidence-base demonstrating this mental health advantage is susceptible to confounding and inadequate adjustment across immigrant and non-immigrant groups preventing a rigorous assessment of a 'healthy migrant effect'. To compare the risk of common mental disorders in the immigrant population compared to the non-immigrant population in ethnic minority groups in England. A propensity-score matched analysis was carried out to adequately balance immigrant and non-immigrant groups for known confounders using the EMPIRIC national survey of Black-Caribbean, Indian, Pakistani and Bangladeshi groups. The mental health of participants was assessed using the validated Revised Clinical Interview Schedule tool. Immigrant participants were significantly less likely to have a common mental disorder than non-immigrant participants; OR = 0.47, (95% CI 0.40, 0.56). The results from this study demonstrate that a mental health advantage exists in ethnic minority immigrants compared to non-immigrants when balancing the two groups for confounding factors. This may be due to immigrants possessing certain personality traits, such as "psychological hardiness", that the migration process may select for.

  8. Assessing formal teaching of ethics in physiology: an empirical survey, patterns, and recommendations.

    PubMed

    Goswami, Nandu; Batzel, Jerry Joseph; Hinghofer-Szalkay, Helmut

    2012-09-01

    Ethics should be an important component of physiological education. In this report, we examined to what extent teaching of ethics is formally being incorporated into the physiology curriculum. We carried out an e-mail survey in which we asked the e-mail recipients whether their institution offered a course or lecture on ethics as part of the physiology teaching process at their institution, using the following query: "We are now doing an online survey in which we would like to know whether you offer a course or a lecture on ethics as part of your physiology teaching curriculum." The response rate was 53.3%: we received 104 responses of a total of 195 sent out. Our responses came from 45 countries. While all of our responders confirmed that there was a need for ethics during medical education and scientific training, the degree of inclusion of formal ethics in the physiology curriculum varied widely. Our survey showed that, in most cases (69%), including at our Medical University of Graz, ethics in physiology is not incorporated into the physiology curriculum. Given this result, we suggest specific topics related to ethics and ethical considerations that could be integrated into the physiology curriculum. We present here a template example of a lecture "Teaching Ethics in Physiology" (structure, content, examples, and references), which was based on guidelines and case reports provided by experts in this area (e.g., Benos DJ. Ethics revisited. Adv Physiol Educ 25: 189-190, 2001). This lecture, which we are presently using in Graz, could be used as a base that could lead to greater awareness of important ethical issues in students at an early point in the educational process.

  9. Terahertz Spectrum Analysis Based on Empirical Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Su, Yunpeng; Zheng, Xiaoping; Deng, Xiaojiao

    2017-08-01

    Precise identification of terahertz absorption peaks for materials with low concentration and high attenuation still remains a challenge. Empirical mode decomposition was applied to terahertz spectrum analysis in order to improve the performance on spectral fingerprints identification. We conducted experiments on water vapor and carbon monoxide respectively with terahertz time domain spectroscopy. By comparing their absorption spectra before and after empirical mode decomposition, we demonstrated that the first-order intrinsic mode function shows absorption peaks clearly in high-frequency range. By comparing the frequency spectra of the sample signals and their intrinsic mode functions, we proved that the first-order function contains most of the original signal's energy and frequency information so that it cannot be left out or replaced by high-order functions in spectral fingerprints detection. Empirical mode decomposition not only acts as an effective supplementary means to terahertz time-domain spectroscopy but also shows great potential in discrimination of materials and prediction of their concentrations.

  10. Assessment of Young Children Using the Achenbach System of Empirically Based Assessment (ASEBA)

    ERIC Educational Resources Information Center

    Rescorla, Leslie A.

    2005-01-01

    After providing a brief review of three other approaches to assessment of preschool children (DSM-IV diagnoses, "Zero to Three" diagnoses, and temperament scales), this paper focuses on the Achenbach System of Empirically Based Assessment (ASEBA). The empirically based assessment paradigm provides user-friendly, cost-effective, reliable,…

  11. Short memory or long memory: an empirical survey of daily rainfall data

    NASA Astrophysics Data System (ADS)

    Yusof, F.; Kane, I. L.

    2012-10-01

    A short memory process that encounters occasional structural breaks in mean can show a slower rate of decay in the autocorrelation function and other properties of fractional integrated I (d) processes. In this paper we employed a procedure for estimating the fractional differencing parameter in semi parametric contexts proposed by Geweke and Porter-Hudak to analyze nine daily rainfall data sets across Malaysia. The results indicate that all the data sets exhibit long memory. Furthermore, an empirical fluctuation process using the Ordinary Least Square (OLS) based cumulative sum (CUSUM) test with F-statistic for the break date were applied, break dates were detected in all data sets. The data sets were partitioned according to their respective break date and further test for long memory was applied for all subseries. Results show that all subseries follows the same pattern with the original series. The estimate of the fractional parameters d1 and d2 on the subseries obtained by splitting the original series at the break-date, confirms that there is a long memory in the DGP. Therefore this evidence shows a true long memory not due to structural break.

  12. Structural break or long memory: an empirical survey on daily rainfall data sets across Malaysia

    NASA Astrophysics Data System (ADS)

    Yusof, F.; Kane, I. L.; Yusop, Z.

    2013-04-01

    A short memory process that encounters occasional structural breaks in mean can show a slower rate of decay in the autocorrelation function and other properties of fractional integrated I (d) processes. In this paper we employed a procedure for estimating the fractional differencing parameter in semiparametric contexts proposed by Geweke and Porter-Hudak (1983) to analyse nine daily rainfall data sets across Malaysia. The results indicate that all the data sets exhibit long memory. Furthermore, an empirical fluctuation process using the ordinary least square (OLS)-based cumulative sum (CUSUM) test for the break date was applied. Break dates were detected in all data sets. The data sets were partitioned according to their respective break date, and a further test for long memory was applied for all subseries. Results show that all subseries follows the same pattern as the original series. The estimate of the fractional parameters d1 and d2 on the subseries obtained by splitting the original series at the break date confirms that there is a long memory in the data generating process (DGP). Therefore this evidence shows a true long memory not due to structural break.

  13. Teaching technical skills to surgical residents: a survey of empirical research.

    PubMed

    Hamstra, Stanley J; Dubrowski, Adam; Backstein, David

    2006-08-01

    We review a series of empirical studies on the use of simulators and bench models in training technical skills and subsequent retention of those skills. We discuss recent research on the transfer of training from bench models and simulators to the clinical setting and provide a theoretical structure to organize the findings. The transfer of training from inanimate bench models and simulators to live patients has recently been demonstrated in a number of areas. The effectiveness of this training is enhanced if focus is placed on the operative, or process-oriented, aspects of the procedure, with suspension of disbelief regarding the physical structure of the training platform. The retention of trained skills is an area of research only beginning to evolve, with recent results suggesting that effective retention can be demonstrated if training is tightly focused and involves an entire procedure. An emerging area of research involves the use of simulators as assessment instruments for high-stakes testing, and recent results involving simulated trauma management support this novel application. Based on these findings, we encourage the use of a wide variety of high- and low-fidelity platforms, with emphasis on training procedural knowledge involving an entire procedure.

  14. Evidence-based ethics? On evidence-based practice and the "empirical turn" from normative bioethics

    PubMed Central

    Goldenberg, Maya J

    2005-01-01

    Background The increase in empirical methods of research in bioethics over the last two decades is typically perceived as a welcomed broadening of the discipline, with increased integration of social and life scientists into the field and ethics consultants into the clinical setting, however it also represents a loss of confidence in the typical normative and analytic methods of bioethics. Discussion The recent incipiency of "Evidence-Based Ethics" attests to this phenomenon and should be rejected as a solution to the current ambivalence toward the normative resolution of moral problems in a pluralistic society. While "evidence-based" is typically read in medicine and other life and social sciences as the empirically-adequate standard of reasonable practice and a means for increasing certainty, I propose that the evidence-based movement in fact gains consensus by displacing normative discourse with aggregate or statistically-derived empirical evidence as the "bottom line". Therefore, along with wavering on the fact/value distinction, evidence-based ethics threatens bioethics' normative mandate. The appeal of the evidence-based approach is that it offers a means of negotiating the demands of moral pluralism. Rather than appealing to explicit values that are likely not shared by all, "the evidence" is proposed to adjudicate between competing claims. Quantified measures are notably more "neutral" and democratic than liberal markers like "species normal functioning". Yet the positivist notion that claims stand or fall in light of the evidence is untenable; furthermore, the legacy of positivism entails the quieting of empirically non-verifiable (or at least non-falsifiable) considerations like moral claims and judgments. As a result, evidence-based ethics proposes to operate with the implicit normativity that accompanies the production and presentation of all biomedical and scientific facts unchecked. Summary The "empirical turn" in bioethics signals a need for

  15. MAIS: An Empirically-Based Intelligent CBI System.

    ERIC Educational Resources Information Center

    Christensen, Dean L.; Tennyson, Robert D.

    The goal of the programmatic research program for the Minnesota Adaptive Instructional System (MAIS), an intelligent computer-assisted instruction system, is to empirically investigate generalizable instructional variables and conditions that improve learning through the use of adaptive instructional strategies. Research has been initiated in the…

  16. Arduino based radiation survey meter

    SciTech Connect

    Rahman, Nur Aira Abd Lombigit, Lojius; Abdullah, Nor Arymaswati; Azman, Azraf; Dolah, Taufik; Jaafar, Zainudin; Mohamad, Glam Hadzir Patai; Ramli, Abd Aziz Mhd; Zain, Rasif Mohd; Said, Fazila; Khalid, Mohd Ashhar; Taat, Muhamad Zahidee; Muzakkir, Amir

    2016-01-22

    This paper presents the design of new digital radiation survey meter with LND7121 Geiger Muller tube detector and Atmega328P microcontroller. Development of the survey meter prototype is carried out on Arduino Uno platform. 16-bit Timer1 on the microcontroller is utilized as external pulse counter to produce count per second or CPS measurement. Conversion from CPS to dose rate technique is also performed by Arduino to display results in micro Sievert per hour (μSvhr{sup −1}). Conversion factor (CF) value for conversion of CPM to μSvhr{sup −1} determined from manufacturer data sheet is compared with CF obtained from calibration procedure. The survey meter measurement results are found to be linear for dose rates below 3500 µSv/hr.

  17. Arduino based radiation survey meter

    NASA Astrophysics Data System (ADS)

    Rahman, Nur Aira Abd; Lombigit, Lojius; Abdullah, Nor Arymaswati; Azman, Azraf; Dolah, Taufik; Muzakkir, Amir; Jaafar, Zainudin; Mohamad, Glam Hadzir Patai; Ramli, Abd Aziz Mhd; Zain, Rasif Mohd; Said, Fazila; Khalid, Mohd Ashhar; Taat, Muhamad Zahidee

    2016-01-01

    This paper presents the design of new digital radiation survey meter with LND7121 Geiger Muller tube detector and Atmega328P microcontroller. Development of the survey meter prototype is carried out on Arduino Uno platform. 16-bit Timer1 on the microcontroller is utilized as external pulse counter to produce count per second or CPS measurement. Conversion from CPS to dose rate technique is also performed by Arduino to display results in micro Sievert per hour (μSvhr-1). Conversion factor (CF) value for conversion of CPM to μSvhr-1 determined from manufacturer data sheet is compared with CF obtained from calibration procedure. The survey meter measurement results are found to be linear for dose rates below 3500 µSv/hr.

  18. An empirical test of short notice surveys in two accreditation programmes.

    PubMed

    Greenfield, David; Moldovan, Max; Westbrook, Mary; Jones, Deborah; Low, Lena; Johnston, Brian; Clark, Stephen; Banks, Margaret; Pawsey, Marjorie; Hinchcliff, Reece; Westbrook, Johanna; Braithwaite, Jeffrey

    2012-02-01

    To evaluate short notice surveys in accreditation programmes. Two trials using short notice surveys were conducted independently: a study of 20 healthcare organizations with the Australian Council on Healthcare Standards (ACHS) and a study of 7 general practices with the Australian General Practice Accreditation Limited (AGPAL). Participating organizations volunteered. ACHS and AGPAL selected 17 and 13 surveyors, respectively, and provided training for them on short notice surveys. Each agency's short notice surveys were an abbreviated version of their current advanced notification surveys. Short notice surveys assessed accreditation programme criteria or indicators that corresponded to the Australian Commission on Safety and Quality in Health Care's priority issues. Fifteen (out of 45) ACHS criteria and 48 (out of 174) AGPAL indicators that aligned to the Commission's criteria were evaluated. Participating organizations were given 2 days notice prior to the short notice surveys. Ratings from the short notice surveys were compared with those from the most recent advanced notification surveys, and statistical tests were performed to detect differences and potential confounding factors. Surveyors and organizational staff completed a post-survey feedback questionnaire which was analysed thematically and by inferential statistics. The short notice survey approach overall produced ratings congruent with the advanced notification survey for both accreditation programmes. However, for both programmes short notice surveys assessed that more organizations would not reach the accreditation threshold as compared with the previous survey. Organizations in both programmes were judged to have achieved less successful performance against clinical standards by the short notice survey than the advanced notification survey. There was support from surveyors and organizational staff for short notice survey to be adopted. However, there were mixed views about the impact of short notice

  19. Empirical relations between sense of coherence and self-efficacy, National Danish Survey.

    PubMed

    Trap, Rete; Rejkjær, Lillan; Hansen, Ebba Holme

    2016-09-01

    Salutogenic orientation is a health promotion paradigm focusing on the resources of the individual. This study analyzed the relationship between sense of coherence (SOC) and self-efficacy (SE) based on population data. By conducting an empirical analysis of the two models, we wanted to see whether we could make a valid judgement as to whether both SOC and SE could be utilized in health promotion practice, or whether one is preferable to the other. The study population was randomly selected from the Danish Central Population Register and consisted of five birth-year cohorts (1920, 1930, 1940, 1965 and 1975). The study used the 13-item SOC scale and the general SE scale. The main findings were that SOC score increased by age cohort (p = 0.0004), and there is a positive and graded correlation between SOC and SE (r = 0.39; p < 0.0001) and adjusted OR = 10.3 (CI = 6.7-15.4). We found the strongest association at the lowest level of SOC. For health promotion practice, this finding signifies the importance of focusing on improving SOC in people with a low SOC score, as they are most in need and most likely to increase their SOC level. The finding of higher SOC scores in the older age cohorts indicates that SOC changes over lifetime. Public health work focusing on lifestyle change by increasing SOC can be effective throughout life, however early intervention is important. The finding of a positive correlation between SOC and SE indicates that health promotion altering one of the constructs is paralleled in the other. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data

    NASA Technical Reports Server (NTRS)

    Rompala, John T.

    2005-01-01

    A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.

  1. Children's Experiences of Completing a Computer-Based Violence Survey: Ethical Implications

    ERIC Educational Resources Information Center

    Ellonen, Noora; Poso, Tarja

    2011-01-01

    This article aims to contribute to the discussion about the ethics of research on children when studying sensitive issues such as violence. The empirical analysis is based on the accounts given by children (11 377) who completed a computer-based questionnaire about their experiences of violence ("The Finnish Child Victim Survey 2008")…

  2. Comparison of empirical, semi-empirical and physically based models of soil hydraulic functions derived for bi-modal soils

    NASA Astrophysics Data System (ADS)

    Kutílek, M.; Jendele, L.; Krejča, M.

    2009-02-01

    The accelerated flow in soil pores is responsible for a rapid transport of pollutants from the soil surface to deeper layers up to groundwater. The term preferential flow is used for this type of transport. Our study was aimed at the preferential flow realized in the structural porous domain in bi-modal soils. We compared equations describing the soil water retention function h( θ) and unsaturated hydraulic conductivity K( h), eventually K( θ) modified for bi-modal soils, where θ is the soil water content and h is the pressure head. The analytical description of a curve passing experimental data sets of the soil hydraulic function is typical for the empirical equation characterized by fitting parameters only. If the measured data are described by the equation derived by the physical model without using fitting parameters, we speak about a physically based model. There exist several transitional subtypes between empirical and physically based models. They are denoted as semi-empirical, or semi-physical. We tested 3 models of soil water retention function and 3 models of unsaturated conductivity using experimental data sets of sand, silt, silt loam and loam. All used soils are typical by their bi-modality of the soil porous system. The model efficiency was estimated by RMSE (Root mean square error) and by RSE (Relative square error). The semi-empirical equation of the soil water retention function had the lowest values of RMSE and RSE and was qualified as "optimal" for the formal description of the shape of the water retention function. With this equation, the fit of the modelled data to experiments was the closest one. The fitting parameters smoothed the difference between the model and the physical reality of the soil porous media. The physical equation based upon the model of the pore size distribution did not allow exact fitting of the modelled data to the experimental data due to the rigidity and simplicity of the physical model when compared to the real soil

  3. A Survey of Graduate Training in Empirically Supported and Manualized Treatments: A Preliminary Report

    ERIC Educational Resources Information Center

    Karekla, Maria; Lundgren, Jennifer D.; Forsyth, John P.

    2004-01-01

    The promotion and dissemination of empirically supported (ESTs) and manualized therapies are important, albeit controversial, developments within clinical science and practice. To date, studies evaluating training opportunities and attitudes about such treatments at the graduate, predoctoral internship, and postdoctoral levels have focused on the…

  4. A Survey of Graduate Training in Empirically Supported and Manualized Treatments: A Preliminary Report

    ERIC Educational Resources Information Center

    Karekla, Maria; Lundgren, Jennifer D.; Forsyth, John P.

    2004-01-01

    The promotion and dissemination of empirically supported (ESTs) and manualized therapies are important, albeit controversial, developments within clinical science and practice. To date, studies evaluating training opportunities and attitudes about such treatments at the graduate, predoctoral internship, and postdoctoral levels have focused on the…

  5. Methods for Evaluating Respondent Attrition in Web-Based Surveys

    PubMed Central

    Sabo, Roy T; Krist, Alex H; Day, Teresa; Cyrus, John; Woolf, Steven H

    2016-01-01

    Background Electronic surveys are convenient, cost effective, and increasingly popular tools for collecting information. While the online platform allows researchers to recruit and enroll more participants, there is an increased risk of participant dropout in Web-based research. Often, these dropout trends are simply reported, adjusted for, or ignored altogether. Objective To propose a conceptual framework that analyzes respondent attrition and demonstrates the utility of these methods with existing survey data. Methods First, we suggest visualization of attrition trends using bar charts and survival curves. Next, we propose a generalized linear mixed model (GLMM) to detect or confirm significant attrition points. Finally, we suggest applications of existing statistical methods to investigate the effect of internal survey characteristics and patient characteristics on dropout. In order to apply this framework, we conducted a case study; a seventeen-item Informed Decision-Making (IDM) module addressing how and why patients make decisions about cancer screening. Results Using the framework, we were able to find significant attrition points at Questions 4, 6, 7, and 9, and were also able to identify participant responses and characteristics associated with dropout at these points and overall. Conclusions When these methods were applied to survey data, significant attrition trends were revealed, both visually and empirically, that can inspire researchers to investigate the factors associated with survey dropout, address whether survey completion is associated with health outcomes, and compare attrition patterns between groups. The framework can be used to extract information beyond simple responses, can be useful during survey development, and can help determine the external validity of survey results. PMID:27876687

  6. Methods for Evaluating Respondent Attrition in Web-Based Surveys.

    PubMed

    Hochheimer, Camille J; Sabo, Roy T; Krist, Alex H; Day, Teresa; Cyrus, John; Woolf, Steven H

    2016-11-22

    Electronic surveys are convenient, cost effective, and increasingly popular tools for collecting information. While the online platform allows researchers to recruit and enroll more participants, there is an increased risk of participant dropout in Web-based research. Often, these dropout trends are simply reported, adjusted for, or ignored altogether. To propose a conceptual framework that analyzes respondent attrition and demonstrates the utility of these methods with existing survey data. First, we suggest visualization of attrition trends using bar charts and survival curves. Next, we propose a generalized linear mixed model (GLMM) to detect or confirm significant attrition points. Finally, we suggest applications of existing statistical methods to investigate the effect of internal survey characteristics and patient characteristics on dropout. In order to apply this framework, we conducted a case study; a seventeen-item Informed Decision-Making (IDM) module addressing how and why patients make decisions about cancer screening. Using the framework, we were able to find significant attrition points at Questions 4, 6, 7, and 9, and were also able to identify participant responses and characteristics associated with dropout at these points and overall. When these methods were applied to survey data, significant attrition trends were revealed, both visually and empirically, that can inspire researchers to investigate the factors associated with survey dropout, address whether survey completion is associated with health outcomes, and compare attrition patterns between groups. The framework can be used to extract information beyond simple responses, can be useful during survey development, and can help determine the external validity of survey results.

  7. Accuracy of Population Validity and Cross-Validity Estimation: An Empirical Comparison of Formula-Based, Traditional Empirical, and Equal Weights Procedures.

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Bilgic, Reyhan; Edwards, Jack E.; Fleer, Paul F.

    1999-01-01

    Performed an empirical Monte Carlo study using predictor and criterion data from 84,808 U.S. Air Force enlistees. Compared formula-based, traditional empirical, and equal-weights procedures. Discusses issues for basic research on validation and cross-validation. (SLD)

  8. Landfill modelling in LCA - a contribution based on empirical data.

    PubMed

    Obersteiner, Gudrun; Binner, Erwin; Mostbauer, Peter; Salhofer, Stefan

    2007-01-01

    Landfills at various stages of development, depending on their age and location, can be found throughout Europe. The type of facilities goes from uncontrolled dumpsites to highly engineered facilities with leachate and gas management. In addition, some landfills are designed to receive untreated waste, while others can receive incineration residues (MSWI) or residues after mechanical biological treatment (MBT). Dimension, type and duration of the emissions from landfills depend on the quality of the disposed waste, the technical design, and the location of the landfill. Environmental impacts are produced by the leachate (heavy metals, organic loading), emissions into the air (CH(4), hydrocarbons, halogenated hydrocarbons) and from the energy or fuel requirements for the operation of the landfill (SO(2) and NO(x) from the production of electricity from fossil fuels). To include landfilling in an life-cycle assessment (LCA) approach entails several methodological questions (multi-input process, site-specific influence, time dependency). Additionally, no experiences are available with regard to mid-term behaviour (decades) for the relatively new types of landfill (MBT landfill, landfill for residues from MSWI). The present paper focuses on two main issues concerning modelling of landfills in LCA: Firstly, it is an acknowledged fact that emissions from landfills may prevail for a very long time, often thousands of years or longer. The choice of time frame in the LCA of landfilling may therefore clearly affect the results. Secondly, the reliability of results obtained through a life-cycle assessment depends on the availability and quality of Life Cycle Inventory (LCI) data. Therefore the choice of the general approach, using multi-input inventory tool versus empirical results, may also influence the results. In this paper the different approaches concerning time horizon and LCI will be introduced and discussed. In the application of empirical results, the presence of

  9. Exoplanet Demographics with a Space-Based Microlensing Survey

    NASA Astrophysics Data System (ADS)

    Gaudi, B. Scott

    2012-05-01

    Measurements of the frequency of exoplanets over a broad range of planet and host star properties provide fundamental empirical constraints on theories of planet formation and evolution. Because of its unique sensitivity to low-mass, long-period, and free-floating planets, microlensing is an essential complement to our arsenal of planet detection methods. I motivate microlensing surveys for exoplanets, and in particular describe how they can be used to test the currently-favored paradigm for planet formation, as well as inform our understanding of the frequency and potential habitability of low-mass planets located in the habitable zones of their host stars. I explain why a space-based mission is necessary to realize the full potential of microlensing, and outline the expected returns of such surveys. When combined with the results from complementary surveys such as Kepler, a space-based microlensing survey will yield a nearly complete picture of the demographics of planetary systems throughout the Galaxy.

  10. Empirical study of ARFIMA model based on fractional differencing

    NASA Astrophysics Data System (ADS)

    Xiu, Jin; Jin, Yao

    2007-04-01

    In this paper, we studied the long-term memory of Hong Kong Hang Sheng index using MRS analysis, established ARFIMA model for it, and detailed the procedure of fractional differencing. Furthermore, we compared the ARFIMA model built by this means with the one that took first-order differencing as an alternative. The result showed that, if doing so, much useful information of time series would be lost. The forecast formula of ARFIMA model was corrected according to the method of fractional differencing, and was employed in the empirical study. It was illustrated that the forecast performance of ARFIMA model was not as good as we expected since the ARFIMA model was ineffective in forecasting Hang Sheng index. The certainty of this conclusion was proposed from two different aspects.

  11. Empirical Mode Decomposition Based Features for Diagnosis and Prognostics of Systems

    DTIC Science & Technology

    2008-04-01

    bearing fault diagnosis – their effectiveness and flexibilities. Journal of Vibration and Acoustics July 2001, ASME. 3. Staszewski, W. J. Structural...Empirical Mode Decomposition Based Features for Diagnosis and Prognostics of Systems by Hiralal Khatri, Kenneth Ranney, Kwok Tom, and Romeo...Laboratory Adelphi, MD 20783-1197 ARL-TR-4301 April 2008 Empirical Mode Decomposition Based Features for Diagnosis and Prognostics of Systems

  12. Bridging process-based and empirical approaches to modeling tree growth

    Treesearch

    Harry T. Valentine; Annikki Makela; Annikki Makela

    2005-01-01

    The gulf between process-based and empirical approaches to modeling tree growth may be bridged, in part, by the use of a common model. To this end, we have formulated a process-based model of tree growth that can be fitted and applied in an empirical mode. The growth model is grounded in pipe model theory and an optimal control model of crown development. Together, the...

  13. An Empirical Agent-Based Model to Simulate the Adoption of Water Reuse Using the Social Amplification of Risk Framework.

    PubMed

    Kandiah, Venu; Binder, Andrew R; Berglund, Emily Z

    2017-01-11

    Water reuse can serve as a sustainable alternative water source for urban areas. However, the successful implementation of large-scale water reuse projects depends on community acceptance. Because of the negative perceptions that are traditionally associated with reclaimed water, water reuse is often not considered in the development of urban water management plans. This study develops a simulation model for understanding community opinion dynamics surrounding the issue of water reuse, and how individual perceptions evolve within that context, which can help in the planning and decision-making process. Based on the social amplification of risk framework, our agent-based model simulates consumer perceptions, discussion patterns, and their adoption or rejection of water reuse. The model is based on the "risk publics" model, an empirical approach that uses the concept of belief clusters to explain the adoption of new technology. Each household is represented as an agent, and parameters that define their behavior and attributes are defined from survey data. Community-level parameters-including social groups, relationships, and communication variables, also from survey data-are encoded to simulate the social processes that influence community opinion. The model demonstrates its capabilities to simulate opinion dynamics and consumer adoption of water reuse. In addition, based on empirical data, the model is applied to investigate water reuse behavior in different regions of the United States. Importantly, our results reveal that public opinion dynamics emerge differently based on membership in opinion clusters, frequency of discussion, and the structure of social networks.

  14. WATERSHED-BASED SURVEY DESIGNS

    EPA Science Inventory

    Water-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification if impaired water bodies or watersheds to meet Sectio...

  15. WATERSHED-BASED SURVEY DESIGNS

    EPA Science Inventory

    Water-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification if impaired water bodies or watersheds to meet Sectio...

  16. Attachment-Based Family Therapy: A Review of the Empirical Support.

    PubMed

    Diamond, Guy; Russon, Jody; Levy, Suzanne

    2016-09-01

    Attachment-based family therapy (ABFT) is an empirically supported treatment designed to capitalize on the innate, biological desire for meaningful and secure relationships. The therapy is grounded in attachment theory and provides an interpersonal, process-oriented, trauma-focused approach to treating adolescent depression, suicidality, and trauma. Although a process-oriented therapy, ABFT offers a clear structure and road map to help therapists quickly address attachment ruptures that lie at the core of family conflict. Several clinical trials and process studies have demonstrated empirical support for the model and its proposed mechanism of change. This article provides an overview of the clinical model and the existing empirical support for ABFT.

  17. An empirical formula based on Monte Carlo simulation for diffuse reflectance from turbid media

    NASA Astrophysics Data System (ADS)

    Gnanatheepam, Einstein; Aruna, Prakasa Rao; Ganesan, Singaravelu

    2016-03-01

    Diffuse reflectance spectroscopy has been widely used in diagnostic oncology and characterization of laser irradiated tissue. However, still accurate and simple analytical equation does not exist for estimation of diffuse reflectance from turbid media. In this work, a diffuse reflectance lookup table for a range of tissue optical properties was generated using Monte Carlo simulation. Based on the generated Monte Carlo lookup table, an empirical formula for diffuse reflectance was developed using surface fitting method. The variance between the Monte Carlo lookup table surface and the surface obtained from the proposed empirical formula is less than 1%. The proposed empirical formula may be used for modeling of diffuse reflectance from tissue.

  18. Empirical Analysis and Refinement of Expert System Knowledge Bases.

    DTIC Science & Technology

    1987-11-30

    Knowledge base refinement is the modification of an existing expert system knowledge base with the goals of localizing specific weaknesses in a... expert system techniques for knowledge acquisition, knowledge base refinement, maintenance, and verification....on the related problems of knowledge base acquisition, maintenance, verification, and learning from experience. The SEEK system was the first expert

  19. Strong Generative Capacity and the Empirical Base of Linguistic Theory

    PubMed Central

    Ott, Dennis

    2017-01-01

    This Perspective traces the evolution of certain central notions in the theory of Generative Grammar (GG). The founding documents of the field suggested a relation between the grammar, construed as recursively enumerating an infinite set of sentences, and the idealized native speaker that was essentially equivalent to the relation between a formal language (a set of well-formed formulas) and an automaton that recognizes strings as belonging to the language or not. But this early view was later abandoned, when the focus of the field shifted to the grammar's strong generative capacity as recursive generation of hierarchically structured objects as opposed to strings. The grammar is now no longer seen as specifying a set of well-formed expressions and in fact necessarily constructs expressions of any degree of intuitive “acceptability.” The field of GG, however, has not sufficiently acknowledged the significance of this shift in perspective, as evidenced by the fact that (informal and experimentally-controlled) observations about string acceptability continue to be treated as bona fide data and generalizations for the theory of GG. The focus on strong generative capacity, it is argued, requires a new discussion of what constitutes valid empirical evidence for GG beyond observations pertaining to weak generation. PMID:28983268

  20. An Empirical Taxonomy of Youths' Fears: Cluster Analysis of the American Fear Survey Schedule

    ERIC Educational Resources Information Center

    Burnham, Joy J.; Schaefer, Barbara A.; Giesen, Judy

    2006-01-01

    Fears profiles among children and adolescents were explored using the Fear Survey Schedule for Children-American version (FSSC-AM; J.J. Burnham, 1995, 2005). Eight cluster profiles were identified via multistage Euclidean grouping and supported by homogeneity coefficients and replication. Four clusters reflected overall level of fears (i.e., very…

  1. A Survey and Empirical Study of Virtual Reference Service in Academic Libraries

    ERIC Educational Resources Information Center

    Mu, Xiangming; Dimitroff, Alexandra; Jordan, Jeanette; Burclaff, Natalie

    2011-01-01

    Virtual Reference Services (VRS) have high user satisfaction. The main problem is its low usage. We surveyed 100 academic library web sites to understand how VRS are presented. We then conducted a usability study to further test an active VRS model regarding its effectiveness.

  2. A Survey and Empirical Study of Virtual Reference Service in Academic Libraries

    ERIC Educational Resources Information Center

    Mu, Xiangming; Dimitroff, Alexandra; Jordan, Jeanette; Burclaff, Natalie

    2011-01-01

    Virtual Reference Services (VRS) have high user satisfaction. The main problem is its low usage. We surveyed 100 academic library web sites to understand how VRS are presented. We then conducted a usability study to further test an active VRS model regarding its effectiveness.

  3. On Consumer Self-Direction of Attendant Care Services: An Empirical Analysis of Survey Responses.

    ERIC Educational Resources Information Center

    Asher, Cheryl C.; And Others

    1991-01-01

    The concept of attendant care--provision of personal services to severely disabled individuals--is presented. Data from a survey of about 340 out of 718 consumers of attendant care indicate the existence of a mix of consumer-oriented programs. Consumer preference for a particular program design appeared to be governed by experience. (SLD)

  4. Continuous Assessment in a New Testament Survey Course: Empirically Informed Reflections on an Australian Trial

    ERIC Educational Resources Information Center

    Hussey, Ian

    2017-01-01

    This article reports on a practitioner action research project focused on developing, trialing, and reflecting upon a continuous and formative-assessment plan for a foundational New Testament survey course. Three pedagogical convictions are discussed and drive the design of the assessment. Seven to nine assessment items (depending on level of…

  5. Deep in Data. Empirical Data Based Software Accuracy Testing Using the Building America Field Data Repository

    SciTech Connect

    Neymark, J.; Roberts, D.

    2013-06-01

    This paper describes progress toward developing a usable, standardized, empirical data-based software accuracy test suite using home energy consumption and building description data. Empirical data collected from around the United States have been translated into a uniform Home Performance Extensible Markup Language format that may enable software developers to create translators to their input schemes for efficient access to the data. This could allow for modeling many homes expediently, and thus implementing software accuracy test cases by applying the translated data.

  6. Cloud Based Processing of Large Photometric Surveys

    NASA Astrophysics Data System (ADS)

    Farivar, R.; Brunner, R. J.; Santucci, R.; Campbell, R.

    2013-10-01

    Astronomy, as is the case with many scientific domains, has entered the realm of being a data rich science. Nowhere is this reflected more clearly than in the growth of large area surveys, such as the recently completed Sloan Digital Sky Survey (SDSS) or the Dark Energy Survey, which will soon obtain PB of imaging data. The data processing on these large surveys is a major challenge. In this paper, we demonstrate a new approach to this common problem. We propose the use of cloud-based technologies (e.g., Hadoop MapReduce) to run a data analysis program (e.g., SExtractor) across a cluster. Using the intermediate key/value pair design of Hadoop, our framework matches objects across different SExtractor invocations to create a unified catalog from all SDSS processed data. We conclude by presenting our experimental results on a 432 core cluster and discuss the lessons we have learned in completing this challenge.

  7. Augmented Reality-Based Simulators as Discovery Learning Tools: An Empirical Study

    ERIC Educational Resources Information Center

    Ibáñez, María-Blanca; Di-Serio, Ángela; Villarán-Molina, Diego; Delgado-Kloos, Carlos

    2015-01-01

    This paper reports empirical evidence on having students use AR-SaBEr, a simulation tool based on augmented reality (AR), to discover the basic principles of electricity through a series of experiments. AR-SaBEr was enhanced with knowledge-based support and inquiry-based scaffolding mechanisms, which proved useful for discovery learning in…

  8. Topological phase transition of single-crystal Bi based on empirical tight-binding calculations

    NASA Astrophysics Data System (ADS)

    Ohtsubo, Yoshiyuki; Kimura, Shin-ichi

    2016-12-01

    The topological order of single-crystal Bi and its surface states on the (111) surface are studied in detail based on empirical tight-binding (TB) calculations. New TB parameters are presented that are used to calculate the surface states of semi-infinite single-crystal Bi(111), which agree with the experimental angle-resolved photoelectron spectroscopy results. The influence of the crystal lattice distortion is surveyed and it is revealed that a topological phase transition is driven by in-plane expansion with topologically non-trivial bulk bands. In contrast with the semi-infinite system, the surface-state dispersions on finite-thickness slabs are non-trivial irrespective of the bulk topological order. The role of the interaction between the top and bottom surfaces in the slab is systematically studied, and it is revealed that a very thick slab is required to properly obtain the bulk topological order of Bi from the (111) surface state: above 150 biatomic layers in this case.

  9. Surveying NGO-Military Relations: Empirical Data to Both Confirm and Reject Popular Beliefs

    DTIC Science & Technology

    2011-06-01

    working with humanitarian actors as perceived by the Military. 3. Summarise the enablers and barriers identified in objectives 1 and 2 in a survey...The findings from the thematic analysis are summarised below for each free-response question below. Due to confidentiality issues the raw data...willing to talk to military (“refusniks”) Date/reference/classification© BAE Systems (insert Home Market) 2011 BAE Systems [INTERNAL USE ONLY/IN

  10. The unclear status of nonprofit directors: an empirical survey of director liability.

    PubMed

    Siciliano, J; Spiro, G

    1992-01-01

    Nonprofit boards of directors are fiduciaries for the organization. However, there have been various legal interpretations of their duties. The authors review the conflicting standards of conduct that exist, and they report the results of a survey which sampled director opinion concerning the liability issue. The proposition that directors were unaware of their legal responsibilities was supported, and the implications of this finding for organizational procedures and public policy are discussed.

  11. An Empirical Analysis of Knowledge Based Hypertext Navigation

    PubMed Central

    Snell, J.R.; Boyle, C.

    1990-01-01

    Our purpose is to investigate the effectiveness of knowledge-based navigation in a dermatology hypertext network. The chosen domain is a set of dermatology class notes implemented in Hypercard and SINS. The study measured time, number of moves, and success rates for subjects to find solutions to ten questions. The subjects were required to navigate within a dermatology hypertext network in order to find the solutions to a question. Our results indicate that knowledge-based navigation can assist the user in finding information of interest in a fewer number of node visits (moves) than with traditional button-based browsing or keyword searching. The time necessary to find an item of interest was lower for traditional-based methods. There was no difference in success rates for the two test groups.

  12. Obesity, weight status and employability: empirical evidence from a French national survey.

    PubMed

    Paraponaris, Alain; Saliba, Bérengère; Ventelou, Bruno

    2005-07-01

    We investigate the relationship between employability and obesity, particularly how obesity and overweight are associated with the percentage of working years spent unemployed and the ability to regain employment. Data for adults who responded to the 2003 Decennial Health Survey collected by the French National Institute of Statistics and Economic Studies revealed that the percentage of time spent unemployed during working years is significantly higher for each kg/m2 deviation from the mean body mass index (BMI) attained at age 20 and that the probability of regaining employment after a period of unemployment is much lower.

  13. Building Assessment Survey and Evaluation Data (BASE)

    EPA Pesticide Factsheets

    The Building Assessment Survey and Evaluation (BASE) study was a five year study to characterize determinants of indoor air quality and occupant perceptions in representative public and commercial office buildings across the U.S. This data source is the raw data from this study about the indoor air quality.

  14. Performance-Based Service Quality Model: An Empirical Study on Japanese Universities

    ERIC Educational Resources Information Center

    Sultan, Parves; Wong, Ho

    2010-01-01

    Purpose: This paper aims to develop and empirically test the performance-based higher education service quality model. Design/methodology/approach: The study develops 67-item instrument for measuring performance-based service quality with a particular focus on the higher education sector. Scale reliability is confirmed using the Cronbach's alpha.…

  15. Performance-Based Service Quality Model: An Empirical Study on Japanese Universities

    ERIC Educational Resources Information Center

    Sultan, Parves; Wong, Ho

    2010-01-01

    Purpose: This paper aims to develop and empirically test the performance-based higher education service quality model. Design/methodology/approach: The study develops 67-item instrument for measuring performance-based service quality with a particular focus on the higher education sector. Scale reliability is confirmed using the Cronbach's alpha.…

  16. Empirically Based School Interventions Targeted at Academic and Mental Health Functioning

    ERIC Educational Resources Information Center

    Hoagwood, Kimberly E.; Olin, S. Serene; Kerker, Bonnie D.; Kratochwill, Thomas R.; Crowe, Maura; Saka, Noa

    2007-01-01

    This review examines empirically based studies of school-based mental health interventions. The review identified 64 out of more than 2,000 articles published between 1990 and 2006 that met methodologically rigorous criteria for inclusion. Of these 64 articles, only 24 examined both mental health "and" educational outcomes. The majority of…

  17. Assessing differential expression in two-color microarrays: a resampling-based empirical Bayes approach.

    PubMed

    Li, Dongmei; Le Pape, Marc A; Parikh, Nisha I; Chen, Will X; Dye, Timothy D

    2013-01-01

    Microarrays are widely used for examining differential gene expression, identifying single nucleotide polymorphisms, and detecting methylation loci. Multiple testing methods in microarray data analysis aim at controlling both Type I and Type II error rates; however, real microarray data do not always fit their distribution assumptions. Smyth's ubiquitous parametric method, for example, inadequately accommodates violations of normality assumptions, resulting in inflated Type I error rates. The Significance Analysis of Microarrays, another widely used microarray data analysis method, is based on a permutation test and is robust to non-normally distributed data; however, the Significance Analysis of Microarrays method fold change criteria are problematic, and can critically alter the conclusion of a study, as a result of compositional changes of the control data set in the analysis. We propose a novel approach, combining resampling with empirical Bayes methods: the Resampling-based empirical Bayes Methods. This approach not only reduces false discovery rates for non-normally distributed microarray data, but it is also impervious to fold change threshold since no control data set selection is needed. Through simulation studies, sensitivities, specificities, total rejections, and false discovery rates are compared across the Smyth's parametric method, the Significance Analysis of Microarrays, and the Resampling-based empirical Bayes Methods. Differences in false discovery rates controls between each approach are illustrated through a preterm delivery methylation study. The results show that the Resampling-based empirical Bayes Methods offer significantly higher specificity and lower false discovery rates compared to Smyth's parametric method when data are not normally distributed. The Resampling-based empirical Bayes Methods also offers higher statistical power than the Significance Analysis of Microarrays method when the proportion of significantly differentially

  18. Economic burden of schizophrenia: empirical analyses from a survey in Thailand.

    PubMed

    Phanthunane, Pudtan; Whiteford, Harvey; Vos, Theo; Bertram, Melanie

    2012-03-01

    Evidence consistently indicates that schizophrenia is a costly disease although it is not a high prevalence disorder. There are a few studies in developing countries but no study in Thailand reporting the cost of schizophrenia from a societal perspective. Health policy makers need to be aware of the cost of health care for people with schizophrenia as well as the economic burden on patients and families. This study aims to provide a detailed breakdown of the costs attributed to schizophrenia including the consumption of public health care resources by people with schizophrenia and the negative consequences on patients and families due to productivity losses. Data from a survey conducted in 2008 among people in treatment for schizophrenia were used to estimate annual medical costs for treatment including outpatient services, hospitalization and patient travel. Indirect costs were estimated for reported productivity losses of patients and families. Uncertainty analysis was performed using Monte Carlo simulation methods. We tested the sensitivity of varying assumptions about market wages to estimate productivity losses. All cost estimates are adjusted to 2008 using the Consumer Price Index and reported in Thai baht (THB). The average annual exchange rate of Thai baths to one US dollar was 33.5 in 2008. The annual overall cost of schizophrenia was estimated to be THB 87 000 (USD 2600) (95% CI: 83 000, 92 000) per person or THB 31 000 million (USD 925 million) (95% CI: 26 000, 37 000) for the entire population with schizophrenia in Thailand. Indirect costs due to high unemployment, absenteeism and presenteeism of patients and families accounted for 61% of the total economic burden of schizophrenia. The largest component of direct medical cost was for hospitalizations (50%), followed by outpatient services and drug costs. Sensitivity analyses suggest that using labor force survey and socioeconomic status survey provided similar results, while lost productivity when the

  19. Attitudes towards terminal sedation: an empirical survey among experts in the field of medical ethics

    PubMed Central

    Simon, Alfred; Kar, Magdalene; Hinz, José; Beck, Dietmar

    2007-01-01

    Background "Terminal sedation" regarded as the use of sedation in (pre-)terminal patients with treatment-refractory symptoms is controversially discussed not only within palliative medicine. While supporters consider terminal sedation as an indispensable palliative medical treatment option, opponents disapprove of it as "slow euthanasia". Against this background, we interviewed medical ethics experts by questionnaire on the term and the moral acceptance of terminal sedation in order to find out how they think about this topic. We were especially interested in whether experts with a professional medical and nursing background think differently about the topic than experts without this background. Methods The survey was carried out by questionnaire; beside the provided answering options free text comments were possible. As test persons we chose the 477 members of the German Academy for Ethics in Medicine, an interdisciplinary society for medical ethics. Results 281 completed questionnaires were returned (response rate = 59%). The majority of persons without medical background regarded "terminal sedation" as an intentional elimination of consciousness until the patient's death occurs; persons with a medical background generally had a broader understanding of the term, including light or intermittent forms of sedation. 98% of the respondents regarded terminal sedation in dying patients with treatment-refractory physical symptoms as acceptable. Situations in which the dying process has not yet started, in which untreatable mental symptoms are the indication for terminal sedation or in which life-sustaining measures are withdrawn during sedation were evaluated as morally difficult. Conclusion The survey reveals a great need for research and discussion on the medical indication as well as on the moral evaluation of terminal sedation. Prerequisite for this is a more precise terminology which describes the circumstances of the sedation. PMID:17437628

  20. The Empirical Investigation of Perspective-Based Reading

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Green, Scott; Laitenberger, Oliver; Shull, Forrest; Sorumgard, Sivert; Zelkowitz, Marvin V.

    1996-01-01

    We consider reading techniques a fundamental means of achieving high quality software. Due to the lack of research in this area, we are experimenting with the application and comparison of various reading techniques. This paper deals with our experiences with Perspective-Based Reading (PBR), a particular reading technique for requirements documents. The goal of PBR is to provide operational scenarios where members of a review team read a document from a particular perspective (e.g., tester, developer, user). Our assumption is that the combination of different perspectives provides better coverage of the document than the same number of readers using their usual technique.

  1. Toward an Empirically-Based Parametric Explosion Spectral Model

    DTIC Science & Technology

    2011-09-01

    Figure 6. Analysis of Vp/Vs () ratio from USGS database (Wood, 2007) at Pahute Mesa and Yucca Flat. The ratio as a function of depth...from Leonard and Johnson (1987) and Ferguson (1988) are shown for Pahute Mesa and Yucca Flat, respectively. Based on the distribution, we estimate...constant Vp/Vs ratios of 1.671 and 1.871 () at Pahute Mesa and Yucca Flat, respectively. In order to obtain the shear modulus and shear

  2. The Empirical Investigation of Perspective-Based Reading

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Green, Scott; Laitenberger, Oliver; Shull, Forrest; Sorumgard, Sivert; Zelkowitz, Marvin V.

    1996-01-01

    We consider reading techniques a fundamental means of achieving high quality software. Due to the lack of research in this area, we are experimenting with the application and comparison of various reading techniques. This paper deals with our experiences with Perspective-Based Reading (PBR), a particular reading technique for requirements documents. The goal of PBR is to provide operational scenarios where members of a review team read a document from a particular perspective (e.g., tester, developer, user). Our assumption is that the combination of different perspectives provides better coverage of the document than the same number of readers using their usual technique.

  3. A Comparison of Web-Based and Paper-Based Survey Methods: Testing Assumptions of Survey Mode and Response Cost

    ERIC Educational Resources Information Center

    Greenlaw, Corey; Brown-Welty, Sharon

    2009-01-01

    Web-based surveys have become more prevalent in areas such as evaluation, research, and marketing research to name a few. The proliferation of these online surveys raises the question, how do response rates compare with traditional surveys and at what cost? This research explored response rates and costs for Web-based surveys, paper surveys, and…

  4. A Comparison of Web-Based and Paper-Based Survey Methods: Testing Assumptions of Survey Mode and Response Cost

    ERIC Educational Resources Information Center

    Greenlaw, Corey; Brown-Welty, Sharon

    2009-01-01

    Web-based surveys have become more prevalent in areas such as evaluation, research, and marketing research to name a few. The proliferation of these online surveys raises the question, how do response rates compare with traditional surveys and at what cost? This research explored response rates and costs for Web-based surveys, paper surveys, and…

  5. Theoretical and empirical dimensions of the Aberdeen Glaucoma Questionnaire: a cross sectional survey and principal component analysis

    PubMed Central

    2013-01-01

    .5% of participants (p < 0.001). Conclusions This paper addresses a methodological gap in the application of classical test theory (CTT) techniques, such as PCA, in instrument development. Labels for empirically-derived factors are often selected intuitively whereas they can inform existing bodies of knowledge if selected on the basis of theoretical construct labels, which are more explicitly defined and which relate to each other in ways that are evidence based. PMID:24268026

  6. An empirical survey of the benefits of implementing pay for safety scheme (PFSS) in the Hong Kong construction industry.

    PubMed

    Chan, Daniel W M; Chan, Albert P C; Choi, Tracy N Y

    2010-10-01

    The Government of the Hong Kong Special Administrative Region (SAR) has implemented different safety initiatives to improve the safety performance of the construction industry over the past decades. The Pay for Safety Scheme (PFSS), which is one of the effective safety measures launched by the government in 1996, has been widely adopted in the public works contracts. Both the accident rate and fatality rate of public sector projects have decreased noticeably over this period. This paper aims to review the current state of application of PFSS in Hong Kong, and attempts to identify and analyze the perceived benefits of PFSS in construction via an industry-wide empirical questionnaire survey. A total of 145 project participants who have gained abundant hands-on experience with the PFSS construction projects were requested to complete a survey questionnaire to indicate the relative importance of those benefits identified in relation to PFSS. The perceived benefits were measured and ranked from the perspectives of the client and contractor for crosscomparison. The survey findings suggested the most significant benefits derived from adopting PFSS were: (a) Increased safety training; (b) Enhanced safety awareness; (c) Encouragement of developing safety management system; and (d) Improved safety commitment. A wider application of PFSS should be advocated so as to achieve better safety performance within the construction industry. It is recommended that a similar scheme to the PFSS currently adopted in Hong Kong may be developed for implementation in other regions or countries for international comparisons. Copyright © 2010 Elsevier Ltd and National Safety Council. All rights reserved.

  7. Capability deprivation of people with Alzheimer's disease: An empirical analysis using a national survey.

    PubMed

    Tellez, Juan; Krishnakumar, Jaya; Bungener, Martine; Le Galès, Catherine

    2016-02-01

    How can one assess the quality of life of older people--particularly those with Alzheimer's disease--from the point of view of their opportunities to do valued things in life? This paper is an attempt to answer this question using as a theoretical framework the capability approach. We use data collected on 8841 individuals above 60 living in France (the 2008 Disability and Health Household Survey) and propose a latent variable modelling framework to analyse their capabilities in two fundamental dimensions: freedom to perform self-care activities and freedom to participate in the life of the household. Our results show that living as a couple, having children, being mobile and having access to local shops, health facilities and public services enhance both capabilities. Age, household size and male gender (for one of the two capabilities) act as impediments while the number of impairments reduces both capabilities. We find that people with Alzheimer's disease have a lower level and a smaller range of capabilities (freedom) when compared to those without, even when the latter have several impairments. Hence they need a special attention in policy-making.

  8. Ethnic density as a buffer for psychotic experiences: findings from a national survey (EMPIRIC).

    PubMed

    Das-Munshi, Jayati; Bécares, Laia; Boydell, Jane E; Dewey, Michael E; Morgan, Craig; Stansfeld, Stephen A; Prince, Martin J

    2012-10-01

    Aetiological mechanisms underlying ethnic density associations with psychosis remain unclear. To assess potential mechanisms underlying the observation that minority ethnic groups experience an increased risk of psychosis when living in neighbourhoods of lower own-group density. Multilevel analysis of nationally representative community-level data (from the Ethnic Minorities Psychiatric Illness Rates in the Community survey), which included the main minority ethnic groups living in England, and a White British group. Structured instruments assessed discrimination, chronic strains and social support. The Psychosis Screening Questionnaire ascertained psychotic experiences. For every ten percentage point reduction in own-group density, the relative odds of reporting psychotic experiences increased 1.07 times (95% CI 1.01-1.14, P = 0.03 (trend)) for the total minority ethnic sample. In general, people living in areas of lower own-group density experienced greater social adversity that was in turn associated with reporting psychotic experiences. People resident in neighbourhoods of higher own-group density experience 'buffering' effects from the social risk factors for psychosis.

  9. An empirical determination of the dust mass absorption coefficient, κd, using the Herschel Reference Survey

    NASA Astrophysics Data System (ADS)

    Clark, Christopher J. R.; Schofield, Simon P.; Gomez, Haley L.; Davies, Jonathan I.

    2016-06-01

    We use the published photometry and spectroscopy of 22 galaxies in the Herschel Reference Survey to determine that the value of the dust mass absorption coefficient κd at a wavelength of 500 μm is kappa _{500} = 0.051^{+0.070}_{-0.026} m^{2 kg^{-1}}. We do so by taking advantage of the fact that the dust-to-metals ratio in the interstellar medium of galaxies appears to be constant. We argue that our value for κd supersedes that of James et al. - who pioneered this approach for determining κd - because we take advantage of superior data, and account for a number of significant systematic effects that they did not consider. We comprehensively incorporate all methodological and observational contributions to establish the uncertainty on our value, which represents a marked improvement on the oft-quoted `order-of-magnitude' uncertainty on κd. We find no evidence that the value of κd differs significantly between galaxies, or that it correlates with any other measured or derived galaxy properties. We note, however, that the availability of data limits our sample to relatively massive (109.7 < M⋆ < 1011.0 M⊙), high metallicity (8.61 < [ 12 + log_{10} fracOH ] < 8.86) galaxies; future work will allow us to investigate a wider range of systems.

  10. Towards an Empirically Based Parametric Explosion Spectral Model

    SciTech Connect

    Ford, S R; Walter, W R; Ruppert, S; Matzel, E; Hauk, T; Gok, R

    2009-08-31

    Small underground nuclear explosions need to be confidently detected, identified, and characterized in regions of the world where they have never before been tested. The focus of our work is on the local and regional distances (< 2000 km) and phases (Pn, Pg, Sn, Lg) necessary to see small explosions. We are developing a parametric model of the nuclear explosion seismic source spectrum that is compatible with the earthquake-based geometrical spreading and attenuation models developed using the Magnitude Distance Amplitude Correction (MDAC) techniques (Walter and Taylor, 2002). The explosion parametric model will be particularly important in regions without any prior explosion data for calibration. The model is being developed using the available body of seismic data at local and regional distances for past nuclear explosions at foreign and domestic test sites. Parametric modeling is a simple and practical approach for widespread monitoring applications, prior to the capability to carry out fully deterministic modeling. The achievable goal of our parametric model development is to be able to predict observed local and regional distance seismic amplitudes for event identification and yield determination in regions with incomplete or no prior history of underground nuclear testing. The relationship between the parametric equations and the geologic and containment conditions will assist in our physical understanding of the nuclear explosion source.

  11. Toward an Empirically-based Parametric Explosion Spectral Model

    NASA Astrophysics Data System (ADS)

    Ford, S. R.; Walter, W. R.; Ruppert, S.; Matzel, E.; Hauk, T. F.; Gok, R.

    2010-12-01

    Small underground nuclear explosions need to be confidently detected, identified, and characterized in regions of the world where they have never occurred. We develop a parametric model of the nuclear explosion seismic source spectrum derived from regional phases (Pn, Pg, and Lg) that is compatible with earthquake-based geometrical spreading and attenuation. Earthquake spectra are fit with a generalized version of the Brune spectrum, which is a three-parameter model that describes the long-period level, corner-frequency, and spectral slope at high-frequencies. These parameters are then correlated with near-source geology and containment conditions. There is a correlation of high gas-porosity (low strength) with increased spectral slope. However, there are trade-offs between the slope and corner-frequency, which we try to independently constrain using Mueller-Murphy relations and coda-ratio techniques. The relationship between the parametric equation and the geologic and containment conditions will assist in our physical understanding of the nuclear explosion source, and aid in the prediction of observed local and regional distance seismic amplitudes for event identification and yield determination in regions with incomplete or no prior history of underground nuclear testing.

  12. The Empirical Investigation of Perspective-Based Reading

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Green, Scott; Laitenberger, Oliver; Shull, Forrest; Sorumgard, Sivert; Zelkowitz, Marvin V.

    1995-01-01

    We consider reading techniques a fundamental means of achieving high quality software. Due to lack of research in this area, we are experimenting with the application and comparison of various reading techniques. This paper deals with our experiences with Perspective Based Reading (PBR) a particular reading technique for requirement documents. The goal of PBR is to provide operation scenarios where members of a review team read a document from a particular perspective (eg., tester, developer, user). Our assumption is that the combination of different perspective provides better coverage of the document than the same number of readers using their usual technique. To test the efficacy of PBR, we conducted two runs of a controlled experiment in the environment of NASA GSFC Software Engineering Laboratory (SEL), using developers from the environment. The subjects read two types of documents, one generic in nature and the other from the NASA Domain, using two reading techniques, PBR and their usual technique. The results from these experiment as well as the experimental design, are presented and analyzed. When there is a statistically significant distinction, PBR performs better than the subjects' usual technique. However, PBR appears to be more effective on the generic documents than on the NASA documents.

  13. The Empirical Investigation of Perspective-Based Reading

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Green, Scott; Laitenberger, Oliver; Shull, Forrest; Sorumgard, Sivert; Zelkowitz, Marvin V.

    1995-01-01

    We consider reading techniques a fundamental means of achieving high quality software. Due to lack of research in this area, we are experimenting with the application and comparison of various reading techniques. This paper deals with our experiences with Perspective Based Reading (PBR) a particular reading technique for requirement documents. The goal of PBR is to provide operation scenarios where members of a review team read a document from a particular perspective (eg., tester, developer, user). Our assumption is that the combination of different perspective provides better coverage of the document than the same number of readers using their usual technique. To test the efficacy of PBR, we conducted two runs of a controlled experiment in the environment of NASA GSFC Software Engineering Laboratory (SEL), using developers from the environment. The subjects read two types of documents, one generic in nature and the other from the NASA Domain, using two reading techniques, PBR and their usual technique. The results from these experiment as well as the experimental design, are presented and analyzed. When there is a statistically significant distinction, PBR performs better than the subjects' usual technique. However, PBR appears to be more effective on the generic documents than on the NASA documents.

  14. Empirical wind retrieval model based on SAR spectrum measurements

    NASA Astrophysics Data System (ADS)

    Panfilova, Maria; Karaev, Vladimir; Balandina, Galina; Kanevsky, Mikhail; Portabella, Marcos; Stoffelen, Ad

    ambiguity from polarimetric SAR. A criterion based on the complex correlation coefficient between the VV and VH signals sign is applied to select the wind direction. An additional quality control on the wind speed value retrieved with the spectral method is applied. Here, we use the direction obtained with the spectral method and the backscattered signal for CMOD wind speed estimate. The algorithm described above may be refined by the use of numerous SAR data and wind measurements. In the present preliminary work the first results of SAR images combined with in situ data processing are presented. Our results are compared to the results obtained using previously developed models CMOD, C-2PO for VH polarization and statistical wind retrieval approaches [1]. Acknowledgments. This work is supported by the Russian Foundation of Basic Research (grants 13-05-00852-a). [1] M. Portabella, A. Stoffelen, J. A. Johannessen, Toward an optimal inversion method for synthetic aperture radar wind retrieval, Journal of geophysical research, V. 107, N C8, 2002

  15. The Dynamics of Value Orientations: An Analysis of the Results of an Empirical Survey.

    ERIC Educational Resources Information Center

    Kosova, L. B.

    1995-01-01

    Defines a value system as a universal, prolonged, consistent structure of priorities that defines an individual's life plan. Reports on a study of individual values among 3,154 Russian adults in 3 major cities. Identifies four generalizations based on the data about social values and social change in Russia. (CFR)

  16. Guidelines for Establishing Coastal Survey Base Lines.

    DTIC Science & Technology

    1981-11-01

    1954) and Czerniak (1972b), that contribute to the value of a monument include marking the monument with its station along the base line and its date...foot of the original location and within ±0.05 foot of the original elevation ( Czerniak , 1972a), which is normally accurate enough. With the exception of...Memorandum C&GSTM-4, Environmental Science Services Administration, U.S. Coast and Geodetic Survey, Rockville, Md., 1968. CZERNIAK , M.T., "Review of

  17. Use of an Empirically Based Marriage Education Program by Religious Organizations: Results of a Dissemination Trial

    ERIC Educational Resources Information Center

    Markman, Howard J.; Whitton, Sarah W.; Kline, Galena H.; Stanley, Scott M.; Thompson, Huette; St. Peters, Michelle; Leber, Douglas B.; Olmos-Gallo, P. Antonio; Prado, Lydia; Williams, Tamara; Gilbert, Katy; Tonelli, Laurie; Bobulinski, Michelle; Cordova, Allen

    2004-01-01

    We present an evaluation of the extent to which an empirically based couples' intervention program was successfully disseminated in the community. Clergy and lay leaders from 27 religious organizations who were trained to deliver the Prevention and Relationship Enhancement Program (PREP) were contacted approximately yearly for 5 years following…

  18. Empirical vs. Expected IRT-Based Reliability Estimation in Computerized Multistage Testing (MST)

    ERIC Educational Resources Information Center

    Zhang, Yanwei; Breithaupt, Krista; Tessema, Aster; Chuah, David

    2006-01-01

    Two IRT-based procedures to estimate test reliability for a certification exam that used both adaptive (via a MST model) and non-adaptive design were considered in this study. Both procedures rely on calibrated item parameters to estimate error variance. In terms of score variance, one procedure (Method 1) uses the empirical ability distribution…

  19. Untangling the Evidence: Introducing an Empirical Model for Evidence-Based Library and Information Practice

    ERIC Educational Resources Information Center

    Gillespie, Ann

    2014-01-01

    Introduction: This research is the first to investigate the experiences of teacher-librarians as evidence-based practice. An empirically derived model is presented in this paper. Method: This qualitative study utilised the expanded critical incident approach, and investigated the real-life experiences of fifteen Australian teacher-librarians,…

  20. Deriving Empirically-Based Design Guidelines for Advanced Learning Technologies that Foster Disciplinary Comprehension

    ERIC Educational Resources Information Center

    Poitras, Eric; Trevors, Gregory

    2012-01-01

    Planning, conducting, and reporting leading-edge research requires professionals who are capable of highly skilled reading. This study reports the development of an empirically informed computer-based learning environment designed to foster the acquisition of reading comprehension strategies that mediate expertise in the social sciences. Empirical…

  1. Development of an Empirically Based Questionnaire to Investigate Young Students' Ideas about Nature of Science

    ERIC Educational Resources Information Center

    Chen, Sufen; Chang, Wen-Hua; Lieu, Sang-Chong; Kao, Huey-Lien; Huang, Mao-Tsai; Lin, Shu-Fen

    2013-01-01

    This study developed an empirically based questionnaire to monitor young learners' conceptions of nature of science (NOS). The questionnaire, entitled Students' Ideas about Nature of Science (SINOS), measured views on theory-ladenness, use of creativity and imagination, tentativeness of scientific knowledge, durability of scientific knowledge,…

  2. Feasibility of an Empirically Based Program for Parents of Preschoolers with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Dababnah, Sarah; Parish, Susan L.

    2016-01-01

    This article reports on the feasibility of implementing an existing empirically based program, "The Incredible Years," tailored to parents of young children with autism spectrum disorder. Parents raising preschool-aged children (aged 3-6?years) with autism spectrum disorder (N?=?17) participated in a 15-week pilot trial of the…

  3. Task-Based Language Teaching: An Empirical Study of Task Transfer

    ERIC Educational Resources Information Center

    Benson, Susan D.

    2016-01-01

    Since the 1980s, task-based language teaching (TBLT) has enjoyed considerable interest from researchers of second language acquisition (SLA), resulting in a growing body of empirical evidence to support how and to what extent this approach can promote language learning. Although transferability and generalizability are critical assumptions for…

  4. An Empirically Based Method of Q-Matrix Validation for the DINA Model: Development and Applications

    ERIC Educational Resources Information Center

    de la Torre, Jimmy

    2008-01-01

    Most model fit analyses in cognitive diagnosis assume that a Q matrix is correct after it has been constructed, without verifying its appropriateness. Consequently, any model misfit attributable to the Q matrix cannot be addressed and remedied. To address this concern, this paper proposes an empirically based method of validating a Q matrix used…

  5. Implementing Evidence-Based Practice: A Review of the Empirical Research Literature

    ERIC Educational Resources Information Center

    Gray, Mel; Joy, Elyssa; Plath, Debbie; Webb, Stephen A.

    2013-01-01

    The article reports on the findings of a review of empirical studies examining the implementation of evidence-based practice (EBP) in the human services. Eleven studies were located that defined EBP as a research-informed, clinical decision-making process and identified barriers and facilitators to EBP implementation. A thematic analysis of the…

  6. Implementing Evidence-Based Practice: A Review of the Empirical Research Literature

    ERIC Educational Resources Information Center

    Gray, Mel; Joy, Elyssa; Plath, Debbie; Webb, Stephen A.

    2013-01-01

    The article reports on the findings of a review of empirical studies examining the implementation of evidence-based practice (EBP) in the human services. Eleven studies were located that defined EBP as a research-informed, clinical decision-making process and identified barriers and facilitators to EBP implementation. A thematic analysis of the…

  7. Task-Based Language Teaching: An Empirical Study of Task Transfer

    ERIC Educational Resources Information Center

    Benson, Susan D.

    2016-01-01

    Since the 1980s, task-based language teaching (TBLT) has enjoyed considerable interest from researchers of second language acquisition (SLA), resulting in a growing body of empirical evidence to support how and to what extent this approach can promote language learning. Although transferability and generalizability are critical assumptions for…

  8. Satellite-based empirical models linking river plume dynamics with hypoxic area andvolume

    EPA Science Inventory

    Satellite-based empirical models explaining hypoxic area and volume variation were developed for the seasonally hypoxic (O2 < 2 mg L−1) northern Gulf of Mexico adjacent to the Mississippi River. Annual variations in midsummer hypoxic area and ...

  9. Satellite-based empirical models linking river plume dynamics with hypoxic area andvolume

    EPA Science Inventory

    Satellite-based empirical models explaining hypoxic area and volume variation were developed for the seasonally hypoxic (O2 < 2 mg L−1) northern Gulf of Mexico adjacent to the Mississippi River. Annual variations in midsummer hypoxic area and ...

  10. An Empirically-Based Statewide System for Identifying Quality Pre-Kindergarten Programs

    ERIC Educational Resources Information Center

    Williams, Jeffrey M.; Landry, Susan H.; Anthony, Jason L.; Swank, Paul R.; Crawford, April D.

    2012-01-01

    This study presents an empirically-based statewide system that links information about pre-kindergarten programs with children's school readiness scores to certify pre-kindergarten classrooms as promoting school readiness. Over 8,000 children from 1,255 pre-kindergarten classrooms were followed longitudinally for one year. Pre-kindergarten quality…

  11. ECG baseline wander correction based on mean-median filter and empirical mode decomposition.

    PubMed

    Xin, Yi; Chen, Yu; Hao, Wei Tuo

    2014-01-01

    A novel approach of ECG baseline wander correction based on mean-median filter and empirical mode decomposition is presented in this paper. The low frequency parts of the original signals were removed by the mean median filter in a nonlinear way to obtain the baseline wander estimation, then its series of IMFs were sifted by t-test after empirical mode decomposition. The proposed method, tested by the ECG signals in MIT-BIH Arrhythmia database and European ST_T database, is more effective compared with other baseline wander removal methods.

  12. Multi-focus image fusion based on window empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Qin, Xinqiang; Zheng, Jiaoyue; Hu, Gang; Wang, Jiao

    2017-09-01

    In order to improve multi-focus image fusion quality, a novel fusion algorithm based on window empirical mode decomposition (WEMD) is proposed. This WEMD is an improved form of bidimensional empirical mode decomposition (BEMD), due to its decomposition process using the adding window principle, effectively resolving the signal concealment problem. We used WEMD for multi-focus image fusion, and formulated different fusion rules for bidimensional intrinsic mode function (BIMF) components and the residue component. For fusion of the BIMF components, the concept of the Sum-modified-Laplacian was used and a scheme based on the visual feature contrast adopted; when choosing the residue coefficients, a pixel value based on the local visibility was selected. We carried out four groups of multi-focus image fusion experiments and compared objective evaluation criteria with other three fusion methods. The experimental results show that the proposed fusion approach is effective and performs better at fusing multi-focus images than some traditional methods.

  13. ['Walkability' and physical activity - results of empirical studies based on the 'Neighbourhood Environment Walkability Scale (NEWS)'].

    PubMed

    Rottmann, M; Mielck, A

    2014-02-01

    'Walkability' is mainly assessed by the NEWS questionnaire (Neighbourhood Environment Walkability Scale); in Germany this questionnaire is widely unknown. We now try to fill this gap by providing a systematic overview of empirical studies based on the NEWS. A systematic review was conducted concerning original papers including empirical analyses based on the NEWS. The results are summarised and presented in tables. Altogether 31 publications could be identified. Most of them focus on associations with the variable 'physical activity', and they often report significant associations with at least some of the scales included in the NEWS. Due to methodological differences between the studies it is difficult to compare the results. The concept of 'walkability' should also be established in the German public health discussion. A number of methodological challenges remain to be solved, such as the identification of those scales and items in the NEWS that show the strongest associations with individual health behaviours. © Georg Thieme Verlag KG Stuttgart · New York.

  14. Attachment-based family therapy for depressed and suicidal adolescents: theory, clinical model and empirical support.

    PubMed

    Ewing, E Stephanie Krauthamer; Diamond, Guy; Levy, Suzanne

    2015-01-01

    Attachment-Based Family Therapy (ABFT) is a manualized family-based intervention designed for working with depressed adolescents, including those at risk for suicide, and their families. It is an empirically informed and supported treatment. ABFT has its theoretical underpinnings in attachment theory and clinical roots in structural family therapy and emotion focused therapies. ABFT relies on a transactional model that aims to transform the quality of adolescent-parent attachment, as a means of providing the adolescent with a more secure relationship that can support them during challenging times generally, and the crises related to suicidal thinking and behavior, specifically. This article reviews: (1) the theoretical foundations of ABFT (attachment theory, models of emotional development); (2) the ABFT clinical model, including training and supervision factors; and (3) empirical support.

  15. Self-adaptive image denoising based on bidimensional empirical mode decomposition (BEMD).

    PubMed

    Guo, Song; Luan, Fangjun; Song, Xiaoyu; Li, Changyou

    2014-01-01

    To better analyze images with the Gaussian white noise, it is necessary to remove the noise before image processing. In this paper, we propose a self-adaptive image denoising method based on bidimensional empirical mode decomposition (BEMD). Firstly, normal probability plot confirms that 2D-IMF of Gaussian white noise images decomposed by BEMD follow the normal distribution. Secondly, energy estimation equation of the ith 2D-IMF (i=2,3,4,......) is proposed referencing that of ith IMF (i=2,3,4,......) obtained by empirical mode decomposition (EMD). Thirdly, the self-adaptive threshold of each 2D-IMF is calculated. Eventually, the algorithm of the self-adaptive image denoising method based on BEMD is described. From the practical perspective, this is applied for denoising of the magnetic resonance images (MRI) of the brain. And the results show it has a better denoising performance compared with other methods.

  16. An Empirical Study for Impacts of Measurement Errors on EHR based Association Studies

    PubMed Central

    Duan, Rui; Cao, Ming; Wu, Yonghui; Huang, Jing; Denny, Joshua C; Xu, Hua; Chen, Yong

    2016-01-01

    Over the last decade, Electronic Health Records (EHR) systems have been increasingly implemented at US hospitals. Despite their great potential, the complex and uneven nature of clinical documentation and data quality brings additional challenges for analyzing EHR data. A critical challenge is the information bias due to the measurement errors in outcome and covariates. We conducted empirical studies to quantify the impacts of the information bias on association study. Specifically, we designed our simulation studies based on the characteristics of the Electronic Medical Records and Genomics (eMERGE) Network. Through simulation studies, we quantified the loss of power due to misclassifications in case ascertainment and measurement errors in covariate status extraction, with respect to different levels of misclassification rates, disease prevalence, and covariate frequencies. These empirical findings can inform investigators for better understanding of the potential power loss due to misclassification and measurement errors under a variety of conditions in EHR based association studies. PMID:28269935

  17. Bacterial clonal diagnostics as a tool for evidence-based empiric antibiotic selection.

    PubMed

    Tchesnokova, Veronika; Avagyan, Hovhannes; Rechkina, Elena; Chan, Diana; Muradova, Mariya; Haile, Helen Ghirmai; Radey, Matthew; Weissman, Scott; Riddell, Kim; Scholes, Delia; Johnson, James R; Sokurenko, Evgeni V

    2017-01-01

    Despite the known clonal distribution of antibiotic resistance in many bacteria, empiric (pre-culture) antibiotic selection still relies heavily on species-level cumulative antibiograms, resulting in overuse of broad-spectrum agents and excessive antibiotic/pathogen mismatch. Urinary tract infections (UTIs), which account for a large share of antibiotic use, are caused predominantly by Escherichia coli, a highly clonal pathogen. In an observational clinical cohort study of urgent care patients with suspected UTI, we assessed the potential for E. coli clonal-level antibiograms to improve empiric antibiotic selection. A novel PCR-based clonotyping assay was applied to fresh urine samples to rapidly detect E. coli and the urine strain's clonotype. Based on a database of clonotype-specific antibiograms, the acceptability of various antibiotics for empiric therapy was inferred using a 20%, 10%, and 30% allowed resistance threshold. The test's performance characteristics and possible effects on prescribing were assessed. The rapid test identified E. coli clonotypes directly in patients' urine within 25-35 minutes, with high specificity and sensitivity compared to culture. Antibiotic selection based on a clonotype-specific antibiogram could reduce the relative likelihood of antibiotic/pathogen mismatch by ≥ 60%. Compared to observed prescribing patterns, clonal diagnostics-guided antibiotic selection could safely double the use of trimethoprim/sulfamethoxazole and minimize fluoroquinolone use. In summary, a rapid clonotyping test showed promise for improving empiric antibiotic prescribing for E. coli UTI, including reversing preferential use of fluoroquinolones over trimethoprim/sulfamethoxazole. The clonal diagnostics approach merges epidemiologic surveillance, antimicrobial stewardship, and molecular diagnostics to bring evidence-based medicine directly to the point of care.

  18. Bacterial clonal diagnostics as a tool for evidence-based empiric antibiotic selection

    PubMed Central

    Tchesnokova, Veronika; Avagyan, Hovhannes; Rechkina, Elena; Chan, Diana; Muradova, Mariya; Haile, Helen Ghirmai; Radey, Matthew; Weissman, Scott; Riddell, Kim; Scholes, Delia; Johnson, James R.

    2017-01-01

    Despite the known clonal distribution of antibiotic resistance in many bacteria, empiric (pre-culture) antibiotic selection still relies heavily on species-level cumulative antibiograms, resulting in overuse of broad-spectrum agents and excessive antibiotic/pathogen mismatch. Urinary tract infections (UTIs), which account for a large share of antibiotic use, are caused predominantly by Escherichia coli, a highly clonal pathogen. In an observational clinical cohort study of urgent care patients with suspected UTI, we assessed the potential for E. coli clonal-level antibiograms to improve empiric antibiotic selection. A novel PCR-based clonotyping assay was applied to fresh urine samples to rapidly detect E. coli and the urine strain's clonotype. Based on a database of clonotype-specific antibiograms, the acceptability of various antibiotics for empiric therapy was inferred using a 20%, 10%, and 30% allowed resistance threshold. The test's performance characteristics and possible effects on prescribing were assessed. The rapid test identified E. coli clonotypes directly in patients’ urine within 25–35 minutes, with high specificity and sensitivity compared to culture. Antibiotic selection based on a clonotype-specific antibiogram could reduce the relative likelihood of antibiotic/pathogen mismatch by ≥ 60%. Compared to observed prescribing patterns, clonal diagnostics-guided antibiotic selection could safely double the use of trimethoprim/sulfamethoxazole and minimize fluoroquinolone use. In summary, a rapid clonotyping test showed promise for improving empiric antibiotic prescribing for E. coli UTI, including reversing preferential use of fluoroquinolones over trimethoprim/sulfamethoxazole. The clonal diagnostics approach merges epidemiologic surveillance, antimicrobial stewardship, and molecular diagnostics to bring evidence-based medicine directly to the point of care. PMID:28350870

  19. Scaling up explanation generation: Large-scale knowledge bases and empirical studies

    SciTech Connect

    Lester, J.C.; Porter, B.W.

    1996-12-31

    To explain complex phenomena, an explanation system must be able to select information from a formal representation of domain knowledge, organize the selected information into multisentential discourse plans, and realize the discourse plans in text. Although recent years have witnessed significant progress in the development of sophisticated computational mechanisms for explanation, empirical results have been limited. This paper reports on a seven year effort to empirically study explanation generation from semantically rich, large-scale knowledge bases. We first describe Knight, a robust explanation system that constructs multi-sentential and multi-paragraph explanations from the Biology Knowledge Base, a large-scale knowledge base in the domain of botanical anatomy, physiology, and development. We then introduce the Two Panel evaluation methodology and describe how Knight`s performance was assessed with this methodology in the most extensive empirical evaluation conducted on an explanation system. In this evaluation, Knight scored within {open_quotes}half a grade{close_quote} of domain experts, and its performance exceeded that of one of the domain experts.

  20. An empirically based model for knowledge management in health care organizations.

    PubMed

    Sibbald, Shannon L; Wathen, C Nadine; Kothari, Anita

    2016-01-01

    Knowledge management (KM) encompasses strategies, processes, and practices that allow an organization to capture, share, store, access, and use knowledge. Ideal KM combines different sources of knowledge to support innovation and improve performance. Despite the importance of KM in health care organizations (HCOs), there has been very little empirical research to describe KM in this context. This study explores KM in HCOs, focusing on the status of current intraorganizational KM. The intention is to provide insight for future studies and model development for effective KM implementation in HCOs. A qualitative methods approach was used to create an empirically based model of KM in HCOs. Methods included (a) qualitative interviews (n = 24) with senior leadership to identify types of knowledge important in these roles plus current information-seeking behaviors/needs and (b) in-depth case study with leaders in new executive positions (n = 2). The data were collected from 10 HCOs. Our empirically based model for KM was assessed for face and content validity. The findings highlight the paucity of formal KM in our sample HCOs. Organizational culture, leadership, and resources are instrumental in supporting KM processes. An executive's knowledge needs are extensive, but knowledge assets are often limited or difficult to acquire as much of the available information is not in a usable format. We propose an empirically based model for KM to highlight the importance of context (internal and external), and knowledge seeking, synthesis, sharing, and organization. Participants who reviewed the model supported its basic components and processes, and potential for incorporating KM into organizational processes. Our results articulate ways to improve KM, increase organizational learning, and support evidence-informed decision-making. This research has implications for how to better integrate evidence and knowledge into organizations while considering context and the role of

  1. Outcome (competency) based education: an exploration of its origins, theoretical basis, and empirical evidence.

    PubMed

    Morcke, Anne Mette; Dornan, Tim; Eika, Berit

    2013-10-01

    Outcome based or competency based education (OBE) is so firmly established in undergraduate medical education that it might not seem necessary to ask why it was included in recommendations for the future, like the Flexner centenary report. Uncritical acceptance may not, however, deliver its greatest benefits. Our aim was to explore the underpinnings of OBE: its historical origins, theoretical basis, and empirical evidence of its effects in order to answer the question: How can predetermined learning outcomes influence undergraduate medical education? This literature review had three components: A review of historical landmarks in the evolution of OBE; a review of conceptual frameworks and theories; and a systematic review of empirical publications from 1999 to 2010 that reported data concerning the effects of learning outcomes on undergraduate medical education. OBE had its origins in behaviourist theories of learning. It is tightly linked to the assessment and regulation of proficiency, but less clearly linked to teaching and learning activities. Over time, there have been cycles of advocacy for, then criticism of, OBE. A recurring critique concerns the place of complex personal and professional attributes as "competencies". OBE has been adopted by consensus in the face of weak empirical evidence. OBE, which has been advocated for over 50 years, can contribute usefully to defining requisite knowledge and skills, and blueprinting assessments. Its applicability to more complex aspects of clinical performance is not clear. OBE, we conclude, provides a valuable approach to some, but not all, important aspects of undergraduate medical education.

  2. Introduction to the Application of Web-Based Surveys.

    ERIC Educational Resources Information Center

    Timmerman, Annemarie

    This paper discusses some basic assumptions and issues concerning web-based surveys. Discussion includes: assumptions regarding cost and ease of use; disadvantages of web-based surveys, concerning the inability to compensate for four common errors of survey research: coverage error, sampling error, measurement error and nonresponse error; and…

  3. Intrinsic fluorescence of protein in turbid media using empirical relation based on Monte Carlo lookup table

    NASA Astrophysics Data System (ADS)

    Einstein, Gnanatheepam; Udayakumar, Kanniyappan; Aruna, Prakasarao; Ganesan, Singaravelu

    2017-03-01

    Fluorescence of Protein has been widely used in diagnostic oncology for characterizing cellular metabolism. However, the intensity of fluorescence emission is affected due to the absorbers and scatterers in tissue, which may lead to error in estimating exact protein content in tissue. Extraction of intrinsic fluorescence from measured fluorescence has been achieved by different methods. Among them, Monte Carlo based method yields the highest accuracy for extracting intrinsic fluorescence. In this work, we have attempted to generate a lookup table for Monte Carlo simulation of fluorescence emission by protein. Furthermore, we fitted the generated lookup table using an empirical relation. The empirical relation between measured and intrinsic fluorescence is validated using tissue phantom experiments. The proposed relation can be used for estimating intrinsic fluorescence of protein for real-time diagnostic applications and thereby improving the clinical interpretation of fluorescence spectroscopic data.

  4. Generalised Linear Models Incorporating Population Level Information: An Empirical Likelihood Based Approach

    PubMed Central

    Chaudhuri, Sanjay; Handcock, Mark S.; Rendall, Michael S.

    2011-01-01

    In many situations information from a sample of individuals can be supplemented by population level information on the relationship between a dependent variable and explanatory variables. Inclusion of the population level information can reduce bias and increase the efficiency of the parameter estimates. Population level information can be incorporated via constraints on functions of the model parameters. In general the constraints are nonlinear making the task of maximum likelihood estimation harder. In this paper we develop an alternative approach exploiting the notion of an empirical likelihood. It is shown that within the framework of generalised linear models, the population level information corresponds to linear constraints, which are comparatively easy to handle. We provide a two-step algorithm that produces parameter estimates using only unconstrained estimation. We also provide computable expressions for the standard errors. We give an application to demographic hazard modelling by combining panel survey data with birth registration data to estimate annual birth probabilities by parity. PMID:22740776

  5. Band structure calculation of GaSe-based nanostructures using empirical pseudopotential method

    NASA Astrophysics Data System (ADS)

    Osadchy, A. V.; Volotovskiy, S. G.; Obraztsova, E. D.; Savin, V. V.; Golovashkin, D. L.

    2016-08-01

    In this paper we present the results of band structure computer simulation of GaSe- based nanostructures using the empirical pseudopotential method. Calculations were performed using a specially developed software that allows performing simulations using cluster computing. Application of this method significantly reduces the demands on computing resources compared to traditional approaches based on ab-initio techniques and provides receiving the adequate comparable results. The use of cluster computing allows to obtain information for structures that require an explicit account of a significant number of atoms, such as quantum dots and quantum pillars.

  6. An Empirical Study of Plan-Based Representations of Pascal and Fortran Code.

    DTIC Science & Technology

    1987-06-01

    COMPUTING LABORATORY lReport No. CCL-0687-0 00 IAN EMPIRICAL STUDY OF PLAN-BASED REPRESENTATIONS OF PASCAL AND FORTRAN CODE Scott P. Robertson Chiung-Chen Yu...82173 ’, " Office of Naval Research Contract No. N00014-86-K-0876 Work Unit No. NR 4424203-01 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED...researchers have argued recenitly that programmers utilize a plan-based representation when composing or comprehending program code. In a series of studies we

  7. An empirical mass-loss law for Population II giants from the Spitzer-IRAC survey of Galactic globular clusters

    NASA Astrophysics Data System (ADS)

    Origlia, L.; Ferraro, F. R.; Fabbri, S.; Fusi Pecci, F.; Dalessandro, E.; Rich, R. M.; Valenti, E.

    2014-04-01

    Aims: The main aim of the present work is to derive an empirical mass-loss (ML) law for Population II stars in first and second ascent red giant branches. Methods: We used the Spitzer InfraRed Array Camera (IRAC) photometry obtained in the 3.6-8 μm range of a carefully chosen sample of 15 Galactic globular clusters spanning the entire metallicity range and sampling the vast zoology of horizontal branch (HB) morphologies. We complemented the IRAC photometry with near-infrared data to build suitable color-magnitude and color-color diagrams and identify mass-losing giant stars. Results: We find that while the majority of stars show colors typical of cool giants, some stars show an excess of mid-infrared light that is larger than expected from their photospheric emission and that is plausibly due to dust formation in mass flowing from them. For these stars, we estimate dust and total (gas + dust) ML rates and timescales. We finally calibrate an empirical ML law for Population II red and asymptotic giant branch stars with varying metallicity. We find that at a given red giant branch luminosity only a fraction of the stars are losing mass. From this, we conclude that ML is episodic and is active only a fraction of the time, which we define as the duty cycle. The fraction of mass-losing stars increases by increasing the stellar luminosity and metallicity. The ML rate, as estimated from reasonable assumptions for the gas-to-dust ratio and expansion velocity, depends on metallicity and slowly increases with decreasing metallicity. In contrast, the duty cycle increases with increasing metallicity, with the net result that total ML increases moderately with increasing metallicity, about 0.1 M⊙ every dex in [Fe/H]. For Population II asymptotic giant branch stars, we estimate a total ML of ≤0.1 M⊙, nearly constant with varying metallicity. This work is based on observations made with the Spitzer Space Telescope, which is operated by the Jet Propulsion Laboratory

  8. Development of Empirically Based Time-to-death Curves for Combat Casualty Deaths in Iraq and Afghanistan

    DTIC Science & Technology

    2015-01-01

    Naval Health Research Center Development of Empirically Based Time-to- death Curves for Combat Casualty Deaths In Iraq and Afghanistan Edwin...10.1177/1548512914531353 dms.sagepub.com Development of empirically based time-to- death curves for combat casualty deaths in Iraq and Afghanistan...casualties with life-threatening injuries. The curves developed from that research were based on a small dataset (n = 160, with 26 deaths and 134

  9. Fault Diagnosis of Rotating Machinery Based on an Adaptive Ensemble Empirical Mode Decomposition

    PubMed Central

    Lei, Yaguo; Li, Naipeng; Lin, Jing; Wang, Sizhe

    2013-01-01

    The vibration based signal processing technique is one of the principal tools for diagnosing faults of rotating machinery. Empirical mode decomposition (EMD), as a time-frequency analysis technique, has been widely used to process vibration signals of rotating machinery. But it has the shortcoming of mode mixing in decomposing signals. To overcome this shortcoming, ensemble empirical mode decomposition (EEMD) was proposed accordingly. EEMD is able to reduce the mode mixing to some extent. The performance of EEMD, however, depends on the parameters adopted in the EEMD algorithms. In most of the studies on EEMD, the parameters were selected artificially and subjectively. To solve the problem, a new adaptive ensemble empirical mode decomposition method is proposed in this paper. In the method, the sifting number is adaptively selected, and the amplitude of the added noise changes with the signal frequency components during the decomposition process. The simulation, the experimental and the application results demonstrate that the adaptive EEMD provides the improved results compared with the original EEMD in diagnosing rotating machinery. PMID:24351666

  10. Fault diagnosis of rotating machinery based on an adaptive ensemble empirical mode decomposition.

    PubMed

    Lei, Yaguo; Li, Naipeng; Lin, Jing; Wang, Sizhe

    2013-12-09

    The vibration based signal processing technique is one of the principal tools for diagnosing faults of rotating machinery. Empirical mode decomposition (EMD), as a time-frequency analysis technique, has been widely used to process vibration signals of rotating machinery. But it has the shortcoming of mode mixing in decomposing signals. To overcome this shortcoming, ensemble empirical mode decomposition (EEMD) was proposed accordingly. EEMD is able to reduce the mode mixing to some extent. The performance of EEMD, however, depends on the parameters adopted in the EEMD algorithms. In most of the studies on EEMD, the parameters were selected artificially and subjectively. To solve the problem, a new adaptive ensemble empirical mode decomposition method is proposed in this paper. In the method, the sifting number is adaptively selected, and the amplitude of the added noise changes with the signal frequency components during the decomposition process. The simulation, the experimental and the application results demonstrate that the adaptive EEMD provides the improved results compared with the original EEMD in diagnosing rotating machinery.

  11. A modified cadastral survey system based on GPS/PDA

    NASA Astrophysics Data System (ADS)

    Wang, Huiqing; Wang, Qing; Wu, Xiangyang

    2009-12-01

    Due to disadvantages of complex working procedure, long field survey and low efficiency of the traditional cadastral survey methods exist, a modified system based on GPS(Global Position System) /PDA(Personal Digital Assist) combined with TS(Total Station) is proposed. The system emphasizes the design of TS free setting station for detail survey without GPS, to realize simultaneously processing control survey and detail survey. The system also applies digital drafting method based on PDA instead of cartographical sketching, to realize fully-digitalized cadastral survey. The application in Beijing shows that the modified cadastral survey system based on GPS/PDA performs high efficiency, and the accuracy of this system can meet the requirement of 1:500 large scale cadastral survey.

  12. Empirical and physics based mathematical models of uranium hydride decomposition kinetics with quantified uncertainties.

    SciTech Connect

    Salloum, Maher N.; Gharagozloo, Patricia E.

    2013-10-01

    Metal particle beds have recently become a major technique for hydrogen storage. In order to extract hydrogen from such beds, it is crucial to understand the decomposition kinetics of the metal hydride. We are interested in obtaining a a better understanding of the uranium hydride (UH3) decomposition kinetics. We first developed an empirical model by fitting data compiled from different experimental studies in the literature and quantified the uncertainty resulting from the scattered data. We found that the decomposition time range predicted by the obtained kinetics was in a good agreement with published experimental results. Secondly, we developed a physics based mathematical model to simulate the rate of hydrogen diffusion in a hydride particle during the decomposition. We used this model to simulate the decomposition of the particles for temperatures ranging from 300K to 1000K while propagating parametric uncertainty and evaluated the kinetics from the results. We compared the kinetics parameters derived from the empirical and physics based models and found that the uncertainty in the kinetics predicted by the physics based model covers the scattered experimental data. Finally, we used the physics-based kinetics parameters to simulate the effects of boundary resistances and powder morphological changes during decomposition in a continuum level model. We found that the species change within the bed occurring during the decomposition accelerates the hydrogen flow by increasing the bed permeability, while the pressure buildup and the thermal barrier forming at the wall significantly impede the hydrogen extraction.

  13. Knowledge-based control and case-based diagnosis based upon empirical knowledge and fuzzy logic for the SBR plant.

    PubMed

    Bae, H; Seo, H Y; Kim, S; Kim, Y

    2006-01-01

    Because biological wastewater treatment plants (WWTPs) involve a long time-delay and various disturbances, in general, skilled operators manually control the plant based on empirical knowledge. And operators usually diagnose the plant using similar cases experienced in the past. For the effective management of the plant, system automation has to be accomplished based upon operating recipes. This paper introduces automatic control and diagnosis based upon the operator's knowledge. Fuzzy logic was employed to design this knowledge-based controller because fuzzy logic can convert the linguistic information to rules. The controller can manage the influent and external carbon in considering the loading rate. The input of the controller is not the loading rate but the dissolved oxygen (DO) lag-time, which has a strong relation to the loading rate. This approach can replace an expensive sensor, which measures the loading rate and ammonia concentration in the reactor, with a cheaper DO sensor. The proposed controller can assure optimal operation and prevent the over-feeding problem. Case-based diagnosis was achieved by the analysis of profile patterns collected from the past. A new test profile was diagnosed by comparing it with template patterns containing normal and abnormal cases. The proposed control and diagnostic system will guarantee the effective and stable operation of WWTPs.

  14. Increasing Response Rates to Web-Based Surveys

    ERIC Educational Resources Information Center

    Monroe, Martha C.; Adams, Damian C.

    2012-01-01

    We review a popular method for collecing data--Web-based surveys. Although Web surveys are popular, one major concern is their typically low response rates. Using the Dillman et al. (2009) approach, we designed, pre-tested, and implemented a survey on climate change with Extension professionals in the Southeast. The Dillman approach worked well,…

  15. Increasing Your Productivity with Web-Based Surveys

    ERIC Educational Resources Information Center

    Wissmann, Mary; Stone, Brittney; Schuster, Ellen

    2012-01-01

    Web-based survey tools such as Survey Monkey can be used in many ways to increase the efficiency and effectiveness of Extension professionals. This article describes how Survey Monkey has been used at the state and county levels to collect community and internal staff information for the purposes of program planning, administration, evaluation and…

  16. Increasing Response Rates to Web-Based Surveys

    ERIC Educational Resources Information Center

    Monroe, Martha C.; Adams, Damian C.

    2012-01-01

    We review a popular method for collecing data--Web-based surveys. Although Web surveys are popular, one major concern is their typically low response rates. Using the Dillman et al. (2009) approach, we designed, pre-tested, and implemented a survey on climate change with Extension professionals in the Southeast. The Dillman approach worked well,…

  17. Increasing Your Productivity with Web-Based Surveys

    ERIC Educational Resources Information Center

    Wissmann, Mary; Stone, Brittney; Schuster, Ellen

    2012-01-01

    Web-based survey tools such as Survey Monkey can be used in many ways to increase the efficiency and effectiveness of Extension professionals. This article describes how Survey Monkey has been used at the state and county levels to collect community and internal staff information for the purposes of program planning, administration, evaluation and…

  18. Ab initio based empirical potential applied to tungsten at high pressure

    NASA Astrophysics Data System (ADS)

    Ehemann, Robert C.; Nicklas, Jeremy W.; Park, Hyoungki; Wilkins, John W.

    2017-05-01

    Density-functional theory forces, stresses, and energies comprise a database from which the optimal parameters of a spline-based empirical potential combining Stillinger-Weber and modified embedded-atom forms are determined. The accuracy of the potential is demonstrated by calculations of ideal shear, stacking fault, vacancy migration, elastic constants, and phonons all between 0 and 100 GPa. Consistency with existing models and experiments is demonstrated by predictions of screw dislocation core structure and deformation twinning in a tungsten nanorod. Last, the potential is used to study the stabilization of fcc tungsten at high pressure.

  19. Evaluating Process Quality Based on Change Request Data - An Empirical Study of the Eclipse Project

    NASA Astrophysics Data System (ADS)

    Schackmann, Holger; Schaefer, Henning; Lichter, Horst

    The information routinely collected in change request management systems contains valuable information for monitoring of the process quality. However this data is currently utilized in a very limited way. This paper presents an empirical study of the process quality in the product portfolio of the Eclipse project. It is based on a systematic approach for the evaluation of process quality characteristics using change request data. Results of the study offer insights into the development process of Eclipse. Moreover the study allows assessing applicability and limitations of the proposed approach for the evaluation of process quality.

  20. An ISAR imaging algorithm for the space satellite based on empirical mode decomposition theory

    NASA Astrophysics Data System (ADS)

    Zhao, Tao; Dong, Chun-zhu

    2014-11-01

    Currently, high resolution imaging of the space satellite is a popular topic in the field of radar technology. In contrast with regular targets, the satellite target often moves along with its trajectory and simultaneously its solar panel substrate changes the direction toward the sun to obtain energy. Aiming at the imaging problem, a signal separating and imaging approach based on the empirical mode decomposition (EMD) theory is proposed, and the approach can realize separating the signal of two parts in the satellite target, the main body and the solar panel substrate and imaging for the target. The simulation experimentation can demonstrate the validity of the proposed method.

  1. Systematic Review of Empirically Evaluated School-Based Gambling Education Programs.

    PubMed

    Keen, Brittany; Blaszczynski, Alex; Anjoul, Fadi

    2017-03-01

    Adolescent problem gambling prevalence rates are reportedly five times higher than in the adult population. Several school-based gambling education programs have been developed in an attempt to reduce problem gambling among adolescents; however few have been empirically evaluated. The aim of this review was to report the outcome of studies empirically evaluating gambling education programs across international jurisdictions. A systematic review following guidelines outlined in the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement searching five academic databases: PubMed, Scopus, Medline, PsycINFO, and ERIC, was conducted. A total of 20 papers and 19 studies were included after screening and exclusion criteria were applied. All studies reported intervention effects on cognitive outcomes such as knowledge, perceptions, and beliefs. Only nine of the studies attempted to measure intervention effects on behavioural outcomes, and only five of those reported significant changes in gambling behaviour. Of these five, methodological inadequacies were commonly found including brief follow-up periods, lack of control comparison in post hoc analyses, and inconsistencies and misclassifications in the measurement of gambling behaviour, including problem gambling. Based on this review, recommendations are offered for the future development and evaluation of school-based gambling education programs relating to both methodological and content design and delivery considerations.

  2. Inequality of Higher Education in China: An Empirical Test Based on the Perspective of Relative Deprivation

    ERIC Educational Resources Information Center

    Hou, Liming

    2014-01-01

    The primary goal of this paper is to examine what makes Chinese college students dissatisfied with entrance opportunities for higher education. Based on the author's survey data, we test two parameters which could be a potential cause of this dissatisfaction: 1) distributive inequality, which emphasizes the individual's dissatisfaction caused by…

  3. Family Capital Social Stratification, and Higher Education Attainment--An Empirical Study Based on Jiangsu Province

    ERIC Educational Resources Information Center

    Zhimin, Liu; Yao, Gao

    2015-01-01

    Based on survey data from 14 representative universities in Jiangsu Province, logistic regression is used to analyze the impact of family capital on the quantity and quality of individuals obtaining higher education. The study found that on the whole, family capital has a clearly positive promoting role in the quantity and quality of higher…

  4. Applying species-tree analyses to deep phylogenetic histories: challenges and potential suggested from a survey of empirical phylogenetic studies.

    PubMed

    Lanier, Hayley C; Knowles, L Lacey

    2015-02-01

    Coalescent-based methods for species-tree estimation are becoming a dominant approach for reconstructing species histories from multi-locus data, with most of the studies examining these methodologies focused on recently diverged species. However, deeper phylogenies, such as the datasets that comprise many Tree of Life (ToL) studies, also exhibit gene-tree discordance. This discord may also arise from the stochastic sorting of gene lineages during the speciation process (i.e., reflecting the random coalescence of gene lineages in ancestral populations). It remains unknown whether guidelines regarding methodologies and numbers of loci established by simulation studies at shallow tree depths translate into accurate species relationships for deeper phylogenetic histories. We address this knowledge gap and specifically identify the challenges and limitations of species-tree methods that account for coalescent variance for deeper phylogenies. Using simulated data with characteristics informed by empirical studies, we evaluate both the accuracy of estimated species trees and the characteristics associated with recalcitrant nodes, with a specific focus on whether coalescent variance is generally responsible for the lack of resolution. By determining the proportion of coalescent genealogies that support a particular node, we demonstrate that (1) species-tree methods account for coalescent variance at deep nodes and (2) mutational variance - not gene-tree discord arising from the coalescent - posed the primary challenge for accurate reconstruction across the tree. For example, many nodes were accurately resolved despite predicted discord from the random coalescence of gene lineages and nodes with poor support were distributed across a range of depths (i.e., they were not restricted to a particular recent divergences). Given their broad taxonomic scope and large sampling of taxa, deep level phylogenies pose several potential methodological complications including

  5. Dip-separated structural filtering using seislet transform and adaptive empirical mode decomposition based dip filter

    NASA Astrophysics Data System (ADS)

    Chen, Yangkang

    2016-07-01

    The seislet transform has been demonstrated to have a better compression performance for seismic data compared with other well-known sparsity promoting transforms, thus it can be used to remove random noise by simply applying a thresholding operator in the seislet domain. Since the seislet transform compresses the seismic data along the local structures, the seislet thresholding can be viewed as a simple structural filtering approach. Because of the dependence on a precise local slope estimation, the seislet transform usually suffers from low compression ratio and high reconstruction error for seismic profiles that have dip conflicts. In order to remove the limitation of seislet thresholding in dealing with conflicting-dip data, I propose a dip-separated filtering strategy. In this method, I first use an adaptive empirical mode decomposition based dip filter to separate the seismic data into several dip bands (5 or 6). Next, I apply seislet thresholding to each separated dip component to remove random noise. Then I combine all the denoised components to form the final denoised data. Compared with other dip filters, the empirical mode decomposition based dip filter is data-adaptive. One only needs to specify the number of dip components to be separated. Both complicated synthetic and field data examples show superior performance of my proposed approach than the traditional alternatives. The dip-separated structural filtering is not limited to seislet thresholding, and can also be extended to all those methods that require slope information.

  6. Comparing results from two continental geochemical surveys to world soil composition and deriving Predicted Empirical Global Soil (PEGS2) reference values

    NASA Astrophysics Data System (ADS)

    de Caritat, Patrice; Reimann, Clemens; Bastrakov, E.; Bowbridge, D.; Boyle, P.; Briggs, S.; Brown, D.; Brown, M.; Brownlie, K.; Burrows, P.; Burton, G.; Byass, J.; de Caritat, P.; Chanthapanya, N.; Cooper, M.; Cranfield, L.; Curtis, S.; Denaro, T.; Dhnaram, C.; Dhu, T.; Diprose, G.; Fabris, A.; Fairclough, M.; Fanning, S.; Fidler, R.; Fitzell, M.; Flitcroft, P.; Fricke, C.; Fulton, D.; Furlonger, J.; Gordon, G.; Green, A.; Green, G.; Greenfield, J.; Harley, J.; Heawood, S.; Hegvold, T.; Henderson, K.; House, E.; Husain, Z.; Krsteska, B.; Lam, J.; Langford, R.; Lavigne, T.; Linehan, B.; Livingstone, M.; Lukss, A.; Maier, R.; Makuei, A.; McCabe, L.; McDonald, P.; McIlroy, D.; McIntyre, D.; Morris, P.; O'Connell, G.; Pappas, B.; Parsons, J.; Petrick, C.; Poignand, B.; Roberts, R.; Ryle, J.; Seymon, A.; Sherry, K.; Skinner, J.; Smith, M.; Strickland, C.; Sutton, S.; Swindell, R.; Tait, H.; Tang, J.; Thomson, A.; Thun, C.; Uppill, B.; Wall, K.; Watkins, J.; Watson, T.; Webber, L.; Whiting, A.; Wilford, J.; Wilson, T.; Wygralak, A.; Albanese, S.; Andersson, M.; Arnoldussen, A.; Baritz, R.; Batista, M. J.; Bel-lan, A.; Birke, M.; Cicchella, C.; Demetriades, A.; Dinelli, E.; De Vivo, B.; De Vos, W.; Duris, M.; Dusza-Dobek, A.; Eggen, O. A.; Eklund, M.; Ernstsen, V.; Filzmoser, P.; Finne, T. E.; Flight, D.; Forrester, S.; Fuchs, M.; Fugedi, U.; Gilucis, A.; Gosar, M.; Gregorauskiene, V.; Gulan, A.; Halamić, J.; Haslinger, E.; Hayoz, P.; Hobiger, G.; Hoffmann, R.; Hoogewerff, J.; Hrvatovic, H.; Husnjak, S.; Janik, L.; Johnson, C. C.; Jordan, G.; Kirby, J.; Kivisilla, J.; Klos, V.; Krone, F.; Kwecko, P.; Kuti, L.; Ladenberger, A.; Lima, A.; Locutura, J.; Lucivjansky, P.; Mackovych, D.; Malyuk, B. I.; Maquil, R.; McLaughlin, M.; Meuli, R. G.; Miosic, N.; Mol, G.; Négrel, P.; O'Connor, P.; Oorts, K.; Ottesen, R. T.; Pasieczna, A.; Petersell, V.; Pfleiderer, S.; Poňavič, M.; Prazeres, C.; Rauch, U.; Reimann, C.; Salpeteur, I.; Schedl, A.; Scheib, A.; Schoeters, I.; Sefcik, P.; Sellersjö, E.; Skopljak, F.; Slaninka, I.; Šorša, A.; Srvkota, R.; Stafilov, T.; Tarvainen, T.; Trendavilov, V.; Valera, P.; Verougstraete, V.; Vidojević, D.; Zissimos, A. M.; Zomeni, Z.

    2012-02-01

    Analytical data for 10 major oxides (Al2O3, CaO, Fe2O3, K2O, MgO, MnO, Na2O, P2O5, SiO2 and TiO2), 16 total trace elements (As, Ba, Ce, Co, Cr, Ga, Nb, Ni, Pb, Rb, Sr, Th, V, Y, Zn and Zr), 14 aqua regia extracted elements (Ag, As, Bi, Cd, Ce, Co, Cs, Cu, Fe, La, Li, Mn, Mo and Pb), Loss On Ignition (LOI) and pH from 3526 soil samples from two continents (Australia and Europe) are presented and compared to (1) the composition of the upper continental crust, (2) published world soil average values, and (3) data from other continental-scale soil surveys. It can be demonstrated that average upper continental crust values do not provide reliable estimates for natural concentrations of elements in soils. For many elements there exist substantial differences between published world soil averages and the median concentrations observed on two continents. Direct comparison with other continental datasets is hampered by the fact that often mean, instead of the statistically more robust median, is reported. Using a database of the worldwide distribution of lithological units, it can be demonstrated that lithology is a poor predictor of soil chemistry. Climate-related processes such as glaciation and weathering are strong modifiers of the geochemical signature inherited from bedrock during pedogenesis. To overcome existing shortcomings of predicted global or world soil geochemical reference values, we propose Preliminary Empirical Global Soil reference values based on analytical results of a representative number of soil samples from two continents (PEGS2).

  7. Semi-empirical versus process-based sea-level projections for the twenty-first century

    NASA Astrophysics Data System (ADS)

    Orlić, Mirko; Pasarić, Zoran

    2013-08-01

    Two dynamical methods are presently used to project sea-level changes during the next century. The process-based method relies on coupled atmosphere-ocean models to estimate the effects of thermal expansion and on sea-level models combined with certain empirical relationships to determine the influence of land-ice mass changes. The semi-empirical method uses various physically motivated relationships between temperature and sea level, with parameters determined from the data, to project total sea level. However, semi-empirical projections far exceed process-based projections. Here, we test the robustness of semi-empirical projections to the underlying assumptions about the inertial and equilibrium responses of sea level to temperature forcing and the impacts of groundwater depletion and dam retention during the twentieth century. Our results show that these projections are sensitive to the dynamics considered and the terrestrial-water corrections applied. For B1, which is a moderate climate-change scenario, the lowest semi-empirical projection of sea-level rise over the twenty-first century equals 62+/-14cm. The average value is substantially smaller than previously published semi-empirical projections and is therefore closer to the corresponding process-based values. The standard deviation is larger than the uncertainties of process-based estimates.

  8. Sleep disturbances in young and middle-aged adults - Empirical patterns and related factors from an epidemiological survey.

    PubMed

    Rössler, Wulf; AjdacicGross, Vladeta; Glozier, Nick; Rodgers, Stephanie; Haker, Helene; Müller, Mario

    2017-07-21

    Previous research suggests that sleep disorders are highly associated with other mental health problems. However, sleep problems even below the diagnostic threshold of sleep disorders are very common in the general population, which highly affects wellbeing and functioning. In order to broaden the focus beyond those severe cases we explored empirical patterns across the whole spectrum of sleep problems as well as associated clinical and other factors. A representative community sample of N=1274 residents from the canton of Zurich was interviewed for sleep problems and diagnostic criteria for mental disorders as well as was given a number of mental health-related psychometrical checklists. Based on a broader spectrum of sleep problems we conducted a latent class analysis (LCA) to derive distinct classes of such disturbances. Classes were compared regarding their associations to mental health-relevant and other risk factors. The LCA revealed four classes - no sleep disturbances (72.6%), difficulties initiating and maintaining sleep (15.8%), delayed sleep (5.3%), and severe sleep problems (6.4%). Severe sleep problems were related to female gender and generalized anxiety disorder, while depression was linked to all sleep problem classes. Persons with difficulties initiating and maintaining sleep and severe sleep problems reported higher levels of psychopathology, burnout and neuroticism, while all sleep problem types were tied to stress-related variables, but not alcohol use disorder. Sleep problems are highly prevalent among the young and middle-aged adults in our representative sample of young and middle-aged adults and as such represent a serious public mental health problem. Our findings indicate sleep problems to have a multi-dimensional structure with some differential associations. While all subtypes were associated with poorer mental health and particularly more depression, severe sleep problems appeared to be the sleep subtype seen in agoraphobia and GAD

  9. Polarizable Empirical Force Field for Hexopyranose Monosaccharides Based on the Classical Drude Oscillator

    PubMed Central

    2015-01-01

    A polarizable empirical force field based on the classical Drude oscillator is presented for the hexopyranose form of selected monosaccharides. Parameter optimization targeted quantum mechanical (QM) dipole moments, solute–water interaction energies, vibrational frequencies, and conformational energies. Validation of the model was based on experimental data on crystals, densities of aqueous-sugar solutions, diffusion constants of glucose, and rotational preferences of the exocylic hydroxymethyl of d-glucose and d-galactose in aqueous solution as well as additional QM data. Notably, the final model involves a single electrostatic model for all sixteen diastereomers of the monosaccharides, indicating the transferability of the polarizable model. The presented parameters are anticipated to lay the foundation for a comprehensive polarizable force field for saccharides that will be compatible with the polarizable Drude parameters for lipids and proteins, allowing for simulations of glycolipids and glycoproteins. PMID:24564643

  10. New Denoising Method Based on Empirical Mode Decomposition and Improved Thresholding Function

    NASA Astrophysics Data System (ADS)

    Mohguen, Wahiba; RaïsEl'hadiBekka

    2017-01-01

    This paper presents a new denoising method called EMD-ITF that was based on Empirical Mode Decomposition (EMD) and the Improved Thresholding Function (ITF) algorithms. EMD was applied to decompose adaptively a noisy signal into intrinsic mode functions (IMFs). Then, all the noisy IMFs were thresholded by applying the improved thresholding function to suppress noise and improve the signal to noise ratio (SNR). The method was tested on simulated and real data and the results were compared to the EMD-Based signal denoising methods using the soft thresholding. The results showed the superior performance of the new EMD-ITF denoising over the traditional approach. The performance were evaluated in terms of SNR in dB, and Mean Square Error (MSE).

  11. Tissue artifact removal from respiratory signals based on empirical mode decomposition.

    PubMed

    Liu, Shaopeng; Gao, Robert X; John, Dinesh; Staudenmayer, John; Freedson, Patty

    2013-05-01

    On-line measurement of respiration plays an important role in monitoring human physical activities. Such measurement commonly employs sensing belts secured around the rib cage and abdomen of the test object. Affected by the movement of body tissues, respiratory signals typically have a low signal-to-noise ratio. Removing tissue artifacts therefore is critical to ensuring effective respiration analysis. This paper presents a signal decomposition technique for tissue artifact removal from respiratory signals, based on the empirical mode decomposition (EMD). An algorithm based on the mutual information and power criteria was devised to automatically select appropriate intrinsic mode functions for tissue artifact removal and respiratory signal reconstruction. Performance of the EMD-algorithm was evaluated through simulations and real-life experiments (N = 105). Comparison with low-pass filtering that has been conventionally applied confirmed the effectiveness of the technique in tissue artifacts removal.

  12. Tissue Artifact Removal from Respiratory Signals Based on Empirical Mode Decomposition

    PubMed Central

    Liu, Shaopeng; Gao, Robert X.; John, Dinesh; Staudenmayer, John; Freedson, Patty

    2013-01-01

    On-line measurement of respiration plays an important role in monitoring human physical activities. Such measurement commonly employs sensing belts secured around the rib cage and abdomen of the test object. Affected by the movement of body tissues, respiratory signals typically have a low signal-to-noise ratio. Removing tissue artifacts therefore is critical to ensuring effective respiration analysis. This paper presents a signal decomposition technique for tissue artifact removal from respiratory signals, based on the empirical mode decomposition (EMD). An algorithm based on the mutual information and power criteria was devised to automatically select appropriate intrinsic mode functions (IMFs) for tissue artifact removal and respiratory signal reconstruction. Performance of the EMD-algorithm was evaluated through simulations and real-life experiments (N = 105). Comparison with low-pass filtering that has been conventionally applied confirmed the effectiveness of the technique in tissue artifacts removal. PMID:23325303

  13. School-Based Health Care State Policy Survey. Executive Summary

    ERIC Educational Resources Information Center

    National Assembly on School-Based Health Care, 2012

    2012-01-01

    The National Assembly on School-Based Health Care (NASBHC) surveys state public health and Medicaid offices every three years to assess state-level public policies and activities that promote the growth and sustainability of school-based health services. The FY2011 survey found 18 states (see map below) reporting investments explicitly dedicated…

  14. Web-Based Surveys Facilitate Undergraduate Research and Knowledge

    ERIC Educational Resources Information Center

    Grimes, Paul, Ed.; Steele, Scott R.

    2008-01-01

    The author presents Web-based surveying as a valuable tool for achieving quality undergraduate research in upper-level economics courses. Web-based surveys can be employed in efforts to integrate undergraduate research into the curriculum without overburdening students or faculty. The author discusses the value of undergraduate research, notes…

  15. Web-Based Surveys Facilitate Undergraduate Research and Knowledge

    ERIC Educational Resources Information Center

    Grimes, Paul, Ed.; Steele, Scott R.

    2008-01-01

    The author presents Web-based surveying as a valuable tool for achieving quality undergraduate research in upper-level economics courses. Web-based surveys can be employed in efforts to integrate undergraduate research into the curriculum without overburdening students or faculty. The author discusses the value of undergraduate research, notes…

  16. Survey says? A primer on web-based survey design and distribution.

    PubMed

    Oppenheimer, Adam J; Pannucci, Christopher J; Kasten, Steven J; Haase, Steven C

    2011-07-01

    The Internet has changed the way in which we gather and interpret information. Although books were once the exclusive bearers of data, knowledge is now only a keystroke away. The Internet has also facilitated the synthesis of new knowledge. Specifically, it has become a tool through which medical research is conducted. A review of the literature reveals that in the past year, over 100 medical publications have been based on Web-based survey data alone. Because of emerging Internet technologies, Web-based surveys can now be launched with little computer knowledge. They may also be self-administered, eliminating personnel requirements. Ultimately, an investigator may build, implement, and analyze survey results with speed and efficiency, obviating the need for mass mailings and data processing. All of these qualities have rendered telephone and mail-based surveys virtually obsolete. Despite these capabilities, Web-based survey techniques are not without their limitations, namely, recall and response biases. When used properly, however, Web-based surveys can greatly simplify the research process. This article discusses the implications of Web-based surveys and provides guidelines for their effective design and distribution.

  17. An empirical comparison of methods for analyzing correlated data from a discrete choice survey to elicit patient preference for colorectal cancer screening

    PubMed Central

    2012-01-01

    Background A discrete choice experiment (DCE) is a preference survey which asks participants to make a choice among product portfolios comparing the key product characteristics by performing several choice tasks. Analyzing DCE data needs to account for within-participant correlation because choices from the same participant are likely to be similar. In this study, we empirically compared some commonly-used statistical methods for analyzing DCE data while accounting for within-participant correlation based on a survey of patient preference for colorectal cancer (CRC) screening tests conducted in Hamilton, Ontario, Canada in 2002. Methods A two-stage DCE design was used to investigate the impact of six attributes on participants' preferences for CRC screening test and willingness to undertake the test. We compared six models for clustered binary outcomes (logistic and probit regressions using cluster-robust standard error (SE), random-effects and generalized estimating equation approaches) and three models for clustered nominal outcomes (multinomial logistic and probit regressions with cluster-robust SE and random-effects multinomial logistic model). We also fitted a bivariate probit model with cluster-robust SE treating the choices from two stages as two correlated binary outcomes. The rank of relative importance between attributes and the estimates of β coefficient within attributes were used to assess the model robustness. Results In total 468 participants with each completing 10 choices were analyzed. Similar results were reported for the rank of relative importance and β coefficients across models for stage-one data on evaluating participants' preferences for the test. The six attributes ranked from high to low as follows: cost, specificity, process, sensitivity, preparation and pain. However, the results differed across models for stage-two data on evaluating participants' willingness to undertake the tests. Little within-patient correlation (ICC ≈ 0) was

  18. An empirical comparison of methods for analyzing correlated data from a discrete choice survey to elicit patient preference for colorectal cancer screening.

    PubMed

    Cheng, Ji; Pullenayegum, Eleanor; Marshall, Deborah A; Marshall, John K; Thabane, Lehana

    2012-02-20

    A discrete choice experiment (DCE) is a preference survey which asks participants to make a choice among product portfolios comparing the key product characteristics by performing several choice tasks. Analyzing DCE data needs to account for within-participant correlation because choices from the same participant are likely to be similar. In this study, we empirically compared some commonly-used statistical methods for analyzing DCE data while accounting for within-participant correlation based on a survey of patient preference for colorectal cancer (CRC) screening tests conducted in Hamilton, Ontario, Canada in 2002. A two-stage DCE design was used to investigate the impact of six attributes on participants' preferences for CRC screening test and willingness to undertake the test. We compared six models for clustered binary outcomes (logistic and probit regressions using cluster-robust standard error (SE), random-effects and generalized estimating equation approaches) and three models for clustered nominal outcomes (multinomial logistic and probit regressions with cluster-robust SE and random-effects multinomial logistic model). We also fitted a bivariate probit model with cluster-robust SE treating the choices from two stages as two correlated binary outcomes. The rank of relative importance between attributes and the estimates of β coefficient within attributes were used to assess the model robustness. In total 468 participants with each completing 10 choices were analyzed. Similar results were reported for the rank of relative importance and β coefficients across models for stage-one data on evaluating participants' preferences for the test. The six attributes ranked from high to low as follows: cost, specificity, process, sensitivity, preparation and pain. However, the results differed across models for stage-two data on evaluating participants' willingness to undertake the tests. Little within-patient correlation (ICC ≈ 0) was found in stage-one data, but

  19. Survey Response-Related Biases in Contingent Valuation: Concepts, Remedies, and Empirical Application to Valuing Aquatic Plant Management

    Treesearch

    Mark L. Messonnier; John C. Bergstrom; Chrisopher M. Cornwell; R. Jeff Teasley; H. Ken Cordell

    2000-01-01

    Simple nonresponse and selection biases that may occur in survey research such as contingent valuation applications are discussed and tested. Correction mechanisms for these types of biases are demonstrated. Results indicate the importance of testing and correcting for unit and item nonresponse bias in contingent valuation survey data. When sample nonresponse and...

  20. Network-based empirical Bayes methods for linear models with applications to genomic data.

    PubMed

    Li, Caiyan; Wei, Zhi; Li, Hongzhe

    2010-03-01

    Empirical Bayes methods are widely used in the analysis of microarray gene expression data in order to identify the differentially expressed genes or genes that are associated with other general phenotypes. Available methods often assume that genes are independent. However, genes are expected to function interactively and to form molecular modules to affect the phenotypes. In order to account for regulatory dependency among genes, we propose in this paper a network-based empirical Bayes method for analyzing genomic data in the framework of linear models, where the dependency of genes is modeled by a discrete Markov random field defined on a predefined biological network. This method provides a statistical framework for integrating the known biological network information into the analysis of genomic data. We present an iterated conditional mode algorithm for parameter estimation and for estimating the posterior probabilities using Gibbs sampling. We demonstrate the application of the proposed methods using simulations and analysis of a human brain aging microarray gene expression data set.

  1. Polarizable empirical force field for acyclic polyalcohols based on the classical Drude oscillator.

    PubMed

    He, Xibing; Lopes, Pedro E M; Mackerell, Alexander D

    2013-10-01

    A polarizable empirical force field for acyclic polyalcohols based on the classical Drude oscillator is presented. The model is optimized with an emphasis on the transferability of the developed parameters among molecules of different sizes in this series and on the condensed-phase properties validated against experimental data. The importance of the explicit treatment of electronic polarizability in empirical force fields is demonstrated in the cases of this series of molecules with vicinal hydroxyl groups that can form cooperative intra- and intermolecular hydrogen bonds. Compared to the CHARMM additive force field, improved treatment of the electrostatic interactions avoids overestimation of the gas-phase dipole moments resulting in significant improvement in the treatment of the conformational energies and leads to the correct balance of intra- and intermolecular hydrogen bonding of glycerol as evidenced by calculated heat of vaporization being in excellent agreement with experiment. Computed condensed phase data, including crystal lattice parameters and volumes and densities of aqueous solutions are in better agreement with experimental data as compared to the corresponding additive model. Such improvements are anticipated to significantly improve the treatment of polymers in general, including biological macromolecules.

  2. Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition

    PubMed Central

    Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P.; McDonald-Maier, Klaus D.

    2015-01-01

    A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences. PMID:26007714

  3. An empirical model for the plasma environment along Titan's orbit based on Cassini plasma observations

    NASA Astrophysics Data System (ADS)

    Smith, H. Todd; Rymer, Abigail M.

    2014-07-01

    Prior to Cassini's arrival at Saturn, the nitrogen-rich dense atmosphere of Titan was considered as a significant, if not dominant, source of heavy ions in Saturn's magnetosphere. While nitrogen was detected in Saturn's magnetosphere based on Cassini observations, Enceladus instead of Titan appears to be the primary source. However, it is difficult to imagine that Titan's dense atmosphere is not a source of nitrogen. In this paper, we apply the Rymer et al.'s (2009) Titan plasma environment categorization model to the plasma environment along Titan's orbit when Titan is not present. We next categorize the Titan encounters that occurred since Rymer et al. (2009). We also produce an empirical model for applying the probabilistic occurrence of each plasma environment as a function of Saturn local time (SLT). Finally, we summarized the electron energy spectra in order to allow one to calculate more accurate electron-impact interaction rates for each plasma environment category. The combination of this full categorization versus SLT and empirical model for the electron spectrum is critical for understanding the magnetospheric plasma and will allow for more accurate modeling of the Titan plasma torus.

  4. Polarizable Empirical Force Field for Acyclic Poly-Alcohols Based on the Classical Drude Oscillator

    PubMed Central

    He, Xibing; Lopes, Pedro E. M.; MacKerell, Alexander D.

    2014-01-01

    A polarizable empirical force field for acyclic polyalcohols based on the classical Drude oscillator is presented. The model is optimized with an emphasis on the transferability of the developed parameters among molecules of different sizes in this series and on the condensed-phase properties validated against experimental data. The importance of the explicit treatment of electronic polarizability in empirical force fields is demonstrated in the cases of this series of molecules with vicinal hydroxyl groups that can form cooperative intra- and intermolecular hydrogen bonds. Compared to the CHARMM additive force field, improved treatment of the electrostatic interactions avoids overestimation of the gas-phase dipole moments, results in significant improvement in the treatment of the conformational energies, and leads to the correct balance of intra- and intermolecular hydrogen bonding of glycerol as evidenced by calculated heat of vaporization being in excellent agreement with experiment. Computed condensed phase data, including crystal lattice parameters and volumes and densities of aqueous solutions are in better agreement with experimental data as compared to the corresponding additive model. Such improvements are anticipated to significantly improve the treatment of polymers in general, including biological macromolecules. PMID:23703219

  5. Empirical tests of natural selection-based evolutionary accounts of ADHD: a systematic review.

    PubMed

    Thagaard, Marthe S; Faraone, Stephen V; Sonuga-Barke, Edmund J; Østergaard, Søren D

    2016-10-01

    ADHD is a prevalent and highly heritable mental disorder associated with significant impairment, morbidity and increased rates of mortality. This combination of high prevalence and high morbidity/mortality seen in ADHD and other mental disorders presents a challenge to natural selection-based models of human evolution. Several hypotheses have been proposed in an attempt to resolve this apparent paradox. The aim of this study was to review the evidence for these hypotheses. We conducted a systematic review of the literature on empirical investigations of natural selection-based evolutionary accounts for ADHD in adherence with the PRISMA guideline. The PubMed, Embase, and PsycINFO databases were screened for relevant publications, by combining search terms covering evolution/selection with search terms covering ADHD. The search identified 790 records. Of these, 15 full-text articles were assessed for eligibility, and three were included in the review. Two of these reported on the evolution of the seven-repeat allele of the ADHD-associated dopamine receptor D4 gene, and one reported on the results of a simulation study of the effect of suggested ADHD-traits on group survival. The authors of the three studies interpreted their findings as favouring the notion that ADHD-traits may have been associated with increased fitness during human evolution. However, we argue that none of the three studies really tap into the core symptoms of ADHD, and that their conclusions therefore lack validity for the disorder. This review indicates that the natural selection-based accounts of ADHD have not been subjected to empirical test and therefore remain hypothetical.

  6. Methodologies for Crawler Based Web Surveys.

    ERIC Educational Resources Information Center

    Thelwall, Mike

    2002-01-01

    Describes Web survey methodologies used to study the content of the Web, and discusses search engines and the concept of crawling the Web. Highlights include Web page selection methodologies; obstacles to reliable automatic indexing of Web sites; publicly indexable pages; crawling parameters; and tests for file duplication. (Contains 62…

  7. Signal enhancement based on complex curvelet transform and complementary ensemble empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Dong, Lieqian; Wang, Deying; Zhang, Yimeng; Zhou, Datong

    2017-09-01

    Signal enhancement is a necessary step in seismic data processing. In this paper we utilize the complementary ensemble empirical mode decomposition (CEEMD) and complex curvelet transform (CCT) methods to separate signal from random noise further to improve the signal to noise (S/N) ratio. Firstly, the original data with noise is decomposed into a series of intrinsic mode function (IMF) profiles with the aid of CEEMD. Then the IMFs with noise are transformed into CCT domain. By choosing different thresholds which are based on the noise level difference of each IMF profile, the noise in original data can be suppressed. Finally, we illustrate the effectiveness of the approach by simulated and field datasets.

  8. A Human ECG Identification System Based on Ensemble Empirical Mode Decomposition

    PubMed Central

    Zhao, Zhidong; Yang, Lei; Chen, Diandian; Luo, Yi

    2013-01-01

    In this paper, a human electrocardiogram (ECG) identification system based on ensemble empirical mode decomposition (EEMD) is designed. A robust preprocessing method comprising noise elimination, heartbeat normalization and quality measurement is proposed to eliminate the effects of noise and heart rate variability. The system is independent of the heart rate. The ECG signal is decomposed into a number of intrinsic mode functions (IMFs) and Welch spectral analysis is used to extract the significant heartbeat signal features. Principal component analysis is used reduce the dimensionality of the feature space, and the K-nearest neighbors (K-NN) method is applied as the classifier tool. The proposed human ECG identification system was tested on standard MIT-BIH ECG databases: the ST change database, the long-term ST database, and the PTB database. The system achieved an identification accuracy of 95% for 90 subjects, demonstrating the effectiveness of the proposed method in terms of accuracy and robustness. PMID:23698274

  9. Distributed optical fiber-based theoretical and empirical methods monitoring hydraulic engineering subjected to seepage velocity

    NASA Astrophysics Data System (ADS)

    Su, Huaizhi; Tian, Shiguang; Cui, Shusheng; Yang, Meng; Wen, Zhiping; Xie, Wei

    2016-09-01

    In order to systematically investigate the general principle and method of monitoring seepage velocity in the hydraulic engineering, the theoretical analysis and physical experiment were implemented based on distributed fiber-optic temperature sensing (DTS) technology. During the coupling influence analyses between seepage field and temperature field in the embankment dam or dike engineering, a simplified model was constructed to describe the coupling relationship of two fields. Different arrangement schemes of optical fiber and measuring approaches of temperature were applied on the model. The inversion analysis idea was further used. The theoretical method of monitoring seepage velocity in the hydraulic engineering was finally proposed. A new concept, namely the effective thermal conductivity, was proposed referring to the thermal conductivity coefficient in the transient hot-wire method. The influence of heat conduction and seepage could be well reflected by this new concept, which was proved to be a potential approach to develop an empirical method monitoring seepage velocity in the hydraulic engineering.

  10. Developing empirically based, culturally grounded drug prevention interventions for indigenous youth populations.

    PubMed

    Okamoto, Scott K; Helm, Susana; Pel, Suzanne; McClain, Latoya L; Hill, Amber P; Hayashida, Janai K P

    2014-01-01

    This article describes the relevance of a culturally grounded approach toward drug prevention development for indigenous youth populations. This approach builds drug prevention from the "ground up" (i.e., from the values, beliefs, and worldviews of the youth that are the intended consumers of the program) and is contrasted with efforts that focus on adapting existing drug prevention interventions to fit the norms of different youth ethnocultural groups. The development of an empirically based drug prevention program focused on rural Native Hawaiian youth is described as a case example of culturally grounded drug prevention development for indigenous youth; the impact of this effort on the validity of the intervention and on community engagement and investment in the development of the program are discussed. Finally, implications of this approach for behavioral health services and the development of an indigenous prevention science are discussed.

  11. Riemann Liouvelle Fractional Integral based Empirical Mode Decomposition for ECG Denoising.

    PubMed

    Jain, Shweta; Bajaj, Varun; Kumar, Anil

    2017-09-18

    Electrocardiograph (ECG) denoising is the most important step in diagnosis of heart related diseases, as the diagnosis gets influenced with noises. In this paper, a new method for ECG denoising is proposed, which incorporates empirical mode decomposition algorithm and Riemann Liouvelle (RL) fractional integral filtering. In the proposed method, noisy ECG signal is decomposed into its intrinsic mode functions (IMFs); from which noisy IMFs are identified by proposed noisy-IMFs identification methodology. RL fractional integral filtering is applied on noisy IMFs to get denoised IMFs; ECG signal is reconstructed with denoised IMFs and remaining signal dominant IMFs to obtain noise-free ECG signal. Proposed methodology is tested with MIT-BIH arrhythmia database. Its performance, in terms of signal to noise ratio (SNR) and mean square error (MSE), is compared with other related fractional integral and EMD based ECG denoising methods. The obtained results by proposed method prove that the proposed method gives efficient noise removal performance.

  12. Confidence Interval Estimation for Sensitivity to the Early Diseased Stage Based on Empirical Likelihood.

    PubMed

    Dong, Tuochuan; Tian, Lili

    2015-01-01

    Many disease processes can be divided into three stages: the non-diseased stage: the early diseased stage, and the fully diseased stage. To assess the accuracy of diagnostic tests for such diseases, various summary indexes have been proposed, such as volume under the surface (VUS), partial volume under the surface (PVUS), and the sensitivity to the early diseased stage given specificity and the sensitivity to the fully diseased stage (P2). This paper focuses on confidence interval estimation for P2 based on empirical likelihood. Simulation studies are carried out to assess the performance of the new methods compared to the existing parametric and nonparametric ones. A real dataset from Alzheimer's Disease Neuroimaging Initiative (ADNI) is analyzed.

  13. On the pathophysiology of migraine--links for "empirically based treatment" with neurofeedback.

    PubMed

    Kropp, Peter; Siniatchkin, Michael; Gerber, Wolf-Dieter

    2002-09-01

    Psychophysiological data support the concept that migraine is the result of cortical hypersensitivity, hyperactivity, and a lack of habituation. There is evidence that this is a brain-stem related information processing dysfunction. This cortical activity reflects a periodicity between 2 migraine attacks and it may be due to endogenous or exogenous factors. In the few days preceding the next attack slow cortical potentials are highest and habituation delay experimentally recorded during contingent negative variation is at a maximum. These striking features of slow cortical potentials are predictors of the next attack. The pronounced negativity can be fed back to the patient. The data support the hypothesis that a change in amplitudes of slow cortical potentials is caused by altered habituation during the recording session. This kind of neurofeedback can be characterized as "empirically based" because it improves habituation and it proves to be clinically efficient.

  14. The children of divorce parenting intervention: outcome evaluation of an empirically based program.

    PubMed

    Wolchik, S A; West, S G; Westover, S; Sandler, I N; Martin, A; Lustig, J; Tein, J Y; Fisher, J

    1993-06-01

    Examined efficacy of an empirically based intervention using 70 divorced mothers who participated in a 12-session program or a wait-list condition. The program targeted five putative mediators: quality of the mother-child relationship, discipline, negative divorce events, contact with fathers, and support from nonparental adults. Posttest comparisons showed higher quality mother-child relationships and discipline, fewer negative divorce events, and better mental health outcomes for program participants than controls. More positive program effects occurred for mothers' than children's reports of variables and for families with poorest initial levels of functioning. Analyses indicated that improvement in the mother-child relationship partially mediated the effects of the program on mental health.

  15. Multi-faults decoupling on turbo-expander using differential-based ensemble empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Li, Hongguang; Li, Ming; Li, Cheng; Li, Fucai; Meng, Guang

    2017-09-01

    This paper dedicates on the multi-faults decoupling of turbo-expander rotor system using Differential-based Ensemble Empirical Mode Decomposition (DEEMD). DEEMD is an improved version of DEMD to resolve the imperfection of mode mixing. The nonlinear behaviors of the turbo-expander considering temperature gradient with crack, rub-impact and pedestal looseness faults are investigated respectively, so that the baseline for the multi-faults decoupling can be established. DEEMD is then utilized on the vibration signals of the rotor system with coupling faults acquired by numerical simulation, and the results indicate that DEEMD can successfully decouple the coupling faults, which is more efficient than EEMD. DEEMD is also applied on the vibration signal of the misalignment coupling with rub-impact fault obtained during the adjustment of the experimental system. The conclusion shows that DEEMD can decompose the practical multi-faults signal and the industrial prospect of DEEMD is verified as well.

  16. Empirical likelihood based detection procedure for change point in mean residual life functions under random censorship.

    PubMed

    Chen, Ying-Ju; Ning, Wei; Gupta, Arjun K

    2016-05-01

    The mean residual life (MRL) function is one of the basic parameters of interest in survival analysis that describes the expected remaining time of an individual after a certain age. The study of changes in the MRL function is practical and interesting because it may help us to identify some factors such as age and gender that may influence the remaining lifetimes of patients after receiving a certain surgery. In this paper, we propose a detection procedure based on the empirical likelihood for the changes in MRL functions with right censored data. Two real examples are also given: Veterans' administration lung cancer study and Stanford heart transplant to illustrate the detecting procedure. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Partial differential equation-based approach for empirical mode decomposition: application on image analysis.

    PubMed

    Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques

    2012-09-01

    The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.

  18. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  19. Embedding empirical mode decomposition within an FPGA-based design: challenges and progress

    NASA Astrophysics Data System (ADS)

    Jones, Jonathan D.; Pei, Jin-Song; Wright, Joseph P.

    2011-04-01

    This paper presents further advancements made in an ongoing project following a series of presentations made at the same SPIE conference in the past. Compared with traditional microprocessor-based systems, rapidly advancing field-programmable gate array (FPGA) technology offers a more powerful, efficient and flexible hardware platform. An FPGA-based design is developed to classify three types of nonlinearities (including linear, hardening and softening) of a single-degree-of-freedom (SDOF) system subjected to free vibration. This significantly advances the team's previous work on using FPGAs for wireless structural health monitoring. The classification is achieved by embedding two important algorithms - empirical mode decomposition (EMD) and backbone curve analysis. A series of systematic efforts is made to embed EMD, which involves cubic spline fitting, in an FPGA-based hardware design. Throughout the process, we take advantage of concurrent operation and strive for a trade-off between computational efficiency and resource utilization. We have started to pursue our work in the context of FPGA-based computation. In particular, handling fixed-point precision is framed under data-path optimization. Our approach for data-path optimization is necessarily manual and thus may not guarantee an optimal design. Nonetheless, our study could provide a baseline case for future work using analytical data-path optimization for this and numerous other powerful algorithms for wireless structural health monitoring.

  20. Web-based application for Data INterpolation Empirical Orthogonal Functions (DINEOF) analysis

    NASA Astrophysics Data System (ADS)

    Tomazic, Igor; Alvera-Azcarate, Aida; Barth, Alexander; Beckers, Jean-Marie

    2014-05-01

    DINEOF (Data INterpolating Empirical Orthogonal Functions) is a powerful tool based on EOF decomposition developed at the University of Liege/GHER for the reconstruction of missing data in satellite datasets, as well as for the reduction of noise and detection of outliers. DINEOF is openly available as a series of Fortran routines to be compiled by the user, and as binaries (that can be run directly without any compilation) both for Windows and Linux platforms. In order to facilitate the use of DINEOF and increase the number of interested users, we developed a web-based interface for DINEOF with the necessary parameters available to run high-quality DINEOF analysis. This includes choosing variable within selected dataset, defining a domain, time range, filtering criteria based on available variables in the dataset (e.g. quality flag, satellite zenith angle …) and defining necessary DINEOF parameters. Results, including reconstructed data and calculated EOF modes will be disseminated in NetCDF format using OpenDAP and WMS server allowing easy visualisation and analysis. First, we will include several satellite datasets of sea surface temperature and chlorophyll concentration obtained from MyOcean data centre and already remapped to the regular grid (L3C). Later, based on user's request, we plan to extend number of datasets available for reconstruction.

  1. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-06-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  2. A survey of machine readable data bases

    NASA Technical Reports Server (NTRS)

    Matlock, P.

    1981-01-01

    Forty-two of the machine readable data bases available to the technologist and researcher in the natural sciences and engineering are described and compared with the data bases and date base services offered by NASA.

  3. Empirical source strength correlations for rans-based acoustic analogy methods

    NASA Astrophysics Data System (ADS)

    Kube-McDowell, Matthew Tyndall

    JeNo is a jet noise prediction code based on an acoustic analogy method developed by Mani, Gliebe, Balsa, and Khavaran. Using the flow predictions from a standard Reynolds-averaged Navier-Stokes computational fluid dynamics solver, JeNo predicts the overall sound pressure level and angular spectra for high-speed hot jets over a range of observer angles, with a processing time suitable for rapid design purposes. JeNo models the noise from hot jets as a combination of two types of noise sources; quadrupole sources dependent on velocity fluctuations, which represent the major noise of turbulent mixing, and dipole sources dependent on enthalpy fluctuations, which represent the effects of thermal variation. These two sources are modeled by JeNo as propagating independently into the far-field, with no cross-correlation at the observer location. However, high-fidelity computational fluid dynamics solutions demonstrate that this assumption is false. In this thesis, the theory, assumptions, and limitations of the JeNo code are briefly discussed, and a modification to the acoustic analogy method is proposed in which the cross-correlation of the two primary noise sources is allowed to vary with the speed of the jet and the observer location. As a proof-of-concept implementation, an empirical correlation correction function is derived from comparisons between JeNo's noise predictions and a set of experimental measurements taken for the Air Force Aero-Propulsion Laboratory. The empirical correlation correction is then applied to JeNo's predictions of a separate data set of hot jets tested at NASA's Glenn Research Center. Metrics are derived to measure the qualitative and quantitative performance of JeNo's acoustic predictions, and the empirical correction is shown to provide a quantitative improvement in the noise prediction at low observer angles with no freestream flow, and a qualitative improvement in the presence of freestream flow. However, the results also demonstrate

  4. An Empirical Study on Washback Effects of the Internet-Based College English Test Band 4 in China

    ERIC Educational Resources Information Center

    Wang, Chao; Yan, Jiaolan; Liu, Bao

    2014-01-01

    Based on Bailey's washback model, in respect of participants, process and products, the present empirical study was conducted to find the actual washback effects of the internet-based College English Test Band 4 (IB CET-4). The methods adopted are questionnaires, class observation, interview and the analysis of both the CET-4 teaching and testing…

  5. Survey of Commercially Available Computer-Readable Bibliographic Data Bases.

    ERIC Educational Resources Information Center

    Schneider, John H., Ed.; And Others

    This document contains the results of a survey of 94 U. S. organizations, and 36 organizations in other countries that were thought to prepare machine-readable data bases. Of those surveyed, 55 organizations (40 in U. S., 15 in other countries) provided completed camera-ready forms describing 81 commercially available, machine-readable data bases…

  6. Survey of Commercially Available Computer-Readable Bibliographic Data Bases.

    ERIC Educational Resources Information Center

    Schneider, John H., Ed.; And Others

    This document contains the results of a survey of 94 U. S. organizations, and 36 organizations in other countries that were thought to prepare machine-readable data bases. Of those surveyed, 55 organizations (40 in U. S., 15 in other countries) provided completed camera-ready forms describing 81 commercially available, machine-readable data bases…

  7. Written institutional ethics policies on euthanasia: an empirical-based organizational-ethical framework.

    PubMed

    Lemiengre, Joke; Dierckx de Casterlé, Bernadette; Schotsmans, Paul; Gastmans, Chris

    2014-05-01

    As euthanasia has become a widely debated issue in many Western countries, hospitals and nursing homes especially are increasingly being confronted with this ethically sensitive societal issue. The focus of this paper is how healthcare institutions can deal with euthanasia requests on an organizational level by means of a written institutional ethics policy. The general aim is to make a critical analysis whether these policies can be considered as organizational-ethical instruments that support healthcare institutions to take their institutional responsibility for dealing with euthanasia requests. By means of an interpretative analysis, we conducted a process of reinterpretation of results of former Belgian empirical studies on written institutional ethics policies on euthanasia in dialogue with the existing international literature. The study findings revealed that legal regulations, ethical and care-oriented aspects strongly affected the development, the content, and the impact of written institutional ethics policies on euthanasia. Hence, these three cornerstones-law, care and ethics-constituted the basis for the empirical-based organizational-ethical framework for written institutional ethics policies on euthanasia that is presented in this paper. However, having a euthanasia policy does not automatically lead to more legal transparency, or to a more professional and ethical care practice. The study findings suggest that the development and implementation of an ethics policy on euthanasia as an organizational-ethical instrument should be considered as a dynamic process. Administrators and ethics committees must take responsibility to actively create an ethical climate supporting care providers who have to deal with ethical dilemmas in their practice.

  8. An Empirical Model of Solar Indices and Hemispheric Power based on DMSP/SSUSI Data

    NASA Astrophysics Data System (ADS)

    Shaikh, D.; Jones, J.

    2014-09-01

    Aurorae are produced by the collision of charged energetic particles, typically the electrons, with the Earths neutral atmosphere particularly in the high latitude regions. These particles originate predominantly from the solar wind that traverses through the Earths magnetosphere and precipitates into the Earths atmosphere thereby resulting in emission of radiation in various frequency ranges. During this process, energetic electrons deposit their kinetic energy (10s of KeV) in the upper atmosphere. The rate of electron kinetic energy deposited over the northern or southern region is called electron hemispheric power (Hpe), measured in Gigawatt (GW). Since the origin and dynamics of these precipitating charged particles is intimately connected to the kinetic and magnetic activities taking place in our Sun, they can be used as a proxy to determine many physical processes that drive the space weather on our Earth. In this paper, we examine correlations that can possibly exist between the Hpe component and various other geomagnetic parameters such as kp, Ap, solar flux and sun spot numbers. For this purpose, we evaluate a year (2012) of data from the Special Sensor Ultraviolet Spectrographic Imager (SSUSI) of the Defense Meteorological Satellite Program (DMSP) Flight 18 - satellite. We find substantially strong correlations between the Hpe and Kp, Ap, the Sun spot number (SSN) and the solar flux density. The practical application of our empirical model is multifold. (i) We can determine/forecast Kp index directly from the electron flux density and use it to drive a variety of space weather models that heavily rely on the Kp input. (ii) The Kp and Ap forecasts from our empirical correlation model could be complementary to the traditional ground-based magnetometer data.

  9. Drainage Canal Survey, Hickam Air Force Base, Hawaii

    DTIC Science & Technology

    1993-03-01

    A,-TR-1 993-0003 AD-A263 478 - A DRAINAGE CANAL SURVEY,R HICKAM AIR FORCE BASE, HAWAIIR M S T Darrin L. Curtis, Captain, USAF, BSC R 0 N G...REPORT DATE 3. REPORT TYPE AND DATES COVERED March 1993 Final 16-27 March 1992 4. TITLE AND SUBTITLE 5. FUNDING NUMBERS Drainage Canal Survey, Hickam...Laboratory Water Quality Function conducted a drainage canal survey at Hickam AFB HI from 16 to 27 Mar 92. The scope of the survey was to indicate if current

  10. Target detection for low cost uncooled MWIR cameras based on empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Piñeiro-Ave, José; Blanco-Velasco, Manuel; Cruz-Roldán, Fernando; Artés-Rodríguez, Antonio

    2014-03-01

    In this work, a novel method for detecting low intensity fast moving objects with low cost Medium Wavelength Infrared (MWIR) cameras is proposed. The method is based on background subtraction in a video sequence obtained with a low density Focal Plane Array (FPA) of the newly available uncooled lead selenide (PbSe) detectors. Thermal instability along with the lack of specific electronics and mechanical devices for canceling the effect of distortion make background image identification very difficult. As a result, the identification of targets is performed in low signal to noise ratio (SNR) conditions, which may considerably restrict the sensitivity of the detection algorithm. These problems are addressed in this work by means of a new technique based on the empirical mode decomposition, which accomplishes drift estimation and target detection. Given that background estimation is the most important stage for detecting, a previous denoising step enabling a better drift estimation is designed. Comparisons are conducted against a denoising technique based on the wavelet transform and also with traditional drift estimation methods such as Kalman filtering and running average. The results reported by the simulations show that the proposed scheme has superior performance.

  11. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User's Head Movement.

    PubMed

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-08-31

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest.

  12. Determinants of Obesity and Associated Population Attributability, South Africa: Empirical Evidence from a National Panel Survey, 2008-2012

    PubMed Central

    Sartorius, Benn; Veerman, Lennert J.; Manyema, Mercy; Chola, Lumbwe; Hofman, Karen

    2015-01-01

    Background Obesity is a major risk factor for emerging non-communicable diseases (NCDS) in middle income countries including South Africa (SA). Understanding the multiple and complex determinants of obesity and their true population attributable impact is critical for informing and developing effective prevention efforts using scientific based evidence. This study identified contextualised high impact factors associated with obesity in South Africa. Methods Analysis of three national cross sectional (repeated panel) surveys, using a multilevel logistic regression and population attributable fraction estimation allowed for identification of contextualised high impact factors associated with obesity (BMI>30 kg/m2) among adults (15years+). Results Obesity prevalence increased significantly from 23.5% in 2008 to 27.2% in 2012, with a significantly (p-value<0.001) higher prevalence among females (37.9% in 2012) compared to males (13.3% in 2012). Living in formal urban areas, white ethnicity, being married, not exercising and/or in higher socio-economic category were significantly associated with male obesity. Females living in formal or informal urban areas, higher crime areas, African/White ethnicity, married, not exercising, in a higher socio-economic category and/or living in households with proportionate higher spending on food (and unhealthy food options) were significantly more likely to be obese. The identified determinants appeared to account for 75% and 43% of male and female obesity respectively. White males had the highest relative gain in obesity from 2008 to 2012. Conclusions The rising prevalence of obesity in South Africa is significant and over the past 5 years the rising prevalence of Type-2 diabetes has mirrored this pattern, especially among females. Targeting young adolescent girls should be a priority. Addressing determinants of obesity will involve a multifaceted strategy and requires at individual and population levels. With rising costs in the

  13. Empirical Evaluation Indicators in Thai Higher Education: Theory-Based Multidimensional Learners' Assessment

    ERIC Educational Resources Information Center

    Sritanyarat, Dawisa; Russ-Eft, Darlene

    2016-01-01

    This study proposed empirical indicators which can be validated and adopted in higher education institutions to evaluate quality of teaching and learning, and to serve as an evaluation criteria for human resource management and development of higher institutions in Thailand. The main purpose of this study was to develop empirical indicators of a…

  14. The "Public Opinion Survey of Human Attributes-Stuttering" (POSHA-S): Summary Framework and Empirical Comparisons

    ERIC Educational Resources Information Center

    St. Louis, Kenneth O.

    2011-01-01

    Purpose: The "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was developed to make available worldwide a standard measure of public attitudes toward stuttering that is practical, reliable, valid, and translatable. Mean data from past field studies as comparisons for interpretation of "POSHA-S" results are reported. Method: Means…

  15. The Interest of Bavarian Primary School Pupils in Geographical Topics and Methods--Selected Results of an Empirical Survey

    ERIC Educational Resources Information Center

    Lorenz, Aline; Roth, Anna; Priese, Carolin; Peukert, Eva; Mertel, Stefanie; Bloß, Susanne; Mehren, Rainer

    2017-01-01

    Interest is a central learning prerequisite for teaching. The article deals with a survey among 1600 primary school pupils in forms 2, 3 and 4 (ages 7-10) in the German federal state of Bavaria who were interviewed on their interest in geographical topics and working methods. They were given a questionnaire including 40 items to indicate their…

  16. Gender Wage Gaps by College Major in Taiwan: Empirical Evidence from the 1997-2003 Manpower Utilization Survey

    ERIC Educational Resources Information Center

    Lin, Eric S.

    2010-01-01

    In this article, we examine the effect of incorporating the fields of study on the explained and unexplained components of the standard Oaxaca decomposition for the gender wage gaps in Taiwan using 1997-2003 Manpower Utilization Survey data. Using several existing and lately developed measures, we inspect the gender wage gap by college major to…

  17. The "Public Opinion Survey of Human Attributes-Stuttering" (POSHA-S): Summary Framework and Empirical Comparisons

    ERIC Educational Resources Information Center

    St. Louis, Kenneth O.

    2011-01-01

    Purpose: The "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was developed to make available worldwide a standard measure of public attitudes toward stuttering that is practical, reliable, valid, and translatable. Mean data from past field studies as comparisons for interpretation of "POSHA-S" results are reported. Method: Means…

  18. The Utility of Teacher and Student Surveys in Principal Evaluations: An Empirical Investigation. REL 2015-047

    ERIC Educational Resources Information Center

    Liu, Keke; Stuit, David; Springer, Jeff; Lindsay, Jim; Wan, Yinmei

    2014-01-01

    This study examined whether adding student and teacher survey measures to existing principal evaluation measures increases the overall power of the principal evaluation model to explain variation in student achievement across schools. The study was conducted using data from 2011-12 on 39 elementary and secondary schools within a midsize urban…

  19. Gender Wage Gaps by College Major in Taiwan: Empirical Evidence from the 1997-2003 Manpower Utilization Survey

    ERIC Educational Resources Information Center

    Lin, Eric S.

    2010-01-01

    In this article, we examine the effect of incorporating the fields of study on the explained and unexplained components of the standard Oaxaca decomposition for the gender wage gaps in Taiwan using 1997-2003 Manpower Utilization Survey data. Using several existing and lately developed measures, we inspect the gender wage gap by college major to…

  20. Exploring physician specialist response rates to web-based surveys.

    PubMed

    Cunningham, Ceara Tess; Quan, Hude; Hemmelgarn, Brenda; Noseworthy, Tom; Beck, Cynthia A; Dixon, Elijah; Samuel, Susan; Ghali, William A; Sykes, Lindsay L; Jetté, Nathalie

    2015-04-09

    Survey research in healthcare is an important tool to collect information about healthcare delivery, service use and overall issues relating to quality of care. Unfortunately, physicians are often a group with low survey response rates and little research has looked at response rates among physician specialists. For these reasons, the purpose of this project was to explore survey response rates among physician specialists in a large metropolitan Canadian city. As part of a larger project to look at physician payment plans, an online survey about medical billing practices was distributed to 904 physicians from various medical specialties. The primary method for physicians to complete the survey was via the Internet using a well-known and established survey company (www.surveymonkey.com). Multiple methods were used to encourage survey response such as individual personalized email invitations, multiple reminders, and a draw for three gift certificate prizes were used to increase response rate. Descriptive statistics were used to assess response rates and reasons for non-response. Overall survey response rate was 35.0%. Response rates varied by specialty: Neurology/neurosurgery (46.6%); internal medicine (42.9%); general surgery (29.6%); pediatrics (29.2%); and psychiatry (27.1%). Non-respondents listed lack of time/survey burden as the main reason for not responding to our survey. Our survey results provide a look into the challenges of collecting healthcare research where response rates to surveys are often low. The findings presented here should help researchers in planning future survey based studies. Findings from this study and others suggest smaller monetary incentives for each individual may be a more appropriate way to increase response rates.

  1. Impact of Inadequate Empirical Therapy on the Mortality of Patients with Bloodstream Infections: a Propensity Score-Based Analysis

    PubMed Central

    Retamar, Pilar; Portillo, María M.; López-Prieto, María Dolores; Rodríguez-López, Fernando; de Cueto, Marina; García, María V.; Gómez, María J.; del Arco, Alfonso; Muñoz, Angel; Sánchez-Porto, Antonio; Torres-Tortosa, Manuel; Martín-Aspas, Andrés; Arroyo, Ascensión; García-Figueras, Carolina; Acosta, Federico; Corzo, Juan E.; León-Ruiz, Laura; Escobar-Lara, Trinidad

    2012-01-01

    The impact of the adequacy of empirical therapy on outcome for patients with bloodstream infections (BSI) is key for determining whether adequate empirical coverage should be prioritized over other, more conservative approaches. Recent systematic reviews outlined the need for new studies in the field, using improved methodologies. We assessed the impact of inadequate empirical treatment on the mortality of patients with BSI in the present-day context, incorporating recent methodological recommendations. A prospective multicenter cohort including all BSI episodes in adult patients was performed in 15 hospitals in Andalucía, Spain, over a 2-month period in 2006 to 2007. The main outcome variables were 14- and 30-day mortality. Adjusted analyses were performed by multivariate analysis and propensity score-based matching. Eight hundred one episodes were included. Inadequate empirical therapy was administered in 199 (24.8%) episodes; mortality at days 14 and 30 was 18.55% and 22.6%, respectively. After controlling for age, Charlson index, Pitt score, neutropenia, source, etiology, and presentation with severe sepsis or shock, inadequate empirical treatment was associated with increased mortality at days 14 and 30 (odds ratios [ORs], 2.12 and 1.56; 95% confidence intervals [95% CI], 1.34 to 3.34 and 1.01 to 2.40, respectively). The adjusted ORs after a propensity score-based matched analysis were 3.03 and 1.70 (95% CI, 1.60 to 5.74 and 0.98 to 2.98, respectively). In conclusion, inadequate empirical therapy is independently associated with increased mortality in patients with BSI. Programs to improve the quality of empirical therapy in patients with suspicion of BSI and optimization of definitive therapy should be implemented. PMID:22005999

  2. Evaluation of Physically and Empirically Based Models for the Estimation of Green Roof Evapotranspiration

    NASA Astrophysics Data System (ADS)

    Digiovanni, K. A.; Montalto, F. A.; Gaffin, S.; Rosenzweig, C.

    2010-12-01

    Green roofs and other urban green spaces can provide a variety of valuable benefits including reduction of the urban heat island effect, reduction of stormwater runoff, carbon sequestration, oxygen generation, air pollution mitigation etc. As many of these benefits are directly linked to the processes of evaporation and transpiration, accurate and representative estimation of urban evapotranspiration (ET) is a necessary tool for predicting and quantifying such benefits. However, many common ET estimation procedures were developed for agricultural applications, and thus carry inherent assumptions that may only be rarely applicable to urban green spaces. Various researchers have identified the estimation of expected urban ET rates as critical, yet poorly studied components of urban green space performance prediction and cite that further evaluation is needed to reconcile differences in predictions from varying ET modeling approaches. A small scale green roof lysimeter setup situated on the green roof of the Ethical Culture Fieldston School in the Bronx, NY has been the focus of ongoing monitoring initiated in June 2009. The experimental setup includes a 0.6 m by 1.2 m Lysimeter replicating the anatomy of the 500 m2 green roof of the building, with a roof membrane, drainage layer, 10 cm media depth, and planted with a variety of Sedum species. Soil moisture sensors and qualitative runoff measurements are also recorded in the Lysimeter, while a weather station situated on the rooftop records climatologic data. Direct quantification of actual evapotranspiration (AET) from the green roof weighing lysimeter was achieved through a mass balance approaches during periods absent of precipitation and drainage. A comparison of AET to estimates of potential evapotranspiration (PET) calculated from empirically and physically based ET models was performed in order to evaluate the applicability of conventional ET equations for the estimation of ET from green roofs. Results have

  3. Web-Based versus Classroom-Based Instruction: An Empirical Comparison of Student Performance

    ERIC Educational Resources Information Center

    Thrasher, Evelyn H.; Coleman, Phillip D.; Atkinson, J. Kirk

    2012-01-01

    Higher education expenditures are being increasingly targeted toward distance learning, with a large portion focused specifically on web-based instruction (WBI). WBI and classroom-based instruction (CBI) tend to offer students diverse options for their education. Thus, it is imperative that colleges and universities have ample, accurate…

  4. Is Project Based Learning More Effective than Direct Instruction in School Science Classrooms? An Analysis of the Empirical Research Evidence

    NASA Astrophysics Data System (ADS)

    Dann, Clifford

    An increasingly loud call by parents, school administrators, teachers, and even business leaders for "authentic learning", emphasizing both group-work and problem solving, has led to growing enthusiasm for inquiry-based learning over the past decade. Although "inquiry" can be defined in many ways, a curriculum called "project-based learning" has recently emerged as the inquiry practice-of-choice with roots in the educational constructivism that emerged in the mid-twentieth century. Often, project-based learning is framed as an alternative instructional strategy to direct instruction for maximizing student content knowledge. This study investigates the empirical evidence for such a comparison while also evaluating the overall quality of the available studies in the light of accepted standards for educational research. Specifically, this thesis investigates what the body of quantitative research says about the efficacy of project-based learning vs. direct instruction when considering student acquisition of content knowledge in science classrooms. Further, existing limitations of the research pertaining to project based learning and secondary school education are explored. The thesis concludes with a discussion of where and how we should focus our empirical efforts in the future. The research revealed that the available empirical research contains flaws in both design and instrumentation. In particular, randomization is poor amongst all the studies considered. The empirical evidence indicates that project-based learning curricula improved student content knowledge but that, while the results were statistically significant, increases in raw test scores were marginal.

  5. Time Domain Strain/Stress Reconstruction Based on Empirical Mode Decomposition: Numerical Study and Experimental Validation

    PubMed Central

    He, Jingjing; Zhou, Yibin; Guan, Xuefei; Zhang, Wei; Zhang, Weifang; Liu, Yongming

    2016-01-01

    Structural health monitoring has been studied by a number of researchers as well as various industries to keep up with the increasing demand for preventive maintenance routines. This work presents a novel method for reconstruct prompt, informed strain/stress responses at the hot spots of the structures based on strain measurements at remote locations. The structural responses measured from usage monitoring system at available locations are decomposed into modal responses using empirical mode decomposition. Transformation equations based on finite element modeling are derived to extrapolate the modal responses from the measured locations to critical locations where direct sensor measurements are not available. Then, two numerical examples (a two-span beam and a 19956-degree of freedom simplified airfoil) are used to demonstrate the overall reconstruction method. Finally, the present work investigates the effectiveness and accuracy of the method through a set of experiments conducted on an aluminium alloy cantilever beam commonly used in air vehicle and spacecraft. The experiments collect the vibration strain signals of the beam via optical fiber sensors. Reconstruction results are compared with theoretical solutions and a detailed error analysis is also provided. PMID:27537889

  6. Predicting Protein Secondary Structure Using Consensus Data Mining (CDM) Based on Empirical Statistics and Evolutionary Information.

    PubMed

    Kandoi, Gaurav; Leelananda, Sumudu P; Jernigan, Robert L; Sen, Taner Z

    2017-01-01

    Predicting the secondary structure of a protein from its sequence still remains a challenging problem. The prediction accuracies remain around 80 %, and for very diverse methods. Using evolutionary information and machine learning algorithms in particular has had the most impact. In this chapter, we will first define secondary structures, then we will review the Consensus Data Mining (CDM) technique based on the robust GOR algorithm and Fragment Database Mining (FDM) approach. GOR V is an empirical method utilizing a sliding window approach to model the secondary structural elements of a protein by making use of generalized evolutionary information. FDM uses data mining from experimental structure fragments, and is able to successfully predict the secondary structure of a protein by combining experimentally determined structural fragments based on sequence similarities of the fragments. The CDM method combines predictions from GOR V and FDM in a hierarchical manner to produce consensus predictions for secondary structure. In other words, if sequence fragment are not available, then it uses GOR V to make the secondary structure prediction. The online server of CDM is available at http://gor.bb.iastate.edu/cdm/ .

  7. An empirically based steady state friction law and implications for fault stability

    NASA Astrophysics Data System (ADS)

    Spagnuolo, E.; Nielsen, S.; Violay, M.; Di Toro, G.

    2016-04-01

    Empirically based rate-and-state friction laws (RSFLs) have been proposed to model the dependence of friction forces with slip and time. The relevance of the RSFL for earthquake mechanics is that few constitutive parameters define critical conditions for fault stability (i.e., critical stiffness and frictional fault behavior). However, the RSFLs were determined from experiments conducted at subseismic slip rates (V < 1 cm/s), and their extrapolation to earthquake deformation conditions (V > 0.1 m/s) remains questionable on the basis of the experimental evidence of (1) large dynamic weakening and (2) activation of particular fault lubrication processes at seismic slip rates. Here we propose a modified RSFL (MFL) based on the review of a large published and unpublished data set of rock friction experiments performed with different testing machines. The MFL, valid at steady state conditions from subseismic to seismic slip rates (0.1 µm/s < V < 3 m/s), describes the initiation of a substantial velocity weakening in the 1-20 cm/s range resulting in a critical stiffness increase that creates a peak of potential instability in that velocity regime. The MFL leads to a new definition of fault frictional stability with implications for slip event styles and relevance for models of seismic rupture nucleation, propagation, and arrest.

  8. Electrocardiogram signal denoising based on empirical mode decomposition technique: an overview

    NASA Astrophysics Data System (ADS)

    Han, G.; Lin, B.; Xu, Z.

    2017-03-01

    Electrocardiogram (ECG) signal is nonlinear and non-stationary weak signal which reflects whether the heart is functioning normally or abnormally. ECG signal is susceptible to various kinds of noises such as high/low frequency noises, powerline interference and baseline wander. Hence, the removal of noises from ECG signal becomes a vital link in the ECG signal processing and plays a significant role in the detection and diagnosis of heart diseases. The review will describe the recent developments of ECG signal denoising based on Empirical Mode Decomposition (EMD) technique including high frequency noise removal, powerline interference separation, baseline wander correction, the combining of EMD and Other Methods, EEMD technique. EMD technique is a quite potential and prospective but not perfect method in the application of processing nonlinear and non-stationary signal like ECG signal. The EMD combined with other algorithms is a good solution to improve the performance of noise cancellation. The pros and cons of EMD technique in ECG signal denoising are discussed in detail. Finally, the future work and challenges in ECG signal denoising based on EMD technique are clarified.

  9. Empirical mode decomposition-based motion artifact correction method for functional near-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Gu, Yue; Han, Junxia; Liang, Zhenhu; Yan, Jiaqing; Li, Zheng; Li, Xiaoli

    2016-01-01

    Functional near-infrared spectroscopy (fNIRS) is a promising technique for monitoring brain activity. However, it is sensitive to motion artifacts. Many methods have been developed for motion correction, such as spline interpolation, wavelet filtering, and kurtosis-based wavelet filtering. We propose a motion correction method based on empirical mode decomposition (EMD), which is applied to segments of data identified as having motion artifacts. The EMD method is adaptive, data-driven, and well suited for nonstationary data. To test the performance of the proposed EMD method and to compare it with other motion correction methods, we used simulated hemodynamic responses added to real resting-state fNIRS data. The EMD method reduced mean squared error in 79% of channels and increased signal-to-noise ratio in 78% of channels. Moreover, it produced the highest Pearson's correlation coefficient between the recovered signal and the original signal, significantly better than the comparison methods (p<0.01, paired t-test). These results indicate that the proposed EMD method is a first choice method for motion artifact correction in fNIRS.

  10. An empirically based steady state friction law and implications for fault stability

    PubMed Central

    Nielsen, S.; Violay, M.; Di Toro, G.

    2016-01-01

    Abstract Empirically based rate‐and‐state friction laws (RSFLs) have been proposed to model the dependence of friction forces with slip and time. The relevance of the RSFL for earthquake mechanics is that few constitutive parameters define critical conditions for fault stability (i.e., critical stiffness and frictional fault behavior). However, the RSFLs were determined from experiments conducted at subseismic slip rates (V < 1 cm/s), and their extrapolation to earthquake deformation conditions (V > 0.1 m/s) remains questionable on the basis of the experimental evidence of (1) large dynamic weakening and (2) activation of particular fault lubrication processes at seismic slip rates. Here we propose a modified RSFL (MFL) based on the review of a large published and unpublished data set of rock friction experiments performed with different testing machines. The MFL, valid at steady state conditions from subseismic to seismic slip rates (0.1 µm/s < V < 3 m/s), describes the initiation of a substantial velocity weakening in the 1–20 cm/s range resulting in a critical stiffness increase that creates a peak of potential instability in that velocity regime. The MFL leads to a new definition of fault frictional stability with implications for slip event styles and relevance for models of seismic rupture nucleation, propagation, and arrest. PMID:27667875

  11. Feasibility of an empirically based program for parents of preschoolers with autism spectrum disorder.

    PubMed

    Dababnah, Sarah; Parish, Susan L

    2016-01-01

    This article reports on the feasibility of implementing an existing empirically based program, The Incredible Years, tailored to parents of young children with autism spectrum disorder. Parents raising preschool-aged children (aged 3-6 years) with autism spectrum disorder (N = 17) participated in a 15-week pilot trial of the intervention. Quantitative assessments of the program revealed fidelity was generally maintained, with the exception of program-specific videos. Qualitative data from individual post-intervention interviews reported parents benefited most from child emotion regulation strategies, play-based child behavior skills, parent stress management, social support, and visual resources. More work is needed to further refine the program to address parent self-care, partner relationships, and the diverse behavioral and communication challenges of children across the autism spectrum. Furthermore, parent access and retention could potentially be increased by providing in-home childcare vouchers and a range of times and locations in which to offer the program. The findings suggest The Incredible Years is a feasible intervention for parents seeking additional support for child- and family-related challenges and offers guidance to those communities currently using The Incredible Years or other related parenting programs with families of children with autism spectrum disorder.

  12. Phospholipid-based nonlamellar mesophases for delivery systems: bridging the gap between empirical and rational design.

    PubMed

    Martiel, Isabelle; Sagalowicz, Laurent; Mezzenga, Raffaele

    2014-07-01

    Phospholipids are ubiquitous cell membrane components and relatively well-accepted ingredients due to their natural origin. Phosphatidylcholine (PC) in particular offers a promising alternative to monoglycerides for lyotropic liquid crystalline (LLC) delivery system applications in the food, cosmetics and pharmaceutical industries, provided its strong tendency to form zero-mean curvature lamellar mesophases in water can be overcome. Higher negative curvatures are usually reached through the addition of a third lipid component, forming a ternary diagram phospholipid/water/oil. The initial part of this work summarizes the potential advantages and the challenges of phospholipid-based delivery system applications. In the next part, various ternary PC/water/oil systems are discussed, with a special emphasis on the PC/water/cyclohexane and PC/water/α-tocopherol systems. We report that R-(+)-limonene has a quantitatively similar effect as cyclohexane. The last part is devoted to the theoretical interpretation of the observed phase behaviors. A fruitful parallel is drawn with PC polymer-like reverse micelles, leading to a thermodynamic description in terms of interfacial bending energy. Investigations at the molecular level are reviewed to help in bridging the empirical and theoretical approaches. Predictive rules are finally derived from this wide-ranging overview, thereby opening the way to a future rational design of PC-based LLC delivery systems.

  13. Time Domain Strain/Stress Reconstruction Based on Empirical Mode Decomposition: Numerical Study and Experimental Validation.

    PubMed

    He, Jingjing; Zhou, Yibin; Guan, Xuefei; Zhang, Wei; Zhang, Weifang; Liu, Yongming

    2016-08-16

    Structural health monitoring has been studied by a number of researchers as well as various industries to keep up with the increasing demand for preventive maintenance routines. This work presents a novel method for reconstruct prompt, informed strain/stress responses at the hot spots of the structures based on strain measurements at remote locations. The structural responses measured from usage monitoring system at available locations are decomposed into modal responses using empirical mode decomposition. Transformation equations based on finite element modeling are derived to extrapolate the modal responses from the measured locations to critical locations where direct sensor measurements are not available. Then, two numerical examples (a two-span beam and a 19956-degree of freedom simplified airfoil) are used to demonstrate the overall reconstruction method. Finally, the present work investigates the effectiveness and accuracy of the method through a set of experiments conducted on an aluminium alloy cantilever beam commonly used in air vehicle and spacecraft. The experiments collect the vibration strain signals of the beam via optical fiber sensors. Reconstruction results are compared with theoretical solutions and a detailed error analysis is also provided.

  14. Enhancement of lung sounds based on empirical mode decomposition and Fourier transform algorithm.

    PubMed

    Mondal, Ashok; Banerjee, Poulami; Somkuwar, Ajay

    2017-02-01

    There is always heart sound (HS) signal interfering during the recording of lung sound (LS) signals. This obscures the features of LS signals and creates confusion on pathological states, if any, of the lungs. In this work, a new method is proposed for reduction of heart sound interference which is based on empirical mode decomposition (EMD) technique and prediction algorithm. In this approach, first the mixed signal is split into several components in terms of intrinsic mode functions (IMFs). Thereafter, HS-included segments are localized and removed from them. The missing values of the gap thus produced, is predicted by a new Fast Fourier Transform (FFT) based prediction algorithm and the time domain LS signal is reconstructed by taking an inverse FFT of the estimated missing values. The experiments have been conducted on simulated and recorded HS corrupted LS signals at three different flow rates and various SNR levels. The performance of the proposed method is evaluated by qualitative and quantitative analysis of the results. It is found that the proposed method is superior to the baseline method in terms of quantitative and qualitative measurement. The developed method gives better results compared to baseline method for different SNR levels. Our method gives cross correlation index (CCI) of 0.9488, signal to deviation ratio (SDR) of 9.8262, and normalized maximum amplitude error (NMAE) of 26.94 for 0 dB SNR value. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. PDE-based nonlinear diffusion techniques for denoising scientific and industrial images: an empirical study

    NASA Astrophysics Data System (ADS)

    Weeratunga, Sisira K.; Kamath, Chandrika

    2002-05-01

    Removing noise from data is often the first step in data analysis. Denoising techniques should not only reduce the noise, but do so without blurring or changing the location of the edges. Many approaches have been proposed to accomplish this; in this paper, we focus on one such approach, namely the use of non-linear diffusion operators. This approach has been studied extensively from a theoretical viewpoint ever since the 1987 work of Perona and Malik showed that non-linear filters outperformed the more traditional linear Canny edge detector. We complement this theoretical work by investigating the performance of several isotropic diffusion operators on test images from scientific domains. We explore the effects of various parameters such as the choice of diffusivity function, explicit and implicit methods for the discretization of the PDE, and approaches for the spatial discretization of the non-linear operator etc. We also compare these schemes with simple spatial filters and the more complex wavelet-based shrinkage techniques. Our empirical results show that, with an appropriate choice of parameters, diffusion-based schemes can be as effective as competitive techniques.

  16. An empirically based steady state friction law and implications for fault stability.

    PubMed

    Spagnuolo, E; Nielsen, S; Violay, M; Di Toro, G

    2016-04-16

    Empirically based rate-and-state friction laws (RSFLs) have been proposed to model the dependence of friction forces with slip and time. The relevance of the RSFL for earthquake mechanics is that few constitutive parameters define critical conditions for fault stability (i.e., critical stiffness and frictional fault behavior). However, the RSFLs were determined from experiments conducted at subseismic slip rates (V < 1 cm/s), and their extrapolation to earthquake deformation conditions (V > 0.1 m/s) remains questionable on the basis of the experimental evidence of (1) large dynamic weakening and (2) activation of particular fault lubrication processes at seismic slip rates. Here we propose a modified RSFL (MFL) based on the review of a large published and unpublished data set of rock friction experiments performed with different testing machines. The MFL, valid at steady state conditions from subseismic to seismic slip rates (0.1 µm/s < V < 3 m/s), describes the initiation of a substantial velocity weakening in the 1-20 cm/s range resulting in a critical stiffness increase that creates a peak of potential instability in that velocity regime. The MFL leads to a new definition of fault frictional stability with implications for slip event styles and relevance for models of seismic rupture nucleation, propagation, and arrest.

  17. Polarizable Empirical Force Field for Nitrogen-containing Heteroaromatic Compounds Based on the Classical Drude Oscillator

    PubMed Central

    Lopes, Pedro E. M.; Lamoureux, Guillaume; MacKerell, Alexander D.

    2009-01-01

    The polarizable empirical CHARMM force field based on the classical Drude oscillator has been extended to the nitrogen-containing heteroaromatic compounds pyridine, pyrimidine, pyrrole, imidazole, indole and purine. Initial parameters for the 6-membered rings were based on benzene with non-bond parameter optimization focused on the nitrogen atoms and adjacent carbons and attached hydrogens. In the case of 5-member rings, parameters were first developed for imidazole and transferred to pyrrole. Optimization of all parameters was performed against an extensive set of quantum mechanical and experimental data. Ab initio data was used for determination of the initial electrostatic parameters, the vibrational analysis, and in the optimization of the relative magnitudes of the Lennard-Jones parameters, through computations of the interactions of dimers of model compounds, model compound-water interactions and interactions of rare gases with model compounds. The absolute values of the Lennard-Jones parameters were determined targeting experimental heats of vaporization, molecular volumes, heats of sublimation, crystal lattice parameters and free energies of hydration. Final scaling of the polarizabilities from the gas phase values by 0.85 was determined by reproduction of the dielectric constants of pyridine and pyrrole. The developed parameter set was extensively validated against additional experimental data such as diffusion constants, heat capacities and isothermal compressibilities, including data as a function of temperature. PMID:19090564

  18. Biomarker-based strategy for early discontinuation of empirical antifungal treatment in critically ill patients: a randomized controlled trial.

    PubMed

    Rouzé, Anahita; Loridant, Séverine; Poissy, Julien; Dervaux, Benoit; Sendid, Boualem; Cornu, Marjorie; Nseir, Saad

    2017-09-22

    The aim of this study was to determine the impact of a biomarker-based strategy on early discontinuation of empirical antifungal treatment. Prospective randomized controlled single-center unblinded study, performed in a mixed ICU. A total of 110 patients were randomly assigned to a strategy in which empirical antifungal treatment duration was determined by (1,3)-β-D-glucan, mannan, and anti-mannan serum assays, performed on day 0 and day 4; or to a routine care strategy, based on international guidelines, which recommend 14 days of treatment. In the biomarker group, early stop recommendation was determined using an algorithm based on the results of biomarkers. The primary outcome was the percentage of survivors discontinuing empirical antifungal treatment early, defined as a discontinuation strictly before day 7. A total of 109 patients were analyzed (one patient withdraw consent). Empirical antifungal treatment was discontinued early in 29 out of 54 patients in the biomarker strategy group, compared with one patient out of 55 in the routine strategy group [54% vs 2%, p < 0.001, OR (95% CI) 62.6 (8.1-486)]. Total duration of antifungal treatment was significantly shorter in the biomarker strategy compared with routine strategy [median (IQR) 6 (4-13) vs 13 (12-14) days, p < 0.0001). No significant difference was found in the percentage of patients with subsequent proven invasive Candida infection, mechanical ventilation-free days, length of ICU stay, cost, and ICU mortality between the two study groups. The use of a biomarker-based strategy increased the percentage of early discontinuation of empirical antifungal treatment among critically ill patients with suspected invasive Candida infection. These results confirm previous findings suggesting that early discontinuation of empirical antifungal treatment had no negative impact on outcome. However, further studies are needed to confirm the safety of this strategy. This trial was registered at Clinical

  19. Functionality of empirical model-based predictive analytics for the early detection of hemodynamic instabilty.

    PubMed

    Summers, Richard L; Pipke, Matt; Wegerich, Stephan; Conkright, Gary; Isom, Kristen C

    2014-01-01

    Background. Monitoring cardiovascular hemodynamics in the modern clinical setting is a major challenge. Increasing amounts of physiologic data must be analyzed and interpreted in the context of the individual patient’s pathology and inherent biologic variability. Certain data-driven analytical methods are currently being explored for smart monitoring of data streams from patients as a first tier automated detection system for clinical deterioration. As a prelude to human clinical trials, an empirical multivariate machine learning method called Similarity-Based Modeling (“SBM”), was tested in an In Silico experiment using data generated with the aid of a detailed computer simulator of human physiology (Quantitative Circulatory Physiology or “QCP”) which contains complex control systems with realistic integrated feedback loops. Methods. SBM is a kernel-based, multivariate machine learning method that that uses monitored clinical information to generate an empirical model of a patient’s physiologic state. This platform allows for the use of predictive analytic techniques to identify early changes in a patient’s condition that are indicative of a state of deterioration or instability. The integrity of the technique was tested through an In Silico experiment using QCP in which the output of computer simulations of a slowly evolving cardiac tamponade resulted in progressive state of cardiovascular decompensation. Simulator outputs for the variables under consideration were generated at a 2-min data rate (0.083Hz) with the tamponade introduced at a point 420 minutes into the simulation sequence. The functionality of the SBM predictive analytics methodology to identify clinical deterioration was compared to the thresholds used by conventional monitoring methods. Results. The SBM modeling method was found to closely track the normal physiologic variation as simulated by QCP. With the slow development of the tamponade, the SBM model are seen to disagree while the

  20. Benchmarking of a T-wave alternans detection method based on empirical mode decomposition.

    PubMed

    Blanco-Velasco, Manuel; Goya-Esteban, Rebeca; Cruz-Roldán, Fernando; García-Alberola, Arcadi; Rojo-Álvarez, José Luis

    2017-07-01

    T-wave alternans (TWA) is a fluctuation of the ST-T complex occurring on an every-other-beat basis of the surface electrocardiogram (ECG). It has been shown to be an informative risk stratifier for sudden cardiac death, though the lack of gold standard to benchmark detection methods has promoted the use of synthetic signals. This work proposes a novel signal model to study the performance of a TWA detection. Additionally, the methodological validation of a denoising technique based on empirical mode decomposition (EMD), which is used here along with the spectral method, is also tackled. The proposed test bed system is based on the following guidelines: (1) use of open source databases to enable experimental replication; (2) use of real ECG signals and physiological noise; (3) inclusion of randomized TWA episodes. Both sensitivity (Se) and specificity (Sp) are separately analyzed. Also a nonparametric hypothesis test, based on Bootstrap resampling, is used to determine whether the presence of the EMD block actually improves the performance. The results show an outstanding specificity when the EMD block is used, even in very noisy conditions (0.96 compared to 0.72 for SNR = 8 dB), being always superior than that of the conventional SM alone. Regarding the sensitivity, using the EMD method also outperforms in noisy conditions (0.57 compared to 0.46 for SNR=8 dB), while it decreases in noiseless conditions. The proposed test setting designed to analyze the performance guarantees that the actual physiological variability of the cardiac system is reproduced. The use of the EMD-based block in noisy environment enables the identification of most patients with fatal arrhythmias. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Evaluating the compatibility of physics-based deterministic synthetic ground motion with empirical GMPE

    NASA Astrophysics Data System (ADS)

    Baumann, C.; Dalguer, L. A.

    2012-12-01

    Recent development of deterministic physics-based numerical simulations of earthquakes has contributed to substantial advances in our understanding of different aspects related to the earthquake mechanism and near source ground motion. These models have greater potential for identifying and predicting the variability of near-source ground motions dominated by the source and/or geological effects. These advances have led to increased interest in using suite of physics-based models for reliable prediction of ground motion of future earthquakes for seismic hazard assessment and risk mitigation, particularly in areas where there are few recorded ground motions. But before using synthetic ground motion, it is important to evaluate the reliability of deterministic synthetic ground motions, particularly the upper frequency limit. Current engineering practice usually use ground motion quantities estimated from empirical Ground Motion Predicting Equations (GMPE) such as peak ground acceleration (PGA), peak ground velocity (PGV), peak ground displacement (PGD), and spectral ordinates as input to assess building response for seismic safety of future and existing structures. Therefore it is intuitive and evident to verify the compatibility of synthetic ground motions with current empirical GMPE. In this study we attempt to do it so, to a suite of deterministic ground motion simulation generated by earthquake dynamic rupture models. We focus mainly on determining the upper frequency limit in which the synthetic ground motions are compatible to GMPE. For that purpose we have generated suite of earthquake rupture dynamic models in a layered 1D velocity structure. The simulations include 360 rupture dynamic models with moment magnitudes in the range of 5.5-7, for three styles of faulting (reverse, normal and strike slip), for both buried faults and surface rupturing faults. Normal stress and frictional strength are depth and non-depth dependent. Initial stress distribution follows

  2. The WASP and NGTS ground-based transit surveys

    NASA Astrophysics Data System (ADS)

    Wheatley, P. J.

    2015-10-01

    I will review the current status of ground-based exoplanet transit surveys, using the Wide Angle Search for Planets (WASP) and the Next Generation Transit Survey (NGTS) as specific examples. I will describe the methods employed by these surveys and show how planets from Neptune to Jupiter-size are detected and confirmed around bright stars. I will also give an overview of the remarkably wide range of exoplanet characterization that is made possible with large-telescope follow up of these bright transiting systems. This characterization includes bulk composition and spin-orbit alignment, as well as atmospheric properties such as thermal structure, composition and dynamics. Finally, I will outline how ground-based photometric studies of transiting planets will evolve with the advent of new space-based surveys such as TESS and PLATO.

  3. An Empirical Introduction to the Concept of Chemical Element Based on Van Hiele's Theory of Level Transitions

    ERIC Educational Resources Information Center

    Vogelezang, Michiel; Van Berkel, Berry; Verdonk, Adri

    2015-01-01

    Between 1970 and 1990, the Dutch working group "Empirical Introduction to Chemistry" developed a secondary school chemistry education curriculum based on the educational vision of the mathematicians van Hiele and van Hiele-Geldof. This approach viewed learning as a process in which students must go through discontinuous level transitions…

  4. Empirically Based Phenotypic Profiles of Children with Pervasive Developmental Disorders: Interpretation in the Light of the DSM-5

    ERIC Educational Resources Information Center

    Greaves-Lord, Kirstin; Eussen, Mart L. J. M.; Verhulst, Frank C.; Minderaa, Ruud B.; Mandy, William; Hudziak, James J.; Steenhuis, Mark Peter; de Nijs, Pieter F.; Hartman, Catharina A.

    2013-01-01

    This study aimed to contribute to the Diagnostic and Statistical Manual (DSM) debates on the conceptualization of autism by investigating (1) whether empirically based distinct phenotypic profiles could be distinguished within a sample of mainly cognitively able children with pervasive developmental disorder (PDD), and (2) how profiles related to…

  5. Empirically Guided Coordination of Multiple Evidence-Based Treatments: An Illustration of Relevance Mapping in Children's Mental Health Services

    ERIC Educational Resources Information Center

    Chorpita, Bruce F.; Bernstein, Adam; Daleiden, Eric L.

    2011-01-01

    Objective: Despite substantial progress in the development and identification of psychosocial evidence-based treatments (EBTs) in mental health, there is minimal empirical guidance for selecting an optimal "set" of EBTs maximally applicable and generalizable to a chosen service sample. Relevance mapping is a proposed methodology that…

  6. Using an empirical and rule-based modeling approach to map cause of disturbance in U.S

    Treesearch

    Todd A. Schroeder; Gretchen G. Moisen; Karen Schleeweis; Chris Toney; Warren B. Cohen; Zhiqiang Yang; Elizabeth A. Freeman

    2015-01-01

    Recently completing over a decade of research, the NASA/NACP funded North American Forest Dynamics (NAFD) project has led to several important advancements in the way U.S. forest disturbance dynamics are mapped at regional and continental scales. One major contribution has been the development of an empirical and rule-based modeling approach which addresses two of the...

  7. Empirically Guided Coordination of Multiple Evidence-Based Treatments: An Illustration of Relevance Mapping in Children's Mental Health Services

    ERIC Educational Resources Information Center

    Chorpita, Bruce F.; Bernstein, Adam; Daleiden, Eric L.

    2011-01-01

    Objective: Despite substantial progress in the development and identification of psychosocial evidence-based treatments (EBTs) in mental health, there is minimal empirical guidance for selecting an optimal "set" of EBTs maximally applicable and generalizable to a chosen service sample. Relevance mapping is a proposed methodology that…

  8. The Potential for Empirically Based Estimates of Expected Progress for Students with Learning Disabilities: Legal and Conceptual Issues.

    ERIC Educational Resources Information Center

    Stone, C. Addison; Doane, J. Abram

    2001-01-01

    The purpose of this article is to spark discussion regarding the value and feasibility of empirically based procedures for goal setting and evaluation of educational services. Recent legal decisions and policy debates point to the need for clearer criteria in decisions regarding appropriate educational services. Possible roles for school…

  9. An Empirical Introduction to the Concept of Chemical Element Based on Van Hiele's Theory of Level Transitions

    ERIC Educational Resources Information Center

    Vogelezang, Michiel; Van Berkel, Berry; Verdonk, Adri

    2015-01-01

    Between 1970 and 1990, the Dutch working group "Empirical Introduction to Chemistry" developed a secondary school chemistry education curriculum based on the educational vision of the mathematicians van Hiele and van Hiele-Geldof. This approach viewed learning as a process in which students must go through discontinuous level transitions…

  10. PowerPoint-Based Lectures in Business Education: An Empirical Investigation of Student-Perceived Novelty and Effectiveness

    ERIC Educational Resources Information Center

    Burke, Lisa A.; James, Karen E.

    2008-01-01

    The use of PowerPoint (PPT)-based lectures in business classes is prevalent, yet it remains empirically understudied in business education research. The authors investigate whether students in the contemporary business classroom view PPT as a novel stimulus and whether these perceptions of novelty are related to students' self-assessment of…

  11. Empirical Differences in Omission Tendency and Reading Ability in PISA: An Application of Tree-Based Item Response Models

    ERIC Educational Resources Information Center

    Okumura, Taichi

    2014-01-01

    This study examined the empirical differences between the tendency to omit items and reading ability by applying tree-based item response (IRTree) models to the Japanese data of the Programme for International Student Assessment (PISA) held in 2009. For this purpose, existing IRTree models were expanded to contain predictors and to handle…

  12. Empirically Based Phenotypic Profiles of Children with Pervasive Developmental Disorders: Interpretation in the Light of the DSM-5

    ERIC Educational Resources Information Center

    Greaves-Lord, Kirstin; Eussen, Mart L. J. M.; Verhulst, Frank C.; Minderaa, Ruud B.; Mandy, William; Hudziak, James J.; Steenhuis, Mark Peter; de Nijs, Pieter F.; Hartman, Catharina A.

    2013-01-01

    This study aimed to contribute to the Diagnostic and Statistical Manual (DSM) debates on the conceptualization of autism by investigating (1) whether empirically based distinct phenotypic profiles could be distinguished within a sample of mainly cognitively able children with pervasive developmental disorder (PDD), and (2) how profiles related to…

  13. Robust multitask learning with three-dimensional empirical mode decomposition-based features for hyperspectral classification

    NASA Astrophysics Data System (ADS)

    He, Zhi; Liu, Lin

    2016-11-01

    Empirical mode decomposition (EMD) and its variants have recently been applied for hyperspectral image (HSI) classification due to their ability to extract useful features from the original HSI. However, it remains a challenging task to effectively exploit the spectral-spatial information by the traditional vector or image-based methods. In this paper, a three-dimensional (3D) extension of EMD (3D-EMD) is proposed to naturally treat the HSI as a cube and decompose the HSI into varying oscillations (i.e. 3D intrinsic mode functions (3D-IMFs)). To achieve fast 3D-EMD implementation, 3D Delaunay triangulation (3D-DT) is utilized to determine the distances of extrema, while separable filters are adopted to generate the envelopes. Taking the extracted 3D-IMFs as features of different tasks, robust multitask learning (RMTL) is further proposed for HSI classification. In RMTL, pairs of low-rank and sparse structures are formulated by trace-norm and l1,2 -norm to capture task relatedness and specificity, respectively. Moreover, the optimization problems of RMTL can be efficiently solved by the inexact augmented Lagrangian method (IALM). Compared with several state-of-the-art feature extraction and classification methods, the experimental results conducted on three benchmark data sets demonstrate the superiority of the proposed methods.

  14. Dispelling myths about dissociative identity disorder treatment: an empirically based approach.

    PubMed

    Brand, Bethany L; Loewenstein, Richard J; Spiegel, David

    2014-01-01

    Some claim that treatment for dissociative identity disorder (DID) is harmful. Others maintain that the available data support the view that psychotherapy is helpful. We review the empirical support for both arguments. Current evidence supports the conclusion that phasic treatment consistent with expert consensus guidelines is associated with improvements in a wide range of DID patients' symptoms and functioning, decreased rates of hospitalization, and reduced costs of treatment. Research indicates that poor outcome is associated with treatment that does not specifically involve direct engagement with DID self-states to repair identity fragmentation and to decrease dissociative amnesia. The evidence demonstrates that carefully staged trauma-focused psychotherapy for DID results in improvement, whereas dissociative symptoms persist when not specifically targeted in treatment. The claims that DID treatment is harmful are based on anecdotal cases, opinion pieces, reports of damage that are not substantiated in the scientific literature, misrepresentations of the data, and misunderstandings about DID treatment and the phenomenology of DID. Given the severe symptomatology and disability associated with DID, iatrogenic harm is far more likely to come from depriving DID patients of treatment that is consistent with expert consensus, treatment guidelines, and current research.

  15. Ship classification using nonlinear features of radiated sound: an approach based on empirical mode decomposition.

    PubMed

    Bao, Fei; Li, Chen; Wang, Xinlong; Wang, Qingfu; Du, Shuanping

    2010-07-01

    Classification for ship-radiated underwater sound is one of the most important and challenging subjects in underwater acoustical signal processing. An approach to ship classification is proposed in this work based on analysis of ship-radiated acoustical noise in subspaces of intrinsic mode functions attained via the ensemble empirical mode decomposition. It is shown that detection and acquisition of stable and reliable nonlinear features become practically feasible by nonlinear analysis of the time series of individual decomposed components, each of which is simple enough and well represents an oscillatory mode of ship dynamics. Surrogate and nonlinear predictability analysis are conducted to probe and measure the nonlinearity and regularity. The results of both methods, which verify each other, substantiate that ship-radiated noises contain components with deterministic nonlinear features well serving for efficient classification of ships. The approach perhaps opens an alternative avenue in the direction toward object classification and identification. It may also import a new view of signals as complex as ship-radiated sound.

  16. Percentile-based Empirical Distribution Function Estimates for Performance Evaluation of Healthcare Providers

    PubMed Central

    Paddock, Susan M.; Louis, Thomas A.

    2010-01-01

    Summary Hierarchical models are widely-used to characterize the performance of individual healthcare providers. However, little attention has been devoted to system-wide performance evaluations, the goals of which include identifying extreme (e.g., top 10%) provider performance and developing statistical benchmarks to define high-quality care. Obtaining optimal estimates of these quantities requires estimating the empirical distribution function (EDF) of provider-specific parameters that generate the dataset under consideration. However, the difficulty of obtaining uncertainty bounds for a square-error loss minimizing EDF estimate has hindered its use in system-wide performance evaluations. We therefore develop and study a percentile-based EDF estimate for univariate provider-specific parameters. We compute order statistics of samples drawn from the posterior distribution of provider-specific parameters to obtain relevant uncertainty assessments of an EDF estimate and its features, such as thresholds and percentiles. We apply our method to data from the Medicare End Stage Renal Disease (ESRD) Program, a health insurance program for people with irreversible kidney failure. We highlight the risk of misclassifying providers as exceptionally good or poor performers when uncertainty in statistical benchmark estimates is ignored. Given the high stakes of performance evaluations, statistical benchmarks should be accompanied by precision estimates. PMID:21918583

  17. Percentile-based Empirical Distribution Function Estimates for Performance Evaluation of Healthcare Providers.

    PubMed

    Paddock, Susan M; Louis, Thomas A

    2011-08-01

    Hierarchical models are widely-used to characterize the performance of individual healthcare providers. However, little attention has been devoted to system-wide performance evaluations, the goals of which include identifying extreme (e.g., top 10%) provider performance and developing statistical benchmarks to define high-quality care. Obtaining optimal estimates of these quantities requires estimating the empirical distribution function (EDF) of provider-specific parameters that generate the dataset under consideration. However, the difficulty of obtaining uncertainty bounds for a square-error loss minimizing EDF estimate has hindered its use in system-wide performance evaluations. We therefore develop and study a percentile-based EDF estimate for univariate provider-specific parameters. We compute order statistics of samples drawn from the posterior distribution of provider-specific parameters to obtain relevant uncertainty assessments of an EDF estimate and its features, such as thresholds and percentiles. We apply our method to data from the Medicare End Stage Renal Disease (ESRD) Program, a health insurance program for people with irreversible kidney failure. We highlight the risk of misclassifying providers as exceptionally good or poor performers when uncertainty in statistical benchmark estimates is ignored. Given the high stakes of performance evaluations, statistical benchmarks should be accompanied by precision estimates.

  18. Empirical prediction of Indian summer monsoon rainfall with different lead periods based on global SST anomalies

    NASA Astrophysics Data System (ADS)

    Pai, D. S.; Rajeevan, M.

    2006-02-01

    The main objective of this study was to develop empirical models with different seasonal lead time periods for the long range prediction of seasonal (June to September) Indian summer monsoon rainfall (ISMR). For this purpose, 13 predictors having significant and stable relationships with ISMR were derived by the correlation analysis of global grid point seasonal Sea-Surface Temperature (SST) anomalies and the tendency in the SST anomalies. The time lags of the seasonal SST anomalies were varied from 1 season to 4 years behind the reference monsoon season. The basic SST data set used was the monthly NOAA Extended Reconstructed Global SST (ERSST) data at 2° × 2° spatial grid for the period 1951 2003. The time lags of the 13 predictors derived from various areas of all three tropical ocean basins (Indian, Pacific and Atlantic Oceans) varied from 1 season to 3 years. Based on these inter-correlated predictors, 3 predictor sub sets A, B and C were formed with prediction lead time periods of 0, 1 and 2 seasons, respectively, from the beginning of the monsoon season. The selected principal components (PCs) of these predictor sets were used as the input parameters for the models A, B and C, respectively. The model development period was 1955 1984. The correct model size was derived using all-possible regressions procedure and Mallow’s “Cp” statistics.

  19. Quantitative analysis of breast DCE-MR images based on ICA and an empirical model

    NASA Astrophysics Data System (ADS)

    Goebl, Sebastian; Plant, Claudia; Lobbes, Marc; Meyer-Bäse, Anke

    2012-06-01

    DCE-MRI represents an important tool for detecting subtle kinetic changes in breast lesion tissue. Non-masslike breast lesions exhibit an atypical dynamical behavior compared to mass-like lesions and pose a challenge to a computer-aided diagnosis system. Yet the correct diagnosis of these tumors represents an important step towards early prevention. We apply Independent Component Analysis (ICA) on DCE-MRI images to extract kinetic tumor curves. We use a known empirical mathematical model to automatically identify the tumor curves from the ICA result. Filtering out noise, our technique is superior to traditional ROI-based analysis in capturing the kinetic characteristics of the tumor curves. These typical characteristics enable us to nd out the optimal number of independent components for ICA. Another benet of our method is the segmentation of tumor tissue which is superior to the segmentation from MR subtraction images. Our aim is a optimal extraction of tumor curves to provide a better basis for kinetic analysis and to distinguish between benign and malignant lesions, especially for the challenging non-mass-like breast lesions.

  20. Empirical Study of User Preferences Based on Rating Data of Movies.

    PubMed

    Zhao, YingSi; Shen, Bo

    2016-01-01

    User preference plays a prominent role in many fields, including electronic commerce, social opinion, and Internet search engines. Particularly in recommender systems, it directly influences the accuracy of the recommendation. Though many methods have been presented, most of these have only focused on how to improve the recommendation results. In this paper, we introduce an empirical study of user preferences based on a set of rating data about movies. We develop a simple statistical method to investigate the characteristics of user preferences. We find that the movies have potential characteristics of closure, which results in the formation of numerous cliques with a power-law size distribution. We also find that a user related to a small clique always has similar opinions on the movies in this clique. Then, we suggest a user preference model, which can eliminate the predictions that are considered to be impracticable. Numerical results show that the model can reflect user preference with remarkable accuracy when data elimination is allowed, and random factors in the rating data make prediction error inevitable. In further research, we will investigate many other rating data sets to examine the universality of our findings.

  1. Empirical Study of User Preferences Based on Rating Data of Movies

    PubMed Central

    Zhao, YingSi; Shen, Bo

    2016-01-01

    User preference plays a prominent role in many fields, including electronic commerce, social opinion, and Internet search engines. Particularly in recommender systems, it directly influences the accuracy of the recommendation. Though many methods have been presented, most of these have only focused on how to improve the recommendation results. In this paper, we introduce an empirical study of user preferences based on a set of rating data about movies. We develop a simple statistical method to investigate the characteristics of user preferences. We find that the movies have potential characteristics of closure, which results in the formation of numerous cliques with a power-law size distribution. We also find that a user related to a small clique always has similar opinions on the movies in this clique. Then, we suggest a user preference model, which can eliminate the predictions that are considered to be impracticable. Numerical results show that the model can reflect user preference with remarkable accuracy when data elimination is allowed, and random factors in the rating data make prediction error inevitable. In further research, we will investigate many other rating data sets to examine the universality of our findings. PMID:26735847

  2. Seismic facies analysis based on self-organizing map and empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Du, Hao-kun; Cao, Jun-xing; Xue, Ya-juan; Wang, Xing-jian

    2015-01-01

    Seismic facies analysis plays an important role in seismic interpretation and reservoir model building by offering an effective way to identify the changes in geofacies inter wells. The selections of input seismic attributes and their time window have an obvious effect on the validity of classification and require iterative experimentation and prior knowledge. In general, it is sensitive to noise when waveform serves as the input data to cluster analysis, especially with a narrow window. To conquer this limitation, the Empirical Mode Decomposition (EMD) method is introduced into waveform classification based on SOM. We first de-noise the seismic data using EMD and then cluster the data using 1D grid SOM. The main advantages of this method are resolution enhancement and noise reduction. 3D seismic data from the western Sichuan basin, China, are collected for validation. The application results show that seismic facies analysis can be improved and better help the interpretation. The powerful tolerance for noise makes the proposed method to be a better seismic facies analysis tool than classical 1D grid SOM method, especially for waveform cluster with a narrow window.

  3. Empirical mode decomposition of digital mammograms for the statistical based characterization of architectural distortion.

    PubMed

    Zyout, Imad; Togneri, Roberto

    2015-01-01

    Among the different and common mammographic signs of the early-stage breast cancer, the architectural distortion is the most difficult to be identified. In this paper, we propose a new multiscale statistical texture analysis to characterize the presence of architectural distortion by distinguishing between textural patterns of architectural distortion and normal breast parenchyma. The proposed approach, firstly, applies the bidimensional empirical mode decomposition algorithm to decompose each mammographic region of interest into a set of adaptive and data-driven two-dimensional intrinsic mode functions (IMF) layers that capture details or high-frequency oscillations of the input image. Then, a model-based approach is applied to IMF histograms to acquire the first order statistics. The normalized entropy measure is also computed from each IMF and used as a complementary textural feature for the recognition of architectural distortion patterns. For evaluating the proposed AD characterization approach, we used a mammographic dataset of 187 true positive regions (i.e. depicting architectural distortion) and 887 true negative (normal parenchyma) regions, extracted from the DDSM database. Using the proposed multiscale textural features and the nonlinear support vector machine classifier, the best classification performance, in terms of the area under the receiver operating characteristic curve (or Az value), achieved was 0.88.

  4. Polarizable Empirical Force Field for Aromatic Compounds Based on the Classical Drude Oscillator

    PubMed Central

    Lopes, Pedro E. M.; Lamoureux, Guillaume; Roux, Benoit; MacKerell, Alexander D.

    2008-01-01

    The polarizable empirical CHARMM force field based on the classical Drude oscillator has been extended to the aromatic compounds benzene and toluene. Parameters were optimized for benzene and then transferred directly to toluene, with parameters for the methyl moiety of toluene taken from the previously published work on the alkanes. Optimization of all parameters was performed against an extensive set of quantum mechanical and experimental data. Ab initio data was used for determination of the electrostatic parameters, the vibrational analysis, and in the optimization of the relative magnitudes of the Lennard-Jones parameters. The absolute values of the Lennard-Jones parameters were determined by comparing computed and experimental heats of vaporization, molecular volumes, free energies of hydration and dielectric constants. The newly developed parameter set was extensively tested against additional experimental data such as vibrational spectra in the condensed phase, diffusion constants, heat capacities at constant pressure and isothermal compressibilities including data as a function of temperature. Moreover, the structure of liquid benzene, liquid toluene and of solutions of each in water were studied. In the case of benzene, the computed and experimental total distribution function were compared, with the developed model shown to be in excellent agreement with experiment. PMID:17388420

  5. An empirically based simulation of group foraging in the harvesting ant, Messor pergandei.

    PubMed

    Plowes, Nicola J R; Ramsch, Kai; Middendorf, Martin; Hölldobler, Bert

    2014-01-07

    We present an empirically based group model of foraging interactions in Messor pergandei, the Sonoran desert harvesting ant. M. pergandei colonies send out daily foraging columns consisting of tens of thousands of individual ants. Each day, the directions of the columns may change depending on the resource availability and the neighbor interactions. If neighboring columns meet, ants fight, and subsequent foraging is suppressed. M. pergandei colonies face a general problem which is present in many systems: dynamic spatial partitioning in a constantly changing environment, while simultaneously minimizing negative competitive interactions with multiple neighbors. Our simulation model of a population of column foragers is spatially explicit and includes neighbor interactions. We study how different behavioral strategies influence resource exploitation and space use for different nest distributions and densities. Column foraging in M. pergandei is adapted to the spatial and temporal properties of their natural habitat. Resource and space use is maximized both at the colony and the population level by a model with a behavioral strategy including learning and fast forgetting rates. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Interdigitated silver-polymer-based antibacterial surface system activated by oligodynamic iontophoresis - an empirical characterization study.

    PubMed

    Shirwaiker, Rohan A; Wysk, Richard A; Kariyawasam, Subhashinie; Voigt, Robert C; Carrion, Hector; Nembhard, Harriet Black

    2014-02-01

    There is a pressing need to control the occurrences of nosocomial infections due to their detrimental effects on patient well-being and the rising treatment costs. To prevent the contact transmission of such infections via health-critical surfaces, a prophylactic surface system that consists of an interdigitated array of oppositely charged silver electrodes with polymer separations and utilizes oligodynamic iontophoresis has been recently developed. This paper presents a systematic study that empirically characterizes the effects of the surface system parameters on its antibacterial efficacy, and validates the system's effectiveness. In the first part of the study, a fractional factorial design of experiments (DOE) was conducted to identify the statistically significant system parameters. The data were used to develop a first-order response surface model to predict the system's antibacterial efficacy based on the input parameters. In the second part of the study, the effectiveness of the surface system was validated by evaluating it against four bacterial species responsible for several nosocomial infections - Staphylococcus aureus, Escherichia coli, Pseudomonas aeruginosa, and Enterococcus faecalis - alongside non-antibacterial polymer (acrylic) control surfaces. The system demonstrated statistically significant efficacy against all four bacteria. The results indicate that given a constant total effective surface area, the system designed with micro-scale features (minimum feature width: 20 μm) and activated by 15 μA direct current will provide the most effective antibacterial prophylaxis.

  7. Satellite-based empirical models linking river plume dynamics with hypoxic area and volume

    NASA Astrophysics Data System (ADS)

    Le, Chengfeng; Lehrter, John C.; Hu, Chuanmin; Obenour, Daniel R.

    2016-03-01

    Satellite-based empirical models explaining hypoxic area and volume variation were developed for the seasonally hypoxic (O2 < 2 mg L-1) northern Gulf of Mexico adjacent to the Mississippi River. Annual variations in midsummer hypoxic area and volume were related to Moderate Resolution Imaging Spectroradiometer-derived monthly estimates of river plume area (km2) and average, inner shelf chlorophyll a concentration (Chl a, mg m-3). River plume area in June was negatively related with midsummer hypoxic area (km2) and volume (km3), while July inner shelf Chl a was positively related to hypoxic area and volume. Multiple regression models using river plume area and Chl a as independent variables accounted for most of the variability in hypoxic area (R2 = 0.92) or volume (R2 = 0.89). These models explain more variation in hypoxic area than models using Mississippi River nutrient loads as independent variables. The results here also support a hypothesis that confinement of the river plume to the inner shelf is an important mechanism controlling hypoxia area and volume in this region.

  8. Extracting and compensating for FOG vibration error based on improved empirical mode decomposition with masking signal.

    PubMed

    Chen, Xiyuan; Wang, Wei

    2017-05-01

    Vibration is an important error source in fiber-optic gyroscopes (FOGs), and the extraction and compensation of vibration signals are important ways to eliminate the error and improve the accuracy of FOG. To decompose the vibration signal better, a new algorithm based on empirical mode decomposition (EMD) with masking signal is proposed in this paper. The masking signal is a kind of sinusoidal signal, and the frequency and amplitude of the masking signal are selected using improved particle swarm optimization. The proposed algorithm is called adaptive masking EMD (AM-EMD). First, the optimal frequency value and range of the masking signal are analyzed and presented. Then, an optimal decomposition of the vibration signal is obtained using the PSO to obtain the optimal frequency and amplitude of the masking signal. Finally, the extraction and compensation of the vibration signal are completed according to the mean value of intrinsic mode functions (IMFs) and the correlation coefficients between IMFs and the vibration signal. Experiments show that the new method can decompose the signal more accurately compared to traditional methods, and the precision of compensation is higher.

  9. Empirical likelihood-based confidence intervals for length-biased data

    PubMed Central

    Ning, J.; Qin, J.; Asgharian, M.; Shen, Y.

    2013-01-01

    Logistic or other constraints often preclude the possibility of conducting incident cohort studies. A feasible alternative in such cases is to conduct a cross-sectional prevalent cohort study for which we recruit prevalent cases, i.e. subjects who have already experienced the initiating event, say the onset of a disease. When the interest lies in estimating the lifespan between the initiating event and a terminating event, say death for instance, such subjects may be followed prospectively until the terminating event or loss to follow-up, whichever happens first. It is well known that prevalent cases have, on average, longer lifespans. As such they do not constitute a representative random sample from the target population; they comprise a biased sample. If the initiating events are generated from a stationary Poisson process, the so-called stationarity assumption, this bias is called length bias. The current literature on length-biased sampling lacks a simple method for estimating the margin of errors of commonly used summary statistics. We fill this gap using the empirical likelihood-based confidence intervals by adapting this method to right-censored length-biased survival data. Both large and small sample behaviors of these confidence intervals are studied. We illustrate our method using a set of data on survival with dementia, collected as part of the Canadian Study of Health and Aging. PMID:23027662

  10. Multivariate Empirical Mode Decomposition Based Signal Analysis and Efficient-Storage in Smart Grid

    SciTech Connect

    Liu, Lu; Albright, Austin P; Rahimpour, Alireza; Guo, Jiandong; Qi, Hairong; Liu, Yilu

    2017-01-01

    Wide-area-measurement systems (WAMSs) are used in smart grid systems to enable the efficient monitoring of grid dynamics. However, the overwhelming amount of data and the severe contamination from noise often impede the effective and efficient data analysis and storage of WAMS generated measurements. To solve this problem, we propose a novel framework that takes advantage of Multivariate Empirical Mode Decomposition (MEMD), a fully data-driven approach to analyzing non-stationary signals, dubbed MEMD based Signal Analysis (MSA). The frequency measurements are considered as a linear superposition of different oscillatory components and noise. The low-frequency components, corresponding to the long-term trend and inter-area oscillations, are grouped and compressed by MSA using the mean shift clustering algorithm. Whereas, higher-frequency components, mostly noise and potentially part of high-frequency inter-area oscillations, are analyzed using Hilbert spectral analysis and they are delineated by statistical behavior. By conducting experiments on both synthetic and real-world data, we show that the proposed framework can capture the characteristics, such as trends and inter-area oscillation, while reducing the data storage requirements

  11. The mature minor: some critical psychological reflections on the empirical bases.

    PubMed

    Partridge, Brian C

    2013-06-01

    Moral and legal notions engaged in clinical ethics should not only possess analytic clarity but a sound basis in empirical findings. The latter condition brings into question the expansion of the mature minor exception. The mature minor exception in the healthcare law of the United States has served to enable those under the legal age to consent to medical treatment. Although originally developed primarily for minors in emergency or quasi-emergency need for health care, it was expanded especially from the 1970s in order to cover unemancipated minors older than 14 years. This expansion initially appeared plausible, given psychological data that showed the intellectual capacity of minors over 14 to recognize the causal connection between their choices and the consequences of their choices. However, subsequent psychological studies have shown that minors generally fail to have realistic affective and evaluative appreciations of the consequences of their decisions, because they tend to over-emphasize short-term benefits and underestimate long-term risks. Also, unlike most decisionmakers over 21, the decisions of minors are more often marked by the lack of adequate impulse control, all of which is reflected in the far higher involvement of adolescents in acts of violence, intentional injury, and serious automobile accidents. These effects are more evident in circumstances that elicit elevated affective responses. The advent of brain imaging has allowed the actual visualization of qualitative differences between how minors versus persons over the age of 21 generally assess risks and benefits and make decisions. In the case of most under the age of 21, subcortical systems fail adequately to be checked by the prefrontal systems that are involved in adult executive decisions. The neuroanatomical and psychological model developed by Casey, Jones, and Summerville offers an empirical insight into the qualitative differences in the neuroanatomical and neuropsychological bases

  12. Cefepime and amikacin as empirical therapy in patients with febrile neutropaenia: a single-centre phase II prospective survey.

    PubMed

    Mebis, J; Vandeplassche, S; Goossens, H; Berneman, Z N

    2009-01-01

    The aim of the survey was to prospectively evaluate the effectiveness of the combination therapy cefepime and amikacin in the initial treatment of haematology patients with febrile neutropaenia. Two hundred twenty (220) episodes of febrile neutropaenia were analysed in 54 males and 82 females (median age 58 years), most patients had a severe neutropaenia with in 72% of all periods a neutrophil count of less than 100. Microbiological infection was confirmed in 72 cases (32.8%). Sixty-one (61) bacteria were isolated from blood cultures of which 22 were identified as Gram-negative bacteria and 38 as Gram-positive bacteria. Sixty-three (63) episodes (28.6%) were clinically documented, 85 episodes (38.6%) were fever of unknown origin. Clinical cure was achieved in 123 febrile episodes (56%) after initiation of the current antibiotic protocol; another 22 patients (10%) became afebrile after modifying the initial antibiotic regimen 48 hours or longer after treatment initiation. In 61 cases (27.7%) there was persistent fever or re-occurrence of fever, these cases were considered as treatment failure. Eight patients (3.6%) died during the study. This survey has demonstrated that the combination therapy with cefepime and amikacin can be considered as an effective treatment for febrile neutropaenia in high-risk haematological patients in our centre with a high incidence of resistance to Gram-negative bacteria.

  13. School-based survey participation: oral health and BMI survey of Ohio third graders.

    PubMed

    Detty, Amber M R

    2013-09-01

    During the 2009-2010 school year, the Ohio Department of Health conducted a statewide oral health and body mass index (BMI) screening survey among 3rd grade children. This marked the fifth school-based survey regarding the oral health of Ohio children since 1987. At 50 %, the participation rate of the 2009-2010 oral health and BMI survey was at the lowest level ever experienced. This study aimed to identify the factors associated with participation rates in a school-based survey. A stratified, random sample of 377 schools was drawn from the list of 1,742 Ohio public elementary schools with third grade students. All third grade children in the sampled schools with parent or guardian consent received an oral health screening and height/weight measurement by trained health professionals. Participation rates at the school level were then combined with data on school characteristics and survey implementation. Predictors of school form return, participation, and refusal rates were assessed by generalized linear modeling (GLM). High student mobility and larger school size were associated with lower form return (p = 0.000 and p = 0.001, respectively) and lower participation rates (p = 0.000 and p = 0.005, respectively). Surveying in the fall or spring (as opposed to winter) significantly decreased form return (p = 0.001 and p = 0.016, respectively) and participation rates (p = 0.008 and p = 0.002, respectively), while being surveyed by internal staff (versus external screeners) significantly increased form return (p = 0.003) and participation rates (p = 0.001). Efforts to increase participation should focus more on schools with higher student mobility and larger size. Additionally, participation could be improved by using internal staff and surveying during winter.

  14. A novel signal compression method based on optimal ensemble empirical mode decomposition for bearing vibration signals

    NASA Astrophysics Data System (ADS)

    Guo, Wei; Tse, Peter W.

    2013-01-01

    Today, remote machine condition monitoring is popular due to the continuous advancement in wireless communication. Bearing is the most frequently and easily failed component in many rotating machines. To accurately identify the type of bearing fault, large amounts of vibration data need to be collected. However, the volume of transmitted data cannot be too high because the bandwidth of wireless communication is limited. To solve this problem, the data are usually compressed before transmitting to a remote maintenance center. This paper proposes a novel signal compression method that can substantially reduce the amount of data that need to be transmitted without sacrificing the accuracy of fault identification. The proposed signal compression method is based on ensemble empirical mode decomposition (EEMD), which is an effective method for adaptively decomposing the vibration signal into different bands of signal components, termed intrinsic mode functions (IMFs). An optimization method was designed to automatically select appropriate EEMD parameters for the analyzed signal, and in particular to select the appropriate level of the added white noise in the EEMD method. An index termed the relative root-mean-square error was used to evaluate the decomposition performances under different noise levels to find the optimal level. After applying the optimal EEMD method to a vibration signal, the IMF relating to the bearing fault can be extracted from the original vibration signal. Compressing this signal component obtains a much smaller proportion of data samples to be retained for transmission and further reconstruction. The proposed compression method were also compared with the popular wavelet compression method. Experimental results demonstrate that the optimization of EEMD parameters can automatically find appropriate EEMD parameters for the analyzed signals, and the IMF-based compression method provides a higher compression ratio, while retaining the bearing defect

  15. An empirically based tool for analyzing morbidity associated with operations for congenital heart disease

    PubMed Central

    Jacobs, Marshall L.; O’Brien, Sean M.; Jacobs, Jeffrey P.; Mavroudis, Constantine; Lacour-Gayet, Francois; Pasquali, Sara K.; Welke, Karl; Pizarro, Christian; Tsai, Felix; Clarke, David R.

    2013-01-01

    Objective: Congenital heart surgery outcomes analysis requires reliable methods of estimating the risk of adverse outcomes. Contemporary methods focus primarily on mortality or rely on expert opinion to estimate morbidity associated with different procedures. We created an objective, empirically based index that reflects statistically estimated risk of morbidity by procedure. Methods: Morbidity risk was estimated using data from 62,851 operations in the Society of Thoracic Surgeons Congenital Heart Surgery Database (2002-2008). Model-based estimates with 95% Bayesian credible intervals were calculated for each procedure’s average risk of major complications and average postoperative length of stay. These 2 measures were combined into a composite morbidity score. A total of 140 procedures were assigned scores ranging from 0.1 to 5.0 and sorted into 5 relatively homogeneous categories. Results: Model-estimated risk of major complications ranged from 1.0% for simple procedures to 38.2% for truncus arteriosus with interrupted aortic arch repair. Procedure-specific estimates of average postoperative length of stay ranged from 2.9 days for simple procedures to 42.6 days for a combined atrial switch and Rastelli operation. Spearman rank correlation between raw rates of major complication and average postoperative length of stay was 0.82 in procedures with n greater than 200. Rate of major complications ranged from 3.2% in category 1 to 30.0% in category 5. Aggregate average postoperative length of stay ranged from 6.3 days in category 1 to 34.0 days in category 5. Conclusions: Complication rates and postoperative length of stay provide related but not redundant information about morbidity. The Morbidity Scores and Categories provide an objective assessment of risk associated with operations for congenital heart disease, which should facilitate comparison of outcomes across cohorts with differing case mixes. PMID:22835225

  16. Patterns of antimicrobial therapy in severe nosocomial infections: empiric choices, proportion of appropriate therapy, and adaptation rates--a multicentre, observational survey in critically ill patients.

    PubMed

    Vogelaers, Dirk; De Bels, David; Forêt, Frédéric; Cran, Sophie; Gilbert, Eric; Schoonheydt, Karen; Blot, Stijn

    2010-04-01

    This prospective, observational multicentre (n=24) study investigated relationships between antimicrobial choices and rates of empiric appropriate or adequate therapy, and subsequent adaptation of therapy in 171 ICU patients with severe nosocomial infections. Appropriate antibiotic therapy was defined as in vitro susceptibility of the causative pathogen and clinical response to the agent administered. In non-microbiologically documented infections, therapy was considered adequate in the case of favourable clinical response <5 days. Patients had pneumonia (n=127; 66 ventilator-associated), intra-abdominal infection (n=23), and bloodstream infection (n=21). Predominant pathogens were Pseudomonas aeruginosa (n=29) Escherichia coli (n=26), Staphylococcus aureus (n=22), and Enterobacter aerogenes (n=21). In 49.6% of infections multidrug-resistant (MDR) bacteria were involved, mostly extended-spectrum beta-lactamase (EBSL)-producing Enterobacteriaceae and MDR non-fermenting Gram-negative bacteria. Prior antibiotic exposure and hospitalisation in a general ward prior to ICU admission were risk factors for MDR. Empiric therapy was appropriate/adequate in 63.7% of cases. Empiric schemes were classified according to coverage of (i) ESBL-producing Enterobacteriaceae and non-fermenting Gram-negative bacteria ("meropenem-based"), (ii) non-fermenting Gram-negative bacteria (schemes with an antipseudomonal agent), and (iii) first-line agents not covering ESBL-Enterobacteriaceae nor non-fermenting Gram-negative bacteria. Meropenem-based schemes allowed for significantly higher rates of appropriate/adequate therapy (p<0.001). This benefit remained when only patients without risk factors for MDR were considered (p=0.021). In 106 patients (61%) empiric therapy was modified: in 60 cases following initial inappropriate/inadequate therapy, in 46 patients in order to refine empiric therapy. In this study reflecting real-life practice, first-line use of meropenem provided significantly

  17. Implementing community-based provider participation in research: an empirical study

    PubMed Central

    2012-01-01

    Background Since 2003, the United States National Institutes of Health (NIH) has sought to restructure the clinical research enterprise in the United States by promoting collaborative research partnerships between academically-based investigators and community-based physicians. By increasing community-based provider participation in research (CBPPR), the NIH seeks to advance the science of discovery by conducting research in clinical settings where most people get their care, and accelerate the translation of research results into everyday clinical practice. Although CBPPR is seen as a promising strategy for promoting the use of evidence-based clinical services in community practice settings, few empirical studies have examined the organizational factors that facilitate or hinder the implementation of CBPPR. The purpose of this study is to explore the organizational start-up and early implementation of CBPPR in community-based practice. Methods We used longitudinal, case study research methods and an organizational model of innovation implementation to theoretically guide our study. Our sample consisted of three community practice settings that recently joined the National Cancer Institute’s (NCI) Community Clinical Oncology Program (CCOP) in the United States. Data were gathered through site visits, telephone interviews, and archival documents from January 2008 to May 2011. Results The organizational model for innovation implementation was useful in identifying and investigating the organizational factors influencing start-up and early implementation of CBPPR in CCOP organizations. In general, the three CCOP organizations varied in the extent to which they achieved consistency in CBPPR over time and across physicians. All three CCOP organizations demonstrated mixed levels of organizational readiness for change. Hospital management support and resource availability were limited across CCOP organizations early on, although they improved in one CCOP organization

  18. Vandenberg Air Force Base Emission Survey.

    DTIC Science & Technology

    1983-01-01

    meteorological conditions. These procedures are described in the following sections. The Toxic Hazard Corridor ( THC ) forecast is the method by which...areas downwind of planned vents and possible accidental spills of toxic chemicals. 2-64 A THC forecast must be requested from Base Weather immediately...prior to the start of the operations for which the THC is needed. A THC must be computed for the following operations: (1) Transfer of toxic

  19. An empirical comparison of character-based and coalescent-based approaches to species delimitation in a young avian complex.

    PubMed

    McKay, Bailey D; Mays, Herman L; Wu, Yuchun; Li, Hui; Yao, Cheng-Te; Nishiumi, Isao; Zou, Fasheng

    2013-10-01

    The process of discovering species is a fundamental responsibility of systematics. Recently, there has been a growing interest in coalescent-based methods of species delimitation aimed at objectively identifying species early in the divergence process. However, few empirical studies have compared these new methods with character-based approaches for discovering species. In this study, we applied both a character-based and a coalescent-based approaches to delimit species in a closely related avian complex, the light-vented/Taiwan bulbul (Pycnonotus sinensis/Pycnonotus taivanus). Population aggregation analyses of plumage, mitochondrial and 13 nuclear intron character data sets produced conflicting species hypotheses with plumage data suggesting three species, mitochondrial data suggesting two species, and nuclear intron data suggesting one species. Such conflict is expected among recently diverged species, and by integrating all sources of data, we delimited three species verified with independently congruent character evidence as well as a more weakly supported fourth species identified by a single character. Attempts to validate species hypothesis using Bayesian Phylogenetics and Phylogeography (BPP), a coalescent-based method of species delimitation, revealed several issues that can seemingly affect statistical support for species recognition. We found that θ priors had a dramatic impact on speciation probabilities, with lower values consistently favouring splitting and higher values consistently favouring lumping. More resolved guide trees also resulted in overall higher speciation probabilities. Finally, we found suggestive evidence that BPP is sensitive to the divergent effects of nonrandom mating caused by intraspecific processes such as isolation-with-distance, and therefore, BPP may not be a conservative method for delimiting independently evolving population lineages. Based on these concerns, we questioned the reliability of BPP results and based our

  20. Advantages and limitations of web-based surveys: evidence from a child mental health survey.

    PubMed

    Heiervang, Einar; Goodman, Robert

    2011-01-01

    Web-based surveys may have advantages related to the speed and cost of data collection as well as data quality. However, they may be biased by low and selective participation. We predicted that such biases would distort point-estimates such as average symptom level or prevalence but not patterns of associations with putative risk-factors. A structured psychiatric interview was administered to parents in two successive surveys of child mental health. In 2003, parents were interviewed face-to-face, whereas in 2006 they completed the interview online. In both surveys, interviews were preceded by paper questionnaires covering child and family characteristics. The rate of parents logging onto the web site was comparable to the response rate for face-to-face interviews, but the rate of full response (completing all sections of the interview) was much lower for web-based interviews. Full response was less frequent for non-traditional families, immigrant parents, and less educated parents. Participation bias affected point estimates of psychopathology but had little effect on associations with putative risk factors. The time and cost of full web-based interviews was only a quarter of that for face-to-face interviews. Web-based surveys may be performed faster and at lower cost than more traditional approaches with personal interviews. Selective participation seems a particular threat to point estimates of psychopathology, while patterns of associations are more robust.

  1. Electronic and optical properties of semiconductors: A study based on the empirical tight binding model

    SciTech Connect

    Lew, Yan Voon, L.C.

    1993-01-01

    This study is a theoretical investigation of the electronic and optical properties of intrinsic semiconductors using the orthogonal empirical tight binding model. An analysis of the bulk properties of semiconductors with the zincblende, diamond and rocksalt structures has been carried out. The author has extended the work of others to higher order in the interaction integrals and derived new parameter sets for certain semiconductors which better fit the experimental data over the Brillouin zone. The Hamiltonian of the heterostructures is built up layer by layer from the parameters of the bulk constituents. The second part of this work examines a number of applications of the theory. A new microscopic derivation of the intervalley deformation potentials is presented within the tight binding representation and computes a number of conduction-band deformation potentials of bulk semiconductors. The author has also studied the electronic states in heterostructures and have shown theoretically the possibility of having barrier localization of above-barrier states in a multivalley heterostructure using a multiband calculation. Another result is the proposal for a new [open quotes]type-II[close quotes] lasing mechanism in short-period GaAs/AlAs super-lattices. As for the author's work on the optical properties, a new formalism, based on the generalized Feynman-Hellmann theorem, for computing interband optical matrix elements has been obtained and has been used to compute the linear and second-order nonlinear optical properties of a number of bulk semiconductors and semiconductor heterostructures. In agreement with the one-band effective-mass calculations of other groups, the more elaborate calculations show that the intersubband oscillator strengths of quantum wells can be greatly enhanced over the bulk interband values.

  2. A Scaling-based Robust Empirical Model of Stream Dissolved Oxygen for the Eastern United States

    NASA Astrophysics Data System (ADS)

    Siddik, M. A. Z.; Abdul-Aziz, O. I.; Ishtiaq, K. S.

    2016-12-01

    We predicted the diurnal cycles of hourly dissolved oxygen (DO) in streams by using a scaling-based empirical model. A single reference observation from each DO cycle was considered as a scaling parameter to convert the DO cycles into a single dimensionless diurnal curve, which was then estimated by employing an extended stochastic harmonic algorithm (ESHA). Hourly DO observations of growing season (May-August) during 2008-2015 from sixteen USGS water quality monitoring stations of the eastern U.S. were used for model calibrations and validations. The study sites incorporated a gradient in climate (tropical vs. temperate), land use (rural vs. urban vs. forest vs. coastal), and catchment size (2.4 - 184.0 mile2) — representing different USEPA level III ecoregions. The estimated model parameters showed a notable spatiotemporal robustness by collapsing into narrow ranges across the growing seasons and study sites. DO predicted using the site-specific, temporally averaged model parameters from a day-specific single reference observation exhibited good model fitting efficiency and accuracy. The model performance was also assessed by simulating the DO time-series using a regional scale parameter set that was obtained from the spatiotemporal aggregation (average) of the estimated parameters for all the sites. Further, model robustness to the individual and simultaneous perturbations in parameters was determined by calculating the analytical sensitivity and uncertainty measures. The study is a continuation of our previous research with a goal to develop a regional-scale predictive model of diurnal cycles of DO. The model can be used to estimate missing data in the observed fine-resolution time-series of DO with a single set of parameter across the eastern USA from limited observations. The fine-resolution DO time-series will be useful to dynamically assess the general health of the aquatic ecosystem.

  3. An altimetry-based gravest empirical mode south of Africa: 1. Development and validation

    NASA Astrophysics Data System (ADS)

    Swart, Sebastiaan; Speich, Sabrina; Ansorge, Isabelle J.; Lutjeharms, Johann R. E.

    2010-03-01

    Hydrographic transects of the Antarctic Circumpolar Current (ACC) south of Africa are projected into baroclinic stream function space parameterized by pressure and dynamic height. This produces a two-dimensional gravest empirical mode (GEM) that captures more than 97% of the total density and temperature variance in the ACC domain. Weekly maps of absolute dynamic topography data, derived from satellite altimetry, are combined with the GEM to obtain a 16 year time series of temperature and salinity fields. The time series of thermohaline fields are compared with independent in situ observations. The residuals decrease sharply below the thermocline and through the entire water column the mean root-mean-square (RMS) error is 0.15°C, 0.02, and 0.02 kg m-3 for temperature, salinity, and density, respectively. The positions of ACC fronts are followed in time using satellite altimetry data. These locations correspond to both the observed and GEM-based positions. The available temperature and salinity information allow one to calculate the baroclinic zonal velocity field between the surface and 2500 dbar. This is compared with velocity measurements from repeat hydrographic transects at the GoodHope line. The net accumulated transports of the ACC, derived from these different methods are within 1-3 Sv of each other. Similarly, GEM-produced cross-sectional velocities at 300 dbar compare closely to the observed data, with the RMS difference not exceeding 0.03 m s-1. The continuous time series of thermohaline fields, described here, are further exploited to understand the dynamic nature of the ACC fronts in the region, and which is given by Swart and Speich (2010).

  4. Surveying ourselves: Examining the use of a web-based approach for a physician survey

    PubMed Central

    Matteson, Kristen A.; Anderson, Britta L.; Pinto, Stephanie B.; Lopes, Vrishali; Schulkin, Jay; Clark, Melissa A.

    2011-01-01

    A survey was distributed, using a sequential mixed-mode approach, to a national sample of obstetrician-gynecologists. Differences between responses to the web-based mode and the on-paper mode were compared to determine if there were systematic differences between respondents. Only two differences in respondents between the two modes were identified. University-based physicians were more likely to complete the web-based mode than private practice physicians. Mail respondents reported a greater volume of endometrial ablations compared to online respondents. The web-based mode had better data quality than the paper-based mailed mode in terms of less missing and inappropriate responses. Together, these findings suggest that, although a few differences were identified, the web-based survey mode attained adequate representativeness and improved data quality. Given the metrics examined for this study, exclusive use of web-based data collection may be appropriate for physician surveys with a minimal reduction in sample coverage and without a reduction in data quality. PMID:21190952

  5. Surveying ourselves: examining the use of a web-based approach for a physician survey.

    PubMed

    Matteson, Kristen A; Anderson, Britta L; Pinto, Stephanie B; Lopes, Vrishali; Schulkin, Jay; Clark, Melissa A

    2011-12-01

    A survey was distributed, using a sequential mixed-mode approach, to a national sample of obstetrician-gynecologists. Differences between responses to the web-based mode and the on-paper mode were compared to determine if there were systematic differences between respondents. Only two differences in respondents between the two modes were identified. University-based physicians were more likely to complete the web-based mode than private practice physicians. Mail respondents reported a greater volume of endometrial ablations compared to online respondents. The web-based mode had better data quality than the paper-based mailed mode in terms of less missing and inappropriate responses. Together, these findings suggest that, although a few differences were identified, the web-based survey mode attained adequate representativeness and improved data quality. Given the metrics examined for this study, exclusive use of web-based data collection may be appropriate for physician surveys with a minimal reduction in sample coverage and without a reduction in data quality.

  6. How much does participatory flood management contribute to stakeholders' social capacity building? Empirical findings based on a triangulation of three evaluation approaches

    NASA Astrophysics Data System (ADS)

    Buchecker, M.; Menzel, S.; Home, R.

    2013-06-01

    Recent literature suggests that dialogic forms of risk communication are more effective to build stakeholders' hazard-related social capacities. In spite of the high theoretical expectations, there is a lack of univocal empirical evidence on the relevance of these effects. This is mainly due to the methodological limitations of the existing evaluation approaches. In our paper we aim at eliciting the contribution of participatory river revitalisation projects on stakeholders' social capacity building by triangulating the findings of three evaluation studies that were based on different approaches: a field-experimental, a qualitative long-term ex-post and a cross-sectional household survey approach. The results revealed that social learning and avoiding the loss of trust were more relevant benefits of participatory flood management than acceptance building. The results suggest that stakeholder involvements should be more explicitly designed as tools for long-term social learning.

  7. A survey of the infrastructure for children's mental health services: implications for the implementation of empirically supported treatments (ESTs).

    PubMed

    Schoenwald, Sonja K; Chapman, Jason E; Kelleher, Kelly; Hoagwood, Kimberly Eaton; Landsverk, John; Stevens, Jack; Glisson, Charles; Rolls-Reutz, Jennifer

    2008-03-01

    A structured interview survey of directors of a large national sample (n = 200) of mental health service organizations treating children examined the governance, financing, staffing, services, and implementation practices of these organizations; and, director ratings of factors important to implementation of new treatments and services. Descriptive analyses showed private organizations financing services with public (particularly Medicaid) funds are prevalent and that employment of professional staff, clinical supervision and training, productivity requirements, and outcomes monitoring are common. Results of random effects regression models (RRMs) evaluating associations between governance, financing, and organizational characteristics and the use of new treatments and services showed for-profit organizations more likely to implement such treatments, and organizations with more licensed clinical staff and weekly clinical supervision in place less likely to do so. Results of RRMs evaluating relations between director ratings of the importance to new treatment and service implementation of three factors-fit with existing implementation practices, infrastructure support, and organizational mission and support-suggest greater importance to public than private organizations of these factors. Implications for EST implementation and future research are described.

  8. Target-based fiber assignment for large survey spectrographs

    NASA Astrophysics Data System (ADS)

    Schaefer, Christoph E. R.; Makarem, Laleh; Kneib, Jean-Paul

    2016-07-01

    Next generation massive spectroscopic survey projects have to process a massive amount of targets. The preparation of subsequent observations should be feasible in a reasonable amount of time. We present a fast algorithm for target assignment that scales as O(log(n)). Our proposed algorithm follow a target based approach, which enables to assign large number of targets to their positioners quickly and with a very high assignment efficiency. We also discuss additional optimization of the fiber positioning problem to take into account the positioner collision problems and how to use the algorithm for an optimal survey strategy. We apply our target-based algorithm in the context of the MOONS project.

  9. THE COS-HALOS SURVEY: AN EMPIRICAL DESCRIPTION OF METAL-LINE ABSORPTION IN THE LOW-REDSHIFT CIRCUMGALACTIC MEDIUM

    SciTech Connect

    Werk, Jessica K.; Prochaska, J. Xavier; Tripp, Todd M.; O'Meara, John M.; Peeples, Molly S.

    2013-02-15

    We present the equivalent width and column density measurements for low and intermediate ionization states of the circumgalactic medium (CGM) surrounding 44 low-z, L Almost-Equal-To L* galaxies drawn from the COS-Halos survey. These measurements are derived from far-UV transitions observed in HST/COS and Keck/HIRES spectra of background quasars within an impact parameter R < 160 kpc to the targeted galaxies. The data show significant metal-line absorption for 33 of the 44 galaxies, including quiescent systems, revealing the common occurrence of a cool (T Almost-Equal-To 10{sup 4}-10{sup 5} K), metal-enriched CGM. The detection rates and column densities derived for these metal lines decrease with increasing impact parameter, a trend we interpret as a declining metal surface density profile for the CGM. A comparison of the relative column densities of adjacent ionization states indicates that the gas is predominantly ionized. The large surface density in metals demands a large reservoir of metals and gas in the cool CGM (very conservatively, M {sup cool} {sub CGM} > 10{sup 9} M {sub Sun }), which likely traces a distinct density and/or temperature regime from the highly ionized CGM traced by O{sup +5} absorption. The large dispersion in absorption strengths (including non-detections) suggests that the cool CGM traces a wide range of densities or a mix of local ionizing conditions. Lastly, the kinematics inferred from the metal-line profiles are consistent with the cool CGM being bound to the dark matter halos hosting the galaxies; this gas may serve as fuel for future star formation. Future work will leverage this data set to provide estimates on the mass, metallicity, dynamics, and origin of the cool CGM in low-z, L* galaxies.

  10. Self-reported dependence on mobile phones in young adults: A European cross-cultural empirical survey

    PubMed Central

    Lopez-Fernandez, Olatz; Kuss, Daria J.; Romo, Lucia; Morvan, Yannick; Kern, Laurence; Graziani, Pierluigi; Rousseau, Amélie; Rumpf, Hans-Jürgen; Bischof, Anja; Gässler, Ann-Kathrin; Schimmenti, Adriano; Passanisi, Alessia; Männikkö, Niko; Kääriänen, Maria; Demetrovics, Zsolt; Király, Orsolya; Chóliz, Mariano; Zacarés, Juan José; Serra, Emilia; Griffiths, Mark D.; Pontes, Halley M.; Lelonek-Kuleta, Bernadeta; Chwaszcz, Joanna; Zullino, Daniele; Rochat, Lucien; Achab, Sophia; Billieux, Joël

    2017-01-01

    Background and aims Despite many positive benefits, mobile phone use can be associated with harmful and detrimental behaviors. The aim of this study was twofold: to examine (a) cross-cultural patterns of perceived dependence on mobile phones in ten European countries, first, grouped in four different regions (North: Finland and UK; South: Spain and Italy; East: Hungary and Poland; West: France, Belgium, Germany, and Switzerland), and second by country, and (b) how socio-demographics, geographic differences, mobile phone usage patterns, and associated activities predicted this perceived dependence. Methods A sample of 2,775 young adults (aged 18–29 years) were recruited in different European Universities who participated in an online survey. Measures included socio-demographic variables, patterns of mobile phone use, and the dependence subscale of a short version of the Problematic Mobile Phone Use Questionnaire (PMPUQ; Billieux, Van der Linden, & Rochat, 2008). Results The young adults from the Northern and Southern regions reported the heaviest use of mobile phones, whereas perceived dependence was less prevalent in the Eastern region. However, the proportion of highly dependent mobile phone users was more elevated in Belgium, UK, and France. Regression analysis identified several risk factors for increased scores on the PMPUQ dependence subscale, namely using mobile phones daily, being female, engaging in social networking, playing video games, shopping and viewing TV shows through the Internet, chatting and messaging, and using mobile phones for downloading-related activities. Discussion and conclusions Self-reported dependence on mobile phone use is influenced by frequency and specific application usage. PMID:28425777

  11. Self-reported dependence on mobile phones in young adults: A European cross-cultural empirical survey.

    PubMed

    Lopez-Fernandez, Olatz; Kuss, Daria J; Romo, Lucia; Morvan, Yannick; Kern, Laurence; Graziani, Pierluigi; Rousseau, Amélie; Rumpf, Hans-Jürgen; Bischof, Anja; Gässler, Ann-Kathrin; Schimmenti, Adriano; Passanisi, Alessia; Männikkö, Niko; Kääriänen, Maria; Demetrovics, Zsolt; Király, Orsolya; Chóliz, Mariano; Zacarés, Juan José; Serra, Emilia; Griffiths, Mark D; Pontes, Halley M; Lelonek-Kuleta, Bernadeta; Chwaszcz, Joanna; Zullino, Daniele; Rochat, Lucien; Achab, Sophia; Billieux, Joël

    2017-06-01

    Background and aims Despite many positive benefits, mobile phone use can be associated with harmful and detrimental behaviors. The aim of this study was twofold: to examine (a) cross-cultural patterns of perceived dependence on mobile phones in ten European countries, first, grouped in four different regions (North: Finland and UK; South: Spain and Italy; East: Hungary and Poland; West: France, Belgium, Germany, and Switzerland), and second by country, and (b) how socio-demographics, geographic differences, mobile phone usage patterns, and associated activities predicted this perceived dependence. Methods A sample of 2,775 young adults (aged 18-29 years) were recruited in different European Universities who participated in an online survey. Measures included socio-demographic variables, patterns of mobile phone use, and the dependence subscale of a short version of the Problematic Mobile Phone Use Questionnaire (PMPUQ; Billieux, Van der Linden, & Rochat, 2008). Results The young adults from the Northern and Southern regions reported the heaviest use of mobile phones, whereas perceived dependence was less prevalent in the Eastern region. However, the proportion of highly dependent mobile phone users was more elevated in Belgium, UK, and France. Regression analysis identified several risk factors for increased scores on the PMPUQ dependence subscale, namely using mobile phones daily, being female, engaging in social networking, playing video games, shopping and viewing TV shows through the Internet, chatting and messaging, and using mobile phones for downloading-related activities. Discussion and conclusions Self-reported dependence on mobile phone use is influenced by frequency and specific application usage.

  12. Effect of a large increase in cigarette tax on cigarette consumption: an empirical analysis of cross-sectional survey data.

    PubMed

    Lee, Jie-Min

    2008-10-01

    This study used cigarette price elasticity estimates to assess the possible effects on cigarette consumption of a large increase in cigarette tax. It also investigated different responses to the cigarette tax increase among smokers from different socio-economic backgrounds and with different smoking characteristics. Cross-sectional study on 483 valid questionnaires completed during a telephone survey of current smokers aged 15 years and above from all 23 major cities and counties in Taiwan. This study analysed the willingness of current smokers to quit smoking or reduce cigarette consumption when faced with a tax increase of NT$22 per pack, which would raise the price of cigarettes by 44%. The Tobit regression model and the maximum likelihood method were used to estimate cigarette demand elasticity. Estimation results yielded a cigarette price elasticity of -0.29 in connection with a 44% increase in the price of cigarettes. This suggests that smokers will have relatively little response to such an event. The most significant response to the price increase was found among women, low-income smokers, moderately addicted smokers, and smokers who regularly purchase low-price cigarettes. A 44% increase in the price of cigarettes would reduce the average annual per capita cigarette consumption in Taiwan by 14.86 packs; a reduction of 12.87%. The tax increase would also boost the Government's cigarette tax revenue by approximately NT$41.4 billion, and increase cigarette merchants' income by approximately NT$27.4 billion. Since current cigarette prices are low in Taiwan and smokers are relatively insensitive to cigarette price hikes, a large increase in cigarette tax would reduce cigarette consumption effectively, and would also increase the Government's cigarette tax revenue and cigarette merchants' income. Clearly, such a tax would create a win-win outcome for the Government, cigarette merchants and smokers, and it is therefore recommended.

  13. An Empirically-based Steady-state Friction Law and its Implications for Fault Stability

    NASA Astrophysics Data System (ADS)

    Spagnuolo, E.; Nielsen, S. B.; Di Toro, G.; Violay, M.

    2015-12-01

    Empirically-based rate-and-state friction laws (RSFL) have been proposed to model the dependence of friction forces with slip and time. The relevance of the RSFL for earthquakes mechanics is that few constitutive parameters (e.g. A-B= dτ/dlog(V) with τ and V the shear stress and slip rate respectively, allow us to define the stability conditions of a fault. According to RSFL if A-B> 0, τ increases with V (rate-hardening behavior) resulting in an unconditionally stable behavior; if A-B< 0, τ decreases with V (rate-weakening behavior) potentially resulting in an unstable behavior leading to dynamic runaway. Given that τ at steady state conditions allows us also to define a critical fault stiffness, the RSFL determine a condition of stability for faults as their stiffness approaches the critical conditions. However, the conditions of fault stability, determined by the critical stiffness under the assumption of either a rate-weakening or a rate-hardening behavior, might be restrictive given that frictional properties sensibly change as a function of slip or slip rate. Moreover, the RSFL were determined from experiments conducted at sub-seismic slip rates (< 1 cm/s) and their extrapolation to earthquake deformation conditions remains questionable on the basis of the experimental evidence of large dynamic weakening at seismic slip rates and the plethora of slip events which characterize the seismic cycle. Here, we propose a modified RSFL based on the review of a large published and unpublished dataset of rock-friction experiments performed with different testing machines (rotary shear, bi-axial, tri-axial). The modified RSFL is valid at steady-state conditions from sub-seismic to seismic slip rates (0.1 μm/s

  14. Simulation of Long Lived Tracers Using an Improved Empirically Based Two-Dimensional Model Transport Algorithm

    NASA Technical Reports Server (NTRS)

    Fleming, E. L.; Jackman, C. H.; Stolarski, R. S.; Considine, D. B.

    1998-01-01

    We have developed a new empirically-based transport algorithm for use in our GSFC two-dimensional transport and chemistry model. The new algorithm contains planetary wave statistics, and parameterizations to account for the effects due to gravity waves and equatorial Kelvin waves. As such, this scheme utilizes significantly more information compared to our previous algorithm which was based only on zonal mean temperatures and heating rates. The new model transport captures much of the qualitative structure and seasonal variability observed in long lived tracers, such as: isolation of the tropics and the southern hemisphere winter polar vortex; the well mixed surf-zone region of the winter sub-tropics and mid-latitudes; the latitudinal and seasonal variations of total ozone; and the seasonal variations of mesospheric H2O. The model also indicates a double peaked structure in methane associated with the semiannual oscillation in the tropical upper stratosphere. This feature is similar in phase but is significantly weaker in amplitude compared to the observations. The model simulations of carbon-14 and strontium-90 are in good agreement with observations, both in simulating the peak in mixing ratio at 20-25 km, and the decrease with altitude in mixing ratio above 25 km. We also find mostly good agreement between modeled and observed age of air determined from SF6 outside of the northern hemisphere polar vortex. However, observations inside the vortex reveal significantly older air compared to the model. This is consistent with the model deficiencies in simulating CH4 in the northern hemisphere winter high latitudes and illustrates the limitations of the current climatological zonal mean model formulation. The propagation of seasonal signals in water vapor and CO2 in the lower stratosphere showed general agreement in phase, and the model qualitatively captured the observed amplitude decrease in CO2 from the tropics to midlatitudes. However, the simulated seasonal

  15. Universal Design for Instruction in Postsecondary Education: A Systematic Review of Empirically Based Articles

    ERIC Educational Resources Information Center

    Roberts, Kelly D.; Park, Hye Jin; Brown, Steven; Cook, Bryan

    2011-01-01

    Universal Design for Instruction (UDI) in postsecondary education is a relatively new concept/framework that has generated significant support. The purpose of this literature review was to examine existing empirical research, including qualitative, quantitative, and mixed methods, on the use of UDI (and related terms) in postsecondary education.…

  16. Understanding Transactional Distance in Web-Based Learning Environments: An Empirical Study

    ERIC Educational Resources Information Center

    Huang, Xiaoxia; Chandra, Aruna; DePaolo, Concetta A.; Simmons, Lakisha L.

    2016-01-01

    Transactional distance is an important pedagogical theory in distance education that calls for more empirical support. The purpose of this study was to verify the theory by operationalizing and examining the relationship of (1) dialogue, structure and learner autonomy to transactional distance, and (2) environmental factors and learner demographic…

  17. Model Selection for Equating Testlet-Based Tests in the NEAT Design: An Empirical Study

    ERIC Educational Resources Information Center

    He, Wei; Li, Feifei; Wolfe, Edward W.; Mao, Xia

    2012-01-01

    For those tests solely composed of testlets, local item independency assumption tends to be violated. This study, by using empirical data from a large-scale state assessment program, was interested in investigates the effects of using different models on equating results under the non-equivalent group anchor-test (NEAT) design. Specifically, the…

  18. Comparisons of experiment with cellulose models based on electronic structure and empirical force field theories

    USDA-ARS?s Scientific Manuscript database

    Studies of cellobiose conformations with HF/6-31G* and B3LYP/6-31+G*quantum theory [1] gave a reference for studies with the much faster empirical methods such as MM3, MM4, CHARMM and AMBER. The quantum studies also enable a substantial reduction in the number of exo-cyclic group orientations that...

  19. Understanding Transactional Distance in Web-Based Learning Environments: An Empirical Study

    ERIC Educational Resources Information Center

    Huang, Xiaoxia; Chandra, Aruna; DePaolo, Concetta A.; Simmons, Lakisha L.

    2016-01-01

    Transactional distance is an important pedagogical theory in distance education that calls for more empirical support. The purpose of this study was to verify the theory by operationalizing and examining the relationship of (1) dialogue, structure and learner autonomy to transactional distance, and (2) environmental factors and learner demographic…

  20. Hospital readiness for undertaking evidence-based practice: A survey.

    PubMed

    Nguyen, Thi Ngoc Minh; Wilson, Anne

    2016-12-01

    Despite the fact that evidence-based practice has increasing emphasis in health care, organizations are not always prepared for its implementation. Identifying organizational preparedness for implementing evidence-based practice is desirable prior to application. A cross-sectional survey was developed to explore nurses' perception of organizational support for evidence-based practice and was implemented via a self-enumerated survey completed by 234 nurses. Data were analyzed with descriptive and inferential statistics. Nurses reported that implementation of evidence-based practice is complex and fraught with challenges because of a lack of organizational support. A conceptual framework comprising three key factors: information resources, nursing leadership, and organizational infrastructure was proposed to assist health authorities in the implementation of evidence-based practice. Suggestions of how organizations can be more supportive of research utilization in practice include establishing a library, journal clubs/mentoring programs, nurses' involvement in decision-making at unit level, and a local nursing association.

  1. Numerical simulation of bubble departure in subcooled pool boiling based on non-empirical boiling and condensation model

    NASA Astrophysics Data System (ADS)

    Ose, Y.; Kunugi, T.

    2013-07-01

    In this study, in order to clarify the heat transfer characteristics of the subcooled boiling phenomena and to discuss on their mechanism, a non-empirical boiling and condensation model for numerical simulation has been adopted. This model consists of an improved phase-change model and a consideration of a relaxation time based on the quasithermal equilibrium hypothesis. The transient three-dimensional numerical simulations based on the MARS (Multiinterface Advection and Reconstruction Solver) with the non-empirical boiling and condensation model have been conducted for an isolated boiling bubble behavior in a subcooled pool. The subcooled bubble behaviors, such as the growth process of the nucleate bubble on the heating surface, the condensation process and the extinction behaviors after departing from the heating surface were investigated, respectively. In this paper, the bubble departing behavior from the heating surface was discussed in detail. The overall numerical results showed in very good agreement with the experimental results.

  2. Using an index-based approach to assess the population-level appropriateness of empirical antibiotic therapy

    PubMed Central

    Ciccolini, M.; Spoorenberg, V.; Geerlings, S. E.; Prins, J. M.; Grundmann, H.

    2015-01-01

    Objectives The population-level appropriateness of empirical antibiotic therapy can be conventionally measured by ascertainment of treatment coverage. This method involves a complex resource-intensive case-by-case assessment of the prescribed antibiotic treatment and the resistance of the causative microorganism. We aimed to develop an alternative approach based, instead, on the use of routinely available surveillance data. Methods We calculated a drug effectiveness index by combining three simple aggregated metrics: relative frequency of aetiological agents, level of resistance and relative frequency of antibiotic use. To evaluate the applicability of our approach, we used this metric to estimate the population-level appropriateness of guideline-compliant and non-guideline-compliant empirical treatment regimens in the context of the Dutch national guidelines for complicated urinary tract infections. Results The drug effectiveness index agrees within 5% with results obtained with the conventional approach based on a case-by-case ascertainment of treatment coverage. Additionally, we estimated that the appropriateness of 2008 antibiotic prescribing regimens would have declined by up to 4% by year 2011 in the Netherlands due to the emergence and expansion of antibiotic resistance. Conclusions The index-based framework can be an alternative approach to the estimation of point values and counterfactual trends in population-level empirical treatment appropriateness. In resource-constrained settings, where empirical prescribing is most prevalent and comprehensive studies to directly measure appropriateness may not be a practical proposition, an index-based approach could provide useful information to aid in the development and monitoring of antibiotic prescription guidelines. PMID:25164311

  3. Upscaling Empirically Based Conceptualisations to Model Tropical Dominant Hydrological Processes for Historical Land Use Change

    NASA Astrophysics Data System (ADS)

    Toohey, R.; Boll, J.; Brooks, E.; Jones, J.

    2009-12-01

    Surface runoff and percolation to ground water are two hydrological processes of concern to the Atlantic slope of Costa Rica because of their impacts on flooding and drinking water contamination. As per legislation, the Costa Rican Government funds land use management from the farm to the regional scale to improve or conserve hydrological ecosystem services. In this study, we examined how land use (e.g., forest, coffee, sugar cane, and pasture) affects hydrological response at the point, plot (1 m2), and the field scale (1-6ha) to empirically conceptualize the dominant hydrological processes in each land use. Using our field data, we upscaled these conceptual processes into a physically-based distributed hydrological model at the field, watershed (130 km2), and regional (1500 km2) scales. At the point and plot scales, the presence of macropores and large roots promoted greater vertical percolation and subsurface connectivity in the forest and coffee field sites. The lack of macropores and large roots, plus the addition of management artifacts (e.g., surface compaction and a plough layer), altered the dominant hydrological processes by increasing lateral flow and surface runoff in the pasture and sugar cane field sites. Macropores and topography were major influences on runoff generation at the field scale. Also at the field scale, antecedent moisture conditions suggest a threshold behavior as a temporal control on surface runoff generation. However, in this tropical climate with very intense rainstorms, annual surface runoff was less than 10% of annual precipitation at the field scale. Significant differences in soil and hydrological characteristics observed at the point and plot scales appear to have less significance when upscaled to the field scale. At the point and plot scales, percolation acted as the dominant hydrological process in this tropical environment. However, at the field scale for sugar cane and pasture sites, saturation-excess runoff increased as

  4. Place Based Assistance Tools: Networking and Resident Surveys.

    ERIC Educational Resources Information Center

    Department of Housing and Urban Development, Washington, DC.

    "Place-based assistance" is not a new concept. Asking what people want and finding ways to give it to them sounds simplistic, but it can result in "win-win" solutions in which everyone involved benefits. This document is a guide to using networking and surveys of residents to determine community needs. Some case studies show…

  5. Older stands characterized and estimated from sample-based surveys

    Treesearch

    Margaret S. Devall; Victor A. Rudis

    1991-01-01

    Old growth criteria from the literature are applied to existing data from systematic sample-based surveys to obtain estimates of detailed attributes for private as well as public stands in the Interior Highlands of Arkansas and Oklahoma. Approximately ¼ of the regions forest is mature. With the most stringent old growth criteria applied, less than 2% of the forested...

  6. Lake Superior Phytoplankton Characterization from the 2006 Probability Based Survey

    EPA Science Inventory

    We conducted a late summer probability based survey of Lake Superior in 2006 which consisted of 52 sites stratified across 3 depth zones. As part of this effort, we collected composite phytoplankton samples from the epilimnion and the fluorescence maxima (Fmax) at 29 of the site...

  7. Lake Superior Phytoplankton Characterization from the 2006 Probability Based Survey

    EPA Science Inventory

    We conducted a late summer probability based survey of Lake Superior in 2006 which consisted of 52 sites stratified across 3 depth zones. As part of this effort, we collected composite phytoplankton samples from the epilimnion and the fluorescence maxima (Fmax) at 29 of the site...

  8. Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education

    ERIC Educational Resources Information Center

    Amrein-Beardsley, A.; Haladyna, T.

    2012-01-01

    Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…

  9. Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education

    ERIC Educational Resources Information Center

    Amrein-Beardsley, A.; Haladyna, T.

    2012-01-01

    Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…

  10. The outcome of non-carbapenem-based empirical antibacterial therapy and VRE colonisation in patients with hematological malignancies.

    PubMed

    Gedik, H; Yildirmak, T; Simsek, F; Kanturk, A; Aydýn, D; Anca, D; Yokus, O; Demirel, N

    2013-06-01

    Febrile neutropenia (FN) is generally a complication of cancer chemotherapy. We retrospectively evaluated the febrile neutropenia episodes and their outcomes with respect to modification rates of non-carbapenem-based empirical antibacterial therapy and vancomycin-resistant enterococcus (VRE) colonisation that caused to VRE bacteremia in patients with hematological malignancies. All consecutive patients, who were older than 14 years of age and developed febrile neutropenia episodes due to hematological malignancies from September 2010 to November 2011 at the hematology department were included into the study. In total, 86 consecutive neutropenic patients and their 151 febrile episodes were evaluated. The mean MASCC prognostic index score was 18,72 ± 9,43. Among 86 patients, 28 patients experienced a total of 30 bacteremia episodes of bacterial origin. Modification rates of both, empirical monotherapy and combination therapies, were found similar, statistically (P = 0,840). Our results suggest that initiating of non-carbapenem based therapy does not provide high response rates in the treatment of febrile neutropenia attacks. Furthermore, non-carbapenem-based empirical therapy provides benefit in regard to cost-effectiveness and antimicrobial stewardship when local antibiotic resistance patterns of gram-negative bacteria are considered. Patients who are colonized with VRE are more likely to develop bacteremia with VRE strains as a result of invasive procedures and severe damage of mucosal barriers observed in this group of patients.

  11. Space-based infrared surveys of small bodies

    NASA Astrophysics Data System (ADS)

    Mommert, M.

    2014-07-01

    Most small bodies in the Solar System are too small and too distant to be spatially resolved, precluding a direct diameter derivation. Furthermore, measurements of the optical brightness alone only allow a rough estimate of the diameter, since the surface albedo is usually unknown and can have values between about 3 % and 60 % or more. The degeneracy can be resolved by considering the thermal emission of these objects, which is less prone to albedo effects and mainly a function of the diameter. Hence, the combination of optical and thermal-infrared observational data provides a means to independently derive an object's diameter and albedo. This technique is used in asteroid thermal models or more sophisticated thermophysical models (see, e.g., [1]). Infrared observations require cryogenic detectors and/or telescopes, depending on the actual wavelength range observed. Observations from the ground are additionally compromised by the variable transparency of Earth's atmosphere in major portions of the infrared wavelength ranges. Hence, space-based infrared telescopes, providing stable conditions and significantly better sensitivities than ground-based telescopes, are now used routinely to exploit this wavelength range. Two observation strategies are used with space-based infrared observatories: Space-based Infrared All-Sky Surveys. Asteroid surveys in the thermal infrared are less prone to albedo-related discovery bias compared to surveys with optical telescopes, providing a more complete picture of small body populations. The first space-based infrared survey of Solar System small bodies was performed with the Infrared Astronomical Satellite (IRAS) for 10 months in 1983. In the course of the 'IRAS Minor Planet Survey' [2], 2228 asteroids (3 new discoveries) and more than 25 comets (6 new discoveries) were observed. More recent space-based infrared all-sky asteroid surveys were performed by Akari (launched 2006) and the Wide-field Infrared Survey Explorer (WISE

  12. Selecting Great Lakes streams for lampricide treatment based on larval sea lamprey surveys

    USGS Publications Warehouse

    Christie, Gavin C.; Adams, Jean V.; Steeves, Todd B.; Slade, Jeffrey W.; Cuddy, Douglas W.; Fodale, Michael F.; Young, Robert J.; Kuc, Miroslaw; Jones, Michael L.

    2003-01-01

    The Empiric Stream Treatment Ranking (ESTR) system is a data-driven, model-based, decision tool for selecting Great Lakes streams for treatment with lampricide, based on estimates from larval sea lamprey (Petromyzon marinus) surveys conducted throughout the basin. The 2000 ESTR system was described and applied to larval assessment surveys conducted from 1996 to 1999. A comparative analysis of stream survey and selection data was conducted and improvements to the stream selection process were recommended. Streams were selected for treatment based on treatment cost, predicted treatment effectiveness, and the projected number of juvenile sea lampreys produced. On average, lampricide treatments were applied annually to 49 streams with 1,075 ha of larval habitat, killing 15 million larval and 514,000 juvenile sea lampreys at a total cost of $5.3 million, and marginal and mean costs of $85 and $10 per juvenile killed. The numbers of juvenile sea lampreys killed for given treatment costs showed a pattern of diminishing returns with increasing investment. Of the streams selected for treatment, those with > 14 ha of larval habitat targeted 73% of the juvenile sea lampreys for 60% of the treatment cost. Suggested improvements to the ESTR system were to improve accuracy and precision of model estimates, account for uncertainty in estimates, include all potentially productive streams in the process (not just those surveyed in the current year), consider the value of all larvae killed during treatment (not just those predicted to metamorphose the following year), use lake-specific estimates of damage, and establish formal suppression targets.

  13. Empirical model of equatorial electrojet based on ground-based magnetometer data during solar minimum in fall

    NASA Astrophysics Data System (ADS)

    Hamid, Nurul Shazana Abdul; Liu, Huixin; Uozumi, Teiji; Yoshikawa, Akimasa

    2015-12-01

    In this study, we constructed an empirical model of the equatorial electrojet (EEJ), including local time and longitudinal dependence, based on simultaneous data from 12 magnetometer stations located in six longitude sectors. An analysis was carried out using the equatorial electrojet index, EUEL, calculated from the geomagnetic northward H component. The magnetic EEJ strength is calculated as the difference between the normalized EUEL index of the magnetic dip equator station and the normalized EUEL index of the off-dip equator station located beyond the EEJ band. Analysis showed that this current is always strongest in the South American sector, regardless of local time (LT), and weakest in the Indian sector during 0900 and 1000 LT, but shifted to the African sector during 1100 to 1400 LT. These longitude variations of EEJ roughly follow variations of the inversed main field strength along the dip equator, except for the Indian and Southeast Asian sectors. The result showed that the EEJ component derived from the model exhibits a similar pattern with measured EEJ from ground data during noontime, mainly before 1300 LT.

  14. Market-Based Multirobot Coordination: A Comprehensive Survey and Analysis

    DTIC Science & Technology

    2005-12-01

    Market-Based Multirobot Coordination: A Comprehensive Survey and Analysis Nidhi Kalra Robert Zlot M. Bernardine Dias Anthony Stentz CMU-RI-TR-05-16... Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7... analysis . This paper meets this need in three ways. First, it provides a tutorial on market-based approaches by discussing the motivating philosophy

  15. Survey-based Indices for Nursing Home Quality Incentive Reimbursement

    PubMed Central

    Willemain, Thomas R.

    1983-01-01

    Incentive payments are a theoretically appealing complement to nursing home quality assurance systems that rely on regulatory enforcement. However, the practical aspects of incentive program design are not yet well understood. After reviewing the rationale for incentive approaches and recent State and. Federal initiatives, the article considers a basic program design issue: creating an index of nursing home quality. It focuses on indices constructed from routine licensure and certification survey results because State initiatives have relied heavily on these readily accessible data. It also suggests a procedure for creating a survey-based index and discusses a sampling of Implementation issues. PMID:10309858

  16. Systematic mapping study of data mining-based empirical studies in cardiology.

    PubMed

    Kadi, Ilham; Idri, Ali; Fernandez-Aleman, José Luis

    2017-07-01

    Data mining provides the methodology and technology to transform huge amount of data into useful information for decision making. It is a powerful process to extract knowledge and discover new patterns embedded in large data sets. Data mining has been increasingly used in medicine, particularly in cardiology. In fact, data mining applications can greatly benefits all parts involved in cardiology such as patients, cardiologists and nurses. This article aims to perform a systematic mapping study so as to analyze and synthesize empirical studies on the application of data mining techniques in cardiology. A total of 142 articles published between 2000 and 2015 were therefore selected, studied and analyzed according to the four following criteria: year and channel of publication, research type, medical task and empirical type. The results of this mapping study are discussed and a list of recommendations for researchers and cardiologists is provided.

  17. The processing of rotor startup signals based on empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Gai, Guanghong

    2006-01-01

    In this paper, we applied empirical mode decomposition method to analyse rotor startup signals, which are non-stationary and contain a lot of additional information other than that from its stationary running signals. The methodology developed in this paper decomposes the original startup signals into intrinsic oscillation modes or intrinsic modes function (IMFs). Then, we obtained rotating frequency components for Bode diagrams plot by corresponding IMFs, according to the characteristics of rotor system. The method can obtain precise critical speed without complex hardware support. The low-frequency components were extracted from these IMFs in vertical and horizontal directions. Utilising these components, we constructed a drift locus of rotor revolution centre, which provides some significant information to fault diagnosis of rotating machinery. Also, we proved that empirical mode decomposition method is more precise than Fourier filter for the extraction of low-frequency component.

  18. Coalitional affiliation as a missing link between ethnic polarization and well-being: An empirical test from the European Social Survey.

    PubMed

    Firat, Rengin B; Boyer, Pascal

    2015-09-01

    Many studies converge in suggesting (a) that ethnic and racial minorities fare worse than host populations in reported well-being and objective measures of health and (b) that ethnic/racial diversity has a negative impact on various measures of social trust and well-being, including in the host or majority population. However, there is much uncertainty about the processes that connect diversity variables with personal outcomes. In this paper, we are particularly interested in different levels of coalitional affiliation, which refers to people's social allegiances that guide their expectations of social support, in-group strength and cohesion. We operationalize coalitional affiliation as the extent to which people rely on a homogeneous social network, and we measure it with indicators of friendships across ethnic boundaries and frequency of contact with friends. Using multi-level models and data from the European Social Survey (Round 1, 2002-2003) for 19 countries, we demonstrate that coalitional affiliation provides an empirically reliable, as well as theoretically coherent, explanation for various effects of ethnic/racial diversity.

  19. Organizational Learning, Strategic Flexibility and Business Model Innovation: An Empirical Research Based on Logistics Enterprises

    NASA Astrophysics Data System (ADS)

    Bao, Yaodong; Cheng, Lin; Zhang, Jian

    Using the data of 237 Jiangsu logistics firms, this paper empirically studies the relationship among organizational learning capability, business model innovation, strategic flexibility. The results show as follows; organizational learning capability has positive impacts on business model innovation performance; strategic flexibility plays mediating roles on the relationship between organizational learning capability and business model innovation; interaction among strategic flexibility, explorative learning and exploitative learning play significant roles in radical business model innovation and incremental business model innovation.

  20. Empirically Based Leadership: Integrating the Science of Psychology in Building a Better Leadership Model

    DTIC Science & Technology

    2013-02-01

    and Olga Epitropaki, “Transformational Leadership and Moral Reasoning,” Journal of Applied Psychology 87(2) (2002): 304-11. 9. Daniel Goleman ...beyond the scope of this paper, so the discussion in this section will primarily focus on the key areas of interest within the empirical literature on...in recent years has been the focus of considerable attention in relationship to leadership efficacy. Emotional intel- ligence involves an awareness

  1. A Reliability Test of a Complex System Based on Empirical Likelihood

    PubMed Central

    Zhang, Jun; Hui, Yongchang

    2016-01-01

    To analyze the reliability of a complex system described by minimal paths, an empirical likelihood method is proposed to solve the reliability test problem when the subsystem distributions are unknown. Furthermore, we provide a reliability test statistic of the complex system and extract the limit distribution of the test statistic. Therefore, we can obtain the confidence interval for reliability and make statistical inferences. The simulation studies also demonstrate the theorem results. PMID:27760130

  2. Epicentral Location of Regional Seismic Events Based on Empirical Green’s Functions from Ambient Noise

    DTIC Science & Technology

    2010-09-01

    located and characterized by the University of Utah Seismic Stations (UUSS) and by the Department of Earth and Atmospheric Sciences at Saint Louis...of sources at different depths; e.g., earthquakes within Earth’s crust, volcanic explosions, meteoritic impacts, explosions, mine collapses, or...not require knowledge of Earth structure. ● It works for weak events where the detection of body wave phases may be problematic. ●The empirical

  3. Survey on Existing Science Gateways - Based on DEGREE

    NASA Astrophysics Data System (ADS)

    Schwichtenberg, Horst; Claus, Steffen

    2010-05-01

    Science Gateways gather community-developed specific tools, applications, and data that is usually combined and presented in a graphical user interface which is customized to the needs of the target user community. Science Gateways serve as a single point of entry for the users and are usually represented by fat clients or web portals. Part of the DEGREE project (Dissemination and Exploitation of Grids in Earth Science) was a state-of-the-art survey of portal usage in Earth Science (ES) applications. This survey considered a list of 29 portals, including 17 ES portals and 12 generic developments coming from outside of the ES domain. The survey identified three common usage types of ES portals, including data dissemination (e.g. observational data), collaboration as well as usage of Grid-based resources (e.g. for processing of ES datasets). Based on these three usage types, key requirements could be extracted. These requirements were furthermore used for a feature comparison with existing portal developments coming from outside of the ES domain. This presentation gives an overview of the results of the survey (including a feature comparison of ES and non-ES portals). Furthermore, three portals are discussed in detail, one for each usage type (data dissemination, collaboration, Grid-based).

  4. I-SOLV: a new surface-based empirical model for computing solvation free energies.

    PubMed

    Wang, Renxiao; Lin, Fu; Xu, Yong; Cheng, Tiejun

    2007-07-01

    We have developed a new empirical model, I-SOLV, for computing solvation free energies of neutral organic molecules. It computes the solvation free energy of a solute molecule by summing up the contributions from its component atoms. The contribution from a certain atom is determined by the solvent-accessible surface area as well as the surface tension of this atom. A total of 49 atom types are implemented in our model for classifying C, N, O, S, P, F, Cl, Br and I in common organic molecules. Their surface tensions are parameterized by using a data set of 532 neutral organic molecules with experimentally measured solvation free energies. A head-to-head comparison of our model with several other solvation models was performed on a test set of 82 molecules. Our model outperformed other solvation models, including widely used PB/SA and GB/SA models, with a mean unsigned error as low as 0.39 kcal/mol. Our study has demonstrated again that well-developed empirical solvation models are not necessarily less accurate than more sophisticated theoretical models. Empirical models may serve as appealing alternatives due to their simplicity and accuracy.

  5. Empirical model of the thermospheric mass density based on CHAMP satellite observations

    NASA Astrophysics Data System (ADS)

    Liu, Huixin; Hirano, Takashi; Watanabe, Shigeto

    2013-02-01

    The decadal observations from CHAMP satellite have provided ample information on the Earth's upper thermosphere, reshaping our understandings of the vertical coupling in the atmosphere and near-Earth space. An empirical model of the thermospheric mass density is constructed from these high-resolution observations using the multivariable least-squares fitting method. It describes the density variation with latitude, longitude, height, local time, season, and solar and geomagnetic activity levels within the altitude range of 350-420 km. It represents well prominent thermosphere structures like the equatorial mass density anomaly (EMA) and the wave-4 longitudinal pattern. Furthermore, the empirical model reveals two distinct features. First, the EMA is found to have a clear altitude dependence, with its crests moving equatorward with increasing altitude. Second, the equinoctial asymmetry is found to strongly depend on solar cycle, with its magnitude and phase being strongly regulated by solar activity levels. The equinoctial density maxima occur significantly after the actual equinox dates toward solar minimum, which may signal growing influence from the lower atmosphere forcing. This empirical model provides an instructive tool in exploring thermospheric density structures and dynamics. It can also be easily incorporated into other models to have a more accurate description of the background thermosphere, for both scientific and practical purposes.

  6. Design-based and model-based inference in surveys of freshwater mollusks

    USGS Publications Warehouse

    Dorazio, R.M.

    1999-01-01

    Well-known concepts in statistical inference and sampling theory are used to develop recommendations for planning and analyzing the results of quantitative surveys of freshwater mollusks. Two methods of inference commonly used in survey sampling (design-based and model-based) are described and illustrated using examples relevant in surveys of freshwater mollusks. The particular objectives of a survey and the type of information observed in each unit of sampling can be used to help select the sampling design and the method of inference. For example, the mean density of a sparsely distributed population of mollusks can be estimated with higher precision by using model-based inference or by using design-based inference with adaptive cluster sampling than by using design-based inference with conventional sampling. More experience with quantitative surveys of natural assemblages of freshwater mollusks is needed to determine the actual benefits of different sampling designs and inferential procedures.

  7. Child fluorosis in Chhattisgarh, India: a community-based survey.

    PubMed

    Vilasrao, Gitte Sunil; Kamble, K M; Sabat, Ramanath N

    2014-11-01

    To assess the prevalence and type of fluorosis among children from randomly selected villages of Chhattisgarh, and its relationship with fluoride levels in drinking water. A community based door-to-door survey was conducted in the sampled villages of seven districts of Chhattisgarh state during the year 2013-14. The field case definitions were used for labelling types of fluorosis. The fluoride concentration in drinking water was estimated by ion selective electrode method. The prevalence of fluorosis ranged between 12 to 44% in children of surveyed districts. The fluoride levels in drinking water of selected villages were in the range of 0.1-9.0 ppm. Dental and skeletal fluorosis is endemic among children in the surveyed districts of Chhattisgarh State, and is related to drinking water with fluoride content of =1.5 ppm.

  8. Deep in Data: Empirical Data Based Software Accuracy Testing Using the Building America Field Data Repository: Preprint

    SciTech Connect

    Neymark, J.; Roberts, D.

    2013-06-01

    An opportunity is available for using home energy consumption and building description data to develop a standardized accuracy test for residential energy analysis tools. That is, to test the ability of uncalibrated simulations to match real utility bills. Empirical data collected from around the United States have been translated into a uniform Home Performance Extensible Markup Language format that may enable software developers to create translators to their input schemes for efficient access to the data. This may facilitate the possibility of modeling many homes expediently, and thus implementing software accuracy test cases by applying the translated data. This paper describes progress toward, and issues related to, developing a usable, standardized, empirical data-based software accuracy test suite.

  9. Smartphone-Based, Self-Administered Intervention System for Alcohol Use Disorders: Theory and Empirical Evidence Basis

    PubMed Central

    Dulin, Patrick L.; Gonzalez, Vivian M.; King, Diane K.; Giroux, Danielle; Bacon, Samantha

    2013-01-01

    Advances in mobile technology provide an opportunity to deliver in-the-moment interventions to individuals with alcohol use disorders, yet availability of effective “apps” that deliver evidence-based interventions is scarce. We developed an immediately available, portable, smartphone-based intervention system whose purpose is to provide stand-alone, self-administered assessment and intervention. In this paper, we describe how theory and empirical evidence, combined with smartphone functionality contributed to the construction of a user-friendly, engaging alcohol intervention. With translation in mind, we discuss how we selected appropriate intervention components including assessments, feedback and tools, that work together to produce the hypothesized outcomes. PMID:24347811

  10. Does community-based conservation shape favorable attitudes among locals? an empirical study from nepal.

    PubMed

    Mehta, J N; Heinen, J T

    2001-08-01

    Like many developing countries, Nepal has adopted a community-based conservation (CBC) approach in recent years to manage its protected areas mainly in response to poor park-people relations. Among other things, under this approach the government has created new "people-oriented" conservation areas, formed and devolved legal authority to grassroots-level institutions to manage local resources, fostered infrastructure development, promoted tourism, and provided income-generating trainings to local people. Of interest to policy-makers and resource managers in Nepal and worldwide is whether this approach to conservation leads to improved attitudes on the part of local people. It is also important to know if personal costs and benefits associated with various intervention programs, and socioeconomic and demographic characteristics influence these attitudes. We explore these questions by looking at the experiences in Annapurna and Makalu-Barun Conservation Areas, Nepal, which have largely adopted a CBC approach in policy formulation, planning, and management. The research was conducted during 1996 and 1997; the data collection methods included random household questionnaire surveys, informal interviews, and review of official records and published literature. The results indicated that the majority of local people held favorable attitudes toward these conservation areas. Logistic regression results revealed that participation in training, benefit from tourism, wildlife depredation issue, ethnicity, gender, and education level were the significant predictors of local attitudes in one or the other conservation area. We conclude that the CBC approach has potential to shape favorable local attitudes and that these attitudes will be mediated by some personal attributes.

  11. Comparison of ensemble post-processing approaches, based on empirical and dynamical error modelisation of rainfall-runoff model forecasts

    NASA Astrophysics Data System (ADS)

    Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.

    2012-04-01

    In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological

  12. Space-based infrared near-Earth asteroid survey simulation

    NASA Astrophysics Data System (ADS)

    Tedesco, Edward F.; Muinonen, Karri; Price, Stephan D.

    2000-08-01

    We demonstrate the efficiency and effectiveness of using a satellite-based sensor with visual and infrared focal plane arrays to search for that subclass of Near-Earth Objects (NEOs) with orbits largely interior to the Earth's orbit. A space-based visual-infrared system could detect approximately 97% of the Atens and 64% of the IEOs (the, as yet hypothetical, objects with orbits entirely Interior to Earth's Orbit) with diameters greater than 1 km in a 5-year mission and obtain orbits, albedos and diameters for all of them; the respective percentages with diameters greater than 500 m are 90% and 60%. Incidental to the search for Atens and IEOs, we found that 70% of all Earth-Crossing Asteroids (ECAs) with diameters greater than 1 km, and 50% of those with diameters greater than 500 m, would also be detected. These are the results of a feasibility study; optimizing the concept presented would result in greater levels of completion. The cost of such a space-based system is estimated to be within a factor of two of the cost of a ground-based system capable of about 21st magnitude, which would provide only orbits and absolute magnitudes and require decades to reach these completeness levels. In addition to obtaining albedos and diameters for the asteroids discovered in the space-based survey, a space-based visual-infrared system would obtain the same information on virtually all NEOs of interest. A combined space-based and ground-based survey would be highly synergistic in that each can concentrate on what it does best and each complements the strengths of the other. The ground-based system would discover the majority of Amors and Apollos and provide long-term follow-up on all the NEOs discovered in both surveys. The space-based system would discover the majority of Atens and IEOs and provide albedos and diameters on all the NEOs discovered in both surveys and most previously discovered NEOs as well. Thus, an integrated ground- and space-based system could accomplish

  13. [DGRW-update: neurology--from empirical strategies towards evidence based interventions].

    PubMed

    Schupp, W

    2011-12-01

    Stroke, Multiple Sclerosis (MS), traumatic brain injuries (TBI) and neuropathies are the most important diseases in neurological rehabilitation financed by the German Pension Insurance. The primary goal is vocational (re)integration. Driven by multiple findings of neuroscience research the traditional holistic approach with mainly empirically derived strategies was developed further and improved by new evidence-based interventions. This process had been, and continues to be, necessary to meet the health-economic pressures for ever shorter and more efficient rehab measures. Evidence-based interventions refer to symptom-oriented measures, to team-management concepts, as well as to education and psychosocial interventions. Drug therapy and/or neurophysiological measures can be added to increase neuroregeneration and neuroplasticity. Evidence-based aftercare concepts support sustainability and steadiness of rehab results.Mirror therapy, robot-assisted training, mental training, task-specific training, and above all constraint-induced movement therapy (CIMT) can restore motor arm and hand functions. Treadmill training and robot-assisted training improve stance and gait. Botulinum toxine injections in combination with physical and redressing methods are superior in managing spasticity. Guideline-oriented management of associated pain syndromes (myofascial, neuropathic, complex-regional=dystrophic) improve primary outcome and quality of life. Drug therapy with so-called co-analgetics and physical therapy play an important role in pain management. Swallowing disorders lead to higher mortality and morbidity in the acute phase; stepwise diagnostics (screening, endoscopy, radiology) and specific swallowing therapy can reduce these risks and frequently can restore normal eating und drinking.In our modern industrial societies communicative and cognitive disturbances are more impairing than the above mentioned disorders. Speech and language therapy (SLT) is dominant in

  14. Time-frequency analysis of neuronal populations with instantaneous resolution based on noise-assisted multivariate empirical mode decomposition.

    PubMed

    Alegre-Cortés, J; Soto-Sánchez, C; Pizá, Á G; Albarracín, A L; Farfán, F D; Felice, C J; Fernández, E

    2016-07-15

    Linear analysis has classically provided powerful tools for understanding the behavior of neural populations, but the neuron responses to real-world stimulation are nonlinear under some conditions, and many neuronal components demonstrate strong nonlinear behavior. In spite of this, temporal and frequency dynamics of neural populations to sensory stimulation have been usually analyzed with linear approaches. In this paper, we propose the use of Noise-Assisted Multivariate Empirical Mode Decomposition (NA-MEMD), a data-driven template-free algorithm, plus the Hilbert transform as a suitable tool for analyzing population oscillatory dynamics in a multi-dimensional space with instantaneous frequency (IF) resolution. The proposed approach was able to extract oscillatory information of neurophysiological data of deep vibrissal nerve and visual cortex multiunit recordings that were not evidenced using linear approaches with fixed bases such as the Fourier analysis. Texture discrimination analysis performance was increased when Noise-Assisted Multivariate Empirical Mode plus Hilbert transform was implemented, compared to linear techniques. Cortical oscillatory population activity was analyzed with precise time-frequency resolution. Similarly, NA-MEMD provided increased time-frequency resolution of cortical oscillatory population activity. Noise-Assisted Multivariate Empirical Mode Decomposition plus Hilbert transform is an improved method to analyze neuronal population oscillatory dynamics overcoming linear and stationary assumptions of classical methods. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Development of a Web-Based Survey for Monitoring Daily Health and its Application in an Epidemiological Survey

    PubMed Central

    Ohkusa, Yasushi; Akahane, Manabu; Sano, Tomomi; Okabe, Nobuhiko; Imamura, Tomoaki

    2011-01-01

    Background Early detection of symptoms arising from exposure to pathogens, harmful substances, or environmental changes is required for timely intervention. The administration of Web-based questionnaires is a potential method for collecting information from a sample population. Objective The objective of our study was to develop a Web-based daily questionnaire for health (WDQH) for symptomatic surveillance. Methods We adopted two different survey methods to develop the WDQH: an Internet panel survey, which included participants already registered with an Internet survey company, and the Tokyo Consumers’ Co-operative Union (TCCU) Internet survey, in cooperation with the Japanese Consumers’ Co-operative Union, which recruited participants by website advertising. The Internet panel survey participants were given a fee every day for providing answers, and the survey was repeated twice with modified surveys and collection methods: Internet Panel Survey I was conducted every day, and Internet Panel Survey II was conducted every 3 days to reduce costs. We examined whether the survey remained valid by reporting health conditions on day 1 over a 3-day period, and whether the response rate would vary among groups with different incentives. In the TCCU survey, participants were given a fee only for initially registering, and health information was provided in return for survey completion. The WDQH included the demographic details of participants and prompted them to answer questions about the presence of various symptoms by email. Health information collected by the WDQH was then used for the syndromic surveillance of infection. Results Response rates averaged 47.3% for Internet Panel Survey I, 42.7% for Internet Panel Survey II, and 40.1% for the TCCU survey. During a seasonal influenza epidemic, the WDQH detected a rapid increase in the number of participants with fever through the early aberration reporting system. Conclusions We developed a health observation method

  16. Empirical estimation of genome-wide significance thresholds based on the 1000 Genomes Project data set

    PubMed Central

    Kanai, Masahiro; Tanaka, Toshihiro; Okada, Yukinori

    2016-01-01

    To assess the statistical significance of associations between variants and traits, genome-wide association studies (GWAS) should employ an appropriate threshold that accounts for the massive burden of multiple testing in the study. Although most studies in the current literature commonly set a genome-wide significance threshold at the level of P=5.0 × 10−8, the adequacy of this value for respective populations has not been fully investigated. To empirically estimate thresholds for different ancestral populations, we conducted GWAS simulations using the 1000 Genomes Phase 3 data set for Africans (AFR), Europeans (EUR), Admixed Americans (AMR), East Asians (EAS) and South Asians (SAS). The estimated empirical genome-wide significance thresholds were Psig=3.24 × 10−8 (AFR), 9.26 × 10−8 (EUR), 1.83 × 10−7 (AMR), 1.61 × 10−7 (EAS) and 9.46 × 10−8 (SAS). We additionally conducted trans-ethnic meta-analyses across all populations (ALL) and all populations except for AFR (ΔAFR), which yielded Psig=3.25 × 10−8 (ALL) and 4.20 × 10−8 (ΔAFR). Our results indicate that the current threshold (P=5.0 × 10−8) is overly stringent for all ancestral populations except for Africans; however, we should employ a more stringent threshold when conducting a meta-analysis, regardless of the presence of African samples. PMID:27305981

  17. Measuring microscopic evolution processes of complex networks based on empirical data

    NASA Astrophysics Data System (ADS)

    Chi, Liping

    2015-04-01

    Aiming at understanding the microscopic mechanism of complex systems in real world, we perform the measurement that characterizes the evolution properties on two empirical data sets. In the Autonomous Systems Internet data, the network size keeps growing although the system suffers a high rate of node deletion (r = 0.4) and link deletion (q = 0.81). However, the average degree keeps almost unchanged during the whole time range. At each time step the external links attached to a new node are about c = 1.1 and the internal links added between existing nodes are approximately m = 8. For the Scientific Collaboration data, it is a cumulated result of all the authors from 1893 up to the considered year. There is no deletion of nodes and links, r = q = 0. The external and internal links at each time step are c = 1.04 and m = 0, correspondingly. The exponents of degree distribution p(k) ∼ k-γ of these two empirical datasets γdata are in good agreement with that obtained theoretically γtheory. The results indicate that these evolution quantities may provide an insight into capturing the microscopic dynamical processes that govern the network topology.

  18. Empirical estimation of genome-wide significance thresholds based on the 1000 Genomes Project data set.

    PubMed

    Kanai, Masahiro; Tanaka, Toshihiro; Okada, Yukinori

    2016-10-01

    To assess the statistical significance of associations between variants and traits, genome-wide association studies (GWAS) should employ an appropriate threshold that accounts for the massive burden of multiple testing in the study. Although most studies in the current literature commonly set a genome-wide significance threshold at the level of P=5.0 × 10(-8), the adequacy of this value for respective populations has not been fully investigated. To empirically estimate thresholds for different ancestral populations, we conducted GWAS simulations using the 1000 Genomes Phase 3 data set for Africans (AFR), Europeans (EUR), Admixed Americans (AMR), East Asians (EAS) and South Asians (SAS). The estimated empirical genome-wide significance thresholds were Psig=3.24 × 10(-8) (AFR), 9.26 × 10(-8) (EUR), 1.83 × 10(-7) (AMR), 1.61 × 10(-7) (EAS) and 9.46 × 10(-8) (SAS). We additionally conducted trans-ethnic meta-analyses across all populations (ALL) and all populations except for AFR (ΔAFR), which yielded Psig=3.25 × 10(-8) (ALL) and 4.20 × 10(-8) (ΔAFR). Our results indicate that the current threshold (P=5.0 × 10(-8)) is overly stringent for all ancestral populations except for Africans; however, we should employ a more stringent threshold when conducting a meta-analysis, regardless of the presence of African samples.

  19. 1946 Dominican Republic Tsunami: Field Survey based on Eyewitness Interviews

    NASA Astrophysics Data System (ADS)

    Fritz, Hermann M.; Martinez, Claudio; Salado, Juan; Rivera, Wagner; Duarte, Leoncio

    2017-04-01

    On 4 August 1946 an Mw 8.1 earthquake struck off the north-eastern shore of Hispaniola Island resulting in a destructive tsunami with order one hundred fatalities in the Dominican Republic and observed runup in Puerto Rico. In the far field, tsunami waves were recorded on some tide gauges on the Atlantic coast of the United States of America. The earthquake devastated the Dominican Republic, extended into Haiti, and shook many other islands. This was one of the strongest earthquakes reported in the Caribbean since colonial times. The immediate earthquake reconnaissance surveys focused on earthquake damage and were conducted in September 1946 (Lynch and Bodle, 1948; Small, 1948). The 1946 Dominican Republic tsunami eyewitness based field survey took place in three phases from 18 to 21 March 2014, 1 to 3 September 2014 and 9 to 11 May 2016. The International Tsunami Survey Team (ITST) covered more than 400 km of coastline along the northern Dominican Republic from the eastern most tip at Punta Cana to La Isabela some 70 km from the border with Haiti. The survey team documented tsunami runup, flow depth, inundation distances, sea-level drawdown, coastal erosion and co-seismic land level changes based on eyewitnesses interviewed on site using established protocols. The early afternoon earthquake resulted in detailed survival stories with excellent eyewitness observations recounted almost 70 years later with lucidity. The Dominican Republic survey data includes 29 runup and tsunami height measurements at 21 locations. The tsunami impacts peaked with maximum tsunami heights exceeding 5 m at a cluster of locations between Cabrera and El Limon. A maximum tsunami height of 8 m likely associated with splash up was measured in Playa Boca Nueva. Tsunami inundation distances of 600 m or more were measured at Las Terrenas and Playa Rincon on the Samana Peninsula. Some locations were surveyed twice in 2014 and 2016, which allowed to identify current coastal erosion rates. Field

  20. The Effect of Survey Mode on High School Risk Behavior Data: A Comparison between Web and Paper-Based Surveys

    ERIC Educational Resources Information Center

    Raghupathy, Shobana; Hahn-Smith, Stephen

    2013-01-01

    There has been increasing interest in using of web-based surveys--rather than paper based surveys--for collecting data on alcohol and other drug use in middle and high schools in the US. However, prior research has indicated that respondent confidentiality is an underlying concern with online data collection especially when computer-assisted…

  1. The Effect of Survey Mode on High School Risk Behavior Data: A Comparison between Web and Paper-Based Surveys

    ERIC Educational Resources Information Center

    Raghupathy, Shobana; Hahn-Smith, Stephen

    2013-01-01

    There has been increasing interest in using of web-based surveys--rather than paper based surveys--for collecting data on alcohol and other drug use in middle and high schools in the US. However, prior research has indicated that respondent confidentiality is an underlying concern with online data collection especially when computer-assisted…

  2. Mental Health Functioning in the Human Rights Field: Findings from an International Internet-Based Survey

    PubMed Central

    Joscelyne, Amy; Knuckey, Sarah; Satterthwaite, Margaret L.; Bryant, Richard A.; Li, Meng; Qian, Meng; Brown, Adam D.

    2015-01-01

    Human rights advocates play a critical role in promoting respect for human rights world-wide, and engage in a broad range of strategies, including documentation of rights violations, monitoring, press work and report-writing, advocacy, and litigation. However, little is known about the impact of human rights work on the mental health of human rights advocates. This study examined the mental health profile of human rights advocates and risk factors associated with their psychological functioning. 346 individuals currently or previously working in the field of human rights completed an internet-based survey regarding trauma exposure, depression, posttraumatic stress disorder (PTSD), resilience and occupational burnout. PTSD was measured with the Posttraumatic Stress Disorder Checklist-Civilian Version (PCL-C) and depression was measured with the Patient History Questionnaire-9 (PHQ-9). These findings revealed that among human rights advocates that completed the survey, 19.4% met criteria for PTSD, 18.8% met criteria for subthreshold PTSD, and 14.7% met criteria for depression. Multiple linear regressions revealed that after controlling for symptoms of depression, PTSD symptom severity was predicted by human rights-related trauma exposure, perfectionism and negative self-appraisals about human rights work. In addition, after controlling for symptoms of PTSD, depressive symptoms were predicted by perfectionism and lower levels of self-efficacy. Survey responses also suggested high levels of resilience: 43% of responders reported minimal symptoms of PTSD. Although survey responses suggest that many human rights workers are resilient, they also suggest that human rights work is associated with elevated rates of PTSD and depression. The field of human rights would benefit from further empirical research, as well as additional education and training programs in the workplace about enhancing resilience in the context of human rights work. PMID:26700305

  3. Mental Health Functioning in the Human Rights Field: Findings from an International Internet-Based Survey.

    PubMed

    Joscelyne, Amy; Knuckey, Sarah; Satterthwaite, Margaret L; Bryant, Richard A; Li, Meng; Qian, Meng; Brown, Adam D

    2015-01-01

    Human rights advocates play a critical role in promoting respect for human rights world-wide, and engage in a broad range of strategies, including documentation of rights violations, monitoring, press work and report-writing, advocacy, and litigation. However, little is known about the impact of human rights work on the mental health of human rights advocates. This study examined the mental health profile of human rights advocates and risk factors associated with their psychological functioning. 346 individuals currently or previously working in the field of human rights completed an internet-based survey regarding trauma exposure, depression, posttraumatic stress disorder (PTSD), resilience and occupational burnout. PTSD was measured with the Posttraumatic Stress Disorder Checklist-Civilian Version (PCL-C) and depression was measured with the Patient History Questionnaire-9 (PHQ-9). These findings revealed that among human rights advocates that completed the survey, 19.4% met criteria for PTSD, 18.8% met criteria for subthreshold PTSD, and 14.7% met criteria for depression. Multiple linear regressions revealed that after controlling for symptoms of depression, PTSD symptom severity was predicted by human rights-related trauma exposure, perfectionism and negative self-appraisals about human rights work. In addition, after controlling for symptoms of PTSD, depressive symptoms were predicted by perfectionism and lower levels of self-efficacy. Survey responses also suggested high levels of resilience: 43% of responders reported minimal symptoms of PTSD. Although survey responses suggest that many human rights workers are resilient, they also suggest that human rights work is associated with elevated rates of PTSD and depression. The field of human rights would benefit from further empirical research, as well as additional education and training programs in the workplace about enhancing resilience in the context of human rights work.

  4. Empirical evaluation of H.265/HEVC-based dynamic adaptive video streaming over HTTP (HEVC-DASH)

    NASA Astrophysics Data System (ADS)

    Irondi, Iheanyi; Wang, Qi; Grecos, Christos

    2014-05-01

    Real-time HTTP streaming has gained global popularity for delivering video content over Internet. In particular, the recent MPEG-DASH (Dynamic Adaptive Streaming over HTTP) standard enables on-demand, live, and adaptive Internet streaming in response to network bandwidth fluctuations. Meanwhile, emerging is the new-generation video coding standard, H.265/HEVC (High Efficiency Video Coding) promises to reduce the bandwidth requirement by 50% at the same video quality when compared with the current H.264/AVC standard. However, little existing work has addressed the integration of the DASH and HEVC standards, let alone empirical performance evaluation of such systems. This paper presents an experimental HEVC-DASH system, which is a pull-based adaptive streaming solution that delivers HEVC-coded video content through conventional HTTP servers where the client switches to its desired quality, resolution or bitrate based on the available network bandwidth. Previous studies in DASH have focused on H.264/AVC, whereas we present an empirical evaluation of the HEVC-DASH system by implementing a real-world test bed, which consists of an Apache HTTP Server with GPAC, an MP4Client (GPAC) with open HEVC-based DASH client and a NETEM box in the middle emulating different network conditions. We investigate and analyze the performance of HEVC-DASH by exploring the impact of various network conditions such as packet loss, bandwidth and delay on video quality. Furthermore, we compare the Intra and Random Access profiles of HEVC coding with the Intra profile of H.264/AVC when the correspondingly encoded video is streamed with DASH. Finally, we explore the correlation among the quality metrics and network conditions, and empirically establish under which conditions the different codecs can provide satisfactory performance.

  5. Comparisons of ground motions from the 1999 Chi-Chi, earthquake with empirical predictions largely based on data from California

    USGS Publications Warehouse

    Boore, D.M.

    2001-01-01

    This article has the modest goal of comparing the ground motions recorded during the 1999 Chi-Chi, Taiwan, mainshock with predictions from four empirical-based equations commonly used for western North America; these empirical predictions are largely based on data from California. Comparisons are made for peak acceleration and 5%-damped response spectra at periods between 0.1 and 4 sec. The general finding is that the Chi-Chi ground motions are smaller than those predicted from the empirically based equations for periods less than about 1 sec by factors averaging about 0.4 but as small as 0.26 (depending on period, on which equation is used, and on whether the sites are assumed to be rock or soil). There is a trend for the observed motions to approach or even exceed the predicted motions for longer periods. Motions at similar distances (30-60 km) to the east and to the west of the fault differ dramatically at periods between about 2 and 20 sec: Long-duration wave trains are present on the motions to the west, and when normalized to similar amplitudes at short periods, the response spectra of the motions at the western stations are as much as five times larger than those of motions from eastern stations. The explanation for the difference is probably related to site and propagation effects; the western stations are on the Coastal Plain, whereas the eastern stations are at the foot of young and steep mountains, either in the relatively narrow Longitudinal Valley or along the eastern coast-the sediments underlying the eastern stations are probably shallower and have higher velocity than those under the western stations.

  6. HIV testing in national population-based surveys: experience from the Demographic and Health Surveys.

    PubMed Central

    Mishra, Vinod; Vaessen, Martin; Boerma, J. Ties; Arnold, Fred; Way, Ann; Barrere, Bernard; Cross, Anne; Hong, Rathavuth; Sangha, Jasbir

    2006-01-01

    OBJECTIVES: To describe the methods used in the Demographic and Health Surveys (DHS) to collect nationally representative data on the prevalence of human immunodeficiency virus (HIV) and assess the value of such data to country HIV surveillance systems. METHODS: During 2001-04, national samples of adult women and men in Burkina Faso, Cameroon, Dominican Republic, Ghana, Mali, Kenya, United Republic of Tanzania and Zambia were tested for HIV. Dried blood spot samples were collected for HIV testing, following internationally accepted ethical standards. The results for each country are presented by age, sex, and urban versus rural residence. To estimate the effects of non-response, HIV prevalence among non-responding males and females was predicted using multivariate statistical models for those who were tested, with a common set of predictor variables. RESULTS: Rates of HIV testing varied from 70% among Kenyan men to 92% among women in Burkina Faso and Cameroon. Despite large differences in HIV prevalence between the surveys (1-16%), fairly consistent patterns of HIV infection were observed by age, sex and urban versus rural residence, with considerably higher rates in urban areas and in women, especially at younger ages. Analysis of non-response bias indicates that although predicted HIV prevalence tended to be higher in non-tested males and females than in those tested, the overall effects of non-response on the observed national estimates of HIV prevalence are insignificant. CONCLUSIONS: Population-based surveys can provide reliable, direct estimates of national and regional HIV seroprevalence among men and women irrespective of pregnancy status. Survey data greatly enhance surveillance systems and the accuracy of national estimates in generalized epidemics. PMID:16878227

  7. Empirical rainfall thresholds and copula based IDF curves for shallow landslides and flash floods

    NASA Astrophysics Data System (ADS)

    Bezak, Nejc; Šraj, Mojca; Brilly, Mitja; Mikoš, Matjaž

    2015-04-01

    Large mass movements, like deep-seated landslides or large debris flows, and flash floods can endanger human lives and cause huge environmental and economic damage in hazard areas. The main objective of the study was to investigate the characteristics of selected extreme rainfall events, which triggered landslides and caused flash floods, in Slovenia in the last 25 years. Seven extreme events, which occurred in Slovenia (Europe) in the last 25 years (1990-2014) and caused 17 casualties and about 500 million Euros of economic loss, were analysed in this study. Post-event analyses showed that rainfall characteristics triggering flash floods and landslides are different where landslides were triggered by longer duration (up to one or few weeks) rainfall events and flash floods by short duration (few hours to one or two days) rainfall events. The sensitivity analysis results indicate that inter-event time variable, which is defined as the minimum duration of the period without rain between two consecutive rainfall events, and sample definition methodology can have significant influence on the position of rainfall events in the intensity-duration space, on the constructed intensity-duration-frequency (IDF) curves and on the relationship between the empirical rainfall threshold curves and IDF curves constructed using copula approach. The empirical rainfall threshold curves (ID curves) were also evaluated for the selected extreme events. The results indicate that a combination of several empirical rainfall thresholds with appropriate high density of rainfall measuring network can be used as part of the early warning system for initiation of landslides and debris flows. However, different rainfall threshold curves should be used for lowland and mountainous areas in Slovenia. Furthermore, the intensity-duration-frequency (IDF) relationship was constructed using the Frank copula functions for 16 pluviographic meteorological stations in Slovenia using the high resolution

  8. Measurement of COPD Severity Using a Survey-Based Score

    PubMed Central

    Omachi, Theodore A.; Katz, Patricia P.; Yelin, Edward H.; Iribarren, Carlos; Blanc, Paul D.

    2010-01-01

    Background: A comprehensive survey-based COPD severity score has usefulness for epidemiologic and health outcomes research. We previously developed and validated the survey-based COPD Severity Score without using lung function or other physiologic measurements. In this study, we aimed to further validate the severity score in a different COPD cohort and using a combination of patient-reported and objective physiologic measurements. Methods: Using data from the Function, Living, Outcomes, and Work cohort study of COPD, we evaluated the concurrent and predictive validity of the COPD Severity Score among 1,202 subjects. The survey instrument is a 35-point score based on symptoms, medication and oxygen use, and prior hospitalization or intubation for COPD. Subjects were systemically assessed using structured telephone survey, spirometry, and 6-min walk testing. Results: We found evidence to support concurrent validity of the score. Higher COPD Severity Score values were associated with poorer FEV1 (r = −0.38), FEV1% predicted (r = −0.40), Body mass, Obstruction, Dyspnea, Exercise Index (r = 0.57), and distance walked in 6 min (r = −0.43) (P < .0001 in all cases). Greater COPD severity was also related to poorer generic physical health status (r = −0.49) and disease-specific health-related quality of life (r = 0.57) (P < .0001). The score also demonstrated predictive validity. It was also associated with a greater prospective risk of acute exacerbation of COPD defined as ED visits (hazard ratio [HR], 1.31; 95% CI, 1.24-1.39), hospitalizations (HR, 1.59; 95% CI, 1.44-1.75), and either measure of hospital-based care for COPD (HR, 1.34; 95% CI, 1.26-1.41) (P < .0001 in all cases). Conclusion: The COPD Severity Score is a valid survey-based measure of disease-specific severity, both in terms of concurrent and predictive validity. The score is a psychometrically sound instrument for use in epidemiologic and outcomes research in COPD. PMID:20040611

  9. University-Based Evaluation Training Programs in the United States 1980-2008: An Empirical Examination

    ERIC Educational Resources Information Center

    LaVelle, John M.; Donaldson, Stewart I.

    2010-01-01

    Evaluation practice has grown in leaps and bounds in recent years. In contrast, the most recent survey data suggest that there has been a sharp decline in the number and strength of preservice evaluation training programs in the United States. In an effort to further understand this curious trend, an alternative methodology was used to examine the…

  10. University-Based Evaluation Training Programs in the United States 1980-2008: An Empirical Examination

    ERIC Educational Resources Information Center

    LaVelle, John M.; Donaldson, Stewart I.

    2010-01-01

    Evaluation practice has grown in leaps and bounds in recent years. In contrast, the most recent survey data suggest that there has been a sharp decline in the number and strength of preservice evaluation training programs in the United States. In an effort to further understand this curious trend, an alternative methodology was used to examine the…

  11. An Empirical Investigation of a Theoretically Based Measure of Perceived Wellness

    ERIC Educational Resources Information Center

    Harari, Marc J.; Waehler, Charles A.; Rogers, James R.

    2005-01-01

    The Perceived Wellness Survey (PWS; T. Adams, 1995; T. Adams, J. Bezner, & M. Steinhardt, 1997) is a recently developed instrument intended to operationalize the comprehensive Perceived Wellness Model (T. Adams, J. Bezner, & M. Steinhardt, 1997), an innovative model that attempts to include the balance of multiple life activities in its evaluation…

  12. An Empirical Investigation of a Theoretically Based Measure of Perceived Wellness

    ERIC Educational Resources Information Center

    Harari, Marc J.; Waehler, Charles A.; Rogers, James R.

    2005-01-01

    The Perceived Wellness Survey (PWS; T. Adams, 1995; T. Adams, J. Bezner, & M. Steinhardt, 1997) is a recently developed instrument intended to operationalize the comprehensive Perceived Wellness Model (T. Adams, J. Bezner, & M. Steinhardt, 1997), an innovative model that attempts to include the balance of multiple life activities in its evaluation…

  13. Conceptual and Empirical Bases of Readability Formulas. Technical Report No. 392.

    ERIC Educational Resources Information Center

    Anderson, Richard C.; Davison, Alice

    The problems arising from treating word and sentence complexity as the direct causes of difficulty in comprehension are surveyed in this paper from the perspective of readability formulas. The basic choices and assumptions made in the development and use of readability formulas are discussed in relation to the larger question of text…

  14. An Empirical Model-based MOE for Friction Reduction by Slot-Ejected Polymer Solutions in an Aqueous Environment

    DTIC Science & Technology

    2007-12-21

    Date: 21 December 2007 Prepared by: Dr. John G. Pierce Under Contract No.: N00014-06-C-0535 Submitted to: Office of Naval Research Dr...1. REPORT DATE 21 DEC 2007 2. REPORT TYPE 3. DATES COVERED 00-00-2007 to 00-00-2007 4. TITLE AND SUBTITLE An Empirical Model-based MOE for...SPONSOR/MONITOR’S ACRONYM(S) 11 . SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution

  15. An automatic electroencephalography blinking artefact detection and removal method based on template matching and ensemble empirical mode decomposition.

    PubMed

    Bizopoulos, Paschalis A; Al-Ani, Tarik; Tsalikakis, Dimitrios G; Tzallas, Alexandros T; Koutsouris, Dimitrios D; Fotiadis, Dimitrios I

    2013-01-01

    Electrooculographic (EOG) artefacts are one of the most common causes of Electroencephalogram (EEG) distortion. In this paper, we propose a method for EOG Blinking Artefacts (BAs) detection and removal from EEG. Normalized Correlation Coefficient (NCC), based on a predetermined BA template library was used for detecting the BA. Ensemble Empirical Mode Decomposition (EEMD) was applied to the contaminated region and a statistical algorithm determined which Intrinsic Mode Functions (IMFs) correspond to the BA. The proposed method was applied in simulated EEG signals, which were contaminated with artificially created EOG BAs, increasing the Signal-to-Error Ratio (SER) of the EEG Contaminated Region (CR) by 35 dB on average.

  16. Bearing fault detection based on hybrid ensemble detector and empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Georgoulas, George; Loutas, Theodore; Stylios, Chrysostomos D.; Kostopoulos, Vassilis

    2013-12-01

    Aiming at more efficient fault diagnosis, this research work presents an integrated anomaly detection approach for seeded bearing faults. Vibration signals from normal bearings and bearings with three different fault locations, as well as different fault sizes and loading conditions are examined. The Empirical Mode Decomposition and the Hilbert Huang transform are employed for the extraction of a compact feature set. Then, a hybrid ensemble detector is trained using data coming only from the normal bearings and it is successfully applied for the detection of any deviation from the normal condition. The results prove the potential use of the proposed scheme as a first stage of an alarm signalling system for the detection of bearing faults irrespective of their loading condition.

  17. Inferring causal molecular networks: empirical assessment through a community-based effort.

    PubMed

    Hill, Steven M; Heiser, Laura M; Cokelaer, Thomas; Unger, Michael; Nesser, Nicole K; Carlin, Daniel E; Zhang, Yang; Sokolov, Artem; Paull, Evan O; Wong, Chris K; Graim, Kiley; Bivol, Adrian; Wang, Haizhou; Zhu, Fan; Afsari, Bahman; Danilova, Ludmila V; Favorov, Alexander V; Lee, Wai Shing; Taylor, Dane; Hu, Chenyue W; Long, Byron L; Noren, David P; Bisberg, Alexander J; Mills, Gordon B; Gray, Joe W; Kellen, Michael; Norman, Thea; Friend, Stephen; Qutub, Amina A; Fertig, Elana J; Guan, Yuanfang; Song, Mingzhou; Stuart, Joshua M; Spellman, Paul T; Koeppl, Heinz; Stolovitzky, Gustavo; Saez-Rodriguez, Julio; Mukherjee, Sach

    2016-04-01

    It remains unclear whether causal, rather than merely correlational, relationships in molecular networks can be inferred in complex biological settings. Here we describe the HPN-DREAM network inference challenge, which focused on learning causal influences in signaling networks. We used phosphoprotein data from cancer cell lines as well as in silico data from a nonlinear dynamical model. Using the phosphoprotein data, we scored more than 2,000 networks submitted by challenge participants. The networks spanned 32 biological contexts and were scored in terms of causal validity with respect to unseen interventional data. A number of approaches were effective, and incorporating known biology was generally advantageous. Additional sub-challenges considered time-course prediction and visualization. Our results suggest that learning causal relationships may be feasible in complex settings such as disease states. Furthermore, our scoring approach provides a practical way to empirically assess inferred molecular networks in a causal sense.

  18. Stability evaluation of short-circuiting gas metal arc welding based on ensemble empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Huang, Yong; Wang, Kehong; Zhou, Zhilan; Zhou, Xiaoxiao; Fang, Jimi

    2017-03-01

    The arc of gas metal arc welding (GMAW) contains abundant information about its stability and droplet transition, which can be effectively characterized by extracting the arc electrical signals. In this study, ensemble empirical mode decomposition (EEMD) was used to evaluate the stability of electrical current signals. The welding electrical signals were first decomposed by EEMD, and then transformed to a Hilbert–Huang spectrum and a marginal spectrum. The marginal spectrum is an approximate distribution of amplitude with frequency of signals, and can be described by a marginal index. Analysis of various welding process parameters showed that the marginal index of current signals increased when the welding process was more stable, and vice versa. Thus EEMD combined with the marginal index can effectively uncover the stability and droplet transition of GMAW.

  19. Ab initio based empirical potential used to study the mechanical properties of molybdenum

    NASA Astrophysics Data System (ADS)

    Park, Hyoungki; Fellinger, Michael R.; Lenosky, Thomas J.; Tipton, William W.; Trinkle, Dallas R.; Rudin, Sven P.; Woodward, Christopher; Wilkins, John W.; Hennig, Richard G.

    2012-06-01

    Density-functional theory energies, forces, and elastic constants determine the parametrization of an empirical, modified embedded-atom method potential for molybdenum. The accuracy and transferability of the potential are verified by comparison to experimental and density-functional data for point defects, phonons, thermal expansion, surface and stacking fault energies, and ideal shear strength. Searching the energy landscape predicted by the potential using a genetic algorithm verifies that it reproduces not only the correct bcc ground state of molybdenum but also all low-energy metastable phases. The potential is also applicable to the study of plastic deformation and used to compute energies, core structures, and Peierls stresses of screw and edge dislocations.

  20. Transportability of an empirically supported dissonance-based prevention program for eating disorders.

    PubMed

    Perez, Marisol; Becker, Carolyn Black; Ramirez, Ana

    2010-06-01

    This study sought to evaluate the degree to which positive effects remained when a well studied cognitive dissonance eating disorder prevention program was disseminated through a large national sorority under naturalistic conditions. All participants underwent a 2-session program run by peer facilitators. The sample included 182 undergraduate women from a local chapter of a national sorority at a large public university. Analyses revealed that the program significantly reduced body dissatisfaction, thin ideal internalization, dietary restraint, and the use of the media as a source of information about beauty, and restrained eating. Importantly, effect sizes were maintained at 5-months and 1-year follow-up. These findings demonstrate that empirically supported programs can remain effective when disseminated with careful training in large social systems.

  1. Avian survey and field guide for Osan Air Base, Korea.

    SciTech Connect

    Levenson, J.

    2006-12-05

    This report summarizes the results of the avian surveys conducted at Osan Air Base (AB). This ongoing survey is conducted to comply with requirements of the Environmental Governing Standards (EGS) for the Republic of Korea, the Integrated Natural Resources Management Plan (INRMP) for Osan AB, and the 51st Fighter Wing's Bird Aircraft Strike Hazard (BASH) Plan. One hundred ten bird species representing 35 families were identified and recorded. Seven species are designated as Natural Monuments, and their protection is accorded by the Korean Ministry of Culture and Tourism. Three species appear on the Korean Association for Conservation of Nature's (KACN's) list of Reserved Wild Species and are protected by the Korean Ministry of Environment. Combined, ten different species are Republic of Korea (ROK)-protected. The primary objective of the avian survey at Osan AB was to determine what species of birds are present on the airfield and their respective habitat requirements during the critical seasons of the year. This requirement is specified in Annex J.14.c of the 51st Fighter BASH Plan 91-212 (51 FW OPLAN 91-212). The second objective was to initiate surveys to determine what bird species are present on Osan AB throughout the year and from the survey results, determine if threatened, endangered, or other Korean-listed bird species are present on Osan AB. This overall census satisfies Criterion 13-3.e of the EGS for Korea. The final objective was to formulate management strategies within Osan AB's operational requirements to protect and enhance habitats of known threatened, endangered, and ROK-protected species in accordance with EGS Criterion 13-3.a that are also favorable for the reproduction of indigenous species in accordance with the EGS Criterion 13-3.h.

  2. Gold price analysis based on ensemble empirical model decomposition and independent component analysis

    NASA Astrophysics Data System (ADS)

    Xian, Lu; He, Kaijian; Lai, Kin Keung

    2016-07-01

    In recent years, the increasing level of volatility of the gold price has received the increasing level of attention from the academia and industry alike. Due to the complexity and significant fluctuations observed in the gold market, however, most of current approaches have failed to produce robust and consistent modeling and forecasting results. Ensemble Empirical Model Decomposition (EEMD) and Independent Component Analysis (ICA) are novel data analysis methods that can deal with nonlinear and non-stationary time series. This study introduces a new methodology which combines the two methods and applies it to gold price analysis. This includes three steps: firstly, the original gold price series is decomposed into several Intrinsic Mode Functions (IMFs) by EEMD. Secondly, IMFs are further processed with unimportant ones re-grouped. Then a new set of data called Virtual Intrinsic Mode Functions (VIMFs) is reconstructed. Finally, ICA is used to decompose VIMFs into statistically Independent Components (ICs). The decomposition results reveal that the gold price series can be represented by the linear combination of ICs. Furthermore, the economic meanings of ICs are analyzed and discussed in detail, according to the change trend and ICs' transformation coefficients. The analyses not only explain the inner driving factors and their impacts but also conduct in-depth analysis on how these factors affect gold price. At the same time, regression analysis has been conducted to verify our analysis. Results from the empirical studies in the gold markets show that the EEMD-ICA serve as an effective technique for gold price analysis from a new perspective.

  3. Population-based absolute risk estimation with survey data

    PubMed Central

    Kovalchik, Stephanie A.; Pfeiffer, Ruth M.

    2013-01-01

    Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614

  4. Road rage: prevalence pattern and web based survey feasibility.

    PubMed

    Mina, Shaily; Verma, Rohit; Balhara, Yatan Pal Singh; Ul-Hasan, Shiraz

    2014-01-01

    Introduction. Incidents of road rage are on a rise in India, but the literature is lacking in the aspect. There is an increasing realization of possibility of effective web based interventions to deliver public health related messages. Objective. The aim was to quantitatively evaluate risk factors among motor vehicle drivers using an internet based survey. Methods. Facebook users were evaluated using Life Orientation Test-Revised (LOT-R) and Driving Anger Scale (DAS). Results. An adequate response rate of 65.9% and satisfactory reliability with sizable correlation were obtained for both scales. Age was found to be positively correlated to LOT-R scores (r = 0.21; P = 0.02) and negatively correlated to DAS scores (r = -0.19; P = 0.03). Years of education were correlated to LOT-R scores (r = 0.26; P = 0.005) but not DAS scores (r = -0.14; P = 0.11). LOT-R scores did not correlate to DAS scores. Conclusion. There is high prevalence of anger amongst drivers in India particularly among younger males. A short web survey formatted in easy to use question language can result in a feasible conduction of an online survey.

  5. A survey of GPU-based medical image computing techniques

    PubMed Central

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming

    2012-01-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine. PMID:23256080

  6. A survey of GPU-based medical image computing techniques.

    PubMed

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming; Wang, Defeng

    2012-09-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine.

  7. 23 CFR Appendix C to Part 1240 - Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153)

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 23 Highways 1 2012-04-01 2012-04-01 false Certification (Calendar Year 1998 Survey Based on Survey... 1240—Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153) State Certification-Calendar Year 1998 Seat Belt Use Survey State of Seat Belt Use Rate Reported for Calendar...

  8. 23 CFR Appendix C to Part 1240 - Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153)

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 23 Highways 1 2014-04-01 2014-04-01 false Certification (Calendar Year 1998 Survey Based on Survey... 1240—Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153) State Certification-Calendar Year 1998 Seat Belt Use Survey State of Seat Belt Use Rate Reported for Calendar...

  9. 23 CFR Appendix C to Part 1240 - Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153)

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 23 Highways 1 2013-04-01 2013-04-01 false Certification (Calendar Year 1998 Survey Based on Survey... 1240—Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153) State Certification-Calendar Year 1998 Seat Belt Use Survey State of Seat Belt Use Rate Reported for Calendar...

  10. Rapid Mapping Method Based on Free Blocks of Surveys

    NASA Astrophysics Data System (ADS)

    Yu, Xianwen; Wang, Huiqing; Wang, Jinling

    2016-06-01

    While producing large-scale larger than 1:2000 maps in cities or towns, the obstruction from buildings leads to difficult and heavy tasks of measuring mapping control points. In order to avoid measuring the mapping control points and shorten the time of fieldwork, in this paper, a quick mapping method is proposed. This method adjusts many free blocks of surveys together, and transforms the points from all free blocks of surveys into the same coordinate system. The entire surveying area is divided into many free blocks, and connection points are set on the boundaries between free blocks. An independent coordinate system of every free block is established via completely free station technology, and the coordinates of the connection points, detail points and control points in every free block in the corresponding independent coordinate systems are obtained based on poly-directional open traverses. Error equations are established based on connection points, which are determined together to obtain the transformation parameters. All points are transformed from the independent coordinate systems to a transitional coordinate system via the transformation parameters. Several control points are then measured by GPS in a geodetic coordinate system. All the points can then be transformed from the transitional coordinate system to the geodetic coordinate system. In this paper, the implementation process and mathematical formulas of the new method are presented in detail, and the formula to estimate the precision of surveys is given. An example has demonstrated that the precision of using the new method could meet large-scale mapping needs.

  11. A Survey of Artificial Immune System Based Intrusion Detection

    PubMed Central

    Li, Tao; Hu, Xinlei; Wang, Feng; Zou, Yang

    2014-01-01

    In the area of computer security, Intrusion Detection (ID) is a mechanism that attempts to discover abnormal access to computers by analyzing various interactions. There is a lot of literature about ID, but this study only surveys the approaches based on Artificial Immune System (AIS). The use of AIS in ID is an appealing concept in current techniques. This paper summarizes AIS based ID methods from a new view point; moreover, a framework is proposed for the design of AIS based ID Systems (IDSs). This framework is analyzed and discussed based on three core aspects: antibody/antigen encoding, generation algorithm, and evolution mode. Then we collate the commonly used algorithms, their implementation characteristics, and the development of IDSs into this framework. Finally, some of the future challenges in this area are also highlighted. PMID:24790549

  12. A survey of artificial immune system based intrusion detection.

    PubMed

    Yang, Hua; Li, Tao; Hu, Xinlei; Wang, Feng; Zou, Yang

    2014-01-01

    In the area of computer security, Intrusion Detection (ID) is a mechanism that attempts to discover abnormal access to computers by analyzing various interactions. There is a lot of literature about ID, but this study only surveys the approaches based on Artificial Immune System (AIS). The use of AIS in ID is an appealing concept in current techniques. This paper summarizes AIS based ID methods from a new view point; moreover, a framework is proposed for the design of AIS based ID Systems (IDSs). This framework is analyzed and discussed based on three core aspects: antibody/antigen encoding, generation algorithm, and evolution mode. Then we collate the commonly used algorithms, their implementation characteristics, and the development of IDSs into this framework. Finally, some of the future challenges in this area are also highlighted.

  13. Empirical Methods for Predicting Eutrophication in Impoundments. Report 1. Phase I. Data Base Development.

    DTIC Science & Technology

    1981-05-01

    estimating volume and area variations with elevation, required for volume-averaging of water quality data and for calculating material load- ings,.)*ave...in the EPA National Eutrophication Survey Compendium . ... 92 6 Distributions of Volume and Area Slope Parameters ..... . 106 7 White River System...the morphometric profiles have been tested by comparing reported volumes at any elevation with the integral of reported areas with respect to depth

  14. RN's experiences of sex-based and sexual harassment--an empirical study.

    PubMed

    Madison, J

    1997-01-01

    A survey of 317 registered nurses enrolled in tertiary post-registration courses found that two thirds of the 197 respondents had encountered sexual harassment in the work place. A quarter of these nurses identified medical officers and 22.1% identified co-workers as their harassers.This paper identifies the harassing behaviours the respondents experienced, their responses to the behaviour and the effects the harassment had on them.

  15. Accounting protesting and warm glow bidding in Contingent Valuation surveys considering the management of environmental goods--an empirical case study assessing the value of protecting a Natura 2000 wetland area in Greece.

    PubMed

    Grammatikopoulou, Ioanna; Olsen, Søren Bøye

    2013-11-30

    Based on a Contingent Valuation survey aiming to reveal the willingness to pay (WTP) for conservation of a wetland area in Greece, we show how protest and warm glow motives can be taken into account when modeling WTP. In a sample of more than 300 respondents, we find that 54% of the positive bids are rooted to some extent in warm glow reasoning while 29% of the zero bids can be classified as expressions of protest rather than preferences. In previous studies, warm glow bidders are only rarely identified while protesters are typically identified and excluded from further analysis. We test for selection bias associated with simple removal of both protesters and warm glow bidders in our data. Our findings show that removal of warm glow bidders does not significantly distort WTP whereas we find strong evidence of selection bias associated with removal of protesters. We show how to correct for such selection bias by using a sample selection model. In our empirical sample, using the typical approach of removing protesters from the analysis, the value of protecting the wetland is significantly underestimated by as much as 46% unless correcting for selection bias.

  16. Vision-based traffic surveys in urban environments

    NASA Astrophysics Data System (ADS)

    Chen, Zezhi; Ellis, Tim; Velastin, Sergio A.

    2016-09-01

    This paper presents a state-of-the-art, vision-based vehicle detection and type classification to perform traffic surveys from a roadside closed-circuit television camera. Vehicles are detected using background subtraction based on a Gaussian mixture model that can cope with vehicles that become stationary over a significant period of time. Vehicle silhouettes are described using a combination of shape and appearance features using an intensity-based pyramid histogram of orientation gradients (HOG). Classification is performed using a support vector machine, which is trained on a small set of hand-labeled silhouette exemplars. These exemplars are identified using a model-based preclassifier that utilizes calibrated images mapped by Google Earth to provide accurately surveyed scene geometry matched to visible image landmarks. Kalman filters track the vehicles to enable classification by majority voting over several consecutive frames. The system counts vehicles and separates them into four categories: car, van, bus, and motorcycle (including bicycles). Experiments with real-world data have been undertaken to evaluate system performance and vehicle detection rates of 96.45% and classification accuracy of 95.70% have been achieved on this data.

  17. Development of the knowledge-based and empirical combined scoring algorithm (KECSA) to score protein-ligand interactions.

    PubMed

    Zheng, Zheng; Merz, Kenneth M

    2013-05-24

    We describe a novel knowledge-based protein-ligand scoring function that employs a new definition for the reference state, allowing us to relate a statistical potential to a Lennard-Jones (LJ) potential. In this way, the LJ potential parameters were generated from protein-ligand complex structural data contained in the Protein Databank (PDB). Forty-nine (49) types of atomic pairwise interactions were derived using this method, which we call the knowledge-based and empirical combined scoring algorithm (KECSA). Two validation benchmarks were introduced to test the performance of KECSA. The first validation benchmark included two test sets that address the training set and enthalpy/entropy of KECSA. The second validation benchmark suite included two large-scale and five small-scale test sets, to compare the reproducibility of KECSA, with respect to two empirical score functions previously developed in our laboratory (LISA and LISA+), as well as to other well-known scoring methods. Validation results illustrate that KECSA shows improved performance in all test sets when compared with other scoring methods, especially in its ability to minimize the root mean square error (RMSE). LISA and LISA+ displayed similar performance using the correlation coefficient and Kendall τ as the metric of quality for some of the small test sets. Further pathways for improvement are discussed for which would allow KECSA to be more sensitive to subtle changes in ligand structure.

  18. A Compound fault diagnosis for rolling bearings method based on blind source separation and ensemble empirical mode decomposition.

    PubMed

    Wang, Huaqing; Li, Ruitong; Tang, Gang; Yuan, Hongfang; Zhao, Qingliang; Cao, Xi

    2014-01-01

    A Compound fault signal usually contains multiple characteristic signals and strong confusion noise, which makes it difficult to separate week fault signals from them through conventional ways, such as FFT-based envelope detection, wavelet transform or empirical mode decomposition individually. In order to improve the compound faults diagnose of rolling bearings via signals' separation, the present paper proposes a new method to identify compound faults from measured mixed-signals, which is based on ensemble empirical mode decomposition (EEMD) method and independent component analysis (ICA) technique. With the approach, a vibration signal is firstly decomposed into intrinsic mode functions (IMF) by EEMD method to obtain multichannel signals. Then, according to a cross correlation criterion, the corresponding IMF is selected as the input matrix of ICA. Finally, the compound faults can be separated effectively by executing ICA method, which makes the fault features more easily extracted and more clearly identified. Experimental results validate the effectiveness of the proposed method in compound fault separating, which works not only for the outer race defect, but also for the rollers defect and the unbalance fault of the experimental system.

  19. Development of the Knowledge-based & Empirical Combined Scoring Algorithm (KECSA) to Score Protein-Ligand Interactions

    PubMed Central

    Zheng, Zheng

    2013-01-01

    We describe a novel knowledge-based protein-ligand scoring function that employs a new definition for the reference state, allowing us to relate a statistical potential to a Lennard-Jones (LJ) potential. In this way, the LJ potential parameters were generated from protein-ligand complex structural data contained in the PDB. Forty-nine types of atomic pairwise interactions were derived using this method, which we call the knowledge-based and empirical combined scoring algorithm (KECSA). Two validation benchmarks were introduced to test the performance of KECSA. The first validation benchmark included two test sets that address the training-set and enthalpy/entropy of KECSA The second validation benchmark suite included two large-scale and five small-scale test sets to compare the reproducibility of KECSA with respect to two empirical score functions previously developed in our laboratory (LISA and LISA+), as well as to other well-known scoring methods. Validation results illustrate that KECSA shows improved performance in all test sets when compared with other scoring methods especially in its ability to minimize the RMSE. LISA and LISA+ displayed similar performance using the correlation coefficient and Kendall τ as the metric of quality for some of the small test sets. Further pathways for improvement are discussed which would KECSA more sensitive to subtle changes in ligand structure. PMID:23560465

  20. [Willingness to accept an Internet-based mobility platform in different age cohorts. Empiric results of the project S-Mobil 100].

    PubMed

    Beil, J; Cihlar, V; Kruse, A

    2015-02-01

    The aim of the project S-Mobil 100 is to develop and implement a prototype of an internet-based, generation-appropriate mobility platform in the model region Siegen-Wittgenstein. In the context of an empirical preliminary study, use of technology, experience with technology, general attitudes towards technology, general technology commitment, and the willingness to accept the mobility platform were investigated in different age cohorts. The investigation was carried out using a written survey based on a standardized questionnaire. The sample of 358 persons aged 40-90 years was divided in four age cohorts (40-54, 55-64, 65-74, and 75 + years). Our results show a high willingness to accept the mobility platform in the overall sample. Age, residence, income, and general technology commitment were significant predictors for the judgment of the platform. Although there were group differences in accepting the mobility platform, the older cohorts are also open-minded towards this new technology.

  1. Relevant modes selection method based on Spearman correlation coefficient for laser signal denoising using empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Duan, Yabo; Song, Chengtian

    2016-12-01

    Empirical mode decomposition (EMD) is a recently proposed nonlinear and nonstationary laser signal denoising method. A noisy signal is broken down using EMD into oscillatory components that are called intrinsic mode functions (IMFs). Thresholding-based denoising and correlation-based partial reconstruction of IMFs are the two main research directions for EMD-based denoising. Similar to other decomposition-based denoising approaches, EMD-based denoising methods require a reliable threshold to determine which IMFs are noise components and which IMFs are noise-free components. In this work, we propose a new approach in which each IMF is first denoised using EMD interval thresholding (EMD-IT), and then a robust thresholding process based on Spearman correlation coefficient is used for relevant modes selection. The proposed method tackles the problem using a thresholding-based denoising approach coupled with partial reconstruction of the relevant IMFs. Other traditional denoising methods, including correlation-based EMD partial reconstruction (EMD-Correlation), discrete Fourier transform and wavelet-based methods, are investigated to provide a comparison with the proposed technique. Simulation and test results demonstrate the superior performance of the proposed method when compared with the other methods.

  2. Prospects for Gaia and other space-based surveys .

    NASA Astrophysics Data System (ADS)

    Bailer-Jones, Coryn A. L.

    Gaia is a fully-approved all-sky astrometric and photometric survey due for launch in 2011. It will measure accurate parallaxes and proper motions for everything brighter than G=20 (ca. 109 stars). Its primary objective is to study the composition, origin and evolution of our Galaxy from the 3D structure, 3D velocities, abundances and ages of its stars. In some respects it can be considered as a cosmological survey at redshift zero. Several other upcoming space-based surveys, in particular JWST and Herschel, will study star and galaxy formation in the early (high-redshift) universe. In this paper I briefly describe these missions, as well as SIM and Jasmine, and explain why they need to observe from space. I then discuss some Galactic science contributions of Gaia concerning dark matter, the search for substructure, stellar populations and the mass-luminosity relation. The Gaia data are complex and require the development of novel analysis methods; here I summarize the principle of the astrometric processing. In the last two sections I outline how the Gaia data can be exploited in connection with other observational and theoretical work in order to build up a more comprehensive picture of galactic evolution.

  3. A Multi-wavelength Study of Nearby Galaxies Based on Molecular Line Surveys: MIPS Observations

    NASA Astrophysics Data System (ADS)

    Fazio, Giovanni; Wang, Zhong; Bush, Stephanie; Cox, Thomas J.; Keto, Eric; Pahre, Michael; Rosolowsky, Erik; Smith, Howard

    2008-03-01

    Dense molecular gas, warm dust, and hot ionized gas are different components of the multi-step transformation of cold gas into stars and star clusters. While empirical laws on star formation in galaxies have been established based on global measurements of these components, substantial galaxy-to-galaxy variations still exist and remain unexplained. To understand the mechanisms that induce and regulate star formation and thus galaxy evolution, we need to study processes on the local scales of typical star forming regions and giant molecular clouds. In a set of pilot studies, we analyzed the Spitzer and Galex data of nearby giant spirals M31, M33 and M99, and compared with the new interferometric CO maps of matching angular resolution. We found evidence that variations in local condition, environmental effects, and viewing geometry may explain much of the large scatter in the empirical relationships. Based on the success of this initial investigation, we have collected high- resolution CO images of 63 late-type galaxies from several large surveys, and we are working on obtaining a complete set of Spitzer and Galex data for these galaxies. A companion Spitzer archival research program will re-examine the existing observations along with CO, HI, UV and optical data, focusing on correlations in spatially resolved, individual star-forming regions. Here we propose MIPS imaging of the 11 galaxies in our CO sample that have not already been observed by Spitzer. A GO proposal will request IRAC time for these galaxies, which are a significant addition to our study because they substantially increase the fraction of gas-rich late types in the full sample. Insight from this program will be applicable to not only nearby system, but also high red-shift galaxies for which only integrated quantities are measurable.

  4. Empirical application of empathy enhancing program based on movement concept for married couples in conflict

    PubMed Central

    Kim, Soo-Yeon; Kang, Hye-Won; Chung, Yong-Chul; Park, Seungha

    2013-01-01

    In the field of marital therapy, it is known that couple movement program helps married couples faced with conflict situation to rebuild the relationship and to maintain a family homeostasis. The purpose of this study was to configure and apply the kinesthetic empathy program and to assess the effectiveness for married couples in conflict. To achieve the research aims, qualitative research method has been conducted, subjecting three couples, 6 people, who are participating in expressive movement program for this study. The study used focus group interview method for collecting date and employed for the interview method by mixing the semi-structured and unstructured questionnaire. The results were followings. First, through kinesthetic empathy enhancing program, one could develop self-awareness and emotional attunement. Second, the result showed the relationship between intention and empathy. It shows that “knowing spouse’s hidden intention” is significant factors to understand others. Third, kinesthetic empathy program could complement general marriage counseling program. The results of this study provide empirical evidence that movement program functions as an empathy enhancer through the process of perceiving, feeling, thinking, and interacting with others. PMID:24278896

  5. Cardiopulmonary Resuscitation Pattern Evaluation Based on Ensemble Empirical Mode Decomposition Filter via Nonlinear Approaches

    PubMed Central

    Ma, Matthew Huei-Ming

    2016-01-01

    Good quality cardiopulmonary resuscitation (CPR) is the mainstay of treatment for managing patients with out-of-hospital cardiac arrest (OHCA). Assessment of the quality of the CPR delivered is now possible through the electrocardiography (ECG) signal that can be collected by an automated external defibrillator (AED). This study evaluates a nonlinear approximation of the CPR given to the asystole patients. The raw ECG signal is filtered using ensemble empirical mode decomposition (EEMD), and the CPR-related intrinsic mode functions (IMF) are chosen to be evaluated. In addition, sample entropy (SE), complexity index (CI), and detrended fluctuation algorithm (DFA) are collated and statistical analysis is performed using ANOVA. The primary outcome measure assessed is the patient survival rate after two hours. CPR pattern of 951 asystole patients was analyzed for quality of CPR delivered. There was no significant difference observed in the CPR-related IMFs peak-to-peak interval analysis for patients who are younger or older than 60 years of age, similarly to the amplitude difference evaluation for SE and DFA. However, there is a difference noted for the CI (p < 0.05). The results show that patients group younger than 60 years have higher survival rate with high complexity of the CPR-IMFs amplitude differences. PMID:27529068

  6. Towards high performing hospital enterprise systems: an empirical and literature based design framework

    NASA Astrophysics Data System (ADS)

    dos Santos Fradinho, Jorge Miguel

    2014-05-01

    Our understanding of enterprise systems (ES) is gradually evolving towards a sense of design which leverages multidisciplinary bodies of knowledge that may bolster hybrid research designs and together further the characterisation of ES operation and performance. This article aims to contribute towards ES design theory with its hospital enterprise systems design (HESD) framework, which reflects a rich multidisciplinary literature and two in-depth hospital empirical cases from the US and UK. In doing so it leverages systems thinking principles and traditionally disparate bodies of knowledge to bolster the theoretical evolution and foundation of ES. A total of seven core ES design elements are identified and characterised with 24 main categories and 53 subcategories. In addition, it builds on recent work which suggests that hospital enterprises are comprised of multiple internal ES configurations which may generate different levels of performance. Multiple sources of evidence were collected including electronic medical records, 54 recorded interviews, observation, and internal documents. Both in-depth cases compare and contrast higher and lower performing ES configurations. Following literal replication across in-depth cases, this article concludes that hospital performance can be improved through an enriched understanding of hospital ES design.

  7. Spectral analysis of Hall-effect thruster plasma oscillations based on the empirical mode decomposition

    SciTech Connect

    Kurzyna, J.; Mazouffre, S.; Lazurenko, A.; Albarede, L.; Bonhomme, G.; Makowski, K.; Dudeck, M.; Peradzynski, Z.

    2005-12-15

    Hall-effect thruster plasma oscillations recorded by means of probes located at the channel exit are analyzed using the empirical mode decomposition (EMD) method. This self-adaptive technique permits to decompose a nonstationary signal into a set of intrinsic modes, and acts as a very efficient filter allowing to separate contributions of different underlying physical mechanisms. Applying the Hilbert transform to the whole set of modes allows to identify peculiar events and to assign them a range of instantaneous frequency and power. In addition to 25 kHz breathing-type oscillations which are unambiguously identified, the EMD approach confirms the existence of oscillations with instantaneous frequencies in the range of 100-500 kHz typical for ion transit-time oscillations. Modeling of high-frequency modes ({nu}{approx}10 MHz) resulting from EMD of measured wave forms supports the idea that high-frequency plasma oscillations originate from electron-density perturbations propagating azimuthally with the electron drift velocity.

  8. Inferring causal molecular networks: empirical assessment through a community-based effort

    PubMed Central

    Hill, Steven M.; Heiser, Laura M.; Cokelaer, Thomas; Unger, Michael; Nesser, Nicole K.; Carlin, Daniel E.; Zhang, Yang; Sokolov, Artem; Paull, Evan O.; Wong, Chris K.; Graim, Kiley; Bivol, Adrian; Wang, Haizhou; Zhu, Fan; Afsari, Bahman; Danilova, Ludmila V.; Favorov, Alexander V.; Lee, Wai Shing; Taylor, Dane; Hu, Chenyue W.; Long, Byron L.; Noren, David P.; Bisberg, Alexander J.; Mills, Gordon B.; Gray, Joe W.; Kellen, Michael; Norman, Thea; Friend, Stephen; Qutub, Amina A.; Fertig, Elana J.; Guan, Yuanfang; Song, Mingzhou; Stuart, Joshua M.; Spellman, Paul T.; Koeppl, Heinz; Stolovitzky, Gustavo; Saez-Rodriguez, Julio; Mukherjee, Sach

    2016-01-01

    Inferring molecular networks is a central challenge in computational biology. However, it has remained unclear whether causal, rather than merely correlational, relationships can be effectively inferred in complex biological settings. Here we describe the HPN-DREAM network inference challenge that focused on learning causal influences in signaling networks. We used phosphoprotein data from cancer cell lines as well as in silico data from a nonlinear dynamical model. Using the phosphoprotein data, we scored more than 2,000 networks submitted by challenge participants. The networks spanned 32 biological contexts and were scored in terms of causal validity with respect to unseen interventional data. A number of approaches were effective and incorporating known biology was generally advantageous. Additional sub-challenges considered time-course prediction and visualization. Our results constitute the most comprehensive assessment of causal network inference in a mammalian setting carried out to date and suggest that learning causal relationships may be feasible in complex settings such as disease states. Furthermore, our scoring approach provides a practical way to empirically assess the causal validity of inferred molecular networks. PMID:26901648

  9. Agent-Based Distributed Data Mining: A Survey

    NASA Astrophysics Data System (ADS)

    Moemeng, Chayapol; Gorodetsky, Vladimir; Zuo, Ziye; Yang, Yong; Zhang, Chengqi

    Distributed data mining is originated from the need of mining over decentralised data sources. Data mining techniques involving in such complex environment must encounter great dynamics due to changes in the system can affect the overall performance of the system. Agent computing whose aim is to deal with complex systems has revealed opportunities to improve distributed data mining systems in a number of ways. This paper surveys the integration of multi-agent system and distributed data mining, also known as agent-based distributed data mining, in terms of significance, system overview, existing systems, and research trends.

  10. Modified detrended fluctuation analysis based on empirical mode decomposition for the characterization of anti-persistent processes

    NASA Astrophysics Data System (ADS)

    Qian, Xi-Yuan; Gu, Gao-Feng; Zhou, Wei-Xing

    2011-11-01

    Detrended fluctuation analysis (DFA) is a simple but very efficient method for investigating the power-law long-term correlations of non-stationary time series, in which a detrending step is necessary to obtain the local fluctuations at different timescales. We propose to determine the local trends through empirical mode decomposition (EMD) and perform the detrending operation by removing the EMD-based local trends, which gives an EMD-based DFA method. Similarly, we also propose a modified multifractal DFA algorithm, called an EMD-based MFDFA. The performance of the EMD-based DFA and MFDFA methods is assessed with extensive numerical experiments based on fractional Brownian motion and multiplicative cascading process. We find that the EMD-based DFA method performs better than the classic DFA method in the determination of the Hurst index when the time series is strongly anticorrelated and the EMD-based MFDFA method outperforms the traditional MFDFA method when the moment order q of the detrended fluctuations is positive. We apply the EMD-based MFDFA to the 1 min data of Shanghai Stock Exchange Composite index, and the presence of multifractality is confirmed. We also analyze the daily Austrian electricity prices and confirm its anti-persistence.

  11. Half of obsessive-compulsive disorder cases misdiagnosed: vignette-based survey of primary care physicians.

    PubMed

    Glazier, Kimberly; Swing, Matt; McGinn, Lata K

    2015-06-01

    Medical settings are the primary mode of care for mental health problems; physicians' abilities with regard to psychiatric diagnosis and treatment recommendations are therefore essential. While misdiagnosis can occur across all psychiatric conditions, the heterogeneous nature of obsessive-compulsive disorder (OCD) may make this condition at an elevated risk for misidentification. The study's aim was to assess primary care physicians' ability to identify OCD. The study was cross-sectional in design. An online, vignette-based survey was emailed to 1,172 physicians from 5 major medical hospitals in the Greater New York Area. The email included a link to the survey, which consisted of 1 of 8 randomized OCD vignettes; each vignette focused on one of the following common manifestations of OCD: obsessions regarding aggression, contamination, fear of saying things, homosexuality, pedophilia, religion, somatic concerns, or symmetry. Participants provided diagnostic impressions and treatment recommendations for the individual described in the vignette. Data collection took place from December 10, 2012, through January 18, 2013. Two hundred eight physicians completed the survey. The overall misidentification rate was 50.5%. Vignette type was the strongest predictor of a correct OCD response (Wald χ(2)7 = 40.58; P <.0001). Misidentification rates by vignette were homosexuality (84.6%), aggression (80.0%), saying certain things (73.9%), pedophilia (70.8%), somatic concerns (40.0%), religion (37.5%), contamination (32.3%), and symmetry (3.70%). Participants who misidentified the OCD vignette were less likely to recommend a first-line empirically supported treatment (cognitive-behavioral therapy [CBT] = 46.7%, selective serotonin reuptake inhibitor [SSRI] = 8.6%) compared to participants who correctly identified the OCD vignette (CBT = 66.0%, SSRI = 35.0%). Antipsychotic recommendation rates were elevated among incorrect versus correct responders (12.4% vs 1.9%). Elevated OCD

  12. Empirical evidence for identical band gaps in substituted C{sub 60} and C{sub 70} based fullerenes

    SciTech Connect

    Mattias Andersson, L. Tanaka, Hideyuki

    2014-01-27

    Optical absorptance data, and a strong correlation between solar cell open circuit voltages and the ionization potentials of a wide range of differently substituted fullerene acceptors, are presented as empirical evidence for identical, or at least very similar, band gaps in all substituted C{sub 60} and C{sub 70} based fullerenes. Both the number and kind of substituents in this study are sufficiently varied to imply generality. While the band gaps of the fullerenes remain the same for all the different substitutions, their ionization potentials vary greatly in a span of more than 0.4 eV. The merits and drawbacks of using these results together with photoelectron based techniques to determine relative fullerene energy levels for, e.g., organic solar cell applications compared to more direct electrochemical methods are also discussed.

  13. Selection bias of Internet panel surveys: a comparison with a paper-based survey and national governmental statistics in Japan.

    PubMed

    Tsuboi, Satoshi; Yoshida, Honami; Ae, Ryusuke; Kojo, Takao; Nakamura, Yosikazu; Kitamura, Kunio

    2015-03-01

    To investigate the selection bias of an Internet panel survey organized by a commercial company. A descriptive study was conducted. The authors compared the characteristics of the Internet panel survey with a national paper-based survey and with national governmental statistics in Japan. The participants in the Internet panel survey were composed of more women, were older, and resided in large cities. Regardless of age and sex, the prevalence of highly educated people in the Internet panel survey was higher than in the paper-based survey and the national statistics. In men, the prevalence of heavy drinkers among the 30- to 49-year-old population and of habitual smokers among the 20- to 49-year-old population in the Internet panel survey was lower than what was found in the national statistics. The estimated characteristics of commercial Internet panel surveys were quite different from the national statistical data. In a commercial Internet panel survey, selection bias should not be underestimated. © 2012 APJPH.

  14. An all-atom structure-based potential for proteins: bridging minimal models with all-atom empirical forcefields.

    PubMed

    Whitford, Paul C; Noel, Jeffrey K; Gosavi, Shachi; Schug, Alexander; Sanbonmatsu, Kevin Y; Onuchic, José N

    2009-05-01

    Protein dynamics take place on many time and length scales. Coarse-grained structure-based (Go) models utilize the funneled energy landscape theory of protein folding to provide an understanding of both long time and long length scale dynamics. All-atom empirical forcefields with explicit solvent can elucidate our understanding of short time dynamics with high energetic and structural resolution. Thus, structure-based models with atomic details included can be used to bridge our understanding between these two approaches. We report on the robustness of folding mechanisms in one such all-atom model. Results for the B domain of Protein A, the SH3 domain of C-Src Kinase, and Chymotrypsin Inhibitor 2 are reported. The interplay between side chain packing and backbone folding is explored. We also compare this model to a C(alpha) structure-based model and an all-atom empirical forcefield. Key findings include: (1) backbone collapse is accompanied by partial side chain packing in a cooperative transition and residual side chain packing occurs gradually with decreasing temperature, (2) folding mechanisms are robust to variations of the energetic parameters, (3) protein folding free-energy barriers can be manipulated through parametric modifications, (4) the global folding mechanisms in a C(alpha) model and the all-atom model agree, although differences can be attributed to energetic heterogeneity in the all-atom model, and (5) proline residues have significant effects on folding mechanisms, independent of isomerization effects. Because this structure-based model has atomic resolution, this work lays the foundation for future studies to probe the contributions of specific energetic factors on protein folding and function.

  15. An All-atom Structure-Based Potential for Proteins: Bridging Minimal Models with All-atom Empirical Forcefields

    PubMed Central

    Whitford, Paul C.; Noel, Jeffrey K.; Gosavi, Shachi; Schug, Alexander; Sanbonmatsu, Kevin Y.; Onuchic, José N.

    2012-01-01

    Protein dynamics take place on many time and length scales. Coarse-grained structure-based (Gō) models utilize the funneled energy landscape theory of protein folding to provide an understanding of both long time and long length scale dynamics. All-atom empirical forcefields with explicit solvent can elucidate our understanding of short time dynamics with high energetic and structural resolution. Thus, structure-based models with atomic details included can be used to bridge our understanding between these two approaches. We report on the robustness of folding mechanisms in one such all-atom model. Results for the B domain of Protein A, the SH3 domain of C-Src Kinase and Chymotrypsin Inhibitor 2 are reported. The interplay between side chain packing and backbone folding is explored. We also compare this model to a Cα structure-based model and an all-atom empirical forcefield. Key findings include 1) backbone collapse is accompanied by partial side chain packing in a cooperative transition and residual side chain packing occurs gradually with decreasing temperature 2) folding mechanisms are robust to variations of the energetic parameters 3) protein folding free energy barriers can be manipulated through parametric modifications 4) the global folding mechanisms in a Cα model and the all-atom model agree, although differences can be attributed to energetic heterogeneity in the all-atom model 5) proline residues have significant effects on folding mechanisms, independent of isomerization effects. Since this structure-based model has atomic resolution, this work lays the foundation for future studies to probe the contributions of specific energetic factors on protein folding and function. PMID:18837035

  16. Developing an Internet-based Survey to Collect Program Cost Data

    ERIC Educational Resources Information Center

    Caffray, Christine M.; Chatterji, Pinka

    2009-01-01

    This manuscript describes the development and testing of an Internet-based cost survey that was designed by the authors for the National Assembly on School-Based Health Care (NASBHC) to capture the costs of school-based health programs. The intent of the survey was twofold. First, the survey was designed to collect comprehensive data on costs in a…

  17. Optical dosimetry probes to validate Monte Carlo and empirical-method-based NIR dose planning in the brain.

    PubMed

    Verleker, Akshay Prabhu; Shaffer, Michael; Fang, Qianqian; Choi, Mi-Ran; Clare, Susan; Stantz, Keith M

    2016-12-01

    A three-dimensional photon dosimetry in tissues is critical in designing optical therapeutic protocols to trigger light-activated drug release. The objective of this study is to investigate the feasibility of a Monte Carlo-based optical therapy planning software by developing dosimetry tools to characterize and cross-validate the local photon fluence in brain tissue, as part of a long-term strategy to quantify the effects of photoactivated drug release in brain tumors. An existing GPU-based 3D Monte Carlo (MC) code was modified to simulate near-infrared photon transport with differing laser beam profiles within phantoms of skull bone (B), white matter (WM), and gray matter (GM). A novel titanium-based optical dosimetry probe with isotropic acceptance was used to validate the local photon fluence, and an empirical model of photon transport was developed to significantly decrease execution time for clinical application. Comparisons between the MC and the dosimetry probe measurements were on an average 11.27%, 13.25%, and 11.81% along the illumination beam axis, and 9.4%, 12.06%, 8.91% perpendicular to the beam axis for WM, GM, and B phantoms, respectively. For a heterogeneous head phantom, the measured % errors were 17.71% and 18.04% along and perpendicular to beam axis. The empirical algorithm was validated by probe measurements and matched the MC results (R20.99), with average % error of 10.1%, 45.2%, and 22.1% relative to probe measurements, and 22.6%, 35.8%, and 21.9% relative to the MC, for WM, GM, and B phantoms, respectively. The simulation time for the empirical model was 6 s versus 8 h for the GPU-based Monte Carlo for a head phantom simulation. These tools provide the capability to develop and optimize treatment plans for optimal release of pharmaceuticals in the treatment of cancer. Future work will test and validate these novel delivery and release mechanisms in vivo.

  18. Ensemble Empirical Mode Decomposition based methodology for ultrasonic testing of coarse grain austenitic stainless steels.

    PubMed

    Sharma, Govind K; Kumar, Anish; Jayakumar, T; Purnachandra Rao, B; Mariyappa, N

    2015-03-01

    A signal processing methodology is proposed in this paper for effective reconstruction of ultrasonic signals in coarse grained high scattering austenitic stainless steel. The proposed methodology is comprised of the Ensemble Empirical Mode Decomposition (EEMD) processing of ultrasonic signals and application of signal minimisation algorithm on selected Intrinsic Mode Functions (IMFs) obtained by EEMD. The methodology is applied to ultrasonic signals obtained from austenitic stainless steel specimens of different grain size, with and without defects. The influence of probe frequency and data length of a signal on EEMD decomposition is also investigated. For a particular sampling rate and probe frequency, the same range of IMFs can be used to reconstruct the ultrasonic signal, irrespective of the grain size in the range of 30-210 μm investigated in this study. This methodology is successfully employed for detection of defects in a 50mm thick coarse grain austenitic stainless steel specimens. Signal to noise ratio improvement of better than 15 dB is observed for the ultrasonic signal obtained from a 25 mm deep flat bottom hole in 200 μm grain size specimen. For ultrasonic signals obtained from defects at different depths, a minimum of 7 dB extra enhancement in SNR is achieved as compared to the sum of selected IMF approach. The application of minimisation algorithm with EEMD processed signal in the proposed methodology proves to be effective for adaptive signal reconstruction with improved signal to noise ratio. This methodology was further employed for successful imaging of defects in a B-scan.

  19. Combining Empirical Relationships with Data Based Mechanistic Modeling to Inform Solute Tracer Investigations across Stream Orders

    NASA Astrophysics Data System (ADS)

    Herrington, C.; Gonzalez-Pinzon, R.; Covino, T. P.; Mortensen, J.

    2015-12-01

    Solute transport studies in streams and rivers often begin with the introduction of conservative and reactive tracers into the water column. Information on the transport of these substances is then captured within tracer breakthrough curves (BTCs) and used to estimate, for instance, travel times and dissolved nutrient and carbon dynamics. Traditionally, these investigations have been limited to systems with small discharges (< 200 L/s) and with small reach lengths (< 500 m), partly due to the need for a priori information of the reach's hydraulic characteristics (e.g., channel geometry, resistance and dispersion coefficients) to predict arrival times, times to peak concentrations of the solute and mean travel times. Current techniques to acquire these channel characteristics through preliminary tracer injections become cost prohibitive at higher stream orders and the use of semi-continuous water quality sensors for collecting real-time information may be affected from erroneous readings that are masked by high turbidity (e.g., nitrate signals with SUNA instruments or fluorescence measures) and/or high total dissolved solids (e.g., making prohibitively expensive the use of salt tracers such as NaCl) in larger systems. Additionally, a successful time-of-travel study is valuable for only a single discharge and river stage. We have developed a method to predict tracer BTCs to inform sampling frequencies at small and large stream orders using empirical relationships developed from multiple tracer injections spanning several orders of magnitude in discharge and reach length. This method was successfully tested in 1st to 8th order systems along the Middle Rio Grande River Basin in New Mexico, USA.

  20. Changing Healthcare Providers’ Behavior during Pediatric Inductions with an Empirically-based Intervention

    PubMed Central

    Martin, Sarah R.; Chorney, Jill MacLaren; Tan, Edwin T.; Fortier, Michelle A.; Blount, Ronald L.; Wald, Samuel H.; Shapiro, Nina L.; Strom, Suzanne L.; Patel, Swati; Kain, Zeev N.

    2011-01-01

    Background Each year over 4 million children experience significant levels of preoperative anxiety, which has been linked to poor recovery outcomes. Healthcare providers (HCP) and parents represent key resources for children to help them manage their preoperative anxiety. The present study reports on the development and preliminary feasibility testing of a new intervention designed to change HCP and parent perioperative behaviors that have been previously reported to be associated with children’s coping and stress behaviors before surgery. Methods An empirically-derived intervention, Provider-Tailored Intervention for Perioperative Stress, was developed to train HCPs to increase behaviors that promote children’s coping and decrease behaviors that may exacerbate children’s distress. Rates of HCP behaviors were coded and compared between pre-intervention and post-intervention. Additionally, rates of parents’ behaviors were compared between those that interacted with HCPs before training to those interacting with HCPs post-intervention. Results Effect sizes indicated that HCPs that underwent training demonstrated increases in rates of desired behaviors (range: 0.22 to 1.49) and decreases in rates of undesired behaviors (range: 0.15 to 2.15). Additionally, parents, who were indirectly trained, also demonstrated changes to their rates of desired (range: 0.30 to 0.60) and undesired behaviors (range: 0.16 to 0.61). Conclusions The intervention successfully modified HCP and parent behaviors. It represents a potentially new clinical way to decrease anxiety in children. A recently National Institute of Child Health and Development funded multi-site randomized control trial will examine the efficacy of this intervention in reducing children’s preoperative anxiety and improving children’s postoperative recovery is about to start. PMID:21606826

  1. Towards a critical evaluation of an empirical and volume-based solvation function for ligand docking

    PubMed Central

    Muniz, Heloisa S.

    2017-01-01

    Molecular docking is an important tool for the discovery of new biologically active molecules given that the receptor structure is known. An excellent environment for the development of new methods and improvement of the current methods is being provided by the rapid growth in the number of proteins with known structure. The evaluation of the solvation energies outstands among the challenges for the modeling of the receptor-ligand interactions, especially in the context of molecular docking where a fast, though accurate, evaluation is ought to be achieved. Here we evaluated a variation of the desolvation energy model proposed by Stouten (Stouten P.F.W. et al, Molecular Simulation, 1993, 10: 97–120), or SV model. The SV model showed a linear correlation with experimentally determined solvation energies, as available in the database FreeSolv. However, when used in retrospective docking simulations using the benchmarks DUD, charged-matched DUD and DUD-Enhanced, the SV model resulted in poorer enrichments when compared to a pure force field model with no correction for solvation effects. The data provided here is consistent with other empirical solvation models employed in the context of molecular docking and indicates that a good model to account for solvent effects is still a goal to achieve. On the other hand, despite the inability to improve the enrichment of retrospective simulations, the SV solvation model showed an interesting ability to reduce the number of molecules with net charge -2 and -3 e among the top-scored molecules in a prospective test. PMID:28323889

  2. Two-sample density-based empirical likelihood tests for incomplete data in application to a pneumonia study.

    PubMed

    Vexler, Albert; Yu, Jihnhee

    2011-07-01

    In clinical trials examining the incidence of pneumonia it is a common practice to measure infection via both invasive and non-invasive procedures. In the context of a recently completed randomized trial comparing two treatments the invasive procedure was only utilized in certain scenarios due to the added risk involved, and given that the level of the non-invasive procedure surpassed a given threshold. Hence, what was observed was bivariate data with a pattern of missingness in the invasive variable dependent upon the value of the observed non-invasive observation within a given pair. In order to compare two treatments with bivariate observed data exhibiting this pattern of missingness we developed a semi-parametric methodology utilizing the density-based empirical likelihood approach in order to provide a non-parametric approximation to Neyman-Pearson-type test statistics. This novel empirical likelihood approach has both a parametric and non-parametric components. The non-parametric component utilizes the observations for the non-missing cases, while the parametric component is utilized to tackle the case where observations are missing with respect to the invasive variable. The method is illustrated through its application to the actual data obtained in the pneumonia study and is shown to be an efficient and practical method. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Empirically supported methods of short-term psychodynamic therapy in depression - towards an evidence-based unified protocol.

    PubMed

    Leichsenring, Falk; Schauenburg, Henning

    2014-12-01

    There is evidence that psychotherapy is helpful in depressive disorders, with no significant differences between psychotherapies. For psychodynamic therapy (PDT) various models prove to be efficacious. Thus, the evidence for PDT is "scattered" between different forms of PDT, also implying problems in training of psychotherapy and in transferring research to clinical practice. A unified protocol based on empirically-supported methods of PDT in depression may contribute to solve these problems Systematic search for randomized controlled trials fulfilling the following criteria: (a) individual psychodynamic therapy (PDT) of depressive disorders, (b) treatment manuals or manual-like guidelines, (c) PDT proved to be efficacious compared to control conditions, (d) reliable measures for diagnosis and outcome, and (f) adult patients. Fourteen RCTs fulfilled the inclusion criteria. By a systematic review of the applied methods of PDT seven treatment components were identified. A high consistency between components was found. The components were conceptualized in the form of seven interrelated treatment modules. A unified psychodynamic protocol for depression may enhance the empirical status of PDT, facilitate both the training in psychotherapy and the transfer of research to clinical practice and may have an impact on the health care system. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. An Empirical Study of Neural Network-Based Audience Response Technology in a Human Anatomy Course for Pharmacy Students.

    PubMed

    Fernández-Alemán, José Luis; López-González, Laura; González-Sequeros, Ofelia; Jayne, Chrisina; López-Jiménez, Juan José; Carrillo-de-Gea, Juan Manuel; Toval, Ambrosio

    2016-04-01

    This paper presents an empirical study of a formative neural network-based assessment approach by using mobile technology to provide pharmacy students with intelligent diagnostic feedback. An unsupervised learning algorithm was integrated with an audience response system called SIDRA in order to generate states that collect some commonality in responses to questions and add diagnostic feedback for guided learning. A total of 89 pharmacy students enrolled on a Human Anatomy course were taught using two different teaching methods. Forty-four students employed intelligent SIDRA (i-SIDRA), whereas 45 students received the same training but without using i-SIDRA. A statistically significant difference was found between the experimental group (i-SIDRA) and the control group (traditional learning methodology), with T (87) = 6.598, p < 0.001. In four MCQs tests, the difference between the number of correct answers in the first attempt and in the last attempt was also studied. A global effect size of 0.644 was achieved in the meta-analysis carried out. The students expressed satisfaction with the content provided by i-SIDRA and the methodology used during the process of learning anatomy (M = 4.59). The new empirical contribution presented in this paper allows instructors to perform post hoc analyses of each particular student's progress to ensure appropriate training.

  5. Neural Representation. A Survey-Based Analysis of the Notion

    PubMed Central

    Vilarroya, Oscar

    2017-01-01

    The word representation (as in “neural representation”), and many of its related terms, such as to represent, representational and the like, play a central explanatory role in neuroscience literature. For instance, in “place cell” literature, place cells are extensively associated with their role in “the representation of space.” In spite of its extended use, we still lack a clear, universal and widely accepted view on what it means for a nervous system to represent something, on what makes a neural activity a representation, and on what is re-presented. The lack of a theoretical foundation and definition of the notion has not hindered actual research. My aim here is to identify how active scientists use the notion of neural representation, and eventually to list a set of criteria, based on actual use, that can help in distinguishing between genuine or non-genuine neural-representation candidates. In order to attain this objective, I present first the results of a survey of authors within two domains, place-cell and multivariate pattern analysis (MVPA) research. Based on the authors’ replies, and on a review of neuroscientific research, I outline a set of common properties that an account of neural representation seems to require. I then apply these properties to assess the use of the notion in two domains of the survey, place-cell and MVPA studies. I conclude by exploring a shift in the notion of representation suggested by recent literature. PMID:28900406

  6. Neural Representation. A Survey-Based Analysis of the Notion.

    PubMed

    Vilarroya, Oscar

    2017-01-01

    The word representation (as in "neural representation"), and many of its related terms, such as to represent, representational and the like, play a central explanatory role in neuroscience literature. For instance, in "place cell" literature, place cells are extensively associated with their role in "the representation of space." In spite of its extended use, we still lack a clear, universal and widely accepted view on what it means for a nervous system to represent something, on what makes a neural activity a representation, and on what is re-presented. The lack of a theoretical foundation and definition of the notion has not hindered actual research. My aim here is to identify how active scientists use the notion of neural representation, and eventually to list a set of criteria, based on actual use, that can help in distinguishing between genuine or non-genuine neural-representation candidates. In order to attain this objective, I present first the results of a survey of authors within two domains, place-cell and multivariate pattern analysis (MVPA) research. Based on the authors' replies, and on a review of neuroscientific research, I outline a set of common properties that an account of neural representation seems to require. I then apply these properties to assess the use of the notion in two domains of the survey, place-cell and MVPA studies. I conclude by exploring a shift in the notion of representation suggested by recent literature.

  7. Neighborhood social capital and adult health: an empirical test of a Bourdieu-based model.

    PubMed

    Carpiano, Richard M

    2007-09-01

    Drawing upon Bourdieu's [1986. The forms of capital. In: Richardson, J.G. (Ed.), Handbook of Theory and Research for the Sociology of Education. Greenwood, New York, pp. 241-258.] social capital theory, I test a conceptual model of neighborhood conditions and social capital - considering relationships between neighborhood social capital forms (social support, social leverage, informal social control, and neighborhood organization participation) and adult health behaviors (smoking, binge drinking) and perceived health, as well as interactions between neighborhood social capital and individuals' access to that social capital. Analyzing Los Angeles Family and Neighborhood Survey data linked with tract level census data, results suggest that specific social capital forms were directly associated with both positive and negative health outcomes. Additionally, residents' neighborhood attachment moderated relationships between various social capital forms and health. Future studies should consider social capital resources and the role of differential access to such resources for promoting or compromising health.

  8. Fault Location Based on Synchronized Measurements: A Comprehensive Survey

    PubMed Central

    Al-Mohammed, A. H.; Abido, M. A.

    2014-01-01

    This paper presents a comprehensive survey on transmission and distribution fault location algorithms that utilize synchronized measurements. Algorithms based on two-end synchronized measurements and fault location algorithms on three-terminal and multiterminal lines are reviewed. Series capacitors equipped with metal oxide varistors (MOVs), when set on a transmission line, create certain problems for line fault locators and, therefore, fault location on series-compensated lines is discussed. The paper reports the work carried out on adaptive fault location algorithms aiming at achieving better fault location accuracy. Work associated with fault location on power system networks, although limited, is also summarized. Additionally, the nonstandard high-frequency-related fault location techniques based on wavelet transform are discussed. Finally, the paper highlights the area for future research. PMID:24701191

  9. Fault location based on synchronized measurements: a comprehensive survey.

    PubMed

    Al-Mohammed, A H; Abido, M A

    2014-01-01

    This paper presents a comprehensive survey on transmission and distribution fault location algorithms that utilize synchronized measurements. Algorithms based on two-end synchronized measurements and fault location algorithms on three-terminal and multiterminal lines are reviewed. Series capacitors equipped with metal oxide varistors (MOVs), when set on a transmission line, create certain problems for line fault locators and, therefore, fault location on series-compensated lines is discussed. The paper reports the work carried out on adaptive fault location algorithms aiming at achieving better fault location accuracy. Work associated with fault location on power system networks, although limited, is also summarized. Additionally, the nonstandard high-frequency-related fault location techniques based on wavelet transform are discussed. Finally, the paper highlights the area for future research.

  10. Sensor Systems Based on FPGAs and Their Applications: A Survey

    PubMed Central

    de la Piedra, Antonio; Braeken, An; Touhafi, Abdellah

    2012-01-01

    In this manuscript, we present a survey of designs and implementations of research sensor nodes that rely on FPGAs, either based upon standalone platforms or as a combination of microcontroller and FPGA. Several current challenges in sensor networks are distinguished and linked to the features of modern FPGAs. As it turns out, low-power optimized FPGAs are able to enhance the computation of several types of algorithms in terms of speed and power consumption in comparison to microcontrollers of commercial sensor nodes. We show that architectures based on the combination of microcontrollers and FPGA can play a key role in the future of sensor networks, in fields where processing capabilities such as strong cryptography, self-testing and data compression, among others, are paramount.

  11. Basewide environmental baseline survey, Reese Air Force Base, Texas

    SciTech Connect

    1996-11-26

    This Environmental Baseline Survey (EBS) has been prepared to document the environmental condition of real property at Reese Air Force Base (AFB), Texas, resulting from the storage, release, and disposal of hazardous substances and petroleum products and their derivatives over the installation`s history. Although primarily a management tool, this EBS is also used by the Air Force to meet its obligations under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), 42 U.S. Code Section 9620(h), as amended by the Community Environmental Response Facilitation Act (CERFA) (Public Law 102-426). Table ES-1 lists all Category 1 uncontaminated property associated with Reese AFB based on information obtained through a records search, interviews, and visual inspections at Reese AFB and Figures ES-1a and ES-1b depict their locations.

  12. Holding-based network of nations based on listed energy companies: An empirical study on two-mode affiliation network of two sets of actors

    NASA Astrophysics Data System (ADS)

    Li, Huajiao; Fang, Wei; An, Haizhong; Gao, Xiangyun; Yan, Lili

    2016-05-01

    Economic networks in the real world are not homogeneous; therefore, it is important to study economic networks with heterogeneous nodes and edges to simulate a real network more precisely. In this paper, we present an empirical study of the one-mode derivative holding-based network constructed by the two-mode affiliation network of two sets of actors using the data of worldwide listed energy companies and their shareholders. First, we identify the primitive relationship in the two-mode affiliation network of the two sets of actors. Then, we present the method used to construct the derivative network based on the shareholding relationship between two sets of actors and the affiliation relationship between actors and events. After constructing the derivative network, we analyze different topological features on the node level, edge level and entire network level and explain the meanings of the different values of the topological features combining the empirical data. This study is helpful for expanding the usage of complex networks to heterogeneous economic networks. For empirical research on the worldwide listed energy stock market, this study is useful for discovering the inner relationships between the nations and regions from a new perspective.

  13. A new multivariate empirical mode decomposition method for improving the performance of SSVEP-based brain-computer interface

    NASA Astrophysics Data System (ADS)

    Chen, Yi-Feng; Atal, Kiran; Xie, Sheng-Quan; Liu, Quan

    2017-08-01

    Objective. Accurate and efficient detection of steady-state visual evoked potentials (SSVEP) in electroencephalogram (EEG) is essential for the related brain-computer interface (BCI) applications. Approach. Although the canonical correlation analysis (CCA) has been applied extensively and successfully to SSVEP recognition, the spontaneous EEG activities and artifacts that often occur during data recording can deteriorate the recognition performance. Therefore, it is meaningful to extract a few frequency sub-bands of interest to avoid or reduce the influence of unrelated brain activity and artifacts. This paper presents an improved method to detect the frequency component associated with SSVEP using multivariate empirical mode decomposition (MEMD) and CCA (MEMD-CCA). EEG signals from nine healthy volunteers were recorded to evaluate the performance of the proposed method for SSVEP recognition. Main results. We compared our method with CCA and temporally local multivariate synchronization index (TMSI). The results suggest that the MEMD-CCA achieved significantly higher accuracy in contrast to standard CCA and TMSI. It gave the improvements of 1.34%, 3.11%, 3.33%, 10.45%, 15.78%, 18.45%, 15.00% and 14.22% on average over CCA at time windows from 0.5 s to 5 s and 0.55%, 1.56%, 7.78%, 14.67%, 13.67%, 7.33% and 7.78% over TMSI from 0.75 s to 5 s. The method outperformed the filter-based decomposition (FB), empirical mode decomposition (EMD) and wavelet decomposition (WT) based CCA for SSVEP recognition. Significance. The results demonstrate the ability of our proposed MEMD-CCA to improve the performance of SSVEP-based BCI.

  14. Data-based empirical model reduction as an approach to data mining

    NASA Astrophysics Data System (ADS)

    Ghil, M.

    2012-12-01

    Science is very much about finding order in chaos, patterns in oodles of data, signal in noise, and so on. One can see any scientific description as a model of the data, whether verbal, statistical or dynamical. In this talk, I will provide an approach to such descriptions that relies on constructing nonlinear, stochastically forced models, via empirical model reduction (EMR). EMR constructs a low-order nonlinear system of prognostic equations driven by stochastic forcing; it estimates both the dynamical operator and the properties of the driving noise directly from observations or from a high-order model's simulation. The multi-level EMR structure for modeling the stochastic forcing allows one to capture feedback between high- and low-frequency components of the variability, thus parameterizing the "fast scales," often referred to as the "noise," in terms of the memory of the "slow" scales, referred to as the "signal." EMR models have been shown to capture quite well features of the high-dimensional data sets involved, in the frequency domain as well as in the spatial domain. Illustrative examples will involve capturing correctly patterns in data sets that are either purely observational or generated by high-end models. They will be selected from intraseasonal variability of the mid-latitude atmosphere, seasonal-to-interannual variability of the sea surface temperature field, and air-sea interaction in the Southern Ocean. The work described in this talk is joint with M.D. Chekroun, D. Kondrashov, S. Kravtsov, and A.W. Robertson. Recent results on using a modified and improved form of EMR modeling for predictive purposes will be provided in a separate talk by D. Kondrashov, M. Chekroun and M. Ghil on "Data-Driven Model Reduction and Climate Prediction: Nonlinear Stochastic, Energy-Conserving Models With Memory Effects."Detailed budget of mean phase-space tendencies for the plane spanned by EOFs 1 and 4 of an intermediate-complexity model of mid-latitude flow

  15. Multiscale Detrended Cross-Correlation Analysis of Traffic Time Series Based on Empirical Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Yin, Yi; Shang, Pengjian

    2015-04-01

    In this paper, we propose multiscale detrended cross-correlation analysis (MSDCCA) to detect the long-range power-law cross-correlation of considered signals in the presence of nonstationarity. For improving the performance and getting better robustness, we further introduce the empirical mode decomposition (EMD) to eliminate the noise effects and propose MSDCCA method combined with EMD, which is called MS-EDXA method, then systematically investigate the multiscale cross-correlation structure of the real traffic signals. We apply the MSDCCA and MS-EDXA methods to study the cross-correlations in three situations: velocity and volume on one lane, velocities on the present and the next moment and velocities on the adjacent lanes, and further compare their spectrums respectively. When the difference between the spectrums of MSDCCA and MS-EDXA becomes unobvious, there is a crossover which denotes the turning point of difference. The crossover results from the competition between the noise effects in the original signals and the intrinsic fluctuation of traffic signals and divides the plot of spectrums into two regions. In all the three case, MS-EDXA method makes the average of local scaling exponents increased and the standard deviation decreased and provides a relative stable persistent scaling cross-correlated behavior which gets the analysis more precise and more robust and improves the performance after noises being removed. Applying MS-EDXA method avoids the inaccurate characteristics of multiscale cross-correlation structure at the short scale including the spectrum minimum, the range for the spectrum fluctuation and general trend, which are caused by the noise in the original signals. We get the conclusions that the traffic velocity and volume are long-range cross-correlated, which is accordant to their actual evolution, while velocities on the present and the next moment and velocities on adjacent lanes reflect the strong cross-correlations both in temporal and

  16. [Utilization of CAP Survey, Based on Questionnaire Results from Survey Participants].

    PubMed

    Hirano, Akiko; Ohno, Hiroie

    2015-08-01

    The survey provided by the College of American Pathologists (CAP) is chosen as one of the proficiency testing programs in Japan, and, recently, the numbers of participating facilities have increased. CAP provides 754 programs, and more than 1,000 tests were provided in 2014. Materials are translated as the "CAP global inter-laboratory comparison program" under the instruction of the Japanese Society of Laboratory Medicine (JSLM) selected from CAP surveys in Japan, and 68 programs and 261 items are provided. The total number of participating facilities was 174. CAP itself and the other services CAP provides are not well-known, while recognition of "the CAP survey as the proficiency test" has increased. The question "What is CAP and the CAP survey" was analyzed as a result of the questionnaire surveys conducted in 2014, and the advantage of the CAP survey and how to utilize it were considered. A questionnaire survey was conducted about the CAP survey for Japanese participants in 2014. Fifty-three questions were asked about their satisfaction level, intended use, and improvement. Eighty replies were analyzed. As a result, most CAP survey participants are satisfied. They intend to mainly use the CAP survey for their quality control. Furthermore, they can continuously monitor their systems throughout all testing phases as the survey has numbers of shipments a year and several specimens per each mailing. This helps in laboratory performance improvement. The Evaluation and Participant Summary (PSR) also effectively improves the laboratories' performance. CAP-accredited laboratories are required to participate in all survey programs concerning the test menu which they provide. Therefore, they have become accustomed to reviewing the evaluation and performing self-evaluation with a high usage rate of the Evaluation and PSR of the CAP survey. The questionnaire proved that performing the CAP survey properly enhanced the laboratories' quality control, and this meets the

  17. An Empirical Study of Instructor Adoption of Web-Based Learning Systems

    ERIC Educational Resources Information Center

    Wang, Wei-Tsong; Wang, Chun-Chieh

    2009-01-01

    For years, web-based learning systems have been widely employed in both educational and non-educational institutions. Although web-based learning systems are emerging as a useful tool for facilitating teaching and learning activities, the number of users is not increasing as fast as expected. This study develops an integrated model of instructor…

  18. An Empirical Analysis of the Antecedents of Web-Based Learning Continuance

    ERIC Educational Resources Information Center

    Chiu, Chao-Min; Sun, Szu-Yuan; Sun, Pei-Chen; Ju, Teresa L.

    2007-01-01

    Like any other product, service and Web-based application, the success of Web-based learning depends largely on learners' satisfaction and other factors that will eventually increase learners' intention to continue using it. This paper integrates the concept of subjective task value and fairness theory to construct a model for investigating the…

  19. Outcome (Competency) Based Education: An Exploration of Its Origins, Theoretical Basis, and Empirical Evidence

    ERIC Educational Resources Information Center

    Morcke, Anne Mette; Dornan, Tim; Eika, Berit

    2013-01-01

    Outcome based or competency based education (OBE) is so firmly established in undergraduate medical education that it might not seem necessary to ask why it was included in recommendations for the future, like the Flexner centenary report. Uncritical acceptance may not, however, deliver its greatest benefits. Our aim was to explore the…

  20. An Empirical Study of Instructor Adoption of Web-Based Learning Systems

    ERIC Educational Resources Information Center

    Wang, Wei-Tsong; Wang, Chun-Chieh

    2009-01-01

    For years, web-based learning systems have been widely employed in both educational and non-educational institutions. Although web-based learning systems are emerging as a useful tool for facilitating teaching and learning activities, the number of users is not increasing as fast as expected. This study develops an integrated model of instructor…

  1. Text-Based On-Line Conferencing: A Conceptual and Empirical Analysis Using a Minimal Prototype.

    ERIC Educational Resources Information Center

    McCarthy, John C.; And Others

    1993-01-01

    Analyzes requirements for text-based online conferencing through the use of a minimal prototype. Topics discussed include prototyping with a minimal system; text-based communication; the system as a message passer versus the system as a shared data structure; and three exercises that showed how users worked with the prototype. (Contains 61…

  2. Formula-Based Public School Funding System in Victoria: An Empirical Analysis of Equity

    ERIC Educational Resources Information Center

    Bandaranayake, Bandara

    2013-01-01

    This article explores the formula-based school funding system in the state of Victoria, Australia, where state funds are directly allocated to schools based on a range of equity measures. The impact of Victoria' funding system for education in terms of alleviating inequality and disadvantage is contentious, to say the least. It is difficult to…

  3. Empirically Supported Family-Based Treatments for Conduct Disorder and Delinquency in Adolescents

    ERIC Educational Resources Information Center

    Henggeler, Scott W.; Sheidow, Ashli J.

    2012-01-01

    Several family-based treatments of conduct disorder and delinquency in adolescents have emerged as evidence-based and, in recent years, have been transported to more than 800 community practice settings. These models include multisystemic therapy, functional family therapy, multidimensional treatment foster care, and, to a lesser extent, brief…

  4. Constructions, Semantic Compatibility, and Coercion: An Empirical Usage-Based Approach

    ERIC Educational Resources Information Center

    Yoon, Soyeon

    2012-01-01

    This study investigates the nature of semantic compatibility between constructions and lexical items that occur in them in relation with language use, and the related concept, coercion, based on a usage-based approach to language, in which linguistic knowledge (grammar) is grounded in language use. This study shows that semantic compatibility…

  5. Meta-Analysis of Group Learning Activities: Empirically Based Teaching Recommendations

    ERIC Educational Resources Information Center

    Tomcho, Thomas J.; Foels, Rob

    2012-01-01

    Teaching researchers commonly employ group-based collaborative learning approaches in Teaching of Psychology teaching activities. However, the authors know relatively little about the effectiveness of group-based activities in relation to known psychological processes associated with group dynamics. Therefore, the authors conducted a meta-analytic…

  6. Probabilistic Algorithms, Integration, and Empirical Evaluation for Disambiguating Multiple Selections in Frustum-Based Pointing

    DTIC Science & Technology

    2006-06-01

    generated and is used for processing selections. Kolsch et al. [11] developed a real-time hand gesture recognition system that can act as the sole...576–583. [19] G. Schmidt and D. House, “Model-based motion filtering for improving arm gesture recognition performance,” in Gesture-based

  7. Meta-Analysis of Group Learning Activities: Empirically Based Teaching Recommendations

    ERIC Educational Resources Information Center

    Tomcho, Thomas J.; Foels, Rob

    2012-01-01

    Teaching researchers commonly employ group-based collaborative learning approaches in Teaching of Psychology teaching activities. However, the authors know relatively little about the effectiveness of group-based activities in relation to known psychological processes associated with group dynamics. Therefore, the authors conducted a meta-analytic…

  8. Resident fatigue in otolaryngology residents: a Web based survey.

    PubMed

    Nida, Andrew M; Googe, Benjamin J; Lewis, Andrea F; May, Warren L

    2016-01-01

    Resident fatigue has become a point of emphasis in medical education and its effects on otolaryngology residents and their patients require further study. The purpose of our study was to evaluate the prevalence and nature of fatigue in otolaryngology residents, evaluate various quality of life measures, and investigate associations of increased fatigue with resident safety. Anonymous survey. Internet based. United States allopathic otolaryngology residents. None. The survey topics included demographics, residency structure, sleep habits and perceived stress. Responses were correlated with a concurrent Epworth Sleep Scale questionnaire to evaluate effects of fatigue on resident training and quality of life. 190 residents responded to the survey with 178 completing the Epworth Sleep Scale questionnaire. Results revealed a mean Epworth Sleep Scale score of 9.9±5.1 with a median of 10.0 indicating a significant number of otolaryngology residents are excessively sleepy. Statistically significant correlations between Epworth Sleep Scale and sex, region, hours of sleep, and work hours were found. Residents taking in-house call had significantly fewer hours of sleep compared to home call (p=0.01). Residents on "head and neck" (typically consisting of a large proportion of head and neck oncologic surgery) rotations tended to have higher Epworth Sleep Scale and had significantly fewer hours of sleep (p=.003) and greater work hours (p<.001). Additionally, residents who reported no needle stick type incidents or near motor vehicle accidents had significantly lower mean Epworth Sleep Scale scores. Only 37.6% of respondents approve of the most recent Accreditation Council for Graduate Medical Education work hour restrictions and 14% reported averaging greater than 80hours of work/week. A substantial number of otolaryngology residents are excessively sleepy. Our data suggest that the effects of fatigue play a role in resident well-being and resident safety. Copyright © 2016

  9. Developing integrated clinical reasoning competencies in dental students using scaffolded case-based learning - empirical evidence.

    PubMed

    Postma, T C; White, J G

    2016-08-01

    This study provides empirical evidence of the development of integrated clinical reasoning in the discipline-based School of Dentistry, University of Pretoria, South Africa. Students were exposed to case-based learning in comprehensive patient care (CPC) in the preclinical year of study, scaffolded by means of the four-component instructional design model for complex learning. Progress test scores of third- to fifth-year dental students, who received case-based teaching and learning in the third year (2009-2011), were compared to the scores of preceding fourth- and fifth-year cohorts. These fourth- and fifth-year cohorts received content-based teaching concurrently with their clinical training in CPC. The progress test consisted of a complex case study and 32 MCQs on tracer conditions. Students had to gather the necessary information and had to make diagnostic and treatment-planning decisions. Preclinical students who participated in the case-based teaching and learning achieved similar scores compared to final-year students who received lecture-based teaching and learning. Final-year students who participated in the case-based learning made three more correct clinical decisions per student, compared to those who received content-based teaching. Students struggled more with treatment-planning than with diagnostic decisions. The scaffolded case-based learning appears to contribute to accurate clinical decisions when compared to lecture-based teaching. It is suggested that the development of integrated reasoning competencies starts as early as possible in a dental curriculum, perhaps even in the preclinical year of study. Treatment-planning should receive particular attention. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  10. Fault identification of rotor-bearing system based on ensemble empirical mode decomposition and self-zero space projection analysis

    NASA Astrophysics Data System (ADS)

    Jiang, Fan; Zhu, Zhencai; Li, Wei; Zhou, Gongbo; Chen, Guoan

    2014-07-01

    Accurately identifying faults in rotor-bearing systems by analyzing vibration signals, which are nonlinear and nonstationary, is challenging. To address this issue, a new approach based on ensemble empirical mode decomposition (EEMD) and self-zero space projection analysis is proposed in this paper. This method seeks to identify faults appearing in a rotor-bearing system using simple algebraic calculations and projection analyses. First, EEMD is applied to decompose the collected vibration signals into a set of intrinsic mode functions (IMFs) for features. Second, these extracted features under various mechanical health conditions are used to design a self-zero space matrix according to space projection analysis. Finally, the so-called projection indicators are calculated to identify the rotor-bearing system's faults with simple decision logic. Experiments are implemented to test the reliability and effectiveness of the proposed approach. The results show that this approach can accurately identify faults in rotor-bearing systems.

  11. Gyroscope-Driven Mouse Pointer with an EMOTIV® EEG Headset and Data Analysis Based on Empirical Mode Decomposition

    PubMed Central

    Rosas-Cholula, Gerardo; Ramirez-Cortes, Juan Manuel; Alarcon-Aquino, Vicente; Gomez-Gil, Pilar; Rangel-Magdaleno, Jose de Jesus; Reyes-Garcia, Carlos

    2013-01-01

    This paper presents a project on the development of a cursor control emulating the typical operations of a computer-mouse, using gyroscope and eye-blinking electromyographic signals which are obtained through a commercial 16-electrode wireless headset, recently released by Emotiv. The cursor position is controlled using information from a gyroscope included in the headset. The clicks are generated through the user's blinking with an adequate detection procedure based on the spectral-like technique called Empirical Mode Decomposition (EMD). EMD is proposed as a simple and quick computational tool, yet effective, aimed to artifact reduction from head movements as well as a method to detect blinking signals for mouse control. Kalman filter is used as state estimator for mouse position control and jitter removal. The detection rate obtained in average was 94.9%. Experimental setup and some obtained results are presented. PMID:23948873

  12. Gyroscope-driven mouse pointer with an EMOTIV® EEG headset and data analysis based on Empirical Mode Decomposition.

    PubMed

    Rosas-Cholula, Gerardo; Ramirez-Cortes, Juan Manuel; Alarcon-Aquino, Vicente; Gomez-Gil, Pilar; Rangel-Magdaleno, Jose de Jesus; Reyes-Garcia, Carlos

    2013-08-14

    This paper presents a project on the development of a cursor control emulating the typical operations of a computer-mouse, using gyroscope and eye-blinking electromyographic signals which are obtained through a commercial 16-electrode wireless headset, recently released by Emotiv. The cursor position is controlled using information from a gyroscope included in the headset. The clicks are generated through the user's blinking with an adequate detection procedure based on the spectral-like technique called Empirical Mode Decomposition (EMD). EMD is proposed as a simple and quick computational tool, yet effective, aimed to artifact reduction from head movements as well as a method to detect blinking signals for mouse control. Kalman filter is used as state estimator for mouse position control and jitter removal. The detection rate obtained in average was 94.9%. Experimental setup and some obtained results are presented.

  13. Re-reading nursing and re-writing practice: towards an empirically based reformulation of the nursing mandate.

    PubMed

    Allen, Davina

    2004-12-01

    This article examines field studies of nursing work published in the English language between 1993 and 2003 as the first step towards an empirically based reformulation of the nursing mandate. A decade of ethnographic research reveals that, contrary to contemporary theories which promote an image of nursing work centred on individualised unmediated caring relationships, in real-life practice the core nursing contribution is that of the healthcare mediator. Eight bundles of activity that comprise this intermediary role are described utilising evidence from the literature. The mismatch between nursing's culture and ideals and the structure and constraints of the work setting is a chronic source of practitioner dissatisfaction. It is argued that the profession has little to gain by pursuing an agenda of holistic patient care centred on emotional intimacy and that an alternative occupational mandate focused on the healthcare mediator function might make for more humane health services and a more viable professional future.

  14. Empirically Supported Treatments in Psychotherapy: Towards an Evidence-Based or Evidence-Biased Psychology in Clinical Settings?

    PubMed Central

    Castelnuovo, Gianluca

    2010-01-01

    The field of research and practice in psychotherapy has been deeply influenced by two different approaches: the empirically supported treatments (ESTs) movement, linked with the evidence-based medicine (EBM) perspective and the “Common Factors” approach, typically connected with the “Dodo Bird Verdict”. About the first perspective, since 1998 a list of ESTs has been established in mental health field. Criterions for “well-established” and “probably efficacious” treatments have arisen. The development of these kinds of paradigms was motivated by the emergence of a “managerial” approach and related systems for remuneration also for mental health providers and for insurance companies. In this article ESTs will be presented underlining also some possible criticisms. Finally complementary approaches, that could add different evidence in the psychotherapy research in comparison with traditional EBM approach, are presented. PMID:21833197

  15. A novel approach for baseline correction in 1H-MRS signals based on ensemble empirical mode decomposition.

    PubMed

    Parto Dezfouli, Mohammad Ali; Dezfouli, Mohsen Parto; Rad, Hamidreza Saligheh

    2014-01-01

    Proton magnetic resonance spectroscopy ((1)H-MRS) is a non-invasive diagnostic tool for measuring biochemical changes in the human body. Acquired (1)H-MRS signals may be corrupted due to a wideband baseline signal generated by macromolecules. Recently, several methods have been developed for the correction of such baseline signals, however most of them are not able to estimate baseline in complex overlapped signal. In this study, a novel automatic baseline correction method is proposed for (1)H-MRS spectra based on ensemble empirical mode decomposition (EEMD). This investigation was applied on both the simulated data and the in-vivo (1)H-MRS of human brain signals. Results justify the efficiency of the proposed method to remove the baseline from (1)H-MRS signals.

  16. Signal-to-Noise Ratio Enhancement Based on Empirical Mode Decomposition in Phase-Sensitive Optical Time Domain Reflectometry Systems.

    PubMed

    Qin, Zengguang; Chen, Hui; Chang, Jun

    2017-08-14

    We propose a novel denoising method based on empirical mode decomposition (EMD) to improve the signal-to-noise ratio (SNR) for vibration sensing in phase-sensitive optical time domain reflectometry (φ-OTDR) systems. Raw Rayleigh backscattering traces are decomposed into a series of intrinsic mode functions (IMFs) and a residual component using an EMD algorithm. High frequency noise is eliminated by removing several IMFs at the position without vibration selected by the Pearson correlation coefficient (PCC). When the pulse width is 50 ns, the SNR of location information for the vibration events of 100 Hz and 1.2 kHz is increased to as high as 42.52 dB and 39.58 dB, respectively, with a 2 km sensing fiber, which demonstrates the excellent performance of this new method.

  17. Restoration of images degraded by signal-dependent noise based on energy minimization: an empirical study

    NASA Astrophysics Data System (ADS)

    Bajić, Buda; Lindblad, Joakim; Sladoje, Nataša

    2016-07-01

    Most energy minimization-based restoration methods are developed for signal-independent Gaussian noise. The assumption of Gaussian noise distribution leads to a quadratic data fidelity term, which is appealing in optimization. When an image is acquired with a photon counting device, it contains signal-dependent Poisson or mixed Poisson-Gaussian noise. We quantify the loss in performance that occurs when a restoration method suited for Gaussian noise is utilized for mixed noise. Signal-dependent noise can be treated by methods based on either classical maximum a posteriori (MAP) probability approach or on a variance stabilization approach (VST). We compare performances of these approaches on a large image material and observe that VST-based methods outperform those based on MAP in both quality of restoration and in computational efficiency. We quantify improvement achieved by utilizing Huber regularization instead of classical total variation regularization. The conclusion from our study is a recommendation to utilize a VST-based approach combined with regularization by Huber potential for restoration of images degraded by blur and signal-dependent noise. This combination provides a robust and flexible method with good performance and high speed.

  18. Empirical model of O +-H + transition height based ontopside sounder data

    NASA Astrophysics Data System (ADS)

    Marinov, P.; Kutiev, I.; Watanabe, S.

    2004-01-01

    A new model of the O +-H + transition height (denoted as THM) is developed, based on vertical electron density profiles fromtopside ionosondes. The model provides the transition height as a function of month of the year, local time, geomagnetic latitude, longitude and solar flux F107. To define TH, the O + scale height is approximated by the lowest gradient in the measured profile and the O + profile is reconstructed. TH is taken at the height where O + density becomes half of total electron density. The model data base contains 170,033 TH values, sufficiently sampling all parameter's ranges. THM describes the transition height by a multivariable polynomial consisted with Chebishev's and trigonometric base functions, which is fitted to the data in the five-dimensional space. The model results are compared with other available models. The comparison shows that THM predictions agree in general with those of the other models, but THM variations along latitude, longitude and local time have larger amplitudes.

  19. Performance-based management and quality of work: an empirical assessment.

    PubMed

    Falzon, Pierre; Nascimento, Adelaide; Gaudart, Corinne; Piney, Cécile; Dujarier, Marie-Anne; Germe, Jean-François

    2012-01-01

    In France, in the private sector as in the public sector, performance-based management tends to become a norm. Performance-based management is supposed to improve service quality, productivity and efficiency, transparency of allotted means and achieved results, and to better focus the activity of employees and of the whole organization. This text reports a study conducted for the French Ministry of Budget by a team of researchers in ergonomics, sociology and management science, in order to assess the impact of performance-based management on employees, on teams and on work organization. About 100 interviews were conducted with employees of all categories and 6 working groups were set up in order to discuss and validate or amend our first analyses. Results concern several aspects: workload and work intensification, indicators and performance management and the transformation of jobs induced by performance management.

  20. Process-based models not always better than empirical models for simulating budburst of Norway spruce and birch in Europe.

    PubMed

    Olsson, Cecilia; Jönsson, Anna Maria

    2014-11-01

    Budburst models have mainly been developed to capture the processes of individual trees, and vary in their complexity and plant physiological realism. We evaluated how well eleven models capture the variation in budburst of birch and Norway spruce in Germany, Austria, the United Kingdom and Finland. The comparison was based on the models performance in relation to their underlying physiological assumptions with four different calibration schemes. The models were not able to accurately simulate the timing of budburst. In general the models overestimated the temperature effect, thereby the timing of budburst was simulated too early in the United Kingdom and too late in Finland. Among the better performing models were three models based on the growing degree day concept, with or without day length or chilling, and an empirical model based on spring temperatures. These models were also the models least influenced by the calibration data. For birch the best calibration scheme was based on multiple sites in either Germany or Europe, and for Norway spruce the best scheme included multiple sites in Germany or cold years of all sites. Most model and calibration combinations indicated greater bias with higher spring temperatures, mostly simulating earlier than observed budburst. © 2014 John Wiley & Sons Ltd.

  1. Ticking the boxes: a survey of workplace-based assessments

    PubMed Central

    Gilberthorpe, Thomas; Sarfo, Maame Duku; Lawrence-Smith, Geoff

    2016-01-01

    Aims and method To survey the quality of workplace-based assessments (WPBAs) through retrospective analysis of completed WPBA forms against training targets derived from the Royal College of Psychiatrists' Portfolio Online. Results Almost a third of assessments analysed showed no divergence in assessment scores across the varied assessment domains and there was poor correlation between domain scores and the nature of comments provided by assessors. Of the assessments that suggested action points only half were considered to be sufficiently ‘specific’ and ‘achievable’ to be useful for trainees' learning. Clinical implications WPBA is not currently being utilised to its full potential as a formative assessment tool and more widespread audit is needed to establish whether this is a local or a national issue. PMID:27087994

  2. A survey of population-based drug databases in Canada.

    PubMed Central

    Miller, E; Blatman, B; Einarson, T R

    1996-01-01

    OBJECTIVE: To identify the population-based drug databases in Canada and to determine their comprehensiveness and accessibility for performing pharmacoepidemiologic and outcomes research. DESIGN: Survey (four-part mailed questionnaire). SETTING: Public and private third-party drug plans across Canada. PARTICIPANTS: All provincial and territorial drug plan or pharmacare managers as well as selected private plan managers including health benefit consultants, group insurers and claims adjudicators/pharmacy benefit managers (CA/PBMs). OUTCOME MEASURES: Patient, drug and pharmacy information; potential for electronic linkages to other provincial databases (e.g., physician, hospital, vital statistics); accessibility of information; population profile. RESULTS: Of the 32 recipients of the questionnaire 29 (91%) responded and 18 (56%) completed the survey. Most databases were reported to contain patient information (e.g., patient identification number, age, sex and medication history) and prescription drug information (e.g., drug identification number, strength, quantity and cost). Six provinces and one territory reported the capability to link to other databases (e.g., hospital and physician databases). One CA/PBM reported some links to selected long-term disability data. All of the government databases except those in British Columbia and the Yukon Territory allowed use of the data for research purposes. Manitoba and Saskatchewan included all residents of the province in their database; the others included selected groups (e.g., residents 65 years of age or older, people on social assistance or people covered by private group insurance). CONCLUSION: A number of public and private population-based databases are available for use in pharmacoepidemiologic and outcomes research. PMID:8653645

  3. Homogeneity in Community-Based Rape Prevention Programs: Empirical Evidence of Institutional Isomorphism

    ERIC Educational Resources Information Center

    Townsend, Stephanie M.; Campbell, Rebecca

    2007-01-01

    This study examined the practices of 24 community-based rape prevention programs. Although these programs were geographically dispersed throughout one state, they were remarkably similar in their approach to rape prevention programming. DiMaggio and Powell's (1991) theory of institutional isomorphism was used to explain the underlying causes of…

  4. Perceptions of the Effectiveness of System Dynamics-Based Interactive Learning Environments: An Empirical Study

    ERIC Educational Resources Information Center

    Qudrat-Ullah, Hassan

    2010-01-01

    The use of simulations in general and of system dynamics simulation based interactive learning environments (SDILEs) in particular is well recognized as an effective way of improving users' decision making and learning in complex, dynamic tasks. However, the effectiveness of SDILEs in classrooms has rarely been evaluated. This article describes…

  5. Empirical Investigation into Motives for Choosing Web-Based Distance Learning Programs

    ERIC Educational Resources Information Center

    Alkhattabi, Mona

    2016-01-01

    Today, in association with rapid social and economic changes, there is an increasing level of demand for distance and online learning programs. This study will focus on identifying the main motivational factors for choosing a web-based distance-learning program. Moreover, it will investigate how these factors relate to age, gender, marital status…

  6. Theoretical and Empirical Base for Implementation Components of Health-Promoting Schools

    ERIC Educational Resources Information Center

    Samdal, Oddrun; Rowling, Louise

    2011-01-01

    Purpose: Efforts to create a scientific base for the health-promoting school approach have so far not articulated a clear "Science of Delivery". There is thus a need for systematic identification of clearly operationalised implementation components. To address a next step in the refinement of the health-promoting schools' work, this paper sets out…

  7. Homogeneity in Community-Based Rape Prevention Programs: Empirical Evidence of Institutional Isomorphism

    ERIC Educational Resources Information Center

    Townsend, Stephanie M.; Campbell, Rebecca

    2007-01-01

    This study examined the practices of 24 community-based rape prevention programs. Although these programs were geographically dispersed throughout one state, they were remarkably similar in their approach to rape prevention programming. DiMaggio and Powell's (1991) theory of institutional isomorphism was used to explain the underlying causes of…

  8. An Adaptive E-Learning System Based on Students' Learning Styles: An Empirical Study

    ERIC Educational Resources Information Center

    Drissi, Samia; Amirat, Abdelkrim

    2016-01-01

    Personalized e-learning implementation is recognized as one of the most interesting research areas in the distance web-based education. Since the learning style of each learner is different one must fit e-learning with the different needs of learners. This paper presents an approach to integrate learning styles into adaptive e-learning hypermedia.…

  9. Theoretical and Empirical Base for Implementation Components of Health-Promoting Schools

    ERIC Educational Resources Information Center

    Samdal, Oddrun; Rowling, Louise

    2011-01-01

    Purpose: Efforts to create a scientific base for the health-promoting school approach have so far not articulated a clear "Science of Delivery". There is thus a need for systematic identification of clearly operationalised implementation components. To address a next step in the refinement of the health-promoting schools' work, this paper sets out…

  10. Young Readers' Narratives Based on a Picture Book: Model Readers and Empirical Readers

    ERIC Educational Resources Information Center

    Hoel, Trude

    2015-01-01

    The article present parts of a research project where the aim is to investigate six- to seven-year-old children's language use in storytelling. The children's oral texts are based on the wordless picture book "Frog, Where Are You?" Which has been, and still remains, a frequent tool for collecting narratives from children. The Frog story…

  11. Young Readers' Narratives Based on a Picture Book: Model Readers and Empirical Readers

    ERIC Educational Resources Information Center

    Hoel, Trude

    2015-01-01

    The article present parts of a research project where the aim is to investigate six- to seven-year-old children's language use in storytelling. The children's oral texts are based on the wordless picture book "Frog, Where Are You?" Which has been, and still remains, a frequent tool for collecting narratives from children. The Frog story…

  12. Empirical Investigation into Motives for Choosing Web-Based Distance Learning Programs

    ERIC Educational Resources Information Center

    Alkhattabi, Mona

    2016-01-01

    Today, in association with rapid social and economic changes, there is an increasing level of demand for distance and online learning programs. This study will focus on identifying the main motivational factors for choosing a web-based distance-learning program. Moreover, it will investigate how these factors relate to age, gender, marital status…

  13. An Empirical Typology of Residential Care/Assisted Living Based on a Four-State Study

    ERIC Educational Resources Information Center

    Park, Nan Sook; Zimmerman, Sheryl; Sloane, Philip D.; Gruber-Baldini, Ann L.; Eckert, J. Kevin

    2006-01-01

    Purpose: Residential care/assisted living describes diverse facilities providing non-nursing home care to a heterogeneous group of primarily elderly residents. This article derives typologies of assisted living based on theoretically and practically grounded evidence. Design and Methods: We obtained data from the Collaborative Studies of Long-Term…

  14. Teaching Standards-Based Group Work Competencies to Social Work Students: An Empirical Examination

    ERIC Educational Resources Information Center

    Macgowan, Mark J.; Vakharia, Sheila P.

    2012-01-01

    Objectives: Accreditation standards and challenges in group work education require competency-based approaches in teaching social work with groups. The Association for the Advancement of Social Work with Groups developed Standards for Social Work Practice with Groups, which serve as foundation competencies for professional practice. However, there…

  15. An Adaptive E-Learning System Based on Students' Learning Styles: An Empirical Study

    ERIC Educational Resources Information Center

    Drissi, Samia; Amirat, Abdelkrim

    2016-01-01

    Personalized e-learning implementation is recognized as one of the most interesting research areas in the distance web-based education. Since the learning style of each learner is different one must fit e-learning with the different needs of learners. This paper presents an approach to integrate learning styles into adaptive e-learning hypermedia.…

  16. Learning Environment in Light of Positional, Institutional, and Cultural Interpretations: An Empirically-Based Conceptual Analysis

    ERIC Educational Resources Information Center

    Kovac, Velibor Bobo; Lund, Ingrid; Omdal, Heidi

    2017-01-01

    This study explores the possibility that the concept of learning environment (LE) is understood and interpreted differently by various users, depending on their relative positions in the educational system, institutional affiliation, and cultural heritage. The study employs a qualitative approach and is based on 14 semistructured separate…

  17. An Empirical Base for Teaching the Past Tense in German as a Foreign Language.

    ERIC Educational Resources Information Center

    Watzinger-Tharp, Johanna

    1994-01-01

    Offers an analysis of native speech data as a base for teaching the past tense in German. Interviews with native speakers revealed that the present perfect serves as the dominant past tense for main verbs; modal verbs, the copula "sein,""haben," and certain formulaic expressions occur in the preterite; and past tense use is linked to contextual…

  18. Web-based Educational Media: Issues and Empirical Test of Learning.

    ERIC Educational Resources Information Center

    Radhakrishnan, Senthil; Bailey, James E.

    This paper addresses issues and cost benefits of World Wide Web-based education systems. It presents the results of an effort to identify problems that arise when considering this media and suggests conceptual solutions to some of these problems. To evaluate these solutions, a prototype system was built and tested in an engineering classroom; the…

  19. Preparation, Practice, and Performance: An Empirical Examination of the Impact of Standards-Based Instruction on Secondary Students' Math and Science Achievement

    ERIC Educational Resources Information Center

    Thompson, Carla J.

    2009-01-01

    For almost two decades proponents of educational reform have advocated the use of standards-based education in maths and science classrooms for improving teacher practices, increasing student learning, and raising the quality of maths and science instruction. This study empirically examined the impact of specific standards-based teacher…

  20. Racism, health status, and birth outcomes: results of a participatory community-based intervention and health survey.

    PubMed

    Carty, Denise C; Kruger, Daniel J; Turner, Tonya M; Campbell, Bettina; DeLoney, E Hill; Lewis, E Yvonne

    2011-02-01

    Many community-based participatory research (CBPR) partnerships address social determinants of health as a central consideration. However, research studies that explicitly address racism are scarce in the CBPR literature, and there is a dearth of available community-generated data to empirically examine how racism influences health disparities at the local level. In this paper, we provide results of a cross-sectional, population-based health survey conducted in the urban areas of Genesee and Saginaw Counties in Michigan to assess how a sustained community intervention to reduce racism and infant mortality influenced knowledge, beliefs, and experiences of racism and to explore how perceived racism is associated with self-rated health and birth outcomes. We used ANOVA and regression models to compare the responses of intervention participants and non-participants as well as African Americans and European Americans (N = 629). We found that intervention participants reported greater acknowledgment of the enduring and differential impact of racism in comparison to the non-intervention participants. Moreover, survey analyses revealed that racism was associated with health in the following ways: (1) experiences of racial discrimination predicted self-rated physical health, mental health, and smoking status; (2) perceived racism against one's racial group predicted lower self-rated physical health; and (3) emotional responses to racism-related experiences were marginally associated with lower birth-weight births in the study sample. Our study bolsters the published findings on perceived racism and health outcomes and highlights the usefulness of CBPR and community surveys to empirically investigate racism as a social determinant of health.

  1. Empirical Characteristics of Family-Based Linkage to a Complex Trait: the ADIPOQ Region and Adiponectin Levels

    PubMed Central

    Hellwege, Jacklyn N.; Palmer, Nicholette D.; Brown, W. Mark; Ziegler, Julie T.; An, S. Sandy; Guo, Xiuqing; Chen, Y.-D. Ida; Taylor, Kent; Hawkins, Gregory A.; Ng, Maggie C.Y.; Speliotes, Elizabeth K.; Lorenzo, Carlos; Norris, Jill M.; Rotter, Jerome I.; Wagenknecht, Lynne E.; Langefeld, Carl D.; Bowden, Donald W.

    2014-01-01

    We previously identified a low frequency (1.1%) coding variant (G45R; rs200573126) in the adiponectin gene (ADIPOQ) which was the basis for a multipoint microsatellite linkage signal (LOD=8.2) for plasma adiponectin levels in Hispanic families. We have empirically evaluated the ability of data from targeted common variants, exome chip genotyping, and genome-wide association study (GWAS) data to detect linkage and association to adiponectin protein levels at this locus. Simple two-point linkage and association analyses were performed in 88 Hispanic families (1150 individuals) using 10,958 SNPs on chromosome 3. Approaches were compared for their ability to map the functional variant, G45R, which was strongly linked (two-point LOD=20.98) and powerfully associated (p-value=8.1×10−50). Over 450 SNPs within a broad 61 Mb interval around rs200573126 showed nominal evidence of linkage (LOD>3) but only four other SNPs in this region were associated with p-values<1.0×10−4. When G45R was accounted for, the maximum LOD score across the interval dropped to 4.39 and the best p-value was 1.1×10−5. Linked and/or associated variants ranged in frequency (0.0018 to 0.50) and type (coding, non-coding) and had little detectable linkage disequilibrium with rs200573126 (r2<0.20). In addition, the two-point linkage approach empirically outperformed multipoint microsatellite and multipoint SNP analysis. In the absence of data for rs200573126, family-based linkage analysis using a moderately dense SNP dataset, including both common and low frequency variants, resulted in stronger evidence for an adiponectin locus than association data alone. Thus, linkage analysis can be a useful tool to facilitate identification of high impact genetic variants. PMID:25447270

  2. Conventional empirical law reverses in the phase transitions of 122-type iron-based superconductors

    PubMed Central

    Yu, Zhenhai; Wang, Lin; Wang, Luhong; Liu, Haozhe; Zhao, Jinggeng; Li, Chunyu; Sinogeikin, Stanislav; Wu, Wei; Luo, Jianlin; Wang, Nanlin; Yang, Ke; Zhao, Yusheng; Mao, Ho-kwang

    2014-01-01

    Phase transition of solid-state materials is a fundamental research topic in condensed matter physics, materials science and geophysics. It has been well accepted and widely proven that isostructural compounds containing different cations undergo same pressure-induced phase transitions but at progressively lower pressures as the cation radii increases. However, we discovered that this conventional law reverses in the structural transitions in 122-type iron-based superconductors. In this report, a combined low temperature and high pressure X-ray diffraction (XRD) measurement has identified the phase transition curves among the tetragonal (T), orthorhombic (O) and the collapsed-tetragonal (cT) phases in the structural phase diagram of the iron-based superconductor AFe2As2 (A = Ca, Sr, Eu, and Ba). The cation radii dependence of the phase transition pressure (T → cT) shows an opposite trend in which the compounds with larger ambient radii cations have a higher transition pressure. PMID:25417655

  3. First Empirical Evaluation of Outcomes for Mentalization-Based Group Therapy for Adolescents With BPD.

    PubMed

    Bo, Sune; Sharp, Carla; Beck, Emma; Pedersen, Jesper; Gondan, Matthias; Simonsen, Erik

    2016-08-15

    Adolescent borderline personality disorder (BPD) is a devastating disorder, and it is essential to identify and treat the disorder in its early course. A total of 34 female Danish adolescents between 15 and 18 years old participated in 1 year of structured mentalization-based group therapy. Twenty-five adolescents completed the study, of which the majority (23) displayed improvement regarding borderline symptoms, depression, self-harm, peer-attachment, parent-attachment, mentalizing, and general psychopathology. Enhanced trust in peers and parents in combination with improved mentalizing capacity was associated with greater decline in borderline symptoms, thereby pointing to a candidate mechanism responsible for the efficacy of the treatment. The current study provides a promising rationale for the further development and evaluation of group-format mentalization-based treatment for adolescents with borderline traits. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  4. Contextual effects in school-based violence prevention programs: a conceptual framework and empirical review.

    PubMed

    Ozer, Emily J

    2006-05-01

    This paper reviews the theoretical and practical importance of studying contextual factors in school-based violence prevention programs and provides a framework for evaluating factors at the classroom, school, and community/district level. Sixty-two published papers describing 38 different programs were reviewed; of these 16 were identified that reported data on contextual effects or discussed possible contextual effects on the intervention. The small number of studies precludes definitive conclusions regarding contextual effects in school-based violence prevention programs, but suggests (a) some evidence for contextual effects on program outcomes, and (b) interdependence of context and implementation factors in influencing outcomes.Editors' Strategic Implications: This review suggests that contextual effects are important to school violence prevention, as context can influence outcomes directly and through interactions with implementation factors. Consequently, characteristics of the classroom, school, and community contexts should be considered by practitioners when implementing prevention programs and measured by researchers studying the processes and outcomes of these programs.

  5. Specification-based software sizing: An empirical investigation of function metrics

    NASA Technical Reports Server (NTRS)

    Jeffery, Ross; Stathis, John

    1993-01-01

    For some time the software industry has espoused the need for improved specification-based software size metrics. This paper reports on a study of nineteen recently developed systems in a variety of application domains. The systems were developed by a single software services corporation using a variety of languages. The study investigated several metric characteristics. It shows that: earlier research into inter-item correlation within the overall function count is partially supported; a priori function counts, in themself, do not explain the majority of the effort variation in software development in the organization studied; documentation quality is critical to accurate function identification; and rater error is substantial in manual function counting. The implication of these findings for organizations using function based metrics are explored.

  6. Evidence-based architectural and space design supports Magnet® empirical outcomes.

    PubMed

    Ecoff, Laurie; Brown, Caroline E

    2010-12-01

    This department expands nursing leaders' knowledge and competencies in health facility design. The editor of this department, Dr Jaynelle Stichler, asked guest authors, Drs Ecoff and Brown, to describe the process of using the conceptual models of a nursing evidence-based practice model and the Magnet Recognition Program® as a structured process to lead decision making in the planning and design processes and to achieve desired outcomes in hospital design.

  7. Why Culture Matters: An Empirically-Based Pre-Deployment Training Program

    DTIC Science & Technology

    2005-09-01

    and psychomotor) which can be viewed as categories that describe the goals of a learner -centered training process (AFM 36-2234 1993; Cook 2000; AFM...to train to the higher levels of learning, it enables learner -centered training and not just subjective or preference-based teacher-oriented topical...external • Thought Patterns • Analytic - Relational • Theoretical learning and knowledge - Experiential or kinesthetic learning and knowledge

  8. Empirically Supported Family-Based Treatments for Conduct Disorder and Delinquency in Adolescents

    PubMed Central

    Henggeler, Scott W.; Sheidow, Ashli J.

    2011-01-01

    Several family-based treatments of conduct disorder and delinquency in adolescents have emerged as evidence-based and, in recent years, have been transported to more than 800 community practice settings. These models include multisystemic therapy, functional family therapy, multidimensional treatment foster care, and, to a lesser extent, brief strategic family therapy. In addition to summarizing the theoretical and clinical bases of these treatments, their results in efficacy and effectiveness trials are examined with particular emphasis on any demonstrated capacity to achieve favorable outcomes when implemented by real world practitioners in community practice settings. Special attention is also devoted to research on purported mechanisms of change as well as the long-term sustainability of outcomes achieved by these treatment models. Importantly, we note that the developers of each of the models have developed quality assurance systems to support treatment fidelity and youth and family outcomes; and the developers have formed purveyor organizations to facilitate the large scale transport of their respective treatments to community settings nationally and internationally. PMID:22283380

  9. Empirical evaluation of analytical models for parallel relational data-base queries. Master's thesis

    SciTech Connect

    Denham, M.C.

    1990-12-01

    This thesis documents the design and implementation of three parallel join algorithms to be used in the verification of analytical models developed by Kearns. Kearns developed a set of analytical models for a variety of relational database queries. These models serve as tools for the design of parallel relational database system. Each of Kearns' models is classified as either single step or multiple step. The single step models reflect queries that require only one operation while the multiple step models reflect queries that require multiple operations. Three parallel join algorithms were implemented based upon Kearns' models. Two are based upon single step join models and one is based upon a multiple step join model. They are implemented on an Intel iPSC/1 parallel computer. The single step join algorithms include the parallel nested-loop join and the bucket (or hash) join. The multiple step algorithm that was implemented is a pipelined version of the bucket join. The results show that within the constraints of the test cases run, the three models are all at least accurate to within about 8.5% and they should prove useful in the design of parallel relational database systems.

  10. An Empirical Study of the Transmission Power Setting for Bluetooth-Based Indoor Localization Mechanisms

    PubMed Central

    Castillo-Cara, Manuel; Lovón-Melgarejo, Jesús; Bravo-Rocca, Gusseppe; Orozco-Barbosa, Luis; García-Varea, Ismael

    2017-01-01

    Nowadays, there is a great interest in developing accurate wireless indoor localization mechanisms enabling the implementation of many consumer-oriented services. Among the many proposals, wireless indoor localization mechanisms based on the Received Signal Strength Indication (RSSI) are being widely explored. Most studies have focused on the evaluation of the capabilities of different mobile device brands and wireless network technologies. Furthermore, different parameters and algorithms have been proposed as a means of improving the accuracy of wireless-based localization mechanisms. In this paper, we focus on the tuning of the RSSI fingerprint to be used in the implementation of a Bluetooth Low Energy 4.0 (BLE4.0) Bluetooth localization mechanism. Following a holistic approach, we start by assessing the capabilities of two Bluetooth sensor/receiver devices. We then evaluate the relevance of the RSSI fingerprint reported by each BLE4.0 beacon operating at various transmission power levels using feature selection techniques. Based on our findings, we use two classification algorithms in order to improve the setting of the transmission power levels of each of the BLE4.0 beacons. Our main findings show that our proposal can greatly improve the localization accuracy by setting a custom transmission power level for each BLE4.0 beacon. PMID:28590413

  11. An Empirical Study of the Transmission Power Setting for Bluetooth-Based Indoor Localization Mechanisms.

    PubMed

    Castillo-Cara, Manuel; Lovón-Melgarejo, Jesús; Bravo-Rocca, Gusseppe; Orozco-Barbosa, Luis; García-Varea, Ismael

    2017-06-07

    Nowadays, there is a great interest in developing accurate wireless indoor localization mechanisms enabling the implementation of many consumer-oriented services. Among the many proposals, wireless indoor localization mechanisms based on the Received Signal Strength Indication (RSSI) are being widely explored. Most studies have focused on the evaluation of the capabilities of different mobile device brands and wireless network technologies. Furthermore, different parameters and algorithms have been proposed as a means of improving the accuracy of wireless-based localization mechanisms. In this paper, we focus on the tuning of the RSSI fingerprint to be used in the implementation of a Bluetooth Low Energy 4.0 (BLE4.0) Bluetooth localization mechanism. Following a holistic approach, we start by assessing the capabilities of two Bluetooth sensor/receiver devices. We then evaluate the relevance of the RSSI fingerprint reported by each BLE4.0 beacon operating at various transmission power levels using feature selection techniques. Based on our findings, we use two classification algorithms in order to improve the setting of the transmission power levels of each of the BLE4.0 beacons. Our main findings show that our proposal can greatly improve the localization accuracy by setting a custom transmission power level for each BLE4.0 beacon.

  12. Patients’ Acceptance towards a Web-Based Personal Health Record System: An Empirical Study in Taiwan

    PubMed Central

    Liu, Chung-Feng; Tsai, Yung-Chieh; Jang, Fong-Lin

    2013-01-01

    The health care sector has become increasingly interested in developing personal health record (PHR) systems as an Internet-based telehealthcare implementation to improve the quality and decrease the cost of care. However, the factors that influence patients’ intention to use PHR systems remain unclear. Based on physicians’ therapeutic expertise, we implemented a web-based infertile PHR system and proposed an extended Technology Acceptance Model (TAM) that integrates the physician-patient relationship (PPR) construct into TAM’s original perceived ease of use (PEOU) and perceived usefulness (PU) constructs to explore which factors will influence the behavioral intentions (BI) of infertile patients to use the PHR. From ninety participants from a medical center, 50 valid responses to a self-rating questionnaire were collected, yielding a response rate of 55.56%. The partial least squares (PLS) technique was used to assess the causal relationships that were hypothesized in the extended model. The results indicate that infertile patients expressed a moderately high intention to use the PHR system. The PPR and PU of patients had significant effects on their BI to use PHR, whereas the PEOU indirectly affected the patients’ BI through the PU. This investigation confirms that PPR can have a critical role in shaping patients’ perceptions of the use of healthcare information technologies. Hence, we suggest that hospitals should promote the potential usefulness of PHR and improve the quality of the physician-patient relationship to increase patients’ intention of using PHR. PMID:24142185

  13. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User’s Head Movement

    PubMed Central

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-01-01

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user’s head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest. PMID:27589768

  14. Empirical force field for cisplatin based on quantum dynamics data: case study of new parameterization scheme for coordination compounds.

    PubMed

    Yesylevskyy, S; Cardey, Bruno; Kraszewski, S; Foley, Sarah; Enescu, Mironel; da Silva, Antônio M; Dos Santos, Hélio F; Ramseyer, Christophe

    2015-10-01

    Parameterization of molecular complexes containing a metallic compound, such as cisplatin, is challenging due to the unconventional coordination nature of the bonds which involve platinum atoms. In this work, we develop a new methodology of parameterization for such compounds based on quantum dynamics (QD) calculations. We show that the coordination bonds and angles are more flexible than in normal covalent compounds. The influence of explicit solvent is also shown to be crucial to determine the flexibility of cisplatin in quantum dynamics simulations. Two empirical topologies of cisplatin were produced by fitting its atomic fluctuations against QD in vacuum and QD with explicit first solvation shell of water molecules respectively. A third topology built in a standard way from the static optimized structure was used for comparison. The later one leads to an excessively rigid molecule and exhibits much smaller fluctuations of the bonds and angles than QD reveals. It is shown that accounting for the high flexibility of cisplatin molecule is needed for adequate description of its first hydration shell. MD simulations with flexible QD-based topology also reveal a significant decrease of the barrier of passive diffusion of cisplatin accross the model lipid bilayer. These results confirm that flexibility of organometallic compounds is an important feature to be considered in classical molecular dynamics topologies. Proposed methodology based on QD simulations provides a systematic way of building such topologies.

  15. Synthesizing Results From Empirical Research on Computer-Based Scaffolding in STEM Education

    PubMed Central

    Belland, Brian R.; Walker, Andrew E.; Kim, Nam Ju; Lefler, Mason

    2016-01-01

    Computer-based scaffolding assists students as they generate solutions to complex problems, goals, or tasks, helping increase and integrate their higher order skills in the process. However, despite decades of research on scaffolding in STEM (science, technology, engineering, and mathematics) education, no existing comprehensive meta-analysis has synthesized the results of these studies. This review addresses that need by synthesizing the results of 144 experimental studies (333 outcomes) on the effects of computer-based scaffolding designed to assist the full range of STEM learners (primary through adult education) as they navigated ill-structured, problem-centered curricula. Results of our random effect meta-analysis (a) indicate that computer-based scaffolding showed a consistently positive (ḡ = 0.46) effect on cognitive outcomes across various contexts of use, scaffolding characteristics, and levels of assessment and (b) shed light on many scaffolding debates, including the roles of customization (i.e., fading and adding) and context-specific support. Specifically, scaffolding’s influence on cognitive outcomes did not vary on the basis of context-specificity, presence or absence of scaffolding change, and logic by which scaffolding change is implemented. Scaffolding’s influence was greatest when measured at the principles level and among adult learners. Still scaffolding’s effect was substantial and significantly greater than zero across all age groups and assessment levels. These results suggest that scaffolding is a highly effective intervention across levels of different characteristics and can largely be designed in many different ways while still being highly effective. PMID:28344365

  16. An empirical analysis of exposure-based regulation to abate toxic air pollution

    SciTech Connect

    Marakovits, D.M.; Considine, T.J.

    1996-11-01

    Title III of the 1990 Clean Air Act Amendments requires the Environmental Protection Agency to regulate 189 air toxics, including emissions from by-product coke ovens. Economists criticize the inefficiency of uniform standards, but Title III makes no provision for flexible regulatory instruments. Environmental health scientists suggest that population exposure, not necessarily ambient air quality, should motivate environmental air pollution policies. Using an engineering-economic model of the United States steel industry, we estimate that an exposure-based policy can achieve the same level of public health as coke oven emissions standards and can reduce compliance costs by up to 60.0%. 18 refs., 3 figs., 1 tab.

  17. The weight of unfinished plate: A survey based characterization of restaurant food waste in Chinese cities.

    PubMed

    Wang, Ling-En; Liu, Gang; Liu, Xiaojie; Liu, Yao; Gao, Jun; Zhou, Bin; Gao, Si; Cheng, Shengkui

    2017-08-01

    Consumer food waste has attracted increasing public, academic, and political attention in recent years, due to its adverse resource, environmental, and socioeconomic impacts. The scales and patterns of consumer food waste, especially in developing countries, however, remain poorly understood, which may hinder the global effort of reducing food waste. In this study, based on a direct weighing method and a survey of 3557 tables in 195 restaurants in four case cities, we investigated the amount and patterns of restaurant food waste in China in 2015. Food waste per capita per meal in the four cities was 93g, consisting mainly of vegetables (29%), rice (14%), aquatic products (11%), wheat (10%), and pork (8%). This equals to approximately 11kg/cap/year and is not far from that of western countries, although per capita GDP of China is still much lower. We found also that food waste per capita per meal varies considerably by cities (Chengdu and Lhasa higher than Shanghai and Beijing), consumer groups (tourists higher than local residents), restaurant categories (more waste in larger restaurants), and purposes of meals (friends gathering and business banquet higher than working meal and private dining). Our pilot study provides a first, to our best knowledge, empirically determined scales and patterns of restaurant food waste in Chinese cities, and could help set targeted interventions and benchmark national food waste reduction targets. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Conventional empirical law reverses in the phase transitions of 122-type iron-based superconductors

    DOE PAGES

    Yu, Zhenhai; Wang, Lin; Wang, Luhong; ...

    2014-11-24

    Phase transition of solid-state materials is a fundamental research topic in condensed matter physics, materials science and geophysics. It has been well accepted and widely proven that isostructural compounds containing different cations undergo same pressure-induced phase transitions but at progressively lower pressures as the cation radii increases. However, we discovered that this conventional law reverses in the structural transitions in 122-type iron-based superconductors. In this report, a combined low temperature and high pressure X-ray diffraction (XRD) measurement has identified the phase transition curves among the tetragonal (T), orthorhombic (O) and the collapsed-tetragonal (cT) phases in the structural phase diagram ofmore » the iron-based superconductor AFe2As2 (A = Ca, Sr, Eu, and Ba). As a result, the cation radii dependence of the phase transition pressure (T → cT) shows an opposite trend in which the compounds with larger ambient radii cations have a higher transition pressure.« less

  19. Conventional empirical law reverses in the phase transitions of 122-type iron-based superconductors

    SciTech Connect

    Yu, Zhenhai; Wang, Lin; Wang, Luhong; Liu, Haozhe; Zhao, Jinggeng; Li, Chunyu; Sinogeikin, Stanislav; Wu, Wei; Luo, Jianlin; Wang, Nanlin; Yang, Ke; Zhao, Yusheng; Mao, Ho -kwang

    2014-11-24

    Phase transition of solid-state materials is a fundamental research topic in condensed matter physics, materials science and geophysics. It has been well accepted and widely proven that isostructural compounds containing different cations undergo same pressure-induced phase transitions but at progressively lower pressures as the cation radii increases. However, we discovered that this conventional law reverses in the structural transitions in 122-type iron-based superconductors. In this report, a combined low temperature and high pressure X-ray diffraction (XRD) measurement has identified the phase transition curves among the tetragonal (T), orthorhombic (O) and the collapsed-tetragonal (cT) phases in the structural phase diagram of the iron-based superconductor AFe2As2 (A = Ca, Sr, Eu, and Ba). As a result, the cation radii dependence of the phase transition pressure (T → cT) shows an opposite trend in which the compounds with larger ambient radii cations have a higher transition pressure.

  20. The influence of land urbanization on landslides: An empirical estimation based on Chinese provincial panel data.

    PubMed

    Li, Gerui; Lei, Yalin; Yao, Huajun; Wu, Sanmang; Ge, Jianping

    2017-04-10

    This study used panel data for 28 provinces and municipalities in China from 2003 to 2014 to investigate the relationship between land urbanization and landslides by building panel models for a national sample and subsamples from the three regions of China and studied the problems of landslide prevention measures based on the relationship. The results showed that 1) at the national level, the percentage of built-up area and road density are respectively negative and positive for landslides. 2) At the regional level, the improvement of landslide prevention measures with increasing economic development only appears in built-up areas. The percentage of built-up areas increases the number of landslides in the western region and decreases the number in the central and eastern regions; the degree of decrease in the eastern region is larger than in the central region. Road density increases the number of landslides in each region, and the degree increases gradually from the west to the east. 3) The effect of landslide prevention funding is not obvious. Although the amount of landslide prevention funds decreases the number of landslides at the national level, the degree of increase is too small. Except in the central region, the amount of landslide prevention funding did not decrease the number of landslides effectively in the western and eastern regions. We propose a series of policy implications based on these test results that may help to improve landslide prevention measures.