Science.gov

Sample records for based empirical survey

  1. An Empirical Pixel-Based Correction for Imperfect CTE. I. HST's Advanced Camera for Surveys

    NASA Astrophysics Data System (ADS)

    Anderson, Jay; Bedin, Luigi

    2010-09-01

    We use an empirical approach to characterize the effect of charge-transfer efficiency (CTE) losses in images taken with the Wide-Field Channel of the Advanced Camera for Surveys (ACS). The study is based on profiles of warm pixels in 168 dark exposures taken between 2009 September and October. The dark exposures allow us to explore charge traps that affect electrons when the background is extremely low. We develop a model for the readout process that reproduces the observed trails out to 70 pixels. We then invert the model to convert the observed pixel values in an image into an estimate of the original pixel values. We find that when we apply this image-restoration process to science images with a variety of stars on a variety of background levels, it restores flux, position, and shape. This means that the observed trails contain essentially all of the flux lost to inefficient CTE. The Space Telescope Science Institute is currently evaluating this algorithm with the aim of optimizing it and eventually providing enhanced data products. The empirical procedure presented here should also work for other epochs (e.g., pre-SM4), though the parameters may have to be recomputed for the time when ACS was operated at a higher temperature than the current -81°C. Finally, this empirical approach may also hold promise for other instruments, such as WFPC2, STIS, the ASC's HRC, and even WFC3/UVIS.

  2. An Empirical Pixel-Based Correction for Imperfect CTE. I. HST's Advanced Camera for Surveys

    NASA Astrophysics Data System (ADS)

    Anderson, Jay; Bedin, Luigi R.

    2010-09-01

    We use an empirical approach to characterize the effect of charge-transfer efficiency (CTE) losses in images taken with the Wide-Field Channel of the Advanced Camera for Surveys (ACS). The study is based on profiles of warm pixels in 168 dark exposures taken between 2009 September and October. The dark exposures allow us to explore charge traps that affect electrons when the background is extremely low. We develop a model for the readout process that reproduces the observed trails out to 70 pixels. We then invert the model to convert the observed pixel values in an image into an estimate of the original pixel values. We find that when we apply this image-restoration process to science images with a variety of stars on a variety of background levels, it restores flux, position, and shape. This means that the observed trails contain essentially all of the flux lost to inefficient CTE. The Space Telescope Science Institute is currently evaluating this algorithm with the aim of optimizing it and eventually providing enhanced data products. The empirical procedure presented here should also work for other epochs (e.g., pre-SM4), though the parameters may have to be recomputed for the time when ACS was operated at a higher temperature than the current -81°C. Finally, this empirical approach may also hold promise for other instruments, such as WFPC2, STIS, the ACS's HRC, and even WFC3/UVIS. Based on observations with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by AURA, Inc., under NASA contract NAS 5-26555.

  3. Monitoring of Qualifications and Employment in Austria: An Empirical Approach Based on the Labour Force Survey

    ERIC Educational Resources Information Center

    Lassnigg, Lorenz; Vogtenhuber, Stefan

    2011-01-01

    The empirical approach referred to in this article describes the relationship between education and training (ET) supply and employment in Austria; the use of the new ISCED (International Standard Classification of Education) fields of study variable makes this approach applicable abroad. The purpose is to explore a system that produces timely…

  4. GIS Teacher Training: Empirically-Based Indicators of Effectiveness

    ERIC Educational Resources Information Center

    Höhnle, Steffen; Fögele, Janis; Mehren, Rainer; Schubert, Jan Christoph

    2016-01-01

    In spite of various actions, the implementation of GIS (geographic information systems) in German schools is still very low. In the presented research, teaching experts as well as teaching novices were presented with empirically based constraints for implementation stemming from an earlier survey. In the process of various group discussions, the…

  5. Experts' attitudes towards medical futility: an empirical survey from Japan

    PubMed Central

    Bagheri, Alireza; Asai, Atsushi; Ida, Ryuichi

    2006-01-01

    Background The current debate about medical futility is mostly driven by theoretical and personal perspectives and there is a lack of empirical data to document experts and public attitudes towards medical futility. Methods To examine the attitudes of the Japanese experts in the fields relevant to medical futility a questionnaire survey was conducted among the members of the Japan Association for Bioethics. A total number of 108 questionnaires returned filled in, giving a response rate of 50.9%. Among the respondents 62% were healthcare professionals (HCPs) and 37% were non-healthcare professionals (Non-HCPs). Results The majority of respondents (67.6 %) believed that a physician's refusal to provide or continue a treatment on the ground of futility judgment could never be morally justified but 22.2% approved such refusal with conditions. In the case of physiologically futile care, three-quarters believed that a physician should inform the patient/family of his futility judgment and it would be the patient who could decide what should be done next, based on his/her value judgment. However more than 10% said that a physician should ask about a patient's value and goals, but the final decision was left to the doctor not the patient. There was no statistically significant difference between HCPs and Non-HCPs (p = 0.676). Of respondents 67.6% believed that practical guidelines set up by the health authority would be helpful in futility judgment. Conclusion The results show that there is no support for the physicians' unilateral decision- making on futile care. This survey highlights medical futility as an emerging issue in Japanese healthcare and emphasizes on the need for public discussion and policy development. PMID:16764732

  6. Acculturating human experimentation: an empirical survey in France

    PubMed Central

    Amiel, Philippe; Mathieu, Séverine; Fagot-Largeault, Anne

    2001-01-01

    Preliminary results of an empirical study of human experimentation practices are presented and contrasted with those of a survey conducted a hundred years ago when clinical research, although tolerated, was culturally deviant. Now that biomedical research is both authorized and controlled, its actors (sponsors, committees, investigators, subjects) come out with heterogeneous rationalities, and they appear to be engaged in a transactional process of negotiating their rationales with one another. In the European context “protective” of subjects, surprisingly the subjects we interviewed (and especially patient-subjects) were creative and revealed an aptitude for integrating experimental medicine into common culture. PMID:11445883

  7. An empirical Bayes approach to analyzing recurring animal surveys

    USGS Publications Warehouse

    Johnson, D.H.

    1989-01-01

    Recurring estimates of the size of animal populations are often required by biologists or wildlife managers. Because of cost or other constraints, estimates frequently lack the accuracy desired but cannot readily be improved by additional sampling. This report proposes a statistical method employing empirical Bayes (EB) estimators as alternatives to those customarily used to estimate population size, and evaluates them by a subsampling experiment on waterfowl surveys. EB estimates, especially a simple limited-translation version, were more accurate and provided shorter confidence intervals with greater coverage probabilities than customary estimates.

  8. Empirical estimates of cumulative refraction errors associated with procedurally constrained levelings based on the Gaithersburg-Tucson Refraction Tests of the National Geodetic Survey

    NASA Astrophysics Data System (ADS)

    Castle, Robert O.; Gilmore, Thomas D.; Mark, Robert K.; Shaw, Roger H.

    1985-05-01

    Analyses of results of the National Geodetic Survey's leveling refraction tests indicate that the standard deviation about the mean (σ) for high-scale minus low-scale rod readings closely correlates with measured refraction error. Use of this relation in conjunction with values for σ obtained from routinely constrained surveys provides a basis for estimating the refraction error associated with levelings of stipulated order and class.

  9. Empirical estimates of cumulative refraction errors associated with procedurally constrained levelings based on the Gaithersburg- Tucson refraction tests of the National Geodetic Survey.

    USGS Publications Warehouse

    Castle, R.O.; Gilmore, T.D.; Mark, R.K.; Shaw, R.H.

    1985-01-01

    Analyses of results of the National Geodetic Survey's leveling refraction tests indicate that the standard deviation about the mean (sigma) for high-scale minus low-scale rod readings closely correlates with measured refraction error. Use of this relation in conjunction with values for sigma obtained from routinely constrainted surveys provides a basis for estimating the refraction error associated with levelings of stipulated order and class. -Authors

  10. Empirically Based Play Interventions for Children

    ERIC Educational Resources Information Center

    Reddy, Linda A., Ed.; Files-Hall, Tara M., Ed.; Schaefer, Charles E., Ed.

    2005-01-01

    "Empirically Based Play Interventions for Children" is a compilation of innovative, well-designed play interventions, presented for the first time in one text. Play therapy is the oldest and most popular form of child therapy in clinical practice and is widely considered by practitioners to be uniquely responsive to children's developmental needs.…

  11. Searching for Evidence-Based Practice: A Survey of Empirical Studies on Curricular Interventions Measuring and Reporting Fidelity of Implementation Published during 2004-2013

    ERIC Educational Resources Information Center

    Missett, Tracy C.; Foster, Lisa H.

    2015-01-01

    In an environment of accountability, the development of evidence-based practices is expected. To demonstrate that a practice is evidence based, quality indicators of rigorous methodology should be present including indications that teachers implementing an intervention have done so with fidelity to its design. Because evidence-based practices…

  12. Watershed-based survey designs

    USGS Publications Warehouse

    Detenbeck, N.E.; Cincotta, D.; Denver, J.M.; Greenlee, S.K.; Olsen, A.R.; Pitchford, A.M.

    2005-01-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream-downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs. ?? Springer Science + Business Media, Inc. 2005.

  13. Functional somatic symptoms and hypochondriasis. A survey of empirical studies.

    PubMed

    Kellner, R

    1985-08-01

    Empirical studies suggest the following main conclusions: functional somatic symptoms are extremely common; a large proportion appear to be caused by physiologic activity and tend to be aggravated by emotion. Hypochondriacal patients misunderstand the nature and significance of these symptoms and believe that they are evidence of serious disease. Hypochondriasis can be a part of another syndrome, usually an affective one, or it can be a primary disorder. The prevalence differs between cultures and social classes. Constitutional factors, disease in the family in childhood, and previous disease predispose to hypochondriasis. Various stressors can be precipitating events. Selective perception of symptoms, motivated by fear of disease, and subsequent increase in anxiety with more somatic symptoms appear to be links in the vicious cycle of the hypochondriacal reaction. Psychotherapy as well as psychotropic drugs are effective in the treatment of functional somatic symptoms. There are no adequate controlled studies of psychotherapy in hypochondriasis, and the recommended treatments are based on studies with similar disorders. The prognosis of treated hypochondriasis is good in a substantial proportion of patients. PMID:2861797

  14. Empirical validation and application of the computing attitudes survey

    NASA Astrophysics Data System (ADS)

    Dorn, Brian; Tew, Allison Elliott

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning Attitudes about Science Survey and measures novice to expert attitude shifts about the nature of knowledge and problem solving in computer science. Factor analysis with a large, multi-institutional data-set identified and confirmed five subscales on the CAS related to different facets of attitudes measured on the survey. We then used the CAS in a pre-post format to demonstrate its usefulness in studying attitude shifts during CS1 courses and its responsiveness to varying instructional conditions. The most recent version of the CAS is provided in its entirety along with a discussion of the conditions under which its validity has been demonstrated.

  15. Empirical Validation and Application of the Computing Attitudes Survey

    ERIC Educational Resources Information Center

    Dorn, Brian; Elliott Tew, Allison

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…

  16. An Empirical Examination of IRT Information for School Climate Surveys

    ERIC Educational Resources Information Center

    Mo, Lun; Yang, Fang; Hu, Xiangen

    2011-01-01

    School climate surveys are widely applied in school districts across the nation to collect information about teacher efficacy, principal leadership, school safety, students' activities, and so forth. They enable school administrators to understand and address many issues on campus when used in conjunction with other student and staff data.…

  17. Empirically Based Myths: Astrology, Biorhythms, and ATIs.

    ERIC Educational Resources Information Center

    Ragsdale, Ronald G.

    1980-01-01

    A myth may have an empirical basis through chance occurrence; perhaps Aptitude Treatment Interactions (ATIs) are in this category. While ATIs have great utility in describing, planning, and implementing instruction, few disordinal interactions have been found. Article suggests narrowing of ATI research with replications and estimates of effect…

  18. Emotional Risks to Respondents in Survey Research: Some Empirical Evidence

    PubMed Central

    Labott, Susan M.; Johnson, Timothy P.; Fendrich, Michael; Feeny, Norah C.

    2014-01-01

    Some survey research has documented distress in respondents with pre-existing emotional vulnerabilities, suggesting the possibility of harm. In this study, respondents were interviewed about a personally distressing event; mood, stress, and emotional reactions were assessed. Two days later, respondents participated in interventions to either enhance or alleviate the effects of the initial interview. Results indicated that distressing interviews increased stress and negative mood, although no adverse events occurred. Between the interviews, moods returned to baseline. Respondents who again discussed a distressing event reported moods more negative than those who discussed a neutral or a positive event. This study provides evidence that, among nonvulnerable survey respondents, interviews on distressing topics can result in negative moods and stress, but they do not harm respondents. PMID:24169422

  19. School-Based Management and Paradigm Shift in Education an Empirical Study

    ERIC Educational Resources Information Center

    Cheng, Yin Cheong; Mok, Magdalena Mo Ching

    2007-01-01

    Purpose: This paper aims to report empirical research investigating how school-based management (SBM) and paradigm shift (PS) in education are closely related to teachers' student-centered teaching and students' active learning in a sample of Hong Kong secondary schools. Design/methodology/approach: It is a cross-sectional survey research…

  20. An Empirical Keying Approach to Academic Prediction with the Guilford-Zimmerman Temperament Survey

    ERIC Educational Resources Information Center

    Jacobs, Rick; Manese, Wilfredo

    1977-01-01

    The present study was conducted to establish a scoring key for the Guilford Zimmerman Temperament Survey appropriate for predicting academic performance. Results point to the utility of non-cognitive measures in predicting academic performance, particularly when scoring keys tailored to the specific situation are empirically derived. Suggestions…

  1. Modelling complex survey data with population level information: an empirical likelihood approach

    PubMed Central

    Oguz-Alper, M.; Berger, Y. G.

    2016-01-01

    Survey data are often collected with unequal probabilities from a stratified population. In many modelling situations, the parameter of interest is a subset of a set of parameters, with the others treated as nuisance parameters. We show that in this situation the empirical likelihood ratio statistic follows a chi-squared distribution asymptotically, under stratified single and multi-stage unequal probability sampling, with negligible sampling fractions. Simulation studies show that the empirical likelihood confidence interval may achieve better coverages and has more balanced tail error rates than standard approaches involving variance estimation, linearization or resampling. PMID:27279669

  2. WATERSHED BASED SURVEY DESIGNS

    EPA Science Inventory

    The development of watershed-based design and assessment tools will help to serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional condition to meet Section 305(b), identification of impaired water bodies or wate...

  3. Survey-Based Measurement of Public Management and Policy Networks

    ERIC Educational Resources Information Center

    Henry, Adam Douglas; Lubell, Mark; McCoy, Michael

    2012-01-01

    Networks have become a central concept in the policy and public management literature; however, theoretical development is hindered by a lack of attention to the empirical properties of network measurement methods. This paper compares three survey-based methods for measuring organizational networks: the roster, the free-recall name generator, and…

  4. Empirically Based Strategies for Preventing Juvenile Delinquency.

    PubMed

    Pardini, Dustin

    2016-04-01

    Juvenile crime is a serious public health problem that results in significant emotional and financial costs for victims and society. Using etiologic models as a guide, multiple interventions have been developed to target risk factors thought to perpetuate the emergence and persistence of delinquent behavior. Evidence suggests that the most effective interventions tend to have well-defined treatment protocols, focus on therapeutic approaches as opposed to external control techniques, and use multimodal cognitive-behavioral treatment strategies. Moving forward, there is a need to develop effective policies and procedures that promote the widespread adoption of evidence-based delinquency prevention practices across multiple settings. PMID:26980128

  5. An Empirical Evaluation of Puzzle-Based Learning as an Interest Approach for Teaching Introductory Computer Science

    ERIC Educational Resources Information Center

    Merrick, K. E.

    2010-01-01

    This correspondence describes an adaptation of puzzle-based learning to teaching an introductory computer programming course. Students from two offerings of the course--with and without the puzzle-based learning--were surveyed over a two-year period. Empirical results show that the synthesis of puzzle-based learning concepts with existing course…

  6. Image-Based Empirical Modeling of the Plasmasphere

    NASA Technical Reports Server (NTRS)

    Adrian, Mark L.; Gallagher, D. L.

    2008-01-01

    A new suite of empirical models of plasmaspheric plasma based on remote, global images from the IMAGE EUV instrument is proposed for development. The purpose of these empirical models is to establish the statistical properties of the plasmasphere as a function of conditions. This suite of models will mark the first time the plasmaspheric plume is included in an empirical model. Development of these empirical plasmaspheric models will support synoptic studies (such as for wave propagation and growth, energetic particle loss through collisions and dust transport as influenced by charging) and serves as a benchmark against which physical models can be tested. The ability to know that a specific global density distribution occurs in response to specific magnetospheric and solar wind factors is a huge advantage over all previous in-situ based empirical models. The consequence of creating these new plasmaspheric models will be to provide much higher fidelity and much richer quantitative descriptions of the statistical properties of plasmaspheric plasma in the inner magnetosphere, whether that plasma is in the main body of the plasmasphere, nearby during recovery or in the plasmaspheric plume. Model products to be presented include statistical probabilities for being in the plasmasphere, near thermal He+ density boundaries and the complexity of its spatial structure.

  7. Responses to Commentaries on Advances in Empirically Based Assessment.

    ERIC Educational Resources Information Center

    McConaughy, Stephanie H.

    1993-01-01

    Author of article (this issue) describing research program to advance assessment of children's behavioral and emotional problems; presenting conceptual framework for multiaxial empirically based assessment; and summarizing research efforts to develop cross-informant scales for scoring parent, teacher, and self-reports responds to commentaries on…

  8. Denoising ECG signal based on ensemble empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Zhi-dong, Zhao; Liu, Juan; Wang, Sheng-tao

    2011-10-01

    The electrocardiogram (ECG) has been used extensively for detection of heart disease. Frequently the signal is corrupted by various kinds of noise such as muscle noise, electromyogram (EMG) interference, instrument noise etc. In this paper, a new ECG denoising method is proposed based on the recently developed ensemble empirical mode decomposition (EEMD). Noisy ECG signal is decomposed into a series of intrinsic mode functions (IMFs). The statistically significant information content is build by the empirical energy model of IMFs. Noisy ECG signal collected from clinic recording is processed using the method. The results show that on contrast with traditional methods, the novel denoising method can achieve the optimal denoising of the ECG signal.

  9. Space Based Dark Energy Surveys

    NASA Astrophysics Data System (ADS)

    Dore, Olivier

    2016-03-01

    Dark energy, the name given to the cause of the accelerating expansion of the Universe, is one of the most tantalizing mystery in modern physics. Current cosmological models hold that dark energy is currently the dominant component of the Universe, but the exact nature of DE remains poorly understood. There are ambitious ground-based surveys underway that seek to understand DE and NASA is participating in the development of significantly more ambitious space-based surveys planned for the next decade. NASA has provided mission enabling technology to the European Space Agency's (ESA) Euclid mission in exchange for US scientists to participate in the Euclid mission. NASA is also developing the Wide Field Infrared Survey Telescope-Astrophysics Focused Telescope Asset (WFIRST) mission for possible launch in 2024. WFIRST was the highest ranked space mission in the Astro2010 Decadal Survey and the current design uses a 2.4m space telescope to go beyond what was then envisioned. Understanding DE is one of the primary science goals of WFIRST-AFTA. This talk will review the state of DE, the relevant activities of the Cosmic Structure Interest Group (CoSSIG) of the PhyPAG, and detail the status and complementarity between Euclid, WFIRST and ot ambitious ground-based efforts.

  10. Unsupervised self-organized mapping: a versatile empirical tool for object selection, classification and redshift estimation in large surveys

    NASA Astrophysics Data System (ADS)

    Geach, James E.

    2012-01-01

    We present an application of unsupervised machine learning - the self-organized map (SOM) - as a tool for visualizing, exploring and mining the catalogues of large astronomical surveys. Self-organization culminates in a low-resolution representation of the 'topology' of a parameter volume, and this can be exploited in various ways pertinent to astronomy. Using data from the Cosmological Evolution Survey (COSMOS), we demonstrate two key astronomical applications of the SOM: (i) object classification and selection, using galaxies with active galactic nuclei as an example, and (ii) photometric redshift estimation, illustrating how SOMs can be used as totally empirical predictive tools. With a training set of ˜3800 galaxies with zspec≤ 1, we achieve photometric redshift accuracies competitive with other (mainly template fitting) techniques that use a similar number of photometric bands [σ(Δz) = 0.03 with a ˜2 per cent outlier rate when using u* band to 8 ?m photometry]. We also test the SOM as a photo-z tool using the PHoto-z Accuracy Testing (PHAT) synthetic catalogue of Hildebrandt et al., which compares several different photo-z codes using a common input/training set. We find that the SOM can deliver accuracies that are competitive with many of the established template fitting and empirical methods. This technique is not without clear limitations, which are discussed, but we suggest it could be a powerful tool in the era of extremely large -'petabyte'- data bases where efficient data mining is a paramount concern.

  11. Advanced airfoil design empirically based transonic aircraft drag buildup technique

    NASA Technical Reports Server (NTRS)

    Morrison, W. D., Jr.

    1976-01-01

    To systematically investigate the potential of advanced airfoils in advance preliminary design studies, empirical relationships were derived, based on available wind tunnel test data, through which total drag is determined recognizing all major aircraft geometric variables. This technique recognizes a single design lift coefficient and Mach number for each aircraft. Using this technique drag polars are derived for all Mach numbers up to MDesign + 0.05 and lift coefficients -0.40 to +0.20 from CLDesign.

  12. The fundamental points of the Second Military Survey of the Habsburg Empire

    NASA Astrophysics Data System (ADS)

    Timár, G.

    2009-04-01

    The Second Military Survey was carried out between 1806 and 1869 in the continuously changing territory of the Habsburg Empire. More than 4000 tiles of the 1:28,800 scale survey sheets cover the Empire, which was the second in territorial extents in Europe after Russia. In the terms of the cartography, the Empire has been divided into eight zones; each zones had its own Cassini-type projection with a center at a geodetically defined fundamental point. The points were the following: - Wien-Stephansdom (valid for Lower and Upper Austria, Hungary, Dalmacy, Moravia and Vorarlberg; latitude=48,20910N; longitude=16,37655E on local datum). - Gusterberg (valid for Bohemia; latitude=48,03903N; longitude=14,13976E). - Schöcklberg (valid for Styria; latitude=47,19899N; longitude=15,46902E). - Krimberg (valid for Illyria and Coastal Land; latitude=45,92903N; longitude=14,47423E). - Löwenberg (valid for Galizia and Bukovina; latitude=49,84889N; longitude=24,04639E). - Hermannstadt (valid for Transylvania; latitude=45,84031N; longitude=24,11297). - Ivanić (valid for Croatia; latitude=45,73924N; longitude=16,42309). - Milano, Duomo San Salvatore (valid for Lombardy, Venezia, Parma and Modena; latitude=45,45944N; longitude=9,18757E) - a simulated fundametal point for Tyrol and Salzburg, several hundred miles north of the territories. The poster shows systematically the fundamental points, their topographic settings and the present situation of the geodetic point sites.

  13. Empirically Based Psychosocial Therapies for Schizophrenia: The Disconnection between Science and Practice

    PubMed Central

    Shean, Glenn D.

    2013-01-01

    Empirically validated psychosocial therapies for individuals diagnosed with schizophrenia were described in the report of the Schizophrenia Patient Outcomes Research Team (PORT, 2009). The PORT team identified eight psychosocial treatments: assertive community treatment, supported employment, cognitive behavioral therapy, family-based services, token economy, skills training, psychosocial interventions for alcohol and substance use disorders, and psychosocial interventions for weight management. PORT listings of empirically validated psychosocial therapies provide a useful template for the design of effective recovery-oriented mental health care systems. Unfortunately, surveys indicate that PORT listings have not been implemented in clinical settings. Obstacles to the implementation of PORT psychosocial therapy listings and suggestions for changes needed to foster implementation are discussed. Limitations of PORT therapy listings that are based on therapy outcome efficacy studies are discussed, and cross-cultural and course and outcome studies of correlates of recovery are summarized. PMID:23738068

  14. Empirically Based Psychosocial Therapies for Schizophrenia: The Disconnection between Science and Practice.

    PubMed

    Shean, Glenn D

    2013-01-01

    Empirically validated psychosocial therapies for individuals diagnosed with schizophrenia were described in the report of the Schizophrenia Patient Outcomes Research Team (PORT, 2009). The PORT team identified eight psychosocial treatments: assertive community treatment, supported employment, cognitive behavioral therapy, family-based services, token economy, skills training, psychosocial interventions for alcohol and substance use disorders, and psychosocial interventions for weight management. PORT listings of empirically validated psychosocial therapies provide a useful template for the design of effective recovery-oriented mental health care systems. Unfortunately, surveys indicate that PORT listings have not been implemented in clinical settings. Obstacles to the implementation of PORT psychosocial therapy listings and suggestions for changes needed to foster implementation are discussed. Limitations of PORT therapy listings that are based on therapy outcome efficacy studies are discussed, and cross-cultural and course and outcome studies of correlates of recovery are summarized. PMID:23738068

  15. An empirical analysis of the demand for sleep: Evidence from the American Time Use Survey.

    PubMed

    Ásgeirsdóttir, Tinna Laufey; Ólafsson, Sigurður Páll

    2015-12-01

    Using data from the American Time Use Survey, this paper empirically examined the demand for sleep, with special attention to its opportunity cost represented by wages. Variation in the unemployment rate by state was also used to investigate the cyclical nature of sleep duration. We conducted separate estimations for males and females, as well as for those who received a fixed salary and hourly wages. The findings predominantly revealed no relationship between sleep duration and the business cycle. However, an inverse relationship between sleep duration and wages was detected. This is in accordance with sleep duration being an economic choice variable, rather than a predetermined subtraction of the 24-h day. Although the inverse relationship was not significant in all the estimations for salaried subjects, it was consistent and strong for subjects who received hourly wages. For instance, elasticity measures were −.03 for those who received hourly wages and −.003 for those who received a fixed salary. PMID:26603429

  16. AGENT-BASED MODELS IN EMPIRICAL SOCIAL RESEARCH*

    PubMed Central

    Bruch, Elizabeth; Atwell, Jon

    2014-01-01

    Agent-based modeling has become increasingly popular in recent years, but there is still no codified set of recommendations or practices for how to use these models within a program of empirical research. This article provides ideas and practical guidelines drawn from sociology, biology, computer science, epidemiology, and statistics. We first discuss the motivations for using agent-based models in both basic science and policy-oriented social research. Next, we provide an overview of methods and strategies for incorporating data on behavior and populations into agent-based models, and review techniques for validating and testing the sensitivity of agent-based models. We close with suggested directions for future research. PMID:25983351

  17. An empirically based electrosource horizon lead-acid battery model

    SciTech Connect

    Moore, S.; Eshani, M.

    1996-09-01

    An empirically based mathematical model of a lead-acid battery for use in the Texas A and M University`s Electrically Peaking Hybrid (ELPH) computer simulation is presented. The battery model is intended to overcome intuitive difficulties with currently available models by employing direct relationships between state-of-charge, voltage, and power demand. The model input is the power demand or load. Model outputs include voltage, an instantaneous battery efficiency coefficient and a state-of-charge indicator. A time and current depend voltage hysteresis is employed to ensure correct voltage tracking inherent with the highly transient nature of a hybrid electric drivetrain.

  18. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data.

    PubMed

    Shannon, Graeme; Lewis, Jesse S; Gerber, Brian D

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km(2) of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10-120 cameras) and occasions (20-120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ψ) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with

  19. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data

    PubMed Central

    Lewis, Jesse S.; Gerber, Brian D.

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km2 of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10–120 cameras) and occasions (20–120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ψ) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with

  20. Comparison between empirical and physically based models of atmospheric correction

    NASA Astrophysics Data System (ADS)

    Mandanici, E.; Franci, F.; Bitelli, G.; Agapiou, A.; Alexakis, D.; Hadjimitsis, D. G.

    2015-06-01

    A number of methods have been proposed for the atmospheric correction of the multispectral satellite images, based on either atmosphere modelling or images themselves. Full radiative transfer models require a lot of ancillary information about the atmospheric conditions at the acquisition time. Whereas, image based methods cannot account for all the involved phenomena. Therefore, the aim of this paper is the comparison of different atmospheric correction methods for multispectral satellite images. The experimentation was carried out on a study area located in the catchment area of Yialias river, 20 km South of Nicosia, the Cyprus capital. The following models were tested, both empirical and physically based: Dark object subtraction, QUAC, Empirical line, 6SV, and FLAASH. They were applied on a Landsat 8 multispectral image. The spectral signatures of ten different land cover types were measured during a field campaign in 2013 and 15 samples were collected for laboratory measurements in a second campaign in 2014. GER 1500 spectroradiometer was used; this instrument can record electromagnetic radiation from 350 up to 1050 nm, includes 512 different channels and each channel covers about 1.5 nm. The spectral signatures measured were used to simulate the reflectance values for the multispectral sensor bands by applying relative spectral response filters. These data were considered as ground truth to assess the accuracy of the different image correction models. Results do not allow to establish which method is the most accurate. The physics-based methods describe better the shape of the signatures, whereas the image-based models perform better regarding the overall albedo.

  1. State of Research on Giftedness and Gifted Education: A Survey of Empirical Studies Published during 1998-2010 (April)

    ERIC Educational Resources Information Center

    Dai, David Yun; Swanson, Joan Ann; Cheng, Hongyu

    2011-01-01

    This study surveyed 1,234 empirical studies on giftedness, gifted education, and creativity during 1998-2010 (April), using PsycINFO database and targeted journals as main sources, with respect to main topics these studies focused on, methods they used for investigation, and the conceptual spaces they traversed. Four main research topics emerged…

  2. Development of an empirically based dynamic biomechanical strength model

    NASA Technical Reports Server (NTRS)

    Pandya, A.; Maida, J.; Aldridge, A.; Hasson, S.; Woolford, B.

    1992-01-01

    The focus here is on the development of a dynamic strength model for humans. Our model is based on empirical data. The shoulder, elbow, and wrist joints are characterized in terms of maximum isolated torque, position, and velocity in all rotational planes. This information is reduced by a least squares regression technique into a table of single variable second degree polynomial equations determining the torque as a function of position and velocity. The isolated joint torque equations are then used to compute forces resulting from a composite motion, which in this case is a ratchet wrench push and pull operation. What is presented here is a comparison of the computed or predicted results of the model with the actual measured values for the composite motion.

  3. Developing an empirical base for clinical nurse specialist education.

    PubMed

    Stahl, Arleen M; Nardi, Deena; Lewandowski, Margaret A

    2008-01-01

    This article reports on the design of a clinical nurse specialist (CNS) education program using National Association of Clinical Nurse Specialists (NACNS) CNS competencies to guide CNS program clinical competency expectations and curriculum outcomes. The purpose is to contribute to the development of an empirical base for education and credentialing of CNSs. The NACNS CNS core competencies and practice competencies in all 3 spheres of influence guided the creation of clinical competency grids for this university's practicum courses. This project describes the development, testing, and application of these clinical competency grids that link the program's CNS clinical courses with the NACNS CNS competencies. These documents guide identification, tracking, measurement, and evaluation of the competencies throughout the clinical practice portion of the CNS program. This ongoing project will continue to provide data necessary to the benchmarking of CNS practice competencies, which is needed to evaluate the effectiveness of direct practice performance and the currency of graduate nursing education. PMID:18438164

  4. The Gaia-ESO Survey: Empirical determination of the precision of stellar radial velocities and projected rotation velocities

    NASA Astrophysics Data System (ADS)

    Jackson, R. J.; Jeffries, R. D.; Lewis, J.; Koposov, S. E.; Sacco, G. G.; Randich, S.; Gilmore, G.; Asplund, M.; Binney, J.; Bonifacio, P.; Drew, J. E.; Feltzing, S.; Ferguson, A. M. N.; Micela, G.; Neguerela, I.; Prusti, T.; Rix, H.-W.; Vallenari, A.; Alfaro, E. J.; Allende Prieto, C.; Babusiaux, C.; Bensby, T.; Blomme, R.; Bragaglia, A.; Flaccomio, E.; Francois, P.; Hambly, N.; Irwin, M.; Korn, A. J.; Lanzafame, A. C.; Pancino, E.; Recio-Blanco, A.; Smiljanic, R.; Van Eck, S.; Walton, N.; Bayo, A.; Bergemann, M.; Carraro, G.; Costado, M. T.; Damiani, F.; Edvardsson, B.; Franciosini, E.; Frasca, A.; Heiter, U.; Hill, V.; Hourihane, A.; Jofré, P.; Lardo, C.; de Laverny, P.; Lind, K.; Magrini, L.; Marconi, G.; Martayan, C.; Masseron, T.; Monaco, L.; Morbidelli, L.; Prisinzano, L.; Sbordone, L.; Sousa, S. G.; Worley, C. C.; Zaggia, S.

    2015-08-01

    Context. The Gaia-ESO Survey (GES) is a large public spectroscopic survey at the European Southern Observatory Very Large Telescope. Aims: A key aim is to provide precise radial velocities (RVs) and projected equatorial velocities (vsini) for representative samples of Galactic stars, which will complement information obtained by the Gaia astrometry satellite. Methods: We present an analysis to empirically quantify the size and distribution of uncertainties in RV and vsini using spectra from repeated exposures of the same stars. Results: We show that the uncertainties vary as simple scaling functions of signal-to-noise ratio (S/N) and vsini, that the uncertainties become larger with increasing photospheric temperature, but that the dependence on stellar gravity, metallicity and age is weak. The underlying uncertainty distributions have extended tails that are better represented by Student's t-distributions than by normal distributions. Conclusions: Parametrised results are provided, which enable estimates of the RV precision for almost all GES measurements, and estimates of the vsini precision for stars in young clusters, as a function of S/N, vsini and stellar temperature. The precision of individual high S/N GES RV measurements is 0.22-0.26 km s-1, dependent on instrumental configuration. Based on observations collected with the FLAMES spectrograph at VLT/UT2 telescope (Paranal Observatory, ESO, Chile), for the Gaia- ESO Large Public Survey (188.B-3002).Full Table 2 is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/580/A75

  5. Ontology-Based Empirical Knowledge Verification for Professional Virtual Community

    ERIC Educational Resources Information Center

    Chen, Yuh-Jen

    2011-01-01

    A professional virtual community provides an interactive platform for enterprise experts to create and share their empirical knowledge cooperatively, and the platform contains a tremendous amount of hidden empirical knowledge that knowledge experts have preserved in the discussion process. Therefore, enterprise knowledge management highly…

  6. Arduino based radiation survey meter

    NASA Astrophysics Data System (ADS)

    Rahman, Nur Aira Abd; Lombigit, Lojius; Abdullah, Nor Arymaswati; Azman, Azraf; Dolah, Taufik; Muzakkir, Amir; Jaafar, Zainudin; Mohamad, Glam Hadzir Patai; Ramli, Abd Aziz Mhd; Zain, Rasif Mohd; Said, Fazila; Khalid, Mohd Ashhar; Taat, Muhamad Zahidee

    2016-01-01

    This paper presents the design of new digital radiation survey meter with LND7121 Geiger Muller tube detector and Atmega328P microcontroller. Development of the survey meter prototype is carried out on Arduino Uno platform. 16-bit Timer1 on the microcontroller is utilized as external pulse counter to produce count per second or CPS measurement. Conversion from CPS to dose rate technique is also performed by Arduino to display results in micro Sievert per hour (μSvhr-1). Conversion factor (CF) value for conversion of CPM to μSvhr-1 determined from manufacturer data sheet is compared with CF obtained from calibration procedure. The survey meter measurement results are found to be linear for dose rates below 3500 µSv/hr.

  7. Noise cancellation in IR video based on empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Piñeiro-Ave, José; Blanco-Velasco, Manuel; Cruz-Roldán, Fernando; Artés-Rodríguez, Antonio

    2013-05-01

    Currently there is a huge demand for simple low cost IR cameras for both civil and military applications, among which one of the most common is the surveillance of restricted access zones. In the design of low cost IR cameras, it is necessary to avoid the use of several elements present in more sophisticated cameras, such as the refrigeration systems and the temperature control of the detectors, so as to prevent the use of a mechanical modulator of the incident radiation (chopper). Consequently, the detection algorithms must reliably separate the target signal from high noise and drift caused by temporal variations of the background image of the scene and the additional drift due to thermal instability detectors. A very important step towards this goal is the design of a preprocessing stage to eliminate noise. Thus, in this work we propose using the Empirical Mode Decomposition (EMD) method to attain this objective. In order to evaluate the quality of the reconstructed clean signal, the Average to Peak Ratio is assessed to evaluate the effectiveness in reconstructing the waveform of the signal from the target. We compare the EMD method with other classical method of noise cancellation based on the Discrete Wavelet Transform (DWT). The results reported by simulations show that the proposed scheme based on EMD performs better than traditional ones.

  8. A joint model of household space heat production and consumption: Empirical evidence from a Belgian micro-data survey

    SciTech Connect

    Cuijpers, C.

    1995-12-31

    Households are faced with increasing regulation to improve energy conservation and energy efficiency for environmental concerns. Understanding how a house produces space heat and how energy consumption can be reduced becomes a keystone in designing energy and environmental policies. This paper provides empirical evidence on household behavior in the context of house heating. A joint household space heat production and consumption model is developed and empirically implemented. Attention is devoted mainly to the intermediate role of the characteristics of the house, with special reference to insulation levels, which determine the ability of the house to convert energy into heat levels. House heat levels are characterized and empirical support for the so-called {open_quote}rebound{close_quote} effects are shown. The econometric model is specified for a single period cross-section regression estimation, The database is drawn from the 1987-88 Belgian Household Expenditure Survey.

  9. Children's Experiences of Completing a Computer-Based Violence Survey: Ethical Implications

    ERIC Educational Resources Information Center

    Ellonen, Noora; Poso, Tarja

    2011-01-01

    This article aims to contribute to the discussion about the ethics of research on children when studying sensitive issues such as violence. The empirical analysis is based on the accounts given by children (11 377) who completed a computer-based questionnaire about their experiences of violence ("The Finnish Child Victim Survey 2008") and their…

  10. Evidence-based ethics? On evidence-based practice and the "empirical turn" from normative bioethics

    PubMed Central

    Goldenberg, Maya J

    2005-01-01

    Background The increase in empirical methods of research in bioethics over the last two decades is typically perceived as a welcomed broadening of the discipline, with increased integration of social and life scientists into the field and ethics consultants into the clinical setting, however it also represents a loss of confidence in the typical normative and analytic methods of bioethics. Discussion The recent incipiency of "Evidence-Based Ethics" attests to this phenomenon and should be rejected as a solution to the current ambivalence toward the normative resolution of moral problems in a pluralistic society. While "evidence-based" is typically read in medicine and other life and social sciences as the empirically-adequate standard of reasonable practice and a means for increasing certainty, I propose that the evidence-based movement in fact gains consensus by displacing normative discourse with aggregate or statistically-derived empirical evidence as the "bottom line". Therefore, along with wavering on the fact/value distinction, evidence-based ethics threatens bioethics' normative mandate. The appeal of the evidence-based approach is that it offers a means of negotiating the demands of moral pluralism. Rather than appealing to explicit values that are likely not shared by all, "the evidence" is proposed to adjudicate between competing claims. Quantified measures are notably more "neutral" and democratic than liberal markers like "species normal functioning". Yet the positivist notion that claims stand or fall in light of the evidence is untenable; furthermore, the legacy of positivism entails the quieting of empirically non-verifiable (or at least non-falsifiable) considerations like moral claims and judgments. As a result, evidence-based ethics proposes to operate with the implicit normativity that accompanies the production and presentation of all biomedical and scientific facts unchecked. Summary The "empirical turn" in bioethics signals a need for

  11. Short memory or long memory: an empirical survey of daily rainfall data

    NASA Astrophysics Data System (ADS)

    Yusof, F.; Kane, I. L.

    2012-10-01

    A short memory process that encounters occasional structural breaks in mean can show a slower rate of decay in the autocorrelation function and other properties of fractional integrated I (d) processes. In this paper we employed a procedure for estimating the fractional differencing parameter in semi parametric contexts proposed by Geweke and Porter-Hudak to analyze nine daily rainfall data sets across Malaysia. The results indicate that all the data sets exhibit long memory. Furthermore, an empirical fluctuation process using the Ordinary Least Square (OLS) based cumulative sum (CUSUM) test with F-statistic for the break date were applied, break dates were detected in all data sets. The data sets were partitioned according to their respective break date and further test for long memory was applied for all subseries. Results show that all subseries follows the same pattern with the original series. The estimate of the fractional parameters d1 and d2 on the subseries obtained by splitting the original series at the break-date, confirms that there is a long memory in the DGP. Therefore this evidence shows a true long memory not due to structural break.

  12. Structural break or long memory: an empirical survey on daily rainfall data sets across Malaysia

    NASA Astrophysics Data System (ADS)

    Yusof, F.; Kane, I. L.; Yusop, Z.

    2013-04-01

    A short memory process that encounters occasional structural breaks in mean can show a slower rate of decay in the autocorrelation function and other properties of fractional integrated I (d) processes. In this paper we employed a procedure for estimating the fractional differencing parameter in semiparametric contexts proposed by Geweke and Porter-Hudak (1983) to analyse nine daily rainfall data sets across Malaysia. The results indicate that all the data sets exhibit long memory. Furthermore, an empirical fluctuation process using the ordinary least square (OLS)-based cumulative sum (CUSUM) test for the break date was applied. Break dates were detected in all data sets. The data sets were partitioned according to their respective break date, and a further test for long memory was applied for all subseries. Results show that all subseries follows the same pattern as the original series. The estimate of the fractional parameters d1 and d2 on the subseries obtained by splitting the original series at the break date confirms that there is a long memory in the data generating process (DGP). Therefore this evidence shows a true long memory not due to structural break.

  13. STEAM: a software tool based on empirical analysis for micro electro mechanical systems

    NASA Astrophysics Data System (ADS)

    Devasia, Archana; Pasupuleti, Ajay; Sahin, Ferat

    2006-03-01

    In this research a generalized software framework that enables accurate computer aided design of MEMS devices is developed. The proposed simulation engine utilizes a novel material property estimation technique that generates effective material properties at the microscopic level. The material property models were developed based on empirical analysis and the behavior extraction of standard test structures. A literature review is provided on the physical phenomena that govern the mechanical behavior of thin films materials. This survey indicates that the present day models operate under a wide range of assumptions that may not be applicable to the micro-world. Thus, this methodology is foreseen to be an essential tool for MEMS designers as it would develop empirical models that relate the loading parameters, material properties, and the geometry of the microstructures with its performance characteristics. This process involves learning the relationship between the above parameters using non-parametric learning algorithms such as radial basis function networks and genetic algorithms. The proposed simulation engine has a graphical user interface (GUI) which is very adaptable, flexible, and transparent. The GUI is able to encompass all parameters associated with the determination of the desired material property so as to create models that provide an accurate estimation of the desired property. This technique was verified by fabricating and simulating bilayer cantilevers consisting of aluminum and glass (TEOS oxide) in our previous work. The results obtained were found to be very encouraging.

  14. Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data

    NASA Technical Reports Server (NTRS)

    Rompala, John T.

    2005-01-01

    A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.

  15. WATERSHED-BASED SURVEY DESIGNS

    EPA Science Inventory

    Water-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification if impaired water bodies or watersheds to meet Sectio...

  16. A Survey of Graduate Training in Empirically Supported and Manualized Treatments: A Preliminary Report

    ERIC Educational Resources Information Center

    Karekla, Maria; Lundgren, Jennifer D.; Forsyth, John P.

    2004-01-01

    The promotion and dissemination of empirically supported (ESTs) and manualized therapies are important, albeit controversial, developments within clinical science and practice. To date, studies evaluating training opportunities and attitudes about such treatments at the graduate, predoctoral internship, and postdoctoral levels have focused on the…

  17. Cloud Based Processing of Large Photometric Surveys

    NASA Astrophysics Data System (ADS)

    Farivar, R.; Brunner, R. J.; Santucci, R.; Campbell, R.

    2013-10-01

    Astronomy, as is the case with many scientific domains, has entered the realm of being a data rich science. Nowhere is this reflected more clearly than in the growth of large area surveys, such as the recently completed Sloan Digital Sky Survey (SDSS) or the Dark Energy Survey, which will soon obtain PB of imaging data. The data processing on these large surveys is a major challenge. In this paper, we demonstrate a new approach to this common problem. We propose the use of cloud-based technologies (e.g., Hadoop MapReduce) to run a data analysis program (e.g., SExtractor) across a cluster. Using the intermediate key/value pair design of Hadoop, our framework matches objects across different SExtractor invocations to create a unified catalog from all SDSS processed data. We conclude by presenting our experimental results on a 432 core cluster and discuss the lessons we have learned in completing this challenge.

  18. Empirical Evaluation of Social, Psychological, & Educational Construct Variables Used in NCES Surveys. Working Paper Series.

    ERIC Educational Resources Information Center

    Freidlin, Boris; Salvucci, Sameena

    The purpose of this study was to provide an analysis and evaluation of composite variables in the National Education Longitudinal Study of 1988 (NELS:88) and School and Staffing Survey (SASS) in a way that provides guidance to the staff of the National Center for Education Statistics (NCES) in the more effective use of survey resources. The study…

  19. Accuracy of Population Validity and Cross-Validity Estimation: An Empirical Comparison of Formula-Based, Traditional Empirical, and Equal Weights Procedures.

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Bilgic, Reyhan; Edwards, Jack E.; Fleer, Paul F.

    1999-01-01

    Performed an empirical Monte Carlo study using predictor and criterion data from 84,808 U.S. Air Force enlistees. Compared formula-based, traditional empirical, and equal-weights procedures. Discusses issues for basic research on validation and cross-validation. (SLD)

  20. A Comparison of Web-Based and Paper-Based Survey Methods: Testing Assumptions of Survey Mode and Response Cost

    ERIC Educational Resources Information Center

    Greenlaw, Corey; Brown-Welty, Sharon

    2009-01-01

    Web-based surveys have become more prevalent in areas such as evaluation, research, and marketing research to name a few. The proliferation of these online surveys raises the question, how do response rates compare with traditional surveys and at what cost? This research explored response rates and costs for Web-based surveys, paper surveys, and…

  1. Empirical Analysis and Refinement of Expert System Knowledge Bases

    PubMed Central

    Weiss, Sholom M.; Politakis, Peter; Ginsberg, Allen

    1986-01-01

    Recent progress in knowledge base refinement for expert systems is reviewed. Knowledge base refinement is characterized by the constrained modification of rule-components in an existing knowledge base. The goals are to localize specific weaknesses in a knowledge base and to improve an expert system's performance. Systems that automate some aspects of knowledge base refinement can have a significant impact on the related problems of knowledge base acquisition, maintenance, verification, and learning from experience. The SEEK empiricial analysis and refinement system is reviewed and its successor system, SEEK2, is introduced. Important areas for future research in knowledge base refinement are described.

  2. An empirical formula based on Monte Carlo simulation for diffuse reflectance from turbid media

    NASA Astrophysics Data System (ADS)

    Gnanatheepam, Einstein; Aruna, Prakasa Rao; Ganesan, Singaravelu

    2016-03-01

    Diffuse reflectance spectroscopy has been widely used in diagnostic oncology and characterization of laser irradiated tissue. However, still accurate and simple analytical equation does not exist for estimation of diffuse reflectance from turbid media. In this work, a diffuse reflectance lookup table for a range of tissue optical properties was generated using Monte Carlo simulation. Based on the generated Monte Carlo lookup table, an empirical formula for diffuse reflectance was developed using surface fitting method. The variance between the Monte Carlo lookup table surface and the surface obtained from the proposed empirical formula is less than 1%. The proposed empirical formula may be used for modeling of diffuse reflectance from tissue.

  3. Marginal Products of the Survey and Principles of Economics Courses: Empirical Information for Curriculum Decision Making.

    ERIC Educational Resources Information Center

    Butler, Thomas

    In order to determine the most appropriate economics prerequisites for selected courses at Thomas Nelson Community College, a study was conducted during Winter 1980 to compare the relative effectiveness of a one-quarter survey course in economics with a three-quarter principles course. The study involved the pre- and post-testing of students…

  4. A Survey and Empirical Study of Virtual Reference Service in Academic Libraries

    ERIC Educational Resources Information Center

    Mu, Xiangming; Dimitroff, Alexandra; Jordan, Jeanette; Burclaff, Natalie

    2011-01-01

    Virtual Reference Services (VRS) have high user satisfaction. The main problem is its low usage. We surveyed 100 academic library web sites to understand how VRS are presented. We then conducted a usability study to further test an active VRS model regarding its effectiveness.

  5. An Empirical Taxonomy of Youths' Fears: Cluster Analysis of the American Fear Survey Schedule

    ERIC Educational Resources Information Center

    Burnham, Joy J.; Schaefer, Barbara A.; Giesen, Judy

    2006-01-01

    Fears profiles among children and adolescents were explored using the Fear Survey Schedule for Children-American version (FSSC-AM; J.J. Burnham, 1995, 2005). Eight cluster profiles were identified via multistage Euclidean grouping and supported by homogeneity coefficients and replication. Four clusters reflected overall level of fears (i.e., very…

  6. Deep in Data. Empirical Data Based Software Accuracy Testing Using the Building America Field Data Repository

    SciTech Connect

    Neymark, J.; Roberts, D.

    2013-06-01

    This paper describes progress toward developing a usable, standardized, empirical data-based software accuracy test suite using home energy consumption and building description data. Empirical data collected from around the United States have been translated into a uniform Home Performance Extensible Markup Language format that may enable software developers to create translators to their input schemes for efficient access to the data. This could allow for modeling many homes expediently, and thus implementing software accuracy test cases by applying the translated data.

  7. Performance-Based Service Quality Model: An Empirical Study on Japanese Universities

    ERIC Educational Resources Information Center

    Sultan, Parves; Wong, Ho

    2010-01-01

    Purpose: This paper aims to develop and empirically test the performance-based higher education service quality model. Design/methodology/approach: The study develops 67-item instrument for measuring performance-based service quality with a particular focus on the higher education sector. Scale reliability is confirmed using the Cronbach's alpha.…

  8. Empirically Based School Interventions Targeted at Academic and Mental Health Functioning

    ERIC Educational Resources Information Center

    Hoagwood, Kimberly E.; Olin, S. Serene; Kerker, Bonnie D.; Kratochwill, Thomas R.; Crowe, Maura; Saka, Noa

    2007-01-01

    This review examines empirically based studies of school-based mental health interventions. The review identified 64 out of more than 2,000 articles published between 1990 and 2006 that met methodologically rigorous criteria for inclusion. Of these 64 articles, only 24 examined both mental health "and" educational outcomes. The majority of…

  9. The Empirical Investigation of Perspective-Based Reading

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Green, Scott; Laitenberger, Oliver; Shull, Forrest; Sorumgard, Sivert; Zelkowitz, Marvin V.

    1996-01-01

    We consider reading techniques a fundamental means of achieving high quality software. Due to the lack of research in this area, we are experimenting with the application and comparison of various reading techniques. This paper deals with our experiences with Perspective-Based Reading (PBR), a particular reading technique for requirements documents. The goal of PBR is to provide operational scenarios where members of a review team read a document from a particular perspective (e.g., tester, developer, user). Our assumption is that the combination of different perspectives provides better coverage of the document than the same number of readers using their usual technique.

  10. Polymer electrolyte membrane fuel cell fault diagnosis based on empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Damour, Cédric; Benne, Michel; Grondin-Perez, Brigitte; Bessafi, Miloud; Hissel, Daniel; Chabriat, Jean-Pierre

    2015-12-01

    Diagnosis tool for water management is relevant to improve the reliability and lifetime of polymer electrolyte membrane fuel cells (PEMFCs). This paper presents a novel signal-based diagnosis approach, based on Empirical Mode Decomposition (EMD), dedicated to PEMFCs. EMD is an empirical, intuitive, direct and adaptive signal processing method, without pre-determined basis functions. The proposed diagnosis approach relies on the decomposition of FC output voltage to detect and isolate flooding and drying faults. The low computational cost of EMD, the reduced number of required measurements, and the high diagnosis accuracy of flooding and drying faults diagnosis make this approach a promising online diagnosis tool for PEMFC degraded modes management.

  11. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study.

    PubMed

    Tîrnăucă, Cristina; Montaña, José L; Ontañón, Santiago; González, Avelino J; Pardo, Luis M

    2016-01-01

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent's actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches. PMID:27347956

  12. Capability deprivation of people with Alzheimer's disease: An empirical analysis using a national survey.

    PubMed

    Tellez, Juan; Krishnakumar, Jaya; Bungener, Martine; Le Galès, Catherine

    2016-02-01

    How can one assess the quality of life of older people--particularly those with Alzheimer's disease--from the point of view of their opportunities to do valued things in life? This paper is an attempt to answer this question using as a theoretical framework the capability approach. We use data collected on 8841 individuals above 60 living in France (the 2008 Disability and Health Household Survey) and propose a latent variable modelling framework to analyse their capabilities in two fundamental dimensions: freedom to perform self-care activities and freedom to participate in the life of the household. Our results show that living as a couple, having children, being mobile and having access to local shops, health facilities and public services enhance both capabilities. Age, household size and male gender (for one of the two capabilities) act as impediments while the number of impairments reduces both capabilities. We find that people with Alzheimer's disease have a lower level and a smaller range of capabilities (freedom) when compared to those without, even when the latter have several impairments. Hence they need a special attention in policy-making. PMID:26773293

  13. An empirical determination of the dust mass absorption coefficient, κd, using the Herschel Reference Survey

    NASA Astrophysics Data System (ADS)

    Clark, Christopher J. R.; Schofield, Simon P.; Gomez, Haley L.; Davies, Jonathan I.

    2016-06-01

    We use the published photometry and spectroscopy of 22 galaxies in the Herschel Reference Survey to determine that the value of the dust mass absorption coefficient κd at a wavelength of 500 μm is kappa _{500} = 0.051^{+0.070}_{-0.026} m^{2 kg^{-1}}. We do so by taking advantage of the fact that the dust-to-metals ratio in the interstellar medium of galaxies appears to be constant. We argue that our value for κd supersedes that of James et al. - who pioneered this approach for determining κd - because we take advantage of superior data, and account for a number of significant systematic effects that they did not consider. We comprehensively incorporate all methodological and observational contributions to establish the uncertainty on our value, which represents a marked improvement on the oft-quoted `order-of-magnitude' uncertainty on κd. We find no evidence that the value of κd differs significantly between galaxies, or that it correlates with any other measured or derived galaxy properties. We note, however, that the availability of data limits our sample to relatively massive (109.7 < M⋆ < 1011.0 M⊙), high metallicity (8.61 < [ 12 + log_{10} fracOH ] < 8.86) galaxies; future work will allow us to investigate a wider range of systems.

  14. Towards an Empirically Based Parametric Explosion Spectral Model

    SciTech Connect

    Ford, S R; Walter, W R; Ruppert, S; Matzel, E; Hauk, T; Gok, R

    2009-08-31

    Small underground nuclear explosions need to be confidently detected, identified, and characterized in regions of the world where they have never before been tested. The focus of our work is on the local and regional distances (< 2000 km) and phases (Pn, Pg, Sn, Lg) necessary to see small explosions. We are developing a parametric model of the nuclear explosion seismic source spectrum that is compatible with the earthquake-based geometrical spreading and attenuation models developed using the Magnitude Distance Amplitude Correction (MDAC) techniques (Walter and Taylor, 2002). The explosion parametric model will be particularly important in regions without any prior explosion data for calibration. The model is being developed using the available body of seismic data at local and regional distances for past nuclear explosions at foreign and domestic test sites. Parametric modeling is a simple and practical approach for widespread monitoring applications, prior to the capability to carry out fully deterministic modeling. The achievable goal of our parametric model development is to be able to predict observed local and regional distance seismic amplitudes for event identification and yield determination in regions with incomplete or no prior history of underground nuclear testing. The relationship between the parametric equations and the geologic and containment conditions will assist in our physical understanding of the nuclear explosion source.

  15. The Empirical Investigation of Perspective-Based Reading

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Green, Scott; Laitenberger, Oliver; Shull, Forrest; Sorumgard, Sivert; Zelkowitz, Marvin V.

    1995-01-01

    We consider reading techniques a fundamental means of achieving high quality software. Due to lack of research in this area, we are experimenting with the application and comparison of various reading techniques. This paper deals with our experiences with Perspective Based Reading (PBR) a particular reading technique for requirement documents. The goal of PBR is to provide operation scenarios where members of a review team read a document from a particular perspective (eg., tester, developer, user). Our assumption is that the combination of different perspective provides better coverage of the document than the same number of readers using their usual technique. To test the efficacy of PBR, we conducted two runs of a controlled experiment in the environment of NASA GSFC Software Engineering Laboratory (SEL), using developers from the environment. The subjects read two types of documents, one generic in nature and the other from the NASA Domain, using two reading techniques, PBR and their usual technique. The results from these experiment as well as the experimental design, are presented and analyzed. When there is a statistically significant distinction, PBR performs better than the subjects' usual technique. However, PBR appears to be more effective on the generic documents than on the NASA documents.

  16. An Empirically Based Error-Model for Radar Rainfall Estimates

    NASA Astrophysics Data System (ADS)

    Ciach, G. J.

    2004-05-01

    Mathematical modeling of the way radar rainfall (RR) approximates the physical truth is a prospective method to quantify the RR uncertainties. In this approach one can represent RR in the form of an "observation equation," that is, as a function of the corresponding true rainfall and a random error process. The error process describes the cumulative effect of all the sources of RR uncertainties. We present the results of our work on the identification and estimation of this relationship. They are based on the Level II reflectivity data from the WSR-88D radar in Tulsa, Oklahoma, and rainfall measurements from 23 surrounding Oklahoma Mesonet raingauges. Accumulation intervals from one hour to one day were analyzed using this sample. The raingauge accumulations were used as an approximation of the true rainfall in this study. The RR error-model that we explored is factorized into a deterministic distortion, which is a function of the true rainfall, and a multiplicative random error factor that is a positively-defined random variable. The distribution of the error factor depends on the true rainfall, however, its expectation in this representation is always equal to one (all the biases are modeled by the deterministic component). With this constraint, the deterministic distortion function can be defined as the conditional mean of RR conditioned on the true rainfall. We use nonparametric regression to estimate the deterministic distortion, and the variance and quantiles of the random error factor, as functions of the true rainfall. The results show that the deterministic distortion is a nonlinear function of the true rainfall that indicates systematic overestimation of week rainfall and underestimation of strong rainfall (conditional bias). The standard deviation of the error factor is a decreasing function of the true rainfall that ranges from about 0.8 for week rainfall to about 0.3 for strong rainfall. For larger time-scales, both the deterministic distortion and the

  17. A Survey of UML Based Regression Testing

    NASA Astrophysics Data System (ADS)

    Fahad, Muhammad; Nadeem, Aamer

    Regression testing is the process of ensuring software quality by analyzing whether changed parts behave as intended, and unchanged parts are not affected by the modifications. Since it is a costly process, a lot of techniques are proposed in the research literature that suggest testers how to build regression test suite from existing test suite with minimum cost. In this paper, we discuss the advantages and drawbacks of using UML diagrams for regression testing and analyze that UML model helps in identifying changes for regression test selection effectively. We survey the existing UML based regression testing techniques and provide an analysis matrix to give a quick insight into prominent features of the literature work. We discuss the open research issues like managing and reducing the size of regression test suite, prioritization of the test cases that would be helpful during strict schedule and resources that remain to be addressed for UML based regression testing.

  18. Evidence-based Nursing Education - a Systematic Review of Empirical Research

    PubMed Central

    Reiber, Karin

    2011-01-01

    The project „Evidence-based Nursing Education – Preparatory Stage“, funded by the Landesstiftung Baden-Württemberg within the programme Impulsfinanzierung Forschung (Funding to Stimulate Research), aims to collect information on current research concerned with nursing education and to process existing data. The results of empirical research which has already been carried out were systematically evaluated with aim of identifying further topics, fields and matters of interest for empirical research in nursing education. In the course of the project, the available empirical studies on nursing education were scientifically analysed and systematised. The over-arching aim of the evidence-based training approach – which extends beyond the aims of this project - is the conception, organisation and evaluation of vocational training and educational processes in the caring professions on the basis of empirical data. The following contribution first provides a systematic, theoretical link to the over-arching reference framework, as the evidence-based approach is adapted from thematically related specialist fields. The research design of the project is oriented towards criteria introduced from a selection of studies and carries out a two-stage systematic review of the selected studies. As a result, the current status of research in nursing education, as well as its organisation and structure, and questions relating to specialist training and comparative education are introduced and discussed. Finally, the empirical research on nursing training is critically appraised as a complementary element in educational theory/psychology of learning and in the ethical tradition of research. This contribution aims, on the one hand, to derive and describe the methods used, and to introduce the steps followed in gathering and evaluating the data. On the other hand, it is intended to give a systematic overview of empirical research work in nursing education. In order to preserve a

  19. Implementing Evidence-Based Practice: A Review of the Empirical Research Literature

    ERIC Educational Resources Information Center

    Gray, Mel; Joy, Elyssa; Plath, Debbie; Webb, Stephen A.

    2013-01-01

    The article reports on the findings of a review of empirical studies examining the implementation of evidence-based practice (EBP) in the human services. Eleven studies were located that defined EBP as a research-informed, clinical decision-making process and identified barriers and facilitators to EBP implementation. A thematic analysis of the…

  20. An Empirically-Based Statewide System for Identifying Quality Pre-Kindergarten Programs

    ERIC Educational Resources Information Center

    Williams, Jeffrey M.; Landry, Susan H.; Anthony, Jason L.; Swank, Paul R.; Crawford, April D.

    2012-01-01

    This study presents an empirically-based statewide system that links information about pre-kindergarten programs with children's school readiness scores to certify pre-kindergarten classrooms as promoting school readiness. Over 8,000 children from 1,255 pre-kindergarten classrooms were followed longitudinally for one year. Pre-kindergarten quality…

  1. Development of an Empirically Based Questionnaire to Investigate Young Students' Ideas about Nature of Science

    ERIC Educational Resources Information Center

    Chen, Sufen; Chang, Wen-Hua; Lieu, Sang-Chong; Kao, Huey-Lien; Huang, Mao-Tsai; Lin, Shu-Fen

    2013-01-01

    This study developed an empirically based questionnaire to monitor young learners' conceptions of nature of science (NOS). The questionnaire, entitled Students' Ideas about Nature of Science (SINOS), measured views on theory-ladenness, use of creativity and imagination, tentativeness of scientific knowledge, durability of scientific knowledge,…

  2. Feasibility of an Empirically Based Program for Parents of Preschoolers with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Dababnah, Sarah; Parish, Susan L.

    2016-01-01

    This article reports on the feasibility of implementing an existing empirically based program, "The Incredible Years," tailored to parents of young children with autism spectrum disorder. Parents raising preschool-aged children (aged 3-6?years) with autism spectrum disorder (N?=?17) participated in a 15-week pilot trial of the…

  3. An Empirically Based Method of Q-Matrix Validation for the DINA Model: Development and Applications

    ERIC Educational Resources Information Center

    de la Torre, Jimmy

    2008-01-01

    Most model fit analyses in cognitive diagnosis assume that a Q matrix is correct after it has been constructed, without verifying its appropriateness. Consequently, any model misfit attributable to the Q matrix cannot be addressed and remedied. To address this concern, this paper proposes an empirically based method of validating a Q matrix used…

  4. Multisystemic Therapy: An Empirically Supported, Home-Based Family Therapy Approach.

    ERIC Educational Resources Information Center

    Sheidow, Ashli J.; Woodford, Mark S.

    2003-01-01

    Multisystemic Therapy (MST) is a well-validated, evidenced-based treatment for serious clinical problems presented by adolescents and their families. This article is an introduction to the MST approach and outlines key clinical features, describes the theoretical underpinnings, and discusses the empirical support for MST's effectiveness with a…

  5. Empirical vs. Expected IRT-Based Reliability Estimation in Computerized Multistage Testing (MST)

    ERIC Educational Resources Information Center

    Zhang, Yanwei; Breithaupt, Krista; Tessema, Aster; Chuah, David

    2006-01-01

    Two IRT-based procedures to estimate test reliability for a certification exam that used both adaptive (via a MST model) and non-adaptive design were considered in this study. Both procedures rely on calibrated item parameters to estimate error variance. In terms of score variance, one procedure (Method 1) uses the empirical ability distribution…

  6. Satellite-based empirical models linking river plume dynamics with hypoxic area andvolume

    EPA Science Inventory

    Satellite-based empirical models explaining hypoxic area and volume variation were developed for the seasonally hypoxic (O2 < 2 mg L−1) northern Gulf of Mexico adjacent to the Mississippi River. Annual variations in midsummer hypoxic area and ...

  7. Use of an Empirically Based Marriage Education Program by Religious Organizations: Results of a Dissemination Trial

    ERIC Educational Resources Information Center

    Markman, Howard J.; Whitton, Sarah W.; Kline, Galena H.; Stanley, Scott M.; Thompson, Huette; St. Peters, Michelle; Leber, Douglas B.; Olmos-Gallo, P. Antonio; Prado, Lydia; Williams, Tamara; Gilbert, Katy; Tonelli, Laurie; Bobulinski, Michelle; Cordova, Allen

    2004-01-01

    We present an evaluation of the extent to which an empirically based couples' intervention program was successfully disseminated in the community. Clergy and lay leaders from 27 religious organizations who were trained to deliver the Prevention and Relationship Enhancement Program (PREP) were contacted approximately yearly for 5 years following…

  8. Untangling the Evidence: Introducing an Empirical Model for Evidence-Based Library and Information Practice

    ERIC Educational Resources Information Center

    Gillespie, Ann

    2014-01-01

    Introduction: This research is the first to investigate the experiences of teacher-librarians as evidence-based practice. An empirically derived model is presented in this paper. Method: This qualitative study utilised the expanded critical incident approach, and investigated the real-life experiences of fifteen Australian teacher-librarians,…

  9. Introduction to the Application of Web-Based Surveys.

    ERIC Educational Resources Information Center

    Timmerman, Annemarie

    This paper discusses some basic assumptions and issues concerning web-based surveys. Discussion includes: assumptions regarding cost and ease of use; disadvantages of web-based surveys, concerning the inability to compensate for four common errors of survey research: coverage error, sampling error, measurement error and nonresponse error; and…

  10. Empirical metallicity-dependent calibrations of effective temperature against colours for dwarfs and giants based on interferometric data

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Liu, X.-W.; Yuan, H.-B.; Xiang, M.-S.; Chen, B.-Q.; Zhang, H.-W.

    2015-12-01

    We present empirical metallicity-dependent calibrations of effective temperature against colours for dwarfs of luminosity classes IV and V and for giants of luminosity classes II and III, based on a collection from the literature of about two hundred nearby stars with direct effective temperature measurements of better than 2.5 per cent. The calibrations are valid for an effective temperature range 3100-10 000 K for dwarfs of spectral types M5 to A0 and 3100-5700 K for giants of spectral types K5 to G5. A total of 21 colours for dwarfs and 18 colours for giants of bands of four photometric systems, i.e. the Johnson (UBVRJIJJHK), the Cousins (RCIC), the Sloan Digital Sky Survey (gr) and the Two Micron All Sky Survey (JHKs), have been calibrated. Restricted by the metallicity range of the current sample, the calibrations are mainly applicable for disc stars ([Fe/H] ≳ - 1.0). The normalized percentage residuals of the calibrations are typically 2.0 and 1.5 per cent for dwarfs and giants, respectively. Some systematic discrepancies at various levels are found between the current scales and those available in the literature (e.g. those based on the infrared flux method or spectroscopy). Based on the current calibrations, we have re-determined the colours of the Sun. We have also investigated the systematic errors in effective temperatures yielded by the current on-going large-scale low- to intermediate-resolution stellar spectroscopic surveys. We show that the calibration of colour (g - Ks) presented in this work provides an invaluable tool for the estimation of stellar effective temperature for those on-going or upcoming surveys.

  11. Understanding why users tag: A survey of tagging motivation literature and results from an empirical study

    PubMed Central

    Strohmaier, Markus; Körner, Christian; Kern, Roman

    2012-01-01

    While recent progress has been achieved in understanding the structure and dynamics of social tagging systems, we know little about the underlying user motivations for tagging, and how they influence resulting folksonomies and tags. This paper addresses three issues related to this question. (1) What distinctions of user motivations are identified by previous research, and in what ways are the motivations of users amenable to quantitative analysis? (2) To what extent does tagging motivation vary across different social tagging systems? (3) How does variability in user motivation influence resulting tags and folksonomies? In this paper, we present measures to detect whether a tagger is primarily motivated by categorizing or describing resources, and apply these measures to datasets from seven different tagging systems. Our results show that (a) users’ motivation for tagging varies not only across, but also within tagging systems, and that (b) tag agreement among users who are motivated by categorizing resources is significantly lower than among users who are motivated by describing resources. Our findings are relevant for (1) the development of tag-based user interfaces, (2) the analysis of tag semantics and (3) the design of search algorithms for social tagging systems. PMID:23471473

  12. A comparison of web-based and paper-based survey methods: testing assumptions of survey mode and response cost.

    PubMed

    Greenlaw, Corey; Brown-Welty, Sharon

    2009-10-01

    Web-based surveys have become more prevalent in areas such as evaluation, research, and marketing research to name a few. The proliferation of these online surveys raises the question, how do response rates compare with traditional surveys and at what cost? This research explored response rates and costs for Web-based surveys, paper surveys, and mixed-mode surveys. The participants included evaluators from the American Evaluation Association (AEA). Results included that mixed-mode, while more expensive, had higher response rates. PMID:19605623

  13. 78 FR 54464 - Premier Empire Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-04

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Premier Empire Energy, LLC; Supplemental Notice That Initial Market-Based... above-referenced proceeding, of Premier Empire Energy, LLC's application for market-based rate...

  14. Increasing Response Rates to Web-Based Surveys

    ERIC Educational Resources Information Center

    Monroe, Martha C.; Adams, Damian C.

    2012-01-01

    We review a popular method for collecing data--Web-based surveys. Although Web surveys are popular, one major concern is their typically low response rates. Using the Dillman et al. (2009) approach, we designed, pre-tested, and implemented a survey on climate change with Extension professionals in the Southeast. The Dillman approach worked well,…

  15. Increasing Your Productivity with Web-Based Surveys

    ERIC Educational Resources Information Center

    Wissmann, Mary; Stone, Brittney; Schuster, Ellen

    2012-01-01

    Web-based survey tools such as Survey Monkey can be used in many ways to increase the efficiency and effectiveness of Extension professionals. This article describes how Survey Monkey has been used at the state and county levels to collect community and internal staff information for the purposes of program planning, administration, evaluation and…

  16. Attachment-based family therapy for depressed and suicidal adolescents: theory, clinical model and empirical support.

    PubMed

    Ewing, E Stephanie Krauthamer; Diamond, Guy; Levy, Suzanne

    2015-01-01

    Attachment-Based Family Therapy (ABFT) is a manualized family-based intervention designed for working with depressed adolescents, including those at risk for suicide, and their families. It is an empirically informed and supported treatment. ABFT has its theoretical underpinnings in attachment theory and clinical roots in structural family therapy and emotion focused therapies. ABFT relies on a transactional model that aims to transform the quality of adolescent-parent attachment, as a means of providing the adolescent with a more secure relationship that can support them during challenging times generally, and the crises related to suicidal thinking and behavior, specifically. This article reviews: (1) the theoretical foundations of ABFT (attachment theory, models of emotional development); (2) the ABFT clinical model, including training and supervision factors; and (3) empirical support. PMID:25778674

  17. Generalized Constitutive-Based Theoretical and Empirical Models for Hot Working Behavior of Functionally Graded Steels

    NASA Astrophysics Data System (ADS)

    Vanini, Seyed Ali Sadough; Abolghasemzadeh, Mohammad; Assadi, Abbas

    2013-07-01

    Functionally graded steels with graded ferritic and austenitic regions including bainite and martensite intermediate layers produced by electroslag remelting have attracted much attention in recent years. In this article, an empirical model based on the Zener-Hollomon (Z-H) constitutive equation with generalized material constants is presented to investigate the effects of temperature and strain rate on the hot working behavior of functionally graded steels. Next, a theoretical model, generalized by strain compensation, is developed for the flow stress estimation of functionally graded steels under hot compression based on the phase mixture rule and boundary layer characteristics. The model is used for different strains and grading configurations. Specifically, the results for αβγMγ steels from empirical and theoretical models showed excellent agreement with those of experiments of other references within acceptable error.

  18. Scaling up explanation generation: Large-scale knowledge bases and empirical studies

    SciTech Connect

    Lester, J.C.; Porter, B.W.

    1996-12-31

    To explain complex phenomena, an explanation system must be able to select information from a formal representation of domain knowledge, organize the selected information into multisentential discourse plans, and realize the discourse plans in text. Although recent years have witnessed significant progress in the development of sophisticated computational mechanisms for explanation, empirical results have been limited. This paper reports on a seven year effort to empirically study explanation generation from semantically rich, large-scale knowledge bases. We first describe Knight, a robust explanation system that constructs multi-sentential and multi-paragraph explanations from the Biology Knowledge Base, a large-scale knowledge base in the domain of botanical anatomy, physiology, and development. We then introduce the Two Panel evaluation methodology and describe how Knight`s performance was assessed with this methodology in the most extensive empirical evaluation conducted on an explanation system. In this evaluation, Knight scored within {open_quotes}half a grade{close_quote} of domain experts, and its performance exceeded that of one of the domain experts.

  19. Outcome (competency) based education: an exploration of its origins, theoretical basis, and empirical evidence.

    PubMed

    Morcke, Anne Mette; Dornan, Tim; Eika, Berit

    2013-10-01

    Outcome based or competency based education (OBE) is so firmly established in undergraduate medical education that it might not seem necessary to ask why it was included in recommendations for the future, like the Flexner centenary report. Uncritical acceptance may not, however, deliver its greatest benefits. Our aim was to explore the underpinnings of OBE: its historical origins, theoretical basis, and empirical evidence of its effects in order to answer the question: How can predetermined learning outcomes influence undergraduate medical education? This literature review had three components: A review of historical landmarks in the evolution of OBE; a review of conceptual frameworks and theories; and a systematic review of empirical publications from 1999 to 2010 that reported data concerning the effects of learning outcomes on undergraduate medical education. OBE had its origins in behaviourist theories of learning. It is tightly linked to the assessment and regulation of proficiency, but less clearly linked to teaching and learning activities. Over time, there have been cycles of advocacy for, then criticism of, OBE. A recurring critique concerns the place of complex personal and professional attributes as "competencies". OBE has been adopted by consensus in the face of weak empirical evidence. OBE, which has been advocated for over 50 years, can contribute usefully to defining requisite knowledge and skills, and blueprinting assessments. Its applicability to more complex aspects of clinical performance is not clear. OBE, we conclude, provides a valuable approach to some, but not all, important aspects of undergraduate medical education. PMID:22987194

  20. An empirical mass-loss law for Population II giants from the Spitzer-IRAC survey of Galactic globular clusters

    NASA Astrophysics Data System (ADS)

    Origlia, L.; Ferraro, F. R.; Fabbri, S.; Fusi Pecci, F.; Dalessandro, E.; Rich, R. M.; Valenti, E.

    2014-04-01

    Aims: The main aim of the present work is to derive an empirical mass-loss (ML) law for Population II stars in first and second ascent red giant branches. Methods: We used the Spitzer InfraRed Array Camera (IRAC) photometry obtained in the 3.6-8 μm range of a carefully chosen sample of 15 Galactic globular clusters spanning the entire metallicity range and sampling the vast zoology of horizontal branch (HB) morphologies. We complemented the IRAC photometry with near-infrared data to build suitable color-magnitude and color-color diagrams and identify mass-losing giant stars. Results: We find that while the majority of stars show colors typical of cool giants, some stars show an excess of mid-infrared light that is larger than expected from their photospheric emission and that is plausibly due to dust formation in mass flowing from them. For these stars, we estimate dust and total (gas + dust) ML rates and timescales. We finally calibrate an empirical ML law for Population II red and asymptotic giant branch stars with varying metallicity. We find that at a given red giant branch luminosity only a fraction of the stars are losing mass. From this, we conclude that ML is episodic and is active only a fraction of the time, which we define as the duty cycle. The fraction of mass-losing stars increases by increasing the stellar luminosity and metallicity. The ML rate, as estimated from reasonable assumptions for the gas-to-dust ratio and expansion velocity, depends on metallicity and slowly increases with decreasing metallicity. In contrast, the duty cycle increases with increasing metallicity, with the net result that total ML increases moderately with increasing metallicity, about 0.1 M⊙ every dex in [Fe/H]. For Population II asymptotic giant branch stars, we estimate a total ML of ≤0.1 M⊙, nearly constant with varying metallicity. This work is based on observations made with the Spitzer Space Telescope, which is operated by the Jet Propulsion Laboratory

  1. School-Based Health Care State Policy Survey. Executive Summary

    ERIC Educational Resources Information Center

    National Assembly on School-Based Health Care, 2012

    2012-01-01

    The National Assembly on School-Based Health Care (NASBHC) surveys state public health and Medicaid offices every three years to assess state-level public policies and activities that promote the growth and sustainability of school-based health services. The FY2011 survey found 18 states (see map below) reporting investments explicitly dedicated…

  2. Web-Based Surveys Facilitate Undergraduate Research and Knowledge

    ERIC Educational Resources Information Center

    Grimes, Paul, Ed.; Steele, Scott R.

    2008-01-01

    The author presents Web-based surveying as a valuable tool for achieving quality undergraduate research in upper-level economics courses. Web-based surveys can be employed in efforts to integrate undergraduate research into the curriculum without overburdening students or faculty. The author discusses the value of undergraduate research, notes…

  3. Survey Says? A Primer on Web-based Survey Design and Distribution

    PubMed Central

    Oppenheimer, Adam J.; Pannucci, Christopher J.; Kasten, Steven J.; Haase, Steven C.

    2011-01-01

    The internet has changed the way in which we gather and interpret information. While books were once the exclusive bearers of data, knowledge is now only a keystroke away. The internet has also facilitated the synthesis of new knowledge. Specifically, it has become a tool through which medical research is conducted. A review of the literature reveals that in the past year, over one-hundred medical publications have been based on web-based survey data alone. Due to emerging internet technologies, web-based surveys can now be launched with little computer knowledge. They may also be self-administered, eliminating personnel requirements. Ultimately, an investigator may build, implement, and analyze survey results with speed and efficiency, obviating the need for mass mailings and data processing. All of these qualities have rendered telephone and mail-based surveys virtually obsolete. Despite these capabilities, web-based survey techniques are not without their limitations, namely recall and response biases. When used properly, however, web-based surveys can greatly simplify the research process. This article discusses the implications of web-based surveys and provides guidelines for their effective design and distribution. PMID:21701347

  4. Empirical and physics based mathematical models of uranium hydride decomposition kinetics with quantified uncertainties.

    SciTech Connect

    Salloum, Maher N.; Gharagozloo, Patricia E.

    2013-10-01

    Metal particle beds have recently become a major technique for hydrogen storage. In order to extract hydrogen from such beds, it is crucial to understand the decomposition kinetics of the metal hydride. We are interested in obtaining a a better understanding of the uranium hydride (UH3) decomposition kinetics. We first developed an empirical model by fitting data compiled from different experimental studies in the literature and quantified the uncertainty resulting from the scattered data. We found that the decomposition time range predicted by the obtained kinetics was in a good agreement with published experimental results. Secondly, we developed a physics based mathematical model to simulate the rate of hydrogen diffusion in a hydride particle during the decomposition. We used this model to simulate the decomposition of the particles for temperatures ranging from 300K to 1000K while propagating parametric uncertainty and evaluated the kinetics from the results. We compared the kinetics parameters derived from the empirical and physics based models and found that the uncertainty in the kinetics predicted by the physics based model covers the scattered experimental data. Finally, we used the physics-based kinetics parameters to simulate the effects of boundary resistances and powder morphological changes during decomposition in a continuum level model. We found that the species change within the bed occurring during the decomposition accelerates the hydrogen flow by increasing the bed permeability, while the pressure buildup and the thermal barrier forming at the wall significantly impede the hydrogen extraction.

  5. Methodologies for Crawler Based Web Surveys.

    ERIC Educational Resources Information Center

    Thelwall, Mike

    2002-01-01

    Describes Web survey methodologies used to study the content of the Web, and discusses search engines and the concept of crawling the Web. Highlights include Web page selection methodologies; obstacles to reliable automatic indexing of Web sites; publicly indexable pages; crawling parameters; and tests for file duplication. (Contains 62…

  6. Scan MDCs for GPS-Based Gamma Radiation Surveys.

    PubMed

    Alecksen, Tyler; Whicker, Randy

    2016-08-01

    A method for estimating the minimum detectable concentration of a contaminant radionuclide in soil when scanning with gamma radiation detectors (known as the "scan MDC") is described in the Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM). This paper presents an alternate method for estimating scan MDCs for GPS-based gamma surveys based on detector efficiencies modeled with the probabilistic Monte Carlo N-Particle Extended (MCNPX) Transport simulation code. Results are compared to those provided in MARSSIM. An extensive database of MCNPX-based detection efficiencies has been developed to represent a variety of gamma survey applications and potential scanning configurations (detector size, scan height, size of contaminated soil volume, etc.), and an associated web-based user interface has been developed to provide survey designers and regulators with access to a reasonably wide range of calculated scan MDC values for survey planning purposes. PMID:27356162

  7. Simulating the Performance of Ground-Based Optical Asteroid Surveys

    NASA Astrophysics Data System (ADS)

    Christensen, Eric J.; Shelly, Frank C.; Gibbs, Alex R.; Grauer, Albert D.; Hill, Richard E.; Johnson, Jess A.; Kowalski, Richard A.; Larson, Stephen M.

    2014-11-01

    We are developing a set of asteroid survey simulation tools in order to estimate the capability of existing and planned ground-based optical surveys, and to test a variety of possible survey cadences and strategies. The survey simulator is composed of several layers, including a model population of solar system objects and an orbital integrator, a site-specific atmospheric model (including inputs for seeing, haze and seasonal cloud cover), a model telescope (with a complete optical path to estimate throughput), a model camera (including FOV, pixel scale, and focal plane fill factor) and model source extraction and moving object detection layers with tunable detection requirements. We have also developed a flexible survey cadence planning tool to automatically generate nightly survey plans. Inputs to the cadence planner include camera properties (FOV, readout time), telescope limits (horizon, declination, hour angle, lunar and zenithal avoidance), preferred and restricted survey regions in RA/Dec, ecliptic, and Galactic coordinate systems, and recent coverage by other asteroid surveys. Simulated surveys are created for a subset of current and previous NEO surveys (LINEAR, Pan-STARRS and the three Catalina Sky Survey telescopes), and compared against the actual performance of these surveys in order to validate the model’s performance. The simulator tracks objects within the FOV of any pointing that were not discovered (e.g. too few observations, too trailed, focal plane array gaps, too fast or slow), thus dividing the population into “discoverable” and “discovered” subsets, to inform possible survey design changes. Ongoing and future work includes generating a realistic “known” subset of the model NEO population, running multiple independent simulated surveys in coordinated and uncoordinated modes, and testing various cadences to find optimal strategies for detecting NEO sub-populations. These tools can also assist in quantifying the efficiency of novel

  8. Inequality of Higher Education in China: An Empirical Test Based on the Perspective of Relative Deprivation

    ERIC Educational Resources Information Center

    Hou, Liming

    2014-01-01

    The primary goal of this paper is to examine what makes Chinese college students dissatisfied with entrance opportunities for higher education. Based on the author's survey data, we test two parameters which could be a potential cause of this dissatisfaction: 1) distributive inequality, which emphasizes the individual's dissatisfaction caused by…

  9. Advances on Empirical Mode Decomposition-based Time-Frequency Analysis Methods in Hydrocarbon Detection

    NASA Astrophysics Data System (ADS)

    Chen, H. X.; Xue, Y. J.; Cao, J.

    2015-12-01

    Empirical mode decomposition (EMD), which is a data-driven adaptive decomposition method and is not limited by time-frequency uncertainty spreading, is proved to be more suitable for seismic signals which are nonlinear and non-stationary. Compared with other Fourier-based and wavelet-based time-frequency methods, EMD-based time-frequency methods have higher temporal and spatial resolution and yield hydrocarbon interpretations with more statistical significance. Empirical mode decomposition algorithm has now evolved from EMD to Ensemble EMD (EEMD) to Complete Ensemble EMD (CEEMD). Even though EMD-based time-frequency methods offer many promising features for analyzing and processing geophysical data, there are some limitations or defects in EMD-based time-frequency methods. This presentation will present a comparative study on hydrocarbon detection using seven EMD-based time-frequency analysis methods, which include: (1) first, EMD combined with Hilbert transform (HT) as a time-frequency analysis method is used for hydrocarbon detection; and (2) second, Normalized Hilbert transform (NHT) and HU Methods respectively combined with HT as improved time-frequency analysis methods are applied for hydrocarbon detection; and (3) three, EMD combined with Teager-Kaiser energy (EMD/TK) is investigated for hydrocarbon detection; and (4) four, EMD combined with wavelet transform (EMDWave) as a seismic attenuation estimation method is comparatively studied; and (5) EEMD- and CEEMD- based time-frequency analysis methods used as highlight volumes technology are studied. The differences between these methods in hydrocarbon detection will be discussed. The question of getting a meaningful instantaneous frequency by HT and mode-mixing issues in EMD will be analysed. The work was supported by NSFC under grant Nos. 41430323, 41404102 and 41274128.

  10. Survey of Commercially Available Computer-Readable Bibliographic Data Bases.

    ERIC Educational Resources Information Center

    Schneider, John H., Ed.; And Others

    This document contains the results of a survey of 94 U. S. organizations, and 36 organizations in other countries that were thought to prepare machine-readable data bases. Of those surveyed, 55 organizations (40 in U. S., 15 in other countries) provided completed camera-ready forms describing 81 commercially available, machine-readable data bases…

  11. A survey of machine readable data bases

    NASA Technical Reports Server (NTRS)

    Matlock, P.

    1981-01-01

    Forty-two of the machine readable data bases available to the technologist and researcher in the natural sciences and engineering are described and compared with the data bases and date base services offered by NASA.

  12. Implementation of an empirically based drug and violence prevention and intervention program in public school settings.

    PubMed

    Cunningham, P B; Henggeler, S W

    2001-06-01

    Describes the implementation of a collaborative preventive intervention project (Healthy Schools) designed to reduce levels of bullying and related antisocial behaviors in children attending two urban middle schools serving primarily African American students. These schools have high rates of juvenile violence, as reflected by suspensions and expulsions for behavioral problems. Using a quasi-experimental design, empirically based drug and violence prevention programs, Bullying Prevention and Project ALERT, are being implemented at each middle school. In addition, an intensive evidence-based intervention, multisystemic therapy, is being used to target students at high risk of expulsion and court referral. Hence, the proposed project integrates both universal approaches to prevention and a model that focuses on indicated cases. Targeted outcomes, by which the effectiveness of this comprehensive school-based program will be measured, are reduced youth violence, reduced drug use, and improved psychosocial functioning of participating youth. PMID:11393922

  13. Comparing results from two continental geochemical surveys to world soil composition and deriving Predicted Empirical Global Soil (PEGS2) reference values

    NASA Astrophysics Data System (ADS)

    de Caritat, Patrice; Reimann, Clemens; Bastrakov, E.; Bowbridge, D.; Boyle, P.; Briggs, S.; Brown, D.; Brown, M.; Brownlie, K.; Burrows, P.; Burton, G.; Byass, J.; de Caritat, P.; Chanthapanya, N.; Cooper, M.; Cranfield, L.; Curtis, S.; Denaro, T.; Dhnaram, C.; Dhu, T.; Diprose, G.; Fabris, A.; Fairclough, M.; Fanning, S.; Fidler, R.; Fitzell, M.; Flitcroft, P.; Fricke, C.; Fulton, D.; Furlonger, J.; Gordon, G.; Green, A.; Green, G.; Greenfield, J.; Harley, J.; Heawood, S.; Hegvold, T.; Henderson, K.; House, E.; Husain, Z.; Krsteska, B.; Lam, J.; Langford, R.; Lavigne, T.; Linehan, B.; Livingstone, M.; Lukss, A.; Maier, R.; Makuei, A.; McCabe, L.; McDonald, P.; McIlroy, D.; McIntyre, D.; Morris, P.; O'Connell, G.; Pappas, B.; Parsons, J.; Petrick, C.; Poignand, B.; Roberts, R.; Ryle, J.; Seymon, A.; Sherry, K.; Skinner, J.; Smith, M.; Strickland, C.; Sutton, S.; Swindell, R.; Tait, H.; Tang, J.; Thomson, A.; Thun, C.; Uppill, B.; Wall, K.; Watkins, J.; Watson, T.; Webber, L.; Whiting, A.; Wilford, J.; Wilson, T.; Wygralak, A.; Albanese, S.; Andersson, M.; Arnoldussen, A.; Baritz, R.; Batista, M. J.; Bel-lan, A.; Birke, M.; Cicchella, C.; Demetriades, A.; Dinelli, E.; De Vivo, B.; De Vos, W.; Duris, M.; Dusza-Dobek, A.; Eggen, O. A.; Eklund, M.; Ernstsen, V.; Filzmoser, P.; Finne, T. E.; Flight, D.; Forrester, S.; Fuchs, M.; Fugedi, U.; Gilucis, A.; Gosar, M.; Gregorauskiene, V.; Gulan, A.; Halamić, J.; Haslinger, E.; Hayoz, P.; Hobiger, G.; Hoffmann, R.; Hoogewerff, J.; Hrvatovic, H.; Husnjak, S.; Janik, L.; Johnson, C. C.; Jordan, G.; Kirby, J.; Kivisilla, J.; Klos, V.; Krone, F.; Kwecko, P.; Kuti, L.; Ladenberger, A.; Lima, A.; Locutura, J.; Lucivjansky, P.; Mackovych, D.; Malyuk, B. I.; Maquil, R.; McLaughlin, M.; Meuli, R. G.; Miosic, N.; Mol, G.; Négrel, P.; O'Connor, P.; Oorts, K.; Ottesen, R. T.; Pasieczna, A.; Petersell, V.; Pfleiderer, S.; Poňavič, M.; Prazeres, C.; Rauch, U.; Reimann, C.; Salpeteur, I.; Schedl, A.; Scheib, A.; Schoeters, I.; Sefcik, P.; Sellersjö, E.; Skopljak, F.; Slaninka, I.; Šorša, A.; Srvkota, R.; Stafilov, T.; Tarvainen, T.; Trendavilov, V.; Valera, P.; Verougstraete, V.; Vidojević, D.; Zissimos, A. M.; Zomeni, Z.

    2012-02-01

    Analytical data for 10 major oxides (Al2O3, CaO, Fe2O3, K2O, MgO, MnO, Na2O, P2O5, SiO2 and TiO2), 16 total trace elements (As, Ba, Ce, Co, Cr, Ga, Nb, Ni, Pb, Rb, Sr, Th, V, Y, Zn and Zr), 14 aqua regia extracted elements (Ag, As, Bi, Cd, Ce, Co, Cs, Cu, Fe, La, Li, Mn, Mo and Pb), Loss On Ignition (LOI) and pH from 3526 soil samples from two continents (Australia and Europe) are presented and compared to (1) the composition of the upper continental crust, (2) published world soil average values, and (3) data from other continental-scale soil surveys. It can be demonstrated that average upper continental crust values do not provide reliable estimates for natural concentrations of elements in soils. For many elements there exist substantial differences between published world soil averages and the median concentrations observed on two continents. Direct comparison with other continental datasets is hampered by the fact that often mean, instead of the statistically more robust median, is reported. Using a database of the worldwide distribution of lithological units, it can be demonstrated that lithology is a poor predictor of soil chemistry. Climate-related processes such as glaciation and weathering are strong modifiers of the geochemical signature inherited from bedrock during pedogenesis. To overcome existing shortcomings of predicted global or world soil geochemical reference values, we propose Preliminary Empirical Global Soil reference values based on analytical results of a representative number of soil samples from two continents (PEGS2).

  14. Dip-separated structural filtering using seislet transform and adaptive empirical mode decomposition based dip filter

    NASA Astrophysics Data System (ADS)

    Chen, Yangkang

    2016-07-01

    The seislet transform has been demonstrated to have a better compression performance for seismic data compared with other well-known sparsity promoting transforms, thus it can be used to remove random noise by simply applying a thresholding operator in the seislet domain. Since the seislet transform compresses the seismic data along the local structures, the seislet thresholding can be viewed as a simple structural filtering approach. Because of the dependence on a precise local slope estimation, the seislet transform usually suffers from low compression ratio and high reconstruction error for seismic profiles that have dip conflicts. In order to remove the limitation of seislet thresholding in dealing with conflicting-dip data, I propose a dip-separated filtering strategy. In this method, I first use an adaptive empirical mode decomposition based dip filter to separate the seismic data into several dip bands (5 or 6). Next, I apply seislet thresholding to each separated dip component to remove random noise. Then I combine all the denoised components to form the final denoised data. Compared with other dip filters, the empirical mode decomposition based dip filter is data-adaptive. One only needs to specify the number of dip components to be separated. Both complicated synthetic and field data examples show superior performance of my proposed approach than the traditional alternatives. The dip-separated structural filtering is not limited to seislet thresholding, and can also be extended to all those methods that require slope information.

  15. Determination of knock characteristics in spark ignition engines: an approach based on ensemble empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Li, Ning; Yang, Jianguo; Zhou, Rui; Liang, Caiping

    2016-04-01

    Knock is one of the major constraints to improve the performance and thermal efficiency of spark ignition (SI) engines. It can also result in severe permanent engine damage under certain operating conditions. Based on the ensemble empirical mode decomposition (EEMD), this paper proposes a new approach to determine the knock characteristics in SI engines. By adding a uniformly distributed and finite white Gaussian noise, the EEMD can preserve signal continuity in different scales and therefore alleviates the mode-mixing problem occurring in the classic empirical mode decomposition (EMD). The feasibilities of applying the EEMD to detect the knock signatures of a test SI engine via the pressure signal measured from combustion chamber and the vibration signal measured from cylinder head are investigated. Experimental results show that the EEMD-based method is able to detect the knock signatures from both the pressure signal and vibration signal, even in initial stage of knock. Finally, by comparing the application results with those obtained by short-time Fourier transform (STFT), Wigner-Ville distribution (WVD) and discrete wavelet transform (DWT), the superiority of the EEMD method in determining knock characteristics is demonstrated.

  16. Dip-separated structural filtering using seislet transform and adaptive empirical mode decomposition based dip filter

    NASA Astrophysics Data System (ADS)

    Chen, Yangkang

    2016-04-01

    The seislet transform has been demonstrated to have a better compression performance for seismic data compared with other well-known sparsity promoting transforms, thus it can be used to remove random noise by simply applying a thresholding operator in the seislet domain. Since the seislet transform compresses the seismic data along the local structures, the seislet thresholding can be viewed as a simple structural filtering approach. Because of the dependence on a precise local slope estimation, the seislet transform usually suffers from low compression ratio and high reconstruction error for seismic profiles that have dip conflicts. In order to remove the limitation of seislet thresholding in dealing with conflicting-dip data, I propose a dip-separated filtering strategy. In this method, I first use an adaptive empirical mode decomposition based dip filter to separate the seismic data into several dip bands (5 or 6). Next, I apply seislet thresholding to each separated dip component to remove random noise. Then I combine all the denoised components to form the final denoised data. Compared with other dip filters, the empirical mode decomposition based dip filter is data-adaptive. One only need to specify the number of dip components to be separated. Both complicated synthetic and field data examples show superior performance of my proposed approach than the traditional alternatives. The dip-separated structural filtering is not limited to seislet thresholding, and can also be extended to all those methods that require slope information.

  17. Dynamics of bloggers’ communities: Bipartite networks from empirical data and agent-based modeling

    NASA Astrophysics Data System (ADS)

    Mitrović, Marija; Tadić, Bosiljka

    2012-11-01

    We present an analysis of the empirical data and the agent-based modeling of the emotional behavior of users on the Web portals where the user interaction is mediated by posted comments, like Blogs and Diggs. We consider the dataset of discussion-driven popular Diggs, in which all comments are screened by machine-learning emotion detection in the text, to determine positive and negative valence (attractiveness and aversiveness) of each comment. By mapping the data onto a suitable bipartite network, we perform an analysis of the network topology and the related time-series of the emotional comments. The agent-based model is then introduced to simulate the dynamics and to capture the emergence of the emotional behaviors and communities. The agents are linked to posts on a bipartite network, whose structure evolves through their actions on the posts. The emotional states (arousal and valence) of each agent fluctuate in time, subject to the current contents of the posts to which the agent is exposed. By an agent’s action on a post its current emotions are transferred to the post. The model rules and the key parameters are inferred from the considered empirical data to ensure their realistic values and mutual consistency. The model assumes that the emotional arousal over posts drives the agent’s action. The simulations are preformed for the case of constant flux of agents and the results are analyzed in full analogy with the empirical data. The main conclusions are that the emotion-driven dynamics leads to long-range temporal correlations and emergent networks with community structure, that are comparable with the ones in the empirical system of popular posts. In view of pure emotion-driven agents actions, this type of comparisons provide a quantitative measure for the role of emotions in the dynamics on real blogs. Furthermore, the model reveals the underlying mechanisms which relate the post popularity with the emotion dynamics and the prevalence of negative

  18. Empirical calibrations of optical absorption-line indices based on the stellar library MILES

    NASA Astrophysics Data System (ADS)

    Johansson, Jonas; Thomas, Daniel; Maraston, Claudia

    2010-07-01

    Stellar population models of absorption-line indices are an important tool for the analysis of stellar population spectra. They are most accurately modelled through empirical calibrations of absorption-line indices with the stellar parameters such as effective temperature, metallicity and surface gravity, which are the so-called fitting functions. Here we present new empirical fitting functions for the 25 optical Lick absorption-line indices based on the new stellar library Medium resolution INT Library of Empirical Spectra (MILES). The major improvements with respect to the Lick/IDS library are the better sampling of stellar parameter space, a generally higher signal-to-noise ratio and a careful flux calibration. In fact, we find that errors on individual index measurements in MILES are considerably smaller than in Lick/IDS. Instead, we find the rms of the residuals between the final fitting functions and the data to be dominated by errors in the stellar parameters. We provide fitting functions for both Lick/IDS and MILES spectral resolutions and compare our results with other fitting functions in the literature. A FORTRAN 90 code is available online in order to simplify the implementation in stellar population models. We further calculate the offsets in index measurements between the Lick/IDS system to a flux-calibrated system. For this purpose, we use the three libraries MILES, ELODIE and STELIB. We find that offsets are negligible in some cases, most notably for the widely used indices Hβ, Mgb, Fe5270 and Fe5335. In a number of cases, however, the difference between the flux-calibrated library and Lick/IDS is significant with the offsets depending on index strengths. Interestingly, there is no general agreement between the three libraries for a large number of indices, which hampers the derivation of a universal offset between the Lick/IDS and flux-calibrated systems.

  19. Polarizable Empirical Force Field for Hexopyranose Monosaccharides Based on the Classical Drude Oscillator

    PubMed Central

    2015-01-01

    A polarizable empirical force field based on the classical Drude oscillator is presented for the hexopyranose form of selected monosaccharides. Parameter optimization targeted quantum mechanical (QM) dipole moments, solute–water interaction energies, vibrational frequencies, and conformational energies. Validation of the model was based on experimental data on crystals, densities of aqueous-sugar solutions, diffusion constants of glucose, and rotational preferences of the exocylic hydroxymethyl of d-glucose and d-galactose in aqueous solution as well as additional QM data. Notably, the final model involves a single electrostatic model for all sixteen diastereomers of the monosaccharides, indicating the transferability of the polarizable model. The presented parameters are anticipated to lay the foundation for a comprehensive polarizable force field for saccharides that will be compatible with the polarizable Drude parameters for lipids and proteins, allowing for simulations of glycolipids and glycoproteins. PMID:24564643

  20. Empirical analysis of web-based user-object bipartite networks

    NASA Astrophysics Data System (ADS)

    Shang, Ming-Sheng; Lü, Linyuan; Zhang, Yi-Cheng; Zhou, Tao

    2010-05-01

    Understanding the structure and evolution of web-based user-object networks is a significant task since they play a crucial role in e-commerce nowadays. This letter reports the empirical analysis on two large-scale web sites, audioscrobbler.com and del.icio.us, where users are connected with music groups and bookmarks, respectively. The degree distributions and degree-degree correlations for both users and objects are reported. We propose a new index, named collaborative similarity, to quantify the diversity of tastes based on the collaborative selection. Accordingly, the correlation between degree and selection diversity is investigated. We report some novel phenomena well characterizing the selection mechanism of web users and outline the relevance of these phenomena to the information recommendation problem.

  1. Polarizable Empirical Force Field for Acyclic Poly-Alcohols Based on the Classical Drude Oscillator

    PubMed Central

    He, Xibing; Lopes, Pedro E. M.; MacKerell, Alexander D.

    2014-01-01

    A polarizable empirical force field for acyclic polyalcohols based on the classical Drude oscillator is presented. The model is optimized with an emphasis on the transferability of the developed parameters among molecules of different sizes in this series and on the condensed-phase properties validated against experimental data. The importance of the explicit treatment of electronic polarizability in empirical force fields is demonstrated in the cases of this series of molecules with vicinal hydroxyl groups that can form cooperative intra- and intermolecular hydrogen bonds. Compared to the CHARMM additive force field, improved treatment of the electrostatic interactions avoids overestimation of the gas-phase dipole moments, results in significant improvement in the treatment of the conformational energies, and leads to the correct balance of intra- and intermolecular hydrogen bonding of glycerol as evidenced by calculated heat of vaporization being in excellent agreement with experiment. Computed condensed phase data, including crystal lattice parameters and volumes and densities of aqueous solutions are in better agreement with experimental data as compared to the corresponding additive model. Such improvements are anticipated to significantly improve the treatment of polymers in general, including biological macromolecules. PMID:23703219

  2. The WASP and NGTS ground-based transit surveys

    NASA Astrophysics Data System (ADS)

    Wheatley, P. J.

    2015-10-01

    I will review the current status of ground-based exoplanet transit surveys, using the Wide Angle Search for Planets (WASP) and the Next Generation Transit Survey (NGTS) as specific examples. I will describe the methods employed by these surveys and show how planets from Neptune to Jupiter-size are detected and confirmed around bright stars. I will also give an overview of the remarkably wide range of exoplanet characterization that is made possible with large-telescope follow up of these bright transiting systems. This characterization includes bulk composition and spin-orbit alignment, as well as atmospheric properties such as thermal structure, composition and dynamics. Finally, I will outline how ground-based photometric studies of transiting planets will evolve with the advent of new space-based surveys such as TESS and PLATO.

  3. Developing Empirically Based, Culturally Grounded Drug Prevention Interventions for Indigenous Youth Populations

    PubMed Central

    Okamoto, Scott K.; Helm, Susana; Pel, Suzanne; McClain, Latoya L.; Hill, Amber P.; Hayashida, Janai K. P.

    2012-01-01

    This article describes the relevance of a culturally grounded approach toward drug prevention development for indigenous youth populations. This approach builds drug prevention from the “ground up” (ie, from the values, beliefs, and worldviews of the youth that are the intended consumers of the program), and is contrasted with efforts that focus on adapting existing drug prevention interventions to fit the norms of different youth ethnocultural groups. The development of an empirically based drug prevention program focused on rural Native Hawaiian youth is described as a case example of culturally grounded drug prevention development for indigenous youth, and the impact of this effort on the validity of the intervention and on community engagement and investment in the development of the program are discussed. Finally, implications of this approach for behavioral health services and the development of an indigenous prevention science are discussed. PMID:23188485

  4. A Human ECG Identification System Based on Ensemble Empirical Mode Decomposition

    PubMed Central

    Zhao, Zhidong; Yang, Lei; Chen, Diandian; Luo, Yi

    2013-01-01

    In this paper, a human electrocardiogram (ECG) identification system based on ensemble empirical mode decomposition (EEMD) is designed. A robust preprocessing method comprising noise elimination, heartbeat normalization and quality measurement is proposed to eliminate the effects of noise and heart rate variability. The system is independent of the heart rate. The ECG signal is decomposed into a number of intrinsic mode functions (IMFs) and Welch spectral analysis is used to extract the significant heartbeat signal features. Principal component analysis is used reduce the dimensionality of the feature space, and the K-nearest neighbors (K-NN) method is applied as the classifier tool. The proposed human ECG identification system was tested on standard MIT-BIH ECG databases: the ST change database, the long-term ST database, and the PTB database. The system achieved an identification accuracy of 95% for 90 subjects, demonstrating the effectiveness of the proposed method in terms of accuracy and robustness. PMID:23698274

  5. Painting by numbers: nanoparticle-based colorants in the post-empirical age.

    PubMed

    Klupp Taylor, Robin N; Seifrt, Frantisek; Zhuromskyy, Oleksandr; Peschel, Ulf; Leugering, Günter; Peukert, Wolfgang

    2011-06-17

    The visual appearance of the artificial world is largely governed by films or composites containing particles with at least one dimension smaller than a micron. Over the past century and a half, the optical properties of such materials have been scrutinized and a broad range of colorant products, based mostly on empirical microstructural improvements, developed. With the advent of advanced synthetic approaches capable of tailoring particle shape, size and composition on the nanoscale, the question of what is the optimum particle for a certain optical property can no longer be answered solely by experimentation. Instead, new and improved computational approaches are required to invert the structure-function relationship. This progress report reviews the development in our understanding of this relationship and indicates recent examples of how theoretical design is taking an ever increasingly important role in the search for enhanced or multifunctional colorants. PMID:21538592

  6. Distributed optical fiber-based theoretical and empirical methods monitoring hydraulic engineering subjected to seepage velocity

    NASA Astrophysics Data System (ADS)

    Su, Huaizhi; Tian, Shiguang; Cui, Shusheng; Yang, Meng; Wen, Zhiping; Xie, Wei

    2016-09-01

    In order to systematically investigate the general principle and method of monitoring seepage velocity in the hydraulic engineering, the theoretical analysis and physical experiment were implemented based on distributed fiber-optic temperature sensing (DTS) technology. During the coupling influence analyses between seepage field and temperature field in the embankment dam or dike engineering, a simplified model was constructed to describe the coupling relationship of two fields. Different arrangement schemes of optical fiber and measuring approaches of temperature were applied on the model. The inversion analysis idea was further used. The theoretical method of monitoring seepage velocity in the hydraulic engineering was finally proposed. A new concept, namely the effective thermal conductivity, was proposed referring to the thermal conductivity coefficient in the transient hot-wire method. The influence of heat conduction and seepage could be well reflected by this new concept, which was proved to be a potential approach to develop an empirical method monitoring seepage velocity in the hydraulic engineering.

  7. Empirical likelihood based detection procedure for change point in mean residual life functions under random censorship.

    PubMed

    Chen, Ying-Ju; Ning, Wei; Gupta, Arjun K

    2016-05-01

    The mean residual life (MRL) function is one of the basic parameters of interest in survival analysis that describes the expected remaining time of an individual after a certain age. The study of changes in the MRL function is practical and interesting because it may help us to identify some factors such as age and gender that may influence the remaining lifetimes of patients after receiving a certain surgery. In this paper, we propose a detection procedure based on the empirical likelihood for the changes in MRL functions with right censored data. Two real examples are also given: Veterans' administration lung cancer study and Stanford heart transplant to illustrate the detecting procedure. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26936529

  8. Survey of projection-based immersive displays

    NASA Astrophysics Data System (ADS)

    Wright, Dan

    2000-05-01

    Projection-based immersive displays are rapidly becoming the visualization system of choice for applications requiring the comprehension of complex datasets and the collaborative sharing of insights. The wide variety of display configurations can be grouped into five categories: benches, flat-screen walls, curved-screen theaters, concave-screen domes and spatially-immersive rooms. Each have their strengths and weaknesses with the appropriateness of each dependent on one's application and budget. The paper outlines the components common to all projection-based displays and describes the characteristics of each particular category. Key image metrics, implementation considerations and immersive display trends are also considered.

  9. Project Surveys Community Based Organizations High Schools

    ERIC Educational Resources Information Center

    Black Issues in Higher Education, 2004

    2004-01-01

    A new report aims to promote awareness at the state level of the importance of Community Based Organizations (CBO) high schools, highlighting the promising lessons these schools hold for improving the educational outcomes of youth at risk of school failure or dropping out. This document briefly analyzes the report, "CBO High Schools: Their Value…

  10. How "Does" the Comforting Process Work? An Empirical Test of an Appraisal-Based Model of Comforting

    ERIC Educational Resources Information Center

    Jones, Susanne M.; Wirtz, John G.

    2006-01-01

    Burleson and Goldsmith's (1998) comforting model suggests an appraisal-based mechanism through which comforting messages can bring about a positive change in emotional states. This study is a first empirical test of three causal linkages implied by the appraisal-based comforting model. Participants (N=258) talked about an upsetting event with a…

  11. An Empirical Study on Washback Effects of the Internet-Based College English Test Band 4 in China

    ERIC Educational Resources Information Center

    Wang, Chao; Yan, Jiaolan; Liu, Bao

    2014-01-01

    Based on Bailey's washback model, in respect of participants, process and products, the present empirical study was conducted to find the actual washback effects of the internet-based College English Test Band 4 (IB CET-4). The methods adopted are questionnaires, class observation, interview and the analysis of both the CET-4 teaching and testing…

  12. Determinants of Obesity and Associated Population Attributability, South Africa: Empirical Evidence from a National Panel Survey, 2008-2012

    PubMed Central

    Sartorius, Benn; Veerman, Lennert J.; Manyema, Mercy; Chola, Lumbwe; Hofman, Karen

    2015-01-01

    Background Obesity is a major risk factor for emerging non-communicable diseases (NCDS) in middle income countries including South Africa (SA). Understanding the multiple and complex determinants of obesity and their true population attributable impact is critical for informing and developing effective prevention efforts using scientific based evidence. This study identified contextualised high impact factors associated with obesity in South Africa. Methods Analysis of three national cross sectional (repeated panel) surveys, using a multilevel logistic regression and population attributable fraction estimation allowed for identification of contextualised high impact factors associated with obesity (BMI>30 kg/m2) among adults (15years+). Results Obesity prevalence increased significantly from 23.5% in 2008 to 27.2% in 2012, with a significantly (p-value<0.001) higher prevalence among females (37.9% in 2012) compared to males (13.3% in 2012). Living in formal urban areas, white ethnicity, being married, not exercising and/or in higher socio-economic category were significantly associated with male obesity. Females living in formal or informal urban areas, higher crime areas, African/White ethnicity, married, not exercising, in a higher socio-economic category and/or living in households with proportionate higher spending on food (and unhealthy food options) were significantly more likely to be obese. The identified determinants appeared to account for 75% and 43% of male and female obesity respectively. White males had the highest relative gain in obesity from 2008 to 2012. Conclusions The rising prevalence of obesity in South Africa is significant and over the past 5 years the rising prevalence of Type-2 diabetes has mirrored this pattern, especially among females. Targeting young adolescent girls should be a priority. Addressing determinants of obesity will involve a multifaceted strategy and requires at individual and population levels. With rising costs in the

  13. Using Web-Based Surveys to Conduct Counseling Research.

    ERIC Educational Resources Information Center

    Granello, Darcy Haag; Wheaton, Joe E.

    In spite of the increased use of the Internet for data collection, there is little published research about the process of data collection online. That is, discipline specific studies publish the results of their web-based surveys in discipline-specific journals, but little information is available on the process of Internet-based data collection.…

  14. Survey of Online Access to Social Science Data Bases.

    ERIC Educational Resources Information Center

    Donati, Robert

    Until very recently there was little computer access to comprehensive bibliographic data bases in the social sciences. Now online searching of several directly relevant files is made possible through services such as the Lockheed DIALOG system. These data bases are briefly surveyed, with emphasis on content, structure, and strategy appropriate for…

  15. Written institutional ethics policies on euthanasia: an empirical-based organizational-ethical framework.

    PubMed

    Lemiengre, Joke; Dierckx de Casterlé, Bernadette; Schotsmans, Paul; Gastmans, Chris

    2014-05-01

    As euthanasia has become a widely debated issue in many Western countries, hospitals and nursing homes especially are increasingly being confronted with this ethically sensitive societal issue. The focus of this paper is how healthcare institutions can deal with euthanasia requests on an organizational level by means of a written institutional ethics policy. The general aim is to make a critical analysis whether these policies can be considered as organizational-ethical instruments that support healthcare institutions to take their institutional responsibility for dealing with euthanasia requests. By means of an interpretative analysis, we conducted a process of reinterpretation of results of former Belgian empirical studies on written institutional ethics policies on euthanasia in dialogue with the existing international literature. The study findings revealed that legal regulations, ethical and care-oriented aspects strongly affected the development, the content, and the impact of written institutional ethics policies on euthanasia. Hence, these three cornerstones-law, care and ethics-constituted the basis for the empirical-based organizational-ethical framework for written institutional ethics policies on euthanasia that is presented in this paper. However, having a euthanasia policy does not automatically lead to more legal transparency, or to a more professional and ethical care practice. The study findings suggest that the development and implementation of an ethics policy on euthanasia as an organizational-ethical instrument should be considered as a dynamic process. Administrators and ethics committees must take responsibility to actively create an ethical climate supporting care providers who have to deal with ethical dilemmas in their practice. PMID:24420744

  16. The Utility of Teacher and Student Surveys in Principal Evaluations: An Empirical Investigation. REL 2015-047

    ERIC Educational Resources Information Center

    Liu, Keke; Stuit, David; Springer, Jeff; Lindsay, Jim; Wan, Yinmei

    2014-01-01

    This study examined whether adding student and teacher survey measures to existing principal evaluation measures increases the overall power of the principal evaluation model to explain variation in student achievement across schools. The study was conducted using data from 2011-12 on 39 elementary and secondary schools within a midsize urban…

  17. The "Public Opinion Survey of Human Attributes-Stuttering" (POSHA-S): Summary Framework and Empirical Comparisons

    ERIC Educational Resources Information Center

    St. Louis, Kenneth O.

    2011-01-01

    Purpose: The "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was developed to make available worldwide a standard measure of public attitudes toward stuttering that is practical, reliable, valid, and translatable. Mean data from past field studies as comparisons for interpretation of "POSHA-S" results are reported. Method: Means…

  18. Gender Wage Gaps by College Major in Taiwan: Empirical Evidence from the 1997-2003 Manpower Utilization Survey

    ERIC Educational Resources Information Center

    Lin, Eric S.

    2010-01-01

    In this article, we examine the effect of incorporating the fields of study on the explained and unexplained components of the standard Oaxaca decomposition for the gender wage gaps in Taiwan using 1997-2003 Manpower Utilization Survey data. Using several existing and lately developed measures, we inspect the gender wage gap by college major to…

  19. Empirical mode decomposition based background removal and de-noising in polarization interference imaging spectrometer.

    PubMed

    Zhang, Chunmin; Ren, Wenyi; Mu, Tingkui; Fu, Lili; Jia, Chenling

    2013-02-11

    Based on empirical mode decomposition (EMD), the background removal and de-noising procedures of the data taken by polarization interference imaging interferometer (PIIS) are implemented. Through numerical simulation, it is discovered that the data processing methods are effective. The assumption that the noise mostly exists in the first intrinsic mode function is verified, and the parameters in the EMD thresholding de-noising methods is determined. In comparison, the wavelet and windowed Fourier transform based thresholding de-noising methods are introduced. The de-noised results are evaluated by the SNR, spectral resolution and peak value of the de-noised spectrums. All the methods are used to suppress the effect from the Gaussian and Poisson noise. The de-noising efficiency is higher for the spectrum contaminated by Gaussian noise. The interferogram obtained by the PIIS is processed by the proposed methods. Both the interferogram without background and noise free spectrum are obtained effectively. The adaptive and robust EMD based methods are effective to the background removal and de-noising in PIIS. PMID:23481716

  20. Target detection for low cost uncooled MWIR cameras based on empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Piñeiro-Ave, José; Blanco-Velasco, Manuel; Cruz-Roldán, Fernando; Artés-Rodríguez, Antonio

    2014-03-01

    In this work, a novel method for detecting low intensity fast moving objects with low cost Medium Wavelength Infrared (MWIR) cameras is proposed. The method is based on background subtraction in a video sequence obtained with a low density Focal Plane Array (FPA) of the newly available uncooled lead selenide (PbSe) detectors. Thermal instability along with the lack of specific electronics and mechanical devices for canceling the effect of distortion make background image identification very difficult. As a result, the identification of targets is performed in low signal to noise ratio (SNR) conditions, which may considerably restrict the sensitivity of the detection algorithm. These problems are addressed in this work by means of a new technique based on the empirical mode decomposition, which accomplishes drift estimation and target detection. Given that background estimation is the most important stage for detecting, a previous denoising step enabling a better drift estimation is designed. Comparisons are conducted against a denoising technique based on the wavelet transform and also with traditional drift estimation methods such as Kalman filtering and running average. The results reported by the simulations show that the proposed scheme has superior performance.

  1. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User's Head Movement.

    PubMed

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-01-01

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest. PMID:27589768

  2. Open-circuit sensitivity model based on empirical parameters for a capacitive-type MEMS acoustic sensor

    NASA Astrophysics Data System (ADS)

    Lee, Jaewoo; Jeon, J. H.; Je, C. H.; Lee, S. Q.; Yang, W. S.; Lee, S.-G.

    2016-03-01

    An empirical-based open-circuit sensitivity model for a capacitive-type MEMS acoustic sensor is presented. To intuitively evaluate the characteristic of the open-circuit sensitivity, the empirical-based model is proposed and analysed by using a lumped spring-mass model and a pad test sample without a parallel plate capacitor for the parasitic capacitance. The model is composed of three different parameter groups: empirical, theoretical, and mixed data. The empirical residual stress from the measured pull-in voltage of 16.7 V and the measured surface topology of the diaphragm were extracted as +13 MPa, resulting in the effective spring constant of 110.9 N/m. The parasitic capacitance for two probing pads including the substrate part was 0.25 pF. Furthermore, to verify the proposed model, the modelled open-circuit sensitivity was compared with the measured value. The MEMS acoustic sensor had an open- circuit sensitivity of -43.0 dBV/Pa at 1 kHz with a bias of 10 V, while the modelled open- circuit sensitivity was -42.9 dBV/Pa, which showed good agreement in the range from 100 Hz to 18 kHz. This validates the empirical-based open-circuit sensitivity model for designing capacitive-type MEMS acoustic sensors.

  3. Empirical Evaluation Indicators in Thai Higher Education: Theory-Based Multidimensional Learners' Assessment

    ERIC Educational Resources Information Center

    Sritanyarat, Dawisa; Russ-Eft, Darlene

    2016-01-01

    This study proposed empirical indicators which can be validated and adopted in higher education institutions to evaluate quality of teaching and learning, and to serve as an evaluation criteria for human resource management and development of higher institutions in Thailand. The main purpose of this study was to develop empirical indicators of a…

  4. Evaluation of Physically and Empirically Based Models for the Estimation of Green Roof Evapotranspiration

    NASA Astrophysics Data System (ADS)

    Digiovanni, K. A.; Montalto, F. A.; Gaffin, S.; Rosenzweig, C.

    2010-12-01

    Green roofs and other urban green spaces can provide a variety of valuable benefits including reduction of the urban heat island effect, reduction of stormwater runoff, carbon sequestration, oxygen generation, air pollution mitigation etc. As many of these benefits are directly linked to the processes of evaporation and transpiration, accurate and representative estimation of urban evapotranspiration (ET) is a necessary tool for predicting and quantifying such benefits. However, many common ET estimation procedures were developed for agricultural applications, and thus carry inherent assumptions that may only be rarely applicable to urban green spaces. Various researchers have identified the estimation of expected urban ET rates as critical, yet poorly studied components of urban green space performance prediction and cite that further evaluation is needed to reconcile differences in predictions from varying ET modeling approaches. A small scale green roof lysimeter setup situated on the green roof of the Ethical Culture Fieldston School in the Bronx, NY has been the focus of ongoing monitoring initiated in June 2009. The experimental setup includes a 0.6 m by 1.2 m Lysimeter replicating the anatomy of the 500 m2 green roof of the building, with a roof membrane, drainage layer, 10 cm media depth, and planted with a variety of Sedum species. Soil moisture sensors and qualitative runoff measurements are also recorded in the Lysimeter, while a weather station situated on the rooftop records climatologic data. Direct quantification of actual evapotranspiration (AET) from the green roof weighing lysimeter was achieved through a mass balance approaches during periods absent of precipitation and drainage. A comparison of AET to estimates of potential evapotranspiration (PET) calculated from empirically and physically based ET models was performed in order to evaluate the applicability of conventional ET equations for the estimation of ET from green roofs. Results have

  5. Impact of Inadequate Empirical Therapy on the Mortality of Patients with Bloodstream Infections: a Propensity Score-Based Analysis

    PubMed Central

    Retamar, Pilar; Portillo, María M.; López-Prieto, María Dolores; Rodríguez-López, Fernando; de Cueto, Marina; García, María V.; Gómez, María J.; del Arco, Alfonso; Muñoz, Angel; Sánchez-Porto, Antonio; Torres-Tortosa, Manuel; Martín-Aspas, Andrés; Arroyo, Ascensión; García-Figueras, Carolina; Acosta, Federico; Corzo, Juan E.; León-Ruiz, Laura; Escobar-Lara, Trinidad

    2012-01-01

    The impact of the adequacy of empirical therapy on outcome for patients with bloodstream infections (BSI) is key for determining whether adequate empirical coverage should be prioritized over other, more conservative approaches. Recent systematic reviews outlined the need for new studies in the field, using improved methodologies. We assessed the impact of inadequate empirical treatment on the mortality of patients with BSI in the present-day context, incorporating recent methodological recommendations. A prospective multicenter cohort including all BSI episodes in adult patients was performed in 15 hospitals in Andalucía, Spain, over a 2-month period in 2006 to 2007. The main outcome variables were 14- and 30-day mortality. Adjusted analyses were performed by multivariate analysis and propensity score-based matching. Eight hundred one episodes were included. Inadequate empirical therapy was administered in 199 (24.8%) episodes; mortality at days 14 and 30 was 18.55% and 22.6%, respectively. After controlling for age, Charlson index, Pitt score, neutropenia, source, etiology, and presentation with severe sepsis or shock, inadequate empirical treatment was associated with increased mortality at days 14 and 30 (odds ratios [ORs], 2.12 and 1.56; 95% confidence intervals [95% CI], 1.34 to 3.34 and 1.01 to 2.40, respectively). The adjusted ORs after a propensity score-based matched analysis were 3.03 and 1.70 (95% CI, 1.60 to 5.74 and 0.98 to 2.98, respectively). In conclusion, inadequate empirical therapy is independently associated with increased mortality in patients with BSI. Programs to improve the quality of empirical therapy in patients with suspicion of BSI and optimization of definitive therapy should be implemented. PMID:22005999

  6. Is Project Based Learning More Effective than Direct Instruction in School Science Classrooms? An Analysis of the Empirical Research Evidence

    NASA Astrophysics Data System (ADS)

    Dann, Clifford

    An increasingly loud call by parents, school administrators, teachers, and even business leaders for "authentic learning", emphasizing both group-work and problem solving, has led to growing enthusiasm for inquiry-based learning over the past decade. Although "inquiry" can be defined in many ways, a curriculum called "project-based learning" has recently emerged as the inquiry practice-of-choice with roots in the educational constructivism that emerged in the mid-twentieth century. Often, project-based learning is framed as an alternative instructional strategy to direct instruction for maximizing student content knowledge. This study investigates the empirical evidence for such a comparison while also evaluating the overall quality of the available studies in the light of accepted standards for educational research. Specifically, this thesis investigates what the body of quantitative research says about the efficacy of project-based learning vs. direct instruction when considering student acquisition of content knowledge in science classrooms. Further, existing limitations of the research pertaining to project based learning and secondary school education are explored. The thesis concludes with a discussion of where and how we should focus our empirical efforts in the future. The research revealed that the available empirical research contains flaws in both design and instrumentation. In particular, randomization is poor amongst all the studies considered. The empirical evidence indicates that project-based learning curricula improved student content knowledge but that, while the results were statistically significant, increases in raw test scores were marginal.

  7. Time Domain Strain/Stress Reconstruction Based on Empirical Mode Decomposition: Numerical Study and Experimental Validation.

    PubMed

    He, Jingjing; Zhou, Yibin; Guan, Xuefei; Zhang, Wei; Zhang, Weifang; Liu, Yongming

    2016-01-01

    Structural health monitoring has been studied by a number of researchers as well as various industries to keep up with the increasing demand for preventive maintenance routines. This work presents a novel method for reconstruct prompt, informed strain/stress responses at the hot spots of the structures based on strain measurements at remote locations. The structural responses measured from usage monitoring system at available locations are decomposed into modal responses using empirical mode decomposition. Transformation equations based on finite element modeling are derived to extrapolate the modal responses from the measured locations to critical locations where direct sensor measurements are not available. Then, two numerical examples (a two-span beam and a 19956-degree of freedom simplified airfoil) are used to demonstrate the overall reconstruction method. Finally, the present work investigates the effectiveness and accuracy of the method through a set of experiments conducted on an aluminium alloy cantilever beam commonly used in air vehicle and spacecraft. The experiments collect the vibration strain signals of the beam via optical fiber sensors. Reconstruction results are compared with theoretical solutions and a detailed error analysis is also provided. PMID:27537889

  8. A hybrid filtering method based on a novel empirical mode decomposition for friction signals

    NASA Astrophysics Data System (ADS)

    Li, Chengwei; Zhan, Liwei

    2015-12-01

    During a measurement, the measured signal usually contains noise. To remove the noise and preserve the important feature of the signal, we introduce a hybrid filtering method that uses a new intrinsic mode function (NIMF) and a modified Hausdorff distance. The NIMF is defined as the difference between the noisy signal and each intrinsic mode function (IMF), which is obtained by empirical mode decomposition (EMD), ensemble EMD, complementary ensemble EMD, or complete ensemble EMD with adaptive noise (CEEMDAN). The relevant mode selecting is based on the similarity between the first NIMF and the rest of the NIMFs. With this filtering method, the EMD and improved versions are used to filter the simulation and friction signals. The friction signal between an airplane tire and the runaway is recorded during a simulated airplane touchdown and features spikes of various amplitudes and noise. The filtering effectiveness of the four hybrid filtering methods are compared and discussed. The results show that the filtering method based on CEEMDAN outperforms other signal filtering methods.

  9. Feasibility of an empirically based program for parents of preschoolers with autism spectrum disorder.

    PubMed

    Dababnah, Sarah; Parish, Susan L

    2016-01-01

    This article reports on the feasibility of implementing an existing empirically based program, The Incredible Years, tailored to parents of young children with autism spectrum disorder. Parents raising preschool-aged children (aged 3-6 years) with autism spectrum disorder (N = 17) participated in a 15-week pilot trial of the intervention. Quantitative assessments of the program revealed fidelity was generally maintained, with the exception of program-specific videos. Qualitative data from individual post-intervention interviews reported parents benefited most from child emotion regulation strategies, play-based child behavior skills, parent stress management, social support, and visual resources. More work is needed to further refine the program to address parent self-care, partner relationships, and the diverse behavioral and communication challenges of children across the autism spectrum. Furthermore, parent access and retention could potentially be increased by providing in-home childcare vouchers and a range of times and locations in which to offer the program. The findings suggest The Incredible Years is a feasible intervention for parents seeking additional support for child- and family-related challenges and offers guidance to those communities currently using The Incredible Years or other related parenting programs with families of children with autism spectrum disorder. PMID:25717131

  10. Empirical mode decomposition-based motion artifact correction method for functional near-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Gu, Yue; Han, Junxia; Liang, Zhenhu; Yan, Jiaqing; Li, Zheng; Li, Xiaoli

    2016-01-01

    Functional near-infrared spectroscopy (fNIRS) is a promising technique for monitoring brain activity. However, it is sensitive to motion artifacts. Many methods have been developed for motion correction, such as spline interpolation, wavelet filtering, and kurtosis-based wavelet filtering. We propose a motion correction method based on empirical mode decomposition (EMD), which is applied to segments of data identified as having motion artifacts. The EMD method is adaptive, data-driven, and well suited for nonstationary data. To test the performance of the proposed EMD method and to compare it with other motion correction methods, we used simulated hemodynamic responses added to real resting-state fNIRS data. The EMD method reduced mean squared error in 79% of channels and increased signal-to-noise ratio in 78% of channels. Moreover, it produced the highest Pearson's correlation coefficient between the recovered signal and the original signal, significantly better than the comparison methods (p<0.01, paired t-test). These results indicate that the proposed EMD method is a first choice method for motion artifact correction in fNIRS.

  11. An empirically based steady state friction law and implications for fault stability

    NASA Astrophysics Data System (ADS)

    Spagnuolo, E.; Nielsen, S.; Violay, M.; Di Toro, G.

    2016-04-01

    Empirically based rate-and-state friction laws (RSFLs) have been proposed to model the dependence of friction forces with slip and time. The relevance of the RSFL for earthquake mechanics is that few constitutive parameters define critical conditions for fault stability (i.e., critical stiffness and frictional fault behavior). However, the RSFLs were determined from experiments conducted at subseismic slip rates (V < 1 cm/s), and their extrapolation to earthquake deformation conditions (V > 0.1 m/s) remains questionable on the basis of the experimental evidence of (1) large dynamic weakening and (2) activation of particular fault lubrication processes at seismic slip rates. Here we propose a modified RSFL (MFL) based on the review of a large published and unpublished data set of rock friction experiments performed with different testing machines. The MFL, valid at steady state conditions from subseismic to seismic slip rates (0.1 µm/s < V < 3 m/s), describes the initiation of a substantial velocity weakening in the 1-20 cm/s range resulting in a critical stiffness increase that creates a peak of potential instability in that velocity regime. The MFL leads to a new definition of fault frictional stability with implications for slip event styles and relevance for models of seismic rupture nucleation, propagation, and arrest.

  12. Neglected Value of Small Population-based Surveys: A Comparison with Demographic and Health Survey Data

    PubMed Central

    Langston, Anne C.; Sarriot, Eric G.

    2015-01-01

    ABSTRACT We believe that global health practice and evaluation operate with misleading assumptions about lack of reliability of small population-based health surveys (district level and below), leading managers and decision-makers to under-use this valuable information and programmatic tool and to rely on health information from large national surveys when neither timing nor available data meet their needs. This paper uses a unique opportunity for comparison between a knowledge, practice, and coverage (KPC) household survey and Rwanda Demographic and Health Survey (RDHS) carried out in overlapping timeframes to disprove these enduring suspicions. Our analysis shows that the KPC provides coverage estimates consistent with the RDHS estimates for the same geographic areas. We discuss cases of divergence between estimates. Application of the Lives Saved Tool to the KPC results also yields child mortality estimates comparable with DHS-measured mortality. We draw three main lessons from the study and conclude with recommendations for challenging unfounded assumptions against the value of small household coverage surveys, which can be a key resource in the arsenal of local health programmers. PMID:25995729

  13. Neglected value of small population-based surveys: a comparison with demographic and health survey data.

    PubMed

    Langston, Anne C; Prosnitz, Debra M; Sarriot, Eric G

    2015-03-01

    We believe that global health practice and evaluation operate with misleading assumptions about lack of reliability of small population-based health surveys (district level and below), leading managers and decision-makers to under-use this valuable information and programmatic tool and to rely on health information from large national surveys when neither timing nor available data meet their needs. This paper uses a unique opportunity for comparison between a knowledge, practice, and coverage (KPC) household survey and Rwanda Demographic and Health Survey (RDHS) carried out in overlapping timeframes to disprove these enduring suspicions. Our analysis shows that the KPC provides coverage estimates consistent with the RDHS estimates for the same geographic areas. We discuss cases of divergence between estimates. Application of the Lives Saved Tool to the KPC results also yields child mortality estimates comparable with DHS-measured mortality. We draw three main lessons from the study and conclude with recommendations for challenging unfounded assumptions against the value of small household coverage surveys, which can be a key resource in the arsenal of local health programmers. PMID:25995729

  14. Evaluating the compatibility of physics-based deterministic synthetic ground motion with empirical GMPE

    NASA Astrophysics Data System (ADS)

    Baumann, C.; Dalguer, L. A.

    2012-12-01

    Recent development of deterministic physics-based numerical simulations of earthquakes has contributed to substantial advances in our understanding of different aspects related to the earthquake mechanism and near source ground motion. These models have greater potential for identifying and predicting the variability of near-source ground motions dominated by the source and/or geological effects. These advances have led to increased interest in using suite of physics-based models for reliable prediction of ground motion of future earthquakes for seismic hazard assessment and risk mitigation, particularly in areas where there are few recorded ground motions. But before using synthetic ground motion, it is important to evaluate the reliability of deterministic synthetic ground motions, particularly the upper frequency limit. Current engineering practice usually use ground motion quantities estimated from empirical Ground Motion Predicting Equations (GMPE) such as peak ground acceleration (PGA), peak ground velocity (PGV), peak ground displacement (PGD), and spectral ordinates as input to assess building response for seismic safety of future and existing structures. Therefore it is intuitive and evident to verify the compatibility of synthetic ground motions with current empirical GMPE. In this study we attempt to do it so, to a suite of deterministic ground motion simulation generated by earthquake dynamic rupture models. We focus mainly on determining the upper frequency limit in which the synthetic ground motions are compatible to GMPE. For that purpose we have generated suite of earthquake rupture dynamic models in a layered 1D velocity structure. The simulations include 360 rupture dynamic models with moment magnitudes in the range of 5.5-7, for three styles of faulting (reverse, normal and strike slip), for both buried faults and surface rupturing faults. Normal stress and frictional strength are depth and non-depth dependent. Initial stress distribution follows

  15. An Empirical Introduction to the Concept of Chemical Element Based on Van Hiele's Theory of Level Transitions

    ERIC Educational Resources Information Center

    Vogelezang, Michiel; Van Berkel, Berry; Verdonk, Adri

    2015-01-01

    Between 1970 and 1990, the Dutch working group "Empirical Introduction to Chemistry" developed a secondary school chemistry education curriculum based on the educational vision of the mathematicians van Hiele and van Hiele-Geldof. This approach viewed learning as a process in which students must go through discontinuous level transitions…

  16. Empirically Based Phenotypic Profiles of Children with Pervasive Developmental Disorders: Interpretation in the Light of the DSM-5

    ERIC Educational Resources Information Center

    Greaves-Lord, Kirstin; Eussen, Mart L. J. M.; Verhulst, Frank C.; Minderaa, Ruud B.; Mandy, William; Hudziak, James J.; Steenhuis, Mark Peter; de Nijs, Pieter F.; Hartman, Catharina A.

    2013-01-01

    This study aimed to contribute to the Diagnostic and Statistical Manual (DSM) debates on the conceptualization of autism by investigating (1) whether empirically based distinct phenotypic profiles could be distinguished within a sample of mainly cognitively able children with pervasive developmental disorder (PDD), and (2) how profiles related to…

  17. PowerPoint-Based Lectures in Business Education: An Empirical Investigation of Student-Perceived Novelty and Effectiveness

    ERIC Educational Resources Information Center

    Burke, Lisa A.; James, Karen E.

    2008-01-01

    The use of PowerPoint (PPT)-based lectures in business classes is prevalent, yet it remains empirically understudied in business education research. The authors investigate whether students in the contemporary business classroom view PPT as a novel stimulus and whether these perceptions of novelty are related to students' self-assessment of…

  18. Empirical Differences in Omission Tendency and Reading Ability in PISA: An Application of Tree-Based Item Response Models

    ERIC Educational Resources Information Center

    Okumura, Taichi

    2014-01-01

    This study examined the empirical differences between the tendency to omit items and reading ability by applying tree-based item response (IRTree) models to the Japanese data of the Programme for International Student Assessment (PISA) held in 2009. For this purpose, existing IRTree models were expanded to contain predictors and to handle…

  19. Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education

    ERIC Educational Resources Information Center

    Amrein-Beardsley, A.; Haladyna, T.

    2012-01-01

    Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…

  20. Acute traumatic brain injury: is current management evidence based? An empirical analysis of systematic reviews.

    PubMed

    Lei, Jin; Gao, Guoyi; Jiang, Jiyao

    2013-04-01

    Traumatic brain injury (TBI) is a major health and socioeconomic problem worldwide with a high rate of death and long-term disability. Previous studies have summarized evidence from large-scale randomized trials, finding no intervention showing convincing efficacy for acute TBI management. The present empirical study set out to assess another crucial component of evidence base-systematic review, which contributes a lot to evidence-based health care, in terms of clinical issues, methodological aspects, and implication for practice and research. A total of 44 systematic reviews pertaining to therapeutic interventions for acute TBI were identified through electronic database searching, clinical guideline retrieval, and expert consultation, of which 21 were published in Cochrane Library and 23 in peer-reviewed journals. Their methodological quality was generally satisfactory, with the median Overview Quality Assessment Questionnaire score of 5.5 (interquartile range 2-7). Cochrane reviews are of better quality than regular journal reviews. Twenty-nine high-quality reviews provided no conclusive evidence for the investigated 22 interventions except for an adverse effect of corticosteroids. Less than one-third of the component trials were reported with adequate allocation concealment. Additionally other methodological flaws in design-for example, ignoring heterogeneity among the TBI population-also contributed to the failure of past clinical research. Based on the above findings, evidence from both systematic reviews and clinical trials does not fully support current management of acute TBI. Translating from laboratory success to clinical effect remains an unique challenge. Accordingly it may be the time to rethink the way in future practice and clinical research in TBI. PMID:23151044

  1. Ship classification using nonlinear features of radiated sound: an approach based on empirical mode decomposition.

    PubMed

    Bao, Fei; Li, Chen; Wang, Xinlong; Wang, Qingfu; Du, Shuanping

    2010-07-01

    Classification for ship-radiated underwater sound is one of the most important and challenging subjects in underwater acoustical signal processing. An approach to ship classification is proposed in this work based on analysis of ship-radiated acoustical noise in subspaces of intrinsic mode functions attained via the ensemble empirical mode decomposition. It is shown that detection and acquisition of stable and reliable nonlinear features become practically feasible by nonlinear analysis of the time series of individual decomposed components, each of which is simple enough and well represents an oscillatory mode of ship dynamics. Surrogate and nonlinear predictability analysis are conducted to probe and measure the nonlinearity and regularity. The results of both methods, which verify each other, substantiate that ship-radiated noises contain components with deterministic nonlinear features well serving for efficient classification of ships. The approach perhaps opens an alternative avenue in the direction toward object classification and identification. It may also import a new view of signals as complex as ship-radiated sound. PMID:20649216

  2. Percentile-based Empirical Distribution Function Estimates for Performance Evaluation of Healthcare Providers

    PubMed Central

    Paddock, Susan M.; Louis, Thomas A.

    2010-01-01

    Summary Hierarchical models are widely-used to characterize the performance of individual healthcare providers. However, little attention has been devoted to system-wide performance evaluations, the goals of which include identifying extreme (e.g., top 10%) provider performance and developing statistical benchmarks to define high-quality care. Obtaining optimal estimates of these quantities requires estimating the empirical distribution function (EDF) of provider-specific parameters that generate the dataset under consideration. However, the difficulty of obtaining uncertainty bounds for a square-error loss minimizing EDF estimate has hindered its use in system-wide performance evaluations. We therefore develop and study a percentile-based EDF estimate for univariate provider-specific parameters. We compute order statistics of samples drawn from the posterior distribution of provider-specific parameters to obtain relevant uncertainty assessments of an EDF estimate and its features, such as thresholds and percentiles. We apply our method to data from the Medicare End Stage Renal Disease (ESRD) Program, a health insurance program for people with irreversible kidney failure. We highlight the risk of misclassifying providers as exceptionally good or poor performers when uncertainty in statistical benchmark estimates is ignored. Given the high stakes of performance evaluations, statistical benchmarks should be accompanied by precision estimates. PMID:21918583

  3. Pseudo-empirical Likelihood-Based Method Using Calibration for Longitudinal Data with Drop-Out

    PubMed Central

    Chen, Baojiang; Zhou, Xiao-Hua; Chan, Kwun Chuen Gary

    2014-01-01

    Summary In observational studies, interest mainly lies in estimation of the population-level relationship between the explanatory variables and dependent variables, and the estimation is often undertaken using a sample of longitudinal data. In some situations, the longitudinal data sample features biases and loss of estimation efficiency due to non-random drop-out. However, inclusion of population-level information can increase estimation efficiency. In this paper we propose an empirical likelihood-based method to incorporate population-level information in a longitudinal study with drop-out. The population-level information is incorporated via constraints on functions of the parameters, and non-random drop-out bias is corrected by using a weighted generalized estimating equations method. We provide a three-step estimation procedure that makes computation easier. Some commonly used methods are compared in simulation studies, which demonstrate that our proposed method can correct the non-random drop-out bias and increase the estimation efficiency, especially for small sample size or when the missing proportion is high. In some situations, the efficiency improvement is substantial. Finally, we apply this method to an Alzheimer’s disease study. PMID:25587200

  4. Empirical Study of User Preferences Based on Rating Data of Movies.

    PubMed

    Zhao, YingSi; Shen, Bo

    2016-01-01

    User preference plays a prominent role in many fields, including electronic commerce, social opinion, and Internet search engines. Particularly in recommender systems, it directly influences the accuracy of the recommendation. Though many methods have been presented, most of these have only focused on how to improve the recommendation results. In this paper, we introduce an empirical study of user preferences based on a set of rating data about movies. We develop a simple statistical method to investigate the characteristics of user preferences. We find that the movies have potential characteristics of closure, which results in the formation of numerous cliques with a power-law size distribution. We also find that a user related to a small clique always has similar opinions on the movies in this clique. Then, we suggest a user preference model, which can eliminate the predictions that are considered to be impracticable. Numerical results show that the model can reflect user preference with remarkable accuracy when data elimination is allowed, and random factors in the rating data make prediction error inevitable. In further research, we will investigate many other rating data sets to examine the universality of our findings. PMID:26735847

  5. Satellite-based empirical models linking river plume dynamics with hypoxic area and volume

    NASA Astrophysics Data System (ADS)

    Le, Chengfeng; Lehrter, John C.; Hu, Chuanmin; Obenour, Daniel R.

    2016-03-01

    Satellite-based empirical models explaining hypoxic area and volume variation were developed for the seasonally hypoxic (O2 < 2 mg L-1) northern Gulf of Mexico adjacent to the Mississippi River. Annual variations in midsummer hypoxic area and volume were related to Moderate Resolution Imaging Spectroradiometer-derived monthly estimates of river plume area (km2) and average, inner shelf chlorophyll a concentration (Chl a, mg m-3). River plume area in June was negatively related with midsummer hypoxic area (km2) and volume (km3), while July inner shelf Chl a was positively related to hypoxic area and volume. Multiple regression models using river plume area and Chl a as independent variables accounted for most of the variability in hypoxic area (R2 = 0.92) or volume (R2 = 0.89). These models explain more variation in hypoxic area than models using Mississippi River nutrient loads as independent variables. The results here also support a hypothesis that confinement of the river plume to the inner shelf is an important mechanism controlling hypoxia area and volume in this region.

  6. Polarizable Empirical Force Field for Aromatic Compounds Based on the Classical Drude Oscillator

    PubMed Central

    Lopes, Pedro E. M.; Lamoureux, Guillaume; Roux, Benoit; MacKerell, Alexander D.

    2008-01-01

    The polarizable empirical CHARMM force field based on the classical Drude oscillator has been extended to the aromatic compounds benzene and toluene. Parameters were optimized for benzene and then transferred directly to toluene, with parameters for the methyl moiety of toluene taken from the previously published work on the alkanes. Optimization of all parameters was performed against an extensive set of quantum mechanical and experimental data. Ab initio data was used for determination of the electrostatic parameters, the vibrational analysis, and in the optimization of the relative magnitudes of the Lennard-Jones parameters. The absolute values of the Lennard-Jones parameters were determined by comparing computed and experimental heats of vaporization, molecular volumes, free energies of hydration and dielectric constants. The newly developed parameter set was extensively tested against additional experimental data such as vibrational spectra in the condensed phase, diffusion constants, heat capacities at constant pressure and isothermal compressibilities including data as a function of temperature. Moreover, the structure of liquid benzene, liquid toluene and of solutions of each in water were studied. In the case of benzene, the computed and experimental total distribution function were compared, with the developed model shown to be in excellent agreement with experiment. PMID:17388420

  7. Empirical Study of User Preferences Based on Rating Data of Movies

    PubMed Central

    Zhao, YingSi; Shen, Bo

    2016-01-01

    User preference plays a prominent role in many fields, including electronic commerce, social opinion, and Internet search engines. Particularly in recommender systems, it directly influences the accuracy of the recommendation. Though many methods have been presented, most of these have only focused on how to improve the recommendation results. In this paper, we introduce an empirical study of user preferences based on a set of rating data about movies. We develop a simple statistical method to investigate the characteristics of user preferences. We find that the movies have potential characteristics of closure, which results in the formation of numerous cliques with a power-law size distribution. We also find that a user related to a small clique always has similar opinions on the movies in this clique. Then, we suggest a user preference model, which can eliminate the predictions that are considered to be impracticable. Numerical results show that the model can reflect user preference with remarkable accuracy when data elimination is allowed, and random factors in the rating data make prediction error inevitable. In further research, we will investigate many other rating data sets to examine the universality of our findings. PMID:26735847

  8. Empirical prediction of Indian summer monsoon rainfall with different lead periods based on global SST anomalies

    NASA Astrophysics Data System (ADS)

    Pai, D. S.; Rajeevan, M.

    2006-02-01

    The main objective of this study was to develop empirical models with different seasonal lead time periods for the long range prediction of seasonal (June to September) Indian summer monsoon rainfall (ISMR). For this purpose, 13 predictors having significant and stable relationships with ISMR were derived by the correlation analysis of global grid point seasonal Sea-Surface Temperature (SST) anomalies and the tendency in the SST anomalies. The time lags of the seasonal SST anomalies were varied from 1 season to 4 years behind the reference monsoon season. The basic SST data set used was the monthly NOAA Extended Reconstructed Global SST (ERSST) data at 2° × 2° spatial grid for the period 1951 2003. The time lags of the 13 predictors derived from various areas of all three tropical ocean basins (Indian, Pacific and Atlantic Oceans) varied from 1 season to 3 years. Based on these inter-correlated predictors, 3 predictor sub sets A, B and C were formed with prediction lead time periods of 0, 1 and 2 seasons, respectively, from the beginning of the monsoon season. The selected principal components (PCs) of these predictor sets were used as the input parameters for the models A, B and C, respectively. The model development period was 1955 1984. The correct model size was derived using all-possible regressions procedure and Mallow’s “Cp” statistics.

  9. The mature minor: some critical psychological reflections on the empirical bases.

    PubMed

    Partridge, Brian C

    2013-06-01

    Moral and legal notions engaged in clinical ethics should not only possess analytic clarity but a sound basis in empirical findings. The latter condition brings into question the expansion of the mature minor exception. The mature minor exception in the healthcare law of the United States has served to enable those under the legal age to consent to medical treatment. Although originally developed primarily for minors in emergency or quasi-emergency need for health care, it was expanded especially from the 1970s in order to cover unemancipated minors older than 14 years. This expansion initially appeared plausible, given psychological data that showed the intellectual capacity of minors over 14 to recognize the causal connection between their choices and the consequences of their choices. However, subsequent psychological studies have shown that minors generally fail to have realistic affective and evaluative appreciations of the consequences of their decisions, because they tend to over-emphasize short-term benefits and underestimate long-term risks. Also, unlike most decisionmakers over 21, the decisions of minors are more often marked by the lack of adequate impulse control, all of which is reflected in the far higher involvement of adolescents in acts of violence, intentional injury, and serious automobile accidents. These effects are more evident in circumstances that elicit elevated affective responses. The advent of brain imaging has allowed the actual visualization of qualitative differences between how minors versus persons over the age of 21 generally assess risks and benefits and make decisions. In the case of most under the age of 21, subcortical systems fail adequately to be checked by the prefrontal systems that are involved in adult executive decisions. The neuroanatomical and psychological model developed by Casey, Jones, and Summerville offers an empirical insight into the qualitative differences in the neuroanatomical and neuropsychological bases

  10. Selective Survey of Online Access to Social Science Data Bases

    ERIC Educational Resources Information Center

    Donati, Robert

    1977-01-01

    Data bases in the social sciences are briefly surveyed with emphasis on content, coverage, and currency. Coverage of certain topics are quantitatively compared among several social science and more general files. Techniques for on line thesaurus utilization and a systematic application of the same search strategies across multiple files are…

  11. Place Based Assistance Tools: Networking and Resident Surveys.

    ERIC Educational Resources Information Center

    Department of Housing and Urban Development, Washington, DC.

    "Place-based assistance" is not a new concept. Asking what people want and finding ways to give it to them sounds simplistic, but it can result in "win-win" solutions in which everyone involved benefits. This document is a guide to using networking and surveys of residents to determine community needs. Some case studies show networking and surveys…

  12. Lake Superior Phytoplankton Characterization from the 2006 Probability Based Survey

    EPA Science Inventory

    We conducted a late summer probability based survey of Lake Superior in 2006 which consisted of 52 sites stratified across 3 depth zones. As part of this effort, we collected composite phytoplankton samples from the epilimnion and the fluorescence maxima (Fmax) at 29 of the site...

  13. Space-based infrared surveys of small bodies

    NASA Astrophysics Data System (ADS)

    Mommert, M.

    2014-07-01

    Most small bodies in the Solar System are too small and too distant to be spatially resolved, precluding a direct diameter derivation. Furthermore, measurements of the optical brightness alone only allow a rough estimate of the diameter, since the surface albedo is usually unknown and can have values between about 3 % and 60 % or more. The degeneracy can be resolved by considering the thermal emission of these objects, which is less prone to albedo effects and mainly a function of the diameter. Hence, the combination of optical and thermal-infrared observational data provides a means to independently derive an object's diameter and albedo. This technique is used in asteroid thermal models or more sophisticated thermophysical models (see, e.g., [1]). Infrared observations require cryogenic detectors and/or telescopes, depending on the actual wavelength range observed. Observations from the ground are additionally compromised by the variable transparency of Earth's atmosphere in major portions of the infrared wavelength ranges. Hence, space-based infrared telescopes, providing stable conditions and significantly better sensitivities than ground-based telescopes, are now used routinely to exploit this wavelength range. Two observation strategies are used with space-based infrared observatories: Space-based Infrared All-Sky Surveys. Asteroid surveys in the thermal infrared are less prone to albedo-related discovery bias compared to surveys with optical telescopes, providing a more complete picture of small body populations. The first space-based infrared survey of Solar System small bodies was performed with the Infrared Astronomical Satellite (IRAS) for 10 months in 1983. In the course of the 'IRAS Minor Planet Survey' [2], 2228 asteroids (3 new discoveries) and more than 25 comets (6 new discoveries) were observed. More recent space-based infrared all-sky asteroid surveys were performed by Akari (launched 2006) and the Wide-field Infrared Survey Explorer (WISE

  14. Survey-based Indices for Nursing Home Quality Incentive Reimbursement

    PubMed Central

    Willemain, Thomas R.

    1983-01-01

    Incentive payments are a theoretically appealing complement to nursing home quality assurance systems that rely on regulatory enforcement. However, the practical aspects of incentive program design are not yet well understood. After reviewing the rationale for incentive approaches and recent State and. Federal initiatives, the article considers a basic program design issue: creating an index of nursing home quality. It focuses on indices constructed from routine licensure and certification survey results because State initiatives have relied heavily on these readily accessible data. It also suggests a procedure for creating a survey-based index and discusses a sampling of Implementation issues. PMID:10309858

  15. Design-based and model-based inference in surveys of freshwater mollusks

    USGS Publications Warehouse

    Dorazio, R.M.

    1999-01-01

    Well-known concepts in statistical inference and sampling theory are used to develop recommendations for planning and analyzing the results of quantitative surveys of freshwater mollusks. Two methods of inference commonly used in survey sampling (design-based and model-based) are described and illustrated using examples relevant in surveys of freshwater mollusks. The particular objectives of a survey and the type of information observed in each unit of sampling can be used to help select the sampling design and the method of inference. For example, the mean density of a sparsely distributed population of mollusks can be estimated with higher precision by using model-based inference or by using design-based inference with adaptive cluster sampling than by using design-based inference with conventional sampling. More experience with quantitative surveys of natural assemblages of freshwater mollusks is needed to determine the actual benefits of different sampling designs and inferential procedures.

  16. Selecting Great Lakes streams for lampricide treatment based on larval sea lamprey surveys

    USGS Publications Warehouse

    Christie, Gavin C.; Adams, Jean V.; Steeves, Todd B.; Slade, Jeffrey W.; Cuddy, Douglas W.; Fodale, Michael F.; Young, Robert J.; Kuc, Miroslaw; Jones, Michael L.

    2003-01-01

    The Empiric Stream Treatment Ranking (ESTR) system is a data-driven, model-based, decision tool for selecting Great Lakes streams for treatment with lampricide, based on estimates from larval sea lamprey (Petromyzon marinus) surveys conducted throughout the basin. The 2000 ESTR system was described and applied to larval assessment surveys conducted from 1996 to 1999. A comparative analysis of stream survey and selection data was conducted and improvements to the stream selection process were recommended. Streams were selected for treatment based on treatment cost, predicted treatment effectiveness, and the projected number of juvenile sea lampreys produced. On average, lampricide treatments were applied annually to 49 streams with 1,075 ha of larval habitat, killing 15 million larval and 514,000 juvenile sea lampreys at a total cost of $5.3 million, and marginal and mean costs of $85 and $10 per juvenile killed. The numbers of juvenile sea lampreys killed for given treatment costs showed a pattern of diminishing returns with increasing investment. Of the streams selected for treatment, those with > 14 ha of larval habitat targeted 73% of the juvenile sea lampreys for 60% of the treatment cost. Suggested improvements to the ESTR system were to improve accuracy and precision of model estimates, account for uncertainty in estimates, include all potentially productive streams in the process (not just those surveyed in the current year), consider the value of all larvae killed during treatment (not just those predicted to metamorphose the following year), use lake-specific estimates of damage, and establish formal suppression targets.

  17. A novel signal compression method based on optimal ensemble empirical mode decomposition for bearing vibration signals

    NASA Astrophysics Data System (ADS)

    Guo, Wei; Tse, Peter W.

    2013-01-01

    Today, remote machine condition monitoring is popular due to the continuous advancement in wireless communication. Bearing is the most frequently and easily failed component in many rotating machines. To accurately identify the type of bearing fault, large amounts of vibration data need to be collected. However, the volume of transmitted data cannot be too high because the bandwidth of wireless communication is limited. To solve this problem, the data are usually compressed before transmitting to a remote maintenance center. This paper proposes a novel signal compression method that can substantially reduce the amount of data that need to be transmitted without sacrificing the accuracy of fault identification. The proposed signal compression method is based on ensemble empirical mode decomposition (EEMD), which is an effective method for adaptively decomposing the vibration signal into different bands of signal components, termed intrinsic mode functions (IMFs). An optimization method was designed to automatically select appropriate EEMD parameters for the analyzed signal, and in particular to select the appropriate level of the added white noise in the EEMD method. An index termed the relative root-mean-square error was used to evaluate the decomposition performances under different noise levels to find the optimal level. After applying the optimal EEMD method to a vibration signal, the IMF relating to the bearing fault can be extracted from the original vibration signal. Compressing this signal component obtains a much smaller proportion of data samples to be retained for transmission and further reconstruction. The proposed compression method were also compared with the popular wavelet compression method. Experimental results demonstrate that the optimization of EEMD parameters can automatically find appropriate EEMD parameters for the analyzed signals, and the IMF-based compression method provides a higher compression ratio, while retaining the bearing defect

  18. Survey on Existing Science Gateways - Based on DEGREE

    NASA Astrophysics Data System (ADS)

    Schwichtenberg, Horst; Claus, Steffen

    2010-05-01

    Science Gateways gather community-developed specific tools, applications, and data that is usually combined and presented in a graphical user interface which is customized to the needs of the target user community. Science Gateways serve as a single point of entry for the users and are usually represented by fat clients or web portals. Part of the DEGREE project (Dissemination and Exploitation of Grids in Earth Science) was a state-of-the-art survey of portal usage in Earth Science (ES) applications. This survey considered a list of 29 portals, including 17 ES portals and 12 generic developments coming from outside of the ES domain. The survey identified three common usage types of ES portals, including data dissemination (e.g. observational data), collaboration as well as usage of Grid-based resources (e.g. for processing of ES datasets). Based on these three usage types, key requirements could be extracted. These requirements were furthermore used for a feature comparison with existing portal developments coming from outside of the ES domain. This presentation gives an overview of the results of the survey (including a feature comparison of ES and non-ES portals). Furthermore, three portals are discussed in detail, one for each usage type (data dissemination, collaboration, Grid-based).

  19. Implementing community-based provider participation in research: an empirical study

    PubMed Central

    2012-01-01

    Background Since 2003, the United States National Institutes of Health (NIH) has sought to restructure the clinical research enterprise in the United States by promoting collaborative research partnerships between academically-based investigators and community-based physicians. By increasing community-based provider participation in research (CBPPR), the NIH seeks to advance the science of discovery by conducting research in clinical settings where most people get their care, and accelerate the translation of research results into everyday clinical practice. Although CBPPR is seen as a promising strategy for promoting the use of evidence-based clinical services in community practice settings, few empirical studies have examined the organizational factors that facilitate or hinder the implementation of CBPPR. The purpose of this study is to explore the organizational start-up and early implementation of CBPPR in community-based practice. Methods We used longitudinal, case study research methods and an organizational model of innovation implementation to theoretically guide our study. Our sample consisted of three community practice settings that recently joined the National Cancer Institute’s (NCI) Community Clinical Oncology Program (CCOP) in the United States. Data were gathered through site visits, telephone interviews, and archival documents from January 2008 to May 2011. Results The organizational model for innovation implementation was useful in identifying and investigating the organizational factors influencing start-up and early implementation of CBPPR in CCOP organizations. In general, the three CCOP organizations varied in the extent to which they achieved consistency in CBPPR over time and across physicians. All three CCOP organizations demonstrated mixed levels of organizational readiness for change. Hospital management support and resource availability were limited across CCOP organizations early on, although they improved in one CCOP organization

  20. Questionnaire surveys: methodological and epistemological problems for field-based ethnopharmacologists.

    PubMed

    Edwards, Sarah; Nebel, Sabine; Heinrich, Michael

    2005-08-22

    The classical scientific approach is empirical. One of the favoured means of gathering quantitative data in the health and social sciences, including ethnopharmacology and medical ethnobotany, is by use of questionnaires. However, while there are numerous published articles discussing the importance of questionnaire content, the fact that questionnaires themselves may be inappropriate in a number of cultural contexts, even where literacy is not a factor, is usually ignored. In this paper, the authors will address the main issues posed by the use of questionnaire surveys, using case studies based on their own personal experiences as ethnopharmacologists 'in the field'. The pros and cons of qualitative and quantitative research and the use of alternative means to elicit quantitative data will be discussed. PMID:16009518

  1. How much does participatory flood management contribute to stakeholders' social capacity building? Empirical findings based on a triangulation of three evaluation approaches

    NASA Astrophysics Data System (ADS)

    Buchecker, M.; Menzel, S.; Home, R.

    2013-06-01

    Recent literature suggests that dialogic forms of risk communication are more effective to build stakeholders' hazard-related social capacities. In spite of the high theoretical expectations, there is a lack of univocal empirical evidence on the relevance of these effects. This is mainly due to the methodological limitations of the existing evaluation approaches. In our paper we aim at eliciting the contribution of participatory river revitalisation projects on stakeholders' social capacity building by triangulating the findings of three evaluation studies that were based on different approaches: a field-experimental, a qualitative long-term ex-post and a cross-sectional household survey approach. The results revealed that social learning and avoiding the loss of trust were more relevant benefits of participatory flood management than acceptance building. The results suggest that stakeholder involvements should be more explicitly designed as tools for long-term social learning.

  2. Assessment of diffuse trace metal inputs into surface waters - Combining empirical estimates with process based simulations

    NASA Astrophysics Data System (ADS)

    Schindewolf, Marcus; Steinz, André; Schmidt, Jürgen

    2015-04-01

    As a result of mining activities since the 13th century, surface waters of the German Mulde catchment suffer from deleterious dissolved and sediment attached lead (Pb) and zinc (Zn) inputs. The leaching rate of trace metals with drainage water is a significant criterion for assessing trace metal concentrations of soils and associated risks of ground water pollution. However, the vertical transport rates of trace metals in soils are difficult to quantify. Monitoring is restricted to small lysimeter plots, which limits the transferability of results. Additionally the solid-liquid-transfer conditions in soils are highly variable, primarily due to the fluctuating retention time of percolating soil water. In contrast, lateral sediment attached trace metal inputs are mostly associated with soil erosion and resulting sediment inputs into surface waters. Since soil erosion by water is related to rare single events, monitoring and empirical estimates reveal visible shortcomings. This gap in knowledge can only be closed by process based model calculations. Concerning these calculations it has to be considered, that Pb and Zn are predominantly attached to the fine-grained soil particles (<0.063 mm). The selective nature of soil erosion causes a preferential transport of these fine particles, while less contaminated larger particles remain on site. Consequently trace metals are enriched in the eroded sediment compared to the origin soil. This paper aims to introduce both, a new method that allows the assessment of trace metal leaching rates from contaminated top soils for standardised transfer conditions and a process based modelling approach for sediment attached trace metal inputs into surface waters. Pb and Zn leaching rates amounts to 20 Mg ha-1 yr-1 resp. 114 Mg ha-1 yr-1. Deviations to observed dissolved trace metal yields at the Bad Düben gauging station are caused by plant uptake and subsoil retention. Sediment attached Pb and Zn input rates amounts to 114 Mg ha-1 yr

  3. THE COS-HALOS SURVEY: AN EMPIRICAL DESCRIPTION OF METAL-LINE ABSORPTION IN THE LOW-REDSHIFT CIRCUMGALACTIC MEDIUM

    SciTech Connect

    Werk, Jessica K.; Prochaska, J. Xavier; Tripp, Todd M.; O'Meara, John M.; Peeples, Molly S.

    2013-02-15

    We present the equivalent width and column density measurements for low and intermediate ionization states of the circumgalactic medium (CGM) surrounding 44 low-z, L Almost-Equal-To L* galaxies drawn from the COS-Halos survey. These measurements are derived from far-UV transitions observed in HST/COS and Keck/HIRES spectra of background quasars within an impact parameter R < 160 kpc to the targeted galaxies. The data show significant metal-line absorption for 33 of the 44 galaxies, including quiescent systems, revealing the common occurrence of a cool (T Almost-Equal-To 10{sup 4}-10{sup 5} K), metal-enriched CGM. The detection rates and column densities derived for these metal lines decrease with increasing impact parameter, a trend we interpret as a declining metal surface density profile for the CGM. A comparison of the relative column densities of adjacent ionization states indicates that the gas is predominantly ionized. The large surface density in metals demands a large reservoir of metals and gas in the cool CGM (very conservatively, M {sup cool} {sub CGM} > 10{sup 9} M {sub Sun }), which likely traces a distinct density and/or temperature regime from the highly ionized CGM traced by O{sup +5} absorption. The large dispersion in absorption strengths (including non-detections) suggests that the cool CGM traces a wide range of densities or a mix of local ionizing conditions. Lastly, the kinematics inferred from the metal-line profiles are consistent with the cool CGM being bound to the dark matter halos hosting the galaxies; this gas may serve as fuel for future star formation. Future work will leverage this data set to provide estimates on the mass, metallicity, dynamics, and origin of the cool CGM in low-z, L* galaxies.

  4. Competence-based demands made of senior physicians: an empirical study to evaluate leadership competencies.

    PubMed

    Lehr, Bosco; Ostermann, Herwig; Schubert, Harald

    2011-01-01

    As a result of more economising in German hospitals, changes evolve in organising the deployment of senior medical staff. New demands are made of senior hospital management. Leadership competencies in the training and development of physicians are of prime importance to the successful perception of managerial responsibilities. The present study investigates the actual and targeted demands of leadership made of senior medical staff in terms of how these demands are perceived. To this end, the demands of leadership were surveyed using a competence-based questionnaire and investigated with a view to potentials in professional development by way of example of the senior management of psychiatric hospitals in Germany. In all, the results show high ratings in personal performance, the greatest significance being attributed to value-oriented competence in the actual assessment of demands on leadership. Besides gender-specific differences in the actual assessments of single fields of competence, the greatest differences between the targeted and the actual demands are, in all, shown to be in the competencies of self-management and communication. Competence-based core areas in leadership can be demonstrated for the professional development of physicians and an adaptive mode of procedure deduced. PMID:22176981

  5. Simulation of Long Lived Tracers Using an Improved Empirically Based Two-Dimensional Model Transport Algorithm

    NASA Technical Reports Server (NTRS)

    Fleming, E. L.; Jackman, C. H.; Stolarski, R. S.; Considine, D. B.

    1998-01-01

    We have developed a new empirically-based transport algorithm for use in our GSFC two-dimensional transport and chemistry model. The new algorithm contains planetary wave statistics, and parameterizations to account for the effects due to gravity waves and equatorial Kelvin waves. As such, this scheme utilizes significantly more information compared to our previous algorithm which was based only on zonal mean temperatures and heating rates. The new model transport captures much of the qualitative structure and seasonal variability observed in long lived tracers, such as: isolation of the tropics and the southern hemisphere winter polar vortex; the well mixed surf-zone region of the winter sub-tropics and mid-latitudes; the latitudinal and seasonal variations of total ozone; and the seasonal variations of mesospheric H2O. The model also indicates a double peaked structure in methane associated with the semiannual oscillation in the tropical upper stratosphere. This feature is similar in phase but is significantly weaker in amplitude compared to the observations. The model simulations of carbon-14 and strontium-90 are in good agreement with observations, both in simulating the peak in mixing ratio at 20-25 km, and the decrease with altitude in mixing ratio above 25 km. We also find mostly good agreement between modeled and observed age of air determined from SF6 outside of the northern hemisphere polar vortex. However, observations inside the vortex reveal significantly older air compared to the model. This is consistent with the model deficiencies in simulating CH4 in the northern hemisphere winter high latitudes and illustrates the limitations of the current climatological zonal mean model formulation. The propagation of seasonal signals in water vapor and CO2 in the lower stratosphere showed general agreement in phase, and the model qualitatively captured the observed amplitude decrease in CO2 from the tropics to midlatitudes. However, the simulated seasonal

  6. Model Selection for Equating Testlet-Based Tests in the NEAT Design: An Empirical Study

    ERIC Educational Resources Information Center

    He, Wei; Li, Feifei; Wolfe, Edward W.; Mao, Xia

    2012-01-01

    For those tests solely composed of testlets, local item independency assumption tends to be violated. This study, by using empirical data from a large-scale state assessment program, was interested in investigates the effects of using different models on equating results under the non-equivalent group anchor-test (NEAT) design. Specifically, the…

  7. Universal Design for Instruction in Postsecondary Education: A Systematic Review of Empirically Based Articles

    ERIC Educational Resources Information Center

    Roberts, Kelly D.; Park, Hye Jin; Brown, Steven; Cook, Bryan

    2011-01-01

    Universal Design for Instruction (UDI) in postsecondary education is a relatively new concept/framework that has generated significant support. The purpose of this literature review was to examine existing empirical research, including qualitative, quantitative, and mixed methods, on the use of UDI (and related terms) in postsecondary education.…

  8. Comparisons of experiment with cellulose models based on electronic structure and empirical force field theories

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Studies of cellobiose conformations with HF/6-31G* and B3LYP/6-31+G*quantum theory [1] gave a reference for studies with the much faster empirical methods such as MM3, MM4, CHARMM and AMBER. The quantum studies also enable a substantial reduction in the number of exo-cyclic group orientations that...

  9. The Effect of Survey Mode on High School Risk Behavior Data: A Comparison between Web and Paper-Based Surveys

    ERIC Educational Resources Information Center

    Raghupathy, Shobana; Hahn-Smith, Stephen

    2013-01-01

    There has been increasing interest in using of web-based surveys--rather than paper based surveys--for collecting data on alcohol and other drug use in middle and high schools in the US. However, prior research has indicated that respondent confidentiality is an underlying concern with online data collection especially when computer-assisted…

  10. Teacher-Reported Use of Empirically Validated and Standards-Based Instructional Approaches in Secondary Mathematics

    ERIC Educational Resources Information Center

    Gagnon, Joseph Calvin; Maccini, Paula

    2007-01-01

    A random sample of 167 secondary special and general educators who taught math to students with emotional and behavioral disorders (EBD) and learning disabilities (LD) responded to a mail survey. The survey examined teacher perceptions of (a) definition of math; (b) familiarity with course topics; (c) effectiveness of methods courses; (d)…

  11. HIV testing in national population-based surveys: experience from the Demographic and Health Surveys.

    PubMed Central

    Mishra, Vinod; Vaessen, Martin; Boerma, J. Ties; Arnold, Fred; Way, Ann; Barrere, Bernard; Cross, Anne; Hong, Rathavuth; Sangha, Jasbir

    2006-01-01

    OBJECTIVES: To describe the methods used in the Demographic and Health Surveys (DHS) to collect nationally representative data on the prevalence of human immunodeficiency virus (HIV) and assess the value of such data to country HIV surveillance systems. METHODS: During 2001-04, national samples of adult women and men in Burkina Faso, Cameroon, Dominican Republic, Ghana, Mali, Kenya, United Republic of Tanzania and Zambia were tested for HIV. Dried blood spot samples were collected for HIV testing, following internationally accepted ethical standards. The results for each country are presented by age, sex, and urban versus rural residence. To estimate the effects of non-response, HIV prevalence among non-responding males and females was predicted using multivariate statistical models for those who were tested, with a common set of predictor variables. RESULTS: Rates of HIV testing varied from 70% among Kenyan men to 92% among women in Burkina Faso and Cameroon. Despite large differences in HIV prevalence between the surveys (1-16%), fairly consistent patterns of HIV infection were observed by age, sex and urban versus rural residence, with considerably higher rates in urban areas and in women, especially at younger ages. Analysis of non-response bias indicates that although predicted HIV prevalence tended to be higher in non-tested males and females than in those tested, the overall effects of non-response on the observed national estimates of HIV prevalence are insignificant. CONCLUSIONS: Population-based surveys can provide reliable, direct estimates of national and regional HIV seroprevalence among men and women irrespective of pregnancy status. Survey data greatly enhance surveillance systems and the accuracy of national estimates in generalized epidemics. PMID:16878227

  12. Empirical model of equatorial electrojet based on ground-based magnetometer data during solar minimum in fall

    NASA Astrophysics Data System (ADS)

    Hamid, Nurul Shazana Abdul; Liu, Huixin; Uozumi, Teiji; Yoshikawa, Akimasa

    2015-12-01

    In this study, we constructed an empirical model of the equatorial electrojet (EEJ), including local time and longitudinal dependence, based on simultaneous data from 12 magnetometer stations located in six longitude sectors. An analysis was carried out using the equatorial electrojet index, EUEL, calculated from the geomagnetic northward H component. The magnetic EEJ strength is calculated as the difference between the normalized EUEL index of the magnetic dip equator station and the normalized EUEL index of the off-dip equator station located beyond the EEJ band. Analysis showed that this current is always strongest in the South American sector, regardless of local time (LT), and weakest in the Indian sector during 0900 and 1000 LT, but shifted to the African sector during 1100 to 1400 LT. These longitude variations of EEJ roughly follow variations of the inversed main field strength along the dip equator, except for the Indian and Southeast Asian sectors. The result showed that the EEJ component derived from the model exhibits a similar pattern with measured EEJ from ground data during noontime, mainly before 1300 LT.

  13. Mental Health Functioning in the Human Rights Field: Findings from an International Internet-Based Survey.

    PubMed

    Joscelyne, Amy; Knuckey, Sarah; Satterthwaite, Margaret L; Bryant, Richard A; Li, Meng; Qian, Meng; Brown, Adam D

    2015-01-01

    Human rights advocates play a critical role in promoting respect for human rights world-wide, and engage in a broad range of strategies, including documentation of rights violations, monitoring, press work and report-writing, advocacy, and litigation. However, little is known about the impact of human rights work on the mental health of human rights advocates. This study examined the mental health profile of human rights advocates and risk factors associated with their psychological functioning. 346 individuals currently or previously working in the field of human rights completed an internet-based survey regarding trauma exposure, depression, posttraumatic stress disorder (PTSD), resilience and occupational burnout. PTSD was measured with the Posttraumatic Stress Disorder Checklist-Civilian Version (PCL-C) and depression was measured with the Patient History Questionnaire-9 (PHQ-9). These findings revealed that among human rights advocates that completed the survey, 19.4% met criteria for PTSD, 18.8% met criteria for subthreshold PTSD, and 14.7% met criteria for depression. Multiple linear regressions revealed that after controlling for symptoms of depression, PTSD symptom severity was predicted by human rights-related trauma exposure, perfectionism and negative self-appraisals about human rights work. In addition, after controlling for symptoms of PTSD, depressive symptoms were predicted by perfectionism and lower levels of self-efficacy. Survey responses also suggested high levels of resilience: 43% of responders reported minimal symptoms of PTSD. Although survey responses suggest that many human rights workers are resilient, they also suggest that human rights work is associated with elevated rates of PTSD and depression. The field of human rights would benefit from further empirical research, as well as additional education and training programs in the workplace about enhancing resilience in the context of human rights work. PMID:26700305

  14. Mental Health Functioning in the Human Rights Field: Findings from an International Internet-Based Survey

    PubMed Central

    Joscelyne, Amy; Knuckey, Sarah; Satterthwaite, Margaret L.; Bryant, Richard A.; Li, Meng; Qian, Meng; Brown, Adam D.

    2015-01-01

    Human rights advocates play a critical role in promoting respect for human rights world-wide, and engage in a broad range of strategies, including documentation of rights violations, monitoring, press work and report-writing, advocacy, and litigation. However, little is known about the impact of human rights work on the mental health of human rights advocates. This study examined the mental health profile of human rights advocates and risk factors associated with their psychological functioning. 346 individuals currently or previously working in the field of human rights completed an internet-based survey regarding trauma exposure, depression, posttraumatic stress disorder (PTSD), resilience and occupational burnout. PTSD was measured with the Posttraumatic Stress Disorder Checklist-Civilian Version (PCL-C) and depression was measured with the Patient History Questionnaire-9 (PHQ-9). These findings revealed that among human rights advocates that completed the survey, 19.4% met criteria for PTSD, 18.8% met criteria for subthreshold PTSD, and 14.7% met criteria for depression. Multiple linear regressions revealed that after controlling for symptoms of depression, PTSD symptom severity was predicted by human rights-related trauma exposure, perfectionism and negative self-appraisals about human rights work. In addition, after controlling for symptoms of PTSD, depressive symptoms were predicted by perfectionism and lower levels of self-efficacy. Survey responses also suggested high levels of resilience: 43% of responders reported minimal symptoms of PTSD. Although survey responses suggest that many human rights workers are resilient, they also suggest that human rights work is associated with elevated rates of PTSD and depression. The field of human rights would benefit from further empirical research, as well as additional education and training programs in the workplace about enhancing resilience in the context of human rights work. PMID:26700305

  15. [DGRW-update: neurology--from empirical strategies towards evidence based interventions].

    PubMed

    Schupp, W

    2011-12-01

    Stroke, Multiple Sclerosis (MS), traumatic brain injuries (TBI) and neuropathies are the most important diseases in neurological rehabilitation financed by the German Pension Insurance. The primary goal is vocational (re)integration. Driven by multiple findings of neuroscience research the traditional holistic approach with mainly empirically derived strategies was developed further and improved by new evidence-based interventions. This process had been, and continues to be, necessary to meet the health-economic pressures for ever shorter and more efficient rehab measures. Evidence-based interventions refer to symptom-oriented measures, to team-management concepts, as well as to education and psychosocial interventions. Drug therapy and/or neurophysiological measures can be added to increase neuroregeneration and neuroplasticity. Evidence-based aftercare concepts support sustainability and steadiness of rehab results.Mirror therapy, robot-assisted training, mental training, task-specific training, and above all constraint-induced movement therapy (CIMT) can restore motor arm and hand functions. Treadmill training and robot-assisted training improve stance and gait. Botulinum toxine injections in combination with physical and redressing methods are superior in managing spasticity. Guideline-oriented management of associated pain syndromes (myofascial, neuropathic, complex-regional=dystrophic) improve primary outcome and quality of life. Drug therapy with so-called co-analgetics and physical therapy play an important role in pain management. Swallowing disorders lead to higher mortality and morbidity in the acute phase; stepwise diagnostics (screening, endoscopy, radiology) and specific swallowing therapy can reduce these risks and frequently can restore normal eating und drinking.In our modern industrial societies communicative and cognitive disturbances are more impairing than the above mentioned disorders. Speech and language therapy (SLT) is dominant in

  16. Upscaling Empirically Based Conceptualisations to Model Tropical Dominant Hydrological Processes for Historical Land Use Change

    NASA Astrophysics Data System (ADS)

    Toohey, R.; Boll, J.; Brooks, E.; Jones, J.

    2009-12-01

    Surface runoff and percolation to ground water are two hydrological processes of concern to the Atlantic slope of Costa Rica because of their impacts on flooding and drinking water contamination. As per legislation, the Costa Rican Government funds land use management from the farm to the regional scale to improve or conserve hydrological ecosystem services. In this study, we examined how land use (e.g., forest, coffee, sugar cane, and pasture) affects hydrological response at the point, plot (1 m2), and the field scale (1-6ha) to empirically conceptualize the dominant hydrological processes in each land use. Using our field data, we upscaled these conceptual processes into a physically-based distributed hydrological model at the field, watershed (130 km2), and regional (1500 km2) scales. At the point and plot scales, the presence of macropores and large roots promoted greater vertical percolation and subsurface connectivity in the forest and coffee field sites. The lack of macropores and large roots, plus the addition of management artifacts (e.g., surface compaction and a plough layer), altered the dominant hydrological processes by increasing lateral flow and surface runoff in the pasture and sugar cane field sites. Macropores and topography were major influences on runoff generation at the field scale. Also at the field scale, antecedent moisture conditions suggest a threshold behavior as a temporal control on surface runoff generation. However, in this tropical climate with very intense rainstorms, annual surface runoff was less than 10% of annual precipitation at the field scale. Significant differences in soil and hydrological characteristics observed at the point and plot scales appear to have less significance when upscaled to the field scale. At the point and plot scales, percolation acted as the dominant hydrological process in this tropical environment. However, at the field scale for sugar cane and pasture sites, saturation-excess runoff increased as

  17. Avian survey and field guide for Osan Air Base, Korea.

    SciTech Connect

    Levenson, J.

    2006-12-05

    This report summarizes the results of the avian surveys conducted at Osan Air Base (AB). This ongoing survey is conducted to comply with requirements of the Environmental Governing Standards (EGS) for the Republic of Korea, the Integrated Natural Resources Management Plan (INRMP) for Osan AB, and the 51st Fighter Wing's Bird Aircraft Strike Hazard (BASH) Plan. One hundred ten bird species representing 35 families were identified and recorded. Seven species are designated as Natural Monuments, and their protection is accorded by the Korean Ministry of Culture and Tourism. Three species appear on the Korean Association for Conservation of Nature's (KACN's) list of Reserved Wild Species and are protected by the Korean Ministry of Environment. Combined, ten different species are Republic of Korea (ROK)-protected. The primary objective of the avian survey at Osan AB was to determine what species of birds are present on the airfield and their respective habitat requirements during the critical seasons of the year. This requirement is specified in Annex J.14.c of the 51st Fighter BASH Plan 91-212 (51 FW OPLAN 91-212). The second objective was to initiate surveys to determine what bird species are present on Osan AB throughout the year and from the survey results, determine if threatened, endangered, or other Korean-listed bird species are present on Osan AB. This overall census satisfies Criterion 13-3.e of the EGS for Korea. The final objective was to formulate management strategies within Osan AB's operational requirements to protect and enhance habitats of known threatened, endangered, and ROK-protected species in accordance with EGS Criterion 13-3.a that are also favorable for the reproduction of indigenous species in accordance with the EGS Criterion 13-3.h.

  18. A survey of GPU-based medical image computing techniques.

    PubMed

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming; Wang, Defeng

    2012-09-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine. PMID:23256080

  19. Population-based absolute risk estimation with survey data.

    PubMed

    Kovalchik, Stephanie A; Pfeiffer, Ruth M

    2014-04-01

    Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614

  20. Road Rage: Prevalence Pattern and Web Based Survey Feasibility

    PubMed Central

    Verma, Rohit; Balhara, Yatan Pal Singh; Ul-Hasan, Shiraz

    2014-01-01

    Introduction. Incidents of road rage are on a rise in India, but the literature is lacking in the aspect. There is an increasing realization of possibility of effective web based interventions to deliver public health related messages. Objective. The aim was to quantitatively evaluate risk factors among motor vehicle drivers using an internet based survey. Methods. Facebook users were evaluated using Life Orientation Test-Revised (LOT-R) and Driving Anger Scale (DAS). Results. An adequate response rate of 65.9% and satisfactory reliability with sizable correlation were obtained for both scales. Age was found to be positively correlated to LOT-R scores (r = 0.21; P = 0.02) and negatively correlated to DAS scores (r = −0.19; P = 0.03). Years of education were correlated to LOT-R scores (r = 0.26; P = 0.005) but not DAS scores (r = −0.14; P = 0.11). LOT-R scores did not correlate to DAS scores. Conclusion. There is high prevalence of anger amongst drivers in India particularly among younger males. A short web survey formatted in easy to use question language can result in a feasible conduction of an online survey. PMID:24864226

  1. A survey of GPU-based medical image computing techniques

    PubMed Central

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming

    2012-01-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine. PMID:23256080

  2. The EMPIRE Survey: Systematic Variations in the Dense Gas Fraction and Star Formation Efficiency from Full-disk Mapping of M51

    NASA Astrophysics Data System (ADS)

    Bigiel, Frank; Leroy, Adam K.; Jiménez-Donaire, Maria J.; Pety, Jérôme; Usero, Antonio; Cormier, Diane; Bolatto, Alberto; Garcia-Burillo, Santiago; Colombo, Dario; González-García, Manuel; Hughes, Annie; Kepley, Amanda A.; Kramer, Carsten; Sandstrom, Karin; Schinnerer, Eva; Schruba, Andreas; Schuster, Karl; Tomicic, Neven; Zschaechner, Laura

    2016-05-01

    We present the first results from the EMPIRE survey, an IRAM large program that is mapping tracers of high-density molecular gas across the disks of nine nearby star-forming galaxies. Here, we present new maps of the 3 mm transitions of HCN, HCO+, and HNC across the whole disk of our pilot target, M51. As expected, dense gas correlates with tracers of recent star formation, filling the “luminosity gap” between Galactic cores and whole galaxies. In detail, we show that both the fraction of gas that is dense, {f}{dense} traced by HCN/CO, and the rate at which dense gas forms stars, {{SFE}}{dense} traced by IR/HCN, depend on environment in the galaxy. The sense of the dependence is that high-surface-density, high molecular gas fraction regions of the galaxy show high dense gas fractions and low dense gas star formation efficiencies. This agrees with recent results for individual pointings by Usero et al. but using unbiased whole-galaxy maps. It also agrees qualitatively with the behavior observed contrasting our own Solar Neighborhood with the central regions of the Milky Way. The sense of the trends can be explained if the dense gas fraction tracks interstellar pressure but star formation occurs only in regions of high density contrast.

  3. Rapid Mapping Method Based on Free Blocks of Surveys

    NASA Astrophysics Data System (ADS)

    Yu, Xianwen; Wang, Huiqing; Wang, Jinling

    2016-06-01

    While producing large-scale larger than 1:2000 maps in cities or towns, the obstruction from buildings leads to difficult and heavy tasks of measuring mapping control points. In order to avoid measuring the mapping control points and shorten the time of fieldwork, in this paper, a quick mapping method is proposed. This method adjusts many free blocks of surveys together, and transforms the points from all free blocks of surveys into the same coordinate system. The entire surveying area is divided into many free blocks, and connection points are set on the boundaries between free blocks. An independent coordinate system of every free block is established via completely free station technology, and the coordinates of the connection points, detail points and control points in every free block in the corresponding independent coordinate systems are obtained based on poly-directional open traverses. Error equations are established based on connection points, which are determined together to obtain the transformation parameters. All points are transformed from the independent coordinate systems to a transitional coordinate system via the transformation parameters. Several control points are then measured by GPS in a geodetic coordinate system. All the points can then be transformed from the transitional coordinate system to the geodetic coordinate system. In this paper, the implementation process and mathematical formulas of the new method are presented in detail, and the formula to estimate the precision of surveys is given. An example has demonstrated that the precision of using the new method could meet large-scale mapping needs.

  4. Organizational Learning, Strategic Flexibility and Business Model Innovation: An Empirical Research Based on Logistics Enterprises

    NASA Astrophysics Data System (ADS)

    Bao, Yaodong; Cheng, Lin; Zhang, Jian

    Using the data of 237 Jiangsu logistics firms, this paper empirically studies the relationship among organizational learning capability, business model innovation, strategic flexibility. The results show as follows; organizational learning capability has positive impacts on business model innovation performance; strategic flexibility plays mediating roles on the relationship between organizational learning capability and business model innovation; interaction among strategic flexibility, explorative learning and exploitative learning play significant roles in radical business model innovation and incremental business model innovation.

  5. A Survey of Artificial Immune System Based Intrusion Detection

    PubMed Central

    Li, Tao; Hu, Xinlei; Wang, Feng; Zou, Yang

    2014-01-01

    In the area of computer security, Intrusion Detection (ID) is a mechanism that attempts to discover abnormal access to computers by analyzing various interactions. There is a lot of literature about ID, but this study only surveys the approaches based on Artificial Immune System (AIS). The use of AIS in ID is an appealing concept in current techniques. This paper summarizes AIS based ID methods from a new view point; moreover, a framework is proposed for the design of AIS based ID Systems (IDSs). This framework is analyzed and discussed based on three core aspects: antibody/antigen encoding, generation algorithm, and evolution mode. Then we collate the commonly used algorithms, their implementation characteristics, and the development of IDSs into this framework. Finally, some of the future challenges in this area are also highlighted. PMID:24790549

  6. A survey of artificial immune system based intrusion detection.

    PubMed

    Yang, Hua; Li, Tao; Hu, Xinlei; Wang, Feng; Zou, Yang

    2014-01-01

    In the area of computer security, Intrusion Detection (ID) is a mechanism that attempts to discover abnormal access to computers by analyzing various interactions. There is a lot of literature about ID, but this study only surveys the approaches based on Artificial Immune System (AIS). The use of AIS in ID is an appealing concept in current techniques. This paper summarizes AIS based ID methods from a new view point; moreover, a framework is proposed for the design of AIS based ID Systems (IDSs). This framework is analyzed and discussed based on three core aspects: antibody/antigen encoding, generation algorithm, and evolution mode. Then we collate the commonly used algorithms, their implementation characteristics, and the development of IDSs into this framework. Finally, some of the future challenges in this area are also highlighted. PMID:24790549

  7. Social-Emotional Well-Being and Resilience of Children in Early Childhood Settings--PERIK: An Empirically Based Observation Scale for Practitioners

    ERIC Educational Resources Information Center

    Mayr, Toni; Ulich, Michaela

    2009-01-01

    Compared with the traditional focus on developmental problems, research on positive development is relatively new. Empirical research in children's well-being has been scarce. The aim of this study was to develop a theoretically and empirically based instrument for practitioners to observe and assess preschool children's well-being in early…

  8. Deep in Data: Empirical Data Based Software Accuracy Testing Using the Building America Field Data Repository: Preprint

    SciTech Connect

    Neymark, J.; Roberts, D.

    2013-06-01

    An opportunity is available for using home energy consumption and building description data to develop a standardized accuracy test for residential energy analysis tools. That is, to test the ability of uncalibrated simulations to match real utility bills. Empirical data collected from around the United States have been translated into a uniform Home Performance Extensible Markup Language format that may enable software developers to create translators to their input schemes for efficient access to the data. This may facilitate the possibility of modeling many homes expediently, and thus implementing software accuracy test cases by applying the translated data. This paper describes progress toward, and issues related to, developing a usable, standardized, empirical data-based software accuracy test suite.

  9. Using a Process-Based Numerical Model and Simple Empirical Relationships to Evaluate CO2 Fluxes from Agricultural Soils.

    NASA Astrophysics Data System (ADS)

    Buchner, J.; Simunek, J.; Dane, J. H.; King, A. P.; Lee, J.; Rolston, D. E.; Hopmans, J. W.

    2007-12-01

    Carbon dioxide emissions from an agricultural field in the Sacramento Valley, California, were evaluated using the process-based SOILCO2 module of the HYDRUS-1D software package and a simple empirical model. CO2 fluxes, meteorological variables, soil temperatures, and water contents were measured during years 2004-2006 at multiple locations in an agricultural field, half of which had been subjected to standard tillage and the other half to minimum tillage. Furrow irrigation was applied on a regular basis. While HYDRUS-1D simulates dynamic interactions between soil water contents, temperatures, soil CO2 concentrations, and soil respiration by numerically solving partially-differential water flow (Richards), and heat and CO2 transport (convection- dispersion) equations, an empirical model is based on simple reduction functions, closely resembling the CO2 production function of SOILCO2. It is assumed in this function that overall CO2 production in the soil profile is the sum of the soil and plant respiration, optimal values of which are affected by time, depth, water contents, temperatures, soil salinity, and CO2 concentrations in the soil profile. The effect of these environmental factors is introduced using various reduction functions that multiply the optimal soil CO2 production. While in the SOILCO2 module it is assumed that CO2 is produced in the soil profile and then transported, depending mainly on water contents, toward the soil surface, an empirical model relates CO2 emissions directly to various environmental factors. It was shown that both the numerical model and the simple reduction functions could reasonably well predict the CO2 fluxes across the soil surface. Regression coefficients between measured CO2 emissions and those predicted by the numerical and simple empirical models are compared.

  10. Does community-based conservation shape favorable attitudes among locals? an empirical study from nepal.

    PubMed

    Mehta, J N; Heinen, J T

    2001-08-01

    Like many developing countries, Nepal has adopted a community-based conservation (CBC) approach in recent years to manage its protected areas mainly in response to poor park-people relations. Among other things, under this approach the government has created new "people-oriented" conservation areas, formed and devolved legal authority to grassroots-level institutions to manage local resources, fostered infrastructure development, promoted tourism, and provided income-generating trainings to local people. Of interest to policy-makers and resource managers in Nepal and worldwide is whether this approach to conservation leads to improved attitudes on the part of local people. It is also important to know if personal costs and benefits associated with various intervention programs, and socioeconomic and demographic characteristics influence these attitudes. We explore these questions by looking at the experiences in Annapurna and Makalu-Barun Conservation Areas, Nepal, which have largely adopted a CBC approach in policy formulation, planning, and management. The research was conducted during 1996 and 1997; the data collection methods included random household questionnaire surveys, informal interviews, and review of official records and published literature. The results indicated that the majority of local people held favorable attitudes toward these conservation areas. Logistic regression results revealed that participation in training, benefit from tourism, wildlife depredation issue, ethnicity, gender, and education level were the significant predictors of local attitudes in one or the other conservation area. We conclude that the CBC approach has potential to shape favorable local attitudes and that these attitudes will be mediated by some personal attributes. PMID:11443381

  11. Prospects for Gaia and other space-based surveys .

    NASA Astrophysics Data System (ADS)

    Bailer-Jones, Coryn A. L.

    Gaia is a fully-approved all-sky astrometric and photometric survey due for launch in 2011. It will measure accurate parallaxes and proper motions for everything brighter than G=20 (ca. 109 stars). Its primary objective is to study the composition, origin and evolution of our Galaxy from the 3D structure, 3D velocities, abundances and ages of its stars. In some respects it can be considered as a cosmological survey at redshift zero. Several other upcoming space-based surveys, in particular JWST and Herschel, will study star and galaxy formation in the early (high-redshift) universe. In this paper I briefly describe these missions, as well as SIM and Jasmine, and explain why they need to observe from space. I then discuss some Galactic science contributions of Gaia concerning dark matter, the search for substructure, stellar populations and the mass-luminosity relation. The Gaia data are complex and require the development of novel analysis methods; here I summarize the principle of the astrometric processing. In the last two sections I outline how the Gaia data can be exploited in connection with other observational and theoretical work in order to build up a more comprehensive picture of galactic evolution.

  12. Smartphone-Based, Self-Administered Intervention System for Alcohol Use Disorders: Theory and Empirical Evidence Basis

    PubMed Central

    Dulin, Patrick L.; Gonzalez, Vivian M.; King, Diane K.; Giroux, Danielle; Bacon, Samantha

    2013-01-01

    Advances in mobile technology provide an opportunity to deliver in-the-moment interventions to individuals with alcohol use disorders, yet availability of effective “apps” that deliver evidence-based interventions is scarce. We developed an immediately available, portable, smartphone-based intervention system whose purpose is to provide stand-alone, self-administered assessment and intervention. In this paper, we describe how theory and empirical evidence, combined with smartphone functionality contributed to the construction of a user-friendly, engaging alcohol intervention. With translation in mind, we discuss how we selected appropriate intervention components including assessments, feedback and tools, that work together to produce the hypothesized outcomes. PMID:24347811

  13. Data Management for Ground-Based Science Surveys at CASU

    NASA Astrophysics Data System (ADS)

    Irwin, Mike

    2015-12-01

    In this talk I will review the data management facilities at CASU for handling large scale ground-based imaging and spectroscopic surveys. The overarching principle for all science data processing at CASU is to provide an end-to-end system that attempts to deliver fully calibrated optimally extracted data products ready for science use. The talk will outline our progress in achieving this and how end users visualize the state-of-play of the data processing and interact with the final products via our internal data repository.

  14. Comparison of ensemble post-processing approaches, based on empirical and dynamical error modelisation of rainfall-runoff model forecasts

    NASA Astrophysics Data System (ADS)

    Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.

    2012-04-01

    In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological

  15. Measuring microscopic evolution processes of complex networks based on empirical data

    NASA Astrophysics Data System (ADS)

    Chi, Liping

    2015-04-01

    Aiming at understanding the microscopic mechanism of complex systems in real world, we perform the measurement that characterizes the evolution properties on two empirical data sets. In the Autonomous Systems Internet data, the network size keeps growing although the system suffers a high rate of node deletion (r = 0.4) and link deletion (q = 0.81). However, the average degree keeps almost unchanged during the whole time range. At each time step the external links attached to a new node are about c = 1.1 and the internal links added between existing nodes are approximately m = 8. For the Scientific Collaboration data, it is a cumulated result of all the authors from 1893 up to the considered year. There is no deletion of nodes and links, r = q = 0. The external and internal links at each time step are c = 1.04 and m = 0, correspondingly. The exponents of degree distribution p(k) ∼ k-γ of these two empirical datasets γdata are in good agreement with that obtained theoretically γtheory. The results indicate that these evolution quantities may provide an insight into capturing the microscopic dynamical processes that govern the network topology.

  16. Surveying converter lining erosion state based on laser measurement technique

    NASA Astrophysics Data System (ADS)

    Li, Hongsheng; Shi, Tielin; Yang, Shuzi

    1998-08-01

    It is very important to survey the eroding state of the steelmaking converter lining real time so as to optimize technological process, extend converter durability and reduce steelmaking production costs. This paper gives one practical method based on the laser measure technique. It presents the basic principle of the measure technique. It presents the basic principle of the measure method, the composition of the measure system and the researches on key technological problems. The method is based on the technique of the laser range finding to net points on the surface of the surveyed converter lining, and the technology of angle finding to the laser beams. The angle signals would be used to help realizing the automatic scanning function also. The laser signals would be modulated and encoded. In the meantime, we would adopt the wavelet analysis and other filter algorithms, to denoise noisy data and extract useful information. And the main idea of some algorithms such as the net point measuring path planning and the measure device position optimal algorithm would also be given in order to improve the measure precision and real time property of the system.

  17. Density-based empirical likelihood procedures for testing symmetry of data distributions and K-sample comparisons

    PubMed Central

    Vexler, Albert; Tanajian, Hovig; Hutson, Alan D.

    2016-01-01

    In practice, parametric likelihood-ratio techniques are powerful statistical tools. In this article, we propose and examine novel and simple distribution-free test statistics that efficiently approximate parametric likelihood ratios to analyze and compare distributions of K groups of observations. Using the density-based empirical likelihood methodology, we develop a Stata package that applies to a test for symmetry of data distributions and compares K-sample distributions. Recognizing that recent statistical software packages do not sufficiently address K-sample nonparametric comparisons of data distributions, we propose a new Stata command, vxdbel, to execute exact density-based empirical likelihood-ratio tests using K samples. To calculate p-values of the proposed tests, we use the following methods: 1) a classical technique based on Monte Carlo p-value evaluations; 2) an interpolation technique based on tabulated critical values; and 3) a new hybrid technique that combines methods 1 and 2. The third, cutting-edge method is shown to be very efficient in the context of exact-test p-value computations. This Bayesian-type method considers tabulated critical values as prior information and Monte Carlo generations of test statistic values as data used to depict the likelihood function. In this case, a nonparametric Bayesian method is proposed to compute critical values of exact tests.

  18. Sensor Systems Based on FPGAs and Their Applications: A Survey

    PubMed Central

    de la Piedra, Antonio; Braeken, An; Touhafi, Abdellah

    2012-01-01

    In this manuscript, we present a survey of designs and implementations of research sensor nodes that rely on FPGAs, either based upon standalone platforms or as a combination of microcontroller and FPGA. Several current challenges in sensor networks are distinguished and linked to the features of modern FPGAs. As it turns out, low-power optimized FPGAs are able to enhance the computation of several types of algorithms in terms of speed and power consumption in comparison to microcontrollers of commercial sensor nodes. We show that architectures based on the combination of microcontrollers and FPGA can play a key role in the future of sensor networks, in fields where processing capabilities such as strong cryptography, self-testing and data compression, among others, are paramount.

  19. Fault Location Based on Synchronized Measurements: A Comprehensive Survey

    PubMed Central

    Al-Mohammed, A. H.; Abido, M. A.

    2014-01-01

    This paper presents a comprehensive survey on transmission and distribution fault location algorithms that utilize synchronized measurements. Algorithms based on two-end synchronized measurements and fault location algorithms on three-terminal and multiterminal lines are reviewed. Series capacitors equipped with metal oxide varistors (MOVs), when set on a transmission line, create certain problems for line fault locators and, therefore, fault location on series-compensated lines is discussed. The paper reports the work carried out on adaptive fault location algorithms aiming at achieving better fault location accuracy. Work associated with fault location on power system networks, although limited, is also summarized. Additionally, the nonstandard high-frequency-related fault location techniques based on wavelet transform are discussed. Finally, the paper highlights the area for future research. PMID:24701191

  20. An Empirical Investigation of a Theoretically Based Measure of Perceived Wellness

    ERIC Educational Resources Information Center

    Harari, Marc J.; Waehler, Charles A.; Rogers, James R.

    2005-01-01

    The Perceived Wellness Survey (PWS; T. Adams, 1995; T. Adams, J. Bezner, & M. Steinhardt, 1997) is a recently developed instrument intended to operationalize the comprehensive Perceived Wellness Model (T. Adams, J. Bezner, & M. Steinhardt, 1997), an innovative model that attempts to include the balance of multiple life activities in its evaluation…

  1. Conceptual and Empirical Bases of Readability Formulas. Technical Report No. 392.

    ERIC Educational Resources Information Center

    Anderson, Richard C.; Davison, Alice

    The problems arising from treating word and sentence complexity as the direct causes of difficulty in comprehension are surveyed in this paper from the perspective of readability formulas. The basic choices and assumptions made in the development and use of readability formulas are discussed in relation to the larger question of text…

  2. University-Based Evaluation Training Programs in the United States 1980-2008: An Empirical Examination

    ERIC Educational Resources Information Center

    LaVelle, John M.; Donaldson, Stewart I.

    2010-01-01

    Evaluation practice has grown in leaps and bounds in recent years. In contrast, the most recent survey data suggest that there has been a sharp decline in the number and strength of preservice evaluation training programs in the United States. In an effort to further understand this curious trend, an alternative methodology was used to examine the…

  3. Empirical evaluation of H.265/HEVC-based dynamic adaptive video streaming over HTTP (HEVC-DASH)

    NASA Astrophysics Data System (ADS)

    Irondi, Iheanyi; Wang, Qi; Grecos, Christos

    2014-05-01

    Real-time HTTP streaming has gained global popularity for delivering video content over Internet. In particular, the recent MPEG-DASH (Dynamic Adaptive Streaming over HTTP) standard enables on-demand, live, and adaptive Internet streaming in response to network bandwidth fluctuations. Meanwhile, emerging is the new-generation video coding standard, H.265/HEVC (High Efficiency Video Coding) promises to reduce the bandwidth requirement by 50% at the same video quality when compared with the current H.264/AVC standard. However, little existing work has addressed the integration of the DASH and HEVC standards, let alone empirical performance evaluation of such systems. This paper presents an experimental HEVC-DASH system, which is a pull-based adaptive streaming solution that delivers HEVC-coded video content through conventional HTTP servers where the client switches to its desired quality, resolution or bitrate based on the available network bandwidth. Previous studies in DASH have focused on H.264/AVC, whereas we present an empirical evaluation of the HEVC-DASH system by implementing a real-world test bed, which consists of an Apache HTTP Server with GPAC, an MP4Client (GPAC) with open HEVC-based DASH client and a NETEM box in the middle emulating different network conditions. We investigate and analyze the performance of HEVC-DASH by exploring the impact of various network conditions such as packet loss, bandwidth and delay on video quality. Furthermore, we compare the Intra and Random Access profiles of HEVC coding with the Intra profile of H.264/AVC when the correspondingly encoded video is streamed with DASH. Finally, we explore the correlation among the quality metrics and network conditions, and empirically establish under which conditions the different codecs can provide satisfactory performance.

  4. Comparisons of ground motions from the 1999 Chi-Chi, earthquake with empirical predictions largely based on data from California

    USGS Publications Warehouse

    Boore, D.M.

    2001-01-01

    This article has the modest goal of comparing the ground motions recorded during the 1999 Chi-Chi, Taiwan, mainshock with predictions from four empirical-based equations commonly used for western North America; these empirical predictions are largely based on data from California. Comparisons are made for peak acceleration and 5%-damped response spectra at periods between 0.1 and 4 sec. The general finding is that the Chi-Chi ground motions are smaller than those predicted from the empirically based equations for periods less than about 1 sec by factors averaging about 0.4 but as small as 0.26 (depending on period, on which equation is used, and on whether the sites are assumed to be rock or soil). There is a trend for the observed motions to approach or even exceed the predicted motions for longer periods. Motions at similar distances (30-60 km) to the east and to the west of the fault differ dramatically at periods between about 2 and 20 sec: Long-duration wave trains are present on the motions to the west, and when normalized to similar amplitudes at short periods, the response spectra of the motions at the western stations are as much as five times larger than those of motions from eastern stations. The explanation for the difference is probably related to site and propagation effects; the western stations are on the Coastal Plain, whereas the eastern stations are at the foot of young and steep mountains, either in the relatively narrow Longitudinal Valley or along the eastern coast-the sediments underlying the eastern stations are probably shallower and have higher velocity than those under the western stations.

  5. Empirical rainfall thresholds and copula based IDF curves for shallow landslides and flash floods

    NASA Astrophysics Data System (ADS)

    Bezak, Nejc; Šraj, Mojca; Brilly, Mitja; Mikoš, Matjaž

    2015-04-01

    Large mass movements, like deep-seated landslides or large debris flows, and flash floods can endanger human lives and cause huge environmental and economic damage in hazard areas. The main objective of the study was to investigate the characteristics of selected extreme rainfall events, which triggered landslides and caused flash floods, in Slovenia in the last 25 years. Seven extreme events, which occurred in Slovenia (Europe) in the last 25 years (1990-2014) and caused 17 casualties and about 500 million Euros of economic loss, were analysed in this study. Post-event analyses showed that rainfall characteristics triggering flash floods and landslides are different where landslides were triggered by longer duration (up to one or few weeks) rainfall events and flash floods by short duration (few hours to one or two days) rainfall events. The sensitivity analysis results indicate that inter-event time variable, which is defined as the minimum duration of the period without rain between two consecutive rainfall events, and sample definition methodology can have significant influence on the position of rainfall events in the intensity-duration space, on the constructed intensity-duration-frequency (IDF) curves and on the relationship between the empirical rainfall threshold curves and IDF curves constructed using copula approach. The empirical rainfall threshold curves (ID curves) were also evaluated for the selected extreme events. The results indicate that a combination of several empirical rainfall thresholds with appropriate high density of rainfall measuring network can be used as part of the early warning system for initiation of landslides and debris flows. However, different rainfall threshold curves should be used for lowland and mountainous areas in Slovenia. Furthermore, the intensity-duration-frequency (IDF) relationship was constructed using the Frank copula functions for 16 pluviographic meteorological stations in Slovenia using the high resolution

  6. Cyclone optimization based on a new empirical model for pressure drop

    SciTech Connect

    Ramachandran, G.; Leith, D. ); Dirgo, J. ); Feldman, H. )

    1991-01-01

    An empirical model for predicting pressure drop across a cyclone, developed by Dirgo is presented. The model was developed through a statistical analysis of pressure drop data for 98 cyclone designs. The model is shown to perform better than the pressure drop models of Shepherd and Lapple, Alexander, First, Stairmand, and Barth. This model is used with the efficiency model of Iozia and Leith to develop an optimization curve which predicts the minimum pressure drop and the dimension ratios of the optimized cyclone for a given aerodynamic cut diameter, d{sub 50}. The effect of variation in cyclone height, cyclone diameter, and flow on the optimization is determined. The optimization results are used to develop a design procedure for optimized cyclones.

  7. Empirical mode decomposition-based facial pose estimation inside video sequences

    NASA Astrophysics Data System (ADS)

    Qing, Chunmei; Jiang, Jianmin; Yang, Zhijing

    2010-03-01

    We describe a new pose-estimation algorithm via integration of the strength in both empirical mode decomposition (EMD) and mutual information. While mutual information is exploited to measure the similarity between facial images to estimate poses, EMD is exploited to decompose input facial images into a number of intrinsic mode function (IMF) components, which redistribute the effect of noise, expression changes, and illumination variations as such that, when the input facial image is described by the selected IMF components, all the negative effects can be minimized. Extensive experiments were carried out in comparisons to existing representative techniques, and the results show that the proposed algorithm achieves better pose-estimation performances with robustness to noise corruption, illumination variation, and facial expressions.

  8. Inferring causal molecular networks: empirical assessment through a community-based effort.

    PubMed

    Hill, Steven M; Heiser, Laura M; Cokelaer, Thomas; Unger, Michael; Nesser, Nicole K; Carlin, Daniel E; Zhang, Yang; Sokolov, Artem; Paull, Evan O; Wong, Chris K; Graim, Kiley; Bivol, Adrian; Wang, Haizhou; Zhu, Fan; Afsari, Bahman; Danilova, Ludmila V; Favorov, Alexander V; Lee, Wai Shing; Taylor, Dane; Hu, Chenyue W; Long, Byron L; Noren, David P; Bisberg, Alexander J; Mills, Gordon B; Gray, Joe W; Kellen, Michael; Norman, Thea; Friend, Stephen; Qutub, Amina A; Fertig, Elana J; Guan, Yuanfang; Song, Mingzhou; Stuart, Joshua M; Spellman, Paul T; Koeppl, Heinz; Stolovitzky, Gustavo; Saez-Rodriguez, Julio; Mukherjee, Sach

    2016-04-01

    It remains unclear whether causal, rather than merely correlational, relationships in molecular networks can be inferred in complex biological settings. Here we describe the HPN-DREAM network inference challenge, which focused on learning causal influences in signaling networks. We used phosphoprotein data from cancer cell lines as well as in silico data from a nonlinear dynamical model. Using the phosphoprotein data, we scored more than 2,000 networks submitted by challenge participants. The networks spanned 32 biological contexts and were scored in terms of causal validity with respect to unseen interventional data. A number of approaches were effective, and incorporating known biology was generally advantageous. Additional sub-challenges considered time-course prediction and visualization. Our results suggest that learning causal relationships may be feasible in complex settings such as disease states. Furthermore, our scoring approach provides a practical way to empirically assess inferred molecular networks in a causal sense. PMID:26901648

  9. Investigation of an empirical probability measure based test for multivariate normality

    SciTech Connect

    Booker, J.M.; Johnson, M.E.; Beckman, R.J.

    1984-01-01

    Foutz (1980) derived a goodness of fit test for a hypothesis specifying a continuous, p-variate distribution. The test statistic is both distribution-free and independent of p. In adapting the Foutz test for multivariate normality, we consider using chi/sup 2/ and rescaled beta variates in constructing statistically equivalent blocks. The Foutz test is compared to other multivariate normality tests developed by Hawkins (1981) and Malkovich and Afifi (1973). The set of alternative distributions tested include Pearson type II and type VII, Johnson translations, Plackett, and distributions arising from Khintchine's theorem. Univariate alternatives from the general class developed by Johnson et al. (1980) were also used. An empirical study confirms the independence of the test statistic on p even when parameters are estimated. In general, the Foutz test is less conservative under the null hypothesis but has poorer power under most alternatives than the other tests.

  10. Population forecasts and confidence intervals for Sweden: a comparison of model-based and empirical approaches.

    PubMed

    Cohen, J E

    1986-02-01

    This paper compares several methods of generating confidence intervals for forecasts of population size. Two rest on a demographic model for age-structured populations with stochastic fluctuations in vital rates. Two rest on empirical analyses of past forecasts of population sizes of Sweden at five-year intervals from 1780 to 1980 inclusive. Confidence intervals produced by the different methods vary substantially. The relative sizes differ in the various historical periods. The narrowest intervals offer a lower bound on uncertainty about the future. Procedures for estimating a range of confidence intervals are tentatively recommended. A major lesson is that finitely many observations of the past and incomplete theoretical understanding of the present and future can justify at best a range of confidence intervals for population projections. Uncertainty attaches not only to the point forecasts of future population, but also to the estimates of those forecasts' uncertainty. PMID:3484356

  11. Gold price analysis based on ensemble empirical model decomposition and independent component analysis

    NASA Astrophysics Data System (ADS)

    Xian, Lu; He, Kaijian; Lai, Kin Keung

    2016-07-01

    In recent years, the increasing level of volatility of the gold price has received the increasing level of attention from the academia and industry alike. Due to the complexity and significant fluctuations observed in the gold market, however, most of current approaches have failed to produce robust and consistent modeling and forecasting results. Ensemble Empirical Model Decomposition (EEMD) and Independent Component Analysis (ICA) are novel data analysis methods that can deal with nonlinear and non-stationary time series. This study introduces a new methodology which combines the two methods and applies it to gold price analysis. This includes three steps: firstly, the original gold price series is decomposed into several Intrinsic Mode Functions (IMFs) by EEMD. Secondly, IMFs are further processed with unimportant ones re-grouped. Then a new set of data called Virtual Intrinsic Mode Functions (VIMFs) is reconstructed. Finally, ICA is used to decompose VIMFs into statistically Independent Components (ICs). The decomposition results reveal that the gold price series can be represented by the linear combination of ICs. Furthermore, the economic meanings of ICs are analyzed and discussed in detail, according to the change trend and ICs' transformation coefficients. The analyses not only explain the inner driving factors and their impacts but also conduct in-depth analysis on how these factors affect gold price. At the same time, regression analysis has been conducted to verify our analysis. Results from the empirical studies in the gold markets show that the EEMD-ICA serve as an effective technique for gold price analysis from a new perspective.

  12. Accounting protesting and warm glow bidding in Contingent Valuation surveys considering the management of environmental goods--an empirical case study assessing the value of protecting a Natura 2000 wetland area in Greece.

    PubMed

    Grammatikopoulou, Ioanna; Olsen, Søren Bøye

    2013-11-30

    Based on a Contingent Valuation survey aiming to reveal the willingness to pay (WTP) for conservation of a wetland area in Greece, we show how protest and warm glow motives can be taken into account when modeling WTP. In a sample of more than 300 respondents, we find that 54% of the positive bids are rooted to some extent in warm glow reasoning while 29% of the zero bids can be classified as expressions of protest rather than preferences. In previous studies, warm glow bidders are only rarely identified while protesters are typically identified and excluded from further analysis. We test for selection bias associated with simple removal of both protesters and warm glow bidders in our data. Our findings show that removal of warm glow bidders does not significantly distort WTP whereas we find strong evidence of selection bias associated with removal of protesters. We show how to correct for such selection bias by using a sample selection model. In our empirical sample, using the typical approach of removing protesters from the analysis, the value of protecting the wetland is significantly underestimated by as much as 46% unless correcting for selection bias. PMID:24091158

  13. Ticking the boxes: a survey of workplace-based assessments

    PubMed Central

    Gilberthorpe, Thomas; Sarfo, Maame Duku; Lawrence-Smith, Geoff

    2016-01-01

    Aims and method To survey the quality of workplace-based assessments (WPBAs) through retrospective analysis of completed WPBA forms against training targets derived from the Royal College of Psychiatrists' Portfolio Online. Results Almost a third of assessments analysed showed no divergence in assessment scores across the varied assessment domains and there was poor correlation between domain scores and the nature of comments provided by assessors. Of the assessments that suggested action points only half were considered to be sufficiently ‘specific’ and ‘achievable’ to be useful for trainees' learning. Clinical implications WPBA is not currently being utilised to its full potential as a formative assessment tool and more widespread audit is needed to establish whether this is a local or a national issue. PMID:27087994

  14. Ticking the boxes: a survey of workplace-based assessments.

    PubMed

    Gilberthorpe, Thomas; Sarfo, Maame Duku; Lawrence-Smith, Geoff

    2016-04-01

    Aims and method To survey the quality of workplace-based assessments (WPBAs) through retrospective analysis of completed WPBA forms against training targets derived from the Royal College of Psychiatrists' Portfolio Online. Results Almost a third of assessments analysed showed no divergence in assessment scores across the varied assessment domains and there was poor correlation between domain scores and the nature of comments provided by assessors. Of the assessments that suggested action points only half were considered to be sufficiently 'specific' and 'achievable' to be useful for trainees' learning. Clinical implications WPBA is not currently being utilised to its full potential as a formative assessment tool and more widespread audit is needed to establish whether this is a local or a national issue. PMID:27087994

  15. Simulations of the Structure and Properties of Large Icosahedral Boron Clusters Based on a Novel Semi-Empirical Hamiltonian

    NASA Astrophysics Data System (ADS)

    Tandy, Paul; Yu, Ming; Jayanthi, C. S.; Wu, Shi-Yu; Condensed Matter Theory Group Team

    2013-03-01

    A successful development of a parameterized semi-empirical Hamiltonian (SCED-LCAO) for boron based on a LCAO framework using a sp3 basis set will be discussed. The semi-empirical Hamiltonian contains environment-dependency and electron screening effects of a many-body Hamiltonian and allows for charge self-consistency. We have optimized the parameters of the SCED-LCAO Hamiltonian for boron by fitting the properties (e.g., the binding energy, bond length, etc.) of boron sheets, small clusters and boron alpha to first-principles calculations based on DFT calculations. Although extended phases of boron alpha and beta have been studied, large clusters of boron with icosahedral structures such as those cut from boron alpha are difficult if not impossible to simulate with ab initio methods. We will demonstrate the effectiveness of the SCED-LCAO Hamiltonian in studying icosahedral boron clusters containing up to 800 atoms and will report on some novel boron clusters and computational speed. Support has been provided by the Dillion Fellowship.

  16. Development of the Knowledge-based & Empirical Combined Scoring Algorithm (KECSA) to Score Protein-Ligand Interactions

    PubMed Central

    Zheng, Zheng

    2013-01-01

    We describe a novel knowledge-based protein-ligand scoring function that employs a new definition for the reference state, allowing us to relate a statistical potential to a Lennard-Jones (LJ) potential. In this way, the LJ potential parameters were generated from protein-ligand complex structural data contained in the PDB. Forty-nine types of atomic pairwise interactions were derived using this method, which we call the knowledge-based and empirical combined scoring algorithm (KECSA). Two validation benchmarks were introduced to test the performance of KECSA. The first validation benchmark included two test sets that address the training-set and enthalpy/entropy of KECSA The second validation benchmark suite included two large-scale and five small-scale test sets to compare the reproducibility of KECSA with respect to two empirical score functions previously developed in our laboratory (LISA and LISA+), as well as to other well-known scoring methods. Validation results illustrate that KECSA shows improved performance in all test sets when compared with other scoring methods especially in its ability to minimize the RMSE. LISA and LISA+ displayed similar performance using the correlation coefficient and Kendall τ as the metric of quality for some of the small test sets. Further pathways for improvement are discussed which would KECSA more sensitive to subtle changes in ligand structure. PMID:23560465

  17. A Compound fault diagnosis for rolling bearings method based on blind source separation and ensemble empirical mode decomposition.

    PubMed

    Wang, Huaqing; Li, Ruitong; Tang, Gang; Yuan, Hongfang; Zhao, Qingliang; Cao, Xi

    2014-01-01

    A Compound fault signal usually contains multiple characteristic signals and strong confusion noise, which makes it difficult to separate week fault signals from them through conventional ways, such as FFT-based envelope detection, wavelet transform or empirical mode decomposition individually. In order to improve the compound faults diagnose of rolling bearings via signals' separation, the present paper proposes a new method to identify compound faults from measured mixed-signals, which is based on ensemble empirical mode decomposition (EEMD) method and independent component analysis (ICA) technique. With the approach, a vibration signal is firstly decomposed into intrinsic mode functions (IMF) by EEMD method to obtain multichannel signals. Then, according to a cross correlation criterion, the corresponding IMF is selected as the input matrix of ICA. Finally, the compound faults can be separated effectively by executing ICA method, which makes the fault features more easily extracted and more clearly identified. Experimental results validate the effectiveness of the proposed method in compound fault separating, which works not only for the outer race defect, but also for the rollers defect and the unbalance fault of the experimental system. PMID:25289644

  18. An Empirical Model of Saturn's Current Sheet Based on Global MHD Modeling of Saturn's Magnetosphere

    NASA Astrophysics Data System (ADS)

    Hansen, K. C.; Nickerson, J. S.; Gombosi, T. I.

    2014-12-01

    Cassini observations imply that during southern summer Saturn's magnetospheric current sheet is displaced northward above the rotational equator and should be similarly displaced southward during northern summer [C.S. Arridge et al., Warping of Saturn's magnetospheric and magnetotail current sheets, Journal of Geophysical Research, Vol. 113, August 2008]. Arridge et al. show that Cassini data from the noon, midnight and dawn local time sectors clearly indicate this bending and they present an azimuthally independent model to describe this bowl shaped geometry. We have used our global MHD model, BATS-R-US/SWMF, to study Saturn's magnetospheric current sheet under different solar wind dynamic pressures and solar zenith angle conditions. We find that under typical conditions the current sheet does bend upward and take on a basic shape similar to the Arridge model in the noon, midnight, and dawn sectors. However, the MHD model results show significant variations from the Arridge model including the degree of bending, variations away from a simple bowl shape, non-uniformity across local time sectors, drastic deviations in the dusk sector, and a dependence on the solar wind dynamic pressure. We will present a detailed description of our 3D MHD model results and the characteristics of the current sheet in the model. We will point out variations from the Arridge model. In addition, we will present a new empirical model of Saturn's current sheet that attempts to characterize the dependences on the local time sector and the solar wind dynamic pressure.

  19. Consequences of asymmetric competition between resident and invasive defoliators: a novel empirically based modelling approach.

    PubMed

    Ammunét, Tea; Klemola, Tero; Parvinen, Kalle

    2014-03-01

    Invasive species can have profound effects on a resident community via indirect interactions among community members. While long periodic cycles in population dynamics can make the experimental observation of the indirect effects difficult, modelling the possible effects on an evolutionary time scale may provide the much needed information on the potential threats of the invasive species on the ecosystem. Using empirical data from a recent invasion in northernmost Fennoscandia, we applied adaptive dynamics theory and modelled the long term consequences of the invasion by the winter moth into the resident community. Specifically, we investigated the outcome of the observed short-term asymmetric preferences of generalist predators and specialist parasitoids on the long term population dynamics of the invasive winter moth and resident autumnal moth sharing these natural enemies. Our results indicate that coexistence after the invasion is possible. However, the outcome of the indirect interaction on the population dynamics of the moth species was variable and the dynamics might not be persistent on an evolutionary time scale. In addition, the indirect interactions between the two moth species via shared natural enemies were able to cause asynchrony in the population cycles corresponding to field observations from previous sympatric outbreak areas. Therefore, the invasion may cause drastic changes in the resident community, for example by prolonging outbreak periods of birch-feeding moths, increasing the average population densities of the moths or, alternatively, leading to extinction of the resident moth species or to equilibrium densities of the two, formerly cyclic, herbivores. PMID:24380810

  20. Towards high performing hospital enterprise systems: an empirical and literature based design framework

    NASA Astrophysics Data System (ADS)

    dos Santos Fradinho, Jorge Miguel

    2014-05-01

    Our understanding of enterprise systems (ES) is gradually evolving towards a sense of design which leverages multidisciplinary bodies of knowledge that may bolster hybrid research designs and together further the characterisation of ES operation and performance. This article aims to contribute towards ES design theory with its hospital enterprise systems design (HESD) framework, which reflects a rich multidisciplinary literature and two in-depth hospital empirical cases from the US and UK. In doing so it leverages systems thinking principles and traditionally disparate bodies of knowledge to bolster the theoretical evolution and foundation of ES. A total of seven core ES design elements are identified and characterised with 24 main categories and 53 subcategories. In addition, it builds on recent work which suggests that hospital enterprises are comprised of multiple internal ES configurations which may generate different levels of performance. Multiple sources of evidence were collected including electronic medical records, 54 recorded interviews, observation, and internal documents. Both in-depth cases compare and contrast higher and lower performing ES configurations. Following literal replication across in-depth cases, this article concludes that hospital performance can be improved through an enriched understanding of hospital ES design.

  1. Spectral analysis of Hall-effect thruster plasma oscillations based on the empirical mode decomposition

    SciTech Connect

    Kurzyna, J.; Mazouffre, S.; Lazurenko, A.; Albarede, L.; Bonhomme, G.; Makowski, K.; Dudeck, M.; Peradzynski, Z.

    2005-12-15

    Hall-effect thruster plasma oscillations recorded by means of probes located at the channel exit are analyzed using the empirical mode decomposition (EMD) method. This self-adaptive technique permits to decompose a nonstationary signal into a set of intrinsic modes, and acts as a very efficient filter allowing to separate contributions of different underlying physical mechanisms. Applying the Hilbert transform to the whole set of modes allows to identify peculiar events and to assign them a range of instantaneous frequency and power. In addition to 25 kHz breathing-type oscillations which are unambiguously identified, the EMD approach confirms the existence of oscillations with instantaneous frequencies in the range of 100-500 kHz typical for ion transit-time oscillations. Modeling of high-frequency modes ({nu}{approx}10 MHz) resulting from EMD of measured wave forms supports the idea that high-frequency plasma oscillations originate from electron-density perturbations propagating azimuthally with the electron drift velocity.

  2. Quantitative hydrogen analysis in minerals based on a semi-empirical approach

    NASA Astrophysics Data System (ADS)

    Kristiansson, P.; Borysiuk, M.; Ros, L.; Skogby, H.; Abdel, N.; Elfman, M.; Nilsson, E. J. C.; Pallon, J.

    2013-07-01

    Hydrogen normally occurs as hydroxyl ions related to defects at specific crystallographic sites in the structures, and is normally characterized by infrared spectroscopy (FTIR). For quantification purposes the FTIR technique has proven to be less precise since calibrations against independent methods are needed. Hydrogen analysis by the NMP technique can solve many of the problems, due to the low detection limit, high lateral resolution, insignificant matrix effects and possibility to discriminate surface-adsorbed water. The technique has been shown to work both on thin samples and on thicker geological samples. To avoid disturbance from surface contamination the hydrogen is analyzed inside semi-thick geological samples. The technique used is an elastic recoil technique where both the incident projectile (proton) and the recoiled hydrogen are detected in coincidence in a segmented detector. Both the traditional annular system with the detector divided in two halves and the new double-sided silicon strip detector (DSSSD) has been used. In this work we present an upgraded version of the technique, studying two sets of mineral standards combined with pre-sample charge normalization. To improve the processing time of data we suggest a very simple semi-empirical approach to be used for data evaluation. The advantages and drawbacks with the approach are discussed and a possible extension of the model is suggested.

  3. Inferring causal molecular networks: empirical assessment through a community-based effort

    PubMed Central

    Hill, Steven M.; Heiser, Laura M.; Cokelaer, Thomas; Unger, Michael; Nesser, Nicole K.; Carlin, Daniel E.; Zhang, Yang; Sokolov, Artem; Paull, Evan O.; Wong, Chris K.; Graim, Kiley; Bivol, Adrian; Wang, Haizhou; Zhu, Fan; Afsari, Bahman; Danilova, Ludmila V.; Favorov, Alexander V.; Lee, Wai Shing; Taylor, Dane; Hu, Chenyue W.; Long, Byron L.; Noren, David P.; Bisberg, Alexander J.; Mills, Gordon B.; Gray, Joe W.; Kellen, Michael; Norman, Thea; Friend, Stephen; Qutub, Amina A.; Fertig, Elana J.; Guan, Yuanfang; Song, Mingzhou; Stuart, Joshua M.; Spellman, Paul T.; Koeppl, Heinz; Stolovitzky, Gustavo; Saez-Rodriguez, Julio; Mukherjee, Sach

    2016-01-01

    Inferring molecular networks is a central challenge in computational biology. However, it has remained unclear whether causal, rather than merely correlational, relationships can be effectively inferred in complex biological settings. Here we describe the HPN-DREAM network inference challenge that focused on learning causal influences in signaling networks. We used phosphoprotein data from cancer cell lines as well as in silico data from a nonlinear dynamical model. Using the phosphoprotein data, we scored more than 2,000 networks submitted by challenge participants. The networks spanned 32 biological contexts and were scored in terms of causal validity with respect to unseen interventional data. A number of approaches were effective and incorporating known biology was generally advantageous. Additional sub-challenges considered time-course prediction and visualization. Our results constitute the most comprehensive assessment of causal network inference in a mammalian setting carried out to date and suggest that learning causal relationships may be feasible in complex settings such as disease states. Furthermore, our scoring approach provides a practical way to empirically assess the causal validity of inferred molecular networks. PMID:26901648

  4. Development of Items for a Pedagogical Content Knowledge Test Based on Empirical Analysis of Pupils' Errors

    NASA Astrophysics Data System (ADS)

    Jüttner, Melanie; Neuhaus, Birgit J.

    2012-05-01

    In view of the lack of instruments for measuring biology teachers' pedagogical content knowledge (PCK), this article reports on a study about the development of PCK items for measuring teachers' knowledge of pupils' errors and ways for dealing with them. This study investigated 9th and 10th grade German pupils' (n = 461) drawings in an achievement test about the knee-jerk in biology, which were analysed by using the inductive qualitative analysis of their content. The empirical data were used for the development of the items in the PCK test. The validation of the items was determined with think-aloud interviews of German secondary school teachers (n = 5). If the item was determined, the reliability was tested by the results of German secondary school biology teachers (n = 65) who took the PCK test. The results indicated that these items are satisfactorily reliable (Cronbach's alpha values ranged from 0.60 to 0.65). We suggest a larger sample size and American biology teachers be used in our further studies. The findings of this study about teachers' professional knowledge from the PCK test could provide new information about the influence of teachers' knowledge on their pupils' understanding of biology and their possible errors in learning biology.

  5. Cardiopulmonary Resuscitation Pattern Evaluation Based on Ensemble Empirical Mode Decomposition Filter via Nonlinear Approaches.

    PubMed

    Sadrawi, Muammar; Sun, Wei-Zen; Ma, Matthew Huei-Ming; Dai, Chun-Yi; Abbod, Maysam F; Shieh, Jiann-Shing

    2016-01-01

    Good quality cardiopulmonary resuscitation (CPR) is the mainstay of treatment for managing patients with out-of-hospital cardiac arrest (OHCA). Assessment of the quality of the CPR delivered is now possible through the electrocardiography (ECG) signal that can be collected by an automated external defibrillator (AED). This study evaluates a nonlinear approximation of the CPR given to the asystole patients. The raw ECG signal is filtered using ensemble empirical mode decomposition (EEMD), and the CPR-related intrinsic mode functions (IMF) are chosen to be evaluated. In addition, sample entropy (SE), complexity index (CI), and detrended fluctuation algorithm (DFA) are collated and statistical analysis is performed using ANOVA. The primary outcome measure assessed is the patient survival rate after two hours. CPR pattern of 951 asystole patients was analyzed for quality of CPR delivered. There was no significant difference observed in the CPR-related IMFs peak-to-peak interval analysis for patients who are younger or older than 60 years of age, similarly to the amplitude difference evaluation for SE and DFA. However, there is a difference noted for the CI (p < 0.05). The results show that patients group younger than 60 years have higher survival rate with high complexity of the CPR-IMFs amplitude differences. PMID:27529068

  6. Cardiopulmonary Resuscitation Pattern Evaluation Based on Ensemble Empirical Mode Decomposition Filter via Nonlinear Approaches

    PubMed Central

    Ma, Matthew Huei-Ming

    2016-01-01

    Good quality cardiopulmonary resuscitation (CPR) is the mainstay of treatment for managing patients with out-of-hospital cardiac arrest (OHCA). Assessment of the quality of the CPR delivered is now possible through the electrocardiography (ECG) signal that can be collected by an automated external defibrillator (AED). This study evaluates a nonlinear approximation of the CPR given to the asystole patients. The raw ECG signal is filtered using ensemble empirical mode decomposition (EEMD), and the CPR-related intrinsic mode functions (IMF) are chosen to be evaluated. In addition, sample entropy (SE), complexity index (CI), and detrended fluctuation algorithm (DFA) are collated and statistical analysis is performed using ANOVA. The primary outcome measure assessed is the patient survival rate after two hours. CPR pattern of 951 asystole patients was analyzed for quality of CPR delivered. There was no significant difference observed in the CPR-related IMFs peak-to-peak interval analysis for patients who are younger or older than 60 years of age, similarly to the amplitude difference evaluation for SE and DFA. However, there is a difference noted for the CI (p < 0.05). The results show that patients group younger than 60 years have higher survival rate with high complexity of the CPR-IMFs amplitude differences. PMID:27529068

  7. Improving the empirical model for plasma nitrided AISI 316L corrosion resistance based on Mössbauer spectroscopy

    NASA Astrophysics Data System (ADS)

    Campos, M.; de Souza, S. D.; de Souza, S.; Olzon-Dionysio, M.

    2011-11-01

    Traditional plasma nitriding treatments using temperatures ranging from approximately 650 to 730 K can improve wear, corrosion resistance and surface hardness on stainless steels. The nitrided layer consists of some iron nitrides: the cubic γ ' phase (Fe4N), the hexagonal phase ɛ (Fe2 - 3N) and a nitrogen supersatured solid phase γ N . An empirical model is proposed to explain the corrosion resistance of AISI 316L and ASTM F138 nitrided samples based on Mössbauer Spectroscopy results: the larger the ratio between ɛ and γ ' phase fractions of the sample, the better its resistance corrosion is. In this work, this model is examined using some new results of AISI 316L samples, nitrided under the same previous conditions of gas composition and temperature, but at different pressure, for 3, 4 and 5 h. The sample nitrided for 4 h, whose value for ɛ/ γ ' is maximum (= 0.73), shows a slightly better response than the other two samples, nitrided for 5 and 3 h ( ɛ/ γ ' = 0.72 and 0.59, respectively). Moreover, these samples show very similar behavior. Therefore, this set of samples was not suitable to test the empirical model. However, the comparison between the present results of potentiodynamic polarization curves and those obtained previously at 4 and 4.5 torr, could indicated that the corrosion resistance of the sample which only presents the γ N phase was the worst of them. Moreover, the empirical model seems not to be ready to explain the response to corrosion and it should be improved including the γ N phase.

  8. Empirical evidence for identical band gaps in substituted C{sub 60} and C{sub 70} based fullerenes

    SciTech Connect

    Mattias Andersson, L. Tanaka, Hideyuki

    2014-01-27

    Optical absorptance data, and a strong correlation between solar cell open circuit voltages and the ionization potentials of a wide range of differently substituted fullerene acceptors, are presented as empirical evidence for identical, or at least very similar, band gaps in all substituted C{sub 60} and C{sub 70} based fullerenes. Both the number and kind of substituents in this study are sufficiently varied to imply generality. While the band gaps of the fullerenes remain the same for all the different substitutions, their ionization potentials vary greatly in a span of more than 0.4 eV. The merits and drawbacks of using these results together with photoelectron based techniques to determine relative fullerene energy levels for, e.g., organic solar cell applications compared to more direct electrochemical methods are also discussed.

  9. An all-atom structure-based potential for proteins: bridging minimal models with all-atom empirical forcefields.

    PubMed

    Whitford, Paul C; Noel, Jeffrey K; Gosavi, Shachi; Schug, Alexander; Sanbonmatsu, Kevin Y; Onuchic, José N

    2009-05-01

    Protein dynamics take place on many time and length scales. Coarse-grained structure-based (Go) models utilize the funneled energy landscape theory of protein folding to provide an understanding of both long time and long length scale dynamics. All-atom empirical forcefields with explicit solvent can elucidate our understanding of short time dynamics with high energetic and structural resolution. Thus, structure-based models with atomic details included can be used to bridge our understanding between these two approaches. We report on the robustness of folding mechanisms in one such all-atom model. Results for the B domain of Protein A, the SH3 domain of C-Src Kinase, and Chymotrypsin Inhibitor 2 are reported. The interplay between side chain packing and backbone folding is explored. We also compare this model to a C(alpha) structure-based model and an all-atom empirical forcefield. Key findings include: (1) backbone collapse is accompanied by partial side chain packing in a cooperative transition and residual side chain packing occurs gradually with decreasing temperature, (2) folding mechanisms are robust to variations of the energetic parameters, (3) protein folding free-energy barriers can be manipulated through parametric modifications, (4) the global folding mechanisms in a C(alpha) model and the all-atom model agree, although differences can be attributed to energetic heterogeneity in the all-atom model, and (5) proline residues have significant effects on folding mechanisms, independent of isomerization effects. Because this structure-based model has atomic resolution, this work lays the foundation for future studies to probe the contributions of specific energetic factors on protein folding and function. PMID:18837035

  10. Changing Healthcare Providers’ Behavior during Pediatric Inductions with an Empirically-based Intervention

    PubMed Central

    Martin, Sarah R.; Chorney, Jill MacLaren; Tan, Edwin T.; Fortier, Michelle A.; Blount, Ronald L.; Wald, Samuel H.; Shapiro, Nina L.; Strom, Suzanne L.; Patel, Swati; Kain, Zeev N.

    2011-01-01

    Background Each year over 4 million children experience significant levels of preoperative anxiety, which has been linked to poor recovery outcomes. Healthcare providers (HCP) and parents represent key resources for children to help them manage their preoperative anxiety. The present study reports on the development and preliminary feasibility testing of a new intervention designed to change HCP and parent perioperative behaviors that have been previously reported to be associated with children’s coping and stress behaviors before surgery. Methods An empirically-derived intervention, Provider-Tailored Intervention for Perioperative Stress, was developed to train HCPs to increase behaviors that promote children’s coping and decrease behaviors that may exacerbate children’s distress. Rates of HCP behaviors were coded and compared between pre-intervention and post-intervention. Additionally, rates of parents’ behaviors were compared between those that interacted with HCPs before training to those interacting with HCPs post-intervention. Results Effect sizes indicated that HCPs that underwent training demonstrated increases in rates of desired behaviors (range: 0.22 to 1.49) and decreases in rates of undesired behaviors (range: 0.15 to 2.15). Additionally, parents, who were indirectly trained, also demonstrated changes to their rates of desired (range: 0.30 to 0.60) and undesired behaviors (range: 0.16 to 0.61). Conclusions The intervention successfully modified HCP and parent behaviors. It represents a potentially new clinical way to decrease anxiety in children. A recently National Institute of Child Health and Development funded multi-site randomized control trial will examine the efficacy of this intervention in reducing children’s preoperative anxiety and improving children’s postoperative recovery is about to start. PMID:21606826