Science.gov

Sample records for based empirical survey

  1. An Empirical Pixel-Based Correction for Imperfect CTE. I. HST's Advanced Camera for Surveys

    NASA Astrophysics Data System (ADS)

    Anderson, Jay; Bedin, Luigi

    2010-09-01

    We use an empirical approach to characterize the effect of charge-transfer efficiency (CTE) losses in images taken with the Wide-Field Channel of the Advanced Camera for Surveys (ACS). The study is based on profiles of warm pixels in 168 dark exposures taken between 2009 September and October. The dark exposures allow us to explore charge traps that affect electrons when the background is extremely low. We develop a model for the readout process that reproduces the observed trails out to 70 pixels. We then invert the model to convert the observed pixel values in an image into an estimate of the original pixel values. We find that when we apply this image-restoration process to science images with a variety of stars on a variety of background levels, it restores flux, position, and shape. This means that the observed trails contain essentially all of the flux lost to inefficient CTE. The Space Telescope Science Institute is currently evaluating this algorithm with the aim of optimizing it and eventually providing enhanced data products. The empirical procedure presented here should also work for other epochs (e.g., pre-SM4), though the parameters may have to be recomputed for the time when ACS was operated at a higher temperature than the current -81°C. Finally, this empirical approach may also hold promise for other instruments, such as WFPC2, STIS, the ASC's HRC, and even WFC3/UVIS.

  2. An Empirical Pixel-Based Correction for Imperfect CTE. I. HST's Advanced Camera for Surveys

    NASA Astrophysics Data System (ADS)

    Anderson, Jay; Bedin, Luigi R.

    2010-09-01

    We use an empirical approach to characterize the effect of charge-transfer efficiency (CTE) losses in images taken with the Wide-Field Channel of the Advanced Camera for Surveys (ACS). The study is based on profiles of warm pixels in 168 dark exposures taken between 2009 September and October. The dark exposures allow us to explore charge traps that affect electrons when the background is extremely low. We develop a model for the readout process that reproduces the observed trails out to 70 pixels. We then invert the model to convert the observed pixel values in an image into an estimate of the original pixel values. We find that when we apply this image-restoration process to science images with a variety of stars on a variety of background levels, it restores flux, position, and shape. This means that the observed trails contain essentially all of the flux lost to inefficient CTE. The Space Telescope Science Institute is currently evaluating this algorithm with the aim of optimizing it and eventually providing enhanced data products. The empirical procedure presented here should also work for other epochs (e.g., pre-SM4), though the parameters may have to be recomputed for the time when ACS was operated at a higher temperature than the current -81°C. Finally, this empirical approach may also hold promise for other instruments, such as WFPC2, STIS, the ACS's HRC, and even WFC3/UVIS. Based on observations with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by AURA, Inc., under NASA contract NAS 5-26555.

  3. Monitoring of Qualifications and Employment in Austria: An Empirical Approach Based on the Labour Force Survey

    ERIC Educational Resources Information Center

    Lassnigg, Lorenz; Vogtenhuber, Stefan

    2011-01-01

    The empirical approach referred to in this article describes the relationship between education and training (ET) supply and employment in Austria; the use of the new ISCED (International Standard Classification of Education) fields of study variable makes this approach applicable abroad. The purpose is to explore a system that produces timely…

  4. GIS Teacher Training: Empirically-Based Indicators of Effectiveness

    ERIC Educational Resources Information Center

    Höhnle, Steffen; Fögele, Janis; Mehren, Rainer; Schubert, Jan Christoph

    2016-01-01

    In spite of various actions, the implementation of GIS (geographic information systems) in German schools is still very low. In the presented research, teaching experts as well as teaching novices were presented with empirically based constraints for implementation stemming from an earlier survey. In the process of various group discussions, the…

  5. Experts' attitudes towards medical futility: an empirical survey from Japan

    PubMed Central

    Bagheri, Alireza; Asai, Atsushi; Ida, Ryuichi

    2006-01-01

    Background The current debate about medical futility is mostly driven by theoretical and personal perspectives and there is a lack of empirical data to document experts and public attitudes towards medical futility. Methods To examine the attitudes of the Japanese experts in the fields relevant to medical futility a questionnaire survey was conducted among the members of the Japan Association for Bioethics. A total number of 108 questionnaires returned filled in, giving a response rate of 50.9%. Among the respondents 62% were healthcare professionals (HCPs) and 37% were non-healthcare professionals (Non-HCPs). Results The majority of respondents (67.6 %) believed that a physician's refusal to provide or continue a treatment on the ground of futility judgment could never be morally justified but 22.2% approved such refusal with conditions. In the case of physiologically futile care, three-quarters believed that a physician should inform the patient/family of his futility judgment and it would be the patient who could decide what should be done next, based on his/her value judgment. However more than 10% said that a physician should ask about a patient's value and goals, but the final decision was left to the doctor not the patient. There was no statistically significant difference between HCPs and Non-HCPs (p = 0.676). Of respondents 67.6% believed that practical guidelines set up by the health authority would be helpful in futility judgment. Conclusion The results show that there is no support for the physicians' unilateral decision- making on futile care. This survey highlights medical futility as an emerging issue in Japanese healthcare and emphasizes on the need for public discussion and policy development. PMID:16764732

  6. Acculturating human experimentation: an empirical survey in France

    PubMed Central

    Amiel, Philippe; Mathieu, Séverine; Fagot-Largeault, Anne

    2001-01-01

    Preliminary results of an empirical study of human experimentation practices are presented and contrasted with those of a survey conducted a hundred years ago when clinical research, although tolerated, was culturally deviant. Now that biomedical research is both authorized and controlled, its actors (sponsors, committees, investigators, subjects) come out with heterogeneous rationalities, and they appear to be engaged in a transactional process of negotiating their rationales with one another. In the European context “protective” of subjects, surprisingly the subjects we interviewed (and especially patient-subjects) were creative and revealed an aptitude for integrating experimental medicine into common culture. PMID:11445883

  7. An empirical Bayes approach to analyzing recurring animal surveys

    USGS Publications Warehouse

    Johnson, D.H.

    1989-01-01

    Recurring estimates of the size of animal populations are often required by biologists or wildlife managers. Because of cost or other constraints, estimates frequently lack the accuracy desired but cannot readily be improved by additional sampling. This report proposes a statistical method employing empirical Bayes (EB) estimators as alternatives to those customarily used to estimate population size, and evaluates them by a subsampling experiment on waterfowl surveys. EB estimates, especially a simple limited-translation version, were more accurate and provided shorter confidence intervals with greater coverage probabilities than customary estimates.

  8. Empirical estimates of cumulative refraction errors associated with procedurally constrained levelings based on the Gaithersburg-Tucson Refraction Tests of the National Geodetic Survey

    NASA Astrophysics Data System (ADS)

    Castle, Robert O.; Gilmore, Thomas D.; Mark, Robert K.; Shaw, Roger H.

    1985-05-01

    Analyses of results of the National Geodetic Survey's leveling refraction tests indicate that the standard deviation about the mean (σ) for high-scale minus low-scale rod readings closely correlates with measured refraction error. Use of this relation in conjunction with values for σ obtained from routinely constrained surveys provides a basis for estimating the refraction error associated with levelings of stipulated order and class.

  9. Empirical estimates of cumulative refraction errors associated with procedurally constrained levelings based on the Gaithersburg- Tucson refraction tests of the National Geodetic Survey.

    USGS Publications Warehouse

    Castle, R.O.; Gilmore, T.D.; Mark, R.K.; Shaw, R.H.

    1985-01-01

    Analyses of results of the National Geodetic Survey's leveling refraction tests indicate that the standard deviation about the mean (sigma) for high-scale minus low-scale rod readings closely correlates with measured refraction error. Use of this relation in conjunction with values for sigma obtained from routinely constrainted surveys provides a basis for estimating the refraction error associated with levelings of stipulated order and class. -Authors

  10. Empirically Based Play Interventions for Children

    ERIC Educational Resources Information Center

    Reddy, Linda A., Ed.; Files-Hall, Tara M., Ed.; Schaefer, Charles E., Ed.

    2005-01-01

    "Empirically Based Play Interventions for Children" is a compilation of innovative, well-designed play interventions, presented for the first time in one text. Play therapy is the oldest and most popular form of child therapy in clinical practice and is widely considered by practitioners to be uniquely responsive to children's developmental needs.…

  11. Searching for Evidence-Based Practice: A Survey of Empirical Studies on Curricular Interventions Measuring and Reporting Fidelity of Implementation Published during 2004-2013

    ERIC Educational Resources Information Center

    Missett, Tracy C.; Foster, Lisa H.

    2015-01-01

    In an environment of accountability, the development of evidence-based practices is expected. To demonstrate that a practice is evidence based, quality indicators of rigorous methodology should be present including indications that teachers implementing an intervention have done so with fidelity to its design. Because evidence-based practices…

  12. Watershed-based survey designs

    USGS Publications Warehouse

    Detenbeck, N.E.; Cincotta, D.; Denver, J.M.; Greenlee, S.K.; Olsen, A.R.; Pitchford, A.M.

    2005-01-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream-downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs. ?? Springer Science + Business Media, Inc. 2005.

  13. Functional somatic symptoms and hypochondriasis. A survey of empirical studies.

    PubMed

    Kellner, R

    1985-08-01

    Empirical studies suggest the following main conclusions: functional somatic symptoms are extremely common; a large proportion appear to be caused by physiologic activity and tend to be aggravated by emotion. Hypochondriacal patients misunderstand the nature and significance of these symptoms and believe that they are evidence of serious disease. Hypochondriasis can be a part of another syndrome, usually an affective one, or it can be a primary disorder. The prevalence differs between cultures and social classes. Constitutional factors, disease in the family in childhood, and previous disease predispose to hypochondriasis. Various stressors can be precipitating events. Selective perception of symptoms, motivated by fear of disease, and subsequent increase in anxiety with more somatic symptoms appear to be links in the vicious cycle of the hypochondriacal reaction. Psychotherapy as well as psychotropic drugs are effective in the treatment of functional somatic symptoms. There are no adequate controlled studies of psychotherapy in hypochondriasis, and the recommended treatments are based on studies with similar disorders. The prognosis of treated hypochondriasis is good in a substantial proportion of patients. PMID:2861797

  14. Empirical validation and application of the computing attitudes survey

    NASA Astrophysics Data System (ADS)

    Dorn, Brian; Tew, Allison Elliott

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning Attitudes about Science Survey and measures novice to expert attitude shifts about the nature of knowledge and problem solving in computer science. Factor analysis with a large, multi-institutional data-set identified and confirmed five subscales on the CAS related to different facets of attitudes measured on the survey. We then used the CAS in a pre-post format to demonstrate its usefulness in studying attitude shifts during CS1 courses and its responsiveness to varying instructional conditions. The most recent version of the CAS is provided in its entirety along with a discussion of the conditions under which its validity has been demonstrated.

  15. Empirical Validation and Application of the Computing Attitudes Survey

    ERIC Educational Resources Information Center

    Dorn, Brian; Elliott Tew, Allison

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…

  16. An Empirical Examination of IRT Information for School Climate Surveys

    ERIC Educational Resources Information Center

    Mo, Lun; Yang, Fang; Hu, Xiangen

    2011-01-01

    School climate surveys are widely applied in school districts across the nation to collect information about teacher efficacy, principal leadership, school safety, students' activities, and so forth. They enable school administrators to understand and address many issues on campus when used in conjunction with other student and staff data.…

  17. Empirically Based Myths: Astrology, Biorhythms, and ATIs.

    ERIC Educational Resources Information Center

    Ragsdale, Ronald G.

    1980-01-01

    A myth may have an empirical basis through chance occurrence; perhaps Aptitude Treatment Interactions (ATIs) are in this category. While ATIs have great utility in describing, planning, and implementing instruction, few disordinal interactions have been found. Article suggests narrowing of ATI research with replications and estimates of effect…

  18. Emotional Risks to Respondents in Survey Research: Some Empirical Evidence

    PubMed Central

    Labott, Susan M.; Johnson, Timothy P.; Fendrich, Michael; Feeny, Norah C.

    2014-01-01

    Some survey research has documented distress in respondents with pre-existing emotional vulnerabilities, suggesting the possibility of harm. In this study, respondents were interviewed about a personally distressing event; mood, stress, and emotional reactions were assessed. Two days later, respondents participated in interventions to either enhance or alleviate the effects of the initial interview. Results indicated that distressing interviews increased stress and negative mood, although no adverse events occurred. Between the interviews, moods returned to baseline. Respondents who again discussed a distressing event reported moods more negative than those who discussed a neutral or a positive event. This study provides evidence that, among nonvulnerable survey respondents, interviews on distressing topics can result in negative moods and stress, but they do not harm respondents. PMID:24169422

  19. School-Based Management and Paradigm Shift in Education an Empirical Study

    ERIC Educational Resources Information Center

    Cheng, Yin Cheong; Mok, Magdalena Mo Ching

    2007-01-01

    Purpose: This paper aims to report empirical research investigating how school-based management (SBM) and paradigm shift (PS) in education are closely related to teachers' student-centered teaching and students' active learning in a sample of Hong Kong secondary schools. Design/methodology/approach: It is a cross-sectional survey research…

  20. An Empirical Keying Approach to Academic Prediction with the Guilford-Zimmerman Temperament Survey

    ERIC Educational Resources Information Center

    Jacobs, Rick; Manese, Wilfredo

    1977-01-01

    The present study was conducted to establish a scoring key for the Guilford Zimmerman Temperament Survey appropriate for predicting academic performance. Results point to the utility of non-cognitive measures in predicting academic performance, particularly when scoring keys tailored to the specific situation are empirically derived. Suggestions…

  1. Modelling complex survey data with population level information: an empirical likelihood approach

    PubMed Central

    Oguz-Alper, M.; Berger, Y. G.

    2016-01-01

    Survey data are often collected with unequal probabilities from a stratified population. In many modelling situations, the parameter of interest is a subset of a set of parameters, with the others treated as nuisance parameters. We show that in this situation the empirical likelihood ratio statistic follows a chi-squared distribution asymptotically, under stratified single and multi-stage unequal probability sampling, with negligible sampling fractions. Simulation studies show that the empirical likelihood confidence interval may achieve better coverages and has more balanced tail error rates than standard approaches involving variance estimation, linearization or resampling. PMID:27279669

  2. WATERSHED BASED SURVEY DESIGNS

    EPA Science Inventory

    The development of watershed-based design and assessment tools will help to serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional condition to meet Section 305(b), identification of impaired water bodies or wate...

  3. Survey-Based Measurement of Public Management and Policy Networks

    ERIC Educational Resources Information Center

    Henry, Adam Douglas; Lubell, Mark; McCoy, Michael

    2012-01-01

    Networks have become a central concept in the policy and public management literature; however, theoretical development is hindered by a lack of attention to the empirical properties of network measurement methods. This paper compares three survey-based methods for measuring organizational networks: the roster, the free-recall name generator, and…

  4. Empirically Based Strategies for Preventing Juvenile Delinquency.

    PubMed

    Pardini, Dustin

    2016-04-01

    Juvenile crime is a serious public health problem that results in significant emotional and financial costs for victims and society. Using etiologic models as a guide, multiple interventions have been developed to target risk factors thought to perpetuate the emergence and persistence of delinquent behavior. Evidence suggests that the most effective interventions tend to have well-defined treatment protocols, focus on therapeutic approaches as opposed to external control techniques, and use multimodal cognitive-behavioral treatment strategies. Moving forward, there is a need to develop effective policies and procedures that promote the widespread adoption of evidence-based delinquency prevention practices across multiple settings. PMID:26980128

  5. An Empirical Evaluation of Puzzle-Based Learning as an Interest Approach for Teaching Introductory Computer Science

    ERIC Educational Resources Information Center

    Merrick, K. E.

    2010-01-01

    This correspondence describes an adaptation of puzzle-based learning to teaching an introductory computer programming course. Students from two offerings of the course--with and without the puzzle-based learning--were surveyed over a two-year period. Empirical results show that the synthesis of puzzle-based learning concepts with existing course…

  6. Image-Based Empirical Modeling of the Plasmasphere

    NASA Technical Reports Server (NTRS)

    Adrian, Mark L.; Gallagher, D. L.

    2008-01-01

    A new suite of empirical models of plasmaspheric plasma based on remote, global images from the IMAGE EUV instrument is proposed for development. The purpose of these empirical models is to establish the statistical properties of the plasmasphere as a function of conditions. This suite of models will mark the first time the plasmaspheric plume is included in an empirical model. Development of these empirical plasmaspheric models will support synoptic studies (such as for wave propagation and growth, energetic particle loss through collisions and dust transport as influenced by charging) and serves as a benchmark against which physical models can be tested. The ability to know that a specific global density distribution occurs in response to specific magnetospheric and solar wind factors is a huge advantage over all previous in-situ based empirical models. The consequence of creating these new plasmaspheric models will be to provide much higher fidelity and much richer quantitative descriptions of the statistical properties of plasmaspheric plasma in the inner magnetosphere, whether that plasma is in the main body of the plasmasphere, nearby during recovery or in the plasmaspheric plume. Model products to be presented include statistical probabilities for being in the plasmasphere, near thermal He+ density boundaries and the complexity of its spatial structure.

  7. Responses to Commentaries on Advances in Empirically Based Assessment.

    ERIC Educational Resources Information Center

    McConaughy, Stephanie H.

    1993-01-01

    Author of article (this issue) describing research program to advance assessment of children's behavioral and emotional problems; presenting conceptual framework for multiaxial empirically based assessment; and summarizing research efforts to develop cross-informant scales for scoring parent, teacher, and self-reports responds to commentaries on…

  8. Denoising ECG signal based on ensemble empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Zhi-dong, Zhao; Liu, Juan; Wang, Sheng-tao

    2011-10-01

    The electrocardiogram (ECG) has been used extensively for detection of heart disease. Frequently the signal is corrupted by various kinds of noise such as muscle noise, electromyogram (EMG) interference, instrument noise etc. In this paper, a new ECG denoising method is proposed based on the recently developed ensemble empirical mode decomposition (EEMD). Noisy ECG signal is decomposed into a series of intrinsic mode functions (IMFs). The statistically significant information content is build by the empirical energy model of IMFs. Noisy ECG signal collected from clinic recording is processed using the method. The results show that on contrast with traditional methods, the novel denoising method can achieve the optimal denoising of the ECG signal.

  9. Space Based Dark Energy Surveys

    NASA Astrophysics Data System (ADS)

    Dore, Olivier

    2016-03-01

    Dark energy, the name given to the cause of the accelerating expansion of the Universe, is one of the most tantalizing mystery in modern physics. Current cosmological models hold that dark energy is currently the dominant component of the Universe, but the exact nature of DE remains poorly understood. There are ambitious ground-based surveys underway that seek to understand DE and NASA is participating in the development of significantly more ambitious space-based surveys planned for the next decade. NASA has provided mission enabling technology to the European Space Agency's (ESA) Euclid mission in exchange for US scientists to participate in the Euclid mission. NASA is also developing the Wide Field Infrared Survey Telescope-Astrophysics Focused Telescope Asset (WFIRST) mission for possible launch in 2024. WFIRST was the highest ranked space mission in the Astro2010 Decadal Survey and the current design uses a 2.4m space telescope to go beyond what was then envisioned. Understanding DE is one of the primary science goals of WFIRST-AFTA. This talk will review the state of DE, the relevant activities of the Cosmic Structure Interest Group (CoSSIG) of the PhyPAG, and detail the status and complementarity between Euclid, WFIRST and ot ambitious ground-based efforts.

  10. Unsupervised self-organized mapping: a versatile empirical tool for object selection, classification and redshift estimation in large surveys

    NASA Astrophysics Data System (ADS)

    Geach, James E.

    2012-01-01

    We present an application of unsupervised machine learning - the self-organized map (SOM) - as a tool for visualizing, exploring and mining the catalogues of large astronomical surveys. Self-organization culminates in a low-resolution representation of the 'topology' of a parameter volume, and this can be exploited in various ways pertinent to astronomy. Using data from the Cosmological Evolution Survey (COSMOS), we demonstrate two key astronomical applications of the SOM: (i) object classification and selection, using galaxies with active galactic nuclei as an example, and (ii) photometric redshift estimation, illustrating how SOMs can be used as totally empirical predictive tools. With a training set of ˜3800 galaxies with zspec≤ 1, we achieve photometric redshift accuracies competitive with other (mainly template fitting) techniques that use a similar number of photometric bands [σ(Δz) = 0.03 with a ˜2 per cent outlier rate when using u* band to 8 ?m photometry]. We also test the SOM as a photo-z tool using the PHoto-z Accuracy Testing (PHAT) synthetic catalogue of Hildebrandt et al., which compares several different photo-z codes using a common input/training set. We find that the SOM can deliver accuracies that are competitive with many of the established template fitting and empirical methods. This technique is not without clear limitations, which are discussed, but we suggest it could be a powerful tool in the era of extremely large -'petabyte'- data bases where efficient data mining is a paramount concern.

  11. Advanced airfoil design empirically based transonic aircraft drag buildup technique

    NASA Technical Reports Server (NTRS)

    Morrison, W. D., Jr.

    1976-01-01

    To systematically investigate the potential of advanced airfoils in advance preliminary design studies, empirical relationships were derived, based on available wind tunnel test data, through which total drag is determined recognizing all major aircraft geometric variables. This technique recognizes a single design lift coefficient and Mach number for each aircraft. Using this technique drag polars are derived for all Mach numbers up to MDesign + 0.05 and lift coefficients -0.40 to +0.20 from CLDesign.

  12. The fundamental points of the Second Military Survey of the Habsburg Empire

    NASA Astrophysics Data System (ADS)

    Timár, G.

    2009-04-01

    The Second Military Survey was carried out between 1806 and 1869 in the continuously changing territory of the Habsburg Empire. More than 4000 tiles of the 1:28,800 scale survey sheets cover the Empire, which was the second in territorial extents in Europe after Russia. In the terms of the cartography, the Empire has been divided into eight zones; each zones had its own Cassini-type projection with a center at a geodetically defined fundamental point. The points were the following: - Wien-Stephansdom (valid for Lower and Upper Austria, Hungary, Dalmacy, Moravia and Vorarlberg; latitude=48,20910N; longitude=16,37655E on local datum). - Gusterberg (valid for Bohemia; latitude=48,03903N; longitude=14,13976E). - Schöcklberg (valid for Styria; latitude=47,19899N; longitude=15,46902E). - Krimberg (valid for Illyria and Coastal Land; latitude=45,92903N; longitude=14,47423E). - Löwenberg (valid for Galizia and Bukovina; latitude=49,84889N; longitude=24,04639E). - Hermannstadt (valid for Transylvania; latitude=45,84031N; longitude=24,11297). - Ivanić (valid for Croatia; latitude=45,73924N; longitude=16,42309). - Milano, Duomo San Salvatore (valid for Lombardy, Venezia, Parma and Modena; latitude=45,45944N; longitude=9,18757E) - a simulated fundametal point for Tyrol and Salzburg, several hundred miles north of the territories. The poster shows systematically the fundamental points, their topographic settings and the present situation of the geodetic point sites.

  13. Empirically Based Psychosocial Therapies for Schizophrenia: The Disconnection between Science and Practice

    PubMed Central

    Shean, Glenn D.

    2013-01-01

    Empirically validated psychosocial therapies for individuals diagnosed with schizophrenia were described in the report of the Schizophrenia Patient Outcomes Research Team (PORT, 2009). The PORT team identified eight psychosocial treatments: assertive community treatment, supported employment, cognitive behavioral therapy, family-based services, token economy, skills training, psychosocial interventions for alcohol and substance use disorders, and psychosocial interventions for weight management. PORT listings of empirically validated psychosocial therapies provide a useful template for the design of effective recovery-oriented mental health care systems. Unfortunately, surveys indicate that PORT listings have not been implemented in clinical settings. Obstacles to the implementation of PORT psychosocial therapy listings and suggestions for changes needed to foster implementation are discussed. Limitations of PORT therapy listings that are based on therapy outcome efficacy studies are discussed, and cross-cultural and course and outcome studies of correlates of recovery are summarized. PMID:23738068

  14. Empirically Based Psychosocial Therapies for Schizophrenia: The Disconnection between Science and Practice.

    PubMed

    Shean, Glenn D

    2013-01-01

    Empirically validated psychosocial therapies for individuals diagnosed with schizophrenia were described in the report of the Schizophrenia Patient Outcomes Research Team (PORT, 2009). The PORT team identified eight psychosocial treatments: assertive community treatment, supported employment, cognitive behavioral therapy, family-based services, token economy, skills training, psychosocial interventions for alcohol and substance use disorders, and psychosocial interventions for weight management. PORT listings of empirically validated psychosocial therapies provide a useful template for the design of effective recovery-oriented mental health care systems. Unfortunately, surveys indicate that PORT listings have not been implemented in clinical settings. Obstacles to the implementation of PORT psychosocial therapy listings and suggestions for changes needed to foster implementation are discussed. Limitations of PORT therapy listings that are based on therapy outcome efficacy studies are discussed, and cross-cultural and course and outcome studies of correlates of recovery are summarized. PMID:23738068

  15. An empirical analysis of the demand for sleep: Evidence from the American Time Use Survey.

    PubMed

    Ásgeirsdóttir, Tinna Laufey; Ólafsson, Sigurður Páll

    2015-12-01

    Using data from the American Time Use Survey, this paper empirically examined the demand for sleep, with special attention to its opportunity cost represented by wages. Variation in the unemployment rate by state was also used to investigate the cyclical nature of sleep duration. We conducted separate estimations for males and females, as well as for those who received a fixed salary and hourly wages. The findings predominantly revealed no relationship between sleep duration and the business cycle. However, an inverse relationship between sleep duration and wages was detected. This is in accordance with sleep duration being an economic choice variable, rather than a predetermined subtraction of the 24-h day. Although the inverse relationship was not significant in all the estimations for salaried subjects, it was consistent and strong for subjects who received hourly wages. For instance, elasticity measures were −.03 for those who received hourly wages and −.003 for those who received a fixed salary. PMID:26603429

  16. AGENT-BASED MODELS IN EMPIRICAL SOCIAL RESEARCH*

    PubMed Central

    Bruch, Elizabeth; Atwell, Jon

    2014-01-01

    Agent-based modeling has become increasingly popular in recent years, but there is still no codified set of recommendations or practices for how to use these models within a program of empirical research. This article provides ideas and practical guidelines drawn from sociology, biology, computer science, epidemiology, and statistics. We first discuss the motivations for using agent-based models in both basic science and policy-oriented social research. Next, we provide an overview of methods and strategies for incorporating data on behavior and populations into agent-based models, and review techniques for validating and testing the sensitivity of agent-based models. We close with suggested directions for future research. PMID:25983351

  17. An empirically based electrosource horizon lead-acid battery model

    SciTech Connect

    Moore, S.; Eshani, M.

    1996-09-01

    An empirically based mathematical model of a lead-acid battery for use in the Texas A and M University`s Electrically Peaking Hybrid (ELPH) computer simulation is presented. The battery model is intended to overcome intuitive difficulties with currently available models by employing direct relationships between state-of-charge, voltage, and power demand. The model input is the power demand or load. Model outputs include voltage, an instantaneous battery efficiency coefficient and a state-of-charge indicator. A time and current depend voltage hysteresis is employed to ensure correct voltage tracking inherent with the highly transient nature of a hybrid electric drivetrain.

  18. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data.

    PubMed

    Shannon, Graeme; Lewis, Jesse S; Gerber, Brian D

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km(2) of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10-120 cameras) and occasions (20-120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ψ) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with

  19. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data

    PubMed Central

    Lewis, Jesse S.; Gerber, Brian D.

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km2 of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10–120 cameras) and occasions (20–120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ψ) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with

  20. Comparison between empirical and physically based models of atmospheric correction

    NASA Astrophysics Data System (ADS)

    Mandanici, E.; Franci, F.; Bitelli, G.; Agapiou, A.; Alexakis, D.; Hadjimitsis, D. G.

    2015-06-01

    A number of methods have been proposed for the atmospheric correction of the multispectral satellite images, based on either atmosphere modelling or images themselves. Full radiative transfer models require a lot of ancillary information about the atmospheric conditions at the acquisition time. Whereas, image based methods cannot account for all the involved phenomena. Therefore, the aim of this paper is the comparison of different atmospheric correction methods for multispectral satellite images. The experimentation was carried out on a study area located in the catchment area of Yialias river, 20 km South of Nicosia, the Cyprus capital. The following models were tested, both empirical and physically based: Dark object subtraction, QUAC, Empirical line, 6SV, and FLAASH. They were applied on a Landsat 8 multispectral image. The spectral signatures of ten different land cover types were measured during a field campaign in 2013 and 15 samples were collected for laboratory measurements in a second campaign in 2014. GER 1500 spectroradiometer was used; this instrument can record electromagnetic radiation from 350 up to 1050 nm, includes 512 different channels and each channel covers about 1.5 nm. The spectral signatures measured were used to simulate the reflectance values for the multispectral sensor bands by applying relative spectral response filters. These data were considered as ground truth to assess the accuracy of the different image correction models. Results do not allow to establish which method is the most accurate. The physics-based methods describe better the shape of the signatures, whereas the image-based models perform better regarding the overall albedo.

  1. State of Research on Giftedness and Gifted Education: A Survey of Empirical Studies Published during 1998-2010 (April)

    ERIC Educational Resources Information Center

    Dai, David Yun; Swanson, Joan Ann; Cheng, Hongyu

    2011-01-01

    This study surveyed 1,234 empirical studies on giftedness, gifted education, and creativity during 1998-2010 (April), using PsycINFO database and targeted journals as main sources, with respect to main topics these studies focused on, methods they used for investigation, and the conceptual spaces they traversed. Four main research topics emerged…

  2. Development of an empirically based dynamic biomechanical strength model

    NASA Technical Reports Server (NTRS)

    Pandya, A.; Maida, J.; Aldridge, A.; Hasson, S.; Woolford, B.

    1992-01-01

    The focus here is on the development of a dynamic strength model for humans. Our model is based on empirical data. The shoulder, elbow, and wrist joints are characterized in terms of maximum isolated torque, position, and velocity in all rotational planes. This information is reduced by a least squares regression technique into a table of single variable second degree polynomial equations determining the torque as a function of position and velocity. The isolated joint torque equations are then used to compute forces resulting from a composite motion, which in this case is a ratchet wrench push and pull operation. What is presented here is a comparison of the computed or predicted results of the model with the actual measured values for the composite motion.

  3. Developing an empirical base for clinical nurse specialist education.

    PubMed

    Stahl, Arleen M; Nardi, Deena; Lewandowski, Margaret A

    2008-01-01

    This article reports on the design of a clinical nurse specialist (CNS) education program using National Association of Clinical Nurse Specialists (NACNS) CNS competencies to guide CNS program clinical competency expectations and curriculum outcomes. The purpose is to contribute to the development of an empirical base for education and credentialing of CNSs. The NACNS CNS core competencies and practice competencies in all 3 spheres of influence guided the creation of clinical competency grids for this university's practicum courses. This project describes the development, testing, and application of these clinical competency grids that link the program's CNS clinical courses with the NACNS CNS competencies. These documents guide identification, tracking, measurement, and evaluation of the competencies throughout the clinical practice portion of the CNS program. This ongoing project will continue to provide data necessary to the benchmarking of CNS practice competencies, which is needed to evaluate the effectiveness of direct practice performance and the currency of graduate nursing education. PMID:18438164

  4. The Gaia-ESO Survey: Empirical determination of the precision of stellar radial velocities and projected rotation velocities

    NASA Astrophysics Data System (ADS)

    Jackson, R. J.; Jeffries, R. D.; Lewis, J.; Koposov, S. E.; Sacco, G. G.; Randich, S.; Gilmore, G.; Asplund, M.; Binney, J.; Bonifacio, P.; Drew, J. E.; Feltzing, S.; Ferguson, A. M. N.; Micela, G.; Neguerela, I.; Prusti, T.; Rix, H.-W.; Vallenari, A.; Alfaro, E. J.; Allende Prieto, C.; Babusiaux, C.; Bensby, T.; Blomme, R.; Bragaglia, A.; Flaccomio, E.; Francois, P.; Hambly, N.; Irwin, M.; Korn, A. J.; Lanzafame, A. C.; Pancino, E.; Recio-Blanco, A.; Smiljanic, R.; Van Eck, S.; Walton, N.; Bayo, A.; Bergemann, M.; Carraro, G.; Costado, M. T.; Damiani, F.; Edvardsson, B.; Franciosini, E.; Frasca, A.; Heiter, U.; Hill, V.; Hourihane, A.; Jofré, P.; Lardo, C.; de Laverny, P.; Lind, K.; Magrini, L.; Marconi, G.; Martayan, C.; Masseron, T.; Monaco, L.; Morbidelli, L.; Prisinzano, L.; Sbordone, L.; Sousa, S. G.; Worley, C. C.; Zaggia, S.

    2015-08-01

    Context. The Gaia-ESO Survey (GES) is a large public spectroscopic survey at the European Southern Observatory Very Large Telescope. Aims: A key aim is to provide precise radial velocities (RVs) and projected equatorial velocities (vsini) for representative samples of Galactic stars, which will complement information obtained by the Gaia astrometry satellite. Methods: We present an analysis to empirically quantify the size and distribution of uncertainties in RV and vsini using spectra from repeated exposures of the same stars. Results: We show that the uncertainties vary as simple scaling functions of signal-to-noise ratio (S/N) and vsini, that the uncertainties become larger with increasing photospheric temperature, but that the dependence on stellar gravity, metallicity and age is weak. The underlying uncertainty distributions have extended tails that are better represented by Student's t-distributions than by normal distributions. Conclusions: Parametrised results are provided, which enable estimates of the RV precision for almost all GES measurements, and estimates of the vsini precision for stars in young clusters, as a function of S/N, vsini and stellar temperature. The precision of individual high S/N GES RV measurements is 0.22-0.26 km s-1, dependent on instrumental configuration. Based on observations collected with the FLAMES spectrograph at VLT/UT2 telescope (Paranal Observatory, ESO, Chile), for the Gaia- ESO Large Public Survey (188.B-3002).Full Table 2 is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/580/A75

  5. Ontology-Based Empirical Knowledge Verification for Professional Virtual Community

    ERIC Educational Resources Information Center

    Chen, Yuh-Jen

    2011-01-01

    A professional virtual community provides an interactive platform for enterprise experts to create and share their empirical knowledge cooperatively, and the platform contains a tremendous amount of hidden empirical knowledge that knowledge experts have preserved in the discussion process. Therefore, enterprise knowledge management highly…

  6. Arduino based radiation survey meter

    NASA Astrophysics Data System (ADS)

    Rahman, Nur Aira Abd; Lombigit, Lojius; Abdullah, Nor Arymaswati; Azman, Azraf; Dolah, Taufik; Muzakkir, Amir; Jaafar, Zainudin; Mohamad, Glam Hadzir Patai; Ramli, Abd Aziz Mhd; Zain, Rasif Mohd; Said, Fazila; Khalid, Mohd Ashhar; Taat, Muhamad Zahidee

    2016-01-01

    This paper presents the design of new digital radiation survey meter with LND7121 Geiger Muller tube detector and Atmega328P microcontroller. Development of the survey meter prototype is carried out on Arduino Uno platform. 16-bit Timer1 on the microcontroller is utilized as external pulse counter to produce count per second or CPS measurement. Conversion from CPS to dose rate technique is also performed by Arduino to display results in micro Sievert per hour (μSvhr-1). Conversion factor (CF) value for conversion of CPM to μSvhr-1 determined from manufacturer data sheet is compared with CF obtained from calibration procedure. The survey meter measurement results are found to be linear for dose rates below 3500 µSv/hr.

  7. Noise cancellation in IR video based on empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Piñeiro-Ave, José; Blanco-Velasco, Manuel; Cruz-Roldán, Fernando; Artés-Rodríguez, Antonio

    2013-05-01

    Currently there is a huge demand for simple low cost IR cameras for both civil and military applications, among which one of the most common is the surveillance of restricted access zones. In the design of low cost IR cameras, it is necessary to avoid the use of several elements present in more sophisticated cameras, such as the refrigeration systems and the temperature control of the detectors, so as to prevent the use of a mechanical modulator of the incident radiation (chopper). Consequently, the detection algorithms must reliably separate the target signal from high noise and drift caused by temporal variations of the background image of the scene and the additional drift due to thermal instability detectors. A very important step towards this goal is the design of a preprocessing stage to eliminate noise. Thus, in this work we propose using the Empirical Mode Decomposition (EMD) method to attain this objective. In order to evaluate the quality of the reconstructed clean signal, the Average to Peak Ratio is assessed to evaluate the effectiveness in reconstructing the waveform of the signal from the target. We compare the EMD method with other classical method of noise cancellation based on the Discrete Wavelet Transform (DWT). The results reported by simulations show that the proposed scheme based on EMD performs better than traditional ones.

  8. A joint model of household space heat production and consumption: Empirical evidence from a Belgian micro-data survey

    SciTech Connect

    Cuijpers, C.

    1995-12-31

    Households are faced with increasing regulation to improve energy conservation and energy efficiency for environmental concerns. Understanding how a house produces space heat and how energy consumption can be reduced becomes a keystone in designing energy and environmental policies. This paper provides empirical evidence on household behavior in the context of house heating. A joint household space heat production and consumption model is developed and empirically implemented. Attention is devoted mainly to the intermediate role of the characteristics of the house, with special reference to insulation levels, which determine the ability of the house to convert energy into heat levels. House heat levels are characterized and empirical support for the so-called {open_quote}rebound{close_quote} effects are shown. The econometric model is specified for a single period cross-section regression estimation, The database is drawn from the 1987-88 Belgian Household Expenditure Survey.

  9. Children's Experiences of Completing a Computer-Based Violence Survey: Ethical Implications

    ERIC Educational Resources Information Center

    Ellonen, Noora; Poso, Tarja

    2011-01-01

    This article aims to contribute to the discussion about the ethics of research on children when studying sensitive issues such as violence. The empirical analysis is based on the accounts given by children (11 377) who completed a computer-based questionnaire about their experiences of violence ("The Finnish Child Victim Survey 2008") and their…

  10. Evidence-based ethics? On evidence-based practice and the "empirical turn" from normative bioethics

    PubMed Central

    Goldenberg, Maya J

    2005-01-01

    Background The increase in empirical methods of research in bioethics over the last two decades is typically perceived as a welcomed broadening of the discipline, with increased integration of social and life scientists into the field and ethics consultants into the clinical setting, however it also represents a loss of confidence in the typical normative and analytic methods of bioethics. Discussion The recent incipiency of "Evidence-Based Ethics" attests to this phenomenon and should be rejected as a solution to the current ambivalence toward the normative resolution of moral problems in a pluralistic society. While "evidence-based" is typically read in medicine and other life and social sciences as the empirically-adequate standard of reasonable practice and a means for increasing certainty, I propose that the evidence-based movement in fact gains consensus by displacing normative discourse with aggregate or statistically-derived empirical evidence as the "bottom line". Therefore, along with wavering on the fact/value distinction, evidence-based ethics threatens bioethics' normative mandate. The appeal of the evidence-based approach is that it offers a means of negotiating the demands of moral pluralism. Rather than appealing to explicit values that are likely not shared by all, "the evidence" is proposed to adjudicate between competing claims. Quantified measures are notably more "neutral" and democratic than liberal markers like "species normal functioning". Yet the positivist notion that claims stand or fall in light of the evidence is untenable; furthermore, the legacy of positivism entails the quieting of empirically non-verifiable (or at least non-falsifiable) considerations like moral claims and judgments. As a result, evidence-based ethics proposes to operate with the implicit normativity that accompanies the production and presentation of all biomedical and scientific facts unchecked. Summary The "empirical turn" in bioethics signals a need for

  11. Short memory or long memory: an empirical survey of daily rainfall data

    NASA Astrophysics Data System (ADS)

    Yusof, F.; Kane, I. L.

    2012-10-01

    A short memory process that encounters occasional structural breaks in mean can show a slower rate of decay in the autocorrelation function and other properties of fractional integrated I (d) processes. In this paper we employed a procedure for estimating the fractional differencing parameter in semi parametric contexts proposed by Geweke and Porter-Hudak to analyze nine daily rainfall data sets across Malaysia. The results indicate that all the data sets exhibit long memory. Furthermore, an empirical fluctuation process using the Ordinary Least Square (OLS) based cumulative sum (CUSUM) test with F-statistic for the break date were applied, break dates were detected in all data sets. The data sets were partitioned according to their respective break date and further test for long memory was applied for all subseries. Results show that all subseries follows the same pattern with the original series. The estimate of the fractional parameters d1 and d2 on the subseries obtained by splitting the original series at the break-date, confirms that there is a long memory in the DGP. Therefore this evidence shows a true long memory not due to structural break.

  12. Structural break or long memory: an empirical survey on daily rainfall data sets across Malaysia

    NASA Astrophysics Data System (ADS)

    Yusof, F.; Kane, I. L.; Yusop, Z.

    2013-04-01

    A short memory process that encounters occasional structural breaks in mean can show a slower rate of decay in the autocorrelation function and other properties of fractional integrated I (d) processes. In this paper we employed a procedure for estimating the fractional differencing parameter in semiparametric contexts proposed by Geweke and Porter-Hudak (1983) to analyse nine daily rainfall data sets across Malaysia. The results indicate that all the data sets exhibit long memory. Furthermore, an empirical fluctuation process using the ordinary least square (OLS)-based cumulative sum (CUSUM) test for the break date was applied. Break dates were detected in all data sets. The data sets were partitioned according to their respective break date, and a further test for long memory was applied for all subseries. Results show that all subseries follows the same pattern as the original series. The estimate of the fractional parameters d1 and d2 on the subseries obtained by splitting the original series at the break date confirms that there is a long memory in the data generating process (DGP). Therefore this evidence shows a true long memory not due to structural break.

  13. STEAM: a software tool based on empirical analysis for micro electro mechanical systems

    NASA Astrophysics Data System (ADS)

    Devasia, Archana; Pasupuleti, Ajay; Sahin, Ferat

    2006-03-01

    In this research a generalized software framework that enables accurate computer aided design of MEMS devices is developed. The proposed simulation engine utilizes a novel material property estimation technique that generates effective material properties at the microscopic level. The material property models were developed based on empirical analysis and the behavior extraction of standard test structures. A literature review is provided on the physical phenomena that govern the mechanical behavior of thin films materials. This survey indicates that the present day models operate under a wide range of assumptions that may not be applicable to the micro-world. Thus, this methodology is foreseen to be an essential tool for MEMS designers as it would develop empirical models that relate the loading parameters, material properties, and the geometry of the microstructures with its performance characteristics. This process involves learning the relationship between the above parameters using non-parametric learning algorithms such as radial basis function networks and genetic algorithms. The proposed simulation engine has a graphical user interface (GUI) which is very adaptable, flexible, and transparent. The GUI is able to encompass all parameters associated with the determination of the desired material property so as to create models that provide an accurate estimation of the desired property. This technique was verified by fabricating and simulating bilayer cantilevers consisting of aluminum and glass (TEOS oxide) in our previous work. The results obtained were found to be very encouraging.

  14. Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data

    NASA Technical Reports Server (NTRS)

    Rompala, John T.

    2005-01-01

    A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.

  15. WATERSHED-BASED SURVEY DESIGNS

    EPA Science Inventory

    Water-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification if impaired water bodies or watersheds to meet Sectio...

  16. A Survey of Graduate Training in Empirically Supported and Manualized Treatments: A Preliminary Report

    ERIC Educational Resources Information Center

    Karekla, Maria; Lundgren, Jennifer D.; Forsyth, John P.

    2004-01-01

    The promotion and dissemination of empirically supported (ESTs) and manualized therapies are important, albeit controversial, developments within clinical science and practice. To date, studies evaluating training opportunities and attitudes about such treatments at the graduate, predoctoral internship, and postdoctoral levels have focused on the…

  17. Cloud Based Processing of Large Photometric Surveys

    NASA Astrophysics Data System (ADS)

    Farivar, R.; Brunner, R. J.; Santucci, R.; Campbell, R.

    2013-10-01

    Astronomy, as is the case with many scientific domains, has entered the realm of being a data rich science. Nowhere is this reflected more clearly than in the growth of large area surveys, such as the recently completed Sloan Digital Sky Survey (SDSS) or the Dark Energy Survey, which will soon obtain PB of imaging data. The data processing on these large surveys is a major challenge. In this paper, we demonstrate a new approach to this common problem. We propose the use of cloud-based technologies (e.g., Hadoop MapReduce) to run a data analysis program (e.g., SExtractor) across a cluster. Using the intermediate key/value pair design of Hadoop, our framework matches objects across different SExtractor invocations to create a unified catalog from all SDSS processed data. We conclude by presenting our experimental results on a 432 core cluster and discuss the lessons we have learned in completing this challenge.

  18. Empirical Evaluation of Social, Psychological, & Educational Construct Variables Used in NCES Surveys. Working Paper Series.

    ERIC Educational Resources Information Center

    Freidlin, Boris; Salvucci, Sameena

    The purpose of this study was to provide an analysis and evaluation of composite variables in the National Education Longitudinal Study of 1988 (NELS:88) and School and Staffing Survey (SASS) in a way that provides guidance to the staff of the National Center for Education Statistics (NCES) in the more effective use of survey resources. The study…

  19. Accuracy of Population Validity and Cross-Validity Estimation: An Empirical Comparison of Formula-Based, Traditional Empirical, and Equal Weights Procedures.

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Bilgic, Reyhan; Edwards, Jack E.; Fleer, Paul F.

    1999-01-01

    Performed an empirical Monte Carlo study using predictor and criterion data from 84,808 U.S. Air Force enlistees. Compared formula-based, traditional empirical, and equal-weights procedures. Discusses issues for basic research on validation and cross-validation. (SLD)

  20. A Comparison of Web-Based and Paper-Based Survey Methods: Testing Assumptions of Survey Mode and Response Cost

    ERIC Educational Resources Information Center

    Greenlaw, Corey; Brown-Welty, Sharon

    2009-01-01

    Web-based surveys have become more prevalent in areas such as evaluation, research, and marketing research to name a few. The proliferation of these online surveys raises the question, how do response rates compare with traditional surveys and at what cost? This research explored response rates and costs for Web-based surveys, paper surveys, and…

  1. Empirical Analysis and Refinement of Expert System Knowledge Bases

    PubMed Central

    Weiss, Sholom M.; Politakis, Peter; Ginsberg, Allen

    1986-01-01

    Recent progress in knowledge base refinement for expert systems is reviewed. Knowledge base refinement is characterized by the constrained modification of rule-components in an existing knowledge base. The goals are to localize specific weaknesses in a knowledge base and to improve an expert system's performance. Systems that automate some aspects of knowledge base refinement can have a significant impact on the related problems of knowledge base acquisition, maintenance, verification, and learning from experience. The SEEK empiricial analysis and refinement system is reviewed and its successor system, SEEK2, is introduced. Important areas for future research in knowledge base refinement are described.

  2. An empirical formula based on Monte Carlo simulation for diffuse reflectance from turbid media

    NASA Astrophysics Data System (ADS)

    Gnanatheepam, Einstein; Aruna, Prakasa Rao; Ganesan, Singaravelu

    2016-03-01

    Diffuse reflectance spectroscopy has been widely used in diagnostic oncology and characterization of laser irradiated tissue. However, still accurate and simple analytical equation does not exist for estimation of diffuse reflectance from turbid media. In this work, a diffuse reflectance lookup table for a range of tissue optical properties was generated using Monte Carlo simulation. Based on the generated Monte Carlo lookup table, an empirical formula for diffuse reflectance was developed using surface fitting method. The variance between the Monte Carlo lookup table surface and the surface obtained from the proposed empirical formula is less than 1%. The proposed empirical formula may be used for modeling of diffuse reflectance from tissue.

  3. An Empirical Taxonomy of Youths' Fears: Cluster Analysis of the American Fear Survey Schedule

    ERIC Educational Resources Information Center

    Burnham, Joy J.; Schaefer, Barbara A.; Giesen, Judy

    2006-01-01

    Fears profiles among children and adolescents were explored using the Fear Survey Schedule for Children-American version (FSSC-AM; J.J. Burnham, 1995, 2005). Eight cluster profiles were identified via multistage Euclidean grouping and supported by homogeneity coefficients and replication. Four clusters reflected overall level of fears (i.e., very…

  4. Marginal Products of the Survey and Principles of Economics Courses: Empirical Information for Curriculum Decision Making.

    ERIC Educational Resources Information Center

    Butler, Thomas

    In order to determine the most appropriate economics prerequisites for selected courses at Thomas Nelson Community College, a study was conducted during Winter 1980 to compare the relative effectiveness of a one-quarter survey course in economics with a three-quarter principles course. The study involved the pre- and post-testing of students…

  5. A Survey and Empirical Study of Virtual Reference Service in Academic Libraries

    ERIC Educational Resources Information Center

    Mu, Xiangming; Dimitroff, Alexandra; Jordan, Jeanette; Burclaff, Natalie

    2011-01-01

    Virtual Reference Services (VRS) have high user satisfaction. The main problem is its low usage. We surveyed 100 academic library web sites to understand how VRS are presented. We then conducted a usability study to further test an active VRS model regarding its effectiveness.

  6. Deep in Data. Empirical Data Based Software Accuracy Testing Using the Building America Field Data Repository

    SciTech Connect

    Neymark, J.; Roberts, D.

    2013-06-01

    This paper describes progress toward developing a usable, standardized, empirical data-based software accuracy test suite using home energy consumption and building description data. Empirical data collected from around the United States have been translated into a uniform Home Performance Extensible Markup Language format that may enable software developers to create translators to their input schemes for efficient access to the data. This could allow for modeling many homes expediently, and thus implementing software accuracy test cases by applying the translated data.

  7. Empirically Based School Interventions Targeted at Academic and Mental Health Functioning

    ERIC Educational Resources Information Center

    Hoagwood, Kimberly E.; Olin, S. Serene; Kerker, Bonnie D.; Kratochwill, Thomas R.; Crowe, Maura; Saka, Noa

    2007-01-01

    This review examines empirically based studies of school-based mental health interventions. The review identified 64 out of more than 2,000 articles published between 1990 and 2006 that met methodologically rigorous criteria for inclusion. Of these 64 articles, only 24 examined both mental health "and" educational outcomes. The majority of…

  8. Performance-Based Service Quality Model: An Empirical Study on Japanese Universities

    ERIC Educational Resources Information Center

    Sultan, Parves; Wong, Ho

    2010-01-01

    Purpose: This paper aims to develop and empirically test the performance-based higher education service quality model. Design/methodology/approach: The study develops 67-item instrument for measuring performance-based service quality with a particular focus on the higher education sector. Scale reliability is confirmed using the Cronbach's alpha.…

  9. The Empirical Investigation of Perspective-Based Reading

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Green, Scott; Laitenberger, Oliver; Shull, Forrest; Sorumgard, Sivert; Zelkowitz, Marvin V.

    1996-01-01

    We consider reading techniques a fundamental means of achieving high quality software. Due to the lack of research in this area, we are experimenting with the application and comparison of various reading techniques. This paper deals with our experiences with Perspective-Based Reading (PBR), a particular reading technique for requirements documents. The goal of PBR is to provide operational scenarios where members of a review team read a document from a particular perspective (e.g., tester, developer, user). Our assumption is that the combination of different perspectives provides better coverage of the document than the same number of readers using their usual technique.

  10. Polymer electrolyte membrane fuel cell fault diagnosis based on empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Damour, Cédric; Benne, Michel; Grondin-Perez, Brigitte; Bessafi, Miloud; Hissel, Daniel; Chabriat, Jean-Pierre

    2015-12-01

    Diagnosis tool for water management is relevant to improve the reliability and lifetime of polymer electrolyte membrane fuel cells (PEMFCs). This paper presents a novel signal-based diagnosis approach, based on Empirical Mode Decomposition (EMD), dedicated to PEMFCs. EMD is an empirical, intuitive, direct and adaptive signal processing method, without pre-determined basis functions. The proposed diagnosis approach relies on the decomposition of FC output voltage to detect and isolate flooding and drying faults. The low computational cost of EMD, the reduced number of required measurements, and the high diagnosis accuracy of flooding and drying faults diagnosis make this approach a promising online diagnosis tool for PEMFC degraded modes management.

  11. Capability deprivation of people with Alzheimer's disease: An empirical analysis using a national survey.

    PubMed

    Tellez, Juan; Krishnakumar, Jaya; Bungener, Martine; Le Galès, Catherine

    2016-02-01

    How can one assess the quality of life of older people--particularly those with Alzheimer's disease--from the point of view of their opportunities to do valued things in life? This paper is an attempt to answer this question using as a theoretical framework the capability approach. We use data collected on 8841 individuals above 60 living in France (the 2008 Disability and Health Household Survey) and propose a latent variable modelling framework to analyse their capabilities in two fundamental dimensions: freedom to perform self-care activities and freedom to participate in the life of the household. Our results show that living as a couple, having children, being mobile and having access to local shops, health facilities and public services enhance both capabilities. Age, household size and male gender (for one of the two capabilities) act as impediments while the number of impairments reduces both capabilities. We find that people with Alzheimer's disease have a lower level and a smaller range of capabilities (freedom) when compared to those without, even when the latter have several impairments. Hence they need a special attention in policy-making. PMID:26773293

  12. An empirical determination of the dust mass absorption coefficient, κd, using the Herschel Reference Survey

    NASA Astrophysics Data System (ADS)

    Clark, Christopher J. R.; Schofield, Simon P.; Gomez, Haley L.; Davies, Jonathan I.

    2016-06-01

    We use the published photometry and spectroscopy of 22 galaxies in the Herschel Reference Survey to determine that the value of the dust mass absorption coefficient κd at a wavelength of 500 μm is kappa _{500} = 0.051^{+0.070}_{-0.026} m^{2 kg^{-1}}. We do so by taking advantage of the fact that the dust-to-metals ratio in the interstellar medium of galaxies appears to be constant. We argue that our value for κd supersedes that of James et al. - who pioneered this approach for determining κd - because we take advantage of superior data, and account for a number of significant systematic effects that they did not consider. We comprehensively incorporate all methodological and observational contributions to establish the uncertainty on our value, which represents a marked improvement on the oft-quoted `order-of-magnitude' uncertainty on κd. We find no evidence that the value of κd differs significantly between galaxies, or that it correlates with any other measured or derived galaxy properties. We note, however, that the availability of data limits our sample to relatively massive (109.7 < M⋆ < 1011.0 M⊙), high metallicity (8.61 < [ 12 + log_{10} fracOH ] < 8.86) galaxies; future work will allow us to investigate a wider range of systems.

  13. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study.

    PubMed

    Tîrnăucă, Cristina; Montaña, José L; Ontañón, Santiago; González, Avelino J; Pardo, Luis M

    2016-01-01

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent's actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches. PMID:27347956

  14. Towards an Empirically Based Parametric Explosion Spectral Model

    SciTech Connect

    Ford, S R; Walter, W R; Ruppert, S; Matzel, E; Hauk, T; Gok, R

    2009-08-31

    Small underground nuclear explosions need to be confidently detected, identified, and characterized in regions of the world where they have never before been tested. The focus of our work is on the local and regional distances (< 2000 km) and phases (Pn, Pg, Sn, Lg) necessary to see small explosions. We are developing a parametric model of the nuclear explosion seismic source spectrum that is compatible with the earthquake-based geometrical spreading and attenuation models developed using the Magnitude Distance Amplitude Correction (MDAC) techniques (Walter and Taylor, 2002). The explosion parametric model will be particularly important in regions without any prior explosion data for calibration. The model is being developed using the available body of seismic data at local and regional distances for past nuclear explosions at foreign and domestic test sites. Parametric modeling is a simple and practical approach for widespread monitoring applications, prior to the capability to carry out fully deterministic modeling. The achievable goal of our parametric model development is to be able to predict observed local and regional distance seismic amplitudes for event identification and yield determination in regions with incomplete or no prior history of underground nuclear testing. The relationship between the parametric equations and the geologic and containment conditions will assist in our physical understanding of the nuclear explosion source.

  15. The Empirical Investigation of Perspective-Based Reading

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Green, Scott; Laitenberger, Oliver; Shull, Forrest; Sorumgard, Sivert; Zelkowitz, Marvin V.

    1995-01-01

    We consider reading techniques a fundamental means of achieving high quality software. Due to lack of research in this area, we are experimenting with the application and comparison of various reading techniques. This paper deals with our experiences with Perspective Based Reading (PBR) a particular reading technique for requirement documents. The goal of PBR is to provide operation scenarios where members of a review team read a document from a particular perspective (eg., tester, developer, user). Our assumption is that the combination of different perspective provides better coverage of the document than the same number of readers using their usual technique. To test the efficacy of PBR, we conducted two runs of a controlled experiment in the environment of NASA GSFC Software Engineering Laboratory (SEL), using developers from the environment. The subjects read two types of documents, one generic in nature and the other from the NASA Domain, using two reading techniques, PBR and their usual technique. The results from these experiment as well as the experimental design, are presented and analyzed. When there is a statistically significant distinction, PBR performs better than the subjects' usual technique. However, PBR appears to be more effective on the generic documents than on the NASA documents.

  16. An Empirically Based Error-Model for Radar Rainfall Estimates

    NASA Astrophysics Data System (ADS)

    Ciach, G. J.

    2004-05-01

    Mathematical modeling of the way radar rainfall (RR) approximates the physical truth is a prospective method to quantify the RR uncertainties. In this approach one can represent RR in the form of an "observation equation," that is, as a function of the corresponding true rainfall and a random error process. The error process describes the cumulative effect of all the sources of RR uncertainties. We present the results of our work on the identification and estimation of this relationship. They are based on the Level II reflectivity data from the WSR-88D radar in Tulsa, Oklahoma, and rainfall measurements from 23 surrounding Oklahoma Mesonet raingauges. Accumulation intervals from one hour to one day were analyzed using this sample. The raingauge accumulations were used as an approximation of the true rainfall in this study. The RR error-model that we explored is factorized into a deterministic distortion, which is a function of the true rainfall, and a multiplicative random error factor that is a positively-defined random variable. The distribution of the error factor depends on the true rainfall, however, its expectation in this representation is always equal to one (all the biases are modeled by the deterministic component). With this constraint, the deterministic distortion function can be defined as the conditional mean of RR conditioned on the true rainfall. We use nonparametric regression to estimate the deterministic distortion, and the variance and quantiles of the random error factor, as functions of the true rainfall. The results show that the deterministic distortion is a nonlinear function of the true rainfall that indicates systematic overestimation of week rainfall and underestimation of strong rainfall (conditional bias). The standard deviation of the error factor is a decreasing function of the true rainfall that ranges from about 0.8 for week rainfall to about 0.3 for strong rainfall. For larger time-scales, both the deterministic distortion and the

  17. A Survey of UML Based Regression Testing

    NASA Astrophysics Data System (ADS)

    Fahad, Muhammad; Nadeem, Aamer

    Regression testing is the process of ensuring software quality by analyzing whether changed parts behave as intended, and unchanged parts are not affected by the modifications. Since it is a costly process, a lot of techniques are proposed in the research literature that suggest testers how to build regression test suite from existing test suite with minimum cost. In this paper, we discuss the advantages and drawbacks of using UML diagrams for regression testing and analyze that UML model helps in identifying changes for regression test selection effectively. We survey the existing UML based regression testing techniques and provide an analysis matrix to give a quick insight into prominent features of the literature work. We discuss the open research issues like managing and reducing the size of regression test suite, prioritization of the test cases that would be helpful during strict schedule and resources that remain to be addressed for UML based regression testing.

  18. Evidence-based Nursing Education - a Systematic Review of Empirical Research

    PubMed Central

    Reiber, Karin

    2011-01-01

    The project „Evidence-based Nursing Education – Preparatory Stage“, funded by the Landesstiftung Baden-Württemberg within the programme Impulsfinanzierung Forschung (Funding to Stimulate Research), aims to collect information on current research concerned with nursing education and to process existing data. The results of empirical research which has already been carried out were systematically evaluated with aim of identifying further topics, fields and matters of interest for empirical research in nursing education. In the course of the project, the available empirical studies on nursing education were scientifically analysed and systematised. The over-arching aim of the evidence-based training approach – which extends beyond the aims of this project - is the conception, organisation and evaluation of vocational training and educational processes in the caring professions on the basis of empirical data. The following contribution first provides a systematic, theoretical link to the over-arching reference framework, as the evidence-based approach is adapted from thematically related specialist fields. The research design of the project is oriented towards criteria introduced from a selection of studies and carries out a two-stage systematic review of the selected studies. As a result, the current status of research in nursing education, as well as its organisation and structure, and questions relating to specialist training and comparative education are introduced and discussed. Finally, the empirical research on nursing training is critically appraised as a complementary element in educational theory/psychology of learning and in the ethical tradition of research. This contribution aims, on the one hand, to derive and describe the methods used, and to introduce the steps followed in gathering and evaluating the data. On the other hand, it is intended to give a systematic overview of empirical research work in nursing education. In order to preserve a

  19. An Empirically Based Method of Q-Matrix Validation for the DINA Model: Development and Applications

    ERIC Educational Resources Information Center

    de la Torre, Jimmy

    2008-01-01

    Most model fit analyses in cognitive diagnosis assume that a Q matrix is correct after it has been constructed, without verifying its appropriateness. Consequently, any model misfit attributable to the Q matrix cannot be addressed and remedied. To address this concern, this paper proposes an empirically based method of validating a Q matrix used…

  20. An Empirically-Based Statewide System for Identifying Quality Pre-Kindergarten Programs

    ERIC Educational Resources Information Center

    Williams, Jeffrey M.; Landry, Susan H.; Anthony, Jason L.; Swank, Paul R.; Crawford, April D.

    2012-01-01

    This study presents an empirically-based statewide system that links information about pre-kindergarten programs with children's school readiness scores to certify pre-kindergarten classrooms as promoting school readiness. Over 8,000 children from 1,255 pre-kindergarten classrooms were followed longitudinally for one year. Pre-kindergarten quality…

  1. Development of an Empirically Based Questionnaire to Investigate Young Students' Ideas about Nature of Science

    ERIC Educational Resources Information Center

    Chen, Sufen; Chang, Wen-Hua; Lieu, Sang-Chong; Kao, Huey-Lien; Huang, Mao-Tsai; Lin, Shu-Fen

    2013-01-01

    This study developed an empirically based questionnaire to monitor young learners' conceptions of nature of science (NOS). The questionnaire, entitled Students' Ideas about Nature of Science (SINOS), measured views on theory-ladenness, use of creativity and imagination, tentativeness of scientific knowledge, durability of scientific knowledge,…

  2. Untangling the Evidence: Introducing an Empirical Model for Evidence-Based Library and Information Practice

    ERIC Educational Resources Information Center

    Gillespie, Ann

    2014-01-01

    Introduction: This research is the first to investigate the experiences of teacher-librarians as evidence-based practice. An empirically derived model is presented in this paper. Method: This qualitative study utilised the expanded critical incident approach, and investigated the real-life experiences of fifteen Australian teacher-librarians,…

  3. Use of an Empirically Based Marriage Education Program by Religious Organizations: Results of a Dissemination Trial

    ERIC Educational Resources Information Center

    Markman, Howard J.; Whitton, Sarah W.; Kline, Galena H.; Stanley, Scott M.; Thompson, Huette; St. Peters, Michelle; Leber, Douglas B.; Olmos-Gallo, P. Antonio; Prado, Lydia; Williams, Tamara; Gilbert, Katy; Tonelli, Laurie; Bobulinski, Michelle; Cordova, Allen

    2004-01-01

    We present an evaluation of the extent to which an empirically based couples' intervention program was successfully disseminated in the community. Clergy and lay leaders from 27 religious organizations who were trained to deliver the Prevention and Relationship Enhancement Program (PREP) were contacted approximately yearly for 5 years following…

  4. Multisystemic Therapy: An Empirically Supported, Home-Based Family Therapy Approach.

    ERIC Educational Resources Information Center

    Sheidow, Ashli J.; Woodford, Mark S.

    2003-01-01

    Multisystemic Therapy (MST) is a well-validated, evidenced-based treatment for serious clinical problems presented by adolescents and their families. This article is an introduction to the MST approach and outlines key clinical features, describes the theoretical underpinnings, and discusses the empirical support for MST's effectiveness with a…

  5. Empirical vs. Expected IRT-Based Reliability Estimation in Computerized Multistage Testing (MST)

    ERIC Educational Resources Information Center

    Zhang, Yanwei; Breithaupt, Krista; Tessema, Aster; Chuah, David

    2006-01-01

    Two IRT-based procedures to estimate test reliability for a certification exam that used both adaptive (via a MST model) and non-adaptive design were considered in this study. Both procedures rely on calibrated item parameters to estimate error variance. In terms of score variance, one procedure (Method 1) uses the empirical ability distribution…

  6. Satellite-based empirical models linking river plume dynamics with hypoxic area andvolume

    EPA Science Inventory

    Satellite-based empirical models explaining hypoxic area and volume variation were developed for the seasonally hypoxic (O2 < 2 mg L−1) northern Gulf of Mexico adjacent to the Mississippi River. Annual variations in midsummer hypoxic area and ...

  7. Implementing Evidence-Based Practice: A Review of the Empirical Research Literature

    ERIC Educational Resources Information Center

    Gray, Mel; Joy, Elyssa; Plath, Debbie; Webb, Stephen A.

    2013-01-01

    The article reports on the findings of a review of empirical studies examining the implementation of evidence-based practice (EBP) in the human services. Eleven studies were located that defined EBP as a research-informed, clinical decision-making process and identified barriers and facilitators to EBP implementation. A thematic analysis of the…

  8. Feasibility of an Empirically Based Program for Parents of Preschoolers with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Dababnah, Sarah; Parish, Susan L.

    2016-01-01

    This article reports on the feasibility of implementing an existing empirically based program, "The Incredible Years," tailored to parents of young children with autism spectrum disorder. Parents raising preschool-aged children (aged 3-6?years) with autism spectrum disorder (N?=?17) participated in a 15-week pilot trial of the…

  9. Introduction to the Application of Web-Based Surveys.

    ERIC Educational Resources Information Center

    Timmerman, Annemarie

    This paper discusses some basic assumptions and issues concerning web-based surveys. Discussion includes: assumptions regarding cost and ease of use; disadvantages of web-based surveys, concerning the inability to compensate for four common errors of survey research: coverage error, sampling error, measurement error and nonresponse error; and…

  10. Understanding why users tag: A survey of tagging motivation literature and results from an empirical study

    PubMed Central

    Strohmaier, Markus; Körner, Christian; Kern, Roman

    2012-01-01

    While recent progress has been achieved in understanding the structure and dynamics of social tagging systems, we know little about the underlying user motivations for tagging, and how they influence resulting folksonomies and tags. This paper addresses three issues related to this question. (1) What distinctions of user motivations are identified by previous research, and in what ways are the motivations of users amenable to quantitative analysis? (2) To what extent does tagging motivation vary across different social tagging systems? (3) How does variability in user motivation influence resulting tags and folksonomies? In this paper, we present measures to detect whether a tagger is primarily motivated by categorizing or describing resources, and apply these measures to datasets from seven different tagging systems. Our results show that (a) users’ motivation for tagging varies not only across, but also within tagging systems, and that (b) tag agreement among users who are motivated by categorizing resources is significantly lower than among users who are motivated by describing resources. Our findings are relevant for (1) the development of tag-based user interfaces, (2) the analysis of tag semantics and (3) the design of search algorithms for social tagging systems. PMID:23471473

  11. Empirical metallicity-dependent calibrations of effective temperature against colours for dwarfs and giants based on interferometric data

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Liu, X.-W.; Yuan, H.-B.; Xiang, M.-S.; Chen, B.-Q.; Zhang, H.-W.

    2015-12-01

    We present empirical metallicity-dependent calibrations of effective temperature against colours for dwarfs of luminosity classes IV and V and for giants of luminosity classes II and III, based on a collection from the literature of about two hundred nearby stars with direct effective temperature measurements of better than 2.5 per cent. The calibrations are valid for an effective temperature range 3100-10 000 K for dwarfs of spectral types M5 to A0 and 3100-5700 K for giants of spectral types K5 to G5. A total of 21 colours for dwarfs and 18 colours for giants of bands of four photometric systems, i.e. the Johnson (UBVRJIJJHK), the Cousins (RCIC), the Sloan Digital Sky Survey (gr) and the Two Micron All Sky Survey (JHKs), have been calibrated. Restricted by the metallicity range of the current sample, the calibrations are mainly applicable for disc stars ([Fe/H] ≳ - 1.0). The normalized percentage residuals of the calibrations are typically 2.0 and 1.5 per cent for dwarfs and giants, respectively. Some systematic discrepancies at various levels are found between the current scales and those available in the literature (e.g. those based on the infrared flux method or spectroscopy). Based on the current calibrations, we have re-determined the colours of the Sun. We have also investigated the systematic errors in effective temperatures yielded by the current on-going large-scale low- to intermediate-resolution stellar spectroscopic surveys. We show that the calibration of colour (g - Ks) presented in this work provides an invaluable tool for the estimation of stellar effective temperature for those on-going or upcoming surveys.

  12. A comparison of web-based and paper-based survey methods: testing assumptions of survey mode and response cost.

    PubMed

    Greenlaw, Corey; Brown-Welty, Sharon

    2009-10-01

    Web-based surveys have become more prevalent in areas such as evaluation, research, and marketing research to name a few. The proliferation of these online surveys raises the question, how do response rates compare with traditional surveys and at what cost? This research explored response rates and costs for Web-based surveys, paper surveys, and mixed-mode surveys. The participants included evaluators from the American Evaluation Association (AEA). Results included that mixed-mode, while more expensive, had higher response rates. PMID:19605623

  13. 78 FR 54464 - Premier Empire Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-04

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Premier Empire Energy, LLC; Supplemental Notice That Initial Market-Based... above-referenced proceeding, of Premier Empire Energy, LLC's application for market-based rate...

  14. Increasing Response Rates to Web-Based Surveys

    ERIC Educational Resources Information Center

    Monroe, Martha C.; Adams, Damian C.

    2012-01-01

    We review a popular method for collecing data--Web-based surveys. Although Web surveys are popular, one major concern is their typically low response rates. Using the Dillman et al. (2009) approach, we designed, pre-tested, and implemented a survey on climate change with Extension professionals in the Southeast. The Dillman approach worked well,…

  15. Increasing Your Productivity with Web-Based Surveys

    ERIC Educational Resources Information Center

    Wissmann, Mary; Stone, Brittney; Schuster, Ellen

    2012-01-01

    Web-based survey tools such as Survey Monkey can be used in many ways to increase the efficiency and effectiveness of Extension professionals. This article describes how Survey Monkey has been used at the state and county levels to collect community and internal staff information for the purposes of program planning, administration, evaluation and…

  16. Attachment-based family therapy for depressed and suicidal adolescents: theory, clinical model and empirical support.

    PubMed

    Ewing, E Stephanie Krauthamer; Diamond, Guy; Levy, Suzanne

    2015-01-01

    Attachment-Based Family Therapy (ABFT) is a manualized family-based intervention designed for working with depressed adolescents, including those at risk for suicide, and their families. It is an empirically informed and supported treatment. ABFT has its theoretical underpinnings in attachment theory and clinical roots in structural family therapy and emotion focused therapies. ABFT relies on a transactional model that aims to transform the quality of adolescent-parent attachment, as a means of providing the adolescent with a more secure relationship that can support them during challenging times generally, and the crises related to suicidal thinking and behavior, specifically. This article reviews: (1) the theoretical foundations of ABFT (attachment theory, models of emotional development); (2) the ABFT clinical model, including training and supervision factors; and (3) empirical support. PMID:25778674

  17. Generalized Constitutive-Based Theoretical and Empirical Models for Hot Working Behavior of Functionally Graded Steels

    NASA Astrophysics Data System (ADS)

    Vanini, Seyed Ali Sadough; Abolghasemzadeh, Mohammad; Assadi, Abbas

    2013-07-01

    Functionally graded steels with graded ferritic and austenitic regions including bainite and martensite intermediate layers produced by electroslag remelting have attracted much attention in recent years. In this article, an empirical model based on the Zener-Hollomon (Z-H) constitutive equation with generalized material constants is presented to investigate the effects of temperature and strain rate on the hot working behavior of functionally graded steels. Next, a theoretical model, generalized by strain compensation, is developed for the flow stress estimation of functionally graded steels under hot compression based on the phase mixture rule and boundary layer characteristics. The model is used for different strains and grading configurations. Specifically, the results for αβγMγ steels from empirical and theoretical models showed excellent agreement with those of experiments of other references within acceptable error.

  18. Scaling up explanation generation: Large-scale knowledge bases and empirical studies

    SciTech Connect

    Lester, J.C.; Porter, B.W.

    1996-12-31

    To explain complex phenomena, an explanation system must be able to select information from a formal representation of domain knowledge, organize the selected information into multisentential discourse plans, and realize the discourse plans in text. Although recent years have witnessed significant progress in the development of sophisticated computational mechanisms for explanation, empirical results have been limited. This paper reports on a seven year effort to empirically study explanation generation from semantically rich, large-scale knowledge bases. We first describe Knight, a robust explanation system that constructs multi-sentential and multi-paragraph explanations from the Biology Knowledge Base, a large-scale knowledge base in the domain of botanical anatomy, physiology, and development. We then introduce the Two Panel evaluation methodology and describe how Knight`s performance was assessed with this methodology in the most extensive empirical evaluation conducted on an explanation system. In this evaluation, Knight scored within {open_quotes}half a grade{close_quote} of domain experts, and its performance exceeded that of one of the domain experts.

  19. Outcome (competency) based education: an exploration of its origins, theoretical basis, and empirical evidence.

    PubMed

    Morcke, Anne Mette; Dornan, Tim; Eika, Berit

    2013-10-01

    Outcome based or competency based education (OBE) is so firmly established in undergraduate medical education that it might not seem necessary to ask why it was included in recommendations for the future, like the Flexner centenary report. Uncritical acceptance may not, however, deliver its greatest benefits. Our aim was to explore the underpinnings of OBE: its historical origins, theoretical basis, and empirical evidence of its effects in order to answer the question: How can predetermined learning outcomes influence undergraduate medical education? This literature review had three components: A review of historical landmarks in the evolution of OBE; a review of conceptual frameworks and theories; and a systematic review of empirical publications from 1999 to 2010 that reported data concerning the effects of learning outcomes on undergraduate medical education. OBE had its origins in behaviourist theories of learning. It is tightly linked to the assessment and regulation of proficiency, but less clearly linked to teaching and learning activities. Over time, there have been cycles of advocacy for, then criticism of, OBE. A recurring critique concerns the place of complex personal and professional attributes as "competencies". OBE has been adopted by consensus in the face of weak empirical evidence. OBE, which has been advocated for over 50 years, can contribute usefully to defining requisite knowledge and skills, and blueprinting assessments. Its applicability to more complex aspects of clinical performance is not clear. OBE, we conclude, provides a valuable approach to some, but not all, important aspects of undergraduate medical education. PMID:22987194

  20. An empirical mass-loss law for Population II giants from the Spitzer-IRAC survey of Galactic globular clusters

    NASA Astrophysics Data System (ADS)

    Origlia, L.; Ferraro, F. R.; Fabbri, S.; Fusi Pecci, F.; Dalessandro, E.; Rich, R. M.; Valenti, E.

    2014-04-01

    Aims: The main aim of the present work is to derive an empirical mass-loss (ML) law for Population II stars in first and second ascent red giant branches. Methods: We used the Spitzer InfraRed Array Camera (IRAC) photometry obtained in the 3.6-8 μm range of a carefully chosen sample of 15 Galactic globular clusters spanning the entire metallicity range and sampling the vast zoology of horizontal branch (HB) morphologies. We complemented the IRAC photometry with near-infrared data to build suitable color-magnitude and color-color diagrams and identify mass-losing giant stars. Results: We find that while the majority of stars show colors typical of cool giants, some stars show an excess of mid-infrared light that is larger than expected from their photospheric emission and that is plausibly due to dust formation in mass flowing from them. For these stars, we estimate dust and total (gas + dust) ML rates and timescales. We finally calibrate an empirical ML law for Population II red and asymptotic giant branch stars with varying metallicity. We find that at a given red giant branch luminosity only a fraction of the stars are losing mass. From this, we conclude that ML is episodic and is active only a fraction of the time, which we define as the duty cycle. The fraction of mass-losing stars increases by increasing the stellar luminosity and metallicity. The ML rate, as estimated from reasonable assumptions for the gas-to-dust ratio and expansion velocity, depends on metallicity and slowly increases with decreasing metallicity. In contrast, the duty cycle increases with increasing metallicity, with the net result that total ML increases moderately with increasing metallicity, about 0.1 M⊙ every dex in [Fe/H]. For Population II asymptotic giant branch stars, we estimate a total ML of ≤0.1 M⊙, nearly constant with varying metallicity. This work is based on observations made with the Spitzer Space Telescope, which is operated by the Jet Propulsion Laboratory

  1. Web-Based Surveys Facilitate Undergraduate Research and Knowledge

    ERIC Educational Resources Information Center

    Grimes, Paul, Ed.; Steele, Scott R.

    2008-01-01

    The author presents Web-based surveying as a valuable tool for achieving quality undergraduate research in upper-level economics courses. Web-based surveys can be employed in efforts to integrate undergraduate research into the curriculum without overburdening students or faculty. The author discusses the value of undergraduate research, notes…

  2. School-Based Health Care State Policy Survey. Executive Summary

    ERIC Educational Resources Information Center

    National Assembly on School-Based Health Care, 2012

    2012-01-01

    The National Assembly on School-Based Health Care (NASBHC) surveys state public health and Medicaid offices every three years to assess state-level public policies and activities that promote the growth and sustainability of school-based health services. The FY2011 survey found 18 states (see map below) reporting investments explicitly dedicated…

  3. Survey Says? A Primer on Web-based Survey Design and Distribution

    PubMed Central

    Oppenheimer, Adam J.; Pannucci, Christopher J.; Kasten, Steven J.; Haase, Steven C.

    2011-01-01

    The internet has changed the way in which we gather and interpret information. While books were once the exclusive bearers of data, knowledge is now only a keystroke away. The internet has also facilitated the synthesis of new knowledge. Specifically, it has become a tool through which medical research is conducted. A review of the literature reveals that in the past year, over one-hundred medical publications have been based on web-based survey data alone. Due to emerging internet technologies, web-based surveys can now be launched with little computer knowledge. They may also be self-administered, eliminating personnel requirements. Ultimately, an investigator may build, implement, and analyze survey results with speed and efficiency, obviating the need for mass mailings and data processing. All of these qualities have rendered telephone and mail-based surveys virtually obsolete. Despite these capabilities, web-based survey techniques are not without their limitations, namely recall and response biases. When used properly, however, web-based surveys can greatly simplify the research process. This article discusses the implications of web-based surveys and provides guidelines for their effective design and distribution. PMID:21701347

  4. Methodologies for Crawler Based Web Surveys.

    ERIC Educational Resources Information Center

    Thelwall, Mike

    2002-01-01

    Describes Web survey methodologies used to study the content of the Web, and discusses search engines and the concept of crawling the Web. Highlights include Web page selection methodologies; obstacles to reliable automatic indexing of Web sites; publicly indexable pages; crawling parameters; and tests for file duplication. (Contains 62…

  5. Empirical and physics based mathematical models of uranium hydride decomposition kinetics with quantified uncertainties.

    SciTech Connect

    Salloum, Maher N.; Gharagozloo, Patricia E.

    2013-10-01

    Metal particle beds have recently become a major technique for hydrogen storage. In order to extract hydrogen from such beds, it is crucial to understand the decomposition kinetics of the metal hydride. We are interested in obtaining a a better understanding of the uranium hydride (UH3) decomposition kinetics. We first developed an empirical model by fitting data compiled from different experimental studies in the literature and quantified the uncertainty resulting from the scattered data. We found that the decomposition time range predicted by the obtained kinetics was in a good agreement with published experimental results. Secondly, we developed a physics based mathematical model to simulate the rate of hydrogen diffusion in a hydride particle during the decomposition. We used this model to simulate the decomposition of the particles for temperatures ranging from 300K to 1000K while propagating parametric uncertainty and evaluated the kinetics from the results. We compared the kinetics parameters derived from the empirical and physics based models and found that the uncertainty in the kinetics predicted by the physics based model covers the scattered experimental data. Finally, we used the physics-based kinetics parameters to simulate the effects of boundary resistances and powder morphological changes during decomposition in a continuum level model. We found that the species change within the bed occurring during the decomposition accelerates the hydrogen flow by increasing the bed permeability, while the pressure buildup and the thermal barrier forming at the wall significantly impede the hydrogen extraction.

  6. Scan MDCs for GPS-Based Gamma Radiation Surveys.

    PubMed

    Alecksen, Tyler; Whicker, Randy

    2016-08-01

    A method for estimating the minimum detectable concentration of a contaminant radionuclide in soil when scanning with gamma radiation detectors (known as the "scan MDC") is described in the Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM). This paper presents an alternate method for estimating scan MDCs for GPS-based gamma surveys based on detector efficiencies modeled with the probabilistic Monte Carlo N-Particle Extended (MCNPX) Transport simulation code. Results are compared to those provided in MARSSIM. An extensive database of MCNPX-based detection efficiencies has been developed to represent a variety of gamma survey applications and potential scanning configurations (detector size, scan height, size of contaminated soil volume, etc.), and an associated web-based user interface has been developed to provide survey designers and regulators with access to a reasonably wide range of calculated scan MDC values for survey planning purposes. PMID:27356162

  7. Simulating the Performance of Ground-Based Optical Asteroid Surveys

    NASA Astrophysics Data System (ADS)

    Christensen, Eric J.; Shelly, Frank C.; Gibbs, Alex R.; Grauer, Albert D.; Hill, Richard E.; Johnson, Jess A.; Kowalski, Richard A.; Larson, Stephen M.

    2014-11-01

    We are developing a set of asteroid survey simulation tools in order to estimate the capability of existing and planned ground-based optical surveys, and to test a variety of possible survey cadences and strategies. The survey simulator is composed of several layers, including a model population of solar system objects and an orbital integrator, a site-specific atmospheric model (including inputs for seeing, haze and seasonal cloud cover), a model telescope (with a complete optical path to estimate throughput), a model camera (including FOV, pixel scale, and focal plane fill factor) and model source extraction and moving object detection layers with tunable detection requirements. We have also developed a flexible survey cadence planning tool to automatically generate nightly survey plans. Inputs to the cadence planner include camera properties (FOV, readout time), telescope limits (horizon, declination, hour angle, lunar and zenithal avoidance), preferred and restricted survey regions in RA/Dec, ecliptic, and Galactic coordinate systems, and recent coverage by other asteroid surveys. Simulated surveys are created for a subset of current and previous NEO surveys (LINEAR, Pan-STARRS and the three Catalina Sky Survey telescopes), and compared against the actual performance of these surveys in order to validate the model’s performance. The simulator tracks objects within the FOV of any pointing that were not discovered (e.g. too few observations, too trailed, focal plane array gaps, too fast or slow), thus dividing the population into “discoverable” and “discovered” subsets, to inform possible survey design changes. Ongoing and future work includes generating a realistic “known” subset of the model NEO population, running multiple independent simulated surveys in coordinated and uncoordinated modes, and testing various cadences to find optimal strategies for detecting NEO sub-populations. These tools can also assist in quantifying the efficiency of novel

  8. Inequality of Higher Education in China: An Empirical Test Based on the Perspective of Relative Deprivation

    ERIC Educational Resources Information Center

    Hou, Liming

    2014-01-01

    The primary goal of this paper is to examine what makes Chinese college students dissatisfied with entrance opportunities for higher education. Based on the author's survey data, we test two parameters which could be a potential cause of this dissatisfaction: 1) distributive inequality, which emphasizes the individual's dissatisfaction caused by…

  9. Advances on Empirical Mode Decomposition-based Time-Frequency Analysis Methods in Hydrocarbon Detection

    NASA Astrophysics Data System (ADS)

    Chen, H. X.; Xue, Y. J.; Cao, J.

    2015-12-01

    Empirical mode decomposition (EMD), which is a data-driven adaptive decomposition method and is not limited by time-frequency uncertainty spreading, is proved to be more suitable for seismic signals which are nonlinear and non-stationary. Compared with other Fourier-based and wavelet-based time-frequency methods, EMD-based time-frequency methods have higher temporal and spatial resolution and yield hydrocarbon interpretations with more statistical significance. Empirical mode decomposition algorithm has now evolved from EMD to Ensemble EMD (EEMD) to Complete Ensemble EMD (CEEMD). Even though EMD-based time-frequency methods offer many promising features for analyzing and processing geophysical data, there are some limitations or defects in EMD-based time-frequency methods. This presentation will present a comparative study on hydrocarbon detection using seven EMD-based time-frequency analysis methods, which include: (1) first, EMD combined with Hilbert transform (HT) as a time-frequency analysis method is used for hydrocarbon detection; and (2) second, Normalized Hilbert transform (NHT) and HU Methods respectively combined with HT as improved time-frequency analysis methods are applied for hydrocarbon detection; and (3) three, EMD combined with Teager-Kaiser energy (EMD/TK) is investigated for hydrocarbon detection; and (4) four, EMD combined with wavelet transform (EMDWave) as a seismic attenuation estimation method is comparatively studied; and (5) EEMD- and CEEMD- based time-frequency analysis methods used as highlight volumes technology are studied. The differences between these methods in hydrocarbon detection will be discussed. The question of getting a meaningful instantaneous frequency by HT and mode-mixing issues in EMD will be analysed. The work was supported by NSFC under grant Nos. 41430323, 41404102 and 41274128.

  10. Survey of Commercially Available Computer-Readable Bibliographic Data Bases.

    ERIC Educational Resources Information Center

    Schneider, John H., Ed.; And Others

    This document contains the results of a survey of 94 U. S. organizations, and 36 organizations in other countries that were thought to prepare machine-readable data bases. Of those surveyed, 55 organizations (40 in U. S., 15 in other countries) provided completed camera-ready forms describing 81 commercially available, machine-readable data bases…

  11. A survey of machine readable data bases

    NASA Technical Reports Server (NTRS)

    Matlock, P.

    1981-01-01

    Forty-two of the machine readable data bases available to the technologist and researcher in the natural sciences and engineering are described and compared with the data bases and date base services offered by NASA.

  12. Implementation of an empirically based drug and violence prevention and intervention program in public school settings.

    PubMed

    Cunningham, P B; Henggeler, S W

    2001-06-01

    Describes the implementation of a collaborative preventive intervention project (Healthy Schools) designed to reduce levels of bullying and related antisocial behaviors in children attending two urban middle schools serving primarily African American students. These schools have high rates of juvenile violence, as reflected by suspensions and expulsions for behavioral problems. Using a quasi-experimental design, empirically based drug and violence prevention programs, Bullying Prevention and Project ALERT, are being implemented at each middle school. In addition, an intensive evidence-based intervention, multisystemic therapy, is being used to target students at high risk of expulsion and court referral. Hence, the proposed project integrates both universal approaches to prevention and a model that focuses on indicated cases. Targeted outcomes, by which the effectiveness of this comprehensive school-based program will be measured, are reduced youth violence, reduced drug use, and improved psychosocial functioning of participating youth. PMID:11393922

  13. Comparing results from two continental geochemical surveys to world soil composition and deriving Predicted Empirical Global Soil (PEGS2) reference values

    NASA Astrophysics Data System (ADS)

    de Caritat, Patrice; Reimann, Clemens; Bastrakov, E.; Bowbridge, D.; Boyle, P.; Briggs, S.; Brown, D.; Brown, M.; Brownlie, K.; Burrows, P.; Burton, G.; Byass, J.; de Caritat, P.; Chanthapanya, N.; Cooper, M.; Cranfield, L.; Curtis, S.; Denaro, T.; Dhnaram, C.; Dhu, T.; Diprose, G.; Fabris, A.; Fairclough, M.; Fanning, S.; Fidler, R.; Fitzell, M.; Flitcroft, P.; Fricke, C.; Fulton, D.; Furlonger, J.; Gordon, G.; Green, A.; Green, G.; Greenfield, J.; Harley, J.; Heawood, S.; Hegvold, T.; Henderson, K.; House, E.; Husain, Z.; Krsteska, B.; Lam, J.; Langford, R.; Lavigne, T.; Linehan, B.; Livingstone, M.; Lukss, A.; Maier, R.; Makuei, A.; McCabe, L.; McDonald, P.; McIlroy, D.; McIntyre, D.; Morris, P.; O'Connell, G.; Pappas, B.; Parsons, J.; Petrick, C.; Poignand, B.; Roberts, R.; Ryle, J.; Seymon, A.; Sherry, K.; Skinner, J.; Smith, M.; Strickland, C.; Sutton, S.; Swindell, R.; Tait, H.; Tang, J.; Thomson, A.; Thun, C.; Uppill, B.; Wall, K.; Watkins, J.; Watson, T.; Webber, L.; Whiting, A.; Wilford, J.; Wilson, T.; Wygralak, A.; Albanese, S.; Andersson, M.; Arnoldussen, A.; Baritz, R.; Batista, M. J.; Bel-lan, A.; Birke, M.; Cicchella, C.; Demetriades, A.; Dinelli, E.; De Vivo, B.; De Vos, W.; Duris, M.; Dusza-Dobek, A.; Eggen, O. A.; Eklund, M.; Ernstsen, V.; Filzmoser, P.; Finne, T. E.; Flight, D.; Forrester, S.; Fuchs, M.; Fugedi, U.; Gilucis, A.; Gosar, M.; Gregorauskiene, V.; Gulan, A.; Halamić, J.; Haslinger, E.; Hayoz, P.; Hobiger, G.; Hoffmann, R.; Hoogewerff, J.; Hrvatovic, H.; Husnjak, S.; Janik, L.; Johnson, C. C.; Jordan, G.; Kirby, J.; Kivisilla, J.; Klos, V.; Krone, F.; Kwecko, P.; Kuti, L.; Ladenberger, A.; Lima, A.; Locutura, J.; Lucivjansky, P.; Mackovych, D.; Malyuk, B. I.; Maquil, R.; McLaughlin, M.; Meuli, R. G.; Miosic, N.; Mol, G.; Négrel, P.; O'Connor, P.; Oorts, K.; Ottesen, R. T.; Pasieczna, A.; Petersell, V.; Pfleiderer, S.; Poňavič, M.; Prazeres, C.; Rauch, U.; Reimann, C.; Salpeteur, I.; Schedl, A.; Scheib, A.; Schoeters, I.; Sefcik, P.; Sellersjö, E.; Skopljak, F.; Slaninka, I.; Šorša, A.; Srvkota, R.; Stafilov, T.; Tarvainen, T.; Trendavilov, V.; Valera, P.; Verougstraete, V.; Vidojević, D.; Zissimos, A. M.; Zomeni, Z.

    2012-02-01

    Analytical data for 10 major oxides (Al2O3, CaO, Fe2O3, K2O, MgO, MnO, Na2O, P2O5, SiO2 and TiO2), 16 total trace elements (As, Ba, Ce, Co, Cr, Ga, Nb, Ni, Pb, Rb, Sr, Th, V, Y, Zn and Zr), 14 aqua regia extracted elements (Ag, As, Bi, Cd, Ce, Co, Cs, Cu, Fe, La, Li, Mn, Mo and Pb), Loss On Ignition (LOI) and pH from 3526 soil samples from two continents (Australia and Europe) are presented and compared to (1) the composition of the upper continental crust, (2) published world soil average values, and (3) data from other continental-scale soil surveys. It can be demonstrated that average upper continental crust values do not provide reliable estimates for natural concentrations of elements in soils. For many elements there exist substantial differences between published world soil averages and the median concentrations observed on two continents. Direct comparison with other continental datasets is hampered by the fact that often mean, instead of the statistically more robust median, is reported. Using a database of the worldwide distribution of lithological units, it can be demonstrated that lithology is a poor predictor of soil chemistry. Climate-related processes such as glaciation and weathering are strong modifiers of the geochemical signature inherited from bedrock during pedogenesis. To overcome existing shortcomings of predicted global or world soil geochemical reference values, we propose Preliminary Empirical Global Soil reference values based on analytical results of a representative number of soil samples from two continents (PEGS2).

  14. Dip-separated structural filtering using seislet transform and adaptive empirical mode decomposition based dip filter

    NASA Astrophysics Data System (ADS)

    Chen, Yangkang

    2016-07-01

    The seislet transform has been demonstrated to have a better compression performance for seismic data compared with other well-known sparsity promoting transforms, thus it can be used to remove random noise by simply applying a thresholding operator in the seislet domain. Since the seislet transform compresses the seismic data along the local structures, the seislet thresholding can be viewed as a simple structural filtering approach. Because of the dependence on a precise local slope estimation, the seislet transform usually suffers from low compression ratio and high reconstruction error for seismic profiles that have dip conflicts. In order to remove the limitation of seislet thresholding in dealing with conflicting-dip data, I propose a dip-separated filtering strategy. In this method, I first use an adaptive empirical mode decomposition based dip filter to separate the seismic data into several dip bands (5 or 6). Next, I apply seislet thresholding to each separated dip component to remove random noise. Then I combine all the denoised components to form the final denoised data. Compared with other dip filters, the empirical mode decomposition based dip filter is data-adaptive. One only needs to specify the number of dip components to be separated. Both complicated synthetic and field data examples show superior performance of my proposed approach than the traditional alternatives. The dip-separated structural filtering is not limited to seislet thresholding, and can also be extended to all those methods that require slope information.

  15. Determination of knock characteristics in spark ignition engines: an approach based on ensemble empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Li, Ning; Yang, Jianguo; Zhou, Rui; Liang, Caiping

    2016-04-01

    Knock is one of the major constraints to improve the performance and thermal efficiency of spark ignition (SI) engines. It can also result in severe permanent engine damage under certain operating conditions. Based on the ensemble empirical mode decomposition (EEMD), this paper proposes a new approach to determine the knock characteristics in SI engines. By adding a uniformly distributed and finite white Gaussian noise, the EEMD can preserve signal continuity in different scales and therefore alleviates the mode-mixing problem occurring in the classic empirical mode decomposition (EMD). The feasibilities of applying the EEMD to detect the knock signatures of a test SI engine via the pressure signal measured from combustion chamber and the vibration signal measured from cylinder head are investigated. Experimental results show that the EEMD-based method is able to detect the knock signatures from both the pressure signal and vibration signal, even in initial stage of knock. Finally, by comparing the application results with those obtained by short-time Fourier transform (STFT), Wigner-Ville distribution (WVD) and discrete wavelet transform (DWT), the superiority of the EEMD method in determining knock characteristics is demonstrated.

  16. Dip-separated structural filtering using seislet transform and adaptive empirical mode decomposition based dip filter

    NASA Astrophysics Data System (ADS)

    Chen, Yangkang

    2016-04-01

    The seislet transform has been demonstrated to have a better compression performance for seismic data compared with other well-known sparsity promoting transforms, thus it can be used to remove random noise by simply applying a thresholding operator in the seislet domain. Since the seislet transform compresses the seismic data along the local structures, the seislet thresholding can be viewed as a simple structural filtering approach. Because of the dependence on a precise local slope estimation, the seislet transform usually suffers from low compression ratio and high reconstruction error for seismic profiles that have dip conflicts. In order to remove the limitation of seislet thresholding in dealing with conflicting-dip data, I propose a dip-separated filtering strategy. In this method, I first use an adaptive empirical mode decomposition based dip filter to separate the seismic data into several dip bands (5 or 6). Next, I apply seislet thresholding to each separated dip component to remove random noise. Then I combine all the denoised components to form the final denoised data. Compared with other dip filters, the empirical mode decomposition based dip filter is data-adaptive. One only need to specify the number of dip components to be separated. Both complicated synthetic and field data examples show superior performance of my proposed approach than the traditional alternatives. The dip-separated structural filtering is not limited to seislet thresholding, and can also be extended to all those methods that require slope information.

  17. Dynamics of bloggers’ communities: Bipartite networks from empirical data and agent-based modeling

    NASA Astrophysics Data System (ADS)

    Mitrović, Marija; Tadić, Bosiljka

    2012-11-01

    We present an analysis of the empirical data and the agent-based modeling of the emotional behavior of users on the Web portals where the user interaction is mediated by posted comments, like Blogs and Diggs. We consider the dataset of discussion-driven popular Diggs, in which all comments are screened by machine-learning emotion detection in the text, to determine positive and negative valence (attractiveness and aversiveness) of each comment. By mapping the data onto a suitable bipartite network, we perform an analysis of the network topology and the related time-series of the emotional comments. The agent-based model is then introduced to simulate the dynamics and to capture the emergence of the emotional behaviors and communities. The agents are linked to posts on a bipartite network, whose structure evolves through their actions on the posts. The emotional states (arousal and valence) of each agent fluctuate in time, subject to the current contents of the posts to which the agent is exposed. By an agent’s action on a post its current emotions are transferred to the post. The model rules and the key parameters are inferred from the considered empirical data to ensure their realistic values and mutual consistency. The model assumes that the emotional arousal over posts drives the agent’s action. The simulations are preformed for the case of constant flux of agents and the results are analyzed in full analogy with the empirical data. The main conclusions are that the emotion-driven dynamics leads to long-range temporal correlations and emergent networks with community structure, that are comparable with the ones in the empirical system of popular posts. In view of pure emotion-driven agents actions, this type of comparisons provide a quantitative measure for the role of emotions in the dynamics on real blogs. Furthermore, the model reveals the underlying mechanisms which relate the post popularity with the emotion dynamics and the prevalence of negative

  18. Empirical calibrations of optical absorption-line indices based on the stellar library MILES

    NASA Astrophysics Data System (ADS)

    Johansson, Jonas; Thomas, Daniel; Maraston, Claudia

    2010-07-01

    Stellar population models of absorption-line indices are an important tool for the analysis of stellar population spectra. They are most accurately modelled through empirical calibrations of absorption-line indices with the stellar parameters such as effective temperature, metallicity and surface gravity, which are the so-called fitting functions. Here we present new empirical fitting functions for the 25 optical Lick absorption-line indices based on the new stellar library Medium resolution INT Library of Empirical Spectra (MILES). The major improvements with respect to the Lick/IDS library are the better sampling of stellar parameter space, a generally higher signal-to-noise ratio and a careful flux calibration. In fact, we find that errors on individual index measurements in MILES are considerably smaller than in Lick/IDS. Instead, we find the rms of the residuals between the final fitting functions and the data to be dominated by errors in the stellar parameters. We provide fitting functions for both Lick/IDS and MILES spectral resolutions and compare our results with other fitting functions in the literature. A FORTRAN 90 code is available online in order to simplify the implementation in stellar population models. We further calculate the offsets in index measurements between the Lick/IDS system to a flux-calibrated system. For this purpose, we use the three libraries MILES, ELODIE and STELIB. We find that offsets are negligible in some cases, most notably for the widely used indices Hβ, Mgb, Fe5270 and Fe5335. In a number of cases, however, the difference between the flux-calibrated library and Lick/IDS is significant with the offsets depending on index strengths. Interestingly, there is no general agreement between the three libraries for a large number of indices, which hampers the derivation of a universal offset between the Lick/IDS and flux-calibrated systems.

  19. Polarizable Empirical Force Field for Hexopyranose Monosaccharides Based on the Classical Drude Oscillator

    PubMed Central

    2015-01-01

    A polarizable empirical force field based on the classical Drude oscillator is presented for the hexopyranose form of selected monosaccharides. Parameter optimization targeted quantum mechanical (QM) dipole moments, solute–water interaction energies, vibrational frequencies, and conformational energies. Validation of the model was based on experimental data on crystals, densities of aqueous-sugar solutions, diffusion constants of glucose, and rotational preferences of the exocylic hydroxymethyl of d-glucose and d-galactose in aqueous solution as well as additional QM data. Notably, the final model involves a single electrostatic model for all sixteen diastereomers of the monosaccharides, indicating the transferability of the polarizable model. The presented parameters are anticipated to lay the foundation for a comprehensive polarizable force field for saccharides that will be compatible with the polarizable Drude parameters for lipids and proteins, allowing for simulations of glycolipids and glycoproteins. PMID:24564643

  20. Empirical analysis of web-based user-object bipartite networks

    NASA Astrophysics Data System (ADS)

    Shang, Ming-Sheng; Lü, Linyuan; Zhang, Yi-Cheng; Zhou, Tao

    2010-05-01

    Understanding the structure and evolution of web-based user-object networks is a significant task since they play a crucial role in e-commerce nowadays. This letter reports the empirical analysis on two large-scale web sites, audioscrobbler.com and del.icio.us, where users are connected with music groups and bookmarks, respectively. The degree distributions and degree-degree correlations for both users and objects are reported. We propose a new index, named collaborative similarity, to quantify the diversity of tastes based on the collaborative selection. Accordingly, the correlation between degree and selection diversity is investigated. We report some novel phenomena well characterizing the selection mechanism of web users and outline the relevance of these phenomena to the information recommendation problem.

  1. Polarizable Empirical Force Field for Acyclic Poly-Alcohols Based on the Classical Drude Oscillator

    PubMed Central

    He, Xibing; Lopes, Pedro E. M.; MacKerell, Alexander D.

    2014-01-01

    A polarizable empirical force field for acyclic polyalcohols based on the classical Drude oscillator is presented. The model is optimized with an emphasis on the transferability of the developed parameters among molecules of different sizes in this series and on the condensed-phase properties validated against experimental data. The importance of the explicit treatment of electronic polarizability in empirical force fields is demonstrated in the cases of this series of molecules with vicinal hydroxyl groups that can form cooperative intra- and intermolecular hydrogen bonds. Compared to the CHARMM additive force field, improved treatment of the electrostatic interactions avoids overestimation of the gas-phase dipole moments, results in significant improvement in the treatment of the conformational energies, and leads to the correct balance of intra- and intermolecular hydrogen bonding of glycerol as evidenced by calculated heat of vaporization being in excellent agreement with experiment. Computed condensed phase data, including crystal lattice parameters and volumes and densities of aqueous solutions are in better agreement with experimental data as compared to the corresponding additive model. Such improvements are anticipated to significantly improve the treatment of polymers in general, including biological macromolecules. PMID:23703219

  2. The WASP and NGTS ground-based transit surveys

    NASA Astrophysics Data System (ADS)

    Wheatley, P. J.

    2015-10-01

    I will review the current status of ground-based exoplanet transit surveys, using the Wide Angle Search for Planets (WASP) and the Next Generation Transit Survey (NGTS) as specific examples. I will describe the methods employed by these surveys and show how planets from Neptune to Jupiter-size are detected and confirmed around bright stars. I will also give an overview of the remarkably wide range of exoplanet characterization that is made possible with large-telescope follow up of these bright transiting systems. This characterization includes bulk composition and spin-orbit alignment, as well as atmospheric properties such as thermal structure, composition and dynamics. Finally, I will outline how ground-based photometric studies of transiting planets will evolve with the advent of new space-based surveys such as TESS and PLATO.

  3. Developing Empirically Based, Culturally Grounded Drug Prevention Interventions for Indigenous Youth Populations

    PubMed Central

    Okamoto, Scott K.; Helm, Susana; Pel, Suzanne; McClain, Latoya L.; Hill, Amber P.; Hayashida, Janai K. P.

    2012-01-01

    This article describes the relevance of a culturally grounded approach toward drug prevention development for indigenous youth populations. This approach builds drug prevention from the “ground up” (ie, from the values, beliefs, and worldviews of the youth that are the intended consumers of the program), and is contrasted with efforts that focus on adapting existing drug prevention interventions to fit the norms of different youth ethnocultural groups. The development of an empirically based drug prevention program focused on rural Native Hawaiian youth is described as a case example of culturally grounded drug prevention development for indigenous youth, and the impact of this effort on the validity of the intervention and on community engagement and investment in the development of the program are discussed. Finally, implications of this approach for behavioral health services and the development of an indigenous prevention science are discussed. PMID:23188485

  4. A Human ECG Identification System Based on Ensemble Empirical Mode Decomposition

    PubMed Central

    Zhao, Zhidong; Yang, Lei; Chen, Diandian; Luo, Yi

    2013-01-01

    In this paper, a human electrocardiogram (ECG) identification system based on ensemble empirical mode decomposition (EEMD) is designed. A robust preprocessing method comprising noise elimination, heartbeat normalization and quality measurement is proposed to eliminate the effects of noise and heart rate variability. The system is independent of the heart rate. The ECG signal is decomposed into a number of intrinsic mode functions (IMFs) and Welch spectral analysis is used to extract the significant heartbeat signal features. Principal component analysis is used reduce the dimensionality of the feature space, and the K-nearest neighbors (K-NN) method is applied as the classifier tool. The proposed human ECG identification system was tested on standard MIT-BIH ECG databases: the ST change database, the long-term ST database, and the PTB database. The system achieved an identification accuracy of 95% for 90 subjects, demonstrating the effectiveness of the proposed method in terms of accuracy and robustness. PMID:23698274

  5. Painting by numbers: nanoparticle-based colorants in the post-empirical age.

    PubMed

    Klupp Taylor, Robin N; Seifrt, Frantisek; Zhuromskyy, Oleksandr; Peschel, Ulf; Leugering, Günter; Peukert, Wolfgang

    2011-06-17

    The visual appearance of the artificial world is largely governed by films or composites containing particles with at least one dimension smaller than a micron. Over the past century and a half, the optical properties of such materials have been scrutinized and a broad range of colorant products, based mostly on empirical microstructural improvements, developed. With the advent of advanced synthetic approaches capable of tailoring particle shape, size and composition on the nanoscale, the question of what is the optimum particle for a certain optical property can no longer be answered solely by experimentation. Instead, new and improved computational approaches are required to invert the structure-function relationship. This progress report reviews the development in our understanding of this relationship and indicates recent examples of how theoretical design is taking an ever increasingly important role in the search for enhanced or multifunctional colorants. PMID:21538592

  6. Distributed optical fiber-based theoretical and empirical methods monitoring hydraulic engineering subjected to seepage velocity

    NASA Astrophysics Data System (ADS)

    Su, Huaizhi; Tian, Shiguang; Cui, Shusheng; Yang, Meng; Wen, Zhiping; Xie, Wei

    2016-09-01

    In order to systematically investigate the general principle and method of monitoring seepage velocity in the hydraulic engineering, the theoretical analysis and physical experiment were implemented based on distributed fiber-optic temperature sensing (DTS) technology. During the coupling influence analyses between seepage field and temperature field in the embankment dam or dike engineering, a simplified model was constructed to describe the coupling relationship of two fields. Different arrangement schemes of optical fiber and measuring approaches of temperature were applied on the model. The inversion analysis idea was further used. The theoretical method of monitoring seepage velocity in the hydraulic engineering was finally proposed. A new concept, namely the effective thermal conductivity, was proposed referring to the thermal conductivity coefficient in the transient hot-wire method. The influence of heat conduction and seepage could be well reflected by this new concept, which was proved to be a potential approach to develop an empirical method monitoring seepage velocity in the hydraulic engineering.

  7. Empirical likelihood based detection procedure for change point in mean residual life functions under random censorship.

    PubMed

    Chen, Ying-Ju; Ning, Wei; Gupta, Arjun K

    2016-05-01

    The mean residual life (MRL) function is one of the basic parameters of interest in survival analysis that describes the expected remaining time of an individual after a certain age. The study of changes in the MRL function is practical and interesting because it may help us to identify some factors such as age and gender that may influence the remaining lifetimes of patients after receiving a certain surgery. In this paper, we propose a detection procedure based on the empirical likelihood for the changes in MRL functions with right censored data. Two real examples are also given: Veterans' administration lung cancer study and Stanford heart transplant to illustrate the detecting procedure. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26936529

  8. Survey of projection-based immersive displays

    NASA Astrophysics Data System (ADS)

    Wright, Dan

    2000-05-01

    Projection-based immersive displays are rapidly becoming the visualization system of choice for applications requiring the comprehension of complex datasets and the collaborative sharing of insights. The wide variety of display configurations can be grouped into five categories: benches, flat-screen walls, curved-screen theaters, concave-screen domes and spatially-immersive rooms. Each have their strengths and weaknesses with the appropriateness of each dependent on one's application and budget. The paper outlines the components common to all projection-based displays and describes the characteristics of each particular category. Key image metrics, implementation considerations and immersive display trends are also considered.

  9. Project Surveys Community Based Organizations High Schools

    ERIC Educational Resources Information Center

    Black Issues in Higher Education, 2004

    2004-01-01

    A new report aims to promote awareness at the state level of the importance of Community Based Organizations (CBO) high schools, highlighting the promising lessons these schools hold for improving the educational outcomes of youth at risk of school failure or dropping out. This document briefly analyzes the report, "CBO High Schools: Their Value…

  10. An Empirical Study on Washback Effects of the Internet-Based College English Test Band 4 in China

    ERIC Educational Resources Information Center

    Wang, Chao; Yan, Jiaolan; Liu, Bao

    2014-01-01

    Based on Bailey's washback model, in respect of participants, process and products, the present empirical study was conducted to find the actual washback effects of the internet-based College English Test Band 4 (IB CET-4). The methods adopted are questionnaires, class observation, interview and the analysis of both the CET-4 teaching and testing…

  11. How "Does" the Comforting Process Work? An Empirical Test of an Appraisal-Based Model of Comforting

    ERIC Educational Resources Information Center

    Jones, Susanne M.; Wirtz, John G.

    2006-01-01

    Burleson and Goldsmith's (1998) comforting model suggests an appraisal-based mechanism through which comforting messages can bring about a positive change in emotional states. This study is a first empirical test of three causal linkages implied by the appraisal-based comforting model. Participants (N=258) talked about an upsetting event with a…

  12. Determinants of Obesity and Associated Population Attributability, South Africa: Empirical Evidence from a National Panel Survey, 2008-2012

    PubMed Central

    Sartorius, Benn; Veerman, Lennert J.; Manyema, Mercy; Chola, Lumbwe; Hofman, Karen

    2015-01-01

    Background Obesity is a major risk factor for emerging non-communicable diseases (NCDS) in middle income countries including South Africa (SA). Understanding the multiple and complex determinants of obesity and their true population attributable impact is critical for informing and developing effective prevention efforts using scientific based evidence. This study identified contextualised high impact factors associated with obesity in South Africa. Methods Analysis of three national cross sectional (repeated panel) surveys, using a multilevel logistic regression and population attributable fraction estimation allowed for identification of contextualised high impact factors associated with obesity (BMI>30 kg/m2) among adults (15years+). Results Obesity prevalence increased significantly from 23.5% in 2008 to 27.2% in 2012, with a significantly (p-value<0.001) higher prevalence among females (37.9% in 2012) compared to males (13.3% in 2012). Living in formal urban areas, white ethnicity, being married, not exercising and/or in higher socio-economic category were significantly associated with male obesity. Females living in formal or informal urban areas, higher crime areas, African/White ethnicity, married, not exercising, in a higher socio-economic category and/or living in households with proportionate higher spending on food (and unhealthy food options) were significantly more likely to be obese. The identified determinants appeared to account for 75% and 43% of male and female obesity respectively. White males had the highest relative gain in obesity from 2008 to 2012. Conclusions The rising prevalence of obesity in South Africa is significant and over the past 5 years the rising prevalence of Type-2 diabetes has mirrored this pattern, especially among females. Targeting young adolescent girls should be a priority. Addressing determinants of obesity will involve a multifaceted strategy and requires at individual and population levels. With rising costs in the

  13. Survey of Online Access to Social Science Data Bases.

    ERIC Educational Resources Information Center

    Donati, Robert

    Until very recently there was little computer access to comprehensive bibliographic data bases in the social sciences. Now online searching of several directly relevant files is made possible through services such as the Lockheed DIALOG system. These data bases are briefly surveyed, with emphasis on content, structure, and strategy appropriate for…

  14. Using Web-Based Surveys to Conduct Counseling Research.

    ERIC Educational Resources Information Center

    Granello, Darcy Haag; Wheaton, Joe E.

    In spite of the increased use of the Internet for data collection, there is little published research about the process of data collection online. That is, discipline specific studies publish the results of their web-based surveys in discipline-specific journals, but little information is available on the process of Internet-based data collection.…

  15. Written institutional ethics policies on euthanasia: an empirical-based organizational-ethical framework.

    PubMed

    Lemiengre, Joke; Dierckx de Casterlé, Bernadette; Schotsmans, Paul; Gastmans, Chris

    2014-05-01

    As euthanasia has become a widely debated issue in many Western countries, hospitals and nursing homes especially are increasingly being confronted with this ethically sensitive societal issue. The focus of this paper is how healthcare institutions can deal with euthanasia requests on an organizational level by means of a written institutional ethics policy. The general aim is to make a critical analysis whether these policies can be considered as organizational-ethical instruments that support healthcare institutions to take their institutional responsibility for dealing with euthanasia requests. By means of an interpretative analysis, we conducted a process of reinterpretation of results of former Belgian empirical studies on written institutional ethics policies on euthanasia in dialogue with the existing international literature. The study findings revealed that legal regulations, ethical and care-oriented aspects strongly affected the development, the content, and the impact of written institutional ethics policies on euthanasia. Hence, these three cornerstones-law, care and ethics-constituted the basis for the empirical-based organizational-ethical framework for written institutional ethics policies on euthanasia that is presented in this paper. However, having a euthanasia policy does not automatically lead to more legal transparency, or to a more professional and ethical care practice. The study findings suggest that the development and implementation of an ethics policy on euthanasia as an organizational-ethical instrument should be considered as a dynamic process. Administrators and ethics committees must take responsibility to actively create an ethical climate supporting care providers who have to deal with ethical dilemmas in their practice. PMID:24420744

  16. The Utility of Teacher and Student Surveys in Principal Evaluations: An Empirical Investigation. REL 2015-047

    ERIC Educational Resources Information Center

    Liu, Keke; Stuit, David; Springer, Jeff; Lindsay, Jim; Wan, Yinmei

    2014-01-01

    This study examined whether adding student and teacher survey measures to existing principal evaluation measures increases the overall power of the principal evaluation model to explain variation in student achievement across schools. The study was conducted using data from 2011-12 on 39 elementary and secondary schools within a midsize urban…

  17. The "Public Opinion Survey of Human Attributes-Stuttering" (POSHA-S): Summary Framework and Empirical Comparisons

    ERIC Educational Resources Information Center

    St. Louis, Kenneth O.

    2011-01-01

    Purpose: The "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was developed to make available worldwide a standard measure of public attitudes toward stuttering that is practical, reliable, valid, and translatable. Mean data from past field studies as comparisons for interpretation of "POSHA-S" results are reported. Method: Means…

  18. Gender Wage Gaps by College Major in Taiwan: Empirical Evidence from the 1997-2003 Manpower Utilization Survey

    ERIC Educational Resources Information Center

    Lin, Eric S.

    2010-01-01

    In this article, we examine the effect of incorporating the fields of study on the explained and unexplained components of the standard Oaxaca decomposition for the gender wage gaps in Taiwan using 1997-2003 Manpower Utilization Survey data. Using several existing and lately developed measures, we inspect the gender wage gap by college major to…

  19. Empirical mode decomposition based background removal and de-noising in polarization interference imaging spectrometer.

    PubMed

    Zhang, Chunmin; Ren, Wenyi; Mu, Tingkui; Fu, Lili; Jia, Chenling

    2013-02-11

    Based on empirical mode decomposition (EMD), the background removal and de-noising procedures of the data taken by polarization interference imaging interferometer (PIIS) are implemented. Through numerical simulation, it is discovered that the data processing methods are effective. The assumption that the noise mostly exists in the first intrinsic mode function is verified, and the parameters in the EMD thresholding de-noising methods is determined. In comparison, the wavelet and windowed Fourier transform based thresholding de-noising methods are introduced. The de-noised results are evaluated by the SNR, spectral resolution and peak value of the de-noised spectrums. All the methods are used to suppress the effect from the Gaussian and Poisson noise. The de-noising efficiency is higher for the spectrum contaminated by Gaussian noise. The interferogram obtained by the PIIS is processed by the proposed methods. Both the interferogram without background and noise free spectrum are obtained effectively. The adaptive and robust EMD based methods are effective to the background removal and de-noising in PIIS. PMID:23481716

  20. Target detection for low cost uncooled MWIR cameras based on empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Piñeiro-Ave, José; Blanco-Velasco, Manuel; Cruz-Roldán, Fernando; Artés-Rodríguez, Antonio

    2014-03-01

    In this work, a novel method for detecting low intensity fast moving objects with low cost Medium Wavelength Infrared (MWIR) cameras is proposed. The method is based on background subtraction in a video sequence obtained with a low density Focal Plane Array (FPA) of the newly available uncooled lead selenide (PbSe) detectors. Thermal instability along with the lack of specific electronics and mechanical devices for canceling the effect of distortion make background image identification very difficult. As a result, the identification of targets is performed in low signal to noise ratio (SNR) conditions, which may considerably restrict the sensitivity of the detection algorithm. These problems are addressed in this work by means of a new technique based on the empirical mode decomposition, which accomplishes drift estimation and target detection. Given that background estimation is the most important stage for detecting, a previous denoising step enabling a better drift estimation is designed. Comparisons are conducted against a denoising technique based on the wavelet transform and also with traditional drift estimation methods such as Kalman filtering and running average. The results reported by the simulations show that the proposed scheme has superior performance.

  1. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User's Head Movement.

    PubMed

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-01-01

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest. PMID:27589768

  2. Open-circuit sensitivity model based on empirical parameters for a capacitive-type MEMS acoustic sensor

    NASA Astrophysics Data System (ADS)

    Lee, Jaewoo; Jeon, J. H.; Je, C. H.; Lee, S. Q.; Yang, W. S.; Lee, S.-G.

    2016-03-01

    An empirical-based open-circuit sensitivity model for a capacitive-type MEMS acoustic sensor is presented. To intuitively evaluate the characteristic of the open-circuit sensitivity, the empirical-based model is proposed and analysed by using a lumped spring-mass model and a pad test sample without a parallel plate capacitor for the parasitic capacitance. The model is composed of three different parameter groups: empirical, theoretical, and mixed data. The empirical residual stress from the measured pull-in voltage of 16.7 V and the measured surface topology of the diaphragm were extracted as +13 MPa, resulting in the effective spring constant of 110.9 N/m. The parasitic capacitance for two probing pads including the substrate part was 0.25 pF. Furthermore, to verify the proposed model, the modelled open-circuit sensitivity was compared with the measured value. The MEMS acoustic sensor had an open- circuit sensitivity of -43.0 dBV/Pa at 1 kHz with a bias of 10 V, while the modelled open- circuit sensitivity was -42.9 dBV/Pa, which showed good agreement in the range from 100 Hz to 18 kHz. This validates the empirical-based open-circuit sensitivity model for designing capacitive-type MEMS acoustic sensors.

  3. Empirical Evaluation Indicators in Thai Higher Education: Theory-Based Multidimensional Learners' Assessment

    ERIC Educational Resources Information Center

    Sritanyarat, Dawisa; Russ-Eft, Darlene

    2016-01-01

    This study proposed empirical indicators which can be validated and adopted in higher education institutions to evaluate quality of teaching and learning, and to serve as an evaluation criteria for human resource management and development of higher institutions in Thailand. The main purpose of this study was to develop empirical indicators of a…

  4. Evaluation of Physically and Empirically Based Models for the Estimation of Green Roof Evapotranspiration

    NASA Astrophysics Data System (ADS)

    Digiovanni, K. A.; Montalto, F. A.; Gaffin, S.; Rosenzweig, C.

    2010-12-01

    Green roofs and other urban green spaces can provide a variety of valuable benefits including reduction of the urban heat island effect, reduction of stormwater runoff, carbon sequestration, oxygen generation, air pollution mitigation etc. As many of these benefits are directly linked to the processes of evaporation and transpiration, accurate and representative estimation of urban evapotranspiration (ET) is a necessary tool for predicting and quantifying such benefits. However, many common ET estimation procedures were developed for agricultural applications, and thus carry inherent assumptions that may only be rarely applicable to urban green spaces. Various researchers have identified the estimation of expected urban ET rates as critical, yet poorly studied components of urban green space performance prediction and cite that further evaluation is needed to reconcile differences in predictions from varying ET modeling approaches. A small scale green roof lysimeter setup situated on the green roof of the Ethical Culture Fieldston School in the Bronx, NY has been the focus of ongoing monitoring initiated in June 2009. The experimental setup includes a 0.6 m by 1.2 m Lysimeter replicating the anatomy of the 500 m2 green roof of the building, with a roof membrane, drainage layer, 10 cm media depth, and planted with a variety of Sedum species. Soil moisture sensors and qualitative runoff measurements are also recorded in the Lysimeter, while a weather station situated on the rooftop records climatologic data. Direct quantification of actual evapotranspiration (AET) from the green roof weighing lysimeter was achieved through a mass balance approaches during periods absent of precipitation and drainage. A comparison of AET to estimates of potential evapotranspiration (PET) calculated from empirically and physically based ET models was performed in order to evaluate the applicability of conventional ET equations for the estimation of ET from green roofs. Results have

  5. Impact of Inadequate Empirical Therapy on the Mortality of Patients with Bloodstream Infections: a Propensity Score-Based Analysis

    PubMed Central

    Retamar, Pilar; Portillo, María M.; López-Prieto, María Dolores; Rodríguez-López, Fernando; de Cueto, Marina; García, María V.; Gómez, María J.; del Arco, Alfonso; Muñoz, Angel; Sánchez-Porto, Antonio; Torres-Tortosa, Manuel; Martín-Aspas, Andrés; Arroyo, Ascensión; García-Figueras, Carolina; Acosta, Federico; Corzo, Juan E.; León-Ruiz, Laura; Escobar-Lara, Trinidad

    2012-01-01

    The impact of the adequacy of empirical therapy on outcome for patients with bloodstream infections (BSI) is key for determining whether adequate empirical coverage should be prioritized over other, more conservative approaches. Recent systematic reviews outlined the need for new studies in the field, using improved methodologies. We assessed the impact of inadequate empirical treatment on the mortality of patients with BSI in the present-day context, incorporating recent methodological recommendations. A prospective multicenter cohort including all BSI episodes in adult patients was performed in 15 hospitals in Andalucía, Spain, over a 2-month period in 2006 to 2007. The main outcome variables were 14- and 30-day mortality. Adjusted analyses were performed by multivariate analysis and propensity score-based matching. Eight hundred one episodes were included. Inadequate empirical therapy was administered in 199 (24.8%) episodes; mortality at days 14 and 30 was 18.55% and 22.6%, respectively. After controlling for age, Charlson index, Pitt score, neutropenia, source, etiology, and presentation with severe sepsis or shock, inadequate empirical treatment was associated with increased mortality at days 14 and 30 (odds ratios [ORs], 2.12 and 1.56; 95% confidence intervals [95% CI], 1.34 to 3.34 and 1.01 to 2.40, respectively). The adjusted ORs after a propensity score-based matched analysis were 3.03 and 1.70 (95% CI, 1.60 to 5.74 and 0.98 to 2.98, respectively). In conclusion, inadequate empirical therapy is independently associated with increased mortality in patients with BSI. Programs to improve the quality of empirical therapy in patients with suspicion of BSI and optimization of definitive therapy should be implemented. PMID:22005999

  6. Is Project Based Learning More Effective than Direct Instruction in School Science Classrooms? An Analysis of the Empirical Research Evidence

    NASA Astrophysics Data System (ADS)

    Dann, Clifford

    An increasingly loud call by parents, school administrators, teachers, and even business leaders for "authentic learning", emphasizing both group-work and problem solving, has led to growing enthusiasm for inquiry-based learning over the past decade. Although "inquiry" can be defined in many ways, a curriculum called "project-based learning" has recently emerged as the inquiry practice-of-choice with roots in the educational constructivism that emerged in the mid-twentieth century. Often, project-based learning is framed as an alternative instructional strategy to direct instruction for maximizing student content knowledge. This study investigates the empirical evidence for such a comparison while also evaluating the overall quality of the available studies in the light of accepted standards for educational research. Specifically, this thesis investigates what the body of quantitative research says about the efficacy of project-based learning vs. direct instruction when considering student acquisition of content knowledge in science classrooms. Further, existing limitations of the research pertaining to project based learning and secondary school education are explored. The thesis concludes with a discussion of where and how we should focus our empirical efforts in the future. The research revealed that the available empirical research contains flaws in both design and instrumentation. In particular, randomization is poor amongst all the studies considered. The empirical evidence indicates that project-based learning curricula improved student content knowledge but that, while the results were statistically significant, increases in raw test scores were marginal.

  7. Feasibility of an empirically based program for parents of preschoolers with autism spectrum disorder.

    PubMed

    Dababnah, Sarah; Parish, Susan L

    2016-01-01

    This article reports on the feasibility of implementing an existing empirically based program, The Incredible Years, tailored to parents of young children with autism spectrum disorder. Parents raising preschool-aged children (aged 3-6 years) with autism spectrum disorder (N = 17) participated in a 15-week pilot trial of the intervention. Quantitative assessments of the program revealed fidelity was generally maintained, with the exception of program-specific videos. Qualitative data from individual post-intervention interviews reported parents benefited most from child emotion regulation strategies, play-based child behavior skills, parent stress management, social support, and visual resources. More work is needed to further refine the program to address parent self-care, partner relationships, and the diverse behavioral and communication challenges of children across the autism spectrum. Furthermore, parent access and retention could potentially be increased by providing in-home childcare vouchers and a range of times and locations in which to offer the program. The findings suggest The Incredible Years is a feasible intervention for parents seeking additional support for child- and family-related challenges and offers guidance to those communities currently using The Incredible Years or other related parenting programs with families of children with autism spectrum disorder. PMID:25717131

  8. Time Domain Strain/Stress Reconstruction Based on Empirical Mode Decomposition: Numerical Study and Experimental Validation.

    PubMed

    He, Jingjing; Zhou, Yibin; Guan, Xuefei; Zhang, Wei; Zhang, Weifang; Liu, Yongming

    2016-01-01

    Structural health monitoring has been studied by a number of researchers as well as various industries to keep up with the increasing demand for preventive maintenance routines. This work presents a novel method for reconstruct prompt, informed strain/stress responses at the hot spots of the structures based on strain measurements at remote locations. The structural responses measured from usage monitoring system at available locations are decomposed into modal responses using empirical mode decomposition. Transformation equations based on finite element modeling are derived to extrapolate the modal responses from the measured locations to critical locations where direct sensor measurements are not available. Then, two numerical examples (a two-span beam and a 19956-degree of freedom simplified airfoil) are used to demonstrate the overall reconstruction method. Finally, the present work investigates the effectiveness and accuracy of the method through a set of experiments conducted on an aluminium alloy cantilever beam commonly used in air vehicle and spacecraft. The experiments collect the vibration strain signals of the beam via optical fiber sensors. Reconstruction results are compared with theoretical solutions and a detailed error analysis is also provided. PMID:27537889

  9. A hybrid filtering method based on a novel empirical mode decomposition for friction signals

    NASA Astrophysics Data System (ADS)

    Li, Chengwei; Zhan, Liwei

    2015-12-01

    During a measurement, the measured signal usually contains noise. To remove the noise and preserve the important feature of the signal, we introduce a hybrid filtering method that uses a new intrinsic mode function (NIMF) and a modified Hausdorff distance. The NIMF is defined as the difference between the noisy signal and each intrinsic mode function (IMF), which is obtained by empirical mode decomposition (EMD), ensemble EMD, complementary ensemble EMD, or complete ensemble EMD with adaptive noise (CEEMDAN). The relevant mode selecting is based on the similarity between the first NIMF and the rest of the NIMFs. With this filtering method, the EMD and improved versions are used to filter the simulation and friction signals. The friction signal between an airplane tire and the runaway is recorded during a simulated airplane touchdown and features spikes of various amplitudes and noise. The filtering effectiveness of the four hybrid filtering methods are compared and discussed. The results show that the filtering method based on CEEMDAN outperforms other signal filtering methods.

  10. Empirical mode decomposition-based motion artifact correction method for functional near-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Gu, Yue; Han, Junxia; Liang, Zhenhu; Yan, Jiaqing; Li, Zheng; Li, Xiaoli

    2016-01-01

    Functional near-infrared spectroscopy (fNIRS) is a promising technique for monitoring brain activity. However, it is sensitive to motion artifacts. Many methods have been developed for motion correction, such as spline interpolation, wavelet filtering, and kurtosis-based wavelet filtering. We propose a motion correction method based on empirical mode decomposition (EMD), which is applied to segments of data identified as having motion artifacts. The EMD method is adaptive, data-driven, and well suited for nonstationary data. To test the performance of the proposed EMD method and to compare it with other motion correction methods, we used simulated hemodynamic responses added to real resting-state fNIRS data. The EMD method reduced mean squared error in 79% of channels and increased signal-to-noise ratio in 78% of channels. Moreover, it produced the highest Pearson's correlation coefficient between the recovered signal and the original signal, significantly better than the comparison methods (p<0.01, paired t-test). These results indicate that the proposed EMD method is a first choice method for motion artifact correction in fNIRS.

  11. An empirically based steady state friction law and implications for fault stability

    NASA Astrophysics Data System (ADS)

    Spagnuolo, E.; Nielsen, S.; Violay, M.; Di Toro, G.

    2016-04-01

    Empirically based rate-and-state friction laws (RSFLs) have been proposed to model the dependence of friction forces with slip and time. The relevance of the RSFL for earthquake mechanics is that few constitutive parameters define critical conditions for fault stability (i.e., critical stiffness and frictional fault behavior). However, the RSFLs were determined from experiments conducted at subseismic slip rates (V < 1 cm/s), and their extrapolation to earthquake deformation conditions (V > 0.1 m/s) remains questionable on the basis of the experimental evidence of (1) large dynamic weakening and (2) activation of particular fault lubrication processes at seismic slip rates. Here we propose a modified RSFL (MFL) based on the review of a large published and unpublished data set of rock friction experiments performed with different testing machines. The MFL, valid at steady state conditions from subseismic to seismic slip rates (0.1 µm/s < V < 3 m/s), describes the initiation of a substantial velocity weakening in the 1-20 cm/s range resulting in a critical stiffness increase that creates a peak of potential instability in that velocity regime. The MFL leads to a new definition of fault frictional stability with implications for slip event styles and relevance for models of seismic rupture nucleation, propagation, and arrest.

  12. Neglected Value of Small Population-based Surveys: A Comparison with Demographic and Health Survey Data

    PubMed Central

    Langston, Anne C.; Sarriot, Eric G.

    2015-01-01

    ABSTRACT We believe that global health practice and evaluation operate with misleading assumptions about lack of reliability of small population-based health surveys (district level and below), leading managers and decision-makers to under-use this valuable information and programmatic tool and to rely on health information from large national surveys when neither timing nor available data meet their needs. This paper uses a unique opportunity for comparison between a knowledge, practice, and coverage (KPC) household survey and Rwanda Demographic and Health Survey (RDHS) carried out in overlapping timeframes to disprove these enduring suspicions. Our analysis shows that the KPC provides coverage estimates consistent with the RDHS estimates for the same geographic areas. We discuss cases of divergence between estimates. Application of the Lives Saved Tool to the KPC results also yields child mortality estimates comparable with DHS-measured mortality. We draw three main lessons from the study and conclude with recommendations for challenging unfounded assumptions against the value of small household coverage surveys, which can be a key resource in the arsenal of local health programmers. PMID:25995729

  13. Neglected value of small population-based surveys: a comparison with demographic and health survey data.

    PubMed

    Langston, Anne C; Prosnitz, Debra M; Sarriot, Eric G

    2015-03-01

    We believe that global health practice and evaluation operate with misleading assumptions about lack of reliability of small population-based health surveys (district level and below), leading managers and decision-makers to under-use this valuable information and programmatic tool and to rely on health information from large national surveys when neither timing nor available data meet their needs. This paper uses a unique opportunity for comparison between a knowledge, practice, and coverage (KPC) household survey and Rwanda Demographic and Health Survey (RDHS) carried out in overlapping timeframes to disprove these enduring suspicions. Our analysis shows that the KPC provides coverage estimates consistent with the RDHS estimates for the same geographic areas. We discuss cases of divergence between estimates. Application of the Lives Saved Tool to the KPC results also yields child mortality estimates comparable with DHS-measured mortality. We draw three main lessons from the study and conclude with recommendations for challenging unfounded assumptions against the value of small household coverage surveys, which can be a key resource in the arsenal of local health programmers. PMID:25995729

  14. Evaluating the compatibility of physics-based deterministic synthetic ground motion with empirical GMPE

    NASA Astrophysics Data System (ADS)

    Baumann, C.; Dalguer, L. A.

    2012-12-01

    Recent development of deterministic physics-based numerical simulations of earthquakes has contributed to substantial advances in our understanding of different aspects related to the earthquake mechanism and near source ground motion. These models have greater potential for identifying and predicting the variability of near-source ground motions dominated by the source and/or geological effects. These advances have led to increased interest in using suite of physics-based models for reliable prediction of ground motion of future earthquakes for seismic hazard assessment and risk mitigation, particularly in areas where there are few recorded ground motions. But before using synthetic ground motion, it is important to evaluate the reliability of deterministic synthetic ground motions, particularly the upper frequency limit. Current engineering practice usually use ground motion quantities estimated from empirical Ground Motion Predicting Equations (GMPE) such as peak ground acceleration (PGA), peak ground velocity (PGV), peak ground displacement (PGD), and spectral ordinates as input to assess building response for seismic safety of future and existing structures. Therefore it is intuitive and evident to verify the compatibility of synthetic ground motions with current empirical GMPE. In this study we attempt to do it so, to a suite of deterministic ground motion simulation generated by earthquake dynamic rupture models. We focus mainly on determining the upper frequency limit in which the synthetic ground motions are compatible to GMPE. For that purpose we have generated suite of earthquake rupture dynamic models in a layered 1D velocity structure. The simulations include 360 rupture dynamic models with moment magnitudes in the range of 5.5-7, for three styles of faulting (reverse, normal and strike slip), for both buried faults and surface rupturing faults. Normal stress and frictional strength are depth and non-depth dependent. Initial stress distribution follows

  15. PowerPoint-Based Lectures in Business Education: An Empirical Investigation of Student-Perceived Novelty and Effectiveness

    ERIC Educational Resources Information Center

    Burke, Lisa A.; James, Karen E.

    2008-01-01

    The use of PowerPoint (PPT)-based lectures in business classes is prevalent, yet it remains empirically understudied in business education research. The authors investigate whether students in the contemporary business classroom view PPT as a novel stimulus and whether these perceptions of novelty are related to students' self-assessment of…

  16. Empirical Differences in Omission Tendency and Reading Ability in PISA: An Application of Tree-Based Item Response Models

    ERIC Educational Resources Information Center

    Okumura, Taichi

    2014-01-01

    This study examined the empirical differences between the tendency to omit items and reading ability by applying tree-based item response (IRTree) models to the Japanese data of the Programme for International Student Assessment (PISA) held in 2009. For this purpose, existing IRTree models were expanded to contain predictors and to handle…

  17. An Empirical Introduction to the Concept of Chemical Element Based on Van Hiele's Theory of Level Transitions

    ERIC Educational Resources Information Center

    Vogelezang, Michiel; Van Berkel, Berry; Verdonk, Adri

    2015-01-01

    Between 1970 and 1990, the Dutch working group "Empirical Introduction to Chemistry" developed a secondary school chemistry education curriculum based on the educational vision of the mathematicians van Hiele and van Hiele-Geldof. This approach viewed learning as a process in which students must go through discontinuous level transitions…

  18. Empirically Based Phenotypic Profiles of Children with Pervasive Developmental Disorders: Interpretation in the Light of the DSM-5

    ERIC Educational Resources Information Center

    Greaves-Lord, Kirstin; Eussen, Mart L. J. M.; Verhulst, Frank C.; Minderaa, Ruud B.; Mandy, William; Hudziak, James J.; Steenhuis, Mark Peter; de Nijs, Pieter F.; Hartman, Catharina A.

    2013-01-01

    This study aimed to contribute to the Diagnostic and Statistical Manual (DSM) debates on the conceptualization of autism by investigating (1) whether empirically based distinct phenotypic profiles could be distinguished within a sample of mainly cognitively able children with pervasive developmental disorder (PDD), and (2) how profiles related to…

  19. Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education

    ERIC Educational Resources Information Center

    Amrein-Beardsley, A.; Haladyna, T.

    2012-01-01

    Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…

  20. Acute traumatic brain injury: is current management evidence based? An empirical analysis of systematic reviews.

    PubMed

    Lei, Jin; Gao, Guoyi; Jiang, Jiyao

    2013-04-01

    Traumatic brain injury (TBI) is a major health and socioeconomic problem worldwide with a high rate of death and long-term disability. Previous studies have summarized evidence from large-scale randomized trials, finding no intervention showing convincing efficacy for acute TBI management. The present empirical study set out to assess another crucial component of evidence base-systematic review, which contributes a lot to evidence-based health care, in terms of clinical issues, methodological aspects, and implication for practice and research. A total of 44 systematic reviews pertaining to therapeutic interventions for acute TBI were identified through electronic database searching, clinical guideline retrieval, and expert consultation, of which 21 were published in Cochrane Library and 23 in peer-reviewed journals. Their methodological quality was generally satisfactory, with the median Overview Quality Assessment Questionnaire score of 5.5 (interquartile range 2-7). Cochrane reviews are of better quality than regular journal reviews. Twenty-nine high-quality reviews provided no conclusive evidence for the investigated 22 interventions except for an adverse effect of corticosteroids. Less than one-third of the component trials were reported with adequate allocation concealment. Additionally other methodological flaws in design-for example, ignoring heterogeneity among the TBI population-also contributed to the failure of past clinical research. Based on the above findings, evidence from both systematic reviews and clinical trials does not fully support current management of acute TBI. Translating from laboratory success to clinical effect remains an unique challenge. Accordingly it may be the time to rethink the way in future practice and clinical research in TBI. PMID:23151044

  1. Polarizable Empirical Force Field for Aromatic Compounds Based on the Classical Drude Oscillator

    PubMed Central

    Lopes, Pedro E. M.; Lamoureux, Guillaume; Roux, Benoit; MacKerell, Alexander D.

    2008-01-01

    The polarizable empirical CHARMM force field based on the classical Drude oscillator has been extended to the aromatic compounds benzene and toluene. Parameters were optimized for benzene and then transferred directly to toluene, with parameters for the methyl moiety of toluene taken from the previously published work on the alkanes. Optimization of all parameters was performed against an extensive set of quantum mechanical and experimental data. Ab initio data was used for determination of the electrostatic parameters, the vibrational analysis, and in the optimization of the relative magnitudes of the Lennard-Jones parameters. The absolute values of the Lennard-Jones parameters were determined by comparing computed and experimental heats of vaporization, molecular volumes, free energies of hydration and dielectric constants. The newly developed parameter set was extensively tested against additional experimental data such as vibrational spectra in the condensed phase, diffusion constants, heat capacities at constant pressure and isothermal compressibilities including data as a function of temperature. Moreover, the structure of liquid benzene, liquid toluene and of solutions of each in water were studied. In the case of benzene, the computed and experimental total distribution function were compared, with the developed model shown to be in excellent agreement with experiment. PMID:17388420

  2. Empirical Study of User Preferences Based on Rating Data of Movies

    PubMed Central

    Zhao, YingSi; Shen, Bo

    2016-01-01

    User preference plays a prominent role in many fields, including electronic commerce, social opinion, and Internet search engines. Particularly in recommender systems, it directly influences the accuracy of the recommendation. Though many methods have been presented, most of these have only focused on how to improve the recommendation results. In this paper, we introduce an empirical study of user preferences based on a set of rating data about movies. We develop a simple statistical method to investigate the characteristics of user preferences. We find that the movies have potential characteristics of closure, which results in the formation of numerous cliques with a power-law size distribution. We also find that a user related to a small clique always has similar opinions on the movies in this clique. Then, we suggest a user preference model, which can eliminate the predictions that are considered to be impracticable. Numerical results show that the model can reflect user preference with remarkable accuracy when data elimination is allowed, and random factors in the rating data make prediction error inevitable. In further research, we will investigate many other rating data sets to examine the universality of our findings. PMID:26735847

  3. Ship classification using nonlinear features of radiated sound: an approach based on empirical mode decomposition.

    PubMed

    Bao, Fei; Li, Chen; Wang, Xinlong; Wang, Qingfu; Du, Shuanping

    2010-07-01

    Classification for ship-radiated underwater sound is one of the most important and challenging subjects in underwater acoustical signal processing. An approach to ship classification is proposed in this work based on analysis of ship-radiated acoustical noise in subspaces of intrinsic mode functions attained via the ensemble empirical mode decomposition. It is shown that detection and acquisition of stable and reliable nonlinear features become practically feasible by nonlinear analysis of the time series of individual decomposed components, each of which is simple enough and well represents an oscillatory mode of ship dynamics. Surrogate and nonlinear predictability analysis are conducted to probe and measure the nonlinearity and regularity. The results of both methods, which verify each other, substantiate that ship-radiated noises contain components with deterministic nonlinear features well serving for efficient classification of ships. The approach perhaps opens an alternative avenue in the direction toward object classification and identification. It may also import a new view of signals as complex as ship-radiated sound. PMID:20649216

  4. Percentile-based Empirical Distribution Function Estimates for Performance Evaluation of Healthcare Providers

    PubMed Central

    Paddock, Susan M.; Louis, Thomas A.

    2010-01-01

    Summary Hierarchical models are widely-used to characterize the performance of individual healthcare providers. However, little attention has been devoted to system-wide performance evaluations, the goals of which include identifying extreme (e.g., top 10%) provider performance and developing statistical benchmarks to define high-quality care. Obtaining optimal estimates of these quantities requires estimating the empirical distribution function (EDF) of provider-specific parameters that generate the dataset under consideration. However, the difficulty of obtaining uncertainty bounds for a square-error loss minimizing EDF estimate has hindered its use in system-wide performance evaluations. We therefore develop and study a percentile-based EDF estimate for univariate provider-specific parameters. We compute order statistics of samples drawn from the posterior distribution of provider-specific parameters to obtain relevant uncertainty assessments of an EDF estimate and its features, such as thresholds and percentiles. We apply our method to data from the Medicare End Stage Renal Disease (ESRD) Program, a health insurance program for people with irreversible kidney failure. We highlight the risk of misclassifying providers as exceptionally good or poor performers when uncertainty in statistical benchmark estimates is ignored. Given the high stakes of performance evaluations, statistical benchmarks should be accompanied by precision estimates. PMID:21918583

  5. Pseudo-empirical Likelihood-Based Method Using Calibration for Longitudinal Data with Drop-Out

    PubMed Central

    Chen, Baojiang; Zhou, Xiao-Hua; Chan, Kwun Chuen Gary

    2014-01-01

    Summary In observational studies, interest mainly lies in estimation of the population-level relationship between the explanatory variables and dependent variables, and the estimation is often undertaken using a sample of longitudinal data. In some situations, the longitudinal data sample features biases and loss of estimation efficiency due to non-random drop-out. However, inclusion of population-level information can increase estimation efficiency. In this paper we propose an empirical likelihood-based method to incorporate population-level information in a longitudinal study with drop-out. The population-level information is incorporated via constraints on functions of the parameters, and non-random drop-out bias is corrected by using a weighted generalized estimating equations method. We provide a three-step estimation procedure that makes computation easier. Some commonly used methods are compared in simulation studies, which demonstrate that our proposed method can correct the non-random drop-out bias and increase the estimation efficiency, especially for small sample size or when the missing proportion is high. In some situations, the efficiency improvement is substantial. Finally, we apply this method to an Alzheimer’s disease study. PMID:25587200

  6. Empirical Study of User Preferences Based on Rating Data of Movies.

    PubMed

    Zhao, YingSi; Shen, Bo

    2016-01-01

    User preference plays a prominent role in many fields, including electronic commerce, social opinion, and Internet search engines. Particularly in recommender systems, it directly influences the accuracy of the recommendation. Though many methods have been presented, most of these have only focused on how to improve the recommendation results. In this paper, we introduce an empirical study of user preferences based on a set of rating data about movies. We develop a simple statistical method to investigate the characteristics of user preferences. We find that the movies have potential characteristics of closure, which results in the formation of numerous cliques with a power-law size distribution. We also find that a user related to a small clique always has similar opinions on the movies in this clique. Then, we suggest a user preference model, which can eliminate the predictions that are considered to be impracticable. Numerical results show that the model can reflect user preference with remarkable accuracy when data elimination is allowed, and random factors in the rating data make prediction error inevitable. In further research, we will investigate many other rating data sets to examine the universality of our findings. PMID:26735847

  7. Satellite-based empirical models linking river plume dynamics with hypoxic area and volume

    NASA Astrophysics Data System (ADS)

    Le, Chengfeng; Lehrter, John C.; Hu, Chuanmin; Obenour, Daniel R.

    2016-03-01

    Satellite-based empirical models explaining hypoxic area and volume variation were developed for the seasonally hypoxic (O2 < 2 mg L-1) northern Gulf of Mexico adjacent to the Mississippi River. Annual variations in midsummer hypoxic area and volume were related to Moderate Resolution Imaging Spectroradiometer-derived monthly estimates of river plume area (km2) and average, inner shelf chlorophyll a concentration (Chl a, mg m-3). River plume area in June was negatively related with midsummer hypoxic area (km2) and volume (km3), while July inner shelf Chl a was positively related to hypoxic area and volume. Multiple regression models using river plume area and Chl a as independent variables accounted for most of the variability in hypoxic area (R2 = 0.92) or volume (R2 = 0.89). These models explain more variation in hypoxic area than models using Mississippi River nutrient loads as independent variables. The results here also support a hypothesis that confinement of the river plume to the inner shelf is an important mechanism controlling hypoxia area and volume in this region.

  8. Empirical prediction of Indian summer monsoon rainfall with different lead periods based on global SST anomalies

    NASA Astrophysics Data System (ADS)

    Pai, D. S.; Rajeevan, M.

    2006-02-01

    The main objective of this study was to develop empirical models with different seasonal lead time periods for the long range prediction of seasonal (June to September) Indian summer monsoon rainfall (ISMR). For this purpose, 13 predictors having significant and stable relationships with ISMR were derived by the correlation analysis of global grid point seasonal Sea-Surface Temperature (SST) anomalies and the tendency in the SST anomalies. The time lags of the seasonal SST anomalies were varied from 1 season to 4 years behind the reference monsoon season. The basic SST data set used was the monthly NOAA Extended Reconstructed Global SST (ERSST) data at 2° × 2° spatial grid for the period 1951 2003. The time lags of the 13 predictors derived from various areas of all three tropical ocean basins (Indian, Pacific and Atlantic Oceans) varied from 1 season to 3 years. Based on these inter-correlated predictors, 3 predictor sub sets A, B and C were formed with prediction lead time periods of 0, 1 and 2 seasons, respectively, from the beginning of the monsoon season. The selected principal components (PCs) of these predictor sets were used as the input parameters for the models A, B and C, respectively. The model development period was 1955 1984. The correct model size was derived using all-possible regressions procedure and Mallow’s “Cp” statistics.

  9. The mature minor: some critical psychological reflections on the empirical bases.

    PubMed

    Partridge, Brian C

    2013-06-01

    Moral and legal notions engaged in clinical ethics should not only possess analytic clarity but a sound basis in empirical findings. The latter condition brings into question the expansion of the mature minor exception. The mature minor exception in the healthcare law of the United States has served to enable those under the legal age to consent to medical treatment. Although originally developed primarily for minors in emergency or quasi-emergency need for health care, it was expanded especially from the 1970s in order to cover unemancipated minors older than 14 years. This expansion initially appeared plausible, given psychological data that showed the intellectual capacity of minors over 14 to recognize the causal connection between their choices and the consequences of their choices. However, subsequent psychological studies have shown that minors generally fail to have realistic affective and evaluative appreciations of the consequences of their decisions, because they tend to over-emphasize short-term benefits and underestimate long-term risks. Also, unlike most decisionmakers over 21, the decisions of minors are more often marked by the lack of adequate impulse control, all of which is reflected in the far higher involvement of adolescents in acts of violence, intentional injury, and serious automobile accidents. These effects are more evident in circumstances that elicit elevated affective responses. The advent of brain imaging has allowed the actual visualization of qualitative differences between how minors versus persons over the age of 21 generally assess risks and benefits and make decisions. In the case of most under the age of 21, subcortical systems fail adequately to be checked by the prefrontal systems that are involved in adult executive decisions. The neuroanatomical and psychological model developed by Casey, Jones, and Summerville offers an empirical insight into the qualitative differences in the neuroanatomical and neuropsychological bases

  10. Selective Survey of Online Access to Social Science Data Bases

    ERIC Educational Resources Information Center

    Donati, Robert

    1977-01-01

    Data bases in the social sciences are briefly surveyed with emphasis on content, coverage, and currency. Coverage of certain topics are quantitatively compared among several social science and more general files. Techniques for on line thesaurus utilization and a systematic application of the same search strategies across multiple files are…

  11. Place Based Assistance Tools: Networking and Resident Surveys.

    ERIC Educational Resources Information Center

    Department of Housing and Urban Development, Washington, DC.

    "Place-based assistance" is not a new concept. Asking what people want and finding ways to give it to them sounds simplistic, but it can result in "win-win" solutions in which everyone involved benefits. This document is a guide to using networking and surveys of residents to determine community needs. Some case studies show networking and surveys…

  12. Lake Superior Phytoplankton Characterization from the 2006 Probability Based Survey

    EPA Science Inventory

    We conducted a late summer probability based survey of Lake Superior in 2006 which consisted of 52 sites stratified across 3 depth zones. As part of this effort, we collected composite phytoplankton samples from the epilimnion and the fluorescence maxima (Fmax) at 29 of the site...

  13. Space-based infrared surveys of small bodies

    NASA Astrophysics Data System (ADS)

    Mommert, M.

    2014-07-01

    Most small bodies in the Solar System are too small and too distant to be spatially resolved, precluding a direct diameter derivation. Furthermore, measurements of the optical brightness alone only allow a rough estimate of the diameter, since the surface albedo is usually unknown and can have values between about 3 % and 60 % or more. The degeneracy can be resolved by considering the thermal emission of these objects, which is less prone to albedo effects and mainly a function of the diameter. Hence, the combination of optical and thermal-infrared observational data provides a means to independently derive an object's diameter and albedo. This technique is used in asteroid thermal models or more sophisticated thermophysical models (see, e.g., [1]). Infrared observations require cryogenic detectors and/or telescopes, depending on the actual wavelength range observed. Observations from the ground are additionally compromised by the variable transparency of Earth's atmosphere in major portions of the infrared wavelength ranges. Hence, space-based infrared telescopes, providing stable conditions and significantly better sensitivities than ground-based telescopes, are now used routinely to exploit this wavelength range. Two observation strategies are used with space-based infrared observatories: Space-based Infrared All-Sky Surveys. Asteroid surveys in the thermal infrared are less prone to albedo-related discovery bias compared to surveys with optical telescopes, providing a more complete picture of small body populations. The first space-based infrared survey of Solar System small bodies was performed with the Infrared Astronomical Satellite (IRAS) for 10 months in 1983. In the course of the 'IRAS Minor Planet Survey' [2], 2228 asteroids (3 new discoveries) and more than 25 comets (6 new discoveries) were observed. More recent space-based infrared all-sky asteroid surveys were performed by Akari (launched 2006) and the Wide-field Infrared Survey Explorer (WISE

  14. Survey-based Indices for Nursing Home Quality Incentive Reimbursement

    PubMed Central

    Willemain, Thomas R.

    1983-01-01

    Incentive payments are a theoretically appealing complement to nursing home quality assurance systems that rely on regulatory enforcement. However, the practical aspects of incentive program design are not yet well understood. After reviewing the rationale for incentive approaches and recent State and. Federal initiatives, the article considers a basic program design issue: creating an index of nursing home quality. It focuses on indices constructed from routine licensure and certification survey results because State initiatives have relied heavily on these readily accessible data. It also suggests a procedure for creating a survey-based index and discusses a sampling of Implementation issues. PMID:10309858

  15. Design-based and model-based inference in surveys of freshwater mollusks

    USGS Publications Warehouse

    Dorazio, R.M.

    1999-01-01

    Well-known concepts in statistical inference and sampling theory are used to develop recommendations for planning and analyzing the results of quantitative surveys of freshwater mollusks. Two methods of inference commonly used in survey sampling (design-based and model-based) are described and illustrated using examples relevant in surveys of freshwater mollusks. The particular objectives of a survey and the type of information observed in each unit of sampling can be used to help select the sampling design and the method of inference. For example, the mean density of a sparsely distributed population of mollusks can be estimated with higher precision by using model-based inference or by using design-based inference with adaptive cluster sampling than by using design-based inference with conventional sampling. More experience with quantitative surveys of natural assemblages of freshwater mollusks is needed to determine the actual benefits of different sampling designs and inferential procedures.

  16. Selecting Great Lakes streams for lampricide treatment based on larval sea lamprey surveys

    USGS Publications Warehouse

    Christie, Gavin C.; Adams, Jean V.; Steeves, Todd B.; Slade, Jeffrey W.; Cuddy, Douglas W.; Fodale, Michael F.; Young, Robert J.; Kuc, Miroslaw; Jones, Michael L.

    2003-01-01

    The Empiric Stream Treatment Ranking (ESTR) system is a data-driven, model-based, decision tool for selecting Great Lakes streams for treatment with lampricide, based on estimates from larval sea lamprey (Petromyzon marinus) surveys conducted throughout the basin. The 2000 ESTR system was described and applied to larval assessment surveys conducted from 1996 to 1999. A comparative analysis of stream survey and selection data was conducted and improvements to the stream selection process were recommended. Streams were selected for treatment based on treatment cost, predicted treatment effectiveness, and the projected number of juvenile sea lampreys produced. On average, lampricide treatments were applied annually to 49 streams with 1,075 ha of larval habitat, killing 15 million larval and 514,000 juvenile sea lampreys at a total cost of $5.3 million, and marginal and mean costs of $85 and $10 per juvenile killed. The numbers of juvenile sea lampreys killed for given treatment costs showed a pattern of diminishing returns with increasing investment. Of the streams selected for treatment, those with > 14 ha of larval habitat targeted 73% of the juvenile sea lampreys for 60% of the treatment cost. Suggested improvements to the ESTR system were to improve accuracy and precision of model estimates, account for uncertainty in estimates, include all potentially productive streams in the process (not just those surveyed in the current year), consider the value of all larvae killed during treatment (not just those predicted to metamorphose the following year), use lake-specific estimates of damage, and establish formal suppression targets.

  17. A novel signal compression method based on optimal ensemble empirical mode decomposition for bearing vibration signals

    NASA Astrophysics Data System (ADS)

    Guo, Wei; Tse, Peter W.

    2013-01-01

    Today, remote machine condition monitoring is popular due to the continuous advancement in wireless communication. Bearing is the most frequently and easily failed component in many rotating machines. To accurately identify the type of bearing fault, large amounts of vibration data need to be collected. However, the volume of transmitted data cannot be too high because the bandwidth of wireless communication is limited. To solve this problem, the data are usually compressed before transmitting to a remote maintenance center. This paper proposes a novel signal compression method that can substantially reduce the amount of data that need to be transmitted without sacrificing the accuracy of fault identification. The proposed signal compression method is based on ensemble empirical mode decomposition (EEMD), which is an effective method for adaptively decomposing the vibration signal into different bands of signal components, termed intrinsic mode functions (IMFs). An optimization method was designed to automatically select appropriate EEMD parameters for the analyzed signal, and in particular to select the appropriate level of the added white noise in the EEMD method. An index termed the relative root-mean-square error was used to evaluate the decomposition performances under different noise levels to find the optimal level. After applying the optimal EEMD method to a vibration signal, the IMF relating to the bearing fault can be extracted from the original vibration signal. Compressing this signal component obtains a much smaller proportion of data samples to be retained for transmission and further reconstruction. The proposed compression method were also compared with the popular wavelet compression method. Experimental results demonstrate that the optimization of EEMD parameters can automatically find appropriate EEMD parameters for the analyzed signals, and the IMF-based compression method provides a higher compression ratio, while retaining the bearing defect

  18. Survey on Existing Science Gateways - Based on DEGREE

    NASA Astrophysics Data System (ADS)

    Schwichtenberg, Horst; Claus, Steffen

    2010-05-01

    Science Gateways gather community-developed specific tools, applications, and data that is usually combined and presented in a graphical user interface which is customized to the needs of the target user community. Science Gateways serve as a single point of entry for the users and are usually represented by fat clients or web portals. Part of the DEGREE project (Dissemination and Exploitation of Grids in Earth Science) was a state-of-the-art survey of portal usage in Earth Science (ES) applications. This survey considered a list of 29 portals, including 17 ES portals and 12 generic developments coming from outside of the ES domain. The survey identified three common usage types of ES portals, including data dissemination (e.g. observational data), collaboration as well as usage of Grid-based resources (e.g. for processing of ES datasets). Based on these three usage types, key requirements could be extracted. These requirements were furthermore used for a feature comparison with existing portal developments coming from outside of the ES domain. This presentation gives an overview of the results of the survey (including a feature comparison of ES and non-ES portals). Furthermore, three portals are discussed in detail, one for each usage type (data dissemination, collaboration, Grid-based).

  19. Implementing community-based provider participation in research: an empirical study

    PubMed Central

    2012-01-01

    Background Since 2003, the United States National Institutes of Health (NIH) has sought to restructure the clinical research enterprise in the United States by promoting collaborative research partnerships between academically-based investigators and community-based physicians. By increasing community-based provider participation in research (CBPPR), the NIH seeks to advance the science of discovery by conducting research in clinical settings where most people get their care, and accelerate the translation of research results into everyday clinical practice. Although CBPPR is seen as a promising strategy for promoting the use of evidence-based clinical services in community practice settings, few empirical studies have examined the organizational factors that facilitate or hinder the implementation of CBPPR. The purpose of this study is to explore the organizational start-up and early implementation of CBPPR in community-based practice. Methods We used longitudinal, case study research methods and an organizational model of innovation implementation to theoretically guide our study. Our sample consisted of three community practice settings that recently joined the National Cancer Institute’s (NCI) Community Clinical Oncology Program (CCOP) in the United States. Data were gathered through site visits, telephone interviews, and archival documents from January 2008 to May 2011. Results The organizational model for innovation implementation was useful in identifying and investigating the organizational factors influencing start-up and early implementation of CBPPR in CCOP organizations. In general, the three CCOP organizations varied in the extent to which they achieved consistency in CBPPR over time and across physicians. All three CCOP organizations demonstrated mixed levels of organizational readiness for change. Hospital management support and resource availability were limited across CCOP organizations early on, although they improved in one CCOP organization

  20. Questionnaire surveys: methodological and epistemological problems for field-based ethnopharmacologists.

    PubMed

    Edwards, Sarah; Nebel, Sabine; Heinrich, Michael

    2005-08-22

    The classical scientific approach is empirical. One of the favoured means of gathering quantitative data in the health and social sciences, including ethnopharmacology and medical ethnobotany, is by use of questionnaires. However, while there are numerous published articles discussing the importance of questionnaire content, the fact that questionnaires themselves may be inappropriate in a number of cultural contexts, even where literacy is not a factor, is usually ignored. In this paper, the authors will address the main issues posed by the use of questionnaire surveys, using case studies based on their own personal experiences as ethnopharmacologists 'in the field'. The pros and cons of qualitative and quantitative research and the use of alternative means to elicit quantitative data will be discussed. PMID:16009518

  1. How much does participatory flood management contribute to stakeholders' social capacity building? Empirical findings based on a triangulation of three evaluation approaches

    NASA Astrophysics Data System (ADS)

    Buchecker, M.; Menzel, S.; Home, R.

    2013-06-01

    Recent literature suggests that dialogic forms of risk communication are more effective to build stakeholders' hazard-related social capacities. In spite of the high theoretical expectations, there is a lack of univocal empirical evidence on the relevance of these effects. This is mainly due to the methodological limitations of the existing evaluation approaches. In our paper we aim at eliciting the contribution of participatory river revitalisation projects on stakeholders' social capacity building by triangulating the findings of three evaluation studies that were based on different approaches: a field-experimental, a qualitative long-term ex-post and a cross-sectional household survey approach. The results revealed that social learning and avoiding the loss of trust were more relevant benefits of participatory flood management than acceptance building. The results suggest that stakeholder involvements should be more explicitly designed as tools for long-term social learning.

  2. Assessment of diffuse trace metal inputs into surface waters - Combining empirical estimates with process based simulations

    NASA Astrophysics Data System (ADS)

    Schindewolf, Marcus; Steinz, André; Schmidt, Jürgen

    2015-04-01

    As a result of mining activities since the 13th century, surface waters of the German Mulde catchment suffer from deleterious dissolved and sediment attached lead (Pb) and zinc (Zn) inputs. The leaching rate of trace metals with drainage water is a significant criterion for assessing trace metal concentrations of soils and associated risks of ground water pollution. However, the vertical transport rates of trace metals in soils are difficult to quantify. Monitoring is restricted to small lysimeter plots, which limits the transferability of results. Additionally the solid-liquid-transfer conditions in soils are highly variable, primarily due to the fluctuating retention time of percolating soil water. In contrast, lateral sediment attached trace metal inputs are mostly associated with soil erosion and resulting sediment inputs into surface waters. Since soil erosion by water is related to rare single events, monitoring and empirical estimates reveal visible shortcomings. This gap in knowledge can only be closed by process based model calculations. Concerning these calculations it has to be considered, that Pb and Zn are predominantly attached to the fine-grained soil particles (<0.063 mm). The selective nature of soil erosion causes a preferential transport of these fine particles, while less contaminated larger particles remain on site. Consequently trace metals are enriched in the eroded sediment compared to the origin soil. This paper aims to introduce both, a new method that allows the assessment of trace metal leaching rates from contaminated top soils for standardised transfer conditions and a process based modelling approach for sediment attached trace metal inputs into surface waters. Pb and Zn leaching rates amounts to 20 Mg ha-1 yr-1 resp. 114 Mg ha-1 yr-1. Deviations to observed dissolved trace metal yields at the Bad Düben gauging station are caused by plant uptake and subsoil retention. Sediment attached Pb and Zn input rates amounts to 114 Mg ha-1 yr

  3. THE COS-HALOS SURVEY: AN EMPIRICAL DESCRIPTION OF METAL-LINE ABSORPTION IN THE LOW-REDSHIFT CIRCUMGALACTIC MEDIUM

    SciTech Connect

    Werk, Jessica K.; Prochaska, J. Xavier; Tripp, Todd M.; O'Meara, John M.; Peeples, Molly S.

    2013-02-15

    We present the equivalent width and column density measurements for low and intermediate ionization states of the circumgalactic medium (CGM) surrounding 44 low-z, L Almost-Equal-To L* galaxies drawn from the COS-Halos survey. These measurements are derived from far-UV transitions observed in HST/COS and Keck/HIRES spectra of background quasars within an impact parameter R < 160 kpc to the targeted galaxies. The data show significant metal-line absorption for 33 of the 44 galaxies, including quiescent systems, revealing the common occurrence of a cool (T Almost-Equal-To 10{sup 4}-10{sup 5} K), metal-enriched CGM. The detection rates and column densities derived for these metal lines decrease with increasing impact parameter, a trend we interpret as a declining metal surface density profile for the CGM. A comparison of the relative column densities of adjacent ionization states indicates that the gas is predominantly ionized. The large surface density in metals demands a large reservoir of metals and gas in the cool CGM (very conservatively, M {sup cool} {sub CGM} > 10{sup 9} M {sub Sun }), which likely traces a distinct density and/or temperature regime from the highly ionized CGM traced by O{sup +5} absorption. The large dispersion in absorption strengths (including non-detections) suggests that the cool CGM traces a wide range of densities or a mix of local ionizing conditions. Lastly, the kinematics inferred from the metal-line profiles are consistent with the cool CGM being bound to the dark matter halos hosting the galaxies; this gas may serve as fuel for future star formation. Future work will leverage this data set to provide estimates on the mass, metallicity, dynamics, and origin of the cool CGM in low-z, L* galaxies.

  4. Competence-based demands made of senior physicians: an empirical study to evaluate leadership competencies.

    PubMed

    Lehr, Bosco; Ostermann, Herwig; Schubert, Harald

    2011-01-01

    As a result of more economising in German hospitals, changes evolve in organising the deployment of senior medical staff. New demands are made of senior hospital management. Leadership competencies in the training and development of physicians are of prime importance to the successful perception of managerial responsibilities. The present study investigates the actual and targeted demands of leadership made of senior medical staff in terms of how these demands are perceived. To this end, the demands of leadership were surveyed using a competence-based questionnaire and investigated with a view to potentials in professional development by way of example of the senior management of psychiatric hospitals in Germany. In all, the results show high ratings in personal performance, the greatest significance being attributed to value-oriented competence in the actual assessment of demands on leadership. Besides gender-specific differences in the actual assessments of single fields of competence, the greatest differences between the targeted and the actual demands are, in all, shown to be in the competencies of self-management and communication. Competence-based core areas in leadership can be demonstrated for the professional development of physicians and an adaptive mode of procedure deduced. PMID:22176981

  5. Simulation of Long Lived Tracers Using an Improved Empirically Based Two-Dimensional Model Transport Algorithm

    NASA Technical Reports Server (NTRS)

    Fleming, E. L.; Jackman, C. H.; Stolarski, R. S.; Considine, D. B.

    1998-01-01

    We have developed a new empirically-based transport algorithm for use in our GSFC two-dimensional transport and chemistry model. The new algorithm contains planetary wave statistics, and parameterizations to account for the effects due to gravity waves and equatorial Kelvin waves. As such, this scheme utilizes significantly more information compared to our previous algorithm which was based only on zonal mean temperatures and heating rates. The new model transport captures much of the qualitative structure and seasonal variability observed in long lived tracers, such as: isolation of the tropics and the southern hemisphere winter polar vortex; the well mixed surf-zone region of the winter sub-tropics and mid-latitudes; the latitudinal and seasonal variations of total ozone; and the seasonal variations of mesospheric H2O. The model also indicates a double peaked structure in methane associated with the semiannual oscillation in the tropical upper stratosphere. This feature is similar in phase but is significantly weaker in amplitude compared to the observations. The model simulations of carbon-14 and strontium-90 are in good agreement with observations, both in simulating the peak in mixing ratio at 20-25 km, and the decrease with altitude in mixing ratio above 25 km. We also find mostly good agreement between modeled and observed age of air determined from SF6 outside of the northern hemisphere polar vortex. However, observations inside the vortex reveal significantly older air compared to the model. This is consistent with the model deficiencies in simulating CH4 in the northern hemisphere winter high latitudes and illustrates the limitations of the current climatological zonal mean model formulation. The propagation of seasonal signals in water vapor and CO2 in the lower stratosphere showed general agreement in phase, and the model qualitatively captured the observed amplitude decrease in CO2 from the tropics to midlatitudes. However, the simulated seasonal

  6. Model Selection for Equating Testlet-Based Tests in the NEAT Design: An Empirical Study

    ERIC Educational Resources Information Center

    He, Wei; Li, Feifei; Wolfe, Edward W.; Mao, Xia

    2012-01-01

    For those tests solely composed of testlets, local item independency assumption tends to be violated. This study, by using empirical data from a large-scale state assessment program, was interested in investigates the effects of using different models on equating results under the non-equivalent group anchor-test (NEAT) design. Specifically, the…

  7. Universal Design for Instruction in Postsecondary Education: A Systematic Review of Empirically Based Articles

    ERIC Educational Resources Information Center

    Roberts, Kelly D.; Park, Hye Jin; Brown, Steven; Cook, Bryan

    2011-01-01

    Universal Design for Instruction (UDI) in postsecondary education is a relatively new concept/framework that has generated significant support. The purpose of this literature review was to examine existing empirical research, including qualitative, quantitative, and mixed methods, on the use of UDI (and related terms) in postsecondary education.…

  8. Comparisons of experiment with cellulose models based on electronic structure and empirical force field theories

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Studies of cellobiose conformations with HF/6-31G* and B3LYP/6-31+G*quantum theory [1] gave a reference for studies with the much faster empirical methods such as MM3, MM4, CHARMM and AMBER. The quantum studies also enable a substantial reduction in the number of exo-cyclic group orientations that...

  9. The Effect of Survey Mode on High School Risk Behavior Data: A Comparison between Web and Paper-Based Surveys

    ERIC Educational Resources Information Center

    Raghupathy, Shobana; Hahn-Smith, Stephen

    2013-01-01

    There has been increasing interest in using of web-based surveys--rather than paper based surveys--for collecting data on alcohol and other drug use in middle and high schools in the US. However, prior research has indicated that respondent confidentiality is an underlying concern with online data collection especially when computer-assisted…

  10. Teacher-Reported Use of Empirically Validated and Standards-Based Instructional Approaches in Secondary Mathematics

    ERIC Educational Resources Information Center

    Gagnon, Joseph Calvin; Maccini, Paula

    2007-01-01

    A random sample of 167 secondary special and general educators who taught math to students with emotional and behavioral disorders (EBD) and learning disabilities (LD) responded to a mail survey. The survey examined teacher perceptions of (a) definition of math; (b) familiarity with course topics; (c) effectiveness of methods courses; (d)…

  11. HIV testing in national population-based surveys: experience from the Demographic and Health Surveys.

    PubMed Central

    Mishra, Vinod; Vaessen, Martin; Boerma, J. Ties; Arnold, Fred; Way, Ann; Barrere, Bernard; Cross, Anne; Hong, Rathavuth; Sangha, Jasbir

    2006-01-01

    OBJECTIVES: To describe the methods used in the Demographic and Health Surveys (DHS) to collect nationally representative data on the prevalence of human immunodeficiency virus (HIV) and assess the value of such data to country HIV surveillance systems. METHODS: During 2001-04, national samples of adult women and men in Burkina Faso, Cameroon, Dominican Republic, Ghana, Mali, Kenya, United Republic of Tanzania and Zambia were tested for HIV. Dried blood spot samples were collected for HIV testing, following internationally accepted ethical standards. The results for each country are presented by age, sex, and urban versus rural residence. To estimate the effects of non-response, HIV prevalence among non-responding males and females was predicted using multivariate statistical models for those who were tested, with a common set of predictor variables. RESULTS: Rates of HIV testing varied from 70% among Kenyan men to 92% among women in Burkina Faso and Cameroon. Despite large differences in HIV prevalence between the surveys (1-16%), fairly consistent patterns of HIV infection were observed by age, sex and urban versus rural residence, with considerably higher rates in urban areas and in women, especially at younger ages. Analysis of non-response bias indicates that although predicted HIV prevalence tended to be higher in non-tested males and females than in those tested, the overall effects of non-response on the observed national estimates of HIV prevalence are insignificant. CONCLUSIONS: Population-based surveys can provide reliable, direct estimates of national and regional HIV seroprevalence among men and women irrespective of pregnancy status. Survey data greatly enhance surveillance systems and the accuracy of national estimates in generalized epidemics. PMID:16878227

  12. Mental Health Functioning in the Human Rights Field: Findings from an International Internet-Based Survey

    PubMed Central

    Joscelyne, Amy; Knuckey, Sarah; Satterthwaite, Margaret L.; Bryant, Richard A.; Li, Meng; Qian, Meng; Brown, Adam D.

    2015-01-01

    Human rights advocates play a critical role in promoting respect for human rights world-wide, and engage in a broad range of strategies, including documentation of rights violations, monitoring, press work and report-writing, advocacy, and litigation. However, little is known about the impact of human rights work on the mental health of human rights advocates. This study examined the mental health profile of human rights advocates and risk factors associated with their psychological functioning. 346 individuals currently or previously working in the field of human rights completed an internet-based survey regarding trauma exposure, depression, posttraumatic stress disorder (PTSD), resilience and occupational burnout. PTSD was measured with the Posttraumatic Stress Disorder Checklist-Civilian Version (PCL-C) and depression was measured with the Patient History Questionnaire-9 (PHQ-9). These findings revealed that among human rights advocates that completed the survey, 19.4% met criteria for PTSD, 18.8% met criteria for subthreshold PTSD, and 14.7% met criteria for depression. Multiple linear regressions revealed that after controlling for symptoms of depression, PTSD symptom severity was predicted by human rights-related trauma exposure, perfectionism and negative self-appraisals about human rights work. In addition, after controlling for symptoms of PTSD, depressive symptoms were predicted by perfectionism and lower levels of self-efficacy. Survey responses also suggested high levels of resilience: 43% of responders reported minimal symptoms of PTSD. Although survey responses suggest that many human rights workers are resilient, they also suggest that human rights work is associated with elevated rates of PTSD and depression. The field of human rights would benefit from further empirical research, as well as additional education and training programs in the workplace about enhancing resilience in the context of human rights work. PMID:26700305

  13. Mental Health Functioning in the Human Rights Field: Findings from an International Internet-Based Survey.

    PubMed

    Joscelyne, Amy; Knuckey, Sarah; Satterthwaite, Margaret L; Bryant, Richard A; Li, Meng; Qian, Meng; Brown, Adam D

    2015-01-01

    Human rights advocates play a critical role in promoting respect for human rights world-wide, and engage in a broad range of strategies, including documentation of rights violations, monitoring, press work and report-writing, advocacy, and litigation. However, little is known about the impact of human rights work on the mental health of human rights advocates. This study examined the mental health profile of human rights advocates and risk factors associated with their psychological functioning. 346 individuals currently or previously working in the field of human rights completed an internet-based survey regarding trauma exposure, depression, posttraumatic stress disorder (PTSD), resilience and occupational burnout. PTSD was measured with the Posttraumatic Stress Disorder Checklist-Civilian Version (PCL-C) and depression was measured with the Patient History Questionnaire-9 (PHQ-9). These findings revealed that among human rights advocates that completed the survey, 19.4% met criteria for PTSD, 18.8% met criteria for subthreshold PTSD, and 14.7% met criteria for depression. Multiple linear regressions revealed that after controlling for symptoms of depression, PTSD symptom severity was predicted by human rights-related trauma exposure, perfectionism and negative self-appraisals about human rights work. In addition, after controlling for symptoms of PTSD, depressive symptoms were predicted by perfectionism and lower levels of self-efficacy. Survey responses also suggested high levels of resilience: 43% of responders reported minimal symptoms of PTSD. Although survey responses suggest that many human rights workers are resilient, they also suggest that human rights work is associated with elevated rates of PTSD and depression. The field of human rights would benefit from further empirical research, as well as additional education and training programs in the workplace about enhancing resilience in the context of human rights work. PMID:26700305

  14. Empirical model of equatorial electrojet based on ground-based magnetometer data during solar minimum in fall

    NASA Astrophysics Data System (ADS)

    Hamid, Nurul Shazana Abdul; Liu, Huixin; Uozumi, Teiji; Yoshikawa, Akimasa

    2015-12-01

    In this study, we constructed an empirical model of the equatorial electrojet (EEJ), including local time and longitudinal dependence, based on simultaneous data from 12 magnetometer stations located in six longitude sectors. An analysis was carried out using the equatorial electrojet index, EUEL, calculated from the geomagnetic northward H component. The magnetic EEJ strength is calculated as the difference between the normalized EUEL index of the magnetic dip equator station and the normalized EUEL index of the off-dip equator station located beyond the EEJ band. Analysis showed that this current is always strongest in the South American sector, regardless of local time (LT), and weakest in the Indian sector during 0900 and 1000 LT, but shifted to the African sector during 1100 to 1400 LT. These longitude variations of EEJ roughly follow variations of the inversed main field strength along the dip equator, except for the Indian and Southeast Asian sectors. The result showed that the EEJ component derived from the model exhibits a similar pattern with measured EEJ from ground data during noontime, mainly before 1300 LT.

  15. [DGRW-update: neurology--from empirical strategies towards evidence based interventions].

    PubMed

    Schupp, W

    2011-12-01

    Stroke, Multiple Sclerosis (MS), traumatic brain injuries (TBI) and neuropathies are the most important diseases in neurological rehabilitation financed by the German Pension Insurance. The primary goal is vocational (re)integration. Driven by multiple findings of neuroscience research the traditional holistic approach with mainly empirically derived strategies was developed further and improved by new evidence-based interventions. This process had been, and continues to be, necessary to meet the health-economic pressures for ever shorter and more efficient rehab measures. Evidence-based interventions refer to symptom-oriented measures, to team-management concepts, as well as to education and psychosocial interventions. Drug therapy and/or neurophysiological measures can be added to increase neuroregeneration and neuroplasticity. Evidence-based aftercare concepts support sustainability and steadiness of rehab results.Mirror therapy, robot-assisted training, mental training, task-specific training, and above all constraint-induced movement therapy (CIMT) can restore motor arm and hand functions. Treadmill training and robot-assisted training improve stance and gait. Botulinum toxine injections in combination with physical and redressing methods are superior in managing spasticity. Guideline-oriented management of associated pain syndromes (myofascial, neuropathic, complex-regional=dystrophic) improve primary outcome and quality of life. Drug therapy with so-called co-analgetics and physical therapy play an important role in pain management. Swallowing disorders lead to higher mortality and morbidity in the acute phase; stepwise diagnostics (screening, endoscopy, radiology) and specific swallowing therapy can reduce these risks and frequently can restore normal eating und drinking.In our modern industrial societies communicative and cognitive disturbances are more impairing than the above mentioned disorders. Speech and language therapy (SLT) is dominant in

  16. Upscaling Empirically Based Conceptualisations to Model Tropical Dominant Hydrological Processes for Historical Land Use Change

    NASA Astrophysics Data System (ADS)

    Toohey, R.; Boll, J.; Brooks, E.; Jones, J.

    2009-12-01

    Surface runoff and percolation to ground water are two hydrological processes of concern to the Atlantic slope of Costa Rica because of their impacts on flooding and drinking water contamination. As per legislation, the Costa Rican Government funds land use management from the farm to the regional scale to improve or conserve hydrological ecosystem services. In this study, we examined how land use (e.g., forest, coffee, sugar cane, and pasture) affects hydrological response at the point, plot (1 m2), and the field scale (1-6ha) to empirically conceptualize the dominant hydrological processes in each land use. Using our field data, we upscaled these conceptual processes into a physically-based distributed hydrological model at the field, watershed (130 km2), and regional (1500 km2) scales. At the point and plot scales, the presence of macropores and large roots promoted greater vertical percolation and subsurface connectivity in the forest and coffee field sites. The lack of macropores and large roots, plus the addition of management artifacts (e.g., surface compaction and a plough layer), altered the dominant hydrological processes by increasing lateral flow and surface runoff in the pasture and sugar cane field sites. Macropores and topography were major influences on runoff generation at the field scale. Also at the field scale, antecedent moisture conditions suggest a threshold behavior as a temporal control on surface runoff generation. However, in this tropical climate with very intense rainstorms, annual surface runoff was less than 10% of annual precipitation at the field scale. Significant differences in soil and hydrological characteristics observed at the point and plot scales appear to have less significance when upscaled to the field scale. At the point and plot scales, percolation acted as the dominant hydrological process in this tropical environment. However, at the field scale for sugar cane and pasture sites, saturation-excess runoff increased as

  17. Avian survey and field guide for Osan Air Base, Korea.

    SciTech Connect

    Levenson, J.

    2006-12-05

    This report summarizes the results of the avian surveys conducted at Osan Air Base (AB). This ongoing survey is conducted to comply with requirements of the Environmental Governing Standards (EGS) for the Republic of Korea, the Integrated Natural Resources Management Plan (INRMP) for Osan AB, and the 51st Fighter Wing's Bird Aircraft Strike Hazard (BASH) Plan. One hundred ten bird species representing 35 families were identified and recorded. Seven species are designated as Natural Monuments, and their protection is accorded by the Korean Ministry of Culture and Tourism. Three species appear on the Korean Association for Conservation of Nature's (KACN's) list of Reserved Wild Species and are protected by the Korean Ministry of Environment. Combined, ten different species are Republic of Korea (ROK)-protected. The primary objective of the avian survey at Osan AB was to determine what species of birds are present on the airfield and their respective habitat requirements during the critical seasons of the year. This requirement is specified in Annex J.14.c of the 51st Fighter BASH Plan 91-212 (51 FW OPLAN 91-212). The second objective was to initiate surveys to determine what bird species are present on Osan AB throughout the year and from the survey results, determine if threatened, endangered, or other Korean-listed bird species are present on Osan AB. This overall census satisfies Criterion 13-3.e of the EGS for Korea. The final objective was to formulate management strategies within Osan AB's operational requirements to protect and enhance habitats of known threatened, endangered, and ROK-protected species in accordance with EGS Criterion 13-3.a that are also favorable for the reproduction of indigenous species in accordance with the EGS Criterion 13-3.h.

  18. The EMPIRE Survey: Systematic Variations in the Dense Gas Fraction and Star Formation Efficiency from Full-disk Mapping of M51

    NASA Astrophysics Data System (ADS)

    Bigiel, Frank; Leroy, Adam K.; Jiménez-Donaire, Maria J.; Pety, Jérôme; Usero, Antonio; Cormier, Diane; Bolatto, Alberto; Garcia-Burillo, Santiago; Colombo, Dario; González-García, Manuel; Hughes, Annie; Kepley, Amanda A.; Kramer, Carsten; Sandstrom, Karin; Schinnerer, Eva; Schruba, Andreas; Schuster, Karl; Tomicic, Neven; Zschaechner, Laura

    2016-05-01

    We present the first results from the EMPIRE survey, an IRAM large program that is mapping tracers of high-density molecular gas across the disks of nine nearby star-forming galaxies. Here, we present new maps of the 3 mm transitions of HCN, HCO+, and HNC across the whole disk of our pilot target, M51. As expected, dense gas correlates with tracers of recent star formation, filling the “luminosity gap” between Galactic cores and whole galaxies. In detail, we show that both the fraction of gas that is dense, {f}{dense} traced by HCN/CO, and the rate at which dense gas forms stars, {{SFE}}{dense} traced by IR/HCN, depend on environment in the galaxy. The sense of the dependence is that high-surface-density, high molecular gas fraction regions of the galaxy show high dense gas fractions and low dense gas star formation efficiencies. This agrees with recent results for individual pointings by Usero et al. but using unbiased whole-galaxy maps. It also agrees qualitatively with the behavior observed contrasting our own Solar Neighborhood with the central regions of the Milky Way. The sense of the trends can be explained if the dense gas fraction tracks interstellar pressure but star formation occurs only in regions of high density contrast.

  19. Population-based absolute risk estimation with survey data.

    PubMed

    Kovalchik, Stephanie A; Pfeiffer, Ruth M

    2014-04-01

    Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614

  20. A survey of GPU-based medical image computing techniques.

    PubMed

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming; Wang, Defeng

    2012-09-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine. PMID:23256080

  1. Road Rage: Prevalence Pattern and Web Based Survey Feasibility

    PubMed Central

    Verma, Rohit; Balhara, Yatan Pal Singh; Ul-Hasan, Shiraz

    2014-01-01

    Introduction. Incidents of road rage are on a rise in India, but the literature is lacking in the aspect. There is an increasing realization of possibility of effective web based interventions to deliver public health related messages. Objective. The aim was to quantitatively evaluate risk factors among motor vehicle drivers using an internet based survey. Methods. Facebook users were evaluated using Life Orientation Test-Revised (LOT-R) and Driving Anger Scale (DAS). Results. An adequate response rate of 65.9% and satisfactory reliability with sizable correlation were obtained for both scales. Age was found to be positively correlated to LOT-R scores (r = 0.21; P = 0.02) and negatively correlated to DAS scores (r = −0.19; P = 0.03). Years of education were correlated to LOT-R scores (r = 0.26; P = 0.005) but not DAS scores (r = −0.14; P = 0.11). LOT-R scores did not correlate to DAS scores. Conclusion. There is high prevalence of anger amongst drivers in India particularly among younger males. A short web survey formatted in easy to use question language can result in a feasible conduction of an online survey. PMID:24864226

  2. A survey of GPU-based medical image computing techniques

    PubMed Central

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming

    2012-01-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine. PMID:23256080

  3. Rapid Mapping Method Based on Free Blocks of Surveys

    NASA Astrophysics Data System (ADS)

    Yu, Xianwen; Wang, Huiqing; Wang, Jinling

    2016-06-01

    While producing large-scale larger than 1:2000 maps in cities or towns, the obstruction from buildings leads to difficult and heavy tasks of measuring mapping control points. In order to avoid measuring the mapping control points and shorten the time of fieldwork, in this paper, a quick mapping method is proposed. This method adjusts many free blocks of surveys together, and transforms the points from all free blocks of surveys into the same coordinate system. The entire surveying area is divided into many free blocks, and connection points are set on the boundaries between free blocks. An independent coordinate system of every free block is established via completely free station technology, and the coordinates of the connection points, detail points and control points in every free block in the corresponding independent coordinate systems are obtained based on poly-directional open traverses. Error equations are established based on connection points, which are determined together to obtain the transformation parameters. All points are transformed from the independent coordinate systems to a transitional coordinate system via the transformation parameters. Several control points are then measured by GPS in a geodetic coordinate system. All the points can then be transformed from the transitional coordinate system to the geodetic coordinate system. In this paper, the implementation process and mathematical formulas of the new method are presented in detail, and the formula to estimate the precision of surveys is given. An example has demonstrated that the precision of using the new method could meet large-scale mapping needs.

  4. Organizational Learning, Strategic Flexibility and Business Model Innovation: An Empirical Research Based on Logistics Enterprises

    NASA Astrophysics Data System (ADS)

    Bao, Yaodong; Cheng, Lin; Zhang, Jian

    Using the data of 237 Jiangsu logistics firms, this paper empirically studies the relationship among organizational learning capability, business model innovation, strategic flexibility. The results show as follows; organizational learning capability has positive impacts on business model innovation performance; strategic flexibility plays mediating roles on the relationship between organizational learning capability and business model innovation; interaction among strategic flexibility, explorative learning and exploitative learning play significant roles in radical business model innovation and incremental business model innovation.

  5. A survey of artificial immune system based intrusion detection.

    PubMed

    Yang, Hua; Li, Tao; Hu, Xinlei; Wang, Feng; Zou, Yang

    2014-01-01

    In the area of computer security, Intrusion Detection (ID) is a mechanism that attempts to discover abnormal access to computers by analyzing various interactions. There is a lot of literature about ID, but this study only surveys the approaches based on Artificial Immune System (AIS). The use of AIS in ID is an appealing concept in current techniques. This paper summarizes AIS based ID methods from a new view point; moreover, a framework is proposed for the design of AIS based ID Systems (IDSs). This framework is analyzed and discussed based on three core aspects: antibody/antigen encoding, generation algorithm, and evolution mode. Then we collate the commonly used algorithms, their implementation characteristics, and the development of IDSs into this framework. Finally, some of the future challenges in this area are also highlighted. PMID:24790549

  6. A Survey of Artificial Immune System Based Intrusion Detection

    PubMed Central

    Li, Tao; Hu, Xinlei; Wang, Feng; Zou, Yang

    2014-01-01

    In the area of computer security, Intrusion Detection (ID) is a mechanism that attempts to discover abnormal access to computers by analyzing various interactions. There is a lot of literature about ID, but this study only surveys the approaches based on Artificial Immune System (AIS). The use of AIS in ID is an appealing concept in current techniques. This paper summarizes AIS based ID methods from a new view point; moreover, a framework is proposed for the design of AIS based ID Systems (IDSs). This framework is analyzed and discussed based on three core aspects: antibody/antigen encoding, generation algorithm, and evolution mode. Then we collate the commonly used algorithms, their implementation characteristics, and the development of IDSs into this framework. Finally, some of the future challenges in this area are also highlighted. PMID:24790549

  7. Social-Emotional Well-Being and Resilience of Children in Early Childhood Settings--PERIK: An Empirically Based Observation Scale for Practitioners

    ERIC Educational Resources Information Center

    Mayr, Toni; Ulich, Michaela

    2009-01-01

    Compared with the traditional focus on developmental problems, research on positive development is relatively new. Empirical research in children's well-being has been scarce. The aim of this study was to develop a theoretically and empirically based instrument for practitioners to observe and assess preschool children's well-being in early…

  8. Deep in Data: Empirical Data Based Software Accuracy Testing Using the Building America Field Data Repository: Preprint

    SciTech Connect

    Neymark, J.; Roberts, D.

    2013-06-01

    An opportunity is available for using home energy consumption and building description data to develop a standardized accuracy test for residential energy analysis tools. That is, to test the ability of uncalibrated simulations to match real utility bills. Empirical data collected from around the United States have been translated into a uniform Home Performance Extensible Markup Language format that may enable software developers to create translators to their input schemes for efficient access to the data. This may facilitate the possibility of modeling many homes expediently, and thus implementing software accuracy test cases by applying the translated data. This paper describes progress toward, and issues related to, developing a usable, standardized, empirical data-based software accuracy test suite.

  9. Using a Process-Based Numerical Model and Simple Empirical Relationships to Evaluate CO2 Fluxes from Agricultural Soils.

    NASA Astrophysics Data System (ADS)

    Buchner, J.; Simunek, J.; Dane, J. H.; King, A. P.; Lee, J.; Rolston, D. E.; Hopmans, J. W.

    2007-12-01

    Carbon dioxide emissions from an agricultural field in the Sacramento Valley, California, were evaluated using the process-based SOILCO2 module of the HYDRUS-1D software package and a simple empirical model. CO2 fluxes, meteorological variables, soil temperatures, and water contents were measured during years 2004-2006 at multiple locations in an agricultural field, half of which had been subjected to standard tillage and the other half to minimum tillage. Furrow irrigation was applied on a regular basis. While HYDRUS-1D simulates dynamic interactions between soil water contents, temperatures, soil CO2 concentrations, and soil respiration by numerically solving partially-differential water flow (Richards), and heat and CO2 transport (convection- dispersion) equations, an empirical model is based on simple reduction functions, closely resembling the CO2 production function of SOILCO2. It is assumed in this function that overall CO2 production in the soil profile is the sum of the soil and plant respiration, optimal values of which are affected by time, depth, water contents, temperatures, soil salinity, and CO2 concentrations in the soil profile. The effect of these environmental factors is introduced using various reduction functions that multiply the optimal soil CO2 production. While in the SOILCO2 module it is assumed that CO2 is produced in the soil profile and then transported, depending mainly on water contents, toward the soil surface, an empirical model relates CO2 emissions directly to various environmental factors. It was shown that both the numerical model and the simple reduction functions could reasonably well predict the CO2 fluxes across the soil surface. Regression coefficients between measured CO2 emissions and those predicted by the numerical and simple empirical models are compared.

  10. Prospects for Gaia and other space-based surveys .

    NASA Astrophysics Data System (ADS)

    Bailer-Jones, Coryn A. L.

    Gaia is a fully-approved all-sky astrometric and photometric survey due for launch in 2011. It will measure accurate parallaxes and proper motions for everything brighter than G=20 (ca. 109 stars). Its primary objective is to study the composition, origin and evolution of our Galaxy from the 3D structure, 3D velocities, abundances and ages of its stars. In some respects it can be considered as a cosmological survey at redshift zero. Several other upcoming space-based surveys, in particular JWST and Herschel, will study star and galaxy formation in the early (high-redshift) universe. In this paper I briefly describe these missions, as well as SIM and Jasmine, and explain why they need to observe from space. I then discuss some Galactic science contributions of Gaia concerning dark matter, the search for substructure, stellar populations and the mass-luminosity relation. The Gaia data are complex and require the development of novel analysis methods; here I summarize the principle of the astrometric processing. In the last two sections I outline how the Gaia data can be exploited in connection with other observational and theoretical work in order to build up a more comprehensive picture of galactic evolution.

  11. Does community-based conservation shape favorable attitudes among locals? an empirical study from nepal.

    PubMed

    Mehta, J N; Heinen, J T

    2001-08-01

    Like many developing countries, Nepal has adopted a community-based conservation (CBC) approach in recent years to manage its protected areas mainly in response to poor park-people relations. Among other things, under this approach the government has created new "people-oriented" conservation areas, formed and devolved legal authority to grassroots-level institutions to manage local resources, fostered infrastructure development, promoted tourism, and provided income-generating trainings to local people. Of interest to policy-makers and resource managers in Nepal and worldwide is whether this approach to conservation leads to improved attitudes on the part of local people. It is also important to know if personal costs and benefits associated with various intervention programs, and socioeconomic and demographic characteristics influence these attitudes. We explore these questions by looking at the experiences in Annapurna and Makalu-Barun Conservation Areas, Nepal, which have largely adopted a CBC approach in policy formulation, planning, and management. The research was conducted during 1996 and 1997; the data collection methods included random household questionnaire surveys, informal interviews, and review of official records and published literature. The results indicated that the majority of local people held favorable attitudes toward these conservation areas. Logistic regression results revealed that participation in training, benefit from tourism, wildlife depredation issue, ethnicity, gender, and education level were the significant predictors of local attitudes in one or the other conservation area. We conclude that the CBC approach has potential to shape favorable local attitudes and that these attitudes will be mediated by some personal attributes. PMID:11443381

  12. Smartphone-Based, Self-Administered Intervention System for Alcohol Use Disorders: Theory and Empirical Evidence Basis

    PubMed Central

    Dulin, Patrick L.; Gonzalez, Vivian M.; King, Diane K.; Giroux, Danielle; Bacon, Samantha

    2013-01-01

    Advances in mobile technology provide an opportunity to deliver in-the-moment interventions to individuals with alcohol use disorders, yet availability of effective “apps” that deliver evidence-based interventions is scarce. We developed an immediately available, portable, smartphone-based intervention system whose purpose is to provide stand-alone, self-administered assessment and intervention. In this paper, we describe how theory and empirical evidence, combined with smartphone functionality contributed to the construction of a user-friendly, engaging alcohol intervention. With translation in mind, we discuss how we selected appropriate intervention components including assessments, feedback and tools, that work together to produce the hypothesized outcomes. PMID:24347811

  13. Data Management for Ground-Based Science Surveys at CASU

    NASA Astrophysics Data System (ADS)

    Irwin, Mike

    2015-12-01

    In this talk I will review the data management facilities at CASU for handling large scale ground-based imaging and spectroscopic surveys. The overarching principle for all science data processing at CASU is to provide an end-to-end system that attempts to deliver fully calibrated optimally extracted data products ready for science use. The talk will outline our progress in achieving this and how end users visualize the state-of-play of the data processing and interact with the final products via our internal data repository.

  14. Comparison of ensemble post-processing approaches, based on empirical and dynamical error modelisation of rainfall-runoff model forecasts

    NASA Astrophysics Data System (ADS)

    Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.

    2012-04-01

    In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological

  15. Measuring microscopic evolution processes of complex networks based on empirical data

    NASA Astrophysics Data System (ADS)

    Chi, Liping

    2015-04-01

    Aiming at understanding the microscopic mechanism of complex systems in real world, we perform the measurement that characterizes the evolution properties on two empirical data sets. In the Autonomous Systems Internet data, the network size keeps growing although the system suffers a high rate of node deletion (r = 0.4) and link deletion (q = 0.81). However, the average degree keeps almost unchanged during the whole time range. At each time step the external links attached to a new node are about c = 1.1 and the internal links added between existing nodes are approximately m = 8. For the Scientific Collaboration data, it is a cumulated result of all the authors from 1893 up to the considered year. There is no deletion of nodes and links, r = q = 0. The external and internal links at each time step are c = 1.04 and m = 0, correspondingly. The exponents of degree distribution p(k) ∼ k-γ of these two empirical datasets γdata are in good agreement with that obtained theoretically γtheory. The results indicate that these evolution quantities may provide an insight into capturing the microscopic dynamical processes that govern the network topology.

  16. Surveying converter lining erosion state based on laser measurement technique

    NASA Astrophysics Data System (ADS)

    Li, Hongsheng; Shi, Tielin; Yang, Shuzi

    1998-08-01

    It is very important to survey the eroding state of the steelmaking converter lining real time so as to optimize technological process, extend converter durability and reduce steelmaking production costs. This paper gives one practical method based on the laser measure technique. It presents the basic principle of the measure technique. It presents the basic principle of the measure method, the composition of the measure system and the researches on key technological problems. The method is based on the technique of the laser range finding to net points on the surface of the surveyed converter lining, and the technology of angle finding to the laser beams. The angle signals would be used to help realizing the automatic scanning function also. The laser signals would be modulated and encoded. In the meantime, we would adopt the wavelet analysis and other filter algorithms, to denoise noisy data and extract useful information. And the main idea of some algorithms such as the net point measuring path planning and the measure device position optimal algorithm would also be given in order to improve the measure precision and real time property of the system.

  17. Sensor Systems Based on FPGAs and Their Applications: A Survey

    PubMed Central

    de la Piedra, Antonio; Braeken, An; Touhafi, Abdellah

    2012-01-01

    In this manuscript, we present a survey of designs and implementations of research sensor nodes that rely on FPGAs, either based upon standalone platforms or as a combination of microcontroller and FPGA. Several current challenges in sensor networks are distinguished and linked to the features of modern FPGAs. As it turns out, low-power optimized FPGAs are able to enhance the computation of several types of algorithms in terms of speed and power consumption in comparison to microcontrollers of commercial sensor nodes. We show that architectures based on the combination of microcontrollers and FPGA can play a key role in the future of sensor networks, in fields where processing capabilities such as strong cryptography, self-testing and data compression, among others, are paramount.

  18. Fault Location Based on Synchronized Measurements: A Comprehensive Survey

    PubMed Central

    Al-Mohammed, A. H.; Abido, M. A.

    2014-01-01

    This paper presents a comprehensive survey on transmission and distribution fault location algorithms that utilize synchronized measurements. Algorithms based on two-end synchronized measurements and fault location algorithms on three-terminal and multiterminal lines are reviewed. Series capacitors equipped with metal oxide varistors (MOVs), when set on a transmission line, create certain problems for line fault locators and, therefore, fault location on series-compensated lines is discussed. The paper reports the work carried out on adaptive fault location algorithms aiming at achieving better fault location accuracy. Work associated with fault location on power system networks, although limited, is also summarized. Additionally, the nonstandard high-frequency-related fault location techniques based on wavelet transform are discussed. Finally, the paper highlights the area for future research. PMID:24701191

  19. Density-based empirical likelihood procedures for testing symmetry of data distributions and K-sample comparisons

    PubMed Central

    Vexler, Albert; Tanajian, Hovig; Hutson, Alan D.

    2016-01-01

    In practice, parametric likelihood-ratio techniques are powerful statistical tools. In this article, we propose and examine novel and simple distribution-free test statistics that efficiently approximate parametric likelihood ratios to analyze and compare distributions of K groups of observations. Using the density-based empirical likelihood methodology, we develop a Stata package that applies to a test for symmetry of data distributions and compares K-sample distributions. Recognizing that recent statistical software packages do not sufficiently address K-sample nonparametric comparisons of data distributions, we propose a new Stata command, vxdbel, to execute exact density-based empirical likelihood-ratio tests using K samples. To calculate p-values of the proposed tests, we use the following methods: 1) a classical technique based on Monte Carlo p-value evaluations; 2) an interpolation technique based on tabulated critical values; and 3) a new hybrid technique that combines methods 1 and 2. The third, cutting-edge method is shown to be very efficient in the context of exact-test p-value computations. This Bayesian-type method considers tabulated critical values as prior information and Monte Carlo generations of test statistic values as data used to depict the likelihood function. In this case, a nonparametric Bayesian method is proposed to compute critical values of exact tests.

  20. An Empirical Investigation of a Theoretically Based Measure of Perceived Wellness

    ERIC Educational Resources Information Center

    Harari, Marc J.; Waehler, Charles A.; Rogers, James R.

    2005-01-01

    The Perceived Wellness Survey (PWS; T. Adams, 1995; T. Adams, J. Bezner, & M. Steinhardt, 1997) is a recently developed instrument intended to operationalize the comprehensive Perceived Wellness Model (T. Adams, J. Bezner, & M. Steinhardt, 1997), an innovative model that attempts to include the balance of multiple life activities in its evaluation…

  1. Conceptual and Empirical Bases of Readability Formulas. Technical Report No. 392.

    ERIC Educational Resources Information Center

    Anderson, Richard C.; Davison, Alice

    The problems arising from treating word and sentence complexity as the direct causes of difficulty in comprehension are surveyed in this paper from the perspective of readability formulas. The basic choices and assumptions made in the development and use of readability formulas are discussed in relation to the larger question of text…

  2. University-Based Evaluation Training Programs in the United States 1980-2008: An Empirical Examination

    ERIC Educational Resources Information Center

    LaVelle, John M.; Donaldson, Stewart I.

    2010-01-01

    Evaluation practice has grown in leaps and bounds in recent years. In contrast, the most recent survey data suggest that there has been a sharp decline in the number and strength of preservice evaluation training programs in the United States. In an effort to further understand this curious trend, an alternative methodology was used to examine the…

  3. Empirical evaluation of H.265/HEVC-based dynamic adaptive video streaming over HTTP (HEVC-DASH)

    NASA Astrophysics Data System (ADS)

    Irondi, Iheanyi; Wang, Qi; Grecos, Christos

    2014-05-01

    Real-time HTTP streaming has gained global popularity for delivering video content over Internet. In particular, the recent MPEG-DASH (Dynamic Adaptive Streaming over HTTP) standard enables on-demand, live, and adaptive Internet streaming in response to network bandwidth fluctuations. Meanwhile, emerging is the new-generation video coding standard, H.265/HEVC (High Efficiency Video Coding) promises to reduce the bandwidth requirement by 50% at the same video quality when compared with the current H.264/AVC standard. However, little existing work has addressed the integration of the DASH and HEVC standards, let alone empirical performance evaluation of such systems. This paper presents an experimental HEVC-DASH system, which is a pull-based adaptive streaming solution that delivers HEVC-coded video content through conventional HTTP servers where the client switches to its desired quality, resolution or bitrate based on the available network bandwidth. Previous studies in DASH have focused on H.264/AVC, whereas we present an empirical evaluation of the HEVC-DASH system by implementing a real-world test bed, which consists of an Apache HTTP Server with GPAC, an MP4Client (GPAC) with open HEVC-based DASH client and a NETEM box in the middle emulating different network conditions. We investigate and analyze the performance of HEVC-DASH by exploring the impact of various network conditions such as packet loss, bandwidth and delay on video quality. Furthermore, we compare the Intra and Random Access profiles of HEVC coding with the Intra profile of H.264/AVC when the correspondingly encoded video is streamed with DASH. Finally, we explore the correlation among the quality metrics and network conditions, and empirically establish under which conditions the different codecs can provide satisfactory performance.

  4. Comparisons of ground motions from the 1999 Chi-Chi, earthquake with empirical predictions largely based on data from California

    USGS Publications Warehouse

    Boore, D.M.

    2001-01-01

    This article has the modest goal of comparing the ground motions recorded during the 1999 Chi-Chi, Taiwan, mainshock with predictions from four empirical-based equations commonly used for western North America; these empirical predictions are largely based on data from California. Comparisons are made for peak acceleration and 5%-damped response spectra at periods between 0.1 and 4 sec. The general finding is that the Chi-Chi ground motions are smaller than those predicted from the empirically based equations for periods less than about 1 sec by factors averaging about 0.4 but as small as 0.26 (depending on period, on which equation is used, and on whether the sites are assumed to be rock or soil). There is a trend for the observed motions to approach or even exceed the predicted motions for longer periods. Motions at similar distances (30-60 km) to the east and to the west of the fault differ dramatically at periods between about 2 and 20 sec: Long-duration wave trains are present on the motions to the west, and when normalized to similar amplitudes at short periods, the response spectra of the motions at the western stations are as much as five times larger than those of motions from eastern stations. The explanation for the difference is probably related to site and propagation effects; the western stations are on the Coastal Plain, whereas the eastern stations are at the foot of young and steep mountains, either in the relatively narrow Longitudinal Valley or along the eastern coast-the sediments underlying the eastern stations are probably shallower and have higher velocity than those under the western stations.

  5. Empirical rainfall thresholds and copula based IDF curves for shallow landslides and flash floods

    NASA Astrophysics Data System (ADS)

    Bezak, Nejc; Šraj, Mojca; Brilly, Mitja; Mikoš, Matjaž

    2015-04-01

    Large mass movements, like deep-seated landslides or large debris flows, and flash floods can endanger human lives and cause huge environmental and economic damage in hazard areas. The main objective of the study was to investigate the characteristics of selected extreme rainfall events, which triggered landslides and caused flash floods, in Slovenia in the last 25 years. Seven extreme events, which occurred in Slovenia (Europe) in the last 25 years (1990-2014) and caused 17 casualties and about 500 million Euros of economic loss, were analysed in this study. Post-event analyses showed that rainfall characteristics triggering flash floods and landslides are different where landslides were triggered by longer duration (up to one or few weeks) rainfall events and flash floods by short duration (few hours to one or two days) rainfall events. The sensitivity analysis results indicate that inter-event time variable, which is defined as the minimum duration of the period without rain between two consecutive rainfall events, and sample definition methodology can have significant influence on the position of rainfall events in the intensity-duration space, on the constructed intensity-duration-frequency (IDF) curves and on the relationship between the empirical rainfall threshold curves and IDF curves constructed using copula approach. The empirical rainfall threshold curves (ID curves) were also evaluated for the selected extreme events. The results indicate that a combination of several empirical rainfall thresholds with appropriate high density of rainfall measuring network can be used as part of the early warning system for initiation of landslides and debris flows. However, different rainfall threshold curves should be used for lowland and mountainous areas in Slovenia. Furthermore, the intensity-duration-frequency (IDF) relationship was constructed using the Frank copula functions for 16 pluviographic meteorological stations in Slovenia using the high resolution

  6. Investigation of an empirical probability measure based test for multivariate normality

    SciTech Connect

    Booker, J.M.; Johnson, M.E.; Beckman, R.J.

    1984-01-01

    Foutz (1980) derived a goodness of fit test for a hypothesis specifying a continuous, p-variate distribution. The test statistic is both distribution-free and independent of p. In adapting the Foutz test for multivariate normality, we consider using chi/sup 2/ and rescaled beta variates in constructing statistically equivalent blocks. The Foutz test is compared to other multivariate normality tests developed by Hawkins (1981) and Malkovich and Afifi (1973). The set of alternative distributions tested include Pearson type II and type VII, Johnson translations, Plackett, and distributions arising from Khintchine's theorem. Univariate alternatives from the general class developed by Johnson et al. (1980) were also used. An empirical study confirms the independence of the test statistic on p even when parameters are estimated. In general, the Foutz test is less conservative under the null hypothesis but has poorer power under most alternatives than the other tests.

  7. Cyclone optimization based on a new empirical model for pressure drop

    SciTech Connect

    Ramachandran, G.; Leith, D. ); Dirgo, J. ); Feldman, H. )

    1991-01-01

    An empirical model for predicting pressure drop across a cyclone, developed by Dirgo is presented. The model was developed through a statistical analysis of pressure drop data for 98 cyclone designs. The model is shown to perform better than the pressure drop models of Shepherd and Lapple, Alexander, First, Stairmand, and Barth. This model is used with the efficiency model of Iozia and Leith to develop an optimization curve which predicts the minimum pressure drop and the dimension ratios of the optimized cyclone for a given aerodynamic cut diameter, d{sub 50}. The effect of variation in cyclone height, cyclone diameter, and flow on the optimization is determined. The optimization results are used to develop a design procedure for optimized cyclones.

  8. Empirical mode decomposition-based facial pose estimation inside video sequences

    NASA Astrophysics Data System (ADS)

    Qing, Chunmei; Jiang, Jianmin; Yang, Zhijing

    2010-03-01

    We describe a new pose-estimation algorithm via integration of the strength in both empirical mode decomposition (EMD) and mutual information. While mutual information is exploited to measure the similarity between facial images to estimate poses, EMD is exploited to decompose input facial images into a number of intrinsic mode function (IMF) components, which redistribute the effect of noise, expression changes, and illumination variations as such that, when the input facial image is described by the selected IMF components, all the negative effects can be minimized. Extensive experiments were carried out in comparisons to existing representative techniques, and the results show that the proposed algorithm achieves better pose-estimation performances with robustness to noise corruption, illumination variation, and facial expressions.

  9. Inferring causal molecular networks: empirical assessment through a community-based effort.

    PubMed

    Hill, Steven M; Heiser, Laura M; Cokelaer, Thomas; Unger, Michael; Nesser, Nicole K; Carlin, Daniel E; Zhang, Yang; Sokolov, Artem; Paull, Evan O; Wong, Chris K; Graim, Kiley; Bivol, Adrian; Wang, Haizhou; Zhu, Fan; Afsari, Bahman; Danilova, Ludmila V; Favorov, Alexander V; Lee, Wai Shing; Taylor, Dane; Hu, Chenyue W; Long, Byron L; Noren, David P; Bisberg, Alexander J; Mills, Gordon B; Gray, Joe W; Kellen, Michael; Norman, Thea; Friend, Stephen; Qutub, Amina A; Fertig, Elana J; Guan, Yuanfang; Song, Mingzhou; Stuart, Joshua M; Spellman, Paul T; Koeppl, Heinz; Stolovitzky, Gustavo; Saez-Rodriguez, Julio; Mukherjee, Sach

    2016-04-01

    It remains unclear whether causal, rather than merely correlational, relationships in molecular networks can be inferred in complex biological settings. Here we describe the HPN-DREAM network inference challenge, which focused on learning causal influences in signaling networks. We used phosphoprotein data from cancer cell lines as well as in silico data from a nonlinear dynamical model. Using the phosphoprotein data, we scored more than 2,000 networks submitted by challenge participants. The networks spanned 32 biological contexts and were scored in terms of causal validity with respect to unseen interventional data. A number of approaches were effective, and incorporating known biology was generally advantageous. Additional sub-challenges considered time-course prediction and visualization. Our results suggest that learning causal relationships may be feasible in complex settings such as disease states. Furthermore, our scoring approach provides a practical way to empirically assess inferred molecular networks in a causal sense. PMID:26901648

  10. Population forecasts and confidence intervals for Sweden: a comparison of model-based and empirical approaches.

    PubMed

    Cohen, J E

    1986-02-01

    This paper compares several methods of generating confidence intervals for forecasts of population size. Two rest on a demographic model for age-structured populations with stochastic fluctuations in vital rates. Two rest on empirical analyses of past forecasts of population sizes of Sweden at five-year intervals from 1780 to 1980 inclusive. Confidence intervals produced by the different methods vary substantially. The relative sizes differ in the various historical periods. The narrowest intervals offer a lower bound on uncertainty about the future. Procedures for estimating a range of confidence intervals are tentatively recommended. A major lesson is that finitely many observations of the past and incomplete theoretical understanding of the present and future can justify at best a range of confidence intervals for population projections. Uncertainty attaches not only to the point forecasts of future population, but also to the estimates of those forecasts' uncertainty. PMID:3484356

  11. Gold price analysis based on ensemble empirical model decomposition and independent component analysis

    NASA Astrophysics Data System (ADS)

    Xian, Lu; He, Kaijian; Lai, Kin Keung

    2016-07-01

    In recent years, the increasing level of volatility of the gold price has received the increasing level of attention from the academia and industry alike. Due to the complexity and significant fluctuations observed in the gold market, however, most of current approaches have failed to produce robust and consistent modeling and forecasting results. Ensemble Empirical Model Decomposition (EEMD) and Independent Component Analysis (ICA) are novel data analysis methods that can deal with nonlinear and non-stationary time series. This study introduces a new methodology which combines the two methods and applies it to gold price analysis. This includes three steps: firstly, the original gold price series is decomposed into several Intrinsic Mode Functions (IMFs) by EEMD. Secondly, IMFs are further processed with unimportant ones re-grouped. Then a new set of data called Virtual Intrinsic Mode Functions (VIMFs) is reconstructed. Finally, ICA is used to decompose VIMFs into statistically Independent Components (ICs). The decomposition results reveal that the gold price series can be represented by the linear combination of ICs. Furthermore, the economic meanings of ICs are analyzed and discussed in detail, according to the change trend and ICs' transformation coefficients. The analyses not only explain the inner driving factors and their impacts but also conduct in-depth analysis on how these factors affect gold price. At the same time, regression analysis has been conducted to verify our analysis. Results from the empirical studies in the gold markets show that the EEMD-ICA serve as an effective technique for gold price analysis from a new perspective.

  12. Accounting protesting and warm glow bidding in Contingent Valuation surveys considering the management of environmental goods--an empirical case study assessing the value of protecting a Natura 2000 wetland area in Greece.

    PubMed

    Grammatikopoulou, Ioanna; Olsen, Søren Bøye

    2013-11-30

    Based on a Contingent Valuation survey aiming to reveal the willingness to pay (WTP) for conservation of a wetland area in Greece, we show how protest and warm glow motives can be taken into account when modeling WTP. In a sample of more than 300 respondents, we find that 54% of the positive bids are rooted to some extent in warm glow reasoning while 29% of the zero bids can be classified as expressions of protest rather than preferences. In previous studies, warm glow bidders are only rarely identified while protesters are typically identified and excluded from further analysis. We test for selection bias associated with simple removal of both protesters and warm glow bidders in our data. Our findings show that removal of warm glow bidders does not significantly distort WTP whereas we find strong evidence of selection bias associated with removal of protesters. We show how to correct for such selection bias by using a sample selection model. In our empirical sample, using the typical approach of removing protesters from the analysis, the value of protecting the wetland is significantly underestimated by as much as 46% unless correcting for selection bias. PMID:24091158

  13. Ticking the boxes: a survey of workplace-based assessments

    PubMed Central

    Gilberthorpe, Thomas; Sarfo, Maame Duku; Lawrence-Smith, Geoff

    2016-01-01

    Aims and method To survey the quality of workplace-based assessments (WPBAs) through retrospective analysis of completed WPBA forms against training targets derived from the Royal College of Psychiatrists' Portfolio Online. Results Almost a third of assessments analysed showed no divergence in assessment scores across the varied assessment domains and there was poor correlation between domain scores and the nature of comments provided by assessors. Of the assessments that suggested action points only half were considered to be sufficiently ‘specific’ and ‘achievable’ to be useful for trainees' learning. Clinical implications WPBA is not currently being utilised to its full potential as a formative assessment tool and more widespread audit is needed to establish whether this is a local or a national issue. PMID:27087994

  14. Ticking the boxes: a survey of workplace-based assessments.

    PubMed

    Gilberthorpe, Thomas; Sarfo, Maame Duku; Lawrence-Smith, Geoff

    2016-04-01

    Aims and method To survey the quality of workplace-based assessments (WPBAs) through retrospective analysis of completed WPBA forms against training targets derived from the Royal College of Psychiatrists' Portfolio Online. Results Almost a third of assessments analysed showed no divergence in assessment scores across the varied assessment domains and there was poor correlation between domain scores and the nature of comments provided by assessors. Of the assessments that suggested action points only half were considered to be sufficiently 'specific' and 'achievable' to be useful for trainees' learning. Clinical implications WPBA is not currently being utilised to its full potential as a formative assessment tool and more widespread audit is needed to establish whether this is a local or a national issue. PMID:27087994

  15. Simulations of the Structure and Properties of Large Icosahedral Boron Clusters Based on a Novel Semi-Empirical Hamiltonian

    NASA Astrophysics Data System (ADS)

    Tandy, Paul; Yu, Ming; Jayanthi, C. S.; Wu, Shi-Yu; Condensed Matter Theory Group Team

    2013-03-01

    A successful development of a parameterized semi-empirical Hamiltonian (SCED-LCAO) for boron based on a LCAO framework using a sp3 basis set will be discussed. The semi-empirical Hamiltonian contains environment-dependency and electron screening effects of a many-body Hamiltonian and allows for charge self-consistency. We have optimized the parameters of the SCED-LCAO Hamiltonian for boron by fitting the properties (e.g., the binding energy, bond length, etc.) of boron sheets, small clusters and boron alpha to first-principles calculations based on DFT calculations. Although extended phases of boron alpha and beta have been studied, large clusters of boron with icosahedral structures such as those cut from boron alpha are difficult if not impossible to simulate with ab initio methods. We will demonstrate the effectiveness of the SCED-LCAO Hamiltonian in studying icosahedral boron clusters containing up to 800 atoms and will report on some novel boron clusters and computational speed. Support has been provided by the Dillion Fellowship.

  16. Development of the Knowledge-based & Empirical Combined Scoring Algorithm (KECSA) to Score Protein-Ligand Interactions

    PubMed Central

    Zheng, Zheng

    2013-01-01

    We describe a novel knowledge-based protein-ligand scoring function that employs a new definition for the reference state, allowing us to relate a statistical potential to a Lennard-Jones (LJ) potential. In this way, the LJ potential parameters were generated from protein-ligand complex structural data contained in the PDB. Forty-nine types of atomic pairwise interactions were derived using this method, which we call the knowledge-based and empirical combined scoring algorithm (KECSA). Two validation benchmarks were introduced to test the performance of KECSA. The first validation benchmark included two test sets that address the training-set and enthalpy/entropy of KECSA The second validation benchmark suite included two large-scale and five small-scale test sets to compare the reproducibility of KECSA with respect to two empirical score functions previously developed in our laboratory (LISA and LISA+), as well as to other well-known scoring methods. Validation results illustrate that KECSA shows improved performance in all test sets when compared with other scoring methods especially in its ability to minimize the RMSE. LISA and LISA+ displayed similar performance using the correlation coefficient and Kendall τ as the metric of quality for some of the small test sets. Further pathways for improvement are discussed which would KECSA more sensitive to subtle changes in ligand structure. PMID:23560465

  17. A Compound fault diagnosis for rolling bearings method based on blind source separation and ensemble empirical mode decomposition.

    PubMed

    Wang, Huaqing; Li, Ruitong; Tang, Gang; Yuan, Hongfang; Zhao, Qingliang; Cao, Xi

    2014-01-01

    A Compound fault signal usually contains multiple characteristic signals and strong confusion noise, which makes it difficult to separate week fault signals from them through conventional ways, such as FFT-based envelope detection, wavelet transform or empirical mode decomposition individually. In order to improve the compound faults diagnose of rolling bearings via signals' separation, the present paper proposes a new method to identify compound faults from measured mixed-signals, which is based on ensemble empirical mode decomposition (EEMD) method and independent component analysis (ICA) technique. With the approach, a vibration signal is firstly decomposed into intrinsic mode functions (IMF) by EEMD method to obtain multichannel signals. Then, according to a cross correlation criterion, the corresponding IMF is selected as the input matrix of ICA. Finally, the compound faults can be separated effectively by executing ICA method, which makes the fault features more easily extracted and more clearly identified. Experimental results validate the effectiveness of the proposed method in compound fault separating, which works not only for the outer race defect, but also for the rollers defect and the unbalance fault of the experimental system. PMID:25289644

  18. Spectral analysis of Hall-effect thruster plasma oscillations based on the empirical mode decomposition

    SciTech Connect

    Kurzyna, J.; Mazouffre, S.; Lazurenko, A.; Albarede, L.; Bonhomme, G.; Makowski, K.; Dudeck, M.; Peradzynski, Z.

    2005-12-15

    Hall-effect thruster plasma oscillations recorded by means of probes located at the channel exit are analyzed using the empirical mode decomposition (EMD) method. This self-adaptive technique permits to decompose a nonstationary signal into a set of intrinsic modes, and acts as a very efficient filter allowing to separate contributions of different underlying physical mechanisms. Applying the Hilbert transform to the whole set of modes allows to identify peculiar events and to assign them a range of instantaneous frequency and power. In addition to 25 kHz breathing-type oscillations which are unambiguously identified, the EMD approach confirms the existence of oscillations with instantaneous frequencies in the range of 100-500 kHz typical for ion transit-time oscillations. Modeling of high-frequency modes ({nu}{approx}10 MHz) resulting from EMD of measured wave forms supports the idea that high-frequency plasma oscillations originate from electron-density perturbations propagating azimuthally with the electron drift velocity.

  19. Quantitative hydrogen analysis in minerals based on a semi-empirical approach

    NASA Astrophysics Data System (ADS)

    Kristiansson, P.; Borysiuk, M.; Ros, L.; Skogby, H.; Abdel, N.; Elfman, M.; Nilsson, E. J. C.; Pallon, J.

    2013-07-01

    Hydrogen normally occurs as hydroxyl ions related to defects at specific crystallographic sites in the structures, and is normally characterized by infrared spectroscopy (FTIR). For quantification purposes the FTIR technique has proven to be less precise since calibrations against independent methods are needed. Hydrogen analysis by the NMP technique can solve many of the problems, due to the low detection limit, high lateral resolution, insignificant matrix effects and possibility to discriminate surface-adsorbed water. The technique has been shown to work both on thin samples and on thicker geological samples. To avoid disturbance from surface contamination the hydrogen is analyzed inside semi-thick geological samples. The technique used is an elastic recoil technique where both the incident projectile (proton) and the recoiled hydrogen are detected in coincidence in a segmented detector. Both the traditional annular system with the detector divided in two halves and the new double-sided silicon strip detector (DSSSD) has been used. In this work we present an upgraded version of the technique, studying two sets of mineral standards combined with pre-sample charge normalization. To improve the processing time of data we suggest a very simple semi-empirical approach to be used for data evaluation. The advantages and drawbacks with the approach are discussed and a possible extension of the model is suggested.

  20. Inferring causal molecular networks: empirical assessment through a community-based effort

    PubMed Central

    Hill, Steven M.; Heiser, Laura M.; Cokelaer, Thomas; Unger, Michael; Nesser, Nicole K.; Carlin, Daniel E.; Zhang, Yang; Sokolov, Artem; Paull, Evan O.; Wong, Chris K.; Graim, Kiley; Bivol, Adrian; Wang, Haizhou; Zhu, Fan; Afsari, Bahman; Danilova, Ludmila V.; Favorov, Alexander V.; Lee, Wai Shing; Taylor, Dane; Hu, Chenyue W.; Long, Byron L.; Noren, David P.; Bisberg, Alexander J.; Mills, Gordon B.; Gray, Joe W.; Kellen, Michael; Norman, Thea; Friend, Stephen; Qutub, Amina A.; Fertig, Elana J.; Guan, Yuanfang; Song, Mingzhou; Stuart, Joshua M.; Spellman, Paul T.; Koeppl, Heinz; Stolovitzky, Gustavo; Saez-Rodriguez, Julio; Mukherjee, Sach

    2016-01-01

    Inferring molecular networks is a central challenge in computational biology. However, it has remained unclear whether causal, rather than merely correlational, relationships can be effectively inferred in complex biological settings. Here we describe the HPN-DREAM network inference challenge that focused on learning causal influences in signaling networks. We used phosphoprotein data from cancer cell lines as well as in silico data from a nonlinear dynamical model. Using the phosphoprotein data, we scored more than 2,000 networks submitted by challenge participants. The networks spanned 32 biological contexts and were scored in terms of causal validity with respect to unseen interventional data. A number of approaches were effective and incorporating known biology was generally advantageous. Additional sub-challenges considered time-course prediction and visualization. Our results constitute the most comprehensive assessment of causal network inference in a mammalian setting carried out to date and suggest that learning causal relationships may be feasible in complex settings such as disease states. Furthermore, our scoring approach provides a practical way to empirically assess the causal validity of inferred molecular networks. PMID:26901648

  1. An Empirical Model of Saturn's Current Sheet Based on Global MHD Modeling of Saturn's Magnetosphere

    NASA Astrophysics Data System (ADS)

    Hansen, K. C.; Nickerson, J. S.; Gombosi, T. I.

    2014-12-01

    Cassini observations imply that during southern summer Saturn's magnetospheric current sheet is displaced northward above the rotational equator and should be similarly displaced southward during northern summer [C.S. Arridge et al., Warping of Saturn's magnetospheric and magnetotail current sheets, Journal of Geophysical Research, Vol. 113, August 2008]. Arridge et al. show that Cassini data from the noon, midnight and dawn local time sectors clearly indicate this bending and they present an azimuthally independent model to describe this bowl shaped geometry. We have used our global MHD model, BATS-R-US/SWMF, to study Saturn's magnetospheric current sheet under different solar wind dynamic pressures and solar zenith angle conditions. We find that under typical conditions the current sheet does bend upward and take on a basic shape similar to the Arridge model in the noon, midnight, and dawn sectors. However, the MHD model results show significant variations from the Arridge model including the degree of bending, variations away from a simple bowl shape, non-uniformity across local time sectors, drastic deviations in the dusk sector, and a dependence on the solar wind dynamic pressure. We will present a detailed description of our 3D MHD model results and the characteristics of the current sheet in the model. We will point out variations from the Arridge model. In addition, we will present a new empirical model of Saturn's current sheet that attempts to characterize the dependences on the local time sector and the solar wind dynamic pressure.

  2. Consequences of asymmetric competition between resident and invasive defoliators: a novel empirically based modelling approach.

    PubMed

    Ammunét, Tea; Klemola, Tero; Parvinen, Kalle

    2014-03-01

    Invasive species can have profound effects on a resident community via indirect interactions among community members. While long periodic cycles in population dynamics can make the experimental observation of the indirect effects difficult, modelling the possible effects on an evolutionary time scale may provide the much needed information on the potential threats of the invasive species on the ecosystem. Using empirical data from a recent invasion in northernmost Fennoscandia, we applied adaptive dynamics theory and modelled the long term consequences of the invasion by the winter moth into the resident community. Specifically, we investigated the outcome of the observed short-term asymmetric preferences of generalist predators and specialist parasitoids on the long term population dynamics of the invasive winter moth and resident autumnal moth sharing these natural enemies. Our results indicate that coexistence after the invasion is possible. However, the outcome of the indirect interaction on the population dynamics of the moth species was variable and the dynamics might not be persistent on an evolutionary time scale. In addition, the indirect interactions between the two moth species via shared natural enemies were able to cause asynchrony in the population cycles corresponding to field observations from previous sympatric outbreak areas. Therefore, the invasion may cause drastic changes in the resident community, for example by prolonging outbreak periods of birch-feeding moths, increasing the average population densities of the moths or, alternatively, leading to extinction of the resident moth species or to equilibrium densities of the two, formerly cyclic, herbivores. PMID:24380810

  3. Towards high performing hospital enterprise systems: an empirical and literature based design framework

    NASA Astrophysics Data System (ADS)

    dos Santos Fradinho, Jorge Miguel

    2014-05-01

    Our understanding of enterprise systems (ES) is gradually evolving towards a sense of design which leverages multidisciplinary bodies of knowledge that may bolster hybrid research designs and together further the characterisation of ES operation and performance. This article aims to contribute towards ES design theory with its hospital enterprise systems design (HESD) framework, which reflects a rich multidisciplinary literature and two in-depth hospital empirical cases from the US and UK. In doing so it leverages systems thinking principles and traditionally disparate bodies of knowledge to bolster the theoretical evolution and foundation of ES. A total of seven core ES design elements are identified and characterised with 24 main categories and 53 subcategories. In addition, it builds on recent work which suggests that hospital enterprises are comprised of multiple internal ES configurations which may generate different levels of performance. Multiple sources of evidence were collected including electronic medical records, 54 recorded interviews, observation, and internal documents. Both in-depth cases compare and contrast higher and lower performing ES configurations. Following literal replication across in-depth cases, this article concludes that hospital performance can be improved through an enriched understanding of hospital ES design.

  4. Cardiopulmonary Resuscitation Pattern Evaluation Based on Ensemble Empirical Mode Decomposition Filter via Nonlinear Approaches.

    PubMed

    Sadrawi, Muammar; Sun, Wei-Zen; Ma, Matthew Huei-Ming; Dai, Chun-Yi; Abbod, Maysam F; Shieh, Jiann-Shing

    2016-01-01

    Good quality cardiopulmonary resuscitation (CPR) is the mainstay of treatment for managing patients with out-of-hospital cardiac arrest (OHCA). Assessment of the quality of the CPR delivered is now possible through the electrocardiography (ECG) signal that can be collected by an automated external defibrillator (AED). This study evaluates a nonlinear approximation of the CPR given to the asystole patients. The raw ECG signal is filtered using ensemble empirical mode decomposition (EEMD), and the CPR-related intrinsic mode functions (IMF) are chosen to be evaluated. In addition, sample entropy (SE), complexity index (CI), and detrended fluctuation algorithm (DFA) are collated and statistical analysis is performed using ANOVA. The primary outcome measure assessed is the patient survival rate after two hours. CPR pattern of 951 asystole patients was analyzed for quality of CPR delivered. There was no significant difference observed in the CPR-related IMFs peak-to-peak interval analysis for patients who are younger or older than 60 years of age, similarly to the amplitude difference evaluation for SE and DFA. However, there is a difference noted for the CI (p < 0.05). The results show that patients group younger than 60 years have higher survival rate with high complexity of the CPR-IMFs amplitude differences. PMID:27529068

  5. Development of Items for a Pedagogical Content Knowledge Test Based on Empirical Analysis of Pupils' Errors

    NASA Astrophysics Data System (ADS)

    Jüttner, Melanie; Neuhaus, Birgit J.

    2012-05-01

    In view of the lack of instruments for measuring biology teachers' pedagogical content knowledge (PCK), this article reports on a study about the development of PCK items for measuring teachers' knowledge of pupils' errors and ways for dealing with them. This study investigated 9th and 10th grade German pupils' (n = 461) drawings in an achievement test about the knee-jerk in biology, which were analysed by using the inductive qualitative analysis of their content. The empirical data were used for the development of the items in the PCK test. The validation of the items was determined with think-aloud interviews of German secondary school teachers (n = 5). If the item was determined, the reliability was tested by the results of German secondary school biology teachers (n = 65) who took the PCK test. The results indicated that these items are satisfactorily reliable (Cronbach's alpha values ranged from 0.60 to 0.65). We suggest a larger sample size and American biology teachers be used in our further studies. The findings of this study about teachers' professional knowledge from the PCK test could provide new information about the influence of teachers' knowledge on their pupils' understanding of biology and their possible errors in learning biology.

  6. Cardiopulmonary Resuscitation Pattern Evaluation Based on Ensemble Empirical Mode Decomposition Filter via Nonlinear Approaches

    PubMed Central

    Ma, Matthew Huei-Ming

    2016-01-01

    Good quality cardiopulmonary resuscitation (CPR) is the mainstay of treatment for managing patients with out-of-hospital cardiac arrest (OHCA). Assessment of the quality of the CPR delivered is now possible through the electrocardiography (ECG) signal that can be collected by an automated external defibrillator (AED). This study evaluates a nonlinear approximation of the CPR given to the asystole patients. The raw ECG signal is filtered using ensemble empirical mode decomposition (EEMD), and the CPR-related intrinsic mode functions (IMF) are chosen to be evaluated. In addition, sample entropy (SE), complexity index (CI), and detrended fluctuation algorithm (DFA) are collated and statistical analysis is performed using ANOVA. The primary outcome measure assessed is the patient survival rate after two hours. CPR pattern of 951 asystole patients was analyzed for quality of CPR delivered. There was no significant difference observed in the CPR-related IMFs peak-to-peak interval analysis for patients who are younger or older than 60 years of age, similarly to the amplitude difference evaluation for SE and DFA. However, there is a difference noted for the CI (p < 0.05). The results show that patients group younger than 60 years have higher survival rate with high complexity of the CPR-IMFs amplitude differences. PMID:27529068

  7. Improving the empirical model for plasma nitrided AISI 316L corrosion resistance based on Mössbauer spectroscopy

    NASA Astrophysics Data System (ADS)

    Campos, M.; de Souza, S. D.; de Souza, S.; Olzon-Dionysio, M.

    2011-11-01

    Traditional plasma nitriding treatments using temperatures ranging from approximately 650 to 730 K can improve wear, corrosion resistance and surface hardness on stainless steels. The nitrided layer consists of some iron nitrides: the cubic γ ' phase (Fe4N), the hexagonal phase ɛ (Fe2 - 3N) and a nitrogen supersatured solid phase γ N . An empirical model is proposed to explain the corrosion resistance of AISI 316L and ASTM F138 nitrided samples based on Mössbauer Spectroscopy results: the larger the ratio between ɛ and γ ' phase fractions of the sample, the better its resistance corrosion is. In this work, this model is examined using some new results of AISI 316L samples, nitrided under the same previous conditions of gas composition and temperature, but at different pressure, for 3, 4 and 5 h. The sample nitrided for 4 h, whose value for ɛ/ γ ' is maximum (= 0.73), shows a slightly better response than the other two samples, nitrided for 5 and 3 h ( ɛ/ γ ' = 0.72 and 0.59, respectively). Moreover, these samples show very similar behavior. Therefore, this set of samples was not suitable to test the empirical model. However, the comparison between the present results of potentiodynamic polarization curves and those obtained previously at 4 and 4.5 torr, could indicated that the corrosion resistance of the sample which only presents the γ N phase was the worst of them. Moreover, the empirical model seems not to be ready to explain the response to corrosion and it should be improved including the γ N phase.

  8. Empirical evidence for identical band gaps in substituted C{sub 60} and C{sub 70} based fullerenes

    SciTech Connect

    Mattias Andersson, L. Tanaka, Hideyuki

    2014-01-27

    Optical absorptance data, and a strong correlation between solar cell open circuit voltages and the ionization potentials of a wide range of differently substituted fullerene acceptors, are presented as empirical evidence for identical, or at least very similar, band gaps in all substituted C{sub 60} and C{sub 70} based fullerenes. Both the number and kind of substituents in this study are sufficiently varied to imply generality. While the band gaps of the fullerenes remain the same for all the different substitutions, their ionization potentials vary greatly in a span of more than 0.4 eV. The merits and drawbacks of using these results together with photoelectron based techniques to determine relative fullerene energy levels for, e.g., organic solar cell applications compared to more direct electrochemical methods are also discussed.

  9. An all-atom structure-based potential for proteins: bridging minimal models with all-atom empirical forcefields.

    PubMed

    Whitford, Paul C; Noel, Jeffrey K; Gosavi, Shachi; Schug, Alexander; Sanbonmatsu, Kevin Y; Onuchic, José N

    2009-05-01

    Protein dynamics take place on many time and length scales. Coarse-grained structure-based (Go) models utilize the funneled energy landscape theory of protein folding to provide an understanding of both long time and long length scale dynamics. All-atom empirical forcefields with explicit solvent can elucidate our understanding of short time dynamics with high energetic and structural resolution. Thus, structure-based models with atomic details included can be used to bridge our understanding between these two approaches. We report on the robustness of folding mechanisms in one such all-atom model. Results for the B domain of Protein A, the SH3 domain of C-Src Kinase, and Chymotrypsin Inhibitor 2 are reported. The interplay between side chain packing and backbone folding is explored. We also compare this model to a C(alpha) structure-based model and an all-atom empirical forcefield. Key findings include: (1) backbone collapse is accompanied by partial side chain packing in a cooperative transition and residual side chain packing occurs gradually with decreasing temperature, (2) folding mechanisms are robust to variations of the energetic parameters, (3) protein folding free-energy barriers can be manipulated through parametric modifications, (4) the global folding mechanisms in a C(alpha) model and the all-atom model agree, although differences can be attributed to energetic heterogeneity in the all-atom model, and (5) proline residues have significant effects on folding mechanisms, independent of isomerization effects. Because this structure-based model has atomic resolution, this work lays the foundation for future studies to probe the contributions of specific energetic factors on protein folding and function. PMID:18837035

  10. Ensemble Empirical Mode Decomposition based methodology for ultrasonic testing of coarse grain austenitic stainless steels.

    PubMed

    Sharma, Govind K; Kumar, Anish; Jayakumar, T; Purnachandra Rao, B; Mariyappa, N

    2015-03-01

    A signal processing methodology is proposed in this paper for effective reconstruction of ultrasonic signals in coarse grained high scattering austenitic stainless steel. The proposed methodology is comprised of the Ensemble Empirical Mode Decomposition (EEMD) processing of ultrasonic signals and application of signal minimisation algorithm on selected Intrinsic Mode Functions (IMFs) obtained by EEMD. The methodology is applied to ultrasonic signals obtained from austenitic stainless steel specimens of different grain size, with and without defects. The influence of probe frequency and data length of a signal on EEMD decomposition is also investigated. For a particular sampling rate and probe frequency, the same range of IMFs can be used to reconstruct the ultrasonic signal, irrespective of the grain size in the range of 30-210 μm investigated in this study. This methodology is successfully employed for detection of defects in a 50mm thick coarse grain austenitic stainless steel specimens. Signal to noise ratio improvement of better than 15 dB is observed for the ultrasonic signal obtained from a 25 mm deep flat bottom hole in 200 μm grain size specimen. For ultrasonic signals obtained from defects at different depths, a minimum of 7 dB extra enhancement in SNR is achieved as compared to the sum of selected IMF approach. The application of minimisation algorithm with EEMD processed signal in the proposed methodology proves to be effective for adaptive signal reconstruction with improved signal to noise ratio. This methodology was further employed for successful imaging of defects in a B-scan. PMID:25488024

  11. Changing Healthcare Providers’ Behavior during Pediatric Inductions with an Empirically-based Intervention

    PubMed Central

    Martin, Sarah R.; Chorney, Jill MacLaren; Tan, Edwin T.; Fortier, Michelle A.; Blount, Ronald L.; Wald, Samuel H.; Shapiro, Nina L.; Strom, Suzanne L.; Patel, Swati; Kain, Zeev N.

    2011-01-01

    Background Each year over 4 million children experience significant levels of preoperative anxiety, which has been linked to poor recovery outcomes. Healthcare providers (HCP) and parents represent key resources for children to help them manage their preoperative anxiety. The present study reports on the development and preliminary feasibility testing of a new intervention designed to change HCP and parent perioperative behaviors that have been previously reported to be associated with children’s coping and stress behaviors before surgery. Methods An empirically-derived intervention, Provider-Tailored Intervention for Perioperative Stress, was developed to train HCPs to increase behaviors that promote children’s coping and decrease behaviors that may exacerbate children’s distress. Rates of HCP behaviors were coded and compared between pre-intervention and post-intervention. Additionally, rates of parents’ behaviors were compared between those that interacted with HCPs before training to those interacting with HCPs post-intervention. Results Effect sizes indicated that HCPs that underwent training demonstrated increases in rates of desired behaviors (range: 0.22 to 1.49) and decreases in rates of undesired behaviors (range: 0.15 to 2.15). Additionally, parents, who were indirectly trained, also demonstrated changes to their rates of desired (range: 0.30 to 0.60) and undesired behaviors (range: 0.16 to 0.61). Conclusions The intervention successfully modified HCP and parent behaviors. It represents a potentially new clinical way to decrease anxiety in children. A recently National Institute of Child Health and Development funded multi-site randomized control trial will examine the efficacy of this intervention in reducing children’s preoperative anxiety and improving children’s postoperative recovery is about to start. PMID:21606826

  12. Empirical modeling the topside ion density around 600 km based on ROCSAT-1 satellite observations

    NASA Astrophysics Data System (ADS)

    Liu, Libo; Huijun Le, lake709.; Chen, Yiding; Wan, Weixing; Huang, He

    ROCSAT-1 satellite was operated in circular orbit at an altitude of 600 km with an inclination of 35 degree during the period from 1999 to 2004. Ionospheric Plasma and Electrodynamics Instrument (IPEI) on board the satellite includes an Ion Trap (IT), which was mainly used to measure the total ion concentration. An empirical model of ion density was constructed by using the data obtained from IT with the temporal resolution of 1s in the range of solar proxy P10.7 from 100 to 240sfu under relatively quiet geomagnetic conditions (Ap≤22). The model describes the ion density variations as functions of local time, day of year, solar activity level, longitude, and height within the altitude range of 560-660 km. An outstanding merit of the model is that it can take the altitude variation of the electron density into account. The model reproduces the ROCSAT-1 ion density accurately with 0.141 root mean square error (RMSE), performing better than International Reference Ionosphere 2007 (IRI2007) with 5.986 RMSE. Furthermore, we use it to predict ion density observed within the similar ROCSAT-1 altitudes, such as the Japanese HINOTORI satellite, to further validate our model. The comparisons show the relative error of 94.5% data located in ±5% and more than 99% data located in ±15%. The model provides a way to describe temporal and spatial variation of topside ion density. It can capture the variations of ion density under given conditions (including height), without doing altitude normalization before modeling. With this model, it is more conducive to understand the climatological features of topside ion density. Acknowledgements: Ionosonde data are provided from BNOSE of IGGCAS. This research was supported by the projects of Chinese Academy of Sciences (KZZD-EW-01-3), National Key Basic Research Program of China (2012CB825604), and National Natural Science Foundation of China (41231065, 41174137).

  13. Combining Empirical Relationships with Data Based Mechanistic Modeling to Inform Solute Tracer Investigations across Stream Orders

    NASA Astrophysics Data System (ADS)

    Herrington, C.; Gonzalez-Pinzon, R.; Covino, T. P.; Mortensen, J.

    2015-12-01

    Solute transport studies in streams and rivers often begin with the introduction of conservative and reactive tracers into the water column. Information on the transport of these substances is then captured within tracer breakthrough curves (BTCs) and used to estimate, for instance, travel times and dissolved nutrient and carbon dynamics. Traditionally, these investigations have been limited to systems with small discharges (< 200 L/s) and with small reach lengths (< 500 m), partly due to the need for a priori information of the reach's hydraulic characteristics (e.g., channel geometry, resistance and dispersion coefficients) to predict arrival times, times to peak concentrations of the solute and mean travel times. Current techniques to acquire these channel characteristics through preliminary tracer injections become cost prohibitive at higher stream orders and the use of semi-continuous water quality sensors for collecting real-time information may be affected from erroneous readings that are masked by high turbidity (e.g., nitrate signals with SUNA instruments or fluorescence measures) and/or high total dissolved solids (e.g., making prohibitively expensive the use of salt tracers such as NaCl) in larger systems. Additionally, a successful time-of-travel study is valuable for only a single discharge and river stage. We have developed a method to predict tracer BTCs to inform sampling frequencies at small and large stream orders using empirical relationships developed from multiple tracer injections spanning several orders of magnitude in discharge and reach length. This method was successfully tested in 1st to 8th order systems along the Middle Rio Grande River Basin in New Mexico, USA.

  14. An Empirical Study of Neural Network-Based Audience Response Technology in a Human Anatomy Course for Pharmacy Students.

    PubMed

    Fernández-Alemán, José Luis; López-González, Laura; González-Sequeros, Ofelia; Jayne, Chrisina; López-Jiménez, Juan José; Carrillo-de-Gea, Juan Manuel; Toval, Ambrosio

    2016-04-01

    This paper presents an empirical study of a formative neural network-based assessment approach by using mobile technology to provide pharmacy students with intelligent diagnostic feedback. An unsupervised learning algorithm was integrated with an audience response system called SIDRA in order to generate states that collect some commonality in responses to questions and add diagnostic feedback for guided learning. A total of 89 pharmacy students enrolled on a Human Anatomy course were taught using two different teaching methods. Forty-four students employed intelligent SIDRA (i-SIDRA), whereas 45 students received the same training but without using i-SIDRA. A statistically significant difference was found between the experimental group (i-SIDRA) and the control group (traditional learning methodology), with T (87) = 6.598, p < 0.001. In four MCQs tests, the difference between the number of correct answers in the first attempt and in the last attempt was also studied. A global effect size of 0.644 was achieved in the meta-analysis carried out. The students expressed satisfaction with the content provided by i-SIDRA and the methodology used during the process of learning anatomy (M = 4.59). The new empirical contribution presented in this paper allows instructors to perform post hoc analyses of each particular student's progress to ensure appropriate training. PMID:26815339

  15. ECG-based heartbeat classification for arrhythmia detection: A survey.

    PubMed

    Luz, Eduardo José da S; Schwartz, William Robson; Cámara-Chávez, Guillermo; Menotti, David

    2016-04-01

    An electrocardiogram (ECG) measures the electric activity of the heart and has been widely used for detecting heart diseases due to its simplicity and non-invasive nature. By analyzing the electrical signal of each heartbeat, i.e., the combination of action impulse waveforms produced by different specialized cardiac tissues found in the heart, it is possible to detect some of its abnormalities. In the last decades, several works were developed to produce automatic ECG-based heartbeat classification methods. In this work, we survey the current state-of-the-art methods of ECG-based automated abnormalities heartbeat classification by presenting the ECG signal preprocessing, the heartbeat segmentation techniques, the feature description methods and the learning algorithms used. In addition, we describe some of the databases used for evaluation of methods indicated by a well-known standard developed by the Association for the Advancement of Medical Instrumentation (AAMI) and described in ANSI/AAMI EC57:1998/(R)2008 (ANSI/AAMI, 2008). Finally, we discuss limitations and drawbacks of the methods in the literature presenting concluding remarks and future challenges, and also we propose an evaluation process workflow to guide authors in future works. PMID:26775139

  16. Holding-based network of nations based on listed energy companies: An empirical study on two-mode affiliation network of two sets of actors

    NASA Astrophysics Data System (ADS)

    Li, Huajiao; Fang, Wei; An, Haizhong; Gao, Xiangyun; Yan, Lili

    2016-05-01

    Economic networks in the real world are not homogeneous; therefore, it is important to study economic networks with heterogeneous nodes and edges to simulate a real network more precisely. In this paper, we present an empirical study of the one-mode derivative holding-based network constructed by the two-mode affiliation network of two sets of actors using the data of worldwide listed energy companies and their shareholders. First, we identify the primitive relationship in the two-mode affiliation network of the two sets of actors. Then, we present the method used to construct the derivative network based on the shareholding relationship between two sets of actors and the affiliation relationship between actors and events. After constructing the derivative network, we analyze different topological features on the node level, edge level and entire network level and explain the meanings of the different values of the topological features combining the empirical data. This study is helpful for expanding the usage of complex networks to heterogeneous economic networks. For empirical research on the worldwide listed energy stock market, this study is useful for discovering the inner relationships between the nations and regions from a new perspective.

  17. Effects of Personalization and Invitation Email Length on Web-Based Survey Response Rates

    ERIC Educational Resources Information Center

    Trespalacios, Jesús H.; Perkins, Ross A.

    2016-01-01

    Individual strategies to increase response rate and survey completion have been extensively researched. Recently, efforts have been made to investigate a combination of interventions to yield better response rates for web-based surveys. This study examined the effects of four different survey invitation conditions on response rate. From a large…

  18. Pinnacles and Pitfalls: Researcher Experiences from a Web-Based Survey of Secondary Teachers

    ERIC Educational Resources Information Center

    Watson, Julie; Anderson, Neil

    2005-01-01

    This article examines the experience of conducting a web-based survey with secondary teachers in Queensland schools. The survey was designed to collect data concerning teachers' attitudes and understanding about students with learning difficulties in their classes. Rather than discuss survey findings, however, the present article focuses on…

  19. Multiscale Detrended Cross-Correlation Analysis of Traffic Time Series Based on Empirical Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Yin, Yi; Shang, Pengjian

    2015-04-01

    In this paper, we propose multiscale detrended cross-correlation analysis (MSDCCA) to detect the long-range power-law cross-correlation of considered signals in the presence of nonstationarity. For improving the performance and getting better robustness, we further introduce the empirical mode decomposition (EMD) to eliminate the noise effects and propose MSDCCA method combined with EMD, which is called MS-EDXA method, then systematically investigate the multiscale cross-correlation structure of the real traffic signals. We apply the MSDCCA and MS-EDXA methods to study the cross-correlations in three situations: velocity and volume on one lane, velocities on the present and the next moment and velocities on the adjacent lanes, and further compare their spectrums respectively. When the difference between the spectrums of MSDCCA and MS-EDXA becomes unobvious, there is a crossover which denotes the turning point of difference. The crossover results from the competition between the noise effects in the original signals and the intrinsic fluctuation of traffic signals and divides the plot of spectrums into two regions. In all the three case, MS-EDXA method makes the average of local scaling exponents increased and the standard deviation decreased and provides a relative stable persistent scaling cross-correlated behavior which gets the analysis more precise and more robust and improves the performance after noises being removed. Applying MS-EDXA method avoids the inaccurate characteristics of multiscale cross-correlation structure at the short scale including the spectrum minimum, the range for the spectrum fluctuation and general trend, which are caused by the noise in the original signals. We get the conclusions that the traffic velocity and volume are long-range cross-correlated, which is accordant to their actual evolution, while velocities on the present and the next moment and velocities on adjacent lanes reflect the strong cross-correlations both in temporal and

  20. Outcome (Competency) Based Education: An Exploration of Its Origins, Theoretical Basis, and Empirical Evidence

    ERIC Educational Resources Information Center

    Morcke, Anne Mette; Dornan, Tim; Eika, Berit

    2013-01-01

    Outcome based or competency based education (OBE) is so firmly established in undergraduate medical education that it might not seem necessary to ask why it was included in recommendations for the future, like the Flexner centenary report. Uncritical acceptance may not, however, deliver its greatest benefits. Our aim was to explore the…

  1. Meta-Analysis of Group Learning Activities: Empirically Based Teaching Recommendations

    ERIC Educational Resources Information Center

    Tomcho, Thomas J.; Foels, Rob

    2012-01-01

    Teaching researchers commonly employ group-based collaborative learning approaches in Teaching of Psychology teaching activities. However, the authors know relatively little about the effectiveness of group-based activities in relation to known psychological processes associated with group dynamics. Therefore, the authors conducted a meta-analytic…

  2. Empirically Supported Family-Based Treatments for Conduct Disorder and Delinquency in Adolescents

    ERIC Educational Resources Information Center

    Henggeler, Scott W.; Sheidow, Ashli J.

    2012-01-01

    Several family-based treatments of conduct disorder and delinquency in adolescents have emerged as evidence-based and, in recent years, have been transported to more than 800 community practice settings. These models include multisystemic therapy, functional family therapy, multidimensional treatment foster care, and, to a lesser extent, brief…

  3. Formula-Based Public School Funding System in Victoria: An Empirical Analysis of Equity

    ERIC Educational Resources Information Center

    Bandaranayake, Bandara

    2013-01-01

    This article explores the formula-based school funding system in the state of Victoria, Australia, where state funds are directly allocated to schools based on a range of equity measures. The impact of Victoria' funding system for education in terms of alleviating inequality and disadvantage is contentious, to say the least. It is difficult…

  4. [Empiric treatment of pyelonephritis].

    PubMed

    Iarovoĭ, S K; Shimanovskiĭ, N L; Kareva, E N

    2011-01-01

    The article analyses the most typical clinical situations in empirical treatment of pyelonephritis including situations with comorbid severe diseases: decompensated diabetes mellitus, chronic renal failure, HIV-infection. Choice of antibacterial medicines for empiric treatment of pyelonephritis is based on the results of the latest studies of antibioticoresistance of pyelonephritis pathogens as well as on specific features of pharmacokinetics and pharmacodynamics of antibacterial drugs. PMID:21815461

  5. Web-based dynamic Delphi: a new survey instrument

    NASA Astrophysics Data System (ADS)

    Yao, JingTao; Liu, Wei-Ning

    2006-04-01

    We present a mathematical model for a dynamic Delphi survey method which takes advantages of Web technology. A comparative study on the performance of the conventional Delphi method and the dynamic Delphi instrument is conducted. It is suggested that a dynamic Delphi survey may form a consensus quickly. However, the result may not be robust due to the judgement leaking issues.

  6. The Empirical Interests of Developmental Psychology.

    ERIC Educational Resources Information Center

    McGraw, Kenneth O.

    1991-01-01

    Presents a classification system for developmental psychology that was constructed by means of a survey of articles published in the journals "Child Development" and "Developmental Psychology" in 1969 and 1987. The survey found that eight empirical questions encompassed the full range of empirical interests expressed by developmental…

  7. Gyroscope-driven mouse pointer with an EMOTIV® EEG headset and data analysis based on Empirical Mode Decomposition.

    PubMed

    Rosas-Cholula, Gerardo; Ramirez-Cortes, Juan Manuel; Alarcon-Aquino, Vicente; Gomez-Gil, Pilar; Rangel-Magdaleno, Jose de Jesus; Reyes-Garcia, Carlos

    2013-01-01

    This paper presents a project on the development of a cursor control emulating the typical operations of a computer-mouse, using gyroscope and eye-blinking electromyographic signals which are obtained through a commercial 16-electrode wireless headset, recently released by Emotiv. The cursor position is controlled using information from a gyroscope included in the headset. The clicks are generated through the user's blinking with an adequate detection procedure based on the spectral-like technique called Empirical Mode Decomposition (EMD). EMD is proposed as a simple and quick computational tool, yet effective, aimed to artifact reduction from head movements as well as a method to detect blinking signals for mouse control. Kalman filter is used as state estimator for mouse position control and jitter removal. The detection rate obtained in average was 94.9%. Experimental setup and some obtained results are presented. PMID:23948873

  8. Gyroscope-Driven Mouse Pointer with an EMOTIV® EEG Headset and Data Analysis Based on Empirical Mode Decomposition

    PubMed Central

    Rosas-Cholula, Gerardo; Ramirez-Cortes, Juan Manuel; Alarcon-Aquino, Vicente; Gomez-Gil, Pilar; Rangel-Magdaleno, Jose de Jesus; Reyes-Garcia, Carlos

    2013-01-01

    This paper presents a project on the development of a cursor control emulating the typical operations of a computer-mouse, using gyroscope and eye-blinking electromyographic signals which are obtained through a commercial 16-electrode wireless headset, recently released by Emotiv. The cursor position is controlled using information from a gyroscope included in the headset. The clicks are generated through the user's blinking with an adequate detection procedure based on the spectral-like technique called Empirical Mode Decomposition (EMD). EMD is proposed as a simple and quick computational tool, yet effective, aimed to artifact reduction from head movements as well as a method to detect blinking signals for mouse control. Kalman filter is used as state estimator for mouse position control and jitter removal. The detection rate obtained in average was 94.9%. Experimental setup and some obtained results are presented. PMID:23948873

  9. Empirically Supported Treatments in Psychotherapy: Towards an Evidence-Based or Evidence-Biased Psychology in Clinical Settings?

    PubMed Central

    Castelnuovo, Gianluca

    2010-01-01

    The field of research and practice in psychotherapy has been deeply influenced by two different approaches: the empirically supported treatments (ESTs) movement, linked with the evidence-based medicine (EBM) perspective and the “Common Factors” approach, typically connected with the “Dodo Bird Verdict”. About the first perspective, since 1998 a list of ESTs has been established in mental health field. Criterions for “well-established” and “probably efficacious” treatments have arisen. The development of these kinds of paradigms was motivated by the emergence of a “managerial” approach and related systems for remuneration also for mental health providers and for insurance companies. In this article ESTs will be presented underlining also some possible criticisms. Finally complementary approaches, that could add different evidence in the psychotherapy research in comparison with traditional EBM approach, are presented. PMID:21833197

  10. Analysis of Vibration and Noise of Construction Machinery Based on Ensemble Empirical Mode Decomposition and Spectral Correlation Analysis Method

    NASA Astrophysics Data System (ADS)

    Chen, Yuebiao; Zhou, Yiqi; Yu, Gang; Lu, Dan

    In order to analyze the effect of engine vibration on cab noise of construction machinery in multi-frequency bands, a new method based on ensemble empirical mode decomposition (EEMD) and spectral correlation analysis is proposed. Firstly, the intrinsic mode functions (IMFs) of vibration and noise signals were obtained by EEMD method, and then the IMFs which have the same frequency bands were selected. Secondly, we calculated the spectral correlation coefficients between the selected IMFs, getting the main frequency bands in which engine vibration has significant impact on cab noise. Thirdly, the dominated frequencies were picked out and analyzed by spectral analysis method. The study result shows that the main frequency bands and dominated frequencies in which engine vibration have serious impact on cab noise can be identified effectively by the proposed method, which provides effective guidance to noise reduction of construction machinery.

  11. Re-reading nursing and re-writing practice: towards an empirically based reformulation of the nursing mandate.

    PubMed

    Allen, Davina

    2004-12-01

    This article examines field studies of nursing work published in the English language between 1993 and 2003 as the first step towards an empirically based reformulation of the nursing mandate. A decade of ethnographic research reveals that, contrary to contemporary theories which promote an image of nursing work centred on individualised unmediated caring relationships, in real-life practice the core nursing contribution is that of the healthcare mediator. Eight bundles of activity that comprise this intermediary role are described utilising evidence from the literature. The mismatch between nursing's culture and ideals and the structure and constraints of the work setting is a chronic source of practitioner dissatisfaction. It is argued that the profession has little to gain by pursuing an agenda of holistic patient care centred on emotional intimacy and that an alternative occupational mandate focused on the healthcare mediator function might make for more humane health services and a more viable professional future. PMID:15601415

  12. Children's Experiences of Completing a Computer-Based Violence Survey: Finnish Child Victim Survey Revisited.

    PubMed

    Fagerlund, Monica; Ellonen, Noora

    2016-07-01

    The involvement of children as research subjects requires special considerations with regard to research practices and ethics. This is especially true concerning sensitive research topics such as sexual victimization. Prior research suggests that reflecting these experiences in a survey can cause negative feelings in child participants, although posing only a minimal to moderate risk. Analyzing only predefined, often negative feelings related to answering a sexual victimization survey has dominated the existing literature. In this article children's free-text comments about answering a victimization survey and experiences of sexual victimization are analyzed together to evaluate the effects of research participation in relation to this sensitive issue. Altogether 11,364 children, aged 11-12 and 15-16, participated in the Finnish Child Victim Survey in 2013. Of these, 69% (7,852) reflected on their feelings about answering the survey. Results indicate that both clearly negative and positive feelings are more prevalent among victimized children compared to their nonvictimized peers. Characteristics unique to sexual victimization as well as differences related to gender and age are also discussed. The study contributes to the important yet contradictory field of studying the effects of research participation on children. PMID:27472509

  13. Perceptions and Experiences of Research Participants on Gender-Based Violence Community Based Survey: Implications for Ethical Guidelines

    PubMed Central

    Sikweyiya, Yandisa; Jewkes, Rachel

    2012-01-01

    Objective To explore how survey respondents perceived their experiences and the impact of participating in a survey, and to assess adverse consequences resulting from participation. Design Qualitative study involving purposefully selected participants who had participated in a household-based survey. Methods This qualitative study was nested within a survey that investigated the prevalence of gender-based violence perpetration and victimization with adult men and women in South Africa. 13 male- and 10 female-in-depth interviews were conducted with survey respondents. Results A majority of informants, without gender-differences, perceived the survey interview as a rare opportunity to share their adverse and or personal experiences in a 'safe' space. Gender-differences were noted in reporting perceptions of risks involved with survey participation. Some women remained fearful after completing the survey, that should breach of confidentiality or full survey content disclosure occur, they may be victimized by partners as a punishment for survey participation without men's approval. A number of informants generally discussed their survey participation with others. However, among women with interpersonal violence history or currently in abusive relationships, full survey content disclosure was done with fear; the partner responses were negative, and few women reported receiving threatening remarks but none reported being assaulted. In contrast no man reported adverse reaction by others. Informants with major life adversities reported that the survey had made them to relive the experiences causing them sadness and pain at the time. No informant perceived the survey as emotionally harmful or needed professional support because of survey questions. Rather the vast majority perceived benefit from survey participation. Conclusion Whilst no informant felt answering the survey questions had caused them emotional or physical harm, some were distressed and anxious, albeit

  14. Accurate Young's modulus measurement based on Rayleigh wave velocity and empirical Poisson's ratio

    NASA Astrophysics Data System (ADS)

    Li, Mingxia; Feng, Zhihua

    2016-07-01

    This paper presents a method for Young's modulus measurement based on Rayleigh wave speed. The error in Poisson's ratio has weak influence on the measurement of Young's modulus based on Rayleigh wave speed, and Poisson's ratio minimally varies in a certain material; thus, we can accurately estimate Young's modulus with surface wave speed and a rough Poisson's ratio. We numerically analysed three methods using Rayleigh, longitudinal, and transversal wave speed, respectively, and the error in Poisson's ratio shows the least influence on the result in the method involving Rayleigh wave speed. An experiment was performed and has proved the feasibility of this method. Device for speed measuring could be small, and no sample pretreatment is needed. Hence, developing a portable instrument based on this method is possible. This method makes a good compromise between usability and precision.

  15. Performance-based management and quality of work: an empirical assessment.

    PubMed

    Falzon, Pierre; Nascimento, Adelaide; Gaudart, Corinne; Piney, Cécile; Dujarier, Marie-Anne; Germe, Jean-François

    2012-01-01

    In France, in the private sector as in the public sector, performance-based management tends to become a norm. Performance-based management is supposed to improve service quality, productivity and efficiency, transparency of allotted means and achieved results, and to better focus the activity of employees and of the whole organization. This text reports a study conducted for the French Ministry of Budget by a team of researchers in ergonomics, sociology and management science, in order to assess the impact of performance-based management on employees, on teams and on work organization. About 100 interviews were conducted with employees of all categories and 6 working groups were set up in order to discuss and validate or amend our first analyses. Results concern several aspects: workload and work intensification, indicators and performance management and the transformation of jobs induced by performance management. PMID:22317310

  16. Building performance-based accountability with limited empirical evidence: performance measurement for public health preparedness.

    PubMed

    Shelton, Shoshana R; Nelson, Christopher D; McLees, Anita W; Mumford, Karen; Thomas, Craig

    2013-08-01

    Efforts to respond to performance-based accountability mandates for public health emergency preparedness have been hindered by a weak evidence base linking preparedness activities with response outcomes. We describe an approach to measure development that was successfully implemented in the Centers for Disease Control and Prevention Public Health Emergency Preparedness Cooperative Agreement. The approach leverages insights from process mapping and experts to guide measure selection, and provides mechanisms for reducing performance-irrelevant variation in measurement data. Also, issues are identified that need to be addressed to advance the science of measurement in public health emergency preparedness. PMID:24229520

  17. Introducing Evidence-Based Principles to Guide Collaborative Approaches to Evaluation: Results of an Empirical Process

    ERIC Educational Resources Information Center

    Shulha, Lyn M.; Whitmore, Elizabeth; Cousins, J. Bradley; Gilbert, Nathalie; al Hudib, Hind

    2016-01-01

    This article introduces a set of evidence-based principles to guide evaluation practice in contexts where evaluation knowledge is collaboratively produced by evaluators and stakeholders. The data from this study evolved in four phases: two pilot phases exploring the desirability of developing a set of principles; an online questionnaire survey…

  18. Teaching Standards-Based Group Work Competencies to Social Work Students: An Empirical Examination

    ERIC Educational Resources Information Center

    Macgowan, Mark J.; Vakharia, Sheila P.

    2012-01-01

    Objectives: Accreditation standards and challenges in group work education require competency-based approaches in teaching social work with groups. The Association for the Advancement of Social Work with Groups developed Standards for Social Work Practice with Groups, which serve as foundation competencies for professional practice. However, there…

  19. Theoretical and Empirical Base for Implementation Components of Health-Promoting Schools

    ERIC Educational Resources Information Center

    Samdal, Oddrun; Rowling, Louise

    2011-01-01

    Purpose: Efforts to create a scientific base for the health-promoting school approach have so far not articulated a clear "Science of Delivery". There is thus a need for systematic identification of clearly operationalised implementation components. To address a next step in the refinement of the health-promoting schools' work, this paper sets out…

  20. Homogeneity in Community-Based Rape Prevention Programs: Empirical Evidence of Institutional Isomorphism

    ERIC Educational Resources Information Center

    Townsend, Stephanie M.; Campbell, Rebecca

    2007-01-01

    This study examined the practices of 24 community-based rape prevention programs. Although these programs were geographically dispersed throughout one state, they were remarkably similar in their approach to rape prevention programming. DiMaggio and Powell's (1991) theory of institutional isomorphism was used to explain the underlying causes of…

  1. Journeys into Inquiry-Based Elementary Science: Literacy Practices, Questioning, and Empirical Study

    ERIC Educational Resources Information Center

    Howes, Elaine V.; Lim, Miyoun; Campos, Jaclyn

    2009-01-01

    Teaching literacy in inquiry-based science-teaching settings has recently become a focus of research in science education. Because professional scientists' uses of reading, writing, and speaking are foundational to their work, as well as to nonscientists' comprehension of it , it follows that literacy practices should also be central to science…

  2. Young Readers' Narratives Based on a Picture Book: Model Readers and Empirical Readers

    ERIC Educational Resources Information Center

    Hoel, Trude

    2015-01-01

    The article present parts of a research project where the aim is to investigate six- to seven-year-old children's language use in storytelling. The children's oral texts are based on the wordless picture book "Frog, Where Are You?" Which has been, and still remains, a frequent tool for collecting narratives from children. The Frog story…

  3. Perceptions of the Effectiveness of System Dynamics-Based Interactive Learning Environments: An Empirical Study

    ERIC Educational Resources Information Center

    Qudrat-Ullah, Hassan

    2010-01-01

    The use of simulations in general and of system dynamics simulation based interactive learning environments (SDILEs) in particular is well recognized as an effective way of improving users' decision making and learning in complex, dynamic tasks. However, the effectiveness of SDILEs in classrooms has rarely been evaluated. This article describes…

  4. An Adaptive E-Learning System Based on Students' Learning Styles: An Empirical Study

    ERIC Educational Resources Information Center

    Drissi, Samia; Amirat, Abdelkrim

    2016-01-01

    Personalized e-learning implementation is recognized as one of the most interesting research areas in the distance web-based education. Since the learning style of each learner is different one must fit e-learning with the different needs of learners. This paper presents an approach to integrate learning styles into adaptive e-learning hypermedia.…

  5. An Empirical Typology of Residential Care/Assisted Living Based on a Four-State Study

    ERIC Educational Resources Information Center

    Park, Nan Sook; Zimmerman, Sheryl; Sloane, Philip D.; Gruber-Baldini, Ann L.; Eckert, J. Kevin

    2006-01-01

    Purpose: Residential care/assisted living describes diverse facilities providing non-nursing home care to a heterogeneous group of primarily elderly residents. This article derives typologies of assisted living based on theoretically and practically grounded evidence. Design and Methods: We obtained data from the Collaborative Studies of Long-Term…

  6. Preparation, Practice, and Performance: An Empirical Examination of the Impact of Standards-Based Instruction on Secondary Students' Math and Science Achievement

    ERIC Educational Resources Information Center

    Thompson, Carla J.

    2009-01-01

    For almost two decades proponents of educational reform have advocated the use of standards-based education in maths and science classrooms for improving teacher practices, increasing student learning, and raising the quality of maths and science instruction. This study empirically examined the impact of specific standards-based teacher…

  7. Conventional empirical law reverses in the phase transitions of 122-type iron-based superconductors

    PubMed Central

    Yu, Zhenhai; Wang, Lin; Wang, Luhong; Liu, Haozhe; Zhao, Jinggeng; Li, Chunyu; Sinogeikin, Stanislav; Wu, Wei; Luo, Jianlin; Wang, Nanlin; Yang, Ke; Zhao, Yusheng; Mao, Ho-kwang

    2014-01-01

    Phase transition of solid-state materials is a fundamental research topic in condensed matter physics, materials science and geophysics. It has been well accepted and widely proven that isostructural compounds containing different cations undergo same pressure-induced phase transitions but at progressively lower pressures as the cation radii increases. However, we discovered that this conventional law reverses in the structural transitions in 122-type iron-based superconductors. In this report, a combined low temperature and high pressure X-ray diffraction (XRD) measurement has identified the phase transition curves among the tetragonal (T), orthorhombic (O) and the collapsed-tetragonal (cT) phases in the structural phase diagram of the iron-based superconductor AFe2As2 (A = Ca, Sr, Eu, and Ba). The cation radii dependence of the phase transition pressure (T → cT) shows an opposite trend in which the compounds with larger ambient radii cations have a higher transition pressure. PMID:25417655

  8. Specification-based software sizing: An empirical investigation of function metrics

    NASA Technical Reports Server (NTRS)

    Jeffery, Ross; Stathis, John

    1993-01-01

    For some time the software industry has espoused the need for improved specification-based software size metrics. This paper reports on a study of nineteen recently developed systems in a variety of application domains. The systems were developed by a single software services corporation using a variety of languages. The study investigated several metric characteristics. It shows that: earlier research into inter-item correlation within the overall function count is partially supported; a priori function counts, in themself, do not explain the majority of the effort variation in software development in the organization studied; documentation quality is critical to accurate function identification; and rater error is substantial in manual function counting. The implication of these findings for organizations using function based metrics are explored.

  9. Lake Superior Zooplankton Biomass Predictions from LOPC Tow Surveys Compare Well with a Probability Based Net Survey

    EPA Science Inventory

    We conducted a probability-based sampling of Lake Superior in 2006 and compared the zooplankton biomass estimate with laser optical plankton counter (LOPC) predictions. The net survey consisted of 52 sites stratified across three depth zones (0-30, 30-150, >150 m). The LOPC tow...

  10. Patients’ Acceptance towards a Web-Based Personal Health Record System: An Empirical Study in Taiwan

    PubMed Central

    Liu, Chung-Feng; Tsai, Yung-Chieh; Jang, Fong-Lin

    2013-01-01

    The health care sector has become increasingly interested in developing personal health record (PHR) systems as an Internet-based telehealthcare implementation to improve the quality and decrease the cost of care. However, the factors that influence patients’ intention to use PHR systems remain unclear. Based on physicians’ therapeutic expertise, we implemented a web-based infertile PHR system and proposed an extended Technology Acceptance Model (TAM) that integrates the physician-patient relationship (PPR) construct into TAM’s original perceived ease of use (PEOU) and perceived usefulness (PU) constructs to explore which factors will influence the behavioral intentions (BI) of infertile patients to use the PHR. From ninety participants from a medical center, 50 valid responses to a self-rating questionnaire were collected, yielding a response rate of 55.56%. The partial least squares (PLS) technique was used to assess the causal relationships that were hypothesized in the extended model. The results indicate that infertile patients expressed a moderately high intention to use the PHR system. The PPR and PU of patients had significant effects on their BI to use PHR, whereas the PEOU indirectly affected the patients’ BI through the PU. This investigation confirms that PPR can have a critical role in shaping patients’ perceptions of the use of healthcare information technologies. Hence, we suggest that hospitals should promote the potential usefulness of PHR and improve the quality of the physician-patient relationship to increase patients’ intention of using PHR. PMID:24142185

  11. Empirically Supported Family-Based Treatments for Conduct Disorder and Delinquency in Adolescents

    PubMed Central

    Henggeler, Scott W.; Sheidow, Ashli J.

    2011-01-01

    Several family-based treatments of conduct disorder and delinquency in adolescents have emerged as evidence-based and, in recent years, have been transported to more than 800 community practice settings. These models include multisystemic therapy, functional family therapy, multidimensional treatment foster care, and, to a lesser extent, brief strategic family therapy. In addition to summarizing the theoretical and clinical bases of these treatments, their results in efficacy and effectiveness trials are examined with particular emphasis on any demonstrated capacity to achieve favorable outcomes when implemented by real world practitioners in community practice settings. Special attention is also devoted to research on purported mechanisms of change as well as the long-term sustainability of outcomes achieved by these treatment models. Importantly, we note that the developers of each of the models have developed quality assurance systems to support treatment fidelity and youth and family outcomes; and the developers have formed purveyor organizations to facilitate the large scale transport of their respective treatments to community settings nationally and internationally. PMID:22283380

  12. An empirical evaluation of exemplar based image inpainting algorithms for natural scene image completion

    NASA Astrophysics Data System (ADS)

    Sangeetha, K.; Sengottuvelan, P.

    2013-03-01

    Image inpainting is the process of filling in of missing region so as to preserve its overall continuity. Image inpainting is manipulation and modification of an image in a form that is not easily detected. Digital image inpainting is relatively new area of research, but numerous and different approaches to tackle the inpainting problem have been proposed since the concept was first introduced. This paper analyzes and compares two recent exemplar based inpainting algorithms by Zhaolin Lu et al and Hao Guo et al. A number of examples on real images are demonstrated to evaluate the results of algorithms using Peak Signal to Noise Ratio (PSNR).

  13. Using Information and Communications Technology in a National Population-Based Survey: The Kenya AIDS Indicator Survey 2012

    PubMed Central

    Ojwang’, James K.; Lee, Veronica C.; Waruru, Anthony; Ssempijja, Victor; Ng’ang’a, John G.; Wakhutu, Brian E.; Kandege, Nicholas O.; Koske, Danson K.; Kamiru, Samuel M.; Omondi, Kenneth O.; Kakinyi, Mutua; Kim, Andrea A.; Oluoch, Tom

    2016-01-01

    Background With improvements in technology, electronic data capture (EDC) for large surveys is feasible. EDC offers benefits over traditional paper-based data collection, including more accurate data, greater completeness of data, and decreased data cleaning burden. Methods The second Kenya AIDS Indicator Survey (KAIS 2012) was a population-based survey of persons aged 18 months to 64 years. A software application was designed to capture the interview, specimen collection, and home-based testing and counseling data. The application included: interview translations for local languages; options for single, multiple, and fill-in responses; and automated participant eligibility determination. Data quality checks were programmed to automate skip patterns and prohibit outlier responses. A data sharing architecture was developed to transmit the data in realtime from the field to a central server over a virtual private network. Results KAIS 2012 was conducted between October 2012 and February 2013. Overall, 68,202 records for the interviews, specimen collection, and home-based testing and counseling were entered into the application. Challenges arose during implementation, including poor connectivity and a systems malfunction that created duplicate records, which prevented timely data transmission to the central server. Data cleaning was minimal given the data quality control measures. Conclusions KAIS 2012 demonstrated the feasibility of using EDC in a population-based survey. The benefits of EDC were apparent in data quality and minimal time needed for data cleaning. Several important lessons were learned, such as the time and monetary investment required before survey implementation, the importance of continuous application testing, and contingency plans for data transmission due to connectivity challenges. PMID:24732816

  14. Conventional empirical law reverses in the phase transitions of 122-type iron-based superconductors

    DOE PAGESBeta

    Yu, Zhenhai; Wang, Lin; Wang, Luhong; Liu, Haozhe; Zhao, Jinggeng; Li, Chunyu; Sinogeikin, Stanislav; Wu, Wei; Luo, Jianlin; Wang, Nanlin; et al

    2014-11-24

    Phase transition of solid-state materials is a fundamental research topic in condensed matter physics, materials science and geophysics. It has been well accepted and widely proven that isostructural compounds containing different cations undergo same pressure-induced phase transitions but at progressively lower pressures as the cation radii increases. However, we discovered that this conventional law reverses in the structural transitions in 122-type iron-based superconductors. In this report, a combined low temperature and high pressure X-ray diffraction (XRD) measurement has identified the phase transition curves among the tetragonal (T), orthorhombic (O) and the collapsed-tetragonal (cT) phases in the structural phase diagram ofmore » the iron-based superconductor AFe2As2 (A = Ca, Sr, Eu, and Ba). As a result, the cation radii dependence of the phase transition pressure (T → cT) shows an opposite trend in which the compounds with larger ambient radii cations have a higher transition pressure.« less

  15. An Empirical Comparison of Tree-Based Methods for Propensity Score Estimation

    PubMed Central

    Watkins, Stephanie; Jonsson-Funk, Michele; Brookhart, M Alan; Rosenberg, Steven A; O'Shea, T Michael; Daniels, Julie

    2013-01-01

    Objective To illustrate the use of ensemble tree-based methods (random forest classification [RFC] and bagging) for propensity score estimation and to compare these methods with logistic regression, in the context of evaluating the effect of physical and occupational therapy on preschool motor ability among very low birth weight (VLBW) children. Data Source We used secondary data from the Early Childhood Longitudinal Study Birth Cohort (ECLS-B) between 2001 and 2006. Study Design We estimated the predicted probability of treatment using tree-based methods and logistic regression (LR). We then modeled the exposure-outcome relation using weighted LR models while considering covariate balance and precision for each propensity score estimation method. Principal Findings Among approximately 500 VLBW children, therapy receipt was associated with moderately improved preschool motor ability. Overall, ensemble methods produced the best covariate balance (Mean Squared Difference: 0.03–0.07) and the most precise effect estimates compared to LR (Mean Squared Difference: 0.11). The overall magnitude of the effect estimates was similar between RFC and LR estimation methods. Conclusion Propensity score estimation using RFC and bagging produced better covariate balance with increased precision compared to LR. Ensemble methods are a useful alterative to logistic regression to control confounding in observational studies. PMID:23701015

  16. Conventional empirical law reverses in the phase transitions of 122-type iron-based superconductors

    SciTech Connect

    Yu, Zhenhai; Wang, Lin; Wang, Luhong; Liu, Haozhe; Zhao, Jinggeng; Li, Chunyu; Sinogeikin, Stanislav; Wu, Wei; Luo, Jianlin; Wang, Nanlin; Yang, Ke; Zhao, Yusheng; Mao, Ho -kwang

    2014-11-24

    Phase transition of solid-state materials is a fundamental research topic in condensed matter physics, materials science and geophysics. It has been well accepted and widely proven that isostructural compounds containing different cations undergo same pressure-induced phase transitions but at progressively lower pressures as the cation radii increases. However, we discovered that this conventional law reverses in the structural transitions in 122-type iron-based superconductors. In this report, a combined low temperature and high pressure X-ray diffraction (XRD) measurement has identified the phase transition curves among the tetragonal (T), orthorhombic (O) and the collapsed-tetragonal (cT) phases in the structural phase diagram of the iron-based superconductor AFe2As2 (A = Ca, Sr, Eu, and Ba). As a result, the cation radii dependence of the phase transition pressure (T → cT) shows an opposite trend in which the compounds with larger ambient radii cations have a higher transition pressure.

  17. Empirically based guidelines for moderate drinking: 1-year results from three studies with problem drinkers.

    PubMed Central

    Sanchez-Craig, M; Wilkinson, D A; Davila, R

    1995-01-01

    OBJECTIVES. The study was conducted to refine guidelines on moderate drinking for problem drinkers, persons whose alcohol use is hazardous or harmful. Information on levels of alcohol intake unlikely to cause problems is useful for health professionals, educators, and policymakers. METHODS. Based on their reports of alcohol-related problems, participants in three studies assessing interventions to reduce heavy drinking (114 men, 91 women) were categorized as "problem-free" or "problem" drinkers at follow-up. Drinking measures were examined to identify patterns separating these outcome categories. RESULTS. Analyses using 95% confidence intervals for means on drinking measures showed that guidelines should be sex-specific. Based on analyses of positive and negative predictive value, sensitivity, and specificity, it is recommended that men consume no more than 4 standard drinks in any day and 16 drinks in any week, and that women consume no more than 3 drinks in any day and 12 drinks in any week. CONCLUSIONS. These guidelines are consistent with those from several official bodies and should be useful for advising problem drinkers when moderation is a valid treatment goal. Their applicability to the general population is unevaluated. PMID:7762717

  18. An Empirical Determination of the Intergalactic Background Light from UV to FIR Wavelengths Using FIR Deep Galaxy Surveys and the Gamma-Ray Opacity of the Universe

    NASA Astrophysics Data System (ADS)

    Stecker, Floyd W.; Scully, Sean T.; Malkan, Matthew A.

    2016-08-01

    We have previously calculated the intergalactic background light (IBL) as a function of redshift from the Lyman limit in the far-ultraviolet to a wavelength of 5 μm in the near-infrared range, based purely on data from deep galaxy surveys. Here, we use similar methods to determine the mid- and far-infrared IBL from 5 to 850 μm. Our approach enables us to constrain the range of photon densities by determining the uncertainties in observationally determined luminosity densities and spectral gradients. By also including the effect of the 2.7 K cosmic background photons, we determine upper and lower limits on the opacity of the universe to γ-rays up to PeV energies within a 68% confidence band. Our direct results on the IBL are consistent with those from complimentary γ-ray analyses using observations from the Fermi γ-ray space telescope and the H.E.S.S. air Čerenkov telescope. Thus, we find no evidence of previously suggested processes for the modification of γ-ray spectra other than that of absorption by pair production alone.

  19. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study †

    PubMed Central

    Tîrnăucă, Cristina; Montaña, José L.; Ontañón, Santiago; González, Avelino J.; Pardo, Luis M.

    2016-01-01

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent’s actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches. PMID:27347956

  20. Psychological First Aid: A Consensus-Derived, Empirically Supported, Competency-Based Training Model

    PubMed Central

    Everly, George S.; Brown, Lisa M.; Wendelboe, Aaron M.; Abd Hamid, Nor Hashidah; Tallchief, Vicki L.; Links, Jonathan M.

    2014-01-01

    Surges in demand for professional mental health services occasioned by disasters represent a major public health challenge. To build response capacity, numerous psychological first aid (PFA) training models for professional and lay audiences have been developed that, although often concurring on broad intervention aims, have not systematically addressed pedagogical elements necessary for optimal learning or teaching. We describe a competency-based model of PFA training developed under the auspices of the Centers for Disease Control and Prevention and the Association of Schools of Public Health. We explain the approach used for developing and refining the competency set and summarize the observable knowledge, skills, and attitudes underlying the 6 core competency domains. We discuss the strategies for model dissemination, validation, and adoption in professional and lay communities. PMID:23865656

  1. An Empirical Pixel-Based CTE Correction for ACS/WFC

    NASA Astrophysics Data System (ADS)

    Anderson, Jay

    2010-07-01

    This presentation summarizes a paper that has been recently published in PASP, Anderson & Bedin (2010). The paper describes our pixel-based approach to correcting ACS data for imperfect CTE (charge-transfer efficiency). We developed the approach by characterizing the size and profiles of trails behind warm pixels in dark exposures. We found an algorithm that simulates the way imperfect CTE impacts the readout process. To correct images for imperfect CTE, we use a forwardmodeling procedure to determine the likely original distribution of charge, given the distribution that was read out. We applied this CTE-reconstruction algorithm to science images and found that the fluxes, positions and shapes of stars were restored to high fidelity. The ACS team is currently working to make this correction available to the public; they are also running tests to determine whether and how best to implement it in the pipeline.

  2. Joint multifractal analysis based on the partition function approach: analytical analysis, numerical simulation and empirical application

    NASA Astrophysics Data System (ADS)

    Xie, Wen-Jie; Jiang, Zhi-Qiang; Gu, Gao-Feng; Xiong, Xiong; Zhou, Wei-Xing

    2015-10-01

    Many complex systems generate multifractal time series which are long-range cross-correlated. Numerous methods have been proposed to characterize the multifractal nature of these long-range cross correlations. However, several important issues about these methods are not well understood and most methods consider only one moment order. We study the joint multifractal analysis based on partition function with two moment orders, which was initially invented to investigate fluid fields, and derive analytically several important properties. We apply the method numerically to binomial measures with multifractal cross correlations and bivariate fractional Brownian motions without multifractal cross correlations. For binomial multifractal measures, the explicit expressions of mass function, singularity strength and multifractal spectrum of the cross correlations are derived, which agree excellently with the numerical results. We also apply the method to stock market indexes and unveil intriguing multifractality in the cross correlations of index volatilities.

  3. A prediction procedure for propeller aircraft flyover noise based on empirical data

    NASA Astrophysics Data System (ADS)

    Smith, M. H.

    1981-04-01

    Forty-eight different flyover noise certification tests are analyzed using multiple linear regression methods. A prediction model is presented based on this analysis, and the results compared with the test data and two other prediction methods. The aircraft analyzed include 30 single engine aircraft, 16 twin engine piston aircraft, and two twin engine turboprops. The importance of helical tip Mach number is verified and the relationship of several other aircraft, engine, and propeller parameters is developed. The model shows good agreement with the test data and is at least as accurate as the other prediction methods. It has the advantage of being somewhat easier to use since it is in the form of a single equation.

  4. Empirical estimation of consistency parameter in intertemporal choice based on Tsallis’ statistics

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki; Oono, Hidemi; Radford, Mark H. B.

    2007-07-01

    Impulsivity and inconsistency in intertemporal choice have been attracting attention in econophysics and neuroeconomics. Although loss of self-control by substance abusers is strongly related to their inconsistency in intertemporal choice, researchers in neuroeconomics and psychopharmacology have usually studied impulsivity in intertemporal choice using a discount rate (e.g. hyperbolic k), with little effort being expended on parameterizing subject's inconsistency in intertemporal choice. Recent studies using Tsallis’ statistics-based econophysics have found a discount function (i.e. q-exponential discount function), which may continuously parameterize a subject's consistency in intertemporal choice. In order to examine the usefulness of the consistency parameter (0⩽q⩽1) in the q-exponential discounting function in behavioral studies, we experimentally estimated the consistency parameter q in Tsallis’ statistics-based discounting function by assessing the points of subjective equality (indifference points) at seven delays (1 week-25 years) in humans (N=24). We observed that most (N=19) subjects’ intertemporal choice was completely inconsistent ( q=0, i.e. hyperbolic discounting), the mean consistency (0⩽q⩽1) was smaller than 0.5, and only one subject had a completely consistent intertemporal choice ( q=1, i.e. exponential discounting). There was no significant correlation between impulsivity and inconsistency parameters. Our results indicate that individual differences in consistency in intertemporal choice can be parameterized by introducing a q-exponential discount function and most people discount delayed rewards hyperbolically, rather than exponentially (i.e. mean q is smaller than 0.5). Further, impulsivity and inconsistency in intertemporal choice can be considered as separate behavioral tendencies. The usefulness of the consistency parameter q in psychopharmacological studies of addictive behavior was demonstrated in the present study.

  5. Emotional competencies in geriatric nursing: empirical evidence from a computer based large scale assessment calibration study.

    PubMed

    Kaspar, Roman; Hartig, Johannes

    2016-03-01

    The care of older people was described as involving substantial emotion-related affordances. Scholars in vocational training and nursing disagree whether emotion-related skills could be conceptualized and assessed as a professional competence. Studies on emotion work and empathy regularly neglect the multidimensionality of these phenomena and their relation to the care process, and are rarely conclusive with respect to nursing behavior in practice. To test the status of emotion-related skills as a facet of client-directed geriatric nursing competence, 402 final-year nursing students from 24 German schools responded to a 62-item computer-based test. 14 items were developed to represent emotion-related affordances. Multi-dimensional IRT modeling was employed to assess a potential subdomain structure. Emotion-related test items did not form a separate subdomain, and were found to be discriminating across the whole competence continuum. Tasks concerning emotion work and empathy are reliable indicators for various levels of client-directed nursing competence. Claims for a distinct emotion-related competence in geriatric nursing, however, appear excessive with a process-oriented perspective. PMID:26108300

  6. An empirical comparison of different LDA methods in fMRI-based brain states decoding.

    PubMed

    Xia, Maogeng; Song, Sutao; Yao, Li; Long, Zhiying

    2015-01-01

    Decoding brain states from response patterns with multivariate pattern recognition techniques is a popular method for detecting multivoxel patterns of brain activation. These patterns are informative with respect to a subject's perceptual or cognitive states. Linear discriminant analysis (LDA) cannot be directly applied to fMRI data analysis because of the "few samples and large features" nature of functional magnetic resonance imaging (fMRI) data. Although several improved LDA methods have been used in fMRI-based decoding, little is known regarding the relative performance of different LDA classifiers on fMRI data. In this study, we compared five LDA classifiers using both simulated data with varied noise levels and real fMRI data. The compared LDA classifiers include LDA combined with PCA (LDA-PCA), LDA with three types of regularizations (identity matrix, diagonal matrix and scaled identity matrix) and LDA with optimal-shrinkage covariance estimator using Ledoit and Wolf lemma (LDA-LW). The results indicated that LDA-LW was the most robust to noises. Moreover, LDA-LW and LDA with scaled identity matrix showed better stability and classification accuracy than the other methods. LDA-LW demonstrated the best overall performance. PMID:26405876

  7. Empirically based recommendations to support parents facing the dilemma of paediatric cadaver organ donation.

    PubMed

    Bellali, T; Papazoglou, I; Papadatou, D

    2007-08-01

    The aim of the study was to describe the challenges donor and non-donor parents encounter before, during, and after the organ donation decision, and to identify parents' needs and expectations from health care professionals. A further aim was to propose evidence-based recommendations for effectively introducing the option of donation, and supporting families through the grieving process. This study was undertaken as part of a larger research project investigating the experiences of Greek parents who consented or declined organ and tissue donation, using a qualitative methodology for data collection and analysis. The experiences of 22 Greek bereaved parents of 14 underage brain dead children were studied through semi-structured interviews. Parents' decision-making process was described as challenging and fraught with difficulties both before and after the donation period. Identified challenges were clustered into: (a) personal challenges, (b) conditions of organ request, and (c) interpersonal challenges. Parents' main concern following donation was the lack of information about transplantation outcomes. Findings led to a list of recommendations for nurses and other health professionals for approaching and supporting parents in making choices about paediatric organ donation that are appropriate to them, and for facilitating their adjustment to the sudden death of their underage child. PMID:17475498

  8. Turbulent inflow conditions for large-eddy simulation based on low-order empirical model

    NASA Astrophysics Data System (ADS)

    Perret, Laurent; Delville, Joël; Manceau, Rémi; Bonnet, Jean-Paul

    2008-07-01

    Generation of turbulent inflow boundary conditions is performed by interfacing an experimental database acquired by particle image velocimetry to a computational code. The proposed method ensures that the velocity fields introduced as inlet conditions in the computational code present correct one- and two-point spatial statistics and a realistic temporal dynamics. This approach is based on the use of the proper orthogonal decomposition (POD) to interpolate and extrapolate the experimental data onto the numerical mesh and to model both the temporal dynamics and the spatial organization of the flow in the inlet section. Realistic representation of the flow is achieved by extracting and modeling independently its coherent and incoherent parts. A low-order dynamical model is derived from the experimental database in order to provide the temporal evolution of the most energetic structures. The incoherent motion is modeled by employing time series of Gaussian random numbers to mimic the temporal evolution of higher order POD modes. Validation of the proposed method is provided by performing a large-eddy simulation of a turbulent plane mixing layer, which is compared to experimental results.

  9. Empirical modeling of the fine particle fraction for carrier-based pulmonary delivery formulations

    PubMed Central

    Pacławski, Adam; Szlęk, Jakub; Lau, Raymond; Jachowicz, Renata; Mendyk, Aleksander

    2015-01-01

    In vitro study of the deposition of drug particles is commonly used during development of formulations for pulmonary delivery. The assay is demanding, complex, and depends on: properties of the drug and carrier particles, including size, surface characteristics, and shape; interactions between the drug and carrier particles and assay conditions, including flow rate, type of inhaler, and impactor. The aerodynamic properties of an aerosol are measured in vitro using impactors and in most cases are presented as the fine particle fraction, which is a mass percentage of drug particles with an aerodynamic diameter below 5 μm. In the present study, a model in the form of a mathematical equation was developed for prediction of the fine particle fraction. The feature selection was performed using the R-environment package “fscaret”. The input vector was reduced from a total of 135 independent variables to 28. During the modeling stage, techniques like artificial neural networks, genetic programming, rule-based systems, and fuzzy logic systems were used. The 10-fold cross-validation technique was used to assess the generalization ability of the models created. The model obtained had good predictive ability, which was confirmed by a root-mean-square error and normalized root-mean-square error of 4.9 and 11%, respectively. Moreover, validation of the model using external experimental data was performed, and resulted in a root-mean-square error and normalized root-mean-square error of 3.8 and 8.6%, respectively. PMID:25653522

  10. New insights into Hoogsteen base pairs in DNA duplexes from a structure-based survey.

    PubMed

    Zhou, Huiqing; Hintze, Bradley J; Kimsey, Isaac J; Sathyamoorthy, Bharathwaj; Yang, Shan; Richardson, Jane S; Al-Hashimi, Hashim M

    2015-04-20

    Hoogsteen (HG) base pairs (bps) provide an alternative pairing geometry to Watson-Crick (WC) bps and can play unique functional roles in duplex DNA. Here, we use structural features unique to HG bps (syn purine base, HG hydrogen bonds and constricted C1'-C1' distance across the bp) to search for HG bps in X-ray structures of DNA duplexes in the Protein Data Bank. The survey identifies 106 A•T and 34 G•C HG bps in DNA duplexes, many of which are undocumented in the literature. It also uncovers HG-like bps with syn purines lacking HG hydrogen bonds or constricted C1'-C1' distances that are analogous to conformations that have been proposed to populate the WC-to-HG transition pathway. The survey reveals HG preferences similar to those observed for transient HG bps in solution by nuclear magnetic resonance, including stronger preferences for A•T versus G•C bps, TA versus GG steps, and also suggests enrichment at terminal ends with a preference for 5'-purine. HG bps induce small local perturbations in neighboring bps and, surprisingly, a small but significant degree of DNA bending (∼14°) directed toward the major groove. The survey provides insights into the preferences and structural consequences of HG bps in duplex DNA. PMID:25813047

  11. [Research on ECG de-noising method based on ensemble empirical mode decomposition and wavelet transform using improved threshold function].

    PubMed

    Ye, Linlin; Yang, Dan; Wang, Xu

    2014-06-01

    A de-noising method for electrocardiogram (ECG) based on ensemble empirical mode decomposition (EEMD) and wavelet threshold de-noising theory is proposed in our school. We decomposed noised ECG signals with the proposed method using the EEMD and calculated a series of intrinsic mode functions (IMFs). Then we selected IMFs and reconstructed them to realize the de-noising for ECG. The processed ECG signals were filtered again with wavelet transform using improved threshold function. In the experiments, MIT-BIH ECG database was used for evaluating the performance of the proposed method, contrasting with de-noising method based on EEMD and wavelet transform with improved threshold function alone in parameters of signal to noise ratio (SNR) and mean square error (MSE). The results showed that the ECG waveforms de-noised with the proposed method were smooth and the amplitudes of ECG features did not attenuate. In conclusion, the method discussed in this paper can realize the ECG denoising and meanwhile keep the characteristics of original ECG signal. PMID:25219236

  12. An Empirical Orthogonal Function-Based Algorithm for Estimating Terrestrial Latent Heat Flux from Eddy Covariance, Meteorological and Satellite Observations.

    PubMed

    Feng, Fei; Li, Xianglan; Yao, Yunjun; Liang, Shunlin; Chen, Jiquan; Zhao, Xiang; Jia, Kun; Pintér, Krisztina; McCaughey, J Harry

    2016-01-01

    Accurate estimation of latent heat flux (LE) based on remote sensing data is critical in characterizing terrestrial ecosystems and modeling land surface processes. Many LE products were released during the past few decades, but their quality might not meet the requirements in terms of data consistency and estimation accuracy. Merging multiple algorithms could be an effective way to improve the quality of existing LE products. In this paper, we present a data integration method based on modified empirical orthogonal function (EOF) analysis to integrate the Moderate Resolution Imaging Spectroradiometer (MODIS) LE product (MOD16) and the Priestley-Taylor LE algorithm of Jet Propulsion Laboratory (PT-JPL) estimate. Twenty-two eddy covariance (EC) sites with LE observation were chosen to evaluate our algorithm, showing that the proposed EOF fusion method was capable of integrating the two satellite data sets with improved consistency and reduced uncertainties. Further efforts were needed to evaluate and improve the proposed algorithm at larger spatial scales and time periods, and over different land cover types. PMID:27472383

  13. An Empirical Orthogonal Function-Based Algorithm for Estimating Terrestrial Latent Heat Flux from Eddy Covariance, Meteorological and Satellite Observations

    PubMed Central

    Feng, Fei; Li, Xianglan; Yao, Yunjun; Liang, Shunlin; Chen, Jiquan; Zhao, Xiang; Jia, Kun; Pintér, Krisztina; McCaughey, J. Harry

    2016-01-01

    Accurate estimation of latent heat flux (LE) based on remote sensing data is critical in characterizing terrestrial ecosystems and modeling land surface processes. Many LE products were released during the past few decades, but their quality might not meet the requirements in terms of data consistency and estimation accuracy. Merging multiple algorithms could be an effective way to improve the quality of existing LE products. In this paper, we present a data integration method based on modified empirical orthogonal function (EOF) analysis to integrate the Moderate Resolution Imaging Spectroradiometer (MODIS) LE product (MOD16) and the Priestley-Taylor LE algorithm of Jet Propulsion Laboratory (PT-JPL) estimate. Twenty-two eddy covariance (EC) sites with LE observation were chosen to evaluate our algorithm, showing that the proposed EOF fusion method was capable of integrating the two satellite data sets with improved consistency and reduced uncertainties. Further efforts were needed to evaluate and improve the proposed algorithm at larger spatial scales and time periods, and over different land cover types. PMID:27472383

  14. Chemical and physical influences on aerosol activation in liquid clouds: an empirical study based on observations from the Jungfraujoch, Switzerland

    NASA Astrophysics Data System (ADS)

    Hoyle, C. R.; Webster, C. S.; Rieder, H. E.; Hammer, E.; Gysel, M.; Bukowiecki, N.; Weingartner, E.; Steinbacher, M.; Baltensperger, U.

    2015-06-01

    A simple empirical model to predict the number of aerosols which activate to form cloud droplets in a warm, free tropospheric cloud has been established, based on data from four summertime Cloud and Aerosol Characterisation Experiments (CLACE) campaigns at the Jungfraujoch (JFJ). It is shown that 76% of the observed variance in droplet numbers can be represented by a model accounting only for the number of potential CCN (defined as number of particles larger than 90 nm in diameter), while the mean errors in the model representation may be reduced by the addition of further explanatory variables, such as the mixing ratios of O3, CO and the height of the measurements above cloud base. The model has similar ability to represent the observed droplet numbers in each of the individual years, as well as for the two predominant local wind directions at the JFJ (north west and south east). Given the central European location of the JFJ, with air masses in summer being representative of the free troposphere with regular boundary layer in-mixing via convection, we expect that this model is applicable to warm, free tropospheric clouds over the European continent.

  15. On the use of Empirical Data to Downscale Non-scientific Scepticism About Results From Complex Physical Based Models

    NASA Astrophysics Data System (ADS)

    Germer, S.; Bens, O.; Hüttl, R. F.

    2008-12-01

    The scepticism of non-scientific local stakeholders about results from complex physical based models is a major problem concerning the development and implementation of local climate change adaptation measures. This scepticism originates from the high complexity of such models. Local stakeholders perceive complex models as black-box models, as it is impossible to gasp all underlying assumptions and mathematically formulated processes at a glance. The use of physical based models is, however, indispensible to study complex underlying processes and to predict future environmental changes. The increase of climate change adaptation efforts following the release of the latest IPCC report indicates that the communication of facts about what has already changed is an appropriate tool to trigger climate change adaptation. Therefore we suggest increasing the practice of empirical data analysis in addition to modelling efforts. The analysis of time series can generate results that are easier to comprehend for non-scientific stakeholders. Temporal trends and seasonal patterns of selected hydrological parameters (precipitation, evapotranspiration, groundwater levels and river discharge) can be identified and the dependence of trends and seasonal patters to land use, topography and soil type can be highlighted. A discussion about lag times between the hydrological parameters can increase the awareness of local stakeholders for delayed environment responses.

  16. A Cutting Pattern Recognition Method for Shearers Based on Improved Ensemble Empirical Mode Decomposition and a Probabilistic Neural Network

    PubMed Central

    Xu, Jing; Wang, Zhongbin; Tan, Chao; Si, Lei; Liu, Xinhua

    2015-01-01

    In order to guarantee the stable operation of shearers and promote construction of an automatic coal mining working face, an online cutting pattern recognition method with high accuracy and speed based on Improved Ensemble Empirical Mode Decomposition (IEEMD) and Probabilistic Neural Network (PNN) is proposed. An industrial microphone is installed on the shearer and the cutting sound is collected as the recognition criterion to overcome the disadvantages of giant size, contact measurement and low identification rate of traditional detectors. To avoid end-point effects and get rid of undesirable intrinsic mode function (IMF) components in the initial signal, IEEMD is conducted on the sound. The end-point continuation based on the practical storage data is performed first to overcome the end-point effect. Next the average correlation coefficient, which is calculated by the correlation of the first IMF with others, is introduced to select essential IMFs. Then the energy and standard deviation of the reminder IMFs are extracted as features and PNN is applied to classify the cutting patterns. Finally, a simulation example, with an accuracy of 92.67%, and an industrial application prove the efficiency and correctness of the proposed method. PMID:26528985

  17. PDE-based Non-Linear Diffusion Techniques for Denoising Scientific and Industrial Images: An Empirical Study

    SciTech Connect

    Weeratunga, S K; Kamath, C

    2001-12-20

    Removing noise from data is often the first step in data analysis. Denoising techniques should not only reduce the noise, but do so without blurring or changing the location of the edges. Many approaches have been proposed to accomplish this; in this paper, they focus on one such approach, namely the use of non-linear diffusion operators. This approach has been studied extensively from a theoretical viewpoint ever since the 1987 work of Perona and Malik showed that non-linear filters outperformed the more traditional linear Canny edge detector. They complement this theoretical work by investigating the performance of several isotropic diffusion operators on test images from scientific domains. They explore the effects of various parameters such as the choice of diffusivity function, explicit and implicit methods for the discretization of the PDE, and approaches for the spatial discretization of the non-linear operator etc. They also compare these schemes with simple spatial filters and the more complex wavelet-based shrinkage techniques. The empirical results show that, with an appropriate choice of parameters, diffusion-based schemes can be as effective as competitive techniques.

  18. A Cutting Pattern Recognition Method for Shearers Based on Improved Ensemble Empirical Mode Decomposition and a Probabilistic Neural Network.

    PubMed

    Xu, Jing; Wang, Zhongbin; Tan, Chao; Si, Lei; Liu, Xinhua

    2015-01-01

    In order to guarantee the stable operation of shearers and promote construction of an automatic coal mining working face, an online cutting pattern recognition method with high accuracy and speed based on Improved Ensemble Empirical Mode Decomposition (IEEMD) and Probabilistic Neural Network (PNN) is proposed. An industrial microphone is installed on the shearer and the cutting sound is collected as the recognition criterion to overcome the disadvantages of giant size, contact measurement and low identification rate of traditional detectors. To avoid end-point effects and get rid of undesirable intrinsic mode function (IMF) components in the initial signal, IEEMD is conducted on the sound. The end-point continuation based on the practical storage data is performed first to overcome the end-point effect. Next the average correlation coefficient, which is calculated by the correlation of the first IMF with others, is introduced to select essential IMFs. Then the energy and standard deviation of the reminder IMFs are extracted as features and PNN is applied to classify the cutting patterns. Finally, a simulation example, with an accuracy of 92.67%, and an industrial application prove the efficiency and correctness of the proposed method. PMID:26528985

  19. A UNIFIED EMPIRICAL MODEL FOR INFRARED GALAXY COUNTS BASED ON THE OBSERVED PHYSICAL EVOLUTION OF DISTANT GALAXIES

    SciTech Connect

    Bethermin, Matthieu; Daddi, Emanuele; Sargent, Mark T.; Elbaz, David; Mullaney, James; Pannella, Maurilio; Hezaveh, Yashar; Le Borgne, Damien; Buat, Veronique; Charmandaris, Vassilis; Lagache, Guilaine; Scott, Douglas

    2012-10-01

    We reproduce the mid-infrared to radio galaxy counts with a new empirical model based on our current understanding of the evolution of main-sequence (MS) and starburst (SB) galaxies. We rely on a simple spectral energy distribution (SED) library based on Herschel observations: a single SED for the MS and another one for SB, getting warmer with redshift. Our model is able to reproduce recent measurements of galaxy counts performed with Herschel, including counts per redshift slice. This agreement demonstrates the power of our 2-Star-Formation Modes (2SFM) decomposition in describing the statistical properties of infrared sources and their evolution with cosmic time. We discuss the relative contribution of MS and SB galaxies to the number counts at various wavelengths and flux densities. We also show that MS galaxies are responsible for a bump in the 1.4 GHz radio counts around 50 {mu}Jy. Material of the model (predictions, SED library, mock catalogs, etc.) is available online.

  20. Empirical study on neural network based predictive techniques for automatic number plate recognition

    NASA Astrophysics Data System (ADS)

    Shashidhara, M. S.; Indrakumar, S. S.

    2011-10-01

    The objective of this study is to provide an easy, accurate and effective technology for the Bangalore city traffic control. This is based on the techniques of image processing and laser beam technology. The core concept chosen here is an image processing technology by the method of automatic number plate recognition system. First number plate is recognized if any vehicle breaks the traffic rules in the signals. The number is fetched from the database of the RTO office by the process of automatic database fetching. Next this sends the notice and penalty related information to the vehicle owner email-id and an SMS sent to vehicle owner. In this paper, we use of cameras with zooming options & laser beams to get accurate pictures further applied image processing techniques such as Edge detection to understand the vehicle, Identifying the location of the number plate, Identifying the number plate for further use, Plain plate number, Number plate with additional information, Number plates in the different fonts. Accessing the database of the vehicle registration office to identify the name and address and other information of the vehicle number. The updates to be made to the database for the recording of the violation and penalty issues. A feed forward artificial neural network is used for OCR. This procedure is particularly important for glyphs that are visually similar such as '8' and '9' and results in training sets of between 25,000 and 40,000 training samples. Over training of the neural network is prevented by Bayesian regularization. The neural network output value is set to 0.05 when the input is not desired glyph, and 0.95 for correct input.

  1. An empirical approach to predicting long term behavior of metal particle based recording media

    NASA Technical Reports Server (NTRS)

    Hadad, Allan S.

    1991-01-01

    Alpha iron particles used for magnetic recording are prepared through a series of dehydration and reduction steps of alpha-Fe2O3-H2O resulting in acicular, polycrystalline, body centered cubic (bcc) alpha-Fe particles that are single magnetic domains. Since fine iron particles are pyrophoric by nature, stabilization processes had to be developed in order for iron particles to be considered as a viable recording medium for long term archival (i.e., 25+ years) information storage. The primary means of establishing stability is through passivation or controlled oxidation of the iron particle's surface. Since iron particles used for magnetic recording are small, additional oxidation has a direct impact on performance especially where archival storage of recorded information for long periods of time is important. Further stabilization chemistry/processes had to be developed to guarantee that iron particles could be considered as a viable long term recording medium. In an effort to retard the diffusion of iron ions through the oxide layer, other elements such as silicon, aluminum, and chromium have been added to the base iron to promote more dense scale formation or to alleviate some of the non-stoichiometric behavior of the oxide or both. The presence of water vapor has been shown to disrupt the passive layer, subsequently increasing the oxidation rate of the iron. A study was undertaken to examine the degradation in magnetic properties as a function of both temperature and humidity on silicon-containing iron particles between 50-120 deg C and 3-89 percent relative humidity. The methodology to which experimental data was collected and analyzed leading to predictive capability is discussed.

  2. An empirical approach to selecting community-based alcohol interventions: combining research evidence, rural community views and professional opinion

    PubMed Central

    2012-01-01

    Background Given limited research evidence for community-based alcohol interventions, this study examines the intervention preferences of rural communities and alcohol professionals, and factors that influence their choices. Method Community preferences were identified by a survey of randomly selected individuals across 20 regional Australian communities. The preferences of alcohol professionals were identified by a survey of randomly selected members of the Australasian Professional Society on Alcohol and Other Drugs. To identify preferred interventions and the extent of support for them, a budget allocation exercise was embedded in both surveys, asking respondents to allocate a given budget to different interventions. Tobit regression models were estimated to identify the characteristics that explain differences in intervention preferences. Results Community respondents selected school programs most often (88.0%) and allocated it the largest proportion of funds, followed by promotion of safer drinking (71.3%), community programs (61.4%) and police enforcement of alcohol laws (60.4%). Professionals selected GP training most often (61.0%) and allocated it the largest proportion of funds, followed by school programs (36.6%), community programs (33.8%) and promotion of safer drinking (31.7%). Community views were susceptible to response bias. There were no significant predictors of professionals' preferences. Conclusions In the absence of sufficient research evidence for effective community-based alcohol interventions, rural communities and professionals both strongly support school programs, promotion of safer drinking and community programs. Rural communities also supported police enforcement of alcohol laws and professionals supported GP training. The impact of a combination of these strategies needs to be rigorously evaluated. PMID:22233608

  3. GIS-based analysis and modelling with empirical and remotely-sensed data on coastline advance and retreat

    NASA Astrophysics Data System (ADS)

    Ahmad, Sajid Rashid

    With the understanding that far more research remains to be done on the development and use of innovative and functional geospatial techniques and procedures to investigate coastline changes this thesis focussed on the integration of remote sensing, geographical information systems (GIS) and modelling techniques to provide meaningful insights on the spatial and temporal dynamics of coastline changes. One of the unique strengths of this research was the parameterization of the GIS with long-term empirical and remote sensing data. Annual empirical data from 1941--2007 were analyzed by the GIS, and then modelled with statistical techniques. Data were also extracted from Landsat TM and ETM+ images. The band ratio method was used to extract the coastlines. Topographic maps were also used to extract digital map data. All data incorporated into ArcGIS 9.2 were analyzed with various modules, including Spatial Analyst, 3D Analyst, and Triangulated Irregular Networks. The Digital Shoreline Analysis System was used to analyze and predict rates of coastline change. GIS results showed the spatial locations along the coast that will either advance or retreat over time. The linear regression results highlighted temporal changes which are likely to occur along the coastline. Box-Jenkins modelling procedures were utilized to determine statistical models which best described the time series (1941--2007) of coastline change data. After several iterations and goodness-of-fit tests, second-order spatial cyclic autoregressive models, first-order autoregressive models and autoregressive moving average models were identified as being appropriate for describing the deterministic and random processes operating in Guyana's coastal system. The models highlighted not only cyclical patterns in advance and retreat of the coastline, but also the existence of short and long-term memory processes. Long-term memory processes could be associated with mudshoal propagation and stabilization while short

  4. Lessons Learned: Cultural and linguistic enhancement of surveys through community-based participatory research

    PubMed Central

    Formea, Christine M.; Mohamed, Ahmed A.; Hassan, Abdullahi; Osman, Ahmed; Weis, Jennifer A.; Sia, Irene G.; Wieland, Mark L.

    2014-01-01

    Background Surveys are frequently implemented in community-based participatory research (CBPR), but adaptation and translation of surveys can be logistically and methodologically challenging when working with immigrant and refugee populations. Objective To describe a process of participatory survey adaptation and translation. Methods Within an established CBPR partnership, a survey about diabetes was adapted for health literacy and local relevance and then translated through a process of forward translation, group deliberation, and back translation. Lessons Learned The group deliberation process was the most time-intensive and important component of the process. The process enhanced community ownership of the larger project while maximizing local applicability of the product. Conclusions A participatory process of survey adaptation and translation resulted in significant revisions to approximate semantic, cultural, and conceptual equivalence with the original surveys. This approach is likely to enhance community acceptance of the survey instrument during the implementation phase. PMID:25435559

  5. Teaching Margery and Julian in Anthology-Based Survey Courses

    ERIC Educational Resources Information Center

    Petersen, Zina

    2006-01-01

    Recognizing that many of us teach the medieval English women mystics Margery Kempe and Julian of Norwich in survey courses, this essay attempts to put these writers in context for teachers who may have only a passing familiarity with the period. Focusing on passages of their writings found in the Longman and Norton anthologies of British…

  6. Environmental surveys in Alaska based upon ERTS data

    NASA Technical Reports Server (NTRS)

    Miller, J. M.

    1974-01-01

    Alaskan applications of ERTS data are summarized. Areas discussed are: (1) land use; (2) archaeology; (3) vegetation mapping; (4) ice reporting and mapping; (5) permafrost; (6) mineral and oil exploration; (7) geological surveys; (8) seismology; (9) geological faults and structures; (10) hydrology and water resources; (11) glaciology; (12) water circulation in Cook Inlet; and (13) fish and mammal populations.

  7. Fall 1994 wildlife and vegetation survey, Norton Air Force Base, California

    SciTech Connect

    Not Available

    1994-12-15

    The fall 1994 wildlife and vegetation surveys were completed October 3-7, 1994, at Norton Air Force Base (AFB), California. Two biologists from CDM Federal Programs, the U.S. Environmental Protection Agency (EPA) regional biologist and the Oak Ridge National Laboratory (ORNL) lead biologist conducted the surveys. A habitat assessment of three Installation Restoration Project (IRP) sites at Norton Air Force Base was also completed during the fall survey period. The IRP sites include: Landfill No. 2 (Site 2); the Industrial Wastewater Treatment Plant (IWTP) area; and Former Fire Training Area No. 1 (Site 5). The assessments were designed to qualitatively characterize the sites of concern, identify potential ecological receptors, and provide information for Remedial Design/Remedial Action activities. A Reference Area (Santa Ana River Wash) and the base urban areas were also characterized. The reference area assessment was performed to provide a baseline for comparison with the IRP site habitats. The fall 1994 survey is the second of up to four surveys that may be completed. In order to develop a complete understanding of all plant and animal species using the base, these surveys were planned to be conducted over four seasons. Species composition can vary widely during the course of a year in Southern California, and therefore, seasonal surveys will provide the most complete and reliable data to address changes in habitat structure and wildlife use of the site. Subsequent surveys will focus on seasonal wildlife observations and a spring vegetation survey.

  8. Autonomous and Remote-Controlled Airborne and Ground-Based Robotic Platforms for Adaptive Geophysical Surveying

    NASA Astrophysics Data System (ADS)

    Spritzer, J. M.; Phelps, G. A.

    2011-12-01

    Low-cost autonomous and remote-controlled robotic platforms have opened the door to precision-guided geophysical surveying. Over the past two years, the U.S. Geological Survey, Senseta, NASA Ames Research Center, and Carnegie Mellon University Silicon Valley, have developed and deployed small autonomous and remotely controlled vehicles for geophysical investigations. The purpose of this line of investigation is to 1) increase the analytical capability, resolution, and repeatability, and 2) decrease the time, and potentially the cost and map-power necessary to conduct near-surface geophysical surveys. Current technology has advanced to the point where vehicles can perform geophysical surveys autonomously, freeing the geoscientist to process and analyze the incoming data in near-real time. This has enabled geoscientists to monitor survey parameters; process, analyze and interpret the incoming data; and test geophysical models in the same field session. This new approach, termed adaptive surveying, provides the geoscientist with choices of how the remainder of the survey should be conducted. Autonomous vehicles follow pre-programmed survey paths, which can be utilized to easily repeat surveys on the same path over large areas without the operator fatigue and error that plague man-powered surveys. While initial deployments with autonomous systems required a larger field crew than a man-powered survey, over time operational experience costs and man power requirements will decrease. Using a low-cost, commercially available chassis as the base for autonomous surveying robotic systems promise to provide higher precision and efficiency than human-powered techniques. An experimental survey successfully demonstrated the adaptive techniques described. A magnetic sensor was mounted on a small rover, which autonomously drove a prescribed course designed to provide an overview of the study area. Magnetic data was relayed to the base station periodically, processed and gridded. A

  9. Quantifying Stream Habitat: Relative Effort Versus Quality of Competing Remote Sensing & Ground-Based Survey Techniques

    NASA Astrophysics Data System (ADS)

    Bangen, S. G.; Wheaton, J. M.; Bouwes, N.

    2010-12-01

    Numerous field and analytical methods exist to assist in the quantification of the quantity and quality of in-stream habitat for salmonids. These methods range from field sketches or ‘tape and stick’ ground-based surveys, through to spatially explicit topographic and aerial photographic surveys from a mix of ground-based and remotely sensed airborne platforms. Although some investigators have assessed the quality of specific individual survey methods, the inter-comparison of competing techniques across a diverse range of habitat conditions (wadeable headwater channels to non-wadeable mainstem channels) has not yet been elucidated. In this study, we seek to quantify relative quality (i.e. accuracy, precision, extent) of habitat metrics and inventories derived from different ground-based and remotely sensed surveys of varying degrees of sophistication, as well as enumerate the effort and cost in completing the surveys. Over the summer of 2010, seven sample reaches of varying habitat complexity were surveyed in the Lemhi River Basin, Idaho, USA. Three different traditional (“stick and tape”) survey techniques were used, including a variant using map-grade GPS. Complete topographic/bathymetric surveys were attempted at each site using separate rtkGPS, total station, ground-based LiDaR, boat-based echo-sounding (w/ ADCP), traditional airborne LiDaR, and imagery-based spectral methods. Separate, georectified aerial imagery surveys were acquired using a tethered blimp, a drone UAV, and a traditional fixed-wing aircraft. Preliminary results from the surveys highlight that no single technique works across the full range of conditions where stream habitat surveys are needed. The results are helpful for understanding the strengths and weaknesses of each approach in specific conditions, and how a hybrid of data acquisition methods can be used to build a more complete quantification of habitat conditions in rivers.

  10. Intelligence in Bali--A Case Study on Estimating Mean IQ for a Population Using Various Corrections Based on Theory and Empirical Findings

    ERIC Educational Resources Information Center

    Rindermann, Heiner; te Nijenhuis, Jan

    2012-01-01

    A high-quality estimate of the mean IQ of a country requires giving a well-validated test to a nationally representative sample, which usually is not feasible in developing countries. So, we used a convenience sample and four corrections based on theory and empirical findings to arrive at a good-quality estimate of the mean IQ in Bali. Our study…

  11. The dappled nature of causes of psychiatric illness: replacing the organic-functional/hardware-software dichotomy with empirically based pluralism.

    PubMed

    Kendler, K S

    2012-04-01

    Our tendency to see the world of psychiatric illness in dichotomous and opposing terms has three major sources: the philosophy of Descartes, the state of neuropathology in late nineteenth century Europe (when disorders were divided into those with and without demonstrable pathology and labeled, respectively, organic and functional), and the influential concept of computer functionalism wherein the computer is viewed as a model for the human mind-brain system (brain=hardware, mind=software). These mutually re-enforcing dichotomies, which have had a pernicious influence on our field, make a clear prediction about how 'difference-makers' (aka causal risk factors) for psychiatric disorders should be distributed in nature. In particular, are psychiatric disorders like our laptops, which when they dysfunction, can be cleanly divided into those with software versus hardware problems? I propose 11 categories of difference-makers for psychiatric illness from molecular genetics through culture and review their distribution in schizophrenia, major depression and alcohol dependence. In no case do these distributions resemble that predicted by the organic-functional/hardware-software dichotomy. Instead, the causes of psychiatric illness are dappled, distributed widely across multiple categories. We should abandon Cartesian and computer-functionalism-based dichotomies as scientifically inadequate and an impediment to our ability to integrate the diverse information about psychiatric illness our research has produced. Empirically based pluralism provides a rigorous but dappled view of the etiology of psychiatric illness. Critically, it is based not on how we wish the world to be but how the difference-makers for psychiatric illness are in fact distributed. PMID:22230881

  12. The dappled nature of causes of psychiatric illness: replacing the organic–functional/hardware–software dichotomy with empirically based pluralism

    PubMed Central

    Kendler, KS

    2012-01-01

    Our tendency to see the world of psychiatric illness in dichotomous and opposing terms has three major sources: the philosophy of Descartes, the state of neuropathology in late nineteenth century Europe (when disorders were divided into those with and without demonstrable pathology and labeled, respectively, organic and functional), and the influential concept of computer functionalism wherein the computer is viewed as a model for the human mind–brain system (brain = hardware, mind = software). These mutually re-enforcing dichotomies, which have had a pernicious influence on our field, make a clear prediction about how ‘difference-makers’ (aka causal risk factors) for psychiatric disorders should be distributed in nature. In particular, are psychiatric disorders like our laptops, which when they dysfunction, can be cleanly divided into those with software versus hardware problems? I propose 11 categories of difference-makers for psychiatric illness from molecular genetics through culture and review their distribution in schizophrenia, major depression and alcohol dependence. In no case do these distributions resemble that predicted by the organic–functional/hardware–software dichotomy. Instead, the causes of psychiatric illness are dappled, distributed widely across multiple categories. We should abandon Cartesian and computer-functionalism-based dichotomies as scientifically inadequate and an impediment to our ability to integrate the diverse information about psychiatric illness our research has produced. Empirically based pluralism provides a rigorous but dappled view of the etiology of psychiatric illness. Critically, it is based not on how we wish the world to be but how the difference-makers for psychiatric illness are in fact distributed. PMID:22230881

  13. In silico structure-based screening of versatile P-glycoprotein inhibitors using polynomial empirical scoring functions.

    PubMed

    Shityakov, Sergey; Förster, Carola

    2014-01-01

    P-glycoprotein (P-gp) is an ATP (adenosine triphosphate)-binding cassette transporter that causes multidrug resistance of various chemotherapeutic substances by active efflux from mammalian cells. P-gp plays a pivotal role in limiting drug absorption and distribution in different organs, including the intestines and brain. Thus, the prediction of P-gp-drug interactions is of vital importance in assessing drug pharmacokinetic and pharmacodynamic properties. To find the strongest P-gp blockers, we performed an in silico structure-based screening of P-gp inhibitor library (1,300 molecules) by the gradient optimization method, using polynomial empirical scoring (POLSCORE) functions. We report a strong correlation (r (2)=0.80, F=16.27, n=6, P<0.0157) of inhibition constants (Kiexp or pKiexp; experimental Ki or negative decimal logarithm of Kiexp) converted from experimental IC50 (half maximal inhibitory concentration) values with POLSCORE-predicted constants (KiPOLSCORE or pKiPOLSCORE), using a linear regression fitting technique. The hydrophobic interactions between P-gp and selected drug substances were detected as the main forces responsible for the inhibition effect. The results showed that this scoring technique might be useful in the virtual screening and filtering of databases of drug-like compounds at the early stage of drug development processes. PMID:24711707

  14. Empirically-Based Crop Insurance for China: A Pilot Study in the Down-middle Yangtze River Area of China

    NASA Astrophysics Data System (ADS)

    Wang, Erda; Yu, Yang; Little, Bertis B.; Chen, Zhongxin; Ren, Jianqiang

    Factors that caused slow growth in crop insurance participation and its ultimate failure in China were multi-faceted including high agricultural production risk, low participation rate, inadequate public awareness, high loss ratio, insufficient and interrupted government financial support. Thus, a clear and present need for data driven analyses and empirically-based risk management exists in China. In the present investigation, agricultural production data for two crops (corn, rice) in five counties in Jiangxi Province and Hunan province for design of a pilot crop insurance program in China. A crop insurance program was designed which (1) provides 75% coverage, (2) a 55% premium rate reduction for the farmer compared to catastrophic coverage most recently offered, and uses the currently approved governmental premium subsidy level. Thus a safety net for Chinese farmers that help maintain agricultural production at a level of self-sufficiency that costs less than half the current plans requires one change to the program: ≥80% of producers must participate in an area.

  15. Empirically Based Profiles of the Early Literacy Skills of Children With Language Impairment in Early Childhood Special Education.

    PubMed

    Justice, Laura; Logan, Jessica; Kaderavek, Joan; Schmitt, Mary Beth; Tompkins, Virginia; Bartlett, Christopher

    2015-01-01

    The purpose of this study was to empirically determine whether specific profiles characterize preschool-aged children with language impairment (LI) with respect to their early literacy skills (print awareness, name-writing ability, phonological awareness, alphabet knowledge); the primary interest was to determine if one or more profiles suggested vulnerability for future reading problems. Participants were 218 children enrolled in early childhood special education classrooms, 95% of whom received speech-language services. Children were administered an assessment of early literacy skills in the fall of the academic year. Based on results of latent profile analysis, four distinct literacy profiles were identified, with the single largest profile (55% of children) representing children with generally poor literacy skills across all areas examined. Children in the two low-risk categories had higher oral language skills than those in the high-risk and moderate-risk profiles. Across three of the four early literacy measures, children with language as their primary disability had higher scores than those with LI concomitant with other disabilities. These findings indicate that there are specific profiles of early literacy skills among children with LI, with about one half of children exhibiting a profile indicating potential susceptibility for future reading problems. PMID:24232733

  16. An improved empirical model of electron and ion fluxes at geosynchronous orbit based on upstream solar wind conditions

    DOE PAGESBeta

    Denton, M. H.; Henderson, M. G.; Jordanova, V. K.; Thomsen, M. F.; Borovsky, J. E.; Woodroffe, J.; Hartley, D. P.; Pitchford, D.

    2016-07-27

    In this study, a new empirical model of the electron fluxes and ion fluxes at geosynchronous orbit (GEO) is introduced, based on observations by Los Alamos National Laboratory (LANL) satellites. The model provides flux predictions in the energy range ~1 eV to ~40 keV, as a function of local time, energy, and the strength of the solar wind electric field (the negative product of the solar wind speed and the z component of the magnetic field). Given appropriate upstream solar wind measurements, the model provides a forecast of the fluxes at GEO with a ~1 h lead time. Model predictionsmore » are tested against in-sample observations from LANL satellites and also against out-of-sample observations from the Compact Environmental Anomaly Sensor II detector on the AMC-12 satellite. The model does not reproduce all structure seen in the observations. However, for the intervals studied here (quiet and storm times) the normalized root-mean-square deviation < ~0.3. It is intended that the model will improve forecasting of the spacecraft environment at GEO and also provide improved boundary/input conditions for physical models of the magnetosphere.« less

  17. Multi-fault diagnosis for rolling element bearings based on ensemble empirical mode decomposition and optimized support vector machines

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoyuan; Zhou, Jianzhong

    2013-12-01

    This study presents a novel procedure based on ensemble empirical mode decomposition (EEMD) and optimized support vector machine (SVM) for multi-fault diagnosis of rolling element bearings. The vibration signal is adaptively decomposed into a number of intrinsic mode functions (IMFs) by EEMD. Two types of features, the EEMD energy entropy and singular values of the matrix whose rows are IMFs, are extracted. EEMD energy entropy is used to specify whether the bearing has faults or not. If the bearing has faults, singular values are input to multi-class SVM optimized by inter-cluster distance in the feature space (ICDSVM) to specify the fault type. The proposed method was tested on a system with an electric motor which has two rolling bearings with 8 normal working conditions and 48 fault working conditions. Five groups of experiments were done to evaluate the effectiveness of the proposed method. The results show that the proposed method outperforms other methods both mentioned in this paper and published in other literatures.

  18. Combined magnetic and kinetic control of advanced tokamak steady state scenarios based on semi-empirical modelling

    NASA Astrophysics Data System (ADS)

    Moreau, D.; Artaud, J. F.; Ferron, J. R.; Holcomb, C. T.; Humphreys, D. A.; Liu, F.; Luce, T. C.; Park, J. M.; Prater, R.; Turco, F.; Walker, M. L.

    2015-06-01

    This paper shows that semi-empirical data-driven models based on a two-time-scale approximation for the magnetic and kinetic control of advanced tokamak (AT) scenarios can be advantageously identified from simulated rather than real data, and used for control design. The method is applied to the combined control of the safety factor profile, q(x), and normalized pressure parameter, βN, using DIII-D parameters and actuators (on-axis co-current neutral beam injection (NBI) power, off-axis co-current NBI power, electron cyclotron current drive power, and ohmic coil). The approximate plasma response model was identified from simulated open-loop data obtained using a rapidly converging plasma transport code, METIS, which includes an MHD equilibrium and current diffusion solver, and combines plasma transport nonlinearity with 0D scaling laws and 1.5D ordinary differential equations. The paper discusses the results of closed-loop METIS simulations, using the near-optimal ARTAEMIS control algorithm (Moreau D et al 2013 Nucl. Fusion 53 063020) for steady state AT operation. With feedforward plus feedback control, the steady state target q-profile and βN are satisfactorily tracked with a time scale of about 10 s, despite large disturbances applied to the feedforward powers and plasma parameters. The robustness of the control algorithm with respect to disturbances of the H&CD actuators and of plasma parameters such as the H-factor, plasma density and effective charge, is also shown.

  19. In silico structure-based screening of versatile P-glycoprotein inhibitors using polynomial empirical scoring functions

    PubMed Central

    Shityakov, Sergey; Förster, Carola

    2014-01-01

    P-glycoprotein (P-gp) is an ATP (adenosine triphosphate)-binding cassette transporter that causes multidrug resistance of various chemotherapeutic substances by active efflux from mammalian cells. P-gp plays a pivotal role in limiting drug absorption and distribution in different organs, including the intestines and brain. Thus, the prediction of P-gp–drug interactions is of vital importance in assessing drug pharmacokinetic and pharmacodynamic properties. To find the strongest P-gp blockers, we performed an in silico structure-based screening of P-gp inhibitor library (1,300 molecules) by the gradient optimization method, using polynomial empirical scoring (POLSCORE) functions. We report a strong correlation (r2=0.80, F=16.27, n=6, P<0.0157) of inhibition constants (Kiexp or pKiexp; experimental Ki or negative decimal logarithm of Kiexp) converted from experimental IC50 (half maximal inhibitory concentration) values with POLSCORE-predicted constants (KiPOLSCORE or pKiPOLSCORE), using a linear regression fitting technique. The hydrophobic interactions between P-gp and selected drug substances were detected as the main forces responsible for the inhibition effect. The results showed that this scoring technique might be useful in the virtual screening and filtering of databases of drug-like compounds at the early stage of drug development processes. PMID:24711707

  20. Discussion on climate oscillations: CMIP5 general circulation models versus a semi-empirical harmonic model based on astronomical cycles

    NASA Astrophysics Data System (ADS)

    Scafetta, Nicola

    2013-11-01

    Power spectra of global surface temperature (GST) records (available since 1850) reveal major periodicities at about 9.1, 10-11, 19-22 and 59-62 years. Equivalent oscillations are found in numerous multisecular paleoclimatic records. The Coupled Model Intercomparison Project 5 (CMIP5) general circulation models (GCMs), to be used in the IPCC Fifth Assessment Report (AR5, 2013), are analyzed and found not able to reconstruct this variability. In particular, from 2000 to 2013.5 a GST plateau is observed while the GCMs predicted a warming rate of about 2 °C/century. In contrast, the hypothesis that the climate is regulated by specific natural oscillations more accurately fits the GST records at multiple time scales. For example, a quasi 60-year natural oscillation simultaneously explains the 1850-1880, 1910-1940 and 1970-2000 warming periods, the 1880-1910 and 1940-1970 cooling periods and the post 2000 GST plateau. This hypothesis implies that about 50% of the ~ 0.5 °C global surface warming observed from 1970 to 2000 was due to natural oscillations of the climate system, not to anthropogenic forcing as modeled by the CMIP3 and CMIP5 GCMs. Consequently, the climate sensitivity to CO2 doubling should be reduced by half, for example from the 2.0-4.5 °C range (as claimed by the IPCC, 2007) to 1.0-2.3 °C with a likely median of ~ 1.5 °C instead of ~ 3.0 °C. Also modern paleoclimatic temperature reconstructions showing a larger preindustrial variability than the hockey-stick shaped temperature reconstructions developed in early 2000 imply a weaker anthropogenic effect and a stronger solar contribution to climatic changes. The observed natural oscillations could be driven by astronomical forcings. The ~ 9.1 year oscillation appears to be a combination of long soli-lunar tidal oscillations, while quasi 10-11, 20 and 60 year oscillations are typically found among major solar and heliospheric oscillations driven mostly by Jupiter and Saturn movements. Solar models based

  1. Measuring coverage in MNCH: evaluation of community-based treatment of childhood illnesses through household surveys.

    PubMed

    Hazel, Elizabeth; Requejo, Jennifer; David, Julia; Bryce, Jennifer

    2013-01-01

    Community case management (CCM) is a strategy for training and supporting workers at the community level to provide treatment for the three major childhood diseases--diarrhea, fever (indicative of malaria), and pneumonia--as a complement to facility-based care. Many low- and middle-income countries are now implementing CCM and need to evaluate whether adoption of the strategy is associated with increases in treatment coverage. In this review, we assess the extent to which large-scale, national household surveys can serve as sources of baseline data for evaluating trends in community-based treatment coverage for childhood illnesses. Our examination of the questionnaires used in Demographic and Health Surveys (DHS) and Multiple Indicator Cluster Surveys (MICS) conducted between 2005 and 2010 in five sub-Saharan African countries shows that questions on care seeking that included a locally adapted option for a community-based provider were present in all the DHS surveys and in some MICS surveys. Most of the surveys also assessed whether appropriate treatments were available, but only one survey collected information on the place of treatment for all three illnesses. This absence of baseline data on treatment source in household surveys will limit efforts to evaluate the effects of the introduction of CCM strategies in the study countries. We recommend alternative analysis plans for assessing CCM programs using household survey data that depend on baseline data availability and on the timing of CCM policy implementation. PMID:23667329

  2. Survey of health department-based environmental epidemiology programs.

    PubMed Central

    Lapham, S C; Castle, S P

    1984-01-01

    A survey of state epidemiologists in all 50 states and New York City was conducted between October 1982 and January 1983 to determine which states had existing programs in environmental epidemiology. We identified 29 environmental epidemiology programs with at least one full-time state-funded staff member. The most common areas of responsibility included investigations of indoor air pollution (96 per cent), exposures to toxic or hazardous substances (93 per cent), and pesticide exposures (93 per cent). PMID:6465403

  3. South African maize production scenarios for 2055 using a combined empirical and process-based model approach

    NASA Astrophysics Data System (ADS)

    Estes, L.; Bradley, B.; Oppenheimer, M.; Wilcove, D.; Beukes, H.; Schulze, R. E.; Tadross, M.

    2011-12-01

    In South Africa, a semi-arid country with a diverse agricultural sector, climate change is projected to negatively impact staple crop production. Our study examines future impacts to maize, South Africa's most widely grown staple crop. Working at finer spatial resolution than previous studies, we combine the process-based DSSAT4.5 and the empirical MAXENT models to study future maize suitability. Climate scenarios were based on 9 GCMs run under SRES A2 and B1 emissions scenarios down-scaled (using self-organizing maps) to 5838 locations. Soil properties were derived from textural and compositional data linked to 26422 landforms. DSSAT was run with typical dryland planting parameters and mean projected CO2 values. MAXENT was trained using aircraft-observed distributions and monthly climatologies data derived from downscaled daily records, with future rainfall increased by 10% to simulate CO2 related water-use efficiency gains. We assessed model accuracy based on correlations between model output and a satellite-derived yield proxy (integrated NDVI), and the overlap of modeled and observed maize field distributions. DSSAT yields were linearly correlated to mean integrated NDVI (R2 = 0.38), while MAXENT's relationship was logistic. Binary suitability maps based on thresholding model outputs were slightly more accurate for MAXENT (88%) than for DSSAT (87%) when compared to current maize field distribution. We created 18 suitability maps for each model (9 GCMs X 2 SRES) using projected changes relative to historical suitability thresholds. Future maps largely agreed in eastern South Africa, but disagreed strongly in the semi-arid west. Using a 95% confidence criterion (17 models agree), MAXENT showed a 241305 km2 suitability loss relative to its modeled historical suitability, while DSSAT showed a potential loss of only 112446 km2. Even the smaller potential loss highlighted by DSSAT is uncertain, given that DSSAT's mean (across all 18 climate scenarios) projected yield

  4. Stellar population synthesis models between 2.5 and 5 μm based on the empirical IRTF stellar library

    NASA Astrophysics Data System (ADS)

    Röck, B.; Vazdekis, A.; Peletier, R. F.; Knapen, J. H.; Falcón-Barroso, J.

    2015-05-01

    We present the first single-burst stellar population models in the infrared wavelength range between 2.5 and 5 μm which are exclusively based on empirical stellar spectra. Our models take as input 180 spectra from the stellar IRTF (Infrared Telescope Facility) library. Our final single-burst stellar population models are calculated based on two different sets of isochrones and various types of initial mass functions of different slopes, ages larger than 1 Gyr and metallicities between [Fe/H] = -0.70 and 0.26. They are made available online to the scientific community on the MILES web page. We analyse the behaviour of the Spitzer [3.6]-[4.5] colour calculated from our single stellar population models and find only slight dependences on both metallicity and age. When comparing to the colours of observed early-type galaxies, we find a good agreement for older, more massive galaxies that resemble a single-burst population. Younger, less massive and more metal-poor galaxies show redder colours with respect to our models. This mismatch can be explained by a more extended star formation history of these galaxies which includes a metal-poor or/and young population. Moreover, the colours derived from our models agree very well with most other models available in this wavelength range. We confirm that the mass-to-light ratio determined in the Spitzer [3.6] μm band changes much less as a function of both age and metallicity than in the optical bands.

  5. Predicting Student Performance in Web-Based Distance Education Courses Based on Survey Instruments Measuring Personality Traits and Technical Skills

    ERIC Educational Resources Information Center

    Hall, Michael

    2008-01-01

    Two common web-based surveys, "Is Online Learning Right for Me?' and "What Technical Skills Do I Need?", were combined into a single survey instrument and given to 228 on-campus and 83 distance education students. The students were enrolled in four different classes (business, computer information services, criminal justice, and…

  6. Survey of Speech-Language Pathology Services in School-Based Settings National Study Final Report.

    ERIC Educational Resources Information Center

    Peters-Johnson, Cassandra

    1998-01-01

    This paper reports on a survey of 1,718 school-based speech language pathologists in 1995. Survey questions addressed such topics as caseload characteristics, service delivery, bilingual/bicultural services, supervision, support personnel, shortages of speech-language pathologists, substitutes, and demographic characteristics, and activities. (DB)

  7. Measuring Model-Based High School Science Instruction: Development and Application of a Student Survey

    ERIC Educational Resources Information Center

    Fulmer, Gavin W.; Liang, Ling L.

    2013-01-01

    This study tested a student survey to detect differences in instruction between teachers in a modeling-based science program and comparison group teachers. The Instructional Activities Survey measured teachers' frequency of modeling, inquiry, and lecture instruction. Factor analysis and Rasch modeling identified three subscales, Modeling and…

  8. The Mood of American Youth. Based on a 1983 Survey of American Youth.

    ERIC Educational Resources Information Center

    National Association of Secondary School Principals, Reston, VA.

    This 1984 report by the National Association of Secondary School Principals is based on a survey of 1,500 students from grades 7 through 12 selected by means of a probability sample by National Family Opinions, Inc. The report compares today's students with those surveyed in a similar study in 1974. Chapter 1, "Students and Their Schools,"…

  9. Demonstrating the Potential for Web-Based Survey Methodology with a Case Study.

    ERIC Educational Resources Information Center

    Mertler, Craig

    2002-01-01

    Describes personal experience with using the Internet to administer a teacher-motivation and job-satisfaction survey to elementary and secondary teachers. Concludes that advantages of Web-base surveys, such as cost savings and efficiency of data collection, outweigh disadvantages, such as the limitations of listservs. (Contains 10 references.)…

  10. Motion Trajectories for Wide-area Surveying with a Rover-based Distributed Spectrometer

    NASA Technical Reports Server (NTRS)

    Tunstel, Edward; Anderson, Gary; Wilson, Edmond

    2006-01-01

    A mobile ground survey application that employs remote sensing as a primary means of area coverage is highlighted. It is distinguished from mobile robotic area coverage problems that employ contact or proximity-based sensing. The focus is on a specific concept for performing mobile surveys in search of biogenic gases on planetary surfaces using a distributed spectrometer -- a rover-based instrument designed for wide measurement coverage of promising search areas. Navigation algorithms for executing circular and spiral survey trajectories are presented for widearea distributed spectroscopy and evaluated based on area covered and distance traveled.

  11. Assessing changes to South African maize production areas in 2055 using empirical and process-based crop models

    NASA Astrophysics Data System (ADS)

    Estes, L.; Bradley, B.; Oppenheimer, M.; Beukes, H.; Schulze, R. E.; Tadross, M.

    2010-12-01

    Rising temperatures and altered precipitation patterns associated with climate change pose a significant threat to crop production, particularly in developing countries. In South Africa, a semi-arid country with a diverse agricultural sector, anthropogenic climate change is likely to affect staple crops and decrease food security. Here, we focus on maize production, South Africa’s most widely grown crop and one with high socio-economic value. We build on previous coarser-scaled studies by working at a finer spatial resolution and by employing two different modeling approaches: the process-based DSSAT Cropping System Model (CSM, version 4.5), and an empirical distribution model (Maxent). For climate projections, we use an ensemble of 10 general circulation models (GCMs) run under both high and low CO2 emissions scenarios (SRES A2 and B1). The models were down-scaled to historical climate records for 5838 quinary-scale catchments covering South Africa (mean area = 164.8 km2), using a technique based on self-organizing maps (SOMs) that generates precipitation patterns more consistent with observed gradients than those produced by the parent GCMs. Soil hydrological and mechanical properties were derived from textural and compositional data linked to a map of 26422 land forms (mean area = 46 km2), while organic carbon from 3377 soil profiles was mapped using regression kriging with 8 spatial predictors. CSM was run using typical management parameters for the several major dryland maize production regions, and with projected CO2 values. The Maxent distribution model was trained using maize locations identified using annual phenology derived from satellite images coupled with airborne crop sampling observations. Temperature and precipitation projections were based on GCM output, with an additional 10% increase in precipitation to simulate higher water-use efficiency under future CO2 concentrations. The two modeling approaches provide spatially explicit projections of

  12. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  13. Modeling magnetic fields from a DC power cable buried beneath San Francisco Bay based on empirical measurements

    DOE PAGESBeta

    Kavet, Robert; Wyman, Megan T.; Klimley, A. Peter; Carretero, Luis

    2016-02-25

    Here, the Trans Bay Cable (TBC) is a ±200-kilovolt (kV), 400 MW 85-km long High Voltage Direct Current (DC) buried transmission line linking Pittsburg, CA with San Francisco, CA (SF) beneath the San Francisco Estuary. The TBC runs parallel to the migratory route of various marine species, including green sturgeon, Chinook salmon, and steelhead trout. In July and August 2014, an extensive series of magnetic field measurements were taken using a pair of submerged Geometrics magnetometers towed behind a survey vessel in four locations in the San Francisco estuary along profiles that cross the cable’s path; these included the Sanmore » Francisco-Oakland Bay Bridge (BB), the Richmond-San Rafael Bridge (RSR), the Benicia- Martinez Bridge (Ben) and an area in San Pablo Bay (SP) in which a bridge is not present. In this paper, we apply basic formulas that ideally describe the magnetic field from a DC cable summed vectorially with the background geomagnetic field (in the absence of other sources that would perturb the ambient field) to derive characteristics of the cable that are otherwise not immediately observable. Magnetic field profiles from measurements taken along 170 survey lines were inspected visually for evidence of a distinct pattern representing the presence of the cable. Many profiles were dominated by field distortions unrelated to the cable caused by bridge structures or other submerged objects, and the cable’s contribution to the field was not detectable. BB, with 40 of the survey lines, did not yield usable data for these reasons. The unrelated anomalies could be up to 100 times greater than those from the cable. In total, discernible magnetic field profiles measured from 76 survey lines were regressed against the equations, representing eight days of measurement. The modeled field anomalies due to the cable (the difference between the maximum and minimum field along the survey line at the cable crossing) were virtually identical to the measured

  14. Modeling Magnetic Fields from a DC Power Cable Buried Beneath San Francisco Bay Based on Empirical Measurements.

    PubMed

    Kavet, Robert; Wyman, Megan T; Klimley, A Peter

    2016-01-01

    The Trans Bay Cable (TBC) is a ±200-kilovolt (kV), 400 MW 85-km long High Voltage Direct Current (DC) buried transmission line linking Pittsburg, CA with San Francisco, CA (SF) beneath the San Francisco Estuary. The TBC runs parallel to the migratory route of various marine species, including green sturgeon, Chinook salmon, and steelhead trout. In July and August 2014, an extensive series of magnetic field measurements were taken using a pair of submerged Geometrics magnetometers towed behind a survey vessel in four locations in the San Francisco estuary along profiles that cross the cable's path; these included the San Francisco-Oakland Bay Bridge (BB), the Richmond-San Rafael Bridge (RSR), the Benicia-Martinez Bridge (Ben) and an area in San Pablo Bay (SP) in which a bridge is not present. In this paper, we apply basic formulas that ideally describe the magnetic field from a DC cable summed vectorially with the background geomagnetic field (in the absence of other sources that would perturb the ambient field) to derive characteristics of the cable that are otherwise not immediately observable. Magnetic field profiles from measurements taken along 170 survey lines were inspected visually for evidence of a distinct pattern representing the presence of the cable. Many profiles were dominated by field distortions unrelated to the cable caused by bridge structures or other submerged objects, and the cable's contribution to the field was not detectable. BB, with 40 of the survey lines, did not yield usable data for these reasons. The unrelated anomalies could be up to 100 times greater than those from the cable. In total, discernible magnetic field profiles measured from 76 survey lines were regressed against the equations, representing eight days of measurement. The modeled field anomalies due to the cable (the difference between the maximum and minimum field along the survey line at the cable crossing) were virtually identical to the measured values. The modeling

  15. Modeling Magnetic Fields from a DC Power Cable Buried Beneath San Francisco Bay Based on Empirical Measurements

    PubMed Central

    Kavet, Robert; Wyman, Megan T.; Klimley, A. Peter

    2016-01-01

    The Trans Bay Cable (TBC) is a ±200-kilovolt (kV), 400 MW 85-km long High Voltage Direct Current (DC) buried transmission line linking Pittsburg, CA with San Francisco, CA (SF) beneath the San Francisco Estuary. The TBC runs parallel to the migratory route of various marine species, including green sturgeon, Chinook salmon, and steelhead trout. In July and August 2014, an extensive series of magnetic field measurements were taken using a pair of submerged Geometrics magnetometers towed behind a survey vessel in four locations in the San Francisco estuary along profiles that cross the cable’s path; these included the San Francisco-Oakland Bay Bridge (BB), the Richmond-San Rafael Bridge (RSR), the Benicia-Martinez Bridge (Ben) and an area in San Pablo Bay (SP) in which a bridge is not present. In this paper, we apply basic formulas that ideally describe the magnetic field from a DC cable summed vectorially with the background geomagnetic field (in the absence of other sources that would perturb the ambient field) to derive characteristics of the cable that are otherwise not immediately observable. Magnetic field profiles from measurements taken along 170 survey lines were inspected visually for evidence of a distinct pattern representing the presence of the cable. Many profiles were dominated by field distortions unrelated to the cable caused by bridge structures or other submerged objects, and the cable’s contribution to the field was not detectable. BB, with 40 of the survey lines, did not yield usable data for these reasons. The unrelated anomalies could be up to 100 times greater than those from the cable. In total, discernible magnetic field profiles measured from 76 survey lines were regressed against the equations, representing eight days of measurement. The modeled field anomalies due to the cable (the difference between the maximum and minimum field along the survey line at the cable crossing) were virtually identical to the measured values. The

  16. A hybrid model for PM₂.₅ forecasting based on ensemble empirical mode decomposition and a general regression neural network.

    PubMed

    Zhou, Qingping; Jiang, Haiyan; Wang, Jianzhou; Zhou, Jianling

    2014-10-15

    Exposure to high concentrations of fine particulate matter (PM₂.₅) can cause serious health problems because PM₂.₅ contains microscopic solid or liquid droplets that are sufficiently small to be ingested deep into human lungs. Thus, daily prediction of PM₂.₅ levels is notably important for regulatory plans that inform the public and restrict social activities in advance when harmful episodes are foreseen. A hybrid EEMD-GRNN (ensemble empirical mode decomposition-general regression neural network) model based on data preprocessing and analysis is firstly proposed in this paper for one-day-ahead prediction of PM₂.₅ concentrations. The EEMD part is utilized to decompose original PM₂.₅ data into several intrinsic mode functions (IMFs), while the GRNN part is used for the prediction of each IMF. The hybrid EEMD-GRNN model is trained using input variables obtained from principal component regression (PCR) model to remove redundancy. These input variables accurately and succinctly reflect the relationships between PM₂.₅ and both air quality and meteorological data. The model is trained with data from January 1 to November 1, 2013 and is validated with data from November 2 to November 21, 2013 in Xi'an Province, China. The experimental results show that the developed hybrid EEMD-GRNN model outperforms a single GRNN model without EEMD, a multiple linear regression (MLR) model, a PCR model, and a traditional autoregressive integrated moving average (ARIMA) model. The hybrid model with fast and accurate results can be used to develop rapid air quality warning systems. PMID:25089688

  17. Development of An Empirical Water Quality Model for Stormwater Based on Watershed Land Use in Puget Sound

    SciTech Connect

    Cullinan, Valerie I.; May, Christopher W.; Brandenberger, Jill M.; Judd, Chaeli; Johnston, Robert K.

    2007-03-29

    The Sinclair and Dyes Inlet watershed is located on the west side of Puget Sound in Kitsap County, Washington, U.S.A. (Figure 1). The Puget Sound Naval Shipyard (PSNS), U.S Environmental Protection Agency (USEPA), the Washington State Department of Ecology (WA-DOE), Kitsap County, City of Bremerton, City of Bainbridge Island, City of Port Orchard, and the Suquamish Tribe have joined in a cooperative effort to evaluate water-quality conditions in the Sinclair-Dyes Inlet watershed and correct identified problems. A major focus of this project, known as Project ENVVEST, is to develop Water Clean-up (TMDL) Plans for constituents listed on the 303(d) list within the Sinclair and Dyes Inlet watershed. Segments within the Sinclair and Dyes Inlet watershed were listed on the State of Washington’s 1998 303(d) because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue (WA-DOE 2003). Stormwater loading was identified by ENVVEST as one potential source of sediment contamination, which lacked sufficient data for a contaminant mass balance calculation for the watershed. This paper summarizes the development of an empirical model for estimating contaminant concentrations in all streams discharging into Sinclair and Dyes Inlets based on watershed land use, 18 storm events, and wet/dry season baseflow conditions between November 2002 and May 2005. Stream pollutant concentrations along with estimates for outfalls and surface runoff will be used in estimating the loading and ultimately in establishing a Water Cleanup Plan (TMDL) for the Sinclair-Dyes Inlet watershed.

  18. Simulation of Long Lived Tracers Using an Improved Empirically-Based Two-Dimensional Model Transport Algorithm

    NASA Technical Reports Server (NTRS)

    Fleming, Eric L.; Jackman, Charles H.; Stolarski, Richard S.; Considine, David B.

    1998-01-01

    We have developed a new empirically-based transport algorithm for use in our GSFC two-dimensional transport and chemistry assessment model. The new algorithm contains planetary wave statistics, and parameterizations to account for the effects due to gravity waves and equatorial Kelvin waves. We will present an overview of the new algorithm, and show various model-data comparisons of long-lived tracers as part of the model validation. We will also show how the new algorithm gives substantially better agreement with observations compared to our previous model transport. The new model captures much of the qualitative structure and seasonal variability observed methane, water vapor, and total ozone. These include: isolation of the tropics and winter polar vortex, the well mixed surf-zone region of the winter sub-tropics and mid-latitudes, and the propagation of seasonal signals in the tropical lower stratosphere. Model simulations of carbon-14 and strontium-90 compare fairly well with observations in reproducing the peak in mixing ratio at 20-25 km, and the decrease with altitude in mixing ratio above 25 km. We also ran time dependent simulations of SF6 from which the model mean age of air values were derived. The oldest air (5.5 to 6 years) occurred in the high latitude upper stratosphere during fall and early winter of both hemispheres, and in the southern hemisphere lower stratosphere during late winter and early spring. The latitudinal gradient of the mean ages also compare well with ER-2 aircraft observations in the lower stratosphere.

  19. A survey of surveys

    SciTech Connect

    Kent, S.M.

    1994-11-01

    A new era for the field of Galactic structure is about to be opened with the advent of wide-area digital sky surveys. In this article, the author reviews the status and prospects for research for 3 new ground-based surveys: the Sloan Digital Sky Survey (SDSS), the Deep Near-Infrared Survey of the Southern Sky (DENIS) and the Two Micron AU Sky Survey (2MASS). These surveys will permit detailed studies of Galactic structure and stellar populations in the Galaxy with unprecedented detail. Extracting the information, however, will be challenging.

  20. Cryptosporidiosis in Indonesia: a hospital-based study and a community-based survey.

    PubMed

    Katsumata, T; Hosea, D; Wasito, E B; Kohno, S; Hara, K; Soeparto, P; Ranuh, I G

    1998-10-01

    Hospital-based and community-based studies were conducted to understand the prevalence and mode of transmission of Cryptosporidium parvum infection in Surabaya, Indonesia. In both studies people with and without diarrhea were examined for oocysts. A community-based survey included questionnaires to a community and stool examination of cats. Questionnaires covered demographic information, health status, and hygienic indicators. In the hospital, C. parvum oocysts were found in 26 (2.8%) of 917 patients with diarrhea and 15 (1.4%) of 1,043 control patients. The most susceptible age was less than two years old. The prevalence was higher during the rainy season. A community-based study again showed that C. parvum oocysts were frequently detected in diarrhea samples (8.2%), exclusively during rainy season. Thirteen (2.4%) of 532 cats passed C. parvum oocysts. A multiple logistic regression model indicated that contact with cats, rain, flood, and crowded living conditions are significant risk factors for Cryptosporidium infection. PMID:9790442

  1. Empirically Based Profiles of the Early Literacy Skills of Children with Language Impairment in Early Childhood Special Education

    ERIC Educational Resources Information Center

    Justice, Laura; Logan, Jessica; Kaderavek, Joan; Schmitt, Mary Beth; Tompkins, Virginia; Bartlett, Christopher

    2015-01-01

    The purpose of this study was to empirically determine whether specific profiles characterize preschool-aged children with language impairment (LI) with respect to their early literacy skills (print awareness, name-writing ability, phonological awareness, alphabet knowledge); the primary interest was to determine if one or more profiles suggested…

  2. Use of Evidence-Based Practice Resources and Empirically Supported Treatments for Posttraumatic Stress Disorder among University Counseling Center Psychologists

    ERIC Educational Resources Information Center

    Juel, Morgen Joray

    2012-01-01

    In the present study, an attempt was made to determine the degree to which psychologists at college and university counseling centers (UCCs) utilized empirically supported treatments with their posttraumatic stress disorder (PTSD) clients. In addition, an attempt was made to determine how frequently UCC psychologists utilized a number of…

  3. SU-E-T-05: A 2D EPID Transit Dosimetry Model Based On An Empirical Quadratic Formalism

    SciTech Connect

    Tan, Y; Metwaly, M; Glegg, M; Baggarley, S; Elliott, A

    2014-06-01

    Purpose: To describe a 2D electronic portal imaging device (EPID) transit dosimetry model, based on an empirical quadratic formalism, that can predict either EPID or in-phantom dose distribution for comparisons with EPID captured image or treatment planning system (TPS) dose respectively. Methods: A quadratic equation can be used to relate the reduction in intensity of an exit beam to the equivalent path length of the attenuator. The calibration involved deriving coefficients from a set of dose planes measured for homogeneous phantoms with known thicknesses under reference conditions. In this study, calibration dose planes were measured with EPID and ionisation chamber (IC) in water for the same reference beam (6MV, 100mu, 20×20cm{sup 2}) and set of thicknesses (0–30cm). Since the same calibration conditions were used, the EPID and IC measurements can be related through the quadratic equation. Consequently, EPID transit dose can be predicted from TPS exported dose planes and in-phantom dose can be predicted using EPID distribution captured during treatment as an input. The model was tested with 4 open fields, 6 wedge fields, and 7 IMRT fields on homogeneous and heterogeneous phantoms. Comparisons were done using 2D absolute gamma (3%/3mm) and results were validated against measurements with a commercial 2D array device. Results: The gamma pass rates for comparisons between EPID measured and predicted ranged from 93.6% to 100.0% for all fields and phantoms tested. Results from this study agreed with 2D array measurements to within 3.1%. Meanwhile, comparisons in-phantom between TPS computed and predicted ranged from 91.6% to 100.0%. Validation with 2D array device was not possible for inphantom comparisons. Conclusion: A 2D EPID transit dosimetry model for treatment verification was described and proven to be accurate. The model has the advantage of being generic and allows comparisons at the EPID plane as well as multiple planes in-phantom.

  4. Empirical antifungal therapy with an echinocandin in critically-ill patients: prospective evaluation of a pragmatic Candida score-based strategy in one medical ICU

    PubMed Central

    2014-01-01

    . Over the same period, our predefined criteria for empirical therapy were overruled in 55 cases. None of them develop IC thereafter. Finally, Our decision rule allowed IC early recognition of proven/probable IC with sensitivity, specificity, positive and negative predictive value of 69.2%, 82.1%, 69.2% and 82.1%, respectively. Conclusion Implementation of pragmatic guidelines for empirical AFT based on CS and fungal colonization assessment could be useful in selecting patients who really benefit from an echinocandin. PMID:25015848

  5. Current indications for renal biopsy: a questionnaire-based survey.

    PubMed

    Fuiano, G; Mazza, G; Comi, N; Caglioti, A; De Nicola, L; Iodice, C; Andreucci, M; Andreucci, V E

    2000-03-01

    Indications for renal biopsy are still ill defined. We recently sent a detailed questionnaire to 360 nephrologists in different areas of the world with the aim of providing information on this critical issue by evaluating the replies. The questionnaire was organized in four sections that included questions on renal biopsy indications in patients with normal renal function, renal insufficiency, and a transplanted kidney. In addition, the questions included methods applied to each renal biopsy procedure and to specimen processing. We received 166 replies; North Europe (50 replies), South Europe (47 replies), North America (31 replies), Australia and New Zealand (24 replies), and other countries (14 replies). In patients with normal renal function, primary indications for renal biopsy were microhematuria associated with proteinuria, particularly greater than 1 g/d of protein. In chronic renal insufficiency, kidney dimension was the major parameter considered before renal biopsy, whereas the presence of diabetes or serological abnormalities was not considered critical. In the course of acute renal failure (ARF) of unknown origin, 20% of the respondents would perform renal biopsy in the early stages, 26% after 1 week of nonrecovery, and 40% after 4 weeks. In a transplanted kidney, the majority of nephrologists would perform a renal biopsy in the case of graft failure after surgery, ARF after initial good function, slow progressive deterioration of renal function, and onset of nephrotic proteinuria. The last section provided comprehensive information on the technical aspects of renal biopsy. This survey represents the first attempt to provide a reliable consensus that can be used in developing guidelines on the use of kidney biopsy. PMID:10692270

  6. Land-based lidar mapping: a new surveying technique to shed light on rapid topographic change

    USGS Publications Warehouse

    Collins, Brian D.; Kayen, Robert

    2006-01-01

    The rate of natural change in such dynamic environments as rivers and coastlines can sometimes overwhelm the monitoring capacity of conventional surveying methods. In response to this limitation, U.S. Geological Survey (USGS) scientists are pioneering new applications of light detection and ranging (lidar), a laser-based scanning technology that promises to greatly increase our ability to track rapid topographic changes and manage their impact on affected communities.

  7. Content based Image Retrieval based on Different Global and Local Color Histogram Methods: A Survey

    NASA Astrophysics Data System (ADS)

    Suhasini, Pallikonda Sarah; Sri Rama Krishna, K.; Murali Krishna, I. V.

    2016-06-01

    Different global and local color histogram methods for content based image retrieval (CBIR) are investigated in this paper. Color histogram is a widely used descriptor for CBIR. Conventional method of extracting color histogram is global, which misses the spatial content, is less invariant to deformation and viewpoint changes, and results in a very large three dimensional histogram corresponding to the color space used. To address the above deficiencies, different global and local histogram methods are proposed in recent research. Different ways of extracting local histograms to have spatial correspondence, invariant colour histogram to add deformation and viewpoint invariance and fuzzy linking method to reduce the size of the histogram are found in recent papers. The color space and the distance metric used are vital in obtaining color histogram. In this paper the performance of CBIR based on different global and local color histograms in three different color spaces, namely, RGB, HSV, L*a*b* and also with three distance measures Euclidean, Quadratic and Histogram intersection are surveyed, to choose appropriate method for future research.

  8. A survey on evolutionary algorithm based hybrid intelligence in bioinformatics.

    PubMed

    Li, Shan; Kang, Liying; Zhao, Xing-Ming

    2014-01-01

    With the rapid advance in genomics, proteomics, metabolomics, and other types of omics technologies during the past decades, a tremendous amount of data related to molecular biology has been produced. It is becoming a big challenge for the bioinformatists to analyze and interpret these data with conventional intelligent techniques, for example, support vector machines. Recently, the hybrid intelligent methods, which integrate several standard intelligent approaches, are becoming more and more popular due to their robustness and efficiency. Specifically, the hybrid intelligent approaches based on evolutionary algorithms (EAs) are widely used in various fields due to the efficiency and robustness of EAs. In this review, we give an introduction about the applications of hybrid intelligent methods, in particular those based on evolutionary algorithm, in bioinformatics. In particular, we focus on their applications to three common problems that arise in bioinformatics, that is, feature selection, parameter estimation, and reconstruction of biological networks. PMID:24729969

  9. A Survey on Evolutionary Algorithm Based Hybrid Intelligence in Bioinformatics

    PubMed Central

    Li, Shan; Zhao, Xing-Ming

    2014-01-01

    With the rapid advance in genomics, proteomics, metabolomics, and other types of omics technologies during the past decades, a tremendous amount of data related to molecular biology has been produced. It is becoming a big challenge for the bioinformatists to analyze and interpret these data with conventional intelligent techniques, for example, support vector machines. Recently, the hybrid intelligent methods, which integrate several standard intelligent approaches, are becoming more and more popular due to their robustness and efficiency. Specifically, the hybrid intelligent approaches based on evolutionary algorithms (EAs) are widely used in various fields due to the efficiency and robustness of EAs. In this review, we give an introduction about the applications of hybrid intelligent methods, in particular those based on evolutionary algorithm, in bioinformatics. In particular, we focus on their applications to three common problems that arise in bioinformatics, that is, feature selection, parameter estimation, and reconstruction of biological networks. PMID:24729969

  10. An empirical survey on the influence of machining parameters on tool wear in diamond turning of large single crystal silicon optics

    SciTech Connect

    Blaedel, K L; Carr, J W; Davis, P J; Goodman, W; Haack, J K; Krulewich, D; McClellan, M; Syn, C K; Zimmermann, M.

    1999-07-01

    The research described in this paper is a continuation of the collaborative efforts by Lawrence Livermore National Laboratory (LLNL), Schafer Corporation and TRW to develop a process for single point diamond turning (SPDT) of large single crystal silicon (SCSi) optical substrates on the Large Optic Diamond Turning Machine (LODTM). The principal challenge to obtaining long track lengths in SCSi has been to identify a set of machining parameters which yield a process that provides both low and predictable tool wear. Identifying such a process for SCSi has proven to be a formidable task because multiple crystallographic orientations with a range of hardness values are encountered when machining conical and annular optical substrates. The LODTM cutting program can compensate for tool wear if it is predictable. However, if the tool wear is not predictable then the figured area of the optical substrate may have unacceptably high error that can not be removed by post-polishing. The emphasis of this survey was limited to elucidating the influence of cutting parameters on the tool wear. We present two preliminary models that can be used to predict tool wear over the parameter space investigated. During the past two and one-half years a series of three evolutionary investigations were performed. The first investigation, the Parameter Assessment Study (PAS), was designed to survey fundamental machining parameters and assess their influence on tool wear [1]. The results of the PAS were used as a point-of-departure for designing the second investigation, the Parameter Selection Study (PSS). The goal of the PSS was to explore the trends identified in the PAS in more detail, to determine if the experimental results obtained in the PAS could be repeated on a different diamond turning machine (DTM), and to select a more optimal set of machining parameters that could be used in subsequent investigations such as the Fluid Down-Select Study (FDS). The goal of the FDS was to compare

  11. 23 CFR Appendix C to Part 1240 - Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153)

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 23 Highways 1 2010-04-01 2010-04-01 false Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153) C Appendix C to Part 1240 Highways NATIONAL HIGHWAY TRAFFIC SAFETY ADMINISTRATION AND FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION GUIDELINES SAFETY INCENTIVE GRANTS FOR USE OF SEAT...

  12. An Internet-Based Summer Student Survey. AIR 2001 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Rosenthal, Dan; Gottesman, Robert

    A five-question Web-based summer school survey was designed to measure student motivation for taking summer classes. Web technology has several advantages, such as no printing, electronic notification, high student access potential, and automatic collection of data in digital form. Notification was sent to all enrolled students (N=19,837) based on…

  13. Developing a Computer Information Systems Curriculum Based on an Industry Needs Survey.

    ERIC Educational Resources Information Center

    Ghafarian, Ahmad; Sisk, Kathy A.

    This paper details experiences in developing an undergraduate Computer Information Systems (CIS) curriculum at a small liberal arts school. The development of the program was based on the study of needs assessment. Findings were based on the analysis of four sources of data: the results of an industry needs survey, data from a needs assessment…

  14. Trajectory-Based Visual Localization in Underwater Surveying Missions

    PubMed Central

    Burguera, Antoni; Bonin-Font, Francisco; Oliver, Gabriel

    2015-01-01

    We present a new vision-based localization system applied to an autonomous underwater vehicle (AUV) with limited sensing and computation capabilities. The traditional EKF-SLAM approaches are usually expensive in terms of execution time; the approach presented in this paper strengthens this method by adopting a trajectory-based schema that reduces the computational requirements. The pose of the vehicle is estimated using an extended Kalman filter (EKF), which predicts the vehicle motion by means of a visual odometer and corrects these predictions using the data associations (loop closures) between the current frame and the previous ones. One of the most important steps in this procedure is the image registration method, as it reinforces the data association and, thus, makes it possible to close loops reliably. Since the use of standard EKFs entail linearization errors that can distort the vehicle pose estimations, the approach has also been tested using an iterated Kalman filter (IEKF). Experiments have been conducted using a real underwater vehicle in controlled scenarios and in shallow sea waters, showing an excellent performance with very small errors, both in the vehicle pose and in the overall trajectory estimates. PMID:25594602

  15. Trajectory-based visual localization in underwater surveying missions.

    PubMed

    Burguera, Antoni; Bonin-Font, Francisco; Oliver, Gabriel

    2015-01-01

    We present a new vision-based localization system applied to an autonomous underwater vehicle (AUV) with limited sensing and computation capabilities. The traditional EKF-SLAM approaches are usually expensive in terms of execution time; the approach presented in this paper strengthens this method by adopting a trajectory-based schema that reduces the computational requirements. The pose of the vehicle is estimated using an extended Kalman filter (EKF), which predicts the vehicle motion by means of a visual odometer and corrects these predictions using the data associations (loop closures) between the current frame and the previous ones. One of the most important steps in this procedure is the image registration method, as it reinforces the data association and, thus, makes it possible to close loops reliably. Since the use of standard EKFs entail linearization errors that can distort the vehicle pose estimations, the approach has also been tested using an iterated Kalman filter (IEKF). Experiments have been conducted using a real underwater vehicle in controlled scenarios and in shallow sea waters, showing an excellent performance with very small errors, both in the vehicle pose and in the overall trajectory estimates. PMID:25594602

  16. NVO, surveys and archiving ground-based data

    NASA Astrophysics Data System (ADS)

    Huchra, John P.

    2002-12-01

    The era of extremely large, public databases in astronomy is upon us. Such databases will open (are opening!) the field to new research and new researchers. However it is important to be sure the resources are available to properly archive groundbased astronomical data, and include the necessary quality checks and calibrations. An NVO without a proper archive will have limited usefulness. This also implies that with limited resources, not all data can or should be archived. NASA already has a very good handle on US space-based astronomical data. Agencies and organizations that operate astronomical facilities, particularly groundbased observatories, need to plan and budget for these activities now. We should not underestimate the effort required to produce high quality data products that will be useful for the broader community.

  17. Academic Choice Behavior of High School Students: Economic Rationale and Empirical Evidence

    ERIC Educational Resources Information Center

    Zietz, J.; Joshi, P.

    2005-01-01

    This study examines the determinants of US students' choice of alternative programs of study in high school. An explicit theoretical framework grounded in optimizing behavior is derived. The empirical work is based on the National Longitudinal Survey of Youth 1997. The set of variables include student and family characteristics, peer behavior, and…

  18. Assessing Pre-Service Teachers' Training in Empirically-Validated Behavioral Instruction Practices

    ERIC Educational Resources Information Center

    Begeny, John C.; Martens, Brian K.

    2006-01-01

    Although many behaviorally-based instructional practices have been shown empirically to promote student achievement, it is unknown to what extent teachers receive adequate training in these methods. This study surveyed master's-level elementary, secondary, and special education students about their coursework and applied training in 25 behavioral…

  19. Radiological survey at the Puget Sound Naval Shipyard and Naval Submarine Base, Bangor

    SciTech Connect

    Fowler, T.W.; Cox, C.

    1998-07-01

    This report presents results of a radiological survey conducted in September 1996 by the National Air and Radiation Environmental Laboratory (NAREL) to assess levels of environmental radioactivity around Puget Sound Naval Shipyard (PSNS) at Bremerton, WA, and the Naval Submarine Base (NSBB) near Bangor, WA. The purpose of the survey was to assess whether the construction, maintenance, overhaul, refueling or operations nuclear-powered warships have created elevated levels of radioactivity that could expose nearby populations or contaminate the environment. During this survey 227 samples were collected: 126 at the PSNS study site; 73 at the NSBB site; 21 from background locations; and 7 near the outfall of the Bremerton sewage treatment plant. Samples included drinking water, harbor water, sediment, sediment cores, and biota. All samples were analyzed for gross alpha and beta activities and gamma emitting radionuclides, and some samples were also analyzed for radium-226 and isotopes of uranium, plutonium, and thorium. In addition to sample collection and analysis, radiation surveys were performed using portable survey instruments to detect gamma radiation. Based on this radiological survey, practices regarding nuclear-powered warship operations at PSNS and NSBB have resulted in no increases in radioactivity causing significant population exposure or contamination of the environment.

  20. Implementing Project Based Survey Research Skills to Grade Six ELP Students with "The Survey Toolkit" and "TinkerPlots"[R

    ERIC Educational Resources Information Center

    Walsh, Thomas, Jr.

    2011-01-01

    "Survey Toolkit Collecting Information, Analyzing Data and Writing Reports" (Walsh, 2009a) is discussed as a survey research curriculum used by the author's sixth grade students. The report describes the implementation of "The Survey Toolkit" curriculum and "TinkerPlots"[R] software to provide instruction to students learning a project based…

  1. Sleepwalking in Parkinson's disease: a questionnaire-based survey.

    PubMed

    Oberholzer, Michael; Poryazova, Rositsa; Bassetti, Claudio L

    2011-07-01

    Sleepwalking (SW) corresponds to a complex sleep-associated behavior that includes locomotion, mental confusion, and amnesia. SW is present in about 10% of children and 2-3% of adults. In a retrospective series of 165 patients with Parkinson's disease (PD), we found adult-onset ("de novo") SW "de novo" in six (4%) of them. The aim of this study was to assess prospectively and systematically the frequency and characteristics of SW in PD patients. A questionnaire including items on sleep quality, sleep disorders, and specifically also SW and REM sleep behavior disorder (RBD), PD characteristics and severity, was sent to the members of the national PD patients organization in Switzerland. In the study, 36/417 patients (9%) reported SW, of which 22 (5%) had adult-onset SW. Patients with SW had significantly longer disease duration (p = 0.035), they reported more often hallucinations (p = 0.004) and nightmares (p = 0.003), and they had higher scores, suggestive for RBD in a validated questionnaire (p = 0.001). Patients with SW were also sleepier (trend to a higher Epworth Sleepiness Scale score, p = 0.055). Our data suggest that SW in PD patients is (1) more common than in the general population, and (2) is associated with RBD, nightmares, and hallucinations. Further studies including polysomnographic recordings are needed to confirm the results of this questionnaire-based analysis, to understand the relationship between SW and other nighttime wandering behaviors in PD, and to clarify the underlying mechanisms. PMID:21293874

  2. Questionnaire-based survey of parturition in the queen.

    PubMed

    Musters, J; de Gier, J; Kooistra, H S; Okkens, A C

    2011-06-01

    The lack of scientific data concerning whether parturition in the queen proceeds normally or not may prevent veterinarians and cat owners from recognizing parturition problems in time. A questionnaire-based study of parturition in 197 queens was performed to determine several parameters of parturition and their influence on its progress. The mean length of gestation was 65.3 days (range 57 to 72 days) and it decreased with increasing litter size (P = 0.02). The median litter size was 4.5 kittens (range 1 to 9), with more males (53%) than females (46%) (P = 0.05). Sixty-nine percent of the kittens were born in anterior presentation and 31% in posterior presentation, indicating that either can be considered normal in the cat. Males were born in posterior position (34%) more often than females (26%) (P = 0.03). The mean birth weight was 98 g (range of 35 to 167 g) and decreased with increasing litter size (P < 0.01). Mean birth weight was higher in males and kittens born in posterior presentation (P < 0.01). Forty-four (5%) of the 887 kittens were stillborn. This was not correlated with the presentation at expulsion but stillborn kittens were more often female (P = 0.02) and weighed less than those born alive (P = 0.04). The median interkitten time was 30 min (range 2 to 343 min) and 95% were born within 100 min after expulsion of the preceding kitten. The interkitten time as a measure of the progress of parturition was not influenced by the kitten's gender, presentation at expulsion, birth weight, or stillbirth, or by the parity of the queen. The results of this study can be used to develop reference values for parturition parameters in the queen, both to determine whether a given parturition is abnormal and as the basis for a parturition protocol. PMID:21295830

  3. Empirically Driven Software Engineering Research

    NASA Astrophysics Data System (ADS)

    Rombach, Dieter

    Software engineering is a design discipline. As such, its engineering methods are based on cognitive instead of physical laws, and their effectiveness depends highly on context. Empirical methods can be used to observe the effects of software engineering methods in vivo and in vitro, to identify improvement potentials, and to validate new research results. This paper summarizes both the current body of knowledge and further challenges wrt. empirical methods in software engineering as well as empirically derived evidence regarding software typical engineering methods. Finally, future challenges wrt. education, research, and technology transfer will be outlined.

  4. Exoplanets -New Results from Space and Ground-based Surveys

    NASA Astrophysics Data System (ADS)

    Udry, Stephane

    The exploration of the outer solar system and in particular of the giant planets and their environments is an on-going process with the Cassini spacecraft currently around Saturn, the Juno mission to Jupiter preparing to depart and two large future space missions planned to launch in the 2020-2025 time frame for the Jupiter system and its satellites (Europa and Ganymede) on the one hand, and the Saturnian system and Titan on the other hand [1,2]. Titan, Saturn's largest satellite, is the only other object in our Solar system to possess an extensive nitrogen atmosphere, host to an active organic chemistry, based on the interaction of N2 with methane (CH4). Following the Voyager flyby in 1980, Titan has been intensely studied from the ground-based large telescopes (such as the Keck or the VLT) and by artificial satellites (such as the Infrared Space Observatory and the Hubble Space Telescope) for the past three decades. Prior to Cassini-Huygens, Titan's atmospheric composition was thus known to us from the Voyager missions and also through the explorations by the ISO. Our perception of Titan had thus greatly been enhanced accordingly, but many questions remained as to the nature of the haze surrounding the satellite and the composition of the surface. The recent revelations by the Cassini-Huygens mission have managed to surprise us with many discoveries [3-8] and have yet to reveal more of the interesting aspects of the satellite. The Cassini-Huygens mission to the Saturnian system has been an extraordinary success for the planetary community since the Saturn-Orbit-Insertion (SOI) in July 2004 and again the very successful probe descent and landing of Huygens on January 14, 2005. One of its main targets was Titan. Titan was revealed to be a complex world more like the Earth than any other: it has a dense mostly nitrogen atmosphere and active climate and meteorological cycles where the working fluid, methane, behaves under Titan conditions the way that water does on

  5. Short-term memory of TiO2-based electrochemical capacitors: empirical analysis with adoption of a sliding threshold

    NASA Astrophysics Data System (ADS)

    Lim, Hyungkwang; Kim, Inho; Kim, Jin-Sang; Hwang, Cheol Seong; Jeong, Doo Seok

    2013-09-01

    Chemical synapses are important components of the large-scaled neural network in the hippocampus of the mammalian brain, and a change in their weight is thought to be in charge of learning and memory. Thus, the realization of artificial chemical synapses is of crucial importance in achieving artificial neural networks emulating the brain’s functionalities to some extent. This kind of research is often referred to as neuromorphic engineering. In this study, we report short-term memory behaviours of electrochemical capacitors (ECs) utilizing TiO2 mixed ionic-electronic conductor and various reactive electrode materials e.g. Ti, Ni, and Cr. By experiments, it turned out that the potentiation behaviours did not represent unlimited growth of synaptic weight. Instead, the behaviours exhibited limited synaptic weight growth that can be understood by means of an empirical equation similar to the Bienenstock-Cooper-Munro rule, employing a sliding threshold. The observed potentiation behaviours were analysed using the empirical equation and the differences between the different ECs were parameterized.

  6. A Household-Based Distribution-Sensitive Human Development Index: An Empirical Application to Mexico, Nicaragua and Peru

    ERIC Educational Resources Information Center

    Lopez-Calva, Luis F.; Ortiz-Juarez, Eduardo

    2012-01-01

    In measuring human development, one of the main concerns relates to the inclusion of a measure that penalizes inequalities in the distribution of achievements across the population. Using indicators from nationally representative household surveys and census data, this paper proposes a straightforward methodology to estimate a household-based…

  7. A Comparative Investigation of TPB and Altruism Frameworks for an Empirically Based Communication Approach to Enhance Paper Recycling

    ERIC Educational Resources Information Center

    Chaisamrej, Rungrat; Zimmerman, Rick S.

    2014-01-01

    This research compared the ability of the theory of planned behavior (TPB) and the altruism framework (AM) to predict paper-recycling behavior. It was comprised of formative research and a major survey. Data collected from 628 undergraduate students in Thailand were analyzed using structural equation modeling. Results showed that TPB was superior…

  8. An in-Depth Survey of Visible Light Communication Based Positioning Systems.

    PubMed

    Do, Trong-Hop; Yoo, Myungsik

    2016-01-01

    While visible light communication (VLC) has become the candidate for the wireless technology of the 21st century due to its inherent advantages, VLC based positioning also has a great chance of becoming the standard approach to positioning. Within the last few years, many studies on VLC based positioning have been published, but there are not many survey works in this field. In this paper, an in-depth survey of VLC based positioning systems is provided. More than 100 papers ranging from pioneering papers to the state-of-the-art in the field were collected and classified based on the positioning algorithms, the types of receivers, and the multiplexing techniques. In addition, current issues and research trends in VLC based positioning are discussed. PMID:27187395

  9. An in-Depth Survey of Visible Light Communication Based Positioning Systems

    PubMed Central

    Do, Trong-Hop; Yoo, Myungsik

    2016-01-01

    While visible light communication (VLC) has become the candidate for the wireless technology of the 21st century due to its inherent advantages, VLC based positioning also has a great chance of becoming the standard approach to positioning. Within the last few years, many studies on VLC based positioning have been published, but there are not many survey works in this field. In this paper, an in-depth survey of VLC based positioning systems is provided. More than 100 papers ranging from pioneering papers to the state-of-the-art in the field were collected and classified based on the positioning algorithms, the types of receivers, and the multiplexing techniques. In addition, current issues and research trends in VLC based positioning are discussed. PMID:27187395

  10. Evaluation of Midwater Trawl Selectivity and its Influence on Acoustic-Based Fish Population Surveys

    NASA Astrophysics Data System (ADS)

    Williams, Kresimir

    Trawls are used extensively during fisheries abundance surveys to derive estimates of fish density and, in the case of acoustic-based surveys, to identify acoustically sampled fish populations. However, trawls are selective in what fish they retain, resulting in biased estimates of density, species, and size compositions. Selectivity of the midwater trawl used in acoustic-based surveys of walleye pollock (Theragra chalcogramma) was evaluated using multiple methods. The effects of trawl selectivity on the acoustic-based survey abundance estimates and the stock assessment were evaluated for the Gulf of Alaska walleye pollock population. Selectivity was quantified using recapture, or pocket, nets attached to the outside of the trawl. Pocket net catches were modeled using a hierarchical Bayesian model to provide uncertainty in selectivity parameter estimates. Significant under-sampling of juvenile pollock by the midwater trawl was found, with lengths at 50% retention ranging from 14--26 cm over three experiments. Escapement was found to be light dependent, with more fish escaping in dark conditions. Highest escapement rates were observed in the aft of the trawl near to the codend though the bottom panel of the trawl. The behavioral mechanisms involved in the process of herding and escapement were evaluated using stereo-cameras, a DIDSON high frequency imaging sonar, and pocket nets. Fish maintained greater distances from the trawl panel during daylight, suggesting trawl modifications such as increased visibility of netting materials may evoke stronger herding responses and increased retention of fish. Selectivity and catchability of pollock by the midwater trawl was also investigated using acoustic density as an independent estimate of fish abundance to compare with trawl catches. A modeling framework was developed to evaluate potential explanatory factors for selectivity and catchability. Selectivity estimates were dependent on which vessel was used for the survey

  11. Algorithms for personalized therapy of type 2 diabetes: results of a web-based international survey

    PubMed Central

    Gallo, Marco; Mannucci, Edoardo; De Cosmo, Salvatore; Gentile, Sandro; Candido, Riccardo; De Micheli, Alberto; Di Benedetto, Antonino; Esposito, Katherine; Genovese, Stefano; Medea, Gerardo; Ceriello, Antonio

    2015-01-01

    Objective In recent years increasing interest in the issue of treatment personalization for type 2 diabetes (T2DM) has emerged. This international web-based survey aimed to evaluate opinions of physicians about tailored therapeutic algorithms developed by the Italian Association of Diabetologists (AMD) and available online, and to get suggestions for future developments. Another aim of this initiative was to assess whether the online advertising and the survey would have increased the global visibility of the AMD algorithms. Research design and methods The web-based survey, which comprised five questions, has been available from the homepage of the web-version of the journal Diabetes Care throughout the month of December 2013, and on the AMD website between December 2013 and September 2014. Participation was totally free and responders were anonymous. Results Overall, 452 physicians (M=58.4%) participated in the survey. Diabetologists accounted for 76.8% of responders. The results of the survey show wide agreement (>90%) by participants on the utility of the algorithms proposed, even if they do not cover all possible needs of patients with T2DM for a personalized therapeutic approach. In the online survey period and in the months after its conclusion, a relevant and durable increase in the number of unique users who visited the websites was registered, compared to the period preceding the survey. Conclusions Patients with T2DM are heterogeneous, and there is interest toward accessible and easy to use personalized therapeutic algorithms. Responders opinions probably reflect the peculiar organization of diabetes care in each country. PMID:26301097

  12. State Poverty-Based Education Funding: A Survey of Current Programs and Options for Improvement.

    ERIC Educational Resources Information Center

    Carey, Kevin

    This paper describes the current status of state poverty-based education funding programs, discussing how to implement or improve them. Researchers surveyed education finance officials in the 49 states with multiple school districts. Results indicate that 38 states currently distribute some education funds on the basis of poverty. A total of 75…

  13. Special Educators' Guide to Exemplary Curricula: Results of a National Field-Based Survey (1983).

    ERIC Educational Resources Information Center

    Ash, Paul M., Comp.

    A listing of 178 curriculum guides for exemplary special education programs is presented, based on a national 1983 survey of over 500 programs. For each program, information is presented on the title of the guide, the source and source address, development date, price, number of pages, recommended exceptionality area(s), and recommended level(s).…

  14. School Nutrition Directors are Receptive to Web-Based Training Opportunities: A National Survey

    ERIC Educational Resources Information Center

    Zoellner, Jamie; Carr, Deborah H.

    2009-01-01

    Purpose/Objective: The purpose of this study was to investigate school nutrition directors' (SNDs) previous experience with web-based training (WBT), interest in utilizing WBT within 14 functional areas, and logistical issues (time, price, educational credits, etc.) of developing and delivering WBT learning modules. Methods: A survey was developed…

  15. Development of a National Survey to Assess Student Learning Outcomes of Community-Based Research

    ERIC Educational Resources Information Center

    Lichtenstein, Gary; Thorme, Trisha; Cutforth, Nick; Tombari, Martin L.

    2011-01-01

    With the goal of codifying student learning outcomes of community-based research (CBR), the authors created a conceptually valid and statistically reliable CBR Student Learning Outcomes Survey. The project began with individual interviews and focus groups with 70 undergraduates and faculty at six colleges and universities nationwide discussing…

  16. Radiological survey of Charleston Naval Base and Shipyard and the Charleston Naval Weapons Station. Final report

    SciTech Connect

    Smith, J.M.

    1987-07-01

    This report presents results of the survey conducted by the Eastern Environmental Radiation Facility (EERF) personnel to assess levels of environmental radioactivity resulting from maintenance and operation of nuclear-powered warships at the Charleston Naval Base and Shipyard and the Charleston Naval Weapons Station, near Charleston, SC.

  17. A Community-Based Activities Survey: Systematically Determining the Impact on and of Faculty

    ERIC Educational Resources Information Center

    Perry, Lane; Farmer, Betty; Onder, David; Tanner, Benjamin; Burton, Carol

    2015-01-01

    As a descriptive case study from Western Carolina University (WCU), this article describes the development of a measuring, monitoring, and tracking system (the WCU Community-based Activities Survey) for faculty engagement in, adoption of, and impact through community engagement practices both internal and external to their courses. This paper will…

  18. HIV/AIDS Misconceptions among Latinos: Findings from a Population-Based Survey of California Adults

    ERIC Educational Resources Information Center

    Ritieni, Assunta; Moskowitz, Joel; Tholandi, Maya

    2008-01-01

    Misconceptions about HIV/AIDS among Latino adults (N=454) in California were examined using data from a population-based telephone survey conducted in 2000. Common misconceptions concerning modes of HIV transmission included transmission via mosquito or animal bite (64.1%), public facilities (48.3%), or kissing someone on the cheek (24.8%). A…

  19. What Attributes Determine Severity of Function in Autism? A Web-Based Survey of Stakeholders

    ERIC Educational Resources Information Center

    Di Rezze, Briano; Rosenbaum, Peter; Zwaigenbaum, Lonnie

    2012-01-01

    Service providers and researchers in autism spectrum disorders (ASD) are challenged to categorize clinical variation in function. Classification systems for children with cerebral palsy have enabled clinicians and families to describe levels of function. A web-based survey engaged international ASD stakeholders to advise on considerations of…

  20. Medical Students' Experiences with Addicted Patients: A Web-Based Survey

    ERIC Educational Resources Information Center

    Midmer, Deana; Kahan, Meldon; Wilson, Lynn

    2008-01-01

    Project CREATE was an initiative to strengthen undergraduate medical education in addictions. As part of a needs assessment, forty-six medical students at Ontario's five medical schools completed a bi-weekly, interactive web-based survey about addiction-related learning events. In all, 704 unique events were recorded, for an average of 16.7…

  1. Design and Development of a Process for Web-based Survey Research.

    ERIC Educational Resources Information Center

    Carbonaro, Mike; Bainbridge, Joyce

    2000-01-01

    Describes the development of a Web-based questionnaire for a survey about childrens' literature used by Alberta elementary teachers. Advantages included fast access to the instrument, protection against missing data, direct uploading of data, and avoidance of postage costs. Disadvantages included limited participation due to computer anxiety (even…

  2. Sexually Transmitted Diseases and Risk Behaviors among California Farmworkers: Results from a Population-Based Survey

    ERIC Educational Resources Information Center

    Brammeier, Monique; Chow, Joan M.; Samuel, Michael C.; Organista, Kurt C.; Miller, Jamie; Bolan, Gail

    2008-01-01

    Context: The prevalence of sexually transmitted diseases and associated risk behaviors among California farmworkers is not well described. Purpose: To estimate the prevalence of sexually transmitted diseases (STDs) and associated risk behaviors among California farmworkers. Methods: Cross-sectional analysis of population-based survey data from 6…

  3. Our Environment, Our Health: A Community-Based Participatory Environmental Health Survey in Richmond, California

    ERIC Educational Resources Information Center

    Cohen, Alison; Lopez, Andrea; Malloy, Nile; Morello-Frosch, Rachel

    2012-01-01

    This study presents a health survey conducted by a community-based participatory research partnership between academic researchers and community organizers to consider environmental health and environmental justice issues in four neighborhoods of Richmond, California, a low-income community of color living along the fence line of a major oil…

  4. A 16-year examination of domestic violence among Asians and Asian Americans in the empirical knowledge base: a content analysis.

    PubMed

    Yick, Alice G; Oomen-Early, Jody

    2008-08-01

    Until recently, research studies have implied that domestic violence does not affect Asian American and immigrant communities, or even Asians abroad, because ethnicity or culture has not been addressed. In this content analysis, the authors examined trends in publications in leading scholarly journals on violence relating to Asian women and domestic violence. A coding schema was developed, with two raters coding the data with high interrater reliability. Sixty articles were published over the 16 years studied, most atheoretical and focusing on individual levels of analysis. The terms used in discussing domestic violence reflected a feminist perspective. Three quarters of the studies were empirical, with most guided by logical positivism using quantitative designs. Most targeted specific Asian subgroups (almost a third focused on Asian Indians) rather than categorizing Asians as a general ethnic category. The concept of "Asian culture" was most often assessed by discussing Asian family structure. Future research is discussed in light of the findings. PMID:18259048

  5. Empirical model of global thermospheric temperature and composition based on data from the OGO-6 quadrupole mass spectrometer

    NASA Technical Reports Server (NTRS)

    Hedin, A. E.; Mayr, H. G.; Reber, C. A.; Spencer, N. W.; Carignan, G. R.

    1972-01-01

    An empirical global model for magnetically quiet conditions has been derived from longitudinally averaged N2, O, and He densities by means of an expansion in spherical harmonics. The data were obtained by the OGO-6 neutral mass spectrometer and cover the altitude range 400 to 600 km for the period 27 June 1969 to 13 May 1971. The accuracy of the analytical description is of the order of the experimental error for He and O and about three times experimental error for N2, thus providing a reasonable overall representation of the satellite observations. Two model schemes are used: one representing densities extrapolated to 450 km and one representing densities extrapolated to 120 km with exospheric temperatures inferred from N2 densities. Using the best fit model parameters the global thermospheric structure is presented in the form of a number of contour plots.

  6. The Problems with Access to Compulsory Education in China and the Effects of the Policy of Direct Subsidies to Students: An Empirical Study Based on a Small Sample

    ERIC Educational Resources Information Center

    Yanqing, Ding

    2012-01-01

    After a brief review of the achievements and the problems in compulsory education enrollment in the thirty years since the reform and opening up, this study analyzes the current compulsory education enrollment and dropout rates in China's least-developed regions and the factors affecting school enrollment based on survey data from a small sample…

  7. Radiological survey of the Charleston Naval Base and Shipyard and the Charleston Naval Weapons Station

    SciTech Connect

    Smith, J.M.

    1987-07-01

    This report presents results of the survey conducted by Eastern Environmental Radiation Facility (EERF) personnel to assess levels of environmental radioactivity resulting from maintenance and operation of nuclear-powered warships at the Charleston Naval Base and Shipyard and the Charleston Naval Weapons Station, near Charleston, South Carolina. The purpose of the survey was to determine if operations related to nuclear powered warship activities resulted in release of radionuclides which may contribute to significant population exposure or contamination of the environment. 4 refs., 5 figs., 3 tabs.

  8. Experience base for Radioactive Waste Thermal Processing Systems: A preliminary survey

    SciTech Connect

    Mayberry, J.; Geimer, R.; Gillins, R.; Steverson, E.M.; Dalton, D. ); Anderson, G.L. )

    1992-04-01

    In the process of considering thermal technologies for potential treatment of the Idaho National Engineering Laboratory mixed transuranic contaminated wastes, a preliminary survey of the experience base available from Radioactive Waste Thermal Processing Systems is reported. A list of known commercial radioactive waste facilities in the United States and some international thermal treatment facilities are provided. Survey focus is upon the US Department of Energy thermal treatment facilities. A brief facility description and a preliminary summary of facility status, and problems experienced is provided for a selected subset of the DOE facilities.

  9. Patient-based surveying: a cost-effective approach for reaching large markets.

    PubMed

    Byer, S

    1995-01-01

    Member-based surveying is an important tool for managed care companies to discern newer and better ways in which to keep their current members satisfied, develop products that will attract new members, and to gauge changes of course in health consumer opinion. This article discusses a consumer friendly and cost-effective method to survey members and the general public that has produced a very positive response for a modest investment. The response rate will likely improve over time as the method gains broader acceptance. PMID:10151597

  10. Free-free and fixed base modal survey tests of the Space Station Common Module Prototype

    NASA Technical Reports Server (NTRS)

    Driskill, T. C.; Anderson, J. B.; Coleman, A. D.

    1992-01-01

    This paper describes the testing aspects and the problems encountered during the free-free and fixed base modal surveys completed on the original Space Station Common Module Prototype (CMP). The CMP is a 40-ft long by 14.5-ft diameter 'waffle-grid' cylinder built by the Boeing Company and housed at the Marshall Space Flight Center (MSFC) near Huntsville, AL. The CMP modal survey tests were conducted at MSFC by the Dynamics Test Branch. The free-free modal survey tests (June '90 to Sept. '90) included interface verification tests (IFVT), often referred to as impedance measurements, mass-additive testing and linearity studies. The fixed base modal survey tests (Feb. '91 to April '91), including linearity studies, were conducted in a fixture designed to constrain the CMP in 7 total degrees-of-freedom at five trunnion interfaces (two primary, two secondary, and the keel). The fixture also incorporated an airbag off-load system designed to alleviate the non-linear effects of friction in the primary and secondary trunnion interfaces. Numerous test configurations were performed with the objective of providing a modal data base for evaluating the various testing methodologies to verify dynamic finite element models used for input to coupled load analysis.

  11. Free-free and fixed base modal survey tests of the Space Station Common Module Prototype

    NASA Astrophysics Data System (ADS)

    Driskill, T. C.; Anderson, J. B.; Coleman, A. D.

    This paper describes the testing aspects and the problems encountered during the free-free and fixed base modal surveys completed on the original Space Station Common Module Prototype (CMP). The CMP is a 40-ft long by 14.5-ft diameter 'waffle-grid' cylinder built by the Boeing Company and housed at the Marshall Space Flight Center (MSFC) near Huntsville, AL. The CMP modal survey tests were conducted at MSFC by the Dynamics Test Branch. The free-free modal survey tests (June '90 to Sept. '90) included interface verification tests (IFVT), often referred to as impedance measurements, mass-additive testing and linearity studies. The fixed base modal survey tests (Feb. '91 to April '91), including linearity studies, were conducted in a fixture designed to constrain the CMP in 7 total degrees-of-freedom at five trunnion interfaces (two primary, two secondary, and the keel). The fixture also incorporated an airbag off-load system designed to alleviate the non-linear effects of friction in the primary and secondary trunnion interfaces. Numerous test configurations were performed with the objective of providing a modal data base for evaluating the various testing methodologies to verify dynamic finite element models used for input to coupled load analysis.

  12. TESTING GROUND BASED GEOPHYSICAL TECHNIQUES TO REFINE ELECTROMAGNETIC SURVEYS NORTH OF THE 300 AREA HANFORD WASHINGTON

    SciTech Connect

    PETERSEN SW

    2010-12-02

    Airborne electromagnetic (AEM) surveys were flown during fiscal year (FY) 2008 within the 600 Area in an attempt to characterize the underlying subsurface and to aid in the closure and remediation design study goals for the 200-PO-1 Groundwater Operable Unit (OU). The rationale for using the AEM surveys was that airborne surveys can cover large areas rapidly at relatively low costs with minimal cultural impact, and observed geo-electrical anomalies could be correlated with important subsurface geologic and hydrogeologic features. Initial interpretation of the AEM surveys indicated a tenuous correlation with the underlying geology, from which several anomalous zones likely associated with channels/erosional features incised into the Ringold units were identified near the River Corridor. Preliminary modeling resulted in a slightly improved correlation but revealed that more information was required to constrain the modeling (SGW-39674, Airborne Electromagnetic Survey Report, 200-PO-1 Groundwater Operable Unit, 600 Area, Hanford Site). Both time-and frequency domain AEM surveys were collected with the densest coverage occurring adjacent to the Columbia River Corridor. Time domain surveys targeted deeper subsurface features (e.g., top-of-basalt) and were acquired using the HeliGEOTEM{reg_sign} system along north-south flight lines with a nominal 400 m (1,312 ft) spacing. The frequency domain RESOLVE system acquired electromagnetic (EM) data along tighter spaced (100 m [328 ft] and 200 m [656 ft]) north-south profiles in the eastern fifth of the 200-PO-1 Groundwater OU (immediately adjacent to the River Corridor). The overall goal of this study is to provide further quantification of the AEM survey results, using ground based geophysical methods, and to link results to the underlying geology and/or hydrogeology. Specific goals of this project are as follows: (1) Test ground based geophysical techniques for the efficacy in delineating underlying geology; (2) Use ground

  13. Institution-Specific Victimization Surveys: Addressing Legal and Practical Disincentives to Gender-Based Violence Reporting on College Campuses.

    PubMed

    Cantalupo, Nancy Chi

    2014-03-12

    This review brings together both the legal literature and original empirical research regarding the advisability of amending the Jeanne Clery Disclosure of Campus Security Policy and Campus Crime Statistics Act or creating new Department of Education regulations to mandate that all higher education institutions survey their students approximately every 5 years about students' experiences with sexual violence. Legal research conducted regarding the three relevant federal legal regimes show inconsistent incentives for schools to encourage victim reporting and proactively address sexual violence on campus. Moreover, the original research carried out for this article shows that the experience of institutions that have voluntarily conducted such surveys suggests many benefits not only for students, prospective students, parents, and the general public but also for schools themselves. These experiences confirm the practical viability of a mandated survey by the Department of Education. PMID:24626456

  14. 1995 Area 1 bird survey/Zone 1, Operable Unit 2, Robins Air Force Base, Georgia

    SciTech Connect

    Wade, M.C.

    1995-08-01

    Robins Air Force Base is located in Warner Robins, Georgia, approximately 90 miles southeast of Atlanta, Georgia. As part of the Baseline Investigation (CDM Federal 1994) a two day bird survey was conducted by M. C. Wade (Oak Ridge National Laboratory) and B.A. Beatty (CDM Federal Programs) in May 1995. The subject area of investigation includes the sludge lagoon, Landfill No. 4, and the wetland area east of the landfill and west of Hannah Road (including two ponds). This is known as Area 1. The Area 1 wetlands include bottomland hardwood forest, stream, and pond habitats. The objectives of this survey were to document bird species using the Area I wetlands and to see if the change in hydrology (due to the installation of the Sewage Treatment Plant effluent diversion and stormwater runon control systems) has resulted in changes at Area 1 since the previous survey of May 1992 (CDM Federal 1994).

  15. Improving Inpatient Surveys: Web-Based Computer Adaptive Testing Accessed via Mobile Phone QR Codes

    PubMed Central

    2016-01-01

    Background The National Health Service (NHS) 70-item inpatient questionnaire surveys inpatients on their perceptions of their hospitalization experience. However, it imposes more burden on the patient than other similar surveys. The literature shows that computerized adaptive testing (CAT) based on item response theory can help shorten the item length of a questionnaire without compromising its precision. Objective Our aim was to investigate whether CAT can be (1) efficient with item reduction and (2) used with quick response (QR) codes scanned by mobile phones. Methods After downloading the 2008 inpatient survey data from the Picker Institute Europe website and analyzing the difficulties of this 70-item questionnaire, we used an author-made Excel program using the Rasch partial credit model to simulate 1000 patients’ true scores followed by a standard normal distribution. The CAT was compared to two other scenarios of answering all items (AAI) and the randomized selection method (RSM), as we investigated item length (efficiency) and measurement accuracy. The author-made Web-based CAT program for gathering patient feedback was effectively accessed from mobile phones by scanning the QR code. Results We found that the CAT can be more efficient for patients answering questions (ie, fewer items to respond to) than either AAI or RSM without compromising its measurement accuracy. A Web-based CAT inpatient survey accessed by scanning a QR code on a mobile phone was viable for gathering inpatient satisfaction responses. Conclusions With advances in technology, patients can now be offered alternatives for providing feedback about hospitalization satisfaction. This Web-based CAT is a possible option in health care settings for reducing the number of survey items, as well as offering an innovative QR code access. PMID:26935793

  16. Creation of an Empirical Energy-Balance Based Snow Module Simulating Both Snowmelt and Snow Accumulation for Mountain Hydrology

    NASA Astrophysics Data System (ADS)

    Riboust, P.; Le Moine, N.; Thirel, G.; Ribstein, P.

    2015-12-01

    In Nordic and mountainous regions, hydrological processes are more complex than for regular rainfall-driven watersheds. Snow accumulates in winter, acting as a reservoir, and melts during late spring and summer. In order to take into account these additional natural processes present in mountainous watersheds, snow modules have been created in order to help rainfall-runoff models to simulate river discharge. Many empirical degree-day snow models have been designed to simulate snowmelt and river discharge when coupled to a rainfall runoff model, but few of them simulate correctly the amount of snow water equivalent (SWE) at point scale. Simulating correctly not only the amount of snowmelt but also the water content of the snowpack has several potential advantages: it allows improving the model reliability and performance for short-term and long-term prediction, spatial regionalization, and it makes it possible to perform data assimilation using observed snow measurements. The objective of our study is to create a new simple empirical snow module, with a structure allowing the use of snow data for calibration or assimilation. We used a model structure close to the snow model defined by M.T. Walter (2005) where each of the processes of the energy balance is parameterized using only temperature and precipitation data. The conductive fluxes into the snowpack have been modeled using analyticalsolutions to the heat equation with phase change. This model which is in-between the degree-day and the physical energy-balance approaches. It has the advantages to use only temperature and precipitation which arewidely available data and to take account of energy balance processes without being computationally intensive. Another advantage is that all state variables of the model should be comparable with observable measurements.For the moment, the snow module has been parameterized at point scale and has been tested over Switzerland and the US, using MeteoSwiss and SNOTEL USGS

  17. NASS: an empirical approach to spike sorting with overlap resolution based on a hybrid noise-assisted methodology.

    PubMed

    Adamos, Dimitrios A; Laskaris, Nikolaos A; Kosmidis, Efstratios K; Theophilidis, George

    2010-06-30

    Background noise and spike overlap pose problems in contemporary spike-sorting strategies. We attempted to resolve both issues by introducing a hybrid scheme that combines the robust representation of spike waveforms to facilitate the reliable identification of contributing neurons with efficient data learning to enable the precise decomposition of coactivations. The isometric feature mapping (ISOMAP) technique reveals the intrinsic data structure, helps with recognising the involved neurons and, simultaneously, identifies the overlaps. Exemplar activation patterns are first estimated for all detected neurons and consecutively used to build a synthetic database in which spike overlaps are systematically varied and realistic noise is added. An Extreme Learning Machine (ELM) is then trained with the ISOMAP representation of this database and learns to associate the synthesised waveforms with the corresponding source neurons. The trained ELM is finally applied to the actual overlaps from the experimental data and this completes the entire spike-sorting process. Our approach is better characterised as semi-supervised, noise-assisted strategy of an empirical nature. The user's engagement is restricted at recognising the number of active neurons from low-dimensional point-diagrams and at deciding about the complexity of overlaps. Efficiency is inherited from the incorporation of well-established algorithms. Moreover, robustness is guaranteed by adaptation to the actual noise properties of a given data set. The validity of our work has been verified via extensive experimentation, using realistically simulated data, under different levels of noise. PMID:20434486

  18. Theoretical modeling of stream potholes based upon empirical observations from the Orange River, Republic of South Africa

    NASA Astrophysics Data System (ADS)

    Springer, Gregory S.; Tooth, Stephen; Wohl, Ellen E.

    2006-12-01

    Potholes carved into streambeds can be important components of channel incision, but they have received little quantitative attention. Here empirical evidence is presented from three sites along the Orange River, Republic of South Africa that demonstrates that the pothole dimensions of radius and depth are strongly correlated using a simple power law. Where radius is the dependent variable, the exponent of the power law describes the rate of increase in radius with increasing depth. Erosion within potholes is complexly related to erosion on the adjacent bed. Erosion efficiencies within small, hemispherical potholes must be high if the potholes are to survive in the face of bed translation (incision). As potholes deepen, however, the necessary efficiencies decline rapidly. Increasing concavity associated with growth imposes stricter constraints; comparatively deep potholes must erode orders of magnitude larger volumes of substrate than shallower potholes in the face of bed retreat. Hemispherical potholes are eventually converted to cylindrical potholes, the geometries of which favor enlargement while they are small. Geometric models constructed using the power law show unambiguously that more substrate is eroded by volume from cylindrical pothole walls during growth than from cylindrical pothole floors. Grinders thus play a secondary role to suspended sediment entrained within the vortices that occur in potholes. Continued growth leads to coalescence with other potholes or destruction through block detachment depending on local geology. The combination of geology and erosion mechanisms may determine whether a strath or inner channel develops as a consequence of the process.

  19. Sci—Thur AM: YIS - 09: Validation of a General Empirically-Based Beam Model for kV X-ray Sources

    SciTech Connect

    Poirier, Y.; Sommerville, M.; Johnstone, C.D.; Gräfe, J.; Nygren, I.; Jacso, F.; Khan, R.; Villareal-Barajas, J.E.; Tambasco, M.

    2014-08-15

    Purpose: To present an empirically-based beam model for computing dose deposited by kilovoltage (kV) x-rays and validate it for radiographic, CT, CBCT, superficial, and orthovoltage kV sources. Method and Materials: We modeled a wide variety of imaging (radiographic, CT, CBCT) and therapeutic (superficial, orthovoltage) kV x-ray sources. The model characterizes spatial variations of the fluence and spectrum independently. The spectrum is derived by matching measured values of the half value layer (HVL) and nominal peak potential (kVp) to computationally-derived spectra while the fluence is derived from in-air relative dose measurements. This model relies only on empirical values and requires no knowledge of proprietary source specifications or other theoretical aspects of the kV x-ray source. To validate the model, we compared measured doses to values computed using our previously validated in-house kV dose computation software, kVDoseCalc. The dose was measured in homogeneous and anthropomorphic phantoms using ionization chambers and LiF thermoluminescent detectors (TLDs), respectively. Results: The maximum difference between measured and computed dose measurements was within 2.6%, 3.6%, 2.0%, 4.8%, and 4.0% for the modeled radiographic, CT, CBCT, superficial, and the orthovoltage sources, respectively. In the anthropomorphic phantom, the computed CBCT dose generally agreed with TLD measurements, with an average difference and standard deviation ranging from 2.4 ± 6.0% to 5.7 ± 10.3% depending on the imaging technique. Most (42/62) measured TLD doses were within 10% of computed values. Conclusions: The proposed model can be used to accurately characterize a wide variety of kV x-ray sources using only empirical values.

  20. Sources of Error in Substance Use Prevalence Surveys

    PubMed Central

    Johnson, Timothy P.

    2014-01-01

    Population-based estimates of substance use patterns have been regularly reported now for several decades. Concerns with the quality of the survey methodologies employed to produce those estimates date back almost as far. Those concerns have led to a considerable body of research specifically focused on understanding the nature and consequences of survey-based errors in substance use epidemiology. This paper reviews and summarizes that empirical research by organizing it within a total survey error model framework that considers multiple types of representation and measurement errors. Gaps in our knowledge of error sources in substance use surveys and areas needing future research are also identified.

  1. Development of an empirical model of a variable speed vapor injection compressor used in a Modelica-based dynamic model of a residential air source heat pump

    NASA Astrophysics Data System (ADS)

    Dechesne, Bertrand; Bertagnolio, Stephane; Lemort, Vincent

    2015-08-01

    This paper presents a steady-state model of a variable speed vapour injection scroll compressor. Two compressors were investigated. The developed empirical model is based on five dimensionless polynomials that were fitted using experimental data from a 2.7kW scroll compressor. A second set of data was used to show the prediction of the presented model for an other device which exhibits a different swept volume. In both cases, the suction and injection mass flow rate were respectively predicted with a coefficient of determination equal to 99.9 and 94.3% and for the consumed power, 98.4% and 95.6%. A Modelica based dynamic model is then presented. The steady-state validation of the main components models is performed. Finally the control of the cycle using two PID controllers is presented and commented.

  2. Investigation of the empirical stellar library

    NASA Astrophysics Data System (ADS)

    Guo, Y. X.; Luo, A. L.

    During the large sample survey of LAMOST, mass spectrum of stars was obtained. The analysis of physical parameters, chemical composition and motion track can help us understand more about the structure and evolution of the Milky Way. Based on the investigation and research done on libraries of stellar spectra issued(such as in CDS), here I give an overview of the current status of empirical stellar libraries. I classify the valuable data according to specific criterion, such as spectral coverage/domain and resolution. After the integration of these spectrum, we will finally construct our own library of observed stellar spectra for LAMOST, which will serve as reference for the classification and automatic parameter analysis of stars, as well as for study of the galaxies evolution.

  3. Empirical derivation of the reference region for computing diagnostic sensitive ¹⁸fluorodeoxyglucose ratios in Alzheimer's disease based on the ADNI sample.

    PubMed

    Rasmussen, Jerod M; Lakatos, Anita; van Erp, Theo G M; Kruggel, Frithjof; Keator, David B; Fallon, James T; Macciardi, Fabio; Potkin, Steven G

    2012-03-01

    Careful selection of the reference region for non-quantitative positron emission tomography (PET) analyses is critically important for Region of Interest (ROI) data analyses. We introduce an empirical method of deriving the most suitable reference region for computing neurodegeneration sensitive (18)fluorodeoxyglucose (FDG) PET ratios based on the dataset collected by the Alzheimer's Disease Neuroimaging Initiative (ADNI) study. Candidate reference regions are selected based on a heat map of the difference in coefficients of variation (COVs) of FDG ratios over time for each of the Automatic Anatomical Labeling (AAL) atlas regions normalized by all other AAL regions. Visual inspection of the heat map suggests that the portion of the cerebellum and vermis superior to the horizontal fissure is the most sensitive reference region. Analyses of FDG ratio data show increases in significance on the order of ten-fold when using the superior portion of the cerebellum as compared with the traditionally used full cerebellum. The approach to reference region selection in this paper can be generalized to other radiopharmaceuticals and radioligands as well as to other disorders where brain changes over time are hypothesized and longitudinal data is available. Based on the empirical evidence presented in this study, we demonstrate the usefulness of the COV heat map method and conclude that intensity normalization based on the superior portion of the cerebellum may be most sensitive to measuring change when performing longitudinal analyses of FDG-PET ratios as well as group comparisons in Alzheimer's disease. This article is part of a Special Issue entitled: Imaging Brain Aging and Neurodegenerative disease. PMID:21958592

  4. "Suntelligence" Survey

    MedlinePlus

    ... to the American Academy of Dermatology's "Suntelligence" sun-smart survey. Please answer the following questions to measure ... be able to view a ranking of major cities suntelligence based on residents' responses to this survey. ...

  5. How Good Is Crude MDL for Solving the Bias-Variance Dilemma? An Empirical Investigation Based on Bayesian Networks

    PubMed Central

    Cruz-Ramírez, Nicandro; Acosta-Mesa, Héctor Gabriel; Mezura-Montes, Efrén; Guerra-Hernández, Alejandro; Hoyos-Rivera, Guillermo de Jesús; Barrientos-Martínez, Rocío Erandi; Gutiérrez-Fragoso, Karina; Nava-Fernández, Luis Alonso; González-Gaspar, Patricia; Novoa-del-Toro, Elva María; Aguilera-Rueda, Vicente Josué; Ameca-Alducin, María Yaneli

    2014-01-01

    The bias-variance dilemma is a well-known and important problem in Machine Learning. It basically relates the generalization capability (goodness of fit) of a learning method to its corresponding complexity. When we have enough data at hand, it is possible to use these data in such a way so as to minimize overfitting (the risk of selecting a complex model that generalizes poorly). Unfortunately, there are many situations where we simply do not have this required amount of data. Thus, we need to find methods capable of efficiently exploiting the available data while avoiding overfitting. Different metrics have been proposed to achieve this goal: the Minimum Description Length principle (MDL), Akaike’s Information Criterion (AIC) and Bayesian Information Criterion (BIC), among others. In this paper, we focus on crude MDL and empirically evaluate its performance in selecting models with a good balance between goodness of fit and complexity: the so-called bias-variance dilemma, decomposition or tradeoff. Although the graphical interaction between these dimensions (bias and variance) is ubiquitous in the Machine Learning literature, few works present experimental evidence to recover such interaction. In our experiments, we argue that the resulting graphs allow us to gain insights that are difficult to unveil otherwise: that crude MDL naturally selects balanced models in terms of bias-variance, which not necessarily need be the gold-standard ones. We carry out these experiments using a specific model: a Bayesian network. In spite of these motivating results, we also should not overlook three other components that may significantly affect the final model selection: the search procedure, the noise rate and the sample size. PMID:24671204

  6. How good is crude MDL for solving the bias-variance dilemma? An empirical investigation based on Bayesian networks.

    PubMed

    Cruz-Ramírez, Nicandro; Acosta-Mesa, Héctor Gabriel; Mezura-Montes, Efrén; Guerra-Hernández, Alejandro; Hoyos-Rivera, Guillermo de Jesús; Barrientos-Martínez, Rocío Erandi; Gutiérrez-Fragoso, Karina; Nava-Fernández, Luis Alonso; González-Gaspar, Patricia; Novoa-del-Toro, Elva María; Aguilera-Rueda, Vicente Josué; Ameca-Alducin, María Yaneli

    2014-01-01

    The bias-variance dilemma is a well-known and important problem in Machine Learning. It basically relates the generalization capability (goodness of fit) of a learning method to its corresponding complexity. When we have enough data at hand, it is possible to use these data in such a way so as to minimize overfitting (the risk of selecting a complex model that generalizes poorly). Unfortunately, there are many situations where we simply do not have this required amount of data. Thus, we need to find methods capable of efficiently exploiting the available data while avoiding overfitting. Different metrics have been proposed to achieve this goal: the Minimum Description Length principle (MDL), Akaike's Information Criterion (AIC) and Bayesian Information Criterion (BIC), among others. In this paper, we focus on crude MDL and empirically evaluate its performance in selecting models with a good balance between goodness of fit and complexity: the so-called bias-variance dilemma, decomposition or tradeoff. Although the graphical interaction between these dimensions (bias and variance) is ubiquitous in the Machine Learning literature, few works present experimental evidence to recover such interaction. In our experiments, we argue that the resulting graphs allow us to gain insights that are difficult to unveil otherwise: that crude MDL naturally selects balanced models in terms of bias-variance, which not necessarily need be the gold-standard ones. We carry out these experiments using a specific model: a Bayesian network. In spite of these motivating results, we also should not overlook three other components that may significantly affect the final model selection: the search procedure, the noise rate and the sample size. PMID:24671204

  7. Semivolatile Organic Compounds in Homes: Strategies for Efficient and Systematic Exposure Measurement Based on Empirical and Theoretical Factors

    PubMed Central

    2014-01-01

    Residential exposure can dominate total exposure for commercial chemicals of health concern; however, despite the importance of consumer exposures, methods for estimating household exposures remain limited. We collected house dust and indoor air samples in 49 California homes and analyzed for 76 semivolatile organic compounds (SVOCs)—phthalates, polybrominated diphenyl ethers (PBDEs), polychlorinated biphenyls (PCBs), polycyclic aromatic hydrocarbons (PAHs), and pesticides. Sixty chemicals were detected in either dust or air and here we report 58 SVOCs detected in dust for the first time. In dust, phthalates (bis(2-ethylhexyl) phthalate, benzyl butyl phthalate, di-n-butyl phthalate) and flame retardants (PBDE 99, PBDE 47) were detected at the highest concentrations relative to other chemicals at the 95th percentile, while phthalates were highest at the median. Because SVOCs are found in both gas and condensed phases and redistribute from their original source over time, partitioning models can clarify their fate indoors. We use empirical data to validate air-dust partitioning models and use these results, combined with experience in SVOC exposure assessment, to recommend residential exposure measurement strategies. We can predict dust concentrations reasonably well from measured air concentrations (R2 = 0.80). Partitioning models and knowledge of chemical Koa elucidate exposure pathways and suggest priorities for chemical regulation. These findings also inform study design by allowing researchers to select sampling approaches optimized for their chemicals of interest and study goals. While surface wipes are commonly used in epidemiology studies because of ease of implementation, passive air sampling may be more standardized between homes and also relatively simple to deploy. Validation of passive air sampling methods for SVOCs is a priority. PMID:25488487

  8. Economic burden of multidrug-resistant bacteria in nursing homes in Germany: a cost analysis based on empirical data

    PubMed Central

    Huebner, Claudia; Roggelin, Marcus; Flessa, Steffen

    2016-01-01

    Objectives Infections and colonisations with multidrug-resistant organisms (MDROs) increasingly affect different types of healthcare facilities worldwide. So far, little is known about additional costs attributable to MDROs outside hospitals. The aim of this study was to analysis the economic burden of multidrug-resistant bacteria in nursing homes in Germany. Setting The cost analysis is performed from a microeconomic perspective of the healthcare facilities. Study took place in six long-term care facilities in north-eastern Germany. Participants Data of 71 residents with a positive MDRO status were included. Primary and secondary outcome measures The study analysed MDRO surveillance data from 2011 to 2013. It was supplemented by an empirical analysis to determine the burden on staff capacity and materials consumption. Results 11 793 days with a positive multidrug-resistant pathogen diagnosis could be included in the analysis. On average, 11.8 (SD±6.3) MDRO cases occurred per nursing home. Mean duration per case was 163.3 days (SD±97.1). The annual MDRO-related costs varied in nursing homes between €2449.72 and €153 263.74 on an average €12 682.23 per case. Main cost drivers were staff capacity (€43.95 per day and €7177.04 per case) and isolation materials (€24.70 per day and €4033.51 per case). Conclusions The importance of MDROs in nursing homes could be confirmed. MDRO-related cost data in this specific healthcare sector were collected for the first time. Knowledge about the burden of MDROs will enable to assess the efficiency of hygiene intervention measures in nursing homes in the future. PMID:26908511

  9. Development of EMC-based empirical model for estimating spatial distribution of pollutant loads and its application in rural areas of Korea.

    PubMed

    Yi, Qitao; Li, Hui; Lee, Jin-Woo; Kim, Youngchul

    2015-09-01

    An integrated approach to easily calculate pollutant loads from agricultural watersheds is suggested and verified in this research. The basic concepts of this empirical tool were based on the assumption that variations in event mean concentrations (EMCs) of pollutants from a given agricultural watershed during rainstorms were only attributable to the rainfall pattern. Fifty one sets of EMC values were obtained from nine different watersheds located in the rural areas of Korea, and these data were used to develop predictive tools for the EMCs in rainfall runoff. The results of statistical tests of these formulas show that they are fairly good in predicting actual EMC values of some parameters, and useful in terms of calculating pollutant loads for any rainfall event time span such as daily, weekly, monthly, and yearly. This model was further checked in for its field applicability in a reservoir receiving stormwater after a cleanup of the sediments, covering 17 consecutive rainfall events from 1 July to 15 August in 2007. Overall the predicted values matched the observed values, indicating the feasibility of this empirical tool as a simple and useful solution in evaluating timely distribution of nonpoint source pollution loads from small rural watersheds of Korea. PMID:26354686

  10. Comparing Web-based with Mail Survey Administration of the Consumer Assessment of Healthcare Providers and Systems (CAHPS®) Clinician and Group Survey

    PubMed Central

    Bergeson, Steven C; Gray, Janiece; Ehrmantraut, Lynn A; Laibson, Tracy; Hays, Ron D

    2013-01-01

    Context The CAHPS® survey instruments are widely used to assess patient experiences with care but there is limited information about web-based data collection with them. Objective To compare web-based data collection with standard mail survey mode of collection of CAHPS® Clinician and Group survey data. Design, setting, and patients We randomized mode of data collection (web versus mail) of the CAHPS® Clinician and Group Survey to patients who had visited one of six clinics over a four-month period in Minnesota. A total of 410 patients responded to the web-based survey (14% response rate) and 982 patients responded to the mail survey (33% response rate). Main outcome measures Responses to CAHPS® survey dimensions and individual question responses, response rates, and participant characteristics. Results There were no significant differences in CAHPS® survey composites and individual question responses by mode, except for those addressing access. Those responding via the web reported less positive experiences with access to an appointment for urgent care as soon as needed, getting an appointment for routine care as soon as needed, getting answers to medical questions as soon as needed, and follow-up on test results (t’s=−3.64, −7.15, −2.58, −2.23; p’s=0.0003, <0.0001, 0.01, 0.03, respectively). Web respondents had more positive experiences about office wait time for the most recent visit (t = 2.32, p=0.021). Those who participated in the study tended to be older than those that did not (ƛ2=247.51, df=8, p<0.0001 for mail; ƛ2= 4.56, df=8, p<0.0001 for the web). Females were significantly more likely than males to respond to the survey overall (24% vs. 18%, ƛ2=6.45, 1 df, p=0.011) and relatively more likely than males to respond to web (15% vs. 13%, ƛ2=1.32, 1 df, p=0.25) than mail (34% vs. 30%, ƛ2=5.42, 1 df, p=0.02). Mail respondents were more likely than web respondents to be male (28% versus 18%, ƛ2=16.27, 1 df, p<0.0001) and older (27% of

  11. A survey of individuals in US-based pharmaceutical industry HEOR departments: attitudes on policy topics.

    PubMed

    Neumann, Peter J; Saret, Cayla J

    2013-10-01

    We surveyed US-based leaders in health economics and outcomes research (HEOR) departments in drug and device companies to examine their views on the state of the field. We created a questionnaire that was emailed to 123 US-based senior HEOR professionals at 54 companies. Of the 123 recipients, 74 (60%) completed the survey. Most respondents (92%) expected their company's HEOR use to increase, and 80% reported that their organization's senior management viewed HEOR work as critical. Approximately 62% agreed that Academy of Managed Care Pharmacy (AMCP) dossiers are useful to US health plans, and 55% stated that Food and Drug Administration Modernization Act (FDAMA) Section 114 is useful. Approximately 49% believed the US government should use cost-effectiveness analysis in coverage and reimbursement decisions, but only 31% expected this to occur within 3 years. The findings suggest strong support for the function at senior management levels and optimism about the field. PMID:24138650

  12. The evidence-based practice readiness survey: a structural equation modeling approach for a Greek sample.

    PubMed

    Patelarou, Athina E; Dafermos, Vasilis; Brokalaki, Hero; Melas, Christos D; Koukia, Evmorfia

    2015-06-01

    The present study reports on the translation, cultural adaptation and validation of the Evidence-Based Practice Readiness Survey into the Greek language. Back-translation strategy for cross-cultural research was used to translate the questionnaire into Greek. The psychometric measurements that were performed included: reliability coefficients and explanatory factor analysis using a Varimax Rotation and Principal Components Method. In a further step, confirmatory analysis of the principal components was conducted. The internal consistency of the Greek Evidence-Based Practice Readiness Survey version, as assessed by the Cronbach's alpha coefficient, showed satisfactory results. The value for alpha was found equal to 0.85. The explanatory and confirmatory factor analysis demonstrated a four-factor structure of the tool. PMID:26057651

  13. Law of Empires.

    ERIC Educational Resources Information Center

    Martz, Carlton

    2001-01-01

    This issue of "Bill of Rights in Action" explores issues raised by empires and imperial law. The first article, "Clash of Empires: The Fight for North America," looks at the clash of empires and the fight for North America during the 18th century. The second article, "When Roman Law Ruled the Western World," examines Roman Law, which helped hold…

  14. A Survey of Partition-Based Techniques for Copy-Move Forgery Detection

    PubMed Central

    Nathalie Diane, Wandji Nanda; Xingming, Sun; Moise, Fah Kue

    2014-01-01

    A copy-move forged image results from a specific type of image tampering procedure carried out by copying a part of an image and pasting it on one or more parts of the same image generally to maliciously hide unwanted objects/regions or clone an object. Therefore, detecting such forgeries mainly consists in devising ways of exposing identical or relatively similar areas in images. This survey attempts to cover existing partition-based copy-move forgery detection techniques. PMID:25152931

  15. Natural language processing-based COTS software and related technologies survey.

    SciTech Connect

    Stickland, Michael G.; Conrad, Gregory N.; Eaton, Shelley M.

    2003-09-01

    Natural language processing-based knowledge management software, traditionally developed for security organizations, is now becoming commercially available. An informal survey was conducted to discover and examine current NLP and related technologies and potential applications for information retrieval, information extraction, summarization, categorization, terminology management, link analysis, and visualization for possible implementation at Sandia National Laboratories. This report documents our current understanding of the technologies, lists software vendors and their products, and identifies potential applications of these technologies.

  16. Involvement with child protective services: is this a useful question in population-based surveys?

    PubMed

    Hamilton, Hayley A; Boak, Angela; Mann, Robert E

    2013-09-01

    Direct questions on child maltreatment in population-based surveys are often limited by ethical and methodological issues. This restricts the ability of researchers to examine an important aspect of early adversity and its relationship to health and behavior. An alternative to excluding issues of maltreatment entirely in population-based surveys is to include questions on child and family involvement with child protective services (CPS). A school-based adolescent survey that included a question on child and family involvement with CPS yielded results that were generally consistent with other studies relating child maltreatment to health and behavioral outcomes such as psychological distress symptoms, delinquency, aspects of bullying, and health service utilization. Such findings suggest that questions on involvement with CPS may be a reasonable proxy for child maltreatment. Despite the lack of information on the reason for involvement or specific categories of maltreatment, CPS involvement questions highlight the shared familial experience that surrounds CPS involvement and serves as a general reflection of an adverse experience that can be utilized by researchers interested in early experiences. PMID:23838213

  17. Empirical examination of the indicator ‘pediatric gastroenteritis hospitalization rate’ based on administrative hospital data in Italy

    PubMed Central

    2014-01-01

    Background Awareness of the importance of strengthening investments in child health and monitoring the quality of services in the pediatric field is increasing. The Pediatric Quality Indicators developed by the US Agency for Healthcare Research and Quality (AHRQ), use hospital administrative data to identify admissions that could be avoided through high-quality outpatient care. Building on this approach, the purpose of this study is to perform an empirical examination of the ‘pediatric gastroenteritis admission rate’ indicator in Italy, under the assumption that lower admission rates are associated with better management at the primary care level and with overall better quality of care for children. Methods Following the AHRQ process for evaluating quality indicators, we examined age exclusion/inclusion criteria, selection of diagnostic codes, hospitalization type, and methodological issues for the ‘pediatric gastroenteritis admission rate’. The regional variability of hospitalizations was analyzed for Italian children aged 0–17 years discharged between January 1, 2009 and December 31, 2011. We considered hospitalizations for the following diagnoses: non-bacterial gastroenteritis, bacterial gastroenteritis and dehydration (along with a secondary diagnosis of gastroenteritis). The data source was the hospital discharge records database. All rates were stratified by age. Results In the study period, there were 61,130 pediatric hospitalizations for non-bacterial gastroenteritis, 5,940 for bacterial gastroenteritis, and 38,820 for dehydration. In <1-year group, the relative risk of hospitalization for non-bacterial gastroenteritis was 24 times higher than in adolescents, then it dropped to 14.5 in 1- to 4-year-olds and to 3.2 in 5- to 9-year-olds. At the national level, the percentage of admissions for bacterial gastroenteritis was small compared with non-bacterial, while including admissions for dehydration revealed a significant variability in diagnostic

  18. A design of strategic alliance based on value chain of surveying and mapping enterprises in China

    NASA Astrophysics Data System (ADS)

    Duan, Hong; Huang, Xianfeng

    2007-06-01

    In this paper, we use value chain and strategic alliance theories to analyzing the surveying and mapping Industry and enterprises. The value chain of surveying and mapping enterprises is highly-contacted but split by administrative interference, the enterprises are common small scale. According to the above things, we consider that establishing a nonequity- Holding strategic alliance based on value chain is an available way, it can not only let the enterprises share the superior resources in different sectors of the whole value chain each other but avoid offending the interests of related administrative departments, by this way, the surveying and mapping enterprises gain development respectively and totally. Then, we give the method to building up the strategic alliance model through parting the value chain and the using advantage of companies in different value chain sectors. Finally, we analyze the internal rule of strategic alliance and prove it is a suitable way to realize the development of surveying and mapping enterprises through game theory.

  19. A Database Selection Expert System Based on Reference Librarian's Database Selection Strategy: A Usability and Empirical Evaluation.

    ERIC Educational Resources Information Center

    Ma, Wei

    2002-01-01

    Describes the development of a prototype Web-based database selection expert system at the University of Illinois at Urbana-Champaign that is based on reference librarians' database selection strategy which allows users to simultaneously search all available databases to identify those most relevant to their search using free-text keywords or…

  20. A Case for Increasing Empirical Attention to Head Start's Home-Based Program: An Exploration of Routine Collaborative Goal Setting

    ERIC Educational Resources Information Center

    Manz, Patricia H.; Lehtinen, Jaana; Bracaliello, Catherine

    2013-01-01

    Collaborative goal setting among home visitors and family members is a mandate for Head Start's home-based program. Yet, a dearth of research is available for advancing evidence-based practices for setting and monitoring home visiting goals or for understanding how family characteristics or program features are associated with them. With the…

  1. An Empirical Determination of the Intergalactic Background Light Using Near-Infrared Deep Galaxy Survey Data Out to 5 Micrometers and the Gamma-Ray Opacity of the Universe

    NASA Technical Reports Server (NTRS)

    Scully, Sean T.; Malkan, Matthew A.; Stecker, Floyd W.

    2014-01-01

    We extend our previous model-independent determination of the intergalactic background light, based purely on galaxy survey data, out to a wavelength of 5 micrometers. Our approach enables us to constrain the range of photon densities, based on the uncertainties from observationally determined luminosity densities and colors. We further determine a 68% confidence upper and lower limit on the opacity of the universe to gamma-rays up to energies of 1.6/(1 + z) terraelectron volts. A comparison of our lower limit redshift-dependent opacity curves to the opacity limits derived from the results of both ground-based air Cerenkov telescope and Fermi-LAT observations of PKS 1424+240 allows us to place a new upper limit on the redshift of this source, independent of IBL modeling.

  2. Using SEM to Analyze Complex Survey Data: A Comparison between Design-Based Single-Level and Model-Based Multilevel Approaches

    ERIC Educational Resources Information Center

    Wu, Jiun-Yu; Kwok, Oi-man

    2012-01-01

    Both ad-hoc robust sandwich standard error estimators (design-based approach) and multilevel analysis (model-based approach) are commonly used for analyzing complex survey data with nonindependent observations. Although these 2 approaches perform equally well on analyzing complex survey data with equal between- and within-level model structures…

  3. Consultants’ Perceptions of School Counselors’ Ability to Implement an Empirically-Based Intervention for Adolescent Social Anxiety Disorder

    PubMed Central

    Warner, Carrie Masia; Brice, Chad; Esseling, Petra G.; Stewart, Catherine E.; Mufson, Laura; Herzig, Kathleen

    2013-01-01

    Social anxiety is highly prevalent but goes untreated. Although school-based CBT programs are efficacious when delivered by specialized psychologists, it is unclear whether school counselors can implement these interventions effectively, which is essential to promote sustainable school programs. We present an initial consultation strategy to support school counselor implementation of group CBT for social anxiety and an evaluation of counselors’ treatment fidelity. Counselors were highly adherent to the treatment, but competence varied based on measurement. Counselors and consultants demonstrated good agreement for adherence, but relatively modest correspondence in competence ratings. We discuss future directions for school-based implementation efforts informed by these initial findings. PMID:23716144

  4. Cost savings of anti-TNF therapy using a test-based strategy versus an empirical dose escalation in Crohn's disease patients who lose response to infliximab

    PubMed Central

    Roblin, Xavier; Attar, Alain; Lamure, Michel; Savarieau, Bernard; Brunel, Pierre; Duru, Gérard; Peyrin-Biroulet, Laurent

    2015-01-01

    Background The use of pharmacokinetics is associated with cost savings in anti-tumor necrosis factor (anti-TNF) therapy, but the long-term cost savings in a large cohort of Crohn's disease (CD) patients are unknown. Aim The goal of this study was to compare the cost of anti-TNF therapy in two cohorts of CD patients losing response to infliximab, one using a test-based strategy and one an empirical dose escalation. Methods We used a selected mathematical model to describe the trajectories of CD patients based on a discrete event system. This design allowed us to track over a given period a double cohort of patients who moved randomly and asynchronously from one state to another, while keeping all the information on their entire trajectory. Both cohorts were modeled using state diagram parameters where transition probabilities from one state to another are derived from literature data. Costs were estimated based on the French health care system. Results Cost savings among the 10,000 CD patients using a test-based strategy were €131,300,293 at 5 years. At 5 years the mean cost saving was €13,130 per patient. The direct cost of the test had no impact on the results until the cost per test reached €2,000. Conclusions A test-based strategy leads to major cost savings related to anti-TNF therapy in CD. PMID:27123185

  5. Galactic Planet Population and Astrophysics with a Space-Based Microlensing Survey

    NASA Astrophysics Data System (ADS)

    Blandford, Roger

    Microlensing surveys have detected hundreds of events in the direction of the Galactic bulge, providing valuable information on the population of faint stars otherwise undetectable in our Galaxy. Within this sample of events approximately 20 planets have been uncovered, with about half of them being either unbound or at large separation (greater than 100 AU) from their host star. The majority of the current microlensing planetary detections have come from rare, high- magnification events which are alerted and immediately followed on with high cadence ground- based observations from a network of collaborations around the world. These results have established that microlensing is an effective technique that can be used distinguish planets from their host star, in particular planets at large separation, a regime in which the radial velocity and transiting methods for planet detection lose sensitivity. The next step in studies of Galactic microlensing involve development of space-based surveys, which provide better angular resolution and the ability resolve more faint stars. In anticipation of proposed spaced-based microlensing surveys such as the WFIRST satellite, this research will develop theoretical tools to understand and interpret future large samples of Galactic microlensing observations. We will study how to optimize a space-based microlensing survey to obtain the maximal scientific output for the costs available. As part of this theoretical research we plan to develop a fast and efficient numerical code that can be distributed to the larger community, incorporating modern aspects of Galactic astrophysics into microlensing theory. The simulations will include effects that will become important for space-based surveys, such as the finite size of main sequence source stars and understanding the microlensing signals from multiple planets. A major output of our analysis will be a quantification of the planetary detection efficiency over the entire range of planet

  6. Historic Building Information Modelling - Adding intelligence to laser and image based surveys of European classical architecture

    NASA Astrophysics Data System (ADS)

    Murphy, Maurice; McGovern, Eugene; Pavia, Sara

    2013-02-01

    Historic Building Information Modelling (HBIM) is a novel prototype library of parametric objects, based on historic architectural data and a system of cross platform programmes for mapping parametric objects onto point cloud and image survey data. The HBIM process begins with remote collection of survey data using a terrestrial laser scanner combined with digital photo modelling. The next stage involves the design and construction of a parametric library of objects, which are based on the manuscripts ranging from Vitruvius to 18th century architectural pattern books. In building parametric objects, the problem of file format and exchange of data has been overcome within the BIM ArchiCAD software platform by using geometric descriptive language (GDL). The plotting of parametric objects onto the laser scan surveys as building components to create or form the entire building is the final stage in the reverse engineering process. The final HBIM product is the creation of full 3D models including detail behind the object's surface concerning its methods of construction and material make-up. The resultant HBIM can automatically create cut sections, details and schedules in addition to the orthographic projections and 3D models (wire frame or textured) for both the analysis and conservation of historic objects, structures and environments.

  7. Historic Building Information Modelling - Adding Intelligence to Laser and Image Based Surveys

    NASA Astrophysics Data System (ADS)

    Murphy, M.; McGovern, E.; Pavia, S.

    2011-09-01

    Historic Building Information Modelling (HBIM) is a novel prototype library of parametric objects based on historic data and a system of cross platform programmes for mapping parametric objects onto a point cloud and image survey data. The HBIM process begins with remote collection of survey data using a terrestrial laser scanner combined with digital photo modelling. The next stage involves the design and construction of a parametric library of objects, which are based on the manuscripts ranging from Vitruvius to 18th century architectural pattern books. In building parametric objects, the problem of file format and exchange of data has been overcome within the BIM ArchiCAD software platform by using geometric descriptive language (GDL). The plotting of parametric objects onto the laser scan surveys as building components to create or form the entire building is the final stage in the reverse engin- eering process. The final HBIM product is the creation of full 3D models including detail behind the object's surface concerning its methods of construction and material make-up. The resultant HBIM can automatically create cut sections, details and schedules in addition to the orthographic projections and 3D models (wire frame or textured).

  8. 23 CFR Appendix C to Part 1240 - Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153)

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Certification-Calendar Year 1998 Seat Belt Use Survey State of Seat Belt Use Rate Reported for Calendar Year ____ : ____ %. In accordance with the provisions of 23 CFR 1240.12(c)(2), I hereby certify as follows: 1. The seat... GRANTS FOR USE OF SEAT BELTS-ALLOCATIONS BASED ON SEAT BELT USE RATES Pt. 1240, App. C Appendix C to...

  9. 23 CFR Appendix C to Part 1240 - Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153)

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Certification-Calendar Year 1998 Seat Belt Use Survey State of Seat Belt Use Rate Reported for Calendar Year ____ : ____ %. In accordance with the provisions of 23 CFR 1240.12(c)(2), I hereby certify as follows: 1. The seat... GRANTS FOR USE OF SEAT BELTS-ALLOCATIONS BASED ON SEAT BELT USE RATES Pt. 1240, App. C Appendix C to...

  10. 23 CFR Appendix C to Part 1240 - Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153)

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Certification-Calendar Year 1998 Seat Belt Use Survey State of Seat Belt Use Rate Reported for Calendar Year ____ : ____ %. In accordance with the provisions of 23 CFR 1240.12(c)(2), I hereby certify as follows: 1. The seat... GRANTS FOR USE OF SEAT BELTS-ALLOCATIONS BASED ON SEAT BELT USE RATES Pt. 1240, App. C Appendix C to...

  11. 23 CFR Appendix C to Part 1240 - Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153)

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Certification-Calendar Year 1998 Seat Belt Use Survey State of Seat Belt Use Rate Reported for Calendar Year ____ : ____ %. In accordance with the provisions of 23 CFR 1240.12(c)(2), I hereby certify as follows: 1. The seat... GRANTS FOR USE OF SEAT BELTS-ALLOCATIONS BASED ON SEAT BELT USE RATES Pt. 1240, App. C Appendix C to...

  12. Wastewater characterization survey, Thule Air Base, Greenland. Final report, 6-22 July 1992

    SciTech Connect

    McCoy, R.P.

    1993-03-01

    A wastewater characterization survey was conducted at Thule AB, Greenland,from 6-22 July 1992 by personnel from the Water Quality Function of Armstrong Laboratory. Extensive sampling of the outfall feeding into North Star Bay and industrial sites within the base cantonment area was performed. Low levels of organic contamination were found in the wastewater. In addition, metals and other pollutants were found in concentrations typical of domestic wastewater, indicating little contamination from industrial activities. Daily flow averaged approximately 100,000 gallons per day (375 cubic meters per day). The average chemical oxygen demand was found to be 130 milligrams per liter. The biochemical oxygen demand (BOD), though not determined experimentally during this survey, can be expected to fall within the range of 75-100 mg/l.... Wastewater characterization, Chemical oxygen demand, Pollutants, Acute marine toxicity, Chronic marine toxicity.

  13. National surveys of radiofrequency field strengths from radio base stations in Africa

    PubMed Central

    Joyner, Ken H.; Van Wyk, Marthinus J.; Rowley, Jack T.

    2014-01-01

    The authors analysed almost 260 000 measurement points from surveys of radiofrequency (RF) field strengths near radio base stations in seven African countries over two time frames from 2001 to 2003 and 2006 to 2012. The results of the national surveys were compared, chronological trends investigated and potential exposures compared by technology and with frequency modulation (FM) radio. The key findings from thes data are that irrespective of country, the year and mobile technology, RF fields at a ground level were only a small fraction of the international human RF exposure recommendations. Importantly, there has been no significant increase in typical measured levels since the introduction of 3G services. The mean levels in these African countries are similar to the reported levels for countries of Asia, Europe and North America using similar mobile technologies. The median level for the FM services in South Africa was comparable to the individual but generally lower than the combined mobile services. PMID:24044904

  14. A Parameter Identification Method for Helicopter Noise Source Identification and Physics-Based Semi-Empirical Modeling

    NASA Technical Reports Server (NTRS)

    Greenwood, Eric, II; Schmitz, Fredric H.

    2010-01-01

    A new physics-based parameter identification method for rotor harmonic noise sources is developed using an acoustic inverse simulation technique. This new method allows for the identification of individual rotor harmonic noise sources and allows them to be characterized in terms of their individual non-dimensional governing parameters. This new method is applied to both wind tunnel measurements and ground noise measurements of two-bladed rotors. The method is shown to match the parametric trends of main rotor Blade-Vortex Interaction (BVI) noise, allowing accurate estimates of BVI noise to be made for operating conditions based on a small number of measurements taken at different operating conditions.

  15. PM3 semi-empirical IR spectra simulations for metal complexes of schiff bases of sulfa drugs

    NASA Astrophysics Data System (ADS)

    Topacli, C.; Topacli, A.

    2003-06-01

    The molecular structures and infrared spectra of Co, Ni, Cu and Zn complexes of two schiff base ligands, viz N-( o-vanillinidene)sulfanilamide ( oVSaH) and N-( o-vanillinidene)sulfamerazine ( oVSmrzH) are studied in detail by PM3 method. It has been shown that the proposed structures for the compounds derived from microanalytical, magnetic and various spectral data were consistent with the IR spectra simulated by PM3 method. Coordination effects on ν(CN) and ν(C-O) modes in the schiff base ligands are in close agreement with the observed results.

  16. Survey Says

    ERIC Educational Resources Information Center

    McCarthy, Susan K.

    2005-01-01

    Survey Says is a lesson plan designed to teach college students how to access Internet resources for valid data related to the sexual health of young people. Discussion questions based on the most recent available data from two national surveys, the Youth Risk Behavior Surveillance-United States, 2003 (CDC, 2004) and the National Survey of…

  17. The National Nursing Assistant Survey: Improving the Evidence Base for Policy Initiatives to Strengthen the Certified Nursing Assistant Workforce

    ERIC Educational Resources Information Center

    Squillace, Marie R.; Remsburg, Robin E.; Harris-Kojetin, Lauren D.; Bercovitz, Anita; Rosenoff, Emily; Han, Beth

    2009-01-01

    Purpose: This study introduces the first National Nursing Assistant Survey (NNAS), a major advance in the data available about certified nursing assistants (CNAs) and a rich resource for evidence-based policy, practice, and applied research initiatives. We highlight potential uses of this new survey using select population estimates as examples of…

  18. A Usability Survey of a Contents-Based Video Retrieval System by Combining Digital Video and an Electronic Bulletin Board

    ERIC Educational Resources Information Center

    Haga, Hirohide; Kaneda, Shigeo

    2005-01-01

    This article describes the survey of the usability of a novel content-based video retrieval system. This system combines video streaming and an electronic bulletin board system (BBS). Comments submitted to the BBS are used to index video data. Following the development of the prototype system an experimental survey with ten subjects was performed.…

  19. Response to Intervention: Empirically Based Special Service Decisions From Single-Case Designs of Increasing and Decreasing Intensity

    ERIC Educational Resources Information Center

    Barnett, David W.; Daly, Edward, J.; Jones, Kevin M.; Lentz, Edward F.

    2004-01-01

    There have been several proposals to the effect that special service decisions could be based, at least in part, on the construct of response to intervention and not necessarily on traditional child measures. Single-case designs that focus on intervention response and intensity are reviewed as a potential evaluation framework for interdisciplinary…

  20. A Cross-Cultural Usability Study on the Internationalization of User Interfaces Based on an Empirical Five Factor Model

    ERIC Educational Resources Information Center

    Chakraborty, Joyram

    2009-01-01

    With the internationalization of e-commerce, it is no longer viable to design one user interface for all environments. Web-based applications and services can be accessed from all over the globe. To account for this globalization process, software developers need to understand that simply accounting for language translation of their websites for…

  1. Analyzing Interactions by an IIS-Map-Based Method in Face-to-Face Collaborative Learning: An Empirical Study

    ERIC Educational Resources Information Center

    Zheng, Lanqin; Yang, Kaicheng; Huang, Ronghuai

    2012-01-01

    This study proposes a new method named the IIS-map-based method for analyzing interactions in face-to-face collaborative learning settings. This analysis method is conducted in three steps: firstly, drawing an initial IIS-map according to collaborative tasks; secondly, coding and segmenting information flows into information items of IIS; thirdly,…

  2. Using Empirical Data to Clarify the Meaning of Various Prescriptions for Designing a Web-Based Course

    ERIC Educational Resources Information Center

    Boulet, Marie-Michele

    2004-01-01

    Design prescriptions to create web-based courses and sites that are dynamic, easy-to-use, interactive and data-driven, emerge from a "how to do it" approach. Unfortunately, the theory behind these methods, prescriptions, procedures or tools, is rarely provided and the important terms, such as "easy-to-use", to which these prescriptions refer are…

  3. Examining the Potential of Web-Based Multimedia to Support Complex Fine Motor Skill Learning: An Empirical Study

    ERIC Educational Resources Information Center

    Papastergiou, Marina; Pollatou, Elisana; Theofylaktou, Ioannis; Karadimou, Konstantina

    2014-01-01

    Research on the utilization of the Web for complex fine motor skill learning that involves whole body movements is still scarce. The aim of this study was to evaluate the impact of the introduction of a multimedia web-based learning environment, which was targeted at a rhythmic gymnastics routine consisting of eight fine motor skills, into an…

  4. Dynamic Interaction: A Measurement Development and Empirical Evaluation of Knowledge Based Systems and Web 2.0 Decision Support Mashups

    ERIC Educational Resources Information Center

    Beemer, Brandon Alan

    2010-01-01

    The research presented in this dissertation focuses on the organizational and consumer need for knowledge based support in unstructured domains, by developing a measurement scale for dynamic interaction. Addressing this need is approached and evaluated from two different perspectives. The first approach is the development of Knowledge Based…

  5. Comparison of Expert-Based and Empirical Evaluation Methodologies in the Case of a CBL Environment: The ''Orestis'' Experience

    ERIC Educational Resources Information Center

    Karoulis, Athanasis; Demetriadis, Stavros; Pombortsis, Andreas

    2006-01-01

    This paper compares several interface evaluation methods applied in the case of a computer based learning (CBL) environment, during a longitudinal study performed in three European countries, Greece, Germany, and Holland, and within the framework of an EC funded Leonardo da Vinci program. The paper firstly considers the particularities of the CBL…

  6. What Do Stroke Patients Look for in Game-Based Rehabilitation: A Survey Study.

    PubMed

    Hung, Ya-Xuan; Huang, Pei-Chen; Chen, Kuan-Ta; Chu, Woei-Chyn

    2016-03-01

    Stroke is one of the most common causes of physical disability, and early, intensive, and repetitive rehabilitation exercises are crucial to the recovery of stroke survivors. Unfortunately, research shows that only one third of stroke patients actually perform recommended exercises at home, because of the repetitive and mundane nature of conventional rehabilitation exercises. Thus, to motivate stroke survivors to engage in monotonous rehabilitation is a significant issue in the therapy process. Game-based rehabilitation systems have the potential to encourage patients continuing rehabilitation exercises at home. However, these systems are still rarely adopted at patients' places. Discovering and eliminating the obstacles in promoting game-based rehabilitation at home is therefore essential. For this purpose, we conducted a study to collect and analyze the opinions and expectations of stroke patients and clinical therapists. The study is composed of 2 parts: Rehab-preference survey - interviews to both patients and therapists to understand the current practices, challenges, and expectations on game-based rehabilitation systems; and Rehab-compatibility survey - a gaming experiment with therapists to elaborate what commercial games are compatible with rehabilitation. The study is conducted with 30 outpatients with stroke and 19 occupational therapists from 2 rehabilitation centers in Taiwan. Our surveys show that game-based rehabilitation systems can turn the rehabilitation exercises more appealing and provide personalized motivation for various stroke patients. Patients prefer to perform rehabilitation exercises with more diverse and fun games, and need cost-effective rehabilitation systems, which are often built on commodity hardware. Our study also sheds light on incorporating the existing design-for-fun games into rehabilitation system. We envision the results are helpful in developing a platform which enables rehab-compatible (i.e., existing, appropriately

  7. Subjectivity of LiDAR-Based Offset Measurements: Results from a Public Online Survey

    NASA Astrophysics Data System (ADS)

    Salisbury, J. B.; Arrowsmith, R.; Rockwell, T. K.; Haddad, D. E.; Zielke, O.; Madden, C.

    2012-12-01

    Geomorphic features (e.g., stream channels) that are offset in an earthquake can be measured to determine slip at that location. Analysis of these and other offset features can provide useful information for generating fault slip distributions. Remote analyses of active fault zones using high-resolution LiDAR data have recently been pursued in several studies, but there is a lack of consistency between users both for data analysis and results reporting. Individual investigators typically make offset measurements in a particular study area with their own protocols for measurement, assessing uncertainty, and quality rating, yet there is no coherent understanding of the reliability and repeatability of the measurements from observer to observer. We invited the participation of colleagues, interested geoscience communities, and the general public to measure ten geomorphic offsets from active faults in western North America using remote measurement methods that span a range of complexity (e.g., paper image and scale, the Google Earth ruler tool, and a MATLAB GUI for calculating backslip required to properly restore tectonic deformation) to explore the subjectivity involved with measuring geomorphic offsets. We provided a semi-quantitative quality-rating rubric for a description of offset quality, but there was a general lack of quality rating/offset uncertainty reporting. Survey responses (including mapped fault traces and piercing lines) were anonymously submitted along with user experience information. We received 11 paper-, 28 Google Earth-, and 16 MATLAB-based survey responses, though not all individuals measured every feature provided. For all survey methods, the majority of responses are in close agreement. However, large discrepancies arise where users interpret landforms differently, specifically the pre-earthquake morphologies and total offset accumulation of geomorphic features. Experienced users make more consistent measurements, whereas beginners less

  8. Development and validation of a web-based questionnaire for surveying the health and working conditions of high-performance marine craft populations

    PubMed Central

    de Alwis, Manudul Pahansen; Lo Martire, Riccardo; Äng, Björn O; Garme, Karl

    2016-01-01

    Background High-performance marine craft crews are susceptible to various adverse health conditions caused by multiple interactive factors. However, there are limited epidemiological data available for assessment of working conditions at sea. Although questionnaire surveys are widely used for identifying exposures, outcomes and associated risks with high accuracy levels, until now, no validated epidemiological tool exists for surveying occupational health and performance in these populations. Aim To develop and validate a web-based questionnaire for epidemiological assessment of occupational and individual risk exposure pertinent to the musculoskeletal health conditions and performance in high-performance marine craft populations. Method A questionnaire for investigating the association between work-related exposure, performance and health was initially developed by a consensus panel under four subdomains, viz. demography, lifestyle, work exposure and health and systematically validated by expert raters for content relevance and simplicity in three consecutive stages, each iteratively followed by a consensus panel revision. The item content validity index (I-CVI) was determined as the proportion of experts giving a rating of 3 or 4. The scale content validity index (S-CVI/Ave) was computed by averaging the I-CVIs for the assessment of the questionnaire as a tool. Finally, the questionnaire was pilot tested. Results The S-CVI/Ave increased from 0.89 to 0.96 for relevance and from 0.76 to 0.94 for simplicity, resulting in 36 items in the final questionnaire. The pilot test confirmed the feasibility of the questionnaire. Conclusions The present study shows that the web-based questionnaire fulfils previously published validity acceptance criteria and is therefore considered valid and feasible for the empirical surveying of epidemiological aspects among high-performance marine craft crews and similar populations. PMID:27324717

  9. The Inter-annual Variability of Controlling Parameter of Catchment Water Balance and Its Semi-empirical Formula Based on the Budyko Hypotheses

    NASA Astrophysics Data System (ADS)

    Ning, T.; Liu, W.; Han, X.

    2015-12-01

    The long-term average of the controlling parameter of catchment water balance has been widely reported; however, their inter-annual variability has rarely been quantified. Besides precipitation (P) and potential evaporation(ET0), the surface condition and seasonality of climate have great impacts on inter-annual variability of catchment water balance, which can be reflected by the parameter w (in terms of Fu's equation). Two watersheds on the Loess Plateau were thus chosen to quantify their relationships. To diminish the impacts of catchment water storage on water balance, the annual water balance was firstly estimated for each water year from 1981 to 2012. Then, the annual maximum vegetation coverage (M) based on NDVI and the variation coefficient (σ) of daily wetness index (P/ET0) were used to respectively present the surface condition and the seasonal variations in the coupled water and energy, and further discuss their relationships with w. Results showed that w correlated well with M and σ, then a semi-empirical formula was developed to calculate the key parameter w on annual scale (w=1+5.99×M1.01×exp (-0.072σ), R2=0.60). The equation was further validated in some other watersheds on the Loess Plateau and proved to be superior in estimating actual evaporation (ET). Finally, the Fu's equation and the semi-empirical formula for w were combined to quantify the contributions of changes in climate (P, ET0 and σ) and surface condition (M) to ET variations. Results showed that σ and M accounted for 5.8% and -3.2% of the ET decrease for the period of 1981-1995, respectively; during 1996-2012, the contribution of σ to ET changes decreased while that of M increased by 18.9%, indicating the impacts of surface condition on catchment water balance were strengthened.

  10. An empirical study of ensemble-based semi-supervised learning approaches for imbalanced splice site datasets

    PubMed Central

    2015-01-01

    Background Recent biochemical advances have led to inexpensive, time-efficient production of massive volumes of raw genomic data. Traditional machine learning approaches to genome annotation typically rely on large amounts of labeled data. The process of labeling data can be expensive, as it requires domain knowledge and expert involvement. Semi-supervised learning approaches that can make use of unlabeled data, in addition to small amounts of labeled data, can help reduce the costs associated with labeling. In this context, we focus on the problem of predicting splice sites in a genome using semi-supervised learning approaches. This is a challenging problem, due to the highly imbalanced distribution of the data, i.e., small number of splice sites as compared to the number of non-splice sites. To address this challenge, we propose to use ensembles of semi-supervised classifiers, specifically self-training and co-training classifiers. Results Our experiments on five highly imbalanced splice site datasets, with positive to negative ratios of 1-to-99, showed that the ensemble-based semi-supervised approaches represent a good choice, even when the amount of labeled data consists of less than 1% of all training data. In particular, we found that ensembles of co-training and self-training classifiers that dynamically balance the set of labeled instances during the semi-supervised iterations show improvements over the corresponding supervised ensemble baselines. Conclusions In the presence of limited amounts of labeled data, ensemble-based semi-supervised approaches can successfully leverage the unlabeled data to enhance supervised ensembles learned from highly imbalanced data distributions. Given that such distributions are common for many biological sequence classification problems, our work can be seen as a stepping stone towards more sophisticated ensemble-based approaches to biological sequence annotation in a semi-supervised framework. PMID:26356316

  11. Cleaning sky survey data bases using Hough transform and renewal string approaches

    NASA Astrophysics Data System (ADS)

    Storkey, A. J.; Hambly, N. C.; Williams, C. K. I.; Mann, R. G.

    2004-01-01

    Large astronomical data bases obtained from sky surveys such as the SuperCOSMOS Sky Survey (SSS) invariably suffer from spurious records coming from the artefactual effects of the telescope, satellites and junk objects in orbit around the Earth and physical defects on the photographic plate or CCD. Though relatively small in number, these spurious records present a significant problem in many situations, where they can become a large proportion of the records potentially of interest to a given astronomer. Accurate and robust techniques are needed for locating and flagging such spurious objects, and we are undertaking a programme investigating the use of machine learning techniques in this context. In this paper we focus on the four most common causes of unwanted records in the SSS: satellite or aeroplane tracks, scratches, fibres and other linear phenomena introduced to the plate, circular haloes around bright stars due to internal reflections within the telescope and diffraction spikes near to bright stars. Appropriate techniques are developed for the detection of each of these. The methods are applied to the SSS data to develop a data set of spurious object detections, along with confidence measures, which can allow these unwanted data to be removed from consideration. These methods are general and can be adapted to other astronomical survey data.

  12. Population-based sexual behavior surveys in China: Liuzhou compared with other prefectural cities

    PubMed Central

    Yingying, Huang; Abler, Laurie; Suiming, Pan; Henderson, Gail E.; Xin, Wang; Xingliang, Yao; Parish, William L.

    2013-01-01

    Sexual behaviors in China are rapidly changing; simultaneously, STI/HIV prevalence is increasing in the general population. To investigate these major shifts, we examined sexual behaviors and self-reported sexually transmitted infections (STI) in one prefectural city in southern China, Liuzhou, and compared it to other prefectural cities throughout China. We used adults age 18-39 from two sets of population-based surveys that paralleled each other in both content and method. The first set was the Liuzhou survey conducted in 2008 (n=398). The second set consisted of two national surveys collected in 2006 and 2010 (n=2186). Liuzhou respondents reported more active social and sexual behaviors than their national counterparts, including more socializing, dancing, drinking excessively, sexual activity among never married men and women, purchasing commercial sex among men, one-night stands among men, multiple sexual partnerships and self-reported STI among both men and women. Women in Liuzhou reported greater sexual risk behavior than their national counterparts, although overall they reported less than their male counterparts; they were also more likely to have had an abortion than women in other prefectural cities. Our findings provide a comprehensive overview of the sexual context of Liuzhou among the general population, which may help explain the greater STI/HIV prevalence in Liuzhou. PMID:24174289

  13. CSTACK: A Web-Based Stacking Analysis Tool for Deep/Wide Chandra Surveys

    NASA Astrophysics Data System (ADS)

    Miyaji, Takamitsu; Griffiths, R. E.; C-COSMOS Team

    2008-03-01

    Stacking analysis is a strong tool to probe the average X-ray properties of X-ray faint objects as a class, each of which are fainter than the detection limit as an individual source. This is especially the case for deep/wide surveys with Chandra, with its superb spatial resolution and the existence of survey data on the fields with extensive multiwavelength coverages. We present an easy-to use web-based tool (http://saturn.phys.cmu.edu/cstack), which enables users to perform a stacking analysis on a number of Chandra survey fields.Currently supported are C-COSMOS, Extended Chandra Deep Field South (proprietary access, password protected), Chandra Deep Fields South, and North (Guest access user=password=guest). For an input list of positions (e.g. galaxies selected from an optical catalog), the WWW tool returns stacked Chandra images in soft and hard bands and statistical analysis results including bootstrap histograms. We present running examples on the C-COSMOS data. The next version will also include the use of off-axis dependent aperture size, automatic exclusions of resolved sources, and histograms of stacks on random positions.

  14. Population-based sexual behavior surveys in China: Liuzhou compared with other prefectural cities.

    PubMed

    Huang, Yingying; Abler, Laurie; Pan, Suiming; Henderson, Gail E; Wang, Xin; Yao, Xingliang; Parish, William L

    2014-02-01

    Sexual behaviors in China are rapidly changing; simultaneously, sexually transmitted infections (STI)/HIV prevalence is increasing in the general population. To investigate these major shifts, we examined sexual behaviors and self-reported STI in one prefectural city in southern China, Liuzhou, and compared it to other prefectural cities throughout China. We used adults age 18-39 from two sets of population-based surveys that paralleled each other in both content and method. The first set was the Liuzhou survey conducted in 2008 (n = 398). The second set consisted of two national surveys collected in 2006 and 2010 (n = 2,186). Liuzhou respondents reported more active social and sexual behaviors than their national counterparts, including more socializing, dancing, drinking excessively, sexual activity among never married men and women, purchasing commercial sex among men, one-night stands among men, multiple sexual partnerships and self-reported STI among both men and women. Women in Liuzhou reported greater sexual risk behavior than their national counterparts, although overall they reported less than their male counterparts; they were also more likely to have had an abortion than women in other prefectural cities. Our findings provide a comprehensive overview of the sexual context of Liuzhou among the general population, which may help explain the greater STI/HIV prevalence in Liuzhou. PMID:24174289

  15. A commentary on evidenced-based parenting programs: redressing misconceptions of the empirical support for Triple P

    PubMed Central

    2012-01-01

    A meta-analytic review of the Triple P-Positive Parenting program by Wilson et al., recently published in BMC Medicine, claimed to demonstrate that although Triple P is widely disseminated and adopted, the evidence attesting to the effectiveness of the program is not as convincing as it may appear. Although this review addresses the important issue of evaluation and reporting methods within evidence-based interventions, we contend that the Wilson et al. review contains a number of significant conceptual, methodological and interpretational inadequacies that render the key conclusions of their review problematic. PMID:23173559

  16. Study of ESC [Empire State College] Graduates.

    ERIC Educational Resources Information Center

    Palola, Ernest G.; Ogden, Katharine

    This paper reports the results of a survey that examined the success or failure of Empire State College's first 29 graduates in obtaining admission to graduate school. Of the 13 applicants, 7 were accepted, 5 were waiting to hear, and 1 was rejected. Difficulties encountered by the applicants are discussed, as are the intentions of those not…

  17. ESC [Empire State College] Student Library Use.

    ERIC Educational Resources Information Center

    Bradley, A. Paul, Jr.; Palola, Ernest G.

    This document reports results of a spring 1973 survey of library use by students at two of the four regional centers at Empire State College (ESC), the new, nontraditional college without a campus of the State University of New York. Because ESC is mandated not to duplicate the resources of the State, students must use existing public and academic…

  18. Developing a Suitable Model for Supplier Selection Based on Supply Chain Risks: An Empirical Study from Iranian Pharmaceutical Companies

    PubMed Central

    Mehralian, Gholamhossein; Rajabzadeh Gatari, Ali; Morakabati, Mohadese; Vatanpour, Hossein

    2012-01-01

    The supply chain represents the critical link between the development of new product and the market in pharmaceutical industry. Over the years, improvements made in supply chain operations have focused largely on ways to reduce cost and gain efficiencies in scale. In addition, powerful regulatory and market forces have provided new incentives for pharmaceutical firms to basically rethink the way they produce and distribute products, and also to re-imagine the role of the supply chain in driving strategic growth, brand differentiation and economic value in the health continuum. The purpose of this paper is to formulate basic factors involved in risk analysis of pharmaceutical industry, and also determine the effective factors involved in suppliers selection and their priorities. This paper is based on the results of literature review, experts’ opinion acquisition, statistical analysis and also using MADM models on data gathered from distributed questionnaires. The model consists of the following steps and components: first factors involved in to supply chain risks are determined. Based on them a framework is considered. According the result of statistical analysis and MADM models the risk factors are formulated. The paper determines the main components and influenceial factors involving in the supply chain risks. Results showed that delivery risk can make an important contribution to mitigate the risk of pharmaceutical industry. PMID:24250442

  19. Developing a suitable model for supplier selection based on supply chain risks: an empirical study from Iranian pharmaceutical companies.

    PubMed

    Mehralian, Gholamhossein; Rajabzadeh Gatari, Ali; Morakabati, Mohadese; Vatanpour, Hossein

    2012-01-01

    The supply chain represents the critical link between the development of new product and the market in pharmaceutical industry. Over the years, improvements made in supply chain operations have focused largely on ways to reduce cost and gain efficiencies in scale. In addition, powerful regulatory and market forces have provided new incentives for pharmaceutical firms to basically rethink the way they produce and distribute products, and also to re-imagine the role of the supply chain in driving strategic growth, brand differentiation and economic value in the health continuum. The purpose of this paper is to formulate basic factors involved in risk analysis of pharmaceutical industry, and also determine the effective factors involved in suppliers selection and their priorities. This paper is based on the results of literature review, experts' opinion acquisition, statistical analysis and also using MADM models on data gathered from distributed questionnaires. The model consists of the following steps and components: first factors involved in to supply chain risks are determined. Based on them a framework is considered. According the result of statistical analysis and MADM models the risk factors are formulated. The paper determines the main components and influenceial factors involving in the supply chain risks. Results showed that delivery risk can make an important contribution to mitigate the risk of pharmaceutical industry. PMID:24250442

  20. Identifying significant covariates for anti-HIV treatment response: mechanism-based differential equation models and empirical semiparametric regression models.

    PubMed

    Huang, Yangxin; Liang, Hua; Wu, Hulin

    2008-10-15

    In this paper, the mechanism-based ordinary differential equation (ODE) model and the flexible semiparametric regression model are employed to identify the significant covariates for antiretroviral response in AIDS clinical trials. We consider the treatment effect as a function of three factors (or covariates) including pharmacokinetics, drug adherence and susceptibility. Both clinical and simulated data examples are given to illustrate these two different kinds of modeling approaches. We found that the ODE model is more powerful to model the mechanism-based nonlinear relationship between treatment effects and virological response biomarkers. The ODE model is also better in identifying the significant factors for virological response, although it is slightly liberal and there is a trend to include more factors (or covariates) in the model. The semiparametric mixed-effects regression model is very flexible to fit the virological response data, but it is too liberal to identify correct factors for the virological response; sometimes it may miss the correct factors. The ODE model is also biologically justifiable and good for predictions and simulations for various biological scenarios. The limitations of the ODE models include the high cost of computation and the requirement of biological assumptions that sometimes may not be easy to validate. The methodologies reviewed in this paper are also generally applicable to studies of other viruses such as hepatitis B virus or hepatitis C virus. PMID:18407583

  1. Teaching earth science in the field: GPS-based educational trails as a practically relevant, empirical verified approach

    NASA Astrophysics Data System (ADS)

    Kisser, Thomas

    2015-04-01

    GPS devices are common use in the daily life and are used in geography classes increasingly often. Presently, specialist literature is merely descriptive and thematically reduced to the function of orientation. The questions whether they are an applicable tool for teaching earth science circumstances and if the lasting learning success shows any differences compared to normal lessons hold in a class room haven't been answered. Neurobiological and teaching psychological knowledge support the idea that students completing the GPS-based educational trail will learn more successful compared to students in a "normal" class: A successful contextualization of modern geomedia stimulates the motivation. Geocaches are also suitable for didactical structuration. The order of "Geopoints" is chosen in a way that the structure of the landscape is being displayed adequate. The students feel addressed affectively due to the real-life encounters and experience their environment consciously. The presented concept "GPS-based educational trail" is different from a normal geocache, which is merely a hide-and-seek-game. Here, the main focus lays on the field work and earth science. The GPS-decvices are used for the orientation between the Geopoints. In order to get two groups with characteristics as different as possible, due to their developmental psychology, age-related education of cognitive and methodical competence, classes from grade 5 (11 years old) and 11 (17 years old) have been chosen. The different cognitive states of development require different didactical approaches. For the 11 grade the topic "rearrangements of fluvial topography" is a possible one. Using the example of anthropogenic rearrangements of the Rheinaue wetlands near Karlsruhe the interdependency between human and environment can be shown. The "Nördlinger Ries" between the Swabian and the Franconian Jura has been chosen for grade 5. The typical elements of the Swabian Jura (karst formation, hydrogeology

  2. COOKING APPLIANCE USE IN CALIFORNIA HOMES DATA COLLECTED FROM A WEB-BASED SURVEY

    SciTech Connect

    Klug, Victoria; Lobscheid, Agnes; Singer, Brett

    2011-08-01

    Cooking of food and use of natural gas cooking burners generate pollutants that can have substantial impacts on residential indoor air quality. The extent of these impacts depends on cooking frequency, duration and specific food preparation activities in addition to the extent to which exhaust fans or other ventilation measures (e.g. windows) are used during cooking. With the intent of improving our understanding of indoor air quality impacts of cooking-related pollutants, we created, posted and advertised a web-based survey about cooking activities in residences. The survey included questions similar to those in California's Residential Appliance Saturation Survey (RASS), relating to home, household and cooking appliance characteristics and weekly patterns of meals cooked. Other questions targeted the following information not captured in the RASS: (1) oven vs. cooktop use, the number of cooktop burners used and the duration of burner use when cooking occurs, (2) specific cooking activities, (3) the use of range hood or window to increase ventilation during cooking, and (4) occupancy during cooking. Specific cooking activity questions were asked about the prior 24 hours with the assumption that most people are able to recollect activities over this time period. We examined inter-relationships among cooking activities and patterns and relationships of cooking activities to household demographics. We did not seek to obtain a sample of respondents that is demographically representative of the California population but rather to inexpensively gather information from homes spanning ranges of relevant characteristics including the number of residents and presence or absence of children. This report presents the survey, the responses obtained, and limited analysis of the results.

  3. Agreement on the Level Selection in Laminoplasty among Experienced Surgeons: A Survey-Based Study

    PubMed Central

    Cho, Jae Hwan; Suk, Kyung-Soo; Park, Jong-Beom; Ha, Jung-Ki; Hwang, Chang Ju; Lee, Choon Sung

    2016-01-01

    Study Design Survey based study. Purpose To assess the degree of agreement in level selection of laminoplasty (LP) for the selected cervical myeloradiculopathy cases between experienced spine surgeons. Overview of Literature Although, cervical LP is a widely used surgical technique for multi-level spinal cord compression, until now there is no consensus about how many segments or which segments should be opened to achieve a satisfactory decompression. Methods Thorough clinical and radiographic data (plain X-ray, computed tomography, and magnetic resonance imaging) of 30 patients who had cervical myelopathy were prepared. The data were provided to three independent spine surgeons with over 10 years experience in operation of their own practices. They were questioned about the most preferable surgical method and suitable decompression levels. The second survey was carried out after 6 months with the same cases. If the level difference between respondents was a half level or below, agreement was considered acceptable. The intraobserver and interobserver agreements in level selection were assessed by kappa statistics. Results Three respondents selected LP as an option for 6, 8, and 22 cases in the first survey and 10, 21, and 24 cases in the second survey. The reasons for selection of LP were levels of ossification of the posterior longitudinal ligament (p=0.004), segmental kyphotic deformity (p=0.036) and mean compression score (p=0.041). Intraobserver agreement showed variable results. Interobserver agreement was poor to fair by perfect matching (kappa=0.111–0.304) and fair to moderate by acceptable matching (kappa=0.308–0.625). Conclusions The degree of agreement for level selection of LP was not high even though experienced surgeons would choose the opening segments on the basis of same criteria. These results suggest that more specific guidelines in determination of levels for LP should be required to decrease unnecessary wide decompression according to

  4. Oral cancer in Myanmar: a preliminary survey based on hospital-based cancer registries.

    PubMed

    Oo, Htun Naing; Myint, Yi Yi; Maung, Chan Nyein; Oo, Phyu Sin; Cheng, Jun; Maruyama, Satoshi; Yamazaki, Manabu; Yagi, Minoru; Sawair, Faleh A; Saku, Takashi

    2011-01-01

    The occurrence of oral cancer is not clearly known in Myanmar, where betel quid chewing habits are widely spread. Since betel quid chewing has been considered to be one of the important causative factors for oral cancer, the circumstantial situation for oral cancer should be investigated in this country. We surveyed oral cancer cases as well as whole body cancers from two cancer registries from Yangon and Mandalay cities, both of which have representative referral hospitals in Myanmar, and we showed that oral cancer stood at the 6th position in males and 10th in females, contributing to 3.5% of whole body cancers. There was a male predominance with a ratio of 2.1:1. Their most frequent site was the tongue, followed by the palate, which was different from that in other countries with betel quid chewing habits. About 90% of male and 44% of female patients had habitual backgrounds of chewing and smoking for more than 15 years. The results revealed for the first time reliable oral cancer frequencies in Myanmar, suggesting that longstanding chewing and smoking habits are etiological backgrounds for oral cancer patients. PMID:20819123

  5. A Review of Theoretical and Empirical Advancements

    ERIC Educational Resources Information Center

    Wang, Mo; Henkens, Kene; van Solinge, Hanna

    2011-01-01

    In this article, we review both theoretical and empirical advancements in retirement adjustment research. After reviewing and integrating current theories about retirement adjustment, we propose a resource-based dynamic perspective to apply to the understanding of retirement adjustment. We then review empirical findings that are associated with…

  6. Wastewater characterization survey, Edwards Air Force Base, California. Final report, 17-28 February 1992

    SciTech Connect

    McCoy, R.P.

    1992-08-01

    A wastewater characterization survey was conducted at Edwards Air Force Base from 17-28 February 1992 by personnel from the Water Quality Function of Armstrong Laboratory. Extensive sampling of the treatment plant influent wastewater and sludge beds was performed as well as sampling at nine other sites in the base cantonment area. Some sampling of an Imhoff tank on North Base, five evaporation ponds and the lakebed was also conducted. Low levels of organic contamination were found in the influent and industrial sites downstream of Site 7. Site 7 is a manhole located in an identified Installation Restoration Program (IRP) site. Corrective actions were recommended to prevent organic soil contaminants from intruding into this site prior to the operation of a planned tertiary treatment plant. Organic and inorganic contaminants discharged at other industrial sites were found to be in low concentrations and indicated that good shop practices were followed in minimizing contamination of the wastewater with industrial chemicals.

  7. The Coverage Problem in Video-Based Wireless Sensor Networks: A Survey

    PubMed Central

    Costa, Daniel G.; Guedes, Luiz Affonso

    2010-01-01

    Wireless sensor networks typically consist of a great number of tiny low-cost electronic devices with limited sensing and computing capabilities which cooperatively communicate to collect some kind of information from an area of interest. When wireless nodes of such networks are equipped with a low-power camera, visual data can be retrieved, facilitating a new set of novel applications. The nature of video-based wireless sensor networks demands new algorithms and solutions, since traditional wireless sensor networks approaches are not feasible or even efficient for that specialized communication scenario. The coverage problem is a crucial issue of wireless sensor networks, requiring specific solutions when video-based sensors are employed. In this paper, it is surveyed the state of the art of this particular issue, regarding strategies, algorithms and general computational solutions. Open research areas are also discussed, envisaging promising investigation considering coverage in video-based wireless sensor networks. PMID:22163651

  8. Knowledge based systems: A critical survey of major concepts, issues and techniques. Visuals

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kavi, Srinu

    1984-01-01

    This Working Paper Series entry represents a collection of presentation visuals associated with the companion report entitled, Knowledge Based Systems: A Critical Survey of Major Concepts, Issues, and Techniques, USL/DBMS NASA/RECON Working Paper Series report number DBMS.NASA/RECON-9. The objectives of the report are to: examine various techniques used to build the KBS; to examine at least one KBS in detail, i.e., a case study; to list and identify limitations and problems with the KBS; to suggest future areas of research; and to provide extensive reference materials.

  9. Is cell culture a risky business? Risk analysis based on scientist survey data.

    PubMed

    Shannon, Mark; Capes-Davis, Amanda; Eggington, Elaine; Georghiou, Ronnie; Huschtscha, Lily I; Moy, Elsa; Power, Melinda; Reddel, Roger R; Arthur, Jonathan W

    2016-02-01

    Cell culture is a technique that requires vigilance from the researcher. Common cell culture problems, including contamination with microorganisms or cells from other cultures, can place the reliability and reproducibility of cell culture work at risk. Here we use survey data, contributed by research scientists based in Australia and New Zealand, to assess common cell culture risks and how these risks are managed in practice. Respondents show that sharing of cell lines between laboratories continues to be widespread. Arrangements for mycoplasma and authentication testing are increasingly in place, although scientists are often uncertain how to perform authentication testing. Additional risks are identified for preparation of frozen stocks, storage and shipping. PMID:26365214

  10. Input formats and specifications of the National Geodetic Survey data base. Volume 3: Gravity control data (revised September 1985)

    NASA Astrophysics Data System (ADS)

    Dewhurst, W. T.

    1985-09-01

    The user's guide to the formats and specifications used within the National Geodetic Survey (NGS) data base is commonly referred to as the Blue Book, and is comprised of three volumes. Gravity control (GRAV) data are discussed.

  11. Empirical Study on Relationship Capital in Supply Chain-Based on Analysis of Enterprises in Hunan Province

    NASA Astrophysics Data System (ADS)

    Shan, Lu; Qiang-Bin, Ou-Yang

    Based on the existing theories and studies, this thesis aims to propose a theoretical model for describing the relationship between the relationship capital in the supply chain and its influencing factors, and meanwhile, the EFA (exploratory factor analysis) and CFA (confirmatory factor analysis) are carried out on 188 sample data. Through the evaluation of goodness of fit on the structure model as well as assumption testing, it turns out that there are four influencing factors for the relationship capital in the supply chain, namely, capability and reputation of the cooperation companies in the supply chain, input in specific assets and transfer cost, which are in a positive correlation with relationship capital separately. Then a decision-making basis is provided for the practice of relationship capital in the supply chain.

  12. Plasma-surface interactions during Si etching in Cl- and Br-based plasmas: An empirical and atomistic study

    NASA Astrophysics Data System (ADS)

    Tsuda, Hirotaka; Nagaoka, Tatsuya; Miyata, Hiroki; Takao, Yoshinori; Eriguchi, Koji; Ono, Kouichi

    2009-10-01

    Nanometer-scale control of Si etching in Cl2- and HBr-containing plasmas is indispensable in the fabrication of gate electrodes and shallow trench isolation. There are profile anomalies of sidewalls such as tapering, bowing, footing (or corner rounding), and notching, which largely affect the critical dimension. There are also anomalies of bottom surfaces such as microtrenching and roughness (or residues), which affect the bottom uniformity, and lead to recess and damage in gate fabrication. Atomic-scale cellular model (ASCeM) based on the Monte Carlo method has been developed to simulate plasma-surface interactions and the profile evolution during etching, including passivation layer formation, and also ion reflection and penetration on feature surfaces. We have also studied atomistic plasma-surface interactions by classical molecular dynamics (MD) simulation, where an improved Stillinger-Weber interatomic potential was newly developed. The numerical results were compared with etching experiments and also with surface diagnostics including in-situ Fourier-transform-infrared reflection absorption spectroscopy (FTIR-RAS), to reveal the origin of profile anomalies on feature surfaces during etching, and then to achieve the precise control of etched profiles.

  13. An Empirical Evaluation of Lightweight Random Walk Based Routing Protocol in Duty Cycle Aware Wireless Sensor Networks

    PubMed Central

    Fatima, Mehwish

    2014-01-01

    Energy efficiency is an important design paradigm in Wireless Sensor Networks (WSNs) and its consumption in dynamic environment is even more critical. Duty cycling of sensor nodes is used to address the energy consumption problem. However, along with advantages, duty cycle aware networks introduce some complexities like synchronization and latency. Due to their inherent characteristics, many traditional routing protocols show low performance in densely deployed WSNs with duty cycle awareness, when sensor nodes are supposed to have high mobility. In this paper we first present a three messages exchange Lightweight Random Walk Routing (LRWR) protocol and then evaluate its performance in WSNs for routing low data rate packets. Through NS-2 based simulations, we examine the LRWR protocol by comparing it with DYMO, a widely used WSN protocol, in both static and dynamic environments with varying duty cycles, assuming the standard IEEE 802.15.4 in lower layers. Results for the three metrics, that is, reliability, end-to-end delay, and energy consumption, show that LRWR protocol outperforms DYMO in scalability, mobility, and robustness, showing this protocol as a suitable choice in low duty cycle and dense WSNs. PMID:24696667

  14. Towards a 3d Based Platform for Cultural Heritage Site Survey and Virtual Exploration

    NASA Astrophysics Data System (ADS)

    Seinturier, J.; Riedinger, C.; Mahiddine, A.; Peloso, D.; Boï, J.-M.; Merad, D.; Drap, P.

    2013-07-01

    This paper present a 3D platform that enables to make both cultural heritage site survey and its virtual exploration. It provides a single and easy way to use framework for merging multi scaled 3D measurements based on photogrammetry, documentation produced by experts and the knowledge of involved domains leaving the experts able to extract and choose the relevant information to produce the final survey. Taking into account the interpretation of the real world during the process of archaeological surveys is in fact the main goal of a survey. New advances in photogrammetry and the capability to produce dense 3D point clouds do not solve the problem of surveys. New opportunities for 3D representation are now available and we must to use them and find new ways to link geometry and knowledge. The new platform is able to efficiently manage and process large 3D data (points set, meshes) thanks to the implementation of space partition methods coming from the state of the art such as octrees and kd-trees and thus can interact with dense point clouds (thousands to millions of points) in real time. The semantisation of raw 3D data relies on geometric algorithms such as geodetic path computation, surface extraction from dense points cloud and geometrical primitive optimization. The platform provide an interface that enables expert to describe geometric representations of interesting objects like ashlar blocs, stratigraphic units or generic items (contour, lines, … ) directly onto the 3D representation of the site and without explicit links to underlying algorithms. The platform provide two ways for describing geometric representation. If oriented photographs are available, the expert can draw geometry on a photograph and the system computes its 3D representation by projection on the underlying mesh or the points cloud. If photographs are not available or if the expert wants to only use the 3D representation then he can simply draw objects shape on it. When 3D

  15. A comparison of three empirically based, spatially explicit predictive models of residential soil Pb concentrations in Baltimore, Maryland, USA: understanding the variability within cities.

    PubMed

    Schwarz, Kirsten; Weathers, Kathleen C; Pickett, Steward T A; Lathrop, Richard G; Pouyat, Richard V; Cadenasso, Mary L

    2013-08-01

    In many older US cities, lead (Pb) contamination of residential soil is widespread; however, contamination is not uniform. Empirically based, spatially explicit models can assist city agencies in addressing this important public health concern by identifying areas predicted to exceed public health targets for soil Pb contamination. Sampling of 61 residential properties in Baltimore City using field portable X-ray fluorescence revealed that 53 % had soil Pb that exceeded the USEPA reportable limit of 400 ppm. These data were used as the input to three different spatially explicit models: a traditional general linear model (GLM), and two machine learning techniques: classification and regression trees (CART) and Random Forests (RF). The GLM revealed that housing age, distance to road, distance to building, and the interactions between variables explained 38 % of the variation in the data. The CART model confirmed the importance of these variables, with housing age, distance to building, and distance to major road networks determining the terminal nodes of the CART model. Using the same three predictor variables, the RF model explained 42 % of the variation in the data. The overall accuracy, which is a measure of agreement between the model and an independent dataset, was 90 % for the GLM, 83 % for the CART model, and 72 % for the RF model. A range of spatially explicit models that can be adapted to changing soil Pb guidelines allows managers to select the most appropriate model based on public health targets. PMID:23775390

  16. Searching transients in large-scale surveys. A method based on the Abbe value

    NASA Astrophysics Data System (ADS)

    Mowlavi, N.

    2014-08-01

    Aims: A new method is presented to identify transient candidates in large-scale surveys based on the variability pattern in their light curves. Methods: The method is based on the Abbe value, Ab, that estimates the smoothness of a light curve, and on a newly introduced value called the excess Abbe and denoted excessAb, that estimates the regularity of the light curve variability pattern over the duration of the observations. Results: Based on simulated light curves, transients are shown to occupy a specific region in the {diagram} diagram, distinct from sources presenting pulsating-like features in their light curves or having featureless light curves. The method is tested on real light curves taken from EROS-2 and OGLE-II surveys in a 0.50° × 0.17° field of the sky in the Large Magellanic Cloud centered at RA(J2000) = 5h25m56.5s and Dec(J2000) = -69d29m43.3s. The method identifies 43 EROS-2 transient candidates out of a total of 1300 variable stars, and 19 more OGLE-II candidates, 10 of which do not have any EROS-2 variable star matches and which would need further confirmation to assess their reliability. The efficiency of the method is further tested by comparing the list of transient candidates with known Be stars in the literature. It is shown that all Be stars known in the studied field of view with detectable bursts or outbursts are successfully extracted by the method. In addition, four new transient candidates displaying bursts and/or outbursts are found in the field, of which at least two are good new Be candidates. Conclusions: The new method proves to be a potentially powerful tool to extract transient candidates from large-scale multi-epoch surveys. The better the photometric measurement uncertainties are, the cleaner the list of detected transient candidates is. In addition, the diagram diagram is shown to be a good diagnostic tool to check the data quality of multi-epoch photometric surveys. A trend of instrumental and/or data reduction origin

  17. Estimating hydraulic conductivity of fractured rocks from high-pressure packer tests with an Izbash's law-based empirical model

    NASA Astrophysics Data System (ADS)

    Chen, Yi-Feng; Hu, Shao-Hua; Hu, Ran; Zhou, Chuang-Bing

    2015-04-01

    High-pressure packer test (HPPT) is an enhanced constant head packer test for characterizing the permeability of fractured rocks under high-pressure groundwater flow conditions. The interpretation of the HPPT data, however, remains difficult due to the transition of flow conditions in the conducting structures and the hydraulic fracturing-induced permeability enhancement in the tested rocks. In this study, a number of HPPTs were performed in the sedimentary and intrusive rocks located at 450 m depth in central Hainan Island. The obtained Q-P curves were divided into a laminar flow phase (I), a non-Darcy flow phase (II), and a hydraulic fracturing phase (III). The critical Reynolds number for the deviation of flow from linearity into phase II was 25-66. The flow of phase III occurred in sparsely to moderately fractured rocks, and was absent at the test intervals of perfect or poor intactness. The threshold fluid pressure between phases II and III was correlated with RQD and the confining stress. An Izbash's law-based analytical model was employed to calculate the hydraulic conductivity of the tested rocks in different flow conditions. It was demonstrated that the estimated hydraulic conductivity values in phases I and II are basically the same, and are weakly dependent on the injection fluid pressure, but it becomes strongly pressure dependent as a result of hydraulic fracturing in phase III. The hydraulic conductivity at different test intervals of a borehole is remarkably enhanced at highly fractured zone or contact zone, but within a rock unit of weak heterogeneity, it decreases with the increase of depth.

  18. The effect of vegetation type and fire on permafrost thaw: An empirical test of a process based model

    NASA Astrophysics Data System (ADS)

    Thierry, Aaron; Estop-Aragones, Cristian; Fisher, James; Hartley, Iain; Murton, Julian; Phoenix, Gareth; Street, Lorna; Williams, Mathew

    2015-04-01

    As conditions become more favourable for plant growth in the high latitudes, most models predict that these areas will take up more carbon during the 21st century. However, vast stores of carbon are frozen in boreal and arctic permafrost, and warming may result in some of this carbon being released to the atmosphere. The recent inclusion of permafrost thaw in large-scale model simulations has suggested that the permafrost feedback could potentially substantially reduce the predicted global net uptake of carbon by terrestrial ecosystems, with major implications for the rate of climate change. However, large uncertainties remain in predicting rates of permafrost thaw and in determining the impacts of thaw in contrasting ecosystems, with many of the key processes missing from carbon-climate models. The role that different plant communities play in insulating soils and protecting permafrost is poorly quantified, with key groups such as mosses absent in many models. But it is thought that they may play a key role in determining permafrost resilience. In order to test the importance of these ecological processes we use a new specially acquired dataset from sites in the Canadian arctic to develop, parameterise and evaluate a detailed process-based model of vegetation-soil-permafrost interactions which includes an insulating moss understory. We tested the sensitivity of modelled active layer depth to a series of factors linked to fire disturbance, which is common in boreal permafrost areas. We show how simulations of active layer depth (ALD) respond to removals of (i) vascular vegetation, (ii) moss cover, and (iii) organic soil layers. We compare model responses to observed patterns from Canada. We also describe the sensitivity of our modelled ALD to changes in temperature and precipitation. We found that four parameters controlled most of the sensitivity in the modelled ALD, linked to conductivity of organic soils and mosses.

  19. Reliability of Nationwide Prevalence Estimates of Dementia: A Critical Appraisal Based on Brazilian Surveys

    PubMed Central

    2015-01-01

    Background The nationwide dementia prevalence is usually calculated by applying the results of local surveys to countries’ populations. To evaluate the reliability of such estimations in developing countries, we chose Brazil as an example. We carried out a systematic review of dementia surveys, ascertained their risk of bias, and present the best estimate of occurrence of dementia in Brazil. Methods and Findings We carried out an electronic search of PubMed, Latin-American databases, and a Brazilian thesis database for surveys focusing on dementia prevalence in Brazil. The systematic review was registered at PROSPERO (CRD42014008815). Among the 35 studies found, 15 analyzed population-based random samples. However, most of them utilized inadequate criteria for diagnostics. Six studies without these limitations were further analyzed to assess the risk of selection, attrition, outcome and population bias as well as several statistical issues. All the studies presented moderate or high risk of bias in at least two domains due to the following features: high non-response, inaccurate cut-offs, and doubtful accuracy of the examiners. Two studies had limited external validity due to high rates of illiteracy or low income. The three studies with adequate generalizability and the lowest risk of bias presented a prevalence of dementia between 7.1% and 8.3% among subjects aged 65 years and older. However, after adjustment for accuracy of screening, the best available evidence points towards a figure between 15.2% and 16.3%. Conclusions The risk of bias may strongly limit the generalizability of dementia prevalence estimates in developing countries. Extrapolations that have already been made for Brazil and Latin America were based on a prevalence that should have been adjusted for screening accuracy or not used at all due to severe bias. Similar evaluations regarding other developing countries are needed in order to verify the scope of these limitations. PMID:26131563

  20. Random sample community-based health surveys: does the effort to reach participants matter?

    PubMed Central

    Messiah, Antoine; Castro, Grettel; Rodríguez de la Vega, Pura; Acuna, Juan M

    2014-01-01

    Objectives Conducting health surveys with community-based random samples are essential to capture an otherwise unreachable population, but these surveys can be biased if the effort to reach participants is insufficient. This study determines the desirable amount of effort to minimise such bias. Design A household-based health survey with random sampling and face-to-face interviews. Up to 11 visits, organised by canvassing rounds, were made to obtain an interview. Setting Single-family homes in an underserved and understudied population in North Miami-Dade County, Florida, USA. Participants Of a probabilistic sample of 2200 household addresses, 30 corresponded to empty lots, 74 were abandoned houses, 625 households declined to participate and 265 could not be reached and interviewed within 11 attempts. Analyses were performed on the 1206 remaining households. Primary outcome Each household was asked if any of their members had been told by a doctor that they had high blood pressure, heart disease including heart attack, cancer, diabetes, anxiety/ depression, obesity or asthma. Responses to these questions were analysed by the number of visit attempts needed to obtain the interview. Results Return per visit fell below 10% after four attempts, below 5% after six attempts and below 2% after eight attempts. As the effort increased, household size decreased, while household income and the percentage of interviewees active and employed increased; proportion of the seven health conditions decreased, four of which did so significantly: heart disease 20.4–9.2%, high blood pressure 63.5–58.1%, anxiety/depression 24.4–9.2% and obesity 21.8–12.6%. Beyond the fifth attempt, however, cumulative percentages varied by less than 1% and precision varied by less than 0.1%. Conclusions In spite of the early and steep drop, sustaining at least five attempts to reach participants is necessary to reduce selection bias. PMID:25510887