Science.gov

Sample records for based empirical survey

  1. Risk-adjusted capitation based on the Diagnostic Cost Group Model: an empirical evaluation with health survey information.

    PubMed Central

    Lamers, L M

    1999-01-01

    OBJECTIVE: To evaluate the predictive accuracy of the Diagnostic Cost Group (DCG) model using health survey information. DATA SOURCES/STUDY SETTING: Longitudinal data collected for a sample of members of a Dutch sickness fund. In the Netherlands the sickness funds provide compulsory health insurance coverage for the 60 percent of the population in the lowest income brackets. STUDY DESIGN: A demographic model and DCG capitation models are estimated by means of ordinary least squares, with an individual's annual healthcare expenditures in 1994 as the dependent variable. For subgroups based on health survey information, costs predicted by the models are compared with actual costs. Using stepwise regression procedures a subset of relevant survey variables that could improve the predictive accuracy of the three-year DCG model was identified. Capitation models were extended with these variables. DATA COLLECTION/EXTRACTION METHODS: For the empirical analysis, panel data of sickness fund members were used that contained demographic information, annual healthcare expenditures, and diagnostic information from hospitalizations for each member. In 1993, a mailed health survey was conducted among a random sample of 15,000 persons in the panel data set, with a 70 percent response rate. PRINCIPAL FINDINGS: The predictive accuracy of the demographic model improves when it is extended with diagnostic information from prior hospitalizations (DCGs). A subset of survey variables further improves the predictive accuracy of the DCG capitation models. The predictable profits and losses based on survey information for the DCG models are smaller than for the demographic model. Most persons with predictable losses based on health survey information were not hospitalized in the preceding year. CONCLUSIONS: The use of diagnostic information from prior hospitalizations is a promising option for improving the demographic capitation payment formula. This study suggests that diagnostic

  2. Monitoring of Qualifications and Employment in Austria: An Empirical Approach Based on the Labour Force Survey

    ERIC Educational Resources Information Center

    Lassnigg, Lorenz; Vogtenhuber, Stefan

    2011-01-01

    The empirical approach referred to in this article describes the relationship between education and training (ET) supply and employment in Austria; the use of the new ISCED (International Standard Classification of Education) fields of study variable makes this approach applicable abroad. The purpose is to explore a system that produces timely…

  3. Defining Empirically Based Practice.

    ERIC Educational Resources Information Center

    Siegel, Deborah H.

    1984-01-01

    Provides a definition of empirically based practice, both conceptually and operationally. Describes a study of how research and practice were integrated in the graduate social work program at the School of Social Service Administration, University of Chicago. (JAC)

  4. GIS Teacher Training: Empirically-Based Indicators of Effectiveness

    ERIC Educational Resources Information Center

    Höhnle, Steffen; Fögele, Janis; Mehren, Rainer; Schubert, Jan Christoph

    2016-01-01

    In spite of various actions, the implementation of GIS (geographic information systems) in German schools is still very low. In the presented research, teaching experts as well as teaching novices were presented with empirically based constraints for implementation stemming from an earlier survey. In the process of various group discussions, the…

  5. An empirical Bayes approach to analyzing recurring animal surveys

    USGS Publications Warehouse

    Johnson, D.H.

    1989-01-01

    Recurring estimates of the size of animal populations are often required by biologists or wildlife managers. Because of cost or other constraints, estimates frequently lack the accuracy desired but cannot readily be improved by additional sampling. This report proposes a statistical method employing empirical Bayes (EB) estimators as alternatives to those customarily used to estimate population size, and evaluates them by a subsampling experiment on waterfowl surveys. EB estimates, especially a simple limited-translation version, were more accurate and provided shorter confidence intervals with greater coverage probabilities than customary estimates.

  6. First Results From The Empire Nearby Galaxy Dense Gas Survey

    NASA Astrophysics Data System (ADS)

    Bigiel, Frank

    2016-09-01

    I will present first results from our EMPIRE survey, a large program ( 500 hr) at the IRAM 30m telescope to map high critical density gas and shock tracers (e.g., HCN, HCO+, HNC, N2H+, etc.) as well as the optically thin 1-0 lines of 13CO and C18O for the first time systematically across 9 prominent, nearby Disk Galaxies."How is star formation regulated across disk galaxies" is the central question framing our science. Specifically, and building on a large suite of available ancillary data from the radio to the UV, we study, among other things, dense gas fractions and star formation efficiencies and how they vary with environment within and among nearby disk galaxies. Of particular interest is how our measurements compare to studies in the Milky Way, predicting a fairly constant star formation efficiency of the dense gas. Already in our first case study focusing on the prominent nearby spiral galaxy M51, we find signifycant variations of this quantity across the disk.In my talk, I will present results from a first series of studies about to me submitted addressing these questions with our EMPIRE and complementary, high-resolution ALMA data. In addition, I will present details of the survey and report on ongoing projects and future directions. I will place our work in context with other work, including studies of dense gas tracers in other galaxies and in particular the Milky Way.

  7. Developing Empirically Based Models of Practice.

    ERIC Educational Resources Information Center

    Blythe, Betty J.; Briar, Scott

    1985-01-01

    Over the last decade emphasis has shifted from theoretically based models of practice to empirically based models whose elements are derived from clinical research. These models are defined and a developing model of practice through the use of single-case methodology is examined. Potential impediments to this new role are identified. (Author/BL)

  8. Empirically Based Play Interventions for Children

    ERIC Educational Resources Information Center

    Reddy, Linda A., Ed.; Files-Hall, Tara M., Ed.; Schaefer, Charles E., Ed.

    2005-01-01

    "Empirically Based Play Interventions for Children" is a compilation of innovative, well-designed play interventions, presented for the first time in one text. Play therapy is the oldest and most popular form of child therapy in clinical practice and is widely considered by practitioners to be uniquely responsive to children's developmental needs.…

  9. Differential Weighting: A Survey of Methods and Empirical Studies.

    ERIC Educational Resources Information Center

    Stanley, Julian C.; Wang, Marilyn D.

    The literature on a priori and empirical weighting of test items and test-item options is reviewed. While multiple regression is the best known technique for deriving fixed empirical weights for component variables (such as tests and test items), other methods allow one to derive weights which equalize the effective weights of the component…

  10. Prosocial Motivation and Blood Donations: A Survey of the Empirical Literature.

    PubMed

    Goette, Lorenz; Stutzer, Alois; Frey, Beat M

    2010-06-01

    Recent shortages in the supply of blood donations have renewed the interest in how blood donations can be increased temporarily. We survey the evidence on the role of financial and other incentives in eliciting blood donations among donors who are normally willing to donate pro bono. We present the predictions from different empirical/psychological-based theories, with some predicting that incentives are effective while others predict that incentives may undermine prosocial motivation. The evidence suggests that incentives work relatively well in settings in which donors are relatively anonymous, but evidence indicates also that when image concerns become important, incentives may be counterproductive as donors do not want to be seen as greedy.

  11. Watershed-based survey designs

    USGS Publications Warehouse

    Detenbeck, N.E.; Cincotta, D.; Denver, J.M.; Greenlee, S.K.; Olsen, A.R.; Pitchford, A.M.

    2005-01-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream–downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs.

  12. Watershed-based survey designs.

    PubMed

    Detenbeck, Naomi E; Cincotta, Dan; Denver, Judith M; Greenlee, Susan K; Olsen, Anthony R; Pitchford, Ann M

    2005-04-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream-downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs.

  13. Issues and Controversies that Surround Recent Texts on Empirically Supported and Empirically Based Treatments

    ERIC Educational Resources Information Center

    Paul, Howard A.

    2004-01-01

    Since the 1993 APA task force of the Society of Clinical Psychology developed guidelines to apply data-based psychology to the identification of effective psychotherapy, there has been an increasing number of texts focussing on Empirically based Psychotherapy and Empirically Supported Treatments. This manuscript examines recent key texts and…

  14. Enhanced FMAM based on empirical kernel map.

    PubMed

    Wang, Min; Chen, Songcan

    2005-05-01

    The existing morphological auto-associative memory models based on the morphological operations, typically including morphological auto-associative memories (auto-MAM) proposed by Ritter et al. and our fuzzy morphological auto-associative memories (auto-FMAM), have many attractive advantages such as unlimited storage capacity, one-shot recall speed and good noise-tolerance to single erosive or dilative noise. However, they suffer from the extreme vulnerability to noise of mixing erosion and dilation, resulting in great degradation on recall performance. To overcome this shortcoming, we focus on FMAM and propose an enhanced FMAM (EFMAM) based on the empirical kernel map. Although it is simple, EFMAM can significantly improve the auto-FMAM with respect to the recognition accuracy under hybrid-noise and computational effort. Experiments conducted on the thumbnail-sized faces (28 x 23 and 14 x 11) scaled from the ORL database show the average accuracies of 92%, 90%, and 88% with 40 classes under 10%, 20%, and 30% randomly generated hybrid-noises, respectively, which are far higher than the auto-FMAM (67%, 46%, 31%) under the same noise levels.

  15. Empirical Validation and Application of the Computing Attitudes Survey

    ERIC Educational Resources Information Center

    Dorn, Brian; Elliott Tew, Allison

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…

  16. Emotional Risks to Respondents in Survey Research: Some Empirical Evidence

    PubMed Central

    Labott, Susan M.; Johnson, Timothy P.; Fendrich, Michael; Feeny, Norah C.

    2014-01-01

    Some survey research has documented distress in respondents with pre-existing emotional vulnerabilities, suggesting the possibility of harm. In this study, respondents were interviewed about a personally distressing event; mood, stress, and emotional reactions were assessed. Two days later, respondents participated in interventions to either enhance or alleviate the effects of the initial interview. Results indicated that distressing interviews increased stress and negative mood, although no adverse events occurred. Between the interviews, moods returned to baseline. Respondents who again discussed a distressing event reported moods more negative than those who discussed a neutral or a positive event. This study provides evidence that, among nonvulnerable survey respondents, interviews on distressing topics can result in negative moods and stress, but they do not harm respondents. PMID:24169422

  17. Empirically Based Myths: Astrology, Biorhythms, and ATIs.

    ERIC Educational Resources Information Center

    Ragsdale, Ronald G.

    1980-01-01

    A myth may have an empirical basis through chance occurrence; perhaps Aptitude Treatment Interactions (ATIs) are in this category. While ATIs have great utility in describing, planning, and implementing instruction, few disordinal interactions have been found. Article suggests narrowing of ATI research with replications and estimates of effect…

  18. A survey on hematology-oncology pediatric AIEOP centers: prophylaxis, empirical therapy and nursing prevention procedures of infectious complications.

    PubMed

    Livadiotti, Susanna; Milano, Giuseppe Maria; Serra, Annalisa; Folgori, Laura; Jenkner, Alessandro; Castagnola, Elio; Cesaro, Simone; Rossi, Mario R; Barone, Angelica; Zanazzo, Giulio; Nesi, Francesca; Licciardello, Maria; De Santis, Raffaella; Ziino, Ottavio; Cellini, Monica; Porta, Fulvio; Caselli, Desiree; Pontrelli, Giuseppe

    2012-01-01

    A nationwide questionnaire-based survey was designed to evaluate the management and prophylaxis of febrile neutropenia in pediatric patients admitted to hematology-oncology and hematopoietic stem cell transplant units. Of the 34 participating centers, 40 and 63%, respectively, continue to prescribe antibacterial and antimycotic prophylaxis in low-risk subjects and 78 and 94% in transplant patients. Approximately half of the centers prescribe a combination antibiotic regimen as first-line therapy in low-risk patients and up to 81% in high-risk patients. When initial empirical therapy fails after seven days, 63% of the centers add empirical antimycotic therapy in low-and 81% in high-risk patients. Overall management varies significantly across centers. Preventive nursing procedures are in accordance with international guidelines. This survey is the first to focus on prescribing practices in children with cancer and could help to implement practice guidelines.

  19. Web-Based Surveys: Not Your Basic Survey Anymore

    ERIC Educational Resources Information Center

    Bertot, John Carlo

    2009-01-01

    Web-based surveys are not new to the library environment. Although such surveys began as extensions of print surveys, the Web-based environment offers a number of approaches to conducting a survey that the print environment cannot duplicate easily. Since 1994, the author and others have conducted national surveys of public library Internet…

  20. Empirical likelihood-based tests for stochastic ordering

    PubMed Central

    BARMI, HAMMOU EL; MCKEAGUE, IAN W.

    2013-01-01

    This paper develops an empirical likelihood approach to testing for the presence of stochastic ordering among univariate distributions based on independent random samples from each distribution. The proposed test statistic is formed by integrating a localized empirical likelihood statistic with respect to the empirical distribution of the pooled sample. The asymptotic null distribution of this test statistic is found to have a simple distribution-free representation in terms of standard Brownian bridge processes. The approach is used to compare the lengths of rule of Roman Emperors over various historical periods, including the “decline and fall” phase of the empire. In a simulation study, the power of the proposed test is found to improve substantially upon that of a competing test due to El Barmi and Mukerjee. PMID:23874142

  1. Empirically Based Comprehensive Treatment Program for Parasuicide.

    ERIC Educational Resources Information Center

    Clum, George A.; And Others

    1979-01-01

    Suggests secondary parasuicide prevention is the most viable path for future research. Aggressive case findings and primary prevention approaches have failed to reduce suicide attempt rates. A secondary prevention model, based on factors predictive of parasuicide, was developed. Stress reduction and cognitive restructuring were primary goals of…

  2. Survey-Based Measurement of Public Management and Policy Networks

    ERIC Educational Resources Information Center

    Henry, Adam Douglas; Lubell, Mark; McCoy, Michael

    2012-01-01

    Networks have become a central concept in the policy and public management literature; however, theoretical development is hindered by a lack of attention to the empirical properties of network measurement methods. This paper compares three survey-based methods for measuring organizational networks: the roster, the free-recall name generator, and…

  3. Empirically Based Strategies for Preventing Juvenile Delinquency.

    PubMed

    Pardini, Dustin

    2016-04-01

    Juvenile crime is a serious public health problem that results in significant emotional and financial costs for victims and society. Using etiologic models as a guide, multiple interventions have been developed to target risk factors thought to perpetuate the emergence and persistence of delinquent behavior. Evidence suggests that the most effective interventions tend to have well-defined treatment protocols, focus on therapeutic approaches as opposed to external control techniques, and use multimodal cognitive-behavioral treatment strategies. Moving forward, there is a need to develop effective policies and procedures that promote the widespread adoption of evidence-based delinquency prevention practices across multiple settings.

  4. Image-Based Empirical Modeling of the Plasmasphere

    NASA Technical Reports Server (NTRS)

    Adrian, Mark L.; Gallagher, D. L.

    2008-01-01

    A new suite of empirical models of plasmaspheric plasma based on remote, global images from the IMAGE EUV instrument is proposed for development. The purpose of these empirical models is to establish the statistical properties of the plasmasphere as a function of conditions. This suite of models will mark the first time the plasmaspheric plume is included in an empirical model. Development of these empirical plasmaspheric models will support synoptic studies (such as for wave propagation and growth, energetic particle loss through collisions and dust transport as influenced by charging) and serves as a benchmark against which physical models can be tested. The ability to know that a specific global density distribution occurs in response to specific magnetospheric and solar wind factors is a huge advantage over all previous in-situ based empirical models. The consequence of creating these new plasmaspheric models will be to provide much higher fidelity and much richer quantitative descriptions of the statistical properties of plasmaspheric plasma in the inner magnetosphere, whether that plasma is in the main body of the plasmasphere, nearby during recovery or in the plasmaspheric plume. Model products to be presented include statistical probabilities for being in the plasmasphere, near thermal He+ density boundaries and the complexity of its spatial structure.

  5. WATERSHED BASED SURVEY DESIGNS

    EPA Science Inventory

    The development of watershed-based design and assessment tools will help to serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional condition to meet Section 305(b), identification of impaired water bodies or wate...

  6. Empirical Likelihood-Based Confidence Interval of ROC Curves.

    PubMed

    Su, Haiyan; Qin, Yongsong; Liang, Hua

    2009-11-01

    In this article we propose an empirical likelihood-based confidence interval for receiver operating characteristic curves which are based on a continuous-scale test. The approach is easily understood, simply implemented, and computationally efficient. The results from our simulation studies indicate that the finite-sample numerical performance slightly outperforms the most promising methods published recently. Two real datasets are analyzed by using the proposed method and the existing bootstrap-based method.

  7. Responses to Commentaries on Advances in Empirically Based Assessment.

    ERIC Educational Resources Information Center

    McConaughy, Stephanie H.

    1993-01-01

    Author of article (this issue) describing research program to advance assessment of children's behavioral and emotional problems; presenting conceptual framework for multiaxial empirically based assessment; and summarizing research efforts to develop cross-informant scales for scoring parent, teacher, and self-reports responds to commentaries on…

  8. Empirical Data Sets for Agent Based Modeling of Crowd Scenarios

    DTIC Science & Technology

    2009-08-06

    Conclusion 2UNCLASSIFIED- Approved for Public Release Crowd Research • Large numbers • Heterogeneous • Individual Actors • Interdependence • Language ... Barriers • Empirical testing is difficult • Simulations require models based on real data, otherwise they are fiction 3UNCLASSIFIED- Approved for

  9. Unsupervised self-organized mapping: a versatile empirical tool for object selection, classification and redshift estimation in large surveys

    NASA Astrophysics Data System (ADS)

    Geach, James E.

    2012-01-01

    We present an application of unsupervised machine learning - the self-organized map (SOM) - as a tool for visualizing, exploring and mining the catalogues of large astronomical surveys. Self-organization culminates in a low-resolution representation of the 'topology' of a parameter volume, and this can be exploited in various ways pertinent to astronomy. Using data from the Cosmological Evolution Survey (COSMOS), we demonstrate two key astronomical applications of the SOM: (i) object classification and selection, using galaxies with active galactic nuclei as an example, and (ii) photometric redshift estimation, illustrating how SOMs can be used as totally empirical predictive tools. With a training set of ˜3800 galaxies with zspec≤ 1, we achieve photometric redshift accuracies competitive with other (mainly template fitting) techniques that use a similar number of photometric bands [σ(Δz) = 0.03 with a ˜2 per cent outlier rate when using u* band to 8 ?m photometry]. We also test the SOM as a photo-z tool using the PHoto-z Accuracy Testing (PHAT) synthetic catalogue of Hildebrandt et al., which compares several different photo-z codes using a common input/training set. We find that the SOM can deliver accuracies that are competitive with many of the established template fitting and empirical methods. This technique is not without clear limitations, which are discussed, but we suggest it could be a powerful tool in the era of extremely large -'petabyte'- data bases where efficient data mining is a paramount concern.

  10. Space Based Dark Energy Surveys

    NASA Astrophysics Data System (ADS)

    Dore, Olivier

    2016-03-01

    Dark energy, the name given to the cause of the accelerating expansion of the Universe, is one of the most tantalizing mystery in modern physics. Current cosmological models hold that dark energy is currently the dominant component of the Universe, but the exact nature of DE remains poorly understood. There are ambitious ground-based surveys underway that seek to understand DE and NASA is participating in the development of significantly more ambitious space-based surveys planned for the next decade. NASA has provided mission enabling technology to the European Space Agency's (ESA) Euclid mission in exchange for US scientists to participate in the Euclid mission. NASA is also developing the Wide Field Infrared Survey Telescope-Astrophysics Focused Telescope Asset (WFIRST) mission for possible launch in 2024. WFIRST was the highest ranked space mission in the Astro2010 Decadal Survey and the current design uses a 2.4m space telescope to go beyond what was then envisioned. Understanding DE is one of the primary science goals of WFIRST-AFTA. This talk will review the state of DE, the relevant activities of the Cosmic Structure Interest Group (CoSSIG) of the PhyPAG, and detail the status and complementarity between Euclid, WFIRST and ot ambitious ground-based efforts.

  11. Nonparametric Bayes Factors Based On Empirical Likelihood Ratios

    PubMed Central

    Vexler, Albert; Deng, Wei; Wilding, Gregory E.

    2012-01-01

    Bayes methodology provides posterior distribution functions based on parametric likelihoods adjusted for prior distributions. A distribution-free alternative to the parametric likelihood is use of empirical likelihood (EL) techniques, well known in the context of nonparametric testing of statistical hypotheses. Empirical likelihoods have been shown to exhibit many of the properties of conventional parametric likelihoods. In this article, we propose and examine Bayes factors (BF) methods that are derived via the EL ratio approach. Following Kass & Wasserman [10], we consider Bayes factors type decision rules in the context of standard statistical testing techniques. We show that the asymptotic properties of the proposed procedure are similar to the classical BF’s asymptotic operating characteristics. Although we focus on hypothesis testing, the proposed approach also yields confidence interval estimators of unknown parameters. Monte Carlo simulations were conducted to evaluate the theoretical results as well as to demonstrate the power of the proposed test. PMID:23180904

  12. What Can Student Perception Surveys Tell Us about Teaching? Empirically Testing the Underlying Structure of the Tripod Student Perception Survey

    ERIC Educational Resources Information Center

    Wallace, Tanner LeBaron; Kelcey, Benjamin; Ruzek, Erik

    2016-01-01

    We conducted a theory-based analysis of the underlying structure of the Tripod student perception survey instrument using the Measures of Effective Teaching (MET) database (N = 1,049 middle school math class sections; N = 25,423 students). Multilevel item factor analyses suggested that an alternative bifactor structure best fit the Tripod items,…

  13. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data.

    PubMed

    Shannon, Graeme; Lewis, Jesse S; Gerber, Brian D

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km(2) of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10-120 cameras) and occasions (20-120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ψ) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with

  14. AGENT-BASED MODELS IN EMPIRICAL SOCIAL RESEARCH*

    PubMed Central

    Bruch, Elizabeth; Atwell, Jon

    2014-01-01

    Agent-based modeling has become increasingly popular in recent years, but there is still no codified set of recommendations or practices for how to use these models within a program of empirical research. This article provides ideas and practical guidelines drawn from sociology, biology, computer science, epidemiology, and statistics. We first discuss the motivations for using agent-based models in both basic science and policy-oriented social research. Next, we provide an overview of methods and strategies for incorporating data on behavior and populations into agent-based models, and review techniques for validating and testing the sensitivity of agent-based models. We close with suggested directions for future research. PMID:25983351

  15. Video watermarking with empirical PCA-based decoding.

    PubMed

    Khalilian, Hanieh; Bajic, Ivan V

    2013-12-01

    A new method for video watermarking is presented in this paper. In the proposed method, data are embedded in the LL subband of wavelet coefficients, and decoding is performed based on the comparison among the elements of the first principal component resulting from empirical principal component analysis (PCA). The locations for data embedding are selected such that they offer the most robust PCA-based decoding. Data are inserted in the LL subband in an adaptive manner based on the energy of high frequency subbands and visual saliency. Extensive testing was performed under various types of attacks, such as spatial attacks (uniform and Gaussian noise and median filtering), compression attacks (MPEG-2, H. 263, and H. 264), and temporal attacks (frame repetition, frame averaging, frame swapping, and frame rate conversion). The results show that the proposed method offers improved performance compared with several methods from the literature, especially under additive noise and compression attacks.

  16. Empirically based device modeling of bulk heterojunction organic photovoltaics

    NASA Astrophysics Data System (ADS)

    Pierre, Adrien; Lu, Shaofeng; Howard, Ian A.; Facchetti, Antonio; Arias, Ana Claudia

    2013-10-01

    An empirically based, open source, optoelectronic model is constructed to accurately simulate organic photovoltaic (OPV) devices. Bulk heterojunction OPV devices based on a new low band gap dithienothiophene- diketopyrrolopyrrole donor polymer (P(TBT-DPP)) are blended with PC70BM and processed under various conditions, with efficiencies up to 4.7%. The mobilities of electrons and holes, bimolecular recombination coefficients, exciton quenching efficiencies in donor and acceptor domains and optical constants of these devices are measured and input into the simulator to yield photocurrent with less than 7% error. The results from this model not only show carrier activity in the active layer but also elucidate new routes of device optimization by varying donor-acceptor composition as a function of position. Sets of high and low performance devices are investigated and compared side-by-side.

  17. Development of an empirically based dynamic biomechanical strength model

    NASA Technical Reports Server (NTRS)

    Pandya, A.; Maida, J.; Aldridge, A.; Hasson, S.; Woolford, B.

    1992-01-01

    The focus here is on the development of a dynamic strength model for humans. Our model is based on empirical data. The shoulder, elbow, and wrist joints are characterized in terms of maximum isolated torque, position, and velocity in all rotational planes. This information is reduced by a least squares regression technique into a table of single variable second degree polynomial equations determining the torque as a function of position and velocity. The isolated joint torque equations are then used to compute forces resulting from a composite motion, which in this case is a ratchet wrench push and pull operation. What is presented here is a comparison of the computed or predicted results of the model with the actual measured values for the composite motion.

  18. Ontology-Based Empirical Knowledge Verification for Professional Virtual Community

    ERIC Educational Resources Information Center

    Chen, Yuh-Jen

    2011-01-01

    A professional virtual community provides an interactive platform for enterprise experts to create and share their empirical knowledge cooperatively, and the platform contains a tremendous amount of hidden empirical knowledge that knowledge experts have preserved in the discussion process. Therefore, enterprise knowledge management highly…

  19. Empirical Likelihood-Based ANOVA for Trimmed Means

    PubMed Central

    Velina, Mara; Valeinis, Janis; Greco, Luca; Luta, George

    2016-01-01

    In this paper, we introduce an alternative to Yuen’s test for the comparison of several population trimmed means. This nonparametric ANOVA type test is based on the empirical likelihood (EL) approach and extends the results for one population trimmed mean from Qin and Tsao (2002). The results of our simulation study indicate that for skewed distributions, with and without variance heterogeneity, Yuen’s test performs better than the new EL ANOVA test for trimmed means with respect to control over the probability of a type I error. This finding is in contrast with our simulation results for the comparison of means, where the EL ANOVA test for means performs better than Welch’s heteroscedastic F test. The analysis of a real data example illustrates the use of Yuen’s test and the new EL ANOVA test for trimmed means for different trimming levels. Based on the results of our study, we recommend the use of Yuen’s test for situations involving the comparison of population trimmed means between groups of interest. PMID:27690063

  20. 'The Healthy Migrant Effect' for Mental Health in England: Propensity-score Matched Analysis Using the EMPIRIC Survey.

    PubMed

    Dhadda, Amrit; Greene, Giles

    2017-04-07

    Evidence has demonstrated that immigrants have a mental health advantage over the indigenous population of developed countries. However, much of the evidence-base demonstrating this mental health advantage is susceptible to confounding and inadequate adjustment across immigrant and non-immigrant groups preventing a rigorous assessment of a 'healthy migrant effect'. To compare the risk of common mental disorders in the immigrant population compared to the non-immigrant population in ethnic minority groups in England. A propensity-score matched analysis was carried out to adequately balance immigrant and non-immigrant groups for known confounders using the EMPIRIC national survey of Black-Caribbean, Indian, Pakistani and Bangladeshi groups. The mental health of participants was assessed using the validated Revised Clinical Interview Schedule tool. Immigrant participants were significantly less likely to have a common mental disorder than non-immigrant participants; OR = 0.47, (95% CI 0.40, 0.56). The results from this study demonstrate that a mental health advantage exists in ethnic minority immigrants compared to non-immigrants when balancing the two groups for confounding factors. This may be due to immigrants possessing certain personality traits, such as "psychological hardiness", that the migration process may select for.

  1. Assessing formal teaching of ethics in physiology: an empirical survey, patterns, and recommendations.

    PubMed

    Goswami, Nandu; Batzel, Jerry Joseph; Hinghofer-Szalkay, Helmut

    2012-09-01

    Ethics should be an important component of physiological education. In this report, we examined to what extent teaching of ethics is formally being incorporated into the physiology curriculum. We carried out an e-mail survey in which we asked the e-mail recipients whether their institution offered a course or lecture on ethics as part of the physiology teaching process at their institution, using the following query: "We are now doing an online survey in which we would like to know whether you offer a course or a lecture on ethics as part of your physiology teaching curriculum." The response rate was 53.3%: we received 104 responses of a total of 195 sent out. Our responses came from 45 countries. While all of our responders confirmed that there was a need for ethics during medical education and scientific training, the degree of inclusion of formal ethics in the physiology curriculum varied widely. Our survey showed that, in most cases (69%), including at our Medical University of Graz, ethics in physiology is not incorporated into the physiology curriculum. Given this result, we suggest specific topics related to ethics and ethical considerations that could be integrated into the physiology curriculum. We present here a template example of a lecture "Teaching Ethics in Physiology" (structure, content, examples, and references), which was based on guidelines and case reports provided by experts in this area (e.g., Benos DJ. Ethics revisited. Adv Physiol Educ 25: 189-190, 2001). This lecture, which we are presently using in Graz, could be used as a base that could lead to greater awareness of important ethical issues in students at an early point in the educational process.

  2. Arduino based radiation survey meter

    NASA Astrophysics Data System (ADS)

    Rahman, Nur Aira Abd; Lombigit, Lojius; Abdullah, Nor Arymaswati; Azman, Azraf; Dolah, Taufik; Muzakkir, Amir; Jaafar, Zainudin; Mohamad, Glam Hadzir Patai; Ramli, Abd Aziz Mhd; Zain, Rasif Mohd; Said, Fazila; Khalid, Mohd Ashhar; Taat, Muhamad Zahidee

    2016-01-01

    This paper presents the design of new digital radiation survey meter with LND7121 Geiger Muller tube detector and Atmega328P microcontroller. Development of the survey meter prototype is carried out on Arduino Uno platform. 16-bit Timer1 on the microcontroller is utilized as external pulse counter to produce count per second or CPS measurement. Conversion from CPS to dose rate technique is also performed by Arduino to display results in micro Sievert per hour (μSvhr-1). Conversion factor (CF) value for conversion of CPM to μSvhr-1 determined from manufacturer data sheet is compared with CF obtained from calibration procedure. The survey meter measurement results are found to be linear for dose rates below 3500 µSv/hr.

  3. Arduino based radiation survey meter

    SciTech Connect

    Rahman, Nur Aira Abd Lombigit, Lojius; Abdullah, Nor Arymaswati; Azman, Azraf; Dolah, Taufik; Jaafar, Zainudin; Mohamad, Glam Hadzir Patai; Ramli, Abd Aziz Mhd; Zain, Rasif Mohd; Said, Fazila; Khalid, Mohd Ashhar; Taat, Muhamad Zahidee; Muzakkir, Amir

    2016-01-22

    This paper presents the design of new digital radiation survey meter with LND7121 Geiger Muller tube detector and Atmega328P microcontroller. Development of the survey meter prototype is carried out on Arduino Uno platform. 16-bit Timer1 on the microcontroller is utilized as external pulse counter to produce count per second or CPS measurement. Conversion from CPS to dose rate technique is also performed by Arduino to display results in micro Sievert per hour (μSvhr{sup −1}). Conversion factor (CF) value for conversion of CPM to μSvhr{sup −1} determined from manufacturer data sheet is compared with CF obtained from calibration procedure. The survey meter measurement results are found to be linear for dose rates below 3500 µSv/hr.

  4. Short memory or long memory: an empirical survey of daily rainfall data

    NASA Astrophysics Data System (ADS)

    Yusof, F.; Kane, I. L.

    2012-10-01

    A short memory process that encounters occasional structural breaks in mean can show a slower rate of decay in the autocorrelation function and other properties of fractional integrated I (d) processes. In this paper we employed a procedure for estimating the fractional differencing parameter in semi parametric contexts proposed by Geweke and Porter-Hudak to analyze nine daily rainfall data sets across Malaysia. The results indicate that all the data sets exhibit long memory. Furthermore, an empirical fluctuation process using the Ordinary Least Square (OLS) based cumulative sum (CUSUM) test with F-statistic for the break date were applied, break dates were detected in all data sets. The data sets were partitioned according to their respective break date and further test for long memory was applied for all subseries. Results show that all subseries follows the same pattern with the original series. The estimate of the fractional parameters d1 and d2 on the subseries obtained by splitting the original series at the break-date, confirms that there is a long memory in the DGP. Therefore this evidence shows a true long memory not due to structural break.

  5. Structural break or long memory: an empirical survey on daily rainfall data sets across Malaysia

    NASA Astrophysics Data System (ADS)

    Yusof, F.; Kane, I. L.; Yusop, Z.

    2013-04-01

    A short memory process that encounters occasional structural breaks in mean can show a slower rate of decay in the autocorrelation function and other properties of fractional integrated I (d) processes. In this paper we employed a procedure for estimating the fractional differencing parameter in semiparametric contexts proposed by Geweke and Porter-Hudak (1983) to analyse nine daily rainfall data sets across Malaysia. The results indicate that all the data sets exhibit long memory. Furthermore, an empirical fluctuation process using the ordinary least square (OLS)-based cumulative sum (CUSUM) test for the break date was applied. Break dates were detected in all data sets. The data sets were partitioned according to their respective break date, and a further test for long memory was applied for all subseries. Results show that all subseries follows the same pattern as the original series. The estimate of the fractional parameters d1 and d2 on the subseries obtained by splitting the original series at the break date confirms that there is a long memory in the data generating process (DGP). Therefore this evidence shows a true long memory not due to structural break.

  6. Empirically based device modeling of bulk heterojunction organic photovoltaics

    NASA Astrophysics Data System (ADS)

    Pierre, Adrien; Lu, Shaofeng; Howard, Ian A.; Facchetti, Antonio; Arias, Ana Claudia

    2013-04-01

    We develop an empirically based optoelectronic model to accurately simulate the photocurrent in organic photovoltaic (OPV) devices with novel materials including bulk heterojunction OPV devices based on a new low band gap dithienothiophene-DPP donor polymer, P(TBT-DPP), blended with PC70BM at various donor-acceptor weight ratios and solvent compositions. Our devices exhibit power conversion efficiencies ranging from 1.8% to 4.7% at AM 1.5G. Electron and hole mobilities are determined using space-charge limited current measurements. Bimolecular recombination coefficients are both analytically calculated using slowest-carrier limited Langevin recombination and measured using an electro-optical pump-probe technique. Exciton quenching efficiencies in the donor and acceptor domains are determined from photoluminescence spectroscopy. In addition, dielectric and optical constants are experimentally determined. The photocurrent and its bias-dependence that we simulate using the optoelectronic model we develop, which takes into account these physically measured parameters, shows less than 7% error with respect to the experimental photocurrent (when both experimentally and semi-analytically determined recombination coefficient is used). Free carrier generation and recombination rates of the photocurrent are modeled as a function of the position in the active layer at various applied biases. These results show that while free carrier generation is maximized in the center of the device, free carrier recombination is most dominant near the electrodes even in high performance devices. Such knowledge of carrier activity is essential for the optimization of the active layer by enhancing light trapping and minimizing recombination. Our simulation program is intended to be freely distributed for use in laboratories fabricating OPV devices.

  7. Evidence-based ethics? On evidence-based practice and the "empirical turn" from normative bioethics

    PubMed Central

    Goldenberg, Maya J

    2005-01-01

    Background The increase in empirical methods of research in bioethics over the last two decades is typically perceived as a welcomed broadening of the discipline, with increased integration of social and life scientists into the field and ethics consultants into the clinical setting, however it also represents a loss of confidence in the typical normative and analytic methods of bioethics. Discussion The recent incipiency of "Evidence-Based Ethics" attests to this phenomenon and should be rejected as a solution to the current ambivalence toward the normative resolution of moral problems in a pluralistic society. While "evidence-based" is typically read in medicine and other life and social sciences as the empirically-adequate standard of reasonable practice and a means for increasing certainty, I propose that the evidence-based movement in fact gains consensus by displacing normative discourse with aggregate or statistically-derived empirical evidence as the "bottom line". Therefore, along with wavering on the fact/value distinction, evidence-based ethics threatens bioethics' normative mandate. The appeal of the evidence-based approach is that it offers a means of negotiating the demands of moral pluralism. Rather than appealing to explicit values that are likely not shared by all, "the evidence" is proposed to adjudicate between competing claims. Quantified measures are notably more "neutral" and democratic than liberal markers like "species normal functioning". Yet the positivist notion that claims stand or fall in light of the evidence is untenable; furthermore, the legacy of positivism entails the quieting of empirically non-verifiable (or at least non-falsifiable) considerations like moral claims and judgments. As a result, evidence-based ethics proposes to operate with the implicit normativity that accompanies the production and presentation of all biomedical and scientific facts unchecked. Summary The "empirical turn" in bioethics signals a need for

  8. Methods for Evaluating Respondent Attrition in Web-Based Surveys

    PubMed Central

    Sabo, Roy T; Krist, Alex H; Day, Teresa; Cyrus, John; Woolf, Steven H

    2016-01-01

    Background Electronic surveys are convenient, cost effective, and increasingly popular tools for collecting information. While the online platform allows researchers to recruit and enroll more participants, there is an increased risk of participant dropout in Web-based research. Often, these dropout trends are simply reported, adjusted for, or ignored altogether. Objective To propose a conceptual framework that analyzes respondent attrition and demonstrates the utility of these methods with existing survey data. Methods First, we suggest visualization of attrition trends using bar charts and survival curves. Next, we propose a generalized linear mixed model (GLMM) to detect or confirm significant attrition points. Finally, we suggest applications of existing statistical methods to investigate the effect of internal survey characteristics and patient characteristics on dropout. In order to apply this framework, we conducted a case study; a seventeen-item Informed Decision-Making (IDM) module addressing how and why patients make decisions about cancer screening. Results Using the framework, we were able to find significant attrition points at Questions 4, 6, 7, and 9, and were also able to identify participant responses and characteristics associated with dropout at these points and overall. Conclusions When these methods were applied to survey data, significant attrition trends were revealed, both visually and empirically, that can inspire researchers to investigate the factors associated with survey dropout, address whether survey completion is associated with health outcomes, and compare attrition patterns between groups. The framework can be used to extract information beyond simple responses, can be useful during survey development, and can help determine the external validity of survey results. PMID:27876687

  9. MAIS: An Empirically-Based Intelligent CBI System.

    ERIC Educational Resources Information Center

    Christensen, Dean L.; Tennyson, Robert D.

    The goal of the programmatic research program for the Minnesota Adaptive Instructional System (MAIS), an intelligent computer-assisted instruction system, is to empirically investigate generalizable instructional variables and conditions that improve learning through the use of adaptive instructional strategies. Research has been initiated in the…

  10. A Survey of Graduate Training in Empirically Supported and Manualized Treatments: A Preliminary Report

    ERIC Educational Resources Information Center

    Karekla, Maria; Lundgren, Jennifer D.; Forsyth, John P.

    2004-01-01

    The promotion and dissemination of empirically supported (ESTs) and manualized therapies are important, albeit controversial, developments within clinical science and practice. To date, studies evaluating training opportunities and attitudes about such treatments at the graduate, predoctoral internship, and postdoctoral levels have focused on the…

  11. Exoplanet Demographics with a Space-Based Microlensing Survey

    NASA Astrophysics Data System (ADS)

    Gaudi, B. Scott

    2012-05-01

    Measurements of the frequency of exoplanets over a broad range of planet and host star properties provide fundamental empirical constraints on theories of planet formation and evolution. Because of its unique sensitivity to low-mass, long-period, and free-floating planets, microlensing is an essential complement to our arsenal of planet detection methods. I motivate microlensing surveys for exoplanets, and in particular describe how they can be used to test the currently-favored paradigm for planet formation, as well as inform our understanding of the frequency and potential habitability of low-mass planets located in the habitable zones of their host stars. I explain why a space-based mission is necessary to realize the full potential of microlensing, and outline the expected returns of such surveys. When combined with the results from complementary surveys such as Kepler, a space-based microlensing survey will yield a nearly complete picture of the demographics of planetary systems throughout the Galaxy.

  12. Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data

    NASA Technical Reports Server (NTRS)

    Rompala, John T.

    2005-01-01

    A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.

  13. Comparison of empirical, semi-empirical and physically based models of soil hydraulic functions derived for bi-modal soils

    NASA Astrophysics Data System (ADS)

    Kutílek, M.; Jendele, L.; Krejča, M.

    2009-02-01

    The accelerated flow in soil pores is responsible for a rapid transport of pollutants from the soil surface to deeper layers up to groundwater. The term preferential flow is used for this type of transport. Our study was aimed at the preferential flow realized in the structural porous domain in bi-modal soils. We compared equations describing the soil water retention function h( θ) and unsaturated hydraulic conductivity K( h), eventually K( θ) modified for bi-modal soils, where θ is the soil water content and h is the pressure head. The analytical description of a curve passing experimental data sets of the soil hydraulic function is typical for the empirical equation characterized by fitting parameters only. If the measured data are described by the equation derived by the physical model without using fitting parameters, we speak about a physically based model. There exist several transitional subtypes between empirical and physically based models. They are denoted as semi-empirical, or semi-physical. We tested 3 models of soil water retention function and 3 models of unsaturated conductivity using experimental data sets of sand, silt, silt loam and loam. All used soils are typical by their bi-modality of the soil porous system. The model efficiency was estimated by RMSE (Root mean square error) and by RSE (Relative square error). The semi-empirical equation of the soil water retention function had the lowest values of RMSE and RSE and was qualified as "optimal" for the formal description of the shape of the water retention function. With this equation, the fit of the modelled data to experiments was the closest one. The fitting parameters smoothed the difference between the model and the physical reality of the soil porous media. The physical equation based upon the model of the pore size distribution did not allow exact fitting of the modelled data to the experimental data due to the rigidity and simplicity of the physical model when compared to the real soil

  14. WATERSHED-BASED SURVEY DESIGNS

    EPA Science Inventory

    Water-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification if impaired water bodies or watersheds to meet Sectio...

  15. An Empirical Agent-Based Model to Simulate the Adoption of Water Reuse Using the Social Amplification of Risk Framework.

    PubMed

    Kandiah, Venu; Binder, Andrew R; Berglund, Emily Z

    2017-01-11

    Water reuse can serve as a sustainable alternative water source for urban areas. However, the successful implementation of large-scale water reuse projects depends on community acceptance. Because of the negative perceptions that are traditionally associated with reclaimed water, water reuse is often not considered in the development of urban water management plans. This study develops a simulation model for understanding community opinion dynamics surrounding the issue of water reuse, and how individual perceptions evolve within that context, which can help in the planning and decision-making process. Based on the social amplification of risk framework, our agent-based model simulates consumer perceptions, discussion patterns, and their adoption or rejection of water reuse. The model is based on the "risk publics" model, an empirical approach that uses the concept of belief clusters to explain the adoption of new technology. Each household is represented as an agent, and parameters that define their behavior and attributes are defined from survey data. Community-level parameters-including social groups, relationships, and communication variables, also from survey data-are encoded to simulate the social processes that influence community opinion. The model demonstrates its capabilities to simulate opinion dynamics and consumer adoption of water reuse. In addition, based on empirical data, the model is applied to investigate water reuse behavior in different regions of the United States. Importantly, our results reveal that public opinion dynamics emerge differently based on membership in opinion clusters, frequency of discussion, and the structure of social networks.

  16. Accuracy of Population Validity and Cross-Validity Estimation: An Empirical Comparison of Formula-Based, Traditional Empirical, and Equal Weights Procedures.

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Bilgic, Reyhan; Edwards, Jack E.; Fleer, Paul F.

    1999-01-01

    Performed an empirical Monte Carlo study using predictor and criterion data from 84,808 U.S. Air Force enlistees. Compared formula-based, traditional empirical, and equal-weights procedures. Discusses issues for basic research on validation and cross-validation. (SLD)

  17. Cloud Based Processing of Large Photometric Surveys

    NASA Astrophysics Data System (ADS)

    Farivar, R.; Brunner, R. J.; Santucci, R.; Campbell, R.

    2013-10-01

    Astronomy, as is the case with many scientific domains, has entered the realm of being a data rich science. Nowhere is this reflected more clearly than in the growth of large area surveys, such as the recently completed Sloan Digital Sky Survey (SDSS) or the Dark Energy Survey, which will soon obtain PB of imaging data. The data processing on these large surveys is a major challenge. In this paper, we demonstrate a new approach to this common problem. We propose the use of cloud-based technologies (e.g., Hadoop MapReduce) to run a data analysis program (e.g., SExtractor) across a cluster. Using the intermediate key/value pair design of Hadoop, our framework matches objects across different SExtractor invocations to create a unified catalog from all SDSS processed data. We conclude by presenting our experimental results on a 432 core cluster and discuss the lessons we have learned in completing this challenge.

  18. Organizational Communication--An Analysis Based on Empirical Data.

    ERIC Educational Resources Information Center

    Brenner, Marshall H.; Sigband, Norman B.

    This two-phase study examined communication style and practice in a major aerospace firm. First, sixty-five high-level company managers were interviewed to ascertain their views on communication problems and practices within the firm. Second, a questionnaire survey was distributed to 700 supervisors and non-supervisory personnel, soliciting…

  19. Empirical Mode Decomposition Based Features for Diagnosis and Prognostics of Systems

    DTIC Science & Technology

    2008-04-01

    bearing fault diagnosis – their effectiveness and flexibilities. Journal of Vibration and Acoustics July 2001, ASME. 3. Staszewski, W. J. Structural...Empirical Mode Decomposition Based Features for Diagnosis and Prognostics of Systems by Hiralal Khatri, Kenneth Ranney, Kwok Tom, and Romeo...Laboratory Adelphi, MD 20783-1197 ARL-TR-4301 April 2008 Empirical Mode Decomposition Based Features for Diagnosis and Prognostics of Systems

  20. Landfill modelling in LCA - a contribution based on empirical data.

    PubMed

    Obersteiner, Gudrun; Binner, Erwin; Mostbauer, Peter; Salhofer, Stefan

    2007-01-01

    Landfills at various stages of development, depending on their age and location, can be found throughout Europe. The type of facilities goes from uncontrolled dumpsites to highly engineered facilities with leachate and gas management. In addition, some landfills are designed to receive untreated waste, while others can receive incineration residues (MSWI) or residues after mechanical biological treatment (MBT). Dimension, type and duration of the emissions from landfills depend on the quality of the disposed waste, the technical design, and the location of the landfill. Environmental impacts are produced by the leachate (heavy metals, organic loading), emissions into the air (CH(4), hydrocarbons, halogenated hydrocarbons) and from the energy or fuel requirements for the operation of the landfill (SO(2) and NO(x) from the production of electricity from fossil fuels). To include landfilling in an life-cycle assessment (LCA) approach entails several methodological questions (multi-input process, site-specific influence, time dependency). Additionally, no experiences are available with regard to mid-term behaviour (decades) for the relatively new types of landfill (MBT landfill, landfill for residues from MSWI). The present paper focuses on two main issues concerning modelling of landfills in LCA: Firstly, it is an acknowledged fact that emissions from landfills may prevail for a very long time, often thousands of years or longer. The choice of time frame in the LCA of landfilling may therefore clearly affect the results. Secondly, the reliability of results obtained through a life-cycle assessment depends on the availability and quality of Life Cycle Inventory (LCI) data. Therefore the choice of the general approach, using multi-input inventory tool versus empirical results, may also influence the results. In this paper the different approaches concerning time horizon and LCI will be introduced and discussed. In the application of empirical results, the presence of

  1. On Consumer Self-Direction of Attendant Care Services: An Empirical Analysis of Survey Responses.

    ERIC Educational Resources Information Center

    Asher, Cheryl C.; And Others

    1991-01-01

    The concept of attendant care--provision of personal services to severely disabled individuals--is presented. Data from a survey of about 340 out of 718 consumers of attendant care indicate the existence of a mix of consumer-oriented programs. Consumer preference for a particular program design appeared to be governed by experience. (SLD)

  2. A Survey and Empirical Study of Virtual Reference Service in Academic Libraries

    ERIC Educational Resources Information Center

    Mu, Xiangming; Dimitroff, Alexandra; Jordan, Jeanette; Burclaff, Natalie

    2011-01-01

    Virtual Reference Services (VRS) have high user satisfaction. The main problem is its low usage. We surveyed 100 academic library web sites to understand how VRS are presented. We then conducted a usability study to further test an active VRS model regarding its effectiveness.

  3. An Empirical Taxonomy of Youths' Fears: Cluster Analysis of the American Fear Survey Schedule

    ERIC Educational Resources Information Center

    Burnham, Joy J.; Schaefer, Barbara A.; Giesen, Judy

    2006-01-01

    Fears profiles among children and adolescents were explored using the Fear Survey Schedule for Children-American version (FSSC-AM; J.J. Burnham, 1995, 2005). Eight cluster profiles were identified via multistage Euclidean grouping and supported by homogeneity coefficients and replication. Four clusters reflected overall level of fears (i.e., very…

  4. Attachment-Based Family Therapy: A Review of the Empirical Support.

    PubMed

    Diamond, Guy; Russon, Jody; Levy, Suzanne

    2016-09-01

    Attachment-based family therapy (ABFT) is an empirically supported treatment designed to capitalize on the innate, biological desire for meaningful and secure relationships. The therapy is grounded in attachment theory and provides an interpersonal, process-oriented, trauma-focused approach to treating adolescent depression, suicidality, and trauma. Although a process-oriented therapy, ABFT offers a clear structure and road map to help therapists quickly address attachment ruptures that lie at the core of family conflict. Several clinical trials and process studies have demonstrated empirical support for the model and its proposed mechanism of change. This article provides an overview of the clinical model and the existing empirical support for ABFT.

  5. An empirical formula based on Monte Carlo simulation for diffuse reflectance from turbid media

    NASA Astrophysics Data System (ADS)

    Gnanatheepam, Einstein; Aruna, Prakasa Rao; Ganesan, Singaravelu

    2016-03-01

    Diffuse reflectance spectroscopy has been widely used in diagnostic oncology and characterization of laser irradiated tissue. However, still accurate and simple analytical equation does not exist for estimation of diffuse reflectance from turbid media. In this work, a diffuse reflectance lookup table for a range of tissue optical properties was generated using Monte Carlo simulation. Based on the generated Monte Carlo lookup table, an empirical formula for diffuse reflectance was developed using surface fitting method. The variance between the Monte Carlo lookup table surface and the surface obtained from the proposed empirical formula is less than 1%. The proposed empirical formula may be used for modeling of diffuse reflectance from tissue.

  6. A Comparison of Web-Based and Paper-Based Survey Methods: Testing Assumptions of Survey Mode and Response Cost

    ERIC Educational Resources Information Center

    Greenlaw, Corey; Brown-Welty, Sharon

    2009-01-01

    Web-based surveys have become more prevalent in areas such as evaluation, research, and marketing research to name a few. The proliferation of these online surveys raises the question, how do response rates compare with traditional surveys and at what cost? This research explored response rates and costs for Web-based surveys, paper surveys, and…

  7. Deep in Data. Empirical Data Based Software Accuracy Testing Using the Building America Field Data Repository

    SciTech Connect

    Neymark, J.; Roberts, D.

    2013-06-01

    This paper describes progress toward developing a usable, standardized, empirical data-based software accuracy test suite using home energy consumption and building description data. Empirical data collected from around the United States have been translated into a uniform Home Performance Extensible Markup Language format that may enable software developers to create translators to their input schemes for efficient access to the data. This could allow for modeling many homes expediently, and thus implementing software accuracy test cases by applying the translated data.

  8. Obesity, weight status and employability: empirical evidence from a French national survey.

    PubMed

    Paraponaris, Alain; Saliba, Bérengère; Ventelou, Bruno

    2005-07-01

    We investigate the relationship between employability and obesity, particularly how obesity and overweight are associated with the percentage of working years spent unemployed and the ability to regain employment. Data for adults who responded to the 2003 Decennial Health Survey collected by the French National Institute of Statistics and Economic Studies revealed that the percentage of time spent unemployed during working years is significantly higher for each kg/m2 deviation from the mean body mass index (BMI) attained at age 20 and that the probability of regaining employment after a period of unemployment is much lower.

  9. Topological phase transition of single-crystal Bi based on empirical tight-binding calculations

    NASA Astrophysics Data System (ADS)

    Ohtsubo, Yoshiyuki; Kimura, Shin-ichi

    2016-12-01

    The topological order of single-crystal Bi and its surface states on the (111) surface are studied in detail based on empirical tight-binding (TB) calculations. New TB parameters are presented that are used to calculate the surface states of semi-infinite single-crystal Bi(111), which agree with the experimental angle-resolved photoelectron spectroscopy results. The influence of the crystal lattice distortion is surveyed and it is revealed that a topological phase transition is driven by in-plane expansion with topologically non-trivial bulk bands. In contrast with the semi-infinite system, the surface-state dispersions on finite-thickness slabs are non-trivial irrespective of the bulk topological order. The role of the interaction between the top and bottom surfaces in the slab is systematically studied, and it is revealed that a very thick slab is required to properly obtain the bulk topological order of Bi from the (111) surface state: above 150 biatomic layers in this case.

  10. An Empirical Analysis of Knowledge Based Hypertext Navigation

    PubMed Central

    Snell, J.R.; Boyle, C.

    1990-01-01

    Our purpose is to investigate the effectiveness of knowledge-based navigation in a dermatology hypertext network. The chosen domain is a set of dermatology class notes implemented in Hypercard and SINS. The study measured time, number of moves, and success rates for subjects to find solutions to ten questions. The subjects were required to navigate within a dermatology hypertext network in order to find the solutions to a question. Our results indicate that knowledge-based navigation can assist the user in finding information of interest in a fewer number of node visits (moves) than with traditional button-based browsing or keyword searching. The time necessary to find an item of interest was lower for traditional-based methods. There was no difference in success rates for the two test groups.

  11. Guidelines for Establishing Coastal Survey Base Lines.

    DTIC Science & Technology

    1981-11-01

    1954) and Czerniak (1972b), that contribute to the value of a monument include marking the monument with its station along the base line and its date...foot of the original location and within ±0.05 foot of the original elevation ( Czerniak , 1972a), which is normally accurate enough. With the exception of...Memorandum C&GSTM-4, Environmental Science Services Administration, U.S. Coast and Geodetic Survey, Rockville, Md., 1968. CZERNIAK , M.T., "Review of

  12. Capability deprivation of people with Alzheimer's disease: An empirical analysis using a national survey.

    PubMed

    Tellez, Juan; Krishnakumar, Jaya; Bungener, Martine; Le Galès, Catherine

    2016-02-01

    How can one assess the quality of life of older people--particularly those with Alzheimer's disease--from the point of view of their opportunities to do valued things in life? This paper is an attempt to answer this question using as a theoretical framework the capability approach. We use data collected on 8841 individuals above 60 living in France (the 2008 Disability and Health Household Survey) and propose a latent variable modelling framework to analyse their capabilities in two fundamental dimensions: freedom to perform self-care activities and freedom to participate in the life of the household. Our results show that living as a couple, having children, being mobile and having access to local shops, health facilities and public services enhance both capabilities. Age, household size and male gender (for one of the two capabilities) act as impediments while the number of impairments reduces both capabilities. We find that people with Alzheimer's disease have a lower level and a smaller range of capabilities (freedom) when compared to those without, even when the latter have several impairments. Hence they need a special attention in policy-making.

  13. Performance-Based Service Quality Model: An Empirical Study on Japanese Universities

    ERIC Educational Resources Information Center

    Sultan, Parves; Wong, Ho

    2010-01-01

    Purpose: This paper aims to develop and empirically test the performance-based higher education service quality model. Design/methodology/approach: The study develops 67-item instrument for measuring performance-based service quality with a particular focus on the higher education sector. Scale reliability is confirmed using the Cronbach's alpha.…

  14. Empirically Based School Interventions Targeted at Academic and Mental Health Functioning

    ERIC Educational Resources Information Center

    Hoagwood, Kimberly E.; Olin, S. Serene; Kerker, Bonnie D.; Kratochwill, Thomas R.; Crowe, Maura; Saka, Noa

    2007-01-01

    This review examines empirically based studies of school-based mental health interventions. The review identified 64 out of more than 2,000 articles published between 1990 and 2006 that met methodologically rigorous criteria for inclusion. Of these 64 articles, only 24 examined both mental health "and" educational outcomes. The majority of…

  15. Assessing differential expression in two-color microarrays: a resampling-based empirical Bayes approach.

    PubMed

    Li, Dongmei; Le Pape, Marc A; Parikh, Nisha I; Chen, Will X; Dye, Timothy D

    2013-01-01

    Microarrays are widely used for examining differential gene expression, identifying single nucleotide polymorphisms, and detecting methylation loci. Multiple testing methods in microarray data analysis aim at controlling both Type I and Type II error rates; however, real microarray data do not always fit their distribution assumptions. Smyth's ubiquitous parametric method, for example, inadequately accommodates violations of normality assumptions, resulting in inflated Type I error rates. The Significance Analysis of Microarrays, another widely used microarray data analysis method, is based on a permutation test and is robust to non-normally distributed data; however, the Significance Analysis of Microarrays method fold change criteria are problematic, and can critically alter the conclusion of a study, as a result of compositional changes of the control data set in the analysis. We propose a novel approach, combining resampling with empirical Bayes methods: the Resampling-based empirical Bayes Methods. This approach not only reduces false discovery rates for non-normally distributed microarray data, but it is also impervious to fold change threshold since no control data set selection is needed. Through simulation studies, sensitivities, specificities, total rejections, and false discovery rates are compared across the Smyth's parametric method, the Significance Analysis of Microarrays, and the Resampling-based empirical Bayes Methods. Differences in false discovery rates controls between each approach are illustrated through a preterm delivery methylation study. The results show that the Resampling-based empirical Bayes Methods offer significantly higher specificity and lower false discovery rates compared to Smyth's parametric method when data are not normally distributed. The Resampling-based empirical Bayes Methods also offers higher statistical power than the Significance Analysis of Microarrays method when the proportion of significantly differentially

  16. Toward an Empirically-Based Parametric Explosion Spectral Model

    DTIC Science & Technology

    2011-09-01

    Figure 6. Analysis of Vp/Vs () ratio from USGS database (Wood, 2007) at Pahute Mesa and Yucca Flat. The ratio as a function of depth...from Leonard and Johnson (1987) and Ferguson (1988) are shown for Pahute Mesa and Yucca Flat, respectively. Based on the distribution, we estimate...constant Vp/Vs ratios of 1.671 and 1.871 () at Pahute Mesa and Yucca Flat, respectively. In order to obtain the shear modulus and shear

  17. The Empirical Investigation of Perspective-Based Reading

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Green, Scott; Laitenberger, Oliver; Shull, Forrest; Sorumgard, Sivert; Zelkowitz, Marvin V.

    1996-01-01

    We consider reading techniques a fundamental means of achieving high quality software. Due to the lack of research in this area, we are experimenting with the application and comparison of various reading techniques. This paper deals with our experiences with Perspective-Based Reading (PBR), a particular reading technique for requirements documents. The goal of PBR is to provide operational scenarios where members of a review team read a document from a particular perspective (e.g., tester, developer, user). Our assumption is that the combination of different perspectives provides better coverage of the document than the same number of readers using their usual technique.

  18. The Empirical Investigation of Perspective-Based Reading

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Green, Scott; Laitenberger, Oliver; Shull, Forrest; Sorumgard, Sivert; Zelkowitz, Marvin V.

    1995-01-01

    We consider reading techniques a fundamental means of achieving high quality software. Due to lack of research in this area, we are experimenting with the application and comparison of various reading techniques. This paper deals with our experiences with Perspective Based Reading (PBR) a particular reading technique for requirement documents. The goal of PBR is to provide operation scenarios where members of a review team read a document from a particular perspective (eg., tester, developer, user). Our assumption is that the combination of different perspective provides better coverage of the document than the same number of readers using their usual technique. To test the efficacy of PBR, we conducted two runs of a controlled experiment in the environment of NASA GSFC Software Engineering Laboratory (SEL), using developers from the environment. The subjects read two types of documents, one generic in nature and the other from the NASA Domain, using two reading techniques, PBR and their usual technique. The results from these experiment as well as the experimental design, are presented and analyzed. When there is a statistically significant distinction, PBR performs better than the subjects' usual technique. However, PBR appears to be more effective on the generic documents than on the NASA documents.

  19. Towards an Empirically Based Parametric Explosion Spectral Model

    SciTech Connect

    Ford, S R; Walter, W R; Ruppert, S; Matzel, E; Hauk, T; Gok, R

    2009-08-31

    Small underground nuclear explosions need to be confidently detected, identified, and characterized in regions of the world where they have never before been tested. The focus of our work is on the local and regional distances (< 2000 km) and phases (Pn, Pg, Sn, Lg) necessary to see small explosions. We are developing a parametric model of the nuclear explosion seismic source spectrum that is compatible with the earthquake-based geometrical spreading and attenuation models developed using the Magnitude Distance Amplitude Correction (MDAC) techniques (Walter and Taylor, 2002). The explosion parametric model will be particularly important in regions without any prior explosion data for calibration. The model is being developed using the available body of seismic data at local and regional distances for past nuclear explosions at foreign and domestic test sites. Parametric modeling is a simple and practical approach for widespread monitoring applications, prior to the capability to carry out fully deterministic modeling. The achievable goal of our parametric model development is to be able to predict observed local and regional distance seismic amplitudes for event identification and yield determination in regions with incomplete or no prior history of underground nuclear testing. The relationship between the parametric equations and the geologic and containment conditions will assist in our physical understanding of the nuclear explosion source.

  20. Toward an Empirically-based Parametric Explosion Spectral Model

    NASA Astrophysics Data System (ADS)

    Ford, S. R.; Walter, W. R.; Ruppert, S.; Matzel, E.; Hauk, T. F.; Gok, R.

    2010-12-01

    Small underground nuclear explosions need to be confidently detected, identified, and characterized in regions of the world where they have never occurred. We develop a parametric model of the nuclear explosion seismic source spectrum derived from regional phases (Pn, Pg, and Lg) that is compatible with earthquake-based geometrical spreading and attenuation. Earthquake spectra are fit with a generalized version of the Brune spectrum, which is a three-parameter model that describes the long-period level, corner-frequency, and spectral slope at high-frequencies. These parameters are then correlated with near-source geology and containment conditions. There is a correlation of high gas-porosity (low strength) with increased spectral slope. However, there are trade-offs between the slope and corner-frequency, which we try to independently constrain using Mueller-Murphy relations and coda-ratio techniques. The relationship between the parametric equation and the geologic and containment conditions will assist in our physical understanding of the nuclear explosion source, and aid in the prediction of observed local and regional distance seismic amplitudes for event identification and yield determination in regions with incomplete or no prior history of underground nuclear testing.

  1. Empirical wind retrieval model based on SAR spectrum measurements

    NASA Astrophysics Data System (ADS)

    Panfilova, Maria; Karaev, Vladimir; Balandina, Galina; Kanevsky, Mikhail; Portabella, Marcos; Stoffelen, Ad

    ambiguity from polarimetric SAR. A criterion based on the complex correlation coefficient between the VV and VH signals sign is applied to select the wind direction. An additional quality control on the wind speed value retrieved with the spectral method is applied. Here, we use the direction obtained with the spectral method and the backscattered signal for CMOD wind speed estimate. The algorithm described above may be refined by the use of numerous SAR data and wind measurements. In the present preliminary work the first results of SAR images combined with in situ data processing are presented. Our results are compared to the results obtained using previously developed models CMOD, C-2PO for VH polarization and statistical wind retrieval approaches [1]. Acknowledgments. This work is supported by the Russian Foundation of Basic Research (grants 13-05-00852-a). [1] M. Portabella, A. Stoffelen, J. A. Johannessen, Toward an optimal inversion method for synthetic aperture radar wind retrieval, Journal of geophysical research, V. 107, N C8, 2002

  2. Task-Based Language Teaching: An Empirical Study of Task Transfer

    ERIC Educational Resources Information Center

    Benson, Susan D.

    2016-01-01

    Since the 1980s, task-based language teaching (TBLT) has enjoyed considerable interest from researchers of second language acquisition (SLA), resulting in a growing body of empirical evidence to support how and to what extent this approach can promote language learning. Although transferability and generalizability are critical assumptions for…

  3. Untangling the Evidence: Introducing an Empirical Model for Evidence-Based Library and Information Practice

    ERIC Educational Resources Information Center

    Gillespie, Ann

    2014-01-01

    Introduction: This research is the first to investigate the experiences of teacher-librarians as evidence-based practice. An empirically derived model is presented in this paper. Method: This qualitative study utilised the expanded critical incident approach, and investigated the real-life experiences of fifteen Australian teacher-librarians,…

  4. Implementing Evidence-Based Practice: A Review of the Empirical Research Literature

    ERIC Educational Resources Information Center

    Gray, Mel; Joy, Elyssa; Plath, Debbie; Webb, Stephen A.

    2013-01-01

    The article reports on the findings of a review of empirical studies examining the implementation of evidence-based practice (EBP) in the human services. Eleven studies were located that defined EBP as a research-informed, clinical decision-making process and identified barriers and facilitators to EBP implementation. A thematic analysis of the…

  5. Empirical vs. Expected IRT-Based Reliability Estimation in Computerized Multistage Testing (MST)

    ERIC Educational Resources Information Center

    Zhang, Yanwei; Breithaupt, Krista; Tessema, Aster; Chuah, David

    2006-01-01

    Two IRT-based procedures to estimate test reliability for a certification exam that used both adaptive (via a MST model) and non-adaptive design were considered in this study. Both procedures rely on calibrated item parameters to estimate error variance. In terms of score variance, one procedure (Method 1) uses the empirical ability distribution…

  6. Satellite-based empirical models linking river plume dynamics with hypoxic area andvolume

    EPA Science Inventory

    Satellite-based empirical models explaining hypoxic area and volume variation were developed for the seasonally hypoxic (O2 < 2 mg L−1) northern Gulf of Mexico adjacent to the Mississippi River. Annual variations in midsummer hypoxic area and ...

  7. Feasibility of an Empirically Based Program for Parents of Preschoolers with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Dababnah, Sarah; Parish, Susan L.

    2016-01-01

    This article reports on the feasibility of implementing an existing empirically based program, "The Incredible Years," tailored to parents of young children with autism spectrum disorder. Parents raising preschool-aged children (aged 3-6?years) with autism spectrum disorder (N?=?17) participated in a 15-week pilot trial of the…

  8. Deriving Empirically-Based Design Guidelines for Advanced Learning Technologies that Foster Disciplinary Comprehension

    ERIC Educational Resources Information Center

    Poitras, Eric; Trevors, Gregory

    2012-01-01

    Planning, conducting, and reporting leading-edge research requires professionals who are capable of highly skilled reading. This study reports the development of an empirically informed computer-based learning environment designed to foster the acquisition of reading comprehension strategies that mediate expertise in the social sciences. Empirical…

  9. An Empirically-Based Statewide System for Identifying Quality Pre-Kindergarten Programs

    ERIC Educational Resources Information Center

    Williams, Jeffrey M.; Landry, Susan H.; Anthony, Jason L.; Swank, Paul R.; Crawford, April D.

    2012-01-01

    This study presents an empirically-based statewide system that links information about pre-kindergarten programs with children's school readiness scores to certify pre-kindergarten classrooms as promoting school readiness. Over 8,000 children from 1,255 pre-kindergarten classrooms were followed longitudinally for one year. Pre-kindergarten quality…

  10. Introduction to the Application of Web-Based Surveys.

    ERIC Educational Resources Information Center

    Timmerman, Annemarie

    This paper discusses some basic assumptions and issues concerning web-based surveys. Discussion includes: assumptions regarding cost and ease of use; disadvantages of web-based surveys, concerning the inability to compensate for four common errors of survey research: coverage error, sampling error, measurement error and nonresponse error; and…

  11. A modified cadastral survey system based on GPS/PDA

    NASA Astrophysics Data System (ADS)

    Wang, Huiqing; Wang, Qing; Wu, Xiangyang

    2009-12-01

    Due to disadvantages of complex working procedure, long field survey and low efficiency of the traditional cadastral survey methods exist, a modified system based on GPS(Global Position System) /PDA(Personal Digital Assist) combined with TS(Total Station) is proposed. The system emphasizes the design of TS free setting station for detail survey without GPS, to realize simultaneously processing control survey and detail survey. The system also applies digital drafting method based on PDA instead of cartographical sketching, to realize fully-digitalized cadastral survey. The application in Beijing shows that the modified cadastral survey system based on GPS/PDA performs high efficiency, and the accuracy of this system can meet the requirement of 1:500 large scale cadastral survey.

  12. Self-adaptive image denoising based on bidimensional empirical mode decomposition (BEMD).

    PubMed

    Guo, Song; Luan, Fangjun; Song, Xiaoyu; Li, Changyou

    2014-01-01

    To better analyze images with the Gaussian white noise, it is necessary to remove the noise before image processing. In this paper, we propose a self-adaptive image denoising method based on bidimensional empirical mode decomposition (BEMD). Firstly, normal probability plot confirms that 2D-IMF of Gaussian white noise images decomposed by BEMD follow the normal distribution. Secondly, energy estimation equation of the ith 2D-IMF (i=2,3,4,......) is proposed referencing that of ith IMF (i=2,3,4,......) obtained by empirical mode decomposition (EMD). Thirdly, the self-adaptive threshold of each 2D-IMF is calculated. Eventually, the algorithm of the self-adaptive image denoising method based on BEMD is described. From the practical perspective, this is applied for denoising of the magnetic resonance images (MRI) of the brain. And the results show it has a better denoising performance compared with other methods.

  13. An Empirical Study for Impacts of Measurement Errors on EHR based Association Studies

    PubMed Central

    Duan, Rui; Cao, Ming; Wu, Yonghui; Huang, Jing; Denny, Joshua C; Xu, Hua; Chen, Yong

    2016-01-01

    Over the last decade, Electronic Health Records (EHR) systems have been increasingly implemented at US hospitals. Despite their great potential, the complex and uneven nature of clinical documentation and data quality brings additional challenges for analyzing EHR data. A critical challenge is the information bias due to the measurement errors in outcome and covariates. We conducted empirical studies to quantify the impacts of the information bias on association study. Specifically, we designed our simulation studies based on the characteristics of the Electronic Medical Records and Genomics (eMERGE) Network. Through simulation studies, we quantified the loss of power due to misclassifications in case ascertainment and measurement errors in covariate status extraction, with respect to different levels of misclassification rates, disease prevalence, and covariate frequencies. These empirical findings can inform investigators for better understanding of the potential power loss due to misclassification and measurement errors under a variety of conditions in EHR based association studies. PMID:28269935

  14. Bacterial clonal diagnostics as a tool for evidence-based empiric antibiotic selection.

    PubMed

    Tchesnokova, Veronika; Avagyan, Hovhannes; Rechkina, Elena; Chan, Diana; Muradova, Mariya; Haile, Helen Ghirmai; Radey, Matthew; Weissman, Scott; Riddell, Kim; Scholes, Delia; Johnson, James R; Sokurenko, Evgeni V

    2017-01-01

    Despite the known clonal distribution of antibiotic resistance in many bacteria, empiric (pre-culture) antibiotic selection still relies heavily on species-level cumulative antibiograms, resulting in overuse of broad-spectrum agents and excessive antibiotic/pathogen mismatch. Urinary tract infections (UTIs), which account for a large share of antibiotic use, are caused predominantly by Escherichia coli, a highly clonal pathogen. In an observational clinical cohort study of urgent care patients with suspected UTI, we assessed the potential for E. coli clonal-level antibiograms to improve empiric antibiotic selection. A novel PCR-based clonotyping assay was applied to fresh urine samples to rapidly detect E. coli and the urine strain's clonotype. Based on a database of clonotype-specific antibiograms, the acceptability of various antibiotics for empiric therapy was inferred using a 20%, 10%, and 30% allowed resistance threshold. The test's performance characteristics and possible effects on prescribing were assessed. The rapid test identified E. coli clonotypes directly in patients' urine within 25-35 minutes, with high specificity and sensitivity compared to culture. Antibiotic selection based on a clonotype-specific antibiogram could reduce the relative likelihood of antibiotic/pathogen mismatch by ≥ 60%. Compared to observed prescribing patterns, clonal diagnostics-guided antibiotic selection could safely double the use of trimethoprim/sulfamethoxazole and minimize fluoroquinolone use. In summary, a rapid clonotyping test showed promise for improving empiric antibiotic prescribing for E. coli UTI, including reversing preferential use of fluoroquinolones over trimethoprim/sulfamethoxazole. The clonal diagnostics approach merges epidemiologic surveillance, antimicrobial stewardship, and molecular diagnostics to bring evidence-based medicine directly to the point of care.

  15. Bacterial clonal diagnostics as a tool for evidence-based empiric antibiotic selection

    PubMed Central

    Tchesnokova, Veronika; Avagyan, Hovhannes; Rechkina, Elena; Chan, Diana; Muradova, Mariya; Haile, Helen Ghirmai; Radey, Matthew; Weissman, Scott; Riddell, Kim; Scholes, Delia; Johnson, James R.

    2017-01-01

    Despite the known clonal distribution of antibiotic resistance in many bacteria, empiric (pre-culture) antibiotic selection still relies heavily on species-level cumulative antibiograms, resulting in overuse of broad-spectrum agents and excessive antibiotic/pathogen mismatch. Urinary tract infections (UTIs), which account for a large share of antibiotic use, are caused predominantly by Escherichia coli, a highly clonal pathogen. In an observational clinical cohort study of urgent care patients with suspected UTI, we assessed the potential for E. coli clonal-level antibiograms to improve empiric antibiotic selection. A novel PCR-based clonotyping assay was applied to fresh urine samples to rapidly detect E. coli and the urine strain's clonotype. Based on a database of clonotype-specific antibiograms, the acceptability of various antibiotics for empiric therapy was inferred using a 20%, 10%, and 30% allowed resistance threshold. The test's performance characteristics and possible effects on prescribing were assessed. The rapid test identified E. coli clonotypes directly in patients’ urine within 25–35 minutes, with high specificity and sensitivity compared to culture. Antibiotic selection based on a clonotype-specific antibiogram could reduce the relative likelihood of antibiotic/pathogen mismatch by ≥ 60%. Compared to observed prescribing patterns, clonal diagnostics-guided antibiotic selection could safely double the use of trimethoprim/sulfamethoxazole and minimize fluoroquinolone use. In summary, a rapid clonotyping test showed promise for improving empiric antibiotic prescribing for E. coli UTI, including reversing preferential use of fluoroquinolones over trimethoprim/sulfamethoxazole. The clonal diagnostics approach merges epidemiologic surveillance, antimicrobial stewardship, and molecular diagnostics to bring evidence-based medicine directly to the point of care. PMID:28350870

  16. Center for Army Leadership’s Response to Empirically Based Leadership

    DTIC Science & Technology

    2013-02-01

    Douglas MacArthur Military Leader- ship Writing award for his article, “Empirically Based Leadership: Integrating the Science of Psychology in...experience. The paper states that integrating relevant empiricism into the process is required to construct a more complete model of leadership; however...review was con- ducted of psychology literature among other bodies of knowledge. The expert review used a Delphi technique to obtain independent

  17. Scaling up explanation generation: Large-scale knowledge bases and empirical studies

    SciTech Connect

    Lester, J.C.; Porter, B.W.

    1996-12-31

    To explain complex phenomena, an explanation system must be able to select information from a formal representation of domain knowledge, organize the selected information into multisentential discourse plans, and realize the discourse plans in text. Although recent years have witnessed significant progress in the development of sophisticated computational mechanisms for explanation, empirical results have been limited. This paper reports on a seven year effort to empirically study explanation generation from semantically rich, large-scale knowledge bases. We first describe Knight, a robust explanation system that constructs multi-sentential and multi-paragraph explanations from the Biology Knowledge Base, a large-scale knowledge base in the domain of botanical anatomy, physiology, and development. We then introduce the Two Panel evaluation methodology and describe how Knight`s performance was assessed with this methodology in the most extensive empirical evaluation conducted on an explanation system. In this evaluation, Knight scored within {open_quotes}half a grade{close_quote} of domain experts, and its performance exceeded that of one of the domain experts.

  18. Increasing Response Rates to Web-Based Surveys

    ERIC Educational Resources Information Center

    Monroe, Martha C.; Adams, Damian C.

    2012-01-01

    We review a popular method for collecing data--Web-based surveys. Although Web surveys are popular, one major concern is their typically low response rates. Using the Dillman et al. (2009) approach, we designed, pre-tested, and implemented a survey on climate change with Extension professionals in the Southeast. The Dillman approach worked well,…

  19. Increasing Your Productivity with Web-Based Surveys

    ERIC Educational Resources Information Center

    Wissmann, Mary; Stone, Brittney; Schuster, Ellen

    2012-01-01

    Web-based survey tools such as Survey Monkey can be used in many ways to increase the efficiency and effectiveness of Extension professionals. This article describes how Survey Monkey has been used at the state and county levels to collect community and internal staff information for the purposes of program planning, administration, evaluation and…

  20. An empirical mass-loss law for Population II giants from the Spitzer-IRAC survey of Galactic globular clusters

    NASA Astrophysics Data System (ADS)

    Origlia, L.; Ferraro, F. R.; Fabbri, S.; Fusi Pecci, F.; Dalessandro, E.; Rich, R. M.; Valenti, E.

    2014-04-01

    Aims: The main aim of the present work is to derive an empirical mass-loss (ML) law for Population II stars in first and second ascent red giant branches. Methods: We used the Spitzer InfraRed Array Camera (IRAC) photometry obtained in the 3.6-8 μm range of a carefully chosen sample of 15 Galactic globular clusters spanning the entire metallicity range and sampling the vast zoology of horizontal branch (HB) morphologies. We complemented the IRAC photometry with near-infrared data to build suitable color-magnitude and color-color diagrams and identify mass-losing giant stars. Results: We find that while the majority of stars show colors typical of cool giants, some stars show an excess of mid-infrared light that is larger than expected from their photospheric emission and that is plausibly due to dust formation in mass flowing from them. For these stars, we estimate dust and total (gas + dust) ML rates and timescales. We finally calibrate an empirical ML law for Population II red and asymptotic giant branch stars with varying metallicity. We find that at a given red giant branch luminosity only a fraction of the stars are losing mass. From this, we conclude that ML is episodic and is active only a fraction of the time, which we define as the duty cycle. The fraction of mass-losing stars increases by increasing the stellar luminosity and metallicity. The ML rate, as estimated from reasonable assumptions for the gas-to-dust ratio and expansion velocity, depends on metallicity and slowly increases with decreasing metallicity. In contrast, the duty cycle increases with increasing metallicity, with the net result that total ML increases moderately with increasing metallicity, about 0.1 M⊙ every dex in [Fe/H]. For Population II asymptotic giant branch stars, we estimate a total ML of ≤0.1 M⊙, nearly constant with varying metallicity. This work is based on observations made with the Spitzer Space Telescope, which is operated by the Jet Propulsion Laboratory

  1. Generalised Linear Models Incorporating Population Level Information: An Empirical Likelihood Based Approach

    PubMed Central

    Chaudhuri, Sanjay; Handcock, Mark S.; Rendall, Michael S.

    2011-01-01

    In many situations information from a sample of individuals can be supplemented by population level information on the relationship between a dependent variable and explanatory variables. Inclusion of the population level information can reduce bias and increase the efficiency of the parameter estimates. Population level information can be incorporated via constraints on functions of the model parameters. In general the constraints are nonlinear making the task of maximum likelihood estimation harder. In this paper we develop an alternative approach exploiting the notion of an empirical likelihood. It is shown that within the framework of generalised linear models, the population level information corresponds to linear constraints, which are comparatively easy to handle. We provide a two-step algorithm that produces parameter estimates using only unconstrained estimation. We also provide computable expressions for the standard errors. We give an application to demographic hazard modelling by combining panel survey data with birth registration data to estimate annual birth probabilities by parity. PMID:22740776

  2. Outcome (competency) based education: an exploration of its origins, theoretical basis, and empirical evidence.

    PubMed

    Morcke, Anne Mette; Dornan, Tim; Eika, Berit

    2013-10-01

    Outcome based or competency based education (OBE) is so firmly established in undergraduate medical education that it might not seem necessary to ask why it was included in recommendations for the future, like the Flexner centenary report. Uncritical acceptance may not, however, deliver its greatest benefits. Our aim was to explore the underpinnings of OBE: its historical origins, theoretical basis, and empirical evidence of its effects in order to answer the question: How can predetermined learning outcomes influence undergraduate medical education? This literature review had three components: A review of historical landmarks in the evolution of OBE; a review of conceptual frameworks and theories; and a systematic review of empirical publications from 1999 to 2010 that reported data concerning the effects of learning outcomes on undergraduate medical education. OBE had its origins in behaviourist theories of learning. It is tightly linked to the assessment and regulation of proficiency, but less clearly linked to teaching and learning activities. Over time, there have been cycles of advocacy for, then criticism of, OBE. A recurring critique concerns the place of complex personal and professional attributes as "competencies". OBE has been adopted by consensus in the face of weak empirical evidence. OBE, which has been advocated for over 50 years, can contribute usefully to defining requisite knowledge and skills, and blueprinting assessments. Its applicability to more complex aspects of clinical performance is not clear. OBE, we conclude, provides a valuable approach to some, but not all, important aspects of undergraduate medical education.

  3. Band structure calculation of GaSe-based nanostructures using empirical pseudopotential method

    NASA Astrophysics Data System (ADS)

    Osadchy, A. V.; Volotovskiy, S. G.; Obraztsova, E. D.; Savin, V. V.; Golovashkin, D. L.

    2016-08-01

    In this paper we present the results of band structure computer simulation of GaSe- based nanostructures using the empirical pseudopotential method. Calculations were performed using a specially developed software that allows performing simulations using cluster computing. Application of this method significantly reduces the demands on computing resources compared to traditional approaches based on ab-initio techniques and provides receiving the adequate comparable results. The use of cluster computing allows to obtain information for structures that require an explicit account of a significant number of atoms, such as quantum dots and quantum pillars.

  4. An Empirical Study of Plan-Based Representations of Pascal and Fortran Code.

    DTIC Science & Technology

    1987-06-01

    COMPUTING LABORATORY lReport No. CCL-0687-0 00 IAN EMPIRICAL STUDY OF PLAN-BASED REPRESENTATIONS OF PASCAL AND FORTRAN CODE Scott P. Robertson Chiung-Chen Yu...82173 ’, " Office of Naval Research Contract No. N00014-86-K-0876 Work Unit No. NR 4424203-01 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED...researchers have argued recenitly that programmers utilize a plan-based representation when composing or comprehending program code. In a series of studies we

  5. Development of Empirically Based Time-to-death Curves for Combat Casualty Deaths in Iraq and Afghanistan

    DTIC Science & Technology

    2015-01-01

    Naval Health Research Center Development of Empirically Based Time-to- death Curves for Combat Casualty Deaths In Iraq and Afghanistan Edwin...10.1177/1548512914531353 dms.sagepub.com Development of empirically based time-to- death curves for combat casualty deaths in Iraq and Afghanistan...casualties with life-threatening injuries. The curves developed from that research were based on a small dataset (n = 160, with 26 deaths and 134

  6. Fault Diagnosis of Rotating Machinery Based on an Adaptive Ensemble Empirical Mode Decomposition

    PubMed Central

    Lei, Yaguo; Li, Naipeng; Lin, Jing; Wang, Sizhe

    2013-01-01

    The vibration based signal processing technique is one of the principal tools for diagnosing faults of rotating machinery. Empirical mode decomposition (EMD), as a time-frequency analysis technique, has been widely used to process vibration signals of rotating machinery. But it has the shortcoming of mode mixing in decomposing signals. To overcome this shortcoming, ensemble empirical mode decomposition (EEMD) was proposed accordingly. EEMD is able to reduce the mode mixing to some extent. The performance of EEMD, however, depends on the parameters adopted in the EEMD algorithms. In most of the studies on EEMD, the parameters were selected artificially and subjectively. To solve the problem, a new adaptive ensemble empirical mode decomposition method is proposed in this paper. In the method, the sifting number is adaptively selected, and the amplitude of the added noise changes with the signal frequency components during the decomposition process. The simulation, the experimental and the application results demonstrate that the adaptive EEMD provides the improved results compared with the original EEMD in diagnosing rotating machinery. PMID:24351666

  7. Fault diagnosis of rotating machinery based on an adaptive ensemble empirical mode decomposition.

    PubMed

    Lei, Yaguo; Li, Naipeng; Lin, Jing; Wang, Sizhe

    2013-12-09

    The vibration based signal processing technique is one of the principal tools for diagnosing faults of rotating machinery. Empirical mode decomposition (EMD), as a time-frequency analysis technique, has been widely used to process vibration signals of rotating machinery. But it has the shortcoming of mode mixing in decomposing signals. To overcome this shortcoming, ensemble empirical mode decomposition (EEMD) was proposed accordingly. EEMD is able to reduce the mode mixing to some extent. The performance of EEMD, however, depends on the parameters adopted in the EEMD algorithms. In most of the studies on EEMD, the parameters were selected artificially and subjectively. To solve the problem, a new adaptive ensemble empirical mode decomposition method is proposed in this paper. In the method, the sifting number is adaptively selected, and the amplitude of the added noise changes with the signal frequency components during the decomposition process. The simulation, the experimental and the application results demonstrate that the adaptive EEMD provides the improved results compared with the original EEMD in diagnosing rotating machinery.

  8. Survey says? A primer on web-based survey design and distribution.

    PubMed

    Oppenheimer, Adam J; Pannucci, Christopher J; Kasten, Steven J; Haase, Steven C

    2011-07-01

    The Internet has changed the way in which we gather and interpret information. Although books were once the exclusive bearers of data, knowledge is now only a keystroke away. The Internet has also facilitated the synthesis of new knowledge. Specifically, it has become a tool through which medical research is conducted. A review of the literature reveals that in the past year, over 100 medical publications have been based on Web-based survey data alone. Because of emerging Internet technologies, Web-based surveys can now be launched with little computer knowledge. They may also be self-administered, eliminating personnel requirements. Ultimately, an investigator may build, implement, and analyze survey results with speed and efficiency, obviating the need for mass mailings and data processing. All of these qualities have rendered telephone and mail-based surveys virtually obsolete. Despite these capabilities, Web-based survey techniques are not without their limitations, namely, recall and response biases. When used properly, however, Web-based surveys can greatly simplify the research process. This article discusses the implications of Web-based surveys and provides guidelines for their effective design and distribution.

  9. Web-Based Surveys Facilitate Undergraduate Research and Knowledge

    ERIC Educational Resources Information Center

    Grimes, Paul, Ed.; Steele, Scott R.

    2008-01-01

    The author presents Web-based surveying as a valuable tool for achieving quality undergraduate research in upper-level economics courses. Web-based surveys can be employed in efforts to integrate undergraduate research into the curriculum without overburdening students or faculty. The author discusses the value of undergraduate research, notes…

  10. School-Based Health Care State Policy Survey. Executive Summary

    ERIC Educational Resources Information Center

    National Assembly on School-Based Health Care, 2012

    2012-01-01

    The National Assembly on School-Based Health Care (NASBHC) surveys state public health and Medicaid offices every three years to assess state-level public policies and activities that promote the growth and sustainability of school-based health services. The FY2011 survey found 18 states (see map below) reporting investments explicitly dedicated…

  11. Empirical and physics based mathematical models of uranium hydride decomposition kinetics with quantified uncertainties.

    SciTech Connect

    Salloum, Maher N.; Gharagozloo, Patricia E.

    2013-10-01

    Metal particle beds have recently become a major technique for hydrogen storage. In order to extract hydrogen from such beds, it is crucial to understand the decomposition kinetics of the metal hydride. We are interested in obtaining a a better understanding of the uranium hydride (UH3) decomposition kinetics. We first developed an empirical model by fitting data compiled from different experimental studies in the literature and quantified the uncertainty resulting from the scattered data. We found that the decomposition time range predicted by the obtained kinetics was in a good agreement with published experimental results. Secondly, we developed a physics based mathematical model to simulate the rate of hydrogen diffusion in a hydride particle during the decomposition. We used this model to simulate the decomposition of the particles for temperatures ranging from 300K to 1000K while propagating parametric uncertainty and evaluated the kinetics from the results. We compared the kinetics parameters derived from the empirical and physics based models and found that the uncertainty in the kinetics predicted by the physics based model covers the scattered experimental data. Finally, we used the physics-based kinetics parameters to simulate the effects of boundary resistances and powder morphological changes during decomposition in a continuum level model. We found that the species change within the bed occurring during the decomposition accelerates the hydrogen flow by increasing the bed permeability, while the pressure buildup and the thermal barrier forming at the wall significantly impede the hydrogen extraction.

  12. Comparing results from two continental geochemical surveys to world soil composition and deriving Predicted Empirical Global Soil (PEGS2) reference values

    NASA Astrophysics Data System (ADS)

    de Caritat, Patrice; Reimann, Clemens; Bastrakov, E.; Bowbridge, D.; Boyle, P.; Briggs, S.; Brown, D.; Brown, M.; Brownlie, K.; Burrows, P.; Burton, G.; Byass, J.; de Caritat, P.; Chanthapanya, N.; Cooper, M.; Cranfield, L.; Curtis, S.; Denaro, T.; Dhnaram, C.; Dhu, T.; Diprose, G.; Fabris, A.; Fairclough, M.; Fanning, S.; Fidler, R.; Fitzell, M.; Flitcroft, P.; Fricke, C.; Fulton, D.; Furlonger, J.; Gordon, G.; Green, A.; Green, G.; Greenfield, J.; Harley, J.; Heawood, S.; Hegvold, T.; Henderson, K.; House, E.; Husain, Z.; Krsteska, B.; Lam, J.; Langford, R.; Lavigne, T.; Linehan, B.; Livingstone, M.; Lukss, A.; Maier, R.; Makuei, A.; McCabe, L.; McDonald, P.; McIlroy, D.; McIntyre, D.; Morris, P.; O'Connell, G.; Pappas, B.; Parsons, J.; Petrick, C.; Poignand, B.; Roberts, R.; Ryle, J.; Seymon, A.; Sherry, K.; Skinner, J.; Smith, M.; Strickland, C.; Sutton, S.; Swindell, R.; Tait, H.; Tang, J.; Thomson, A.; Thun, C.; Uppill, B.; Wall, K.; Watkins, J.; Watson, T.; Webber, L.; Whiting, A.; Wilford, J.; Wilson, T.; Wygralak, A.; Albanese, S.; Andersson, M.; Arnoldussen, A.; Baritz, R.; Batista, M. J.; Bel-lan, A.; Birke, M.; Cicchella, C.; Demetriades, A.; Dinelli, E.; De Vivo, B.; De Vos, W.; Duris, M.; Dusza-Dobek, A.; Eggen, O. A.; Eklund, M.; Ernstsen, V.; Filzmoser, P.; Finne, T. E.; Flight, D.; Forrester, S.; Fuchs, M.; Fugedi, U.; Gilucis, A.; Gosar, M.; Gregorauskiene, V.; Gulan, A.; Halamić, J.; Haslinger, E.; Hayoz, P.; Hobiger, G.; Hoffmann, R.; Hoogewerff, J.; Hrvatovic, H.; Husnjak, S.; Janik, L.; Johnson, C. C.; Jordan, G.; Kirby, J.; Kivisilla, J.; Klos, V.; Krone, F.; Kwecko, P.; Kuti, L.; Ladenberger, A.; Lima, A.; Locutura, J.; Lucivjansky, P.; Mackovych, D.; Malyuk, B. I.; Maquil, R.; McLaughlin, M.; Meuli, R. G.; Miosic, N.; Mol, G.; Négrel, P.; O'Connor, P.; Oorts, K.; Ottesen, R. T.; Pasieczna, A.; Petersell, V.; Pfleiderer, S.; Poňavič, M.; Prazeres, C.; Rauch, U.; Reimann, C.; Salpeteur, I.; Schedl, A.; Scheib, A.; Schoeters, I.; Sefcik, P.; Sellersjö, E.; Skopljak, F.; Slaninka, I.; Šorša, A.; Srvkota, R.; Stafilov, T.; Tarvainen, T.; Trendavilov, V.; Valera, P.; Verougstraete, V.; Vidojević, D.; Zissimos, A. M.; Zomeni, Z.

    2012-02-01

    Analytical data for 10 major oxides (Al2O3, CaO, Fe2O3, K2O, MgO, MnO, Na2O, P2O5, SiO2 and TiO2), 16 total trace elements (As, Ba, Ce, Co, Cr, Ga, Nb, Ni, Pb, Rb, Sr, Th, V, Y, Zn and Zr), 14 aqua regia extracted elements (Ag, As, Bi, Cd, Ce, Co, Cs, Cu, Fe, La, Li, Mn, Mo and Pb), Loss On Ignition (LOI) and pH from 3526 soil samples from two continents (Australia and Europe) are presented and compared to (1) the composition of the upper continental crust, (2) published world soil average values, and (3) data from other continental-scale soil surveys. It can be demonstrated that average upper continental crust values do not provide reliable estimates for natural concentrations of elements in soils. For many elements there exist substantial differences between published world soil averages and the median concentrations observed on two continents. Direct comparison with other continental datasets is hampered by the fact that often mean, instead of the statistically more robust median, is reported. Using a database of the worldwide distribution of lithological units, it can be demonstrated that lithology is a poor predictor of soil chemistry. Climate-related processes such as glaciation and weathering are strong modifiers of the geochemical signature inherited from bedrock during pedogenesis. To overcome existing shortcomings of predicted global or world soil geochemical reference values, we propose Preliminary Empirical Global Soil reference values based on analytical results of a representative number of soil samples from two continents (PEGS2).

  13. An ISAR imaging algorithm for the space satellite based on empirical mode decomposition theory

    NASA Astrophysics Data System (ADS)

    Zhao, Tao; Dong, Chun-zhu

    2014-11-01

    Currently, high resolution imaging of the space satellite is a popular topic in the field of radar technology. In contrast with regular targets, the satellite target often moves along with its trajectory and simultaneously its solar panel substrate changes the direction toward the sun to obtain energy. Aiming at the imaging problem, a signal separating and imaging approach based on the empirical mode decomposition (EMD) theory is proposed, and the approach can realize separating the signal of two parts in the satellite target, the main body and the solar panel substrate and imaging for the target. The simulation experimentation can demonstrate the validity of the proposed method.

  14. Evaluating Process Quality Based on Change Request Data - An Empirical Study of the Eclipse Project

    NASA Astrophysics Data System (ADS)

    Schackmann, Holger; Schaefer, Henning; Lichter, Horst

    The information routinely collected in change request management systems contains valuable information for monitoring of the process quality. However this data is currently utilized in a very limited way. This paper presents an empirical study of the process quality in the product portfolio of the Eclipse project. It is based on a systematic approach for the evaluation of process quality characteristics using change request data. Results of the study offer insights into the development process of Eclipse. Moreover the study allows assessing applicability and limitations of the proposed approach for the evaluation of process quality.

  15. Systematic Review of Empirically Evaluated School-Based Gambling Education Programs.

    PubMed

    Keen, Brittany; Blaszczynski, Alex; Anjoul, Fadi

    2017-03-01

    Adolescent problem gambling prevalence rates are reportedly five times higher than in the adult population. Several school-based gambling education programs have been developed in an attempt to reduce problem gambling among adolescents; however few have been empirically evaluated. The aim of this review was to report the outcome of studies empirically evaluating gambling education programs across international jurisdictions. A systematic review following guidelines outlined in the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement searching five academic databases: PubMed, Scopus, Medline, PsycINFO, and ERIC, was conducted. A total of 20 papers and 19 studies were included after screening and exclusion criteria were applied. All studies reported intervention effects on cognitive outcomes such as knowledge, perceptions, and beliefs. Only nine of the studies attempted to measure intervention effects on behavioural outcomes, and only five of those reported significant changes in gambling behaviour. Of these five, methodological inadequacies were commonly found including brief follow-up periods, lack of control comparison in post hoc analyses, and inconsistencies and misclassifications in the measurement of gambling behaviour, including problem gambling. Based on this review, recommendations are offered for the future development and evaluation of school-based gambling education programs relating to both methodological and content design and delivery considerations.

  16. Inequality of Higher Education in China: An Empirical Test Based on the Perspective of Relative Deprivation

    ERIC Educational Resources Information Center

    Hou, Liming

    2014-01-01

    The primary goal of this paper is to examine what makes Chinese college students dissatisfied with entrance opportunities for higher education. Based on the author's survey data, we test two parameters which could be a potential cause of this dissatisfaction: 1) distributive inequality, which emphasizes the individual's dissatisfaction caused by…

  17. Methodologies for Crawler Based Web Surveys.

    ERIC Educational Resources Information Center

    Thelwall, Mike

    2002-01-01

    Describes Web survey methodologies used to study the content of the Web, and discusses search engines and the concept of crawling the Web. Highlights include Web page selection methodologies; obstacles to reliable automatic indexing of Web sites; publicly indexable pages; crawling parameters; and tests for file duplication. (Contains 62…

  18. Dip-separated structural filtering using seislet transform and adaptive empirical mode decomposition based dip filter

    NASA Astrophysics Data System (ADS)

    Chen, Yangkang

    2016-07-01

    The seislet transform has been demonstrated to have a better compression performance for seismic data compared with other well-known sparsity promoting transforms, thus it can be used to remove random noise by simply applying a thresholding operator in the seislet domain. Since the seislet transform compresses the seismic data along the local structures, the seislet thresholding can be viewed as a simple structural filtering approach. Because of the dependence on a precise local slope estimation, the seislet transform usually suffers from low compression ratio and high reconstruction error for seismic profiles that have dip conflicts. In order to remove the limitation of seislet thresholding in dealing with conflicting-dip data, I propose a dip-separated filtering strategy. In this method, I first use an adaptive empirical mode decomposition based dip filter to separate the seismic data into several dip bands (5 or 6). Next, I apply seislet thresholding to each separated dip component to remove random noise. Then I combine all the denoised components to form the final denoised data. Compared with other dip filters, the empirical mode decomposition based dip filter is data-adaptive. One only needs to specify the number of dip components to be separated. Both complicated synthetic and field data examples show superior performance of my proposed approach than the traditional alternatives. The dip-separated structural filtering is not limited to seislet thresholding, and can also be extended to all those methods that require slope information.

  19. Semi-empirical versus process-based sea-level projections for the twenty-first century

    NASA Astrophysics Data System (ADS)

    Orlić, Mirko; Pasarić, Zoran

    2013-08-01

    Two dynamical methods are presently used to project sea-level changes during the next century. The process-based method relies on coupled atmosphere-ocean models to estimate the effects of thermal expansion and on sea-level models combined with certain empirical relationships to determine the influence of land-ice mass changes. The semi-empirical method uses various physically motivated relationships between temperature and sea level, with parameters determined from the data, to project total sea level. However, semi-empirical projections far exceed process-based projections. Here, we test the robustness of semi-empirical projections to the underlying assumptions about the inertial and equilibrium responses of sea level to temperature forcing and the impacts of groundwater depletion and dam retention during the twentieth century. Our results show that these projections are sensitive to the dynamics considered and the terrestrial-water corrections applied. For B1, which is a moderate climate-change scenario, the lowest semi-empirical projection of sea-level rise over the twenty-first century equals 62+/-14cm. The average value is substantially smaller than previously published semi-empirical projections and is therefore closer to the corresponding process-based values. The standard deviation is larger than the uncertainties of process-based estimates.

  20. Survey of Commercially Available Computer-Readable Bibliographic Data Bases.

    ERIC Educational Resources Information Center

    Schneider, John H., Ed.; And Others

    This document contains the results of a survey of 94 U. S. organizations, and 36 organizations in other countries that were thought to prepare machine-readable data bases. Of those surveyed, 55 organizations (40 in U. S., 15 in other countries) provided completed camera-ready forms describing 81 commercially available, machine-readable data bases…

  1. Tissue artifact removal from respiratory signals based on empirical mode decomposition.

    PubMed

    Liu, Shaopeng; Gao, Robert X; John, Dinesh; Staudenmayer, John; Freedson, Patty

    2013-05-01

    On-line measurement of respiration plays an important role in monitoring human physical activities. Such measurement commonly employs sensing belts secured around the rib cage and abdomen of the test object. Affected by the movement of body tissues, respiratory signals typically have a low signal-to-noise ratio. Removing tissue artifacts therefore is critical to ensuring effective respiration analysis. This paper presents a signal decomposition technique for tissue artifact removal from respiratory signals, based on the empirical mode decomposition (EMD). An algorithm based on the mutual information and power criteria was devised to automatically select appropriate intrinsic mode functions for tissue artifact removal and respiratory signal reconstruction. Performance of the EMD-algorithm was evaluated through simulations and real-life experiments (N = 105). Comparison with low-pass filtering that has been conventionally applied confirmed the effectiveness of the technique in tissue artifacts removal.

  2. Polarizable Empirical Force Field for Hexopyranose Monosaccharides Based on the Classical Drude Oscillator

    PubMed Central

    2015-01-01

    A polarizable empirical force field based on the classical Drude oscillator is presented for the hexopyranose form of selected monosaccharides. Parameter optimization targeted quantum mechanical (QM) dipole moments, solute–water interaction energies, vibrational frequencies, and conformational energies. Validation of the model was based on experimental data on crystals, densities of aqueous-sugar solutions, diffusion constants of glucose, and rotational preferences of the exocylic hydroxymethyl of d-glucose and d-galactose in aqueous solution as well as additional QM data. Notably, the final model involves a single electrostatic model for all sixteen diastereomers of the monosaccharides, indicating the transferability of the polarizable model. The presented parameters are anticipated to lay the foundation for a comprehensive polarizable force field for saccharides that will be compatible with the polarizable Drude parameters for lipids and proteins, allowing for simulations of glycolipids and glycoproteins. PMID:24564643

  3. A survey of machine readable data bases

    NASA Technical Reports Server (NTRS)

    Matlock, P.

    1981-01-01

    Forty-two of the machine readable data bases available to the technologist and researcher in the natural sciences and engineering are described and compared with the data bases and date base services offered by NASA.

  4. Polarizable empirical force field for acyclic polyalcohols based on the classical Drude oscillator.

    PubMed

    He, Xibing; Lopes, Pedro E M; Mackerell, Alexander D

    2013-10-01

    A polarizable empirical force field for acyclic polyalcohols based on the classical Drude oscillator is presented. The model is optimized with an emphasis on the transferability of the developed parameters among molecules of different sizes in this series and on the condensed-phase properties validated against experimental data. The importance of the explicit treatment of electronic polarizability in empirical force fields is demonstrated in the cases of this series of molecules with vicinal hydroxyl groups that can form cooperative intra- and intermolecular hydrogen bonds. Compared to the CHARMM additive force field, improved treatment of the electrostatic interactions avoids overestimation of the gas-phase dipole moments resulting in significant improvement in the treatment of the conformational energies and leads to the correct balance of intra- and intermolecular hydrogen bonding of glycerol as evidenced by calculated heat of vaporization being in excellent agreement with experiment. Computed condensed phase data, including crystal lattice parameters and volumes and densities of aqueous solutions are in better agreement with experimental data as compared to the corresponding additive model. Such improvements are anticipated to significantly improve the treatment of polymers in general, including biological macromolecules.

  5. Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition

    PubMed Central

    Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P.; McDonald-Maier, Klaus D.

    2015-01-01

    A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences. PMID:26007714

  6. Network-based empirical Bayes methods for linear models with applications to genomic data.

    PubMed

    Li, Caiyan; Wei, Zhi; Li, Hongzhe

    2010-03-01

    Empirical Bayes methods are widely used in the analysis of microarray gene expression data in order to identify the differentially expressed genes or genes that are associated with other general phenotypes. Available methods often assume that genes are independent. However, genes are expected to function interactively and to form molecular modules to affect the phenotypes. In order to account for regulatory dependency among genes, we propose in this paper a network-based empirical Bayes method for analyzing genomic data in the framework of linear models, where the dependency of genes is modeled by a discrete Markov random field defined on a predefined biological network. This method provides a statistical framework for integrating the known biological network information into the analysis of genomic data. We present an iterated conditional mode algorithm for parameter estimation and for estimating the posterior probabilities using Gibbs sampling. We demonstrate the application of the proposed methods using simulations and analysis of a human brain aging microarray gene expression data set.

  7. An empirical model for the plasma environment along Titan's orbit based on Cassini plasma observations

    NASA Astrophysics Data System (ADS)

    Smith, H. Todd; Rymer, Abigail M.

    2014-07-01

    Prior to Cassini's arrival at Saturn, the nitrogen-rich dense atmosphere of Titan was considered as a significant, if not dominant, source of heavy ions in Saturn's magnetosphere. While nitrogen was detected in Saturn's magnetosphere based on Cassini observations, Enceladus instead of Titan appears to be the primary source. However, it is difficult to imagine that Titan's dense atmosphere is not a source of nitrogen. In this paper, we apply the Rymer et al.'s (2009) Titan plasma environment categorization model to the plasma environment along Titan's orbit when Titan is not present. We next categorize the Titan encounters that occurred since Rymer et al. (2009). We also produce an empirical model for applying the probabilistic occurrence of each plasma environment as a function of Saturn local time (SLT). Finally, we summarized the electron energy spectra in order to allow one to calculate more accurate electron-impact interaction rates for each plasma environment category. The combination of this full categorization versus SLT and empirical model for the electron spectrum is critical for understanding the magnetospheric plasma and will allow for more accurate modeling of the Titan plasma torus.

  8. Determinants of Obesity and Associated Population Attributability, South Africa: Empirical Evidence from a National Panel Survey, 2008-2012

    PubMed Central

    Sartorius, Benn; Veerman, Lennert J.; Manyema, Mercy; Chola, Lumbwe; Hofman, Karen

    2015-01-01

    Background Obesity is a major risk factor for emerging non-communicable diseases (NCDS) in middle income countries including South Africa (SA). Understanding the multiple and complex determinants of obesity and their true population attributable impact is critical for informing and developing effective prevention efforts using scientific based evidence. This study identified contextualised high impact factors associated with obesity in South Africa. Methods Analysis of three national cross sectional (repeated panel) surveys, using a multilevel logistic regression and population attributable fraction estimation allowed for identification of contextualised high impact factors associated with obesity (BMI>30 kg/m2) among adults (15years+). Results Obesity prevalence increased significantly from 23.5% in 2008 to 27.2% in 2012, with a significantly (p-value<0.001) higher prevalence among females (37.9% in 2012) compared to males (13.3% in 2012). Living in formal urban areas, white ethnicity, being married, not exercising and/or in higher socio-economic category were significantly associated with male obesity. Females living in formal or informal urban areas, higher crime areas, African/White ethnicity, married, not exercising, in a higher socio-economic category and/or living in households with proportionate higher spending on food (and unhealthy food options) were significantly more likely to be obese. The identified determinants appeared to account for 75% and 43% of male and female obesity respectively. White males had the highest relative gain in obesity from 2008 to 2012. Conclusions The rising prevalence of obesity in South Africa is significant and over the past 5 years the rising prevalence of Type-2 diabetes has mirrored this pattern, especially among females. Targeting young adolescent girls should be a priority. Addressing determinants of obesity will involve a multifaceted strategy and requires at individual and population levels. With rising costs in the

  9. The children of divorce parenting intervention: outcome evaluation of an empirically based program.

    PubMed

    Wolchik, S A; West, S G; Westover, S; Sandler, I N; Martin, A; Lustig, J; Tein, J Y; Fisher, J

    1993-06-01

    Examined efficacy of an empirically based intervention using 70 divorced mothers who participated in a 12-session program or a wait-list condition. The program targeted five putative mediators: quality of the mother-child relationship, discipline, negative divorce events, contact with fathers, and support from nonparental adults. Posttest comparisons showed higher quality mother-child relationships and discipline, fewer negative divorce events, and better mental health outcomes for program participants than controls. More positive program effects occurred for mothers' than children's reports of variables and for families with poorest initial levels of functioning. Analyses indicated that improvement in the mother-child relationship partially mediated the effects of the program on mental health.

  10. Empirical likelihood based detection procedure for change point in mean residual life functions under random censorship.

    PubMed

    Chen, Ying-Ju; Ning, Wei; Gupta, Arjun K

    2016-05-01

    The mean residual life (MRL) function is one of the basic parameters of interest in survival analysis that describes the expected remaining time of an individual after a certain age. The study of changes in the MRL function is practical and interesting because it may help us to identify some factors such as age and gender that may influence the remaining lifetimes of patients after receiving a certain surgery. In this paper, we propose a detection procedure based on the empirical likelihood for the changes in MRL functions with right censored data. Two real examples are also given: Veterans' administration lung cancer study and Stanford heart transplant to illustrate the detecting procedure. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Confidence Interval Estimation for Sensitivity to the Early Diseased Stage Based on Empirical Likelihood.

    PubMed

    Dong, Tuochuan; Tian, Lili

    2015-01-01

    Many disease processes can be divided into three stages: the non-diseased stage: the early diseased stage, and the fully diseased stage. To assess the accuracy of diagnostic tests for such diseases, various summary indexes have been proposed, such as volume under the surface (VUS), partial volume under the surface (PVUS), and the sensitivity to the early diseased stage given specificity and the sensitivity to the fully diseased stage (P2). This paper focuses on confidence interval estimation for P2 based on empirical likelihood. Simulation studies are carried out to assess the performance of the new methods compared to the existing parametric and nonparametric ones. A real dataset from Alzheimer's Disease Neuroimaging Initiative (ADNI) is analyzed.

  12. On the pathophysiology of migraine--links for "empirically based treatment" with neurofeedback.

    PubMed

    Kropp, Peter; Siniatchkin, Michael; Gerber, Wolf-Dieter

    2002-09-01

    Psychophysiological data support the concept that migraine is the result of cortical hypersensitivity, hyperactivity, and a lack of habituation. There is evidence that this is a brain-stem related information processing dysfunction. This cortical activity reflects a periodicity between 2 migraine attacks and it may be due to endogenous or exogenous factors. In the few days preceding the next attack slow cortical potentials are highest and habituation delay experimentally recorded during contingent negative variation is at a maximum. These striking features of slow cortical potentials are predictors of the next attack. The pronounced negativity can be fed back to the patient. The data support the hypothesis that a change in amplitudes of slow cortical potentials is caused by altered habituation during the recording session. This kind of neurofeedback can be characterized as "empirically based" because it improves habituation and it proves to be clinically efficient.

  13. A Human ECG Identification System Based on Ensemble Empirical Mode Decomposition

    PubMed Central

    Zhao, Zhidong; Yang, Lei; Chen, Diandian; Luo, Yi

    2013-01-01

    In this paper, a human electrocardiogram (ECG) identification system based on ensemble empirical mode decomposition (EEMD) is designed. A robust preprocessing method comprising noise elimination, heartbeat normalization and quality measurement is proposed to eliminate the effects of noise and heart rate variability. The system is independent of the heart rate. The ECG signal is decomposed into a number of intrinsic mode functions (IMFs) and Welch spectral analysis is used to extract the significant heartbeat signal features. Principal component analysis is used reduce the dimensionality of the feature space, and the K-nearest neighbors (K-NN) method is applied as the classifier tool. The proposed human ECG identification system was tested on standard MIT-BIH ECG databases: the ST change database, the long-term ST database, and the PTB database. The system achieved an identification accuracy of 95% for 90 subjects, demonstrating the effectiveness of the proposed method in terms of accuracy and robustness. PMID:23698274

  14. Distributed optical fiber-based theoretical and empirical methods monitoring hydraulic engineering subjected to seepage velocity

    NASA Astrophysics Data System (ADS)

    Su, Huaizhi; Tian, Shiguang; Cui, Shusheng; Yang, Meng; Wen, Zhiping; Xie, Wei

    2016-09-01

    In order to systematically investigate the general principle and method of monitoring seepage velocity in the hydraulic engineering, the theoretical analysis and physical experiment were implemented based on distributed fiber-optic temperature sensing (DTS) technology. During the coupling influence analyses between seepage field and temperature field in the embankment dam or dike engineering, a simplified model was constructed to describe the coupling relationship of two fields. Different arrangement schemes of optical fiber and measuring approaches of temperature were applied on the model. The inversion analysis idea was further used. The theoretical method of monitoring seepage velocity in the hydraulic engineering was finally proposed. A new concept, namely the effective thermal conductivity, was proposed referring to the thermal conductivity coefficient in the transient hot-wire method. The influence of heat conduction and seepage could be well reflected by this new concept, which was proved to be a potential approach to develop an empirical method monitoring seepage velocity in the hydraulic engineering.

  15. Partial differential equation-based approach for empirical mode decomposition: application on image analysis.

    PubMed

    Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques

    2012-09-01

    The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.

  16. Gender Wage Gaps by College Major in Taiwan: Empirical Evidence from the 1997-2003 Manpower Utilization Survey

    ERIC Educational Resources Information Center

    Lin, Eric S.

    2010-01-01

    In this article, we examine the effect of incorporating the fields of study on the explained and unexplained components of the standard Oaxaca decomposition for the gender wage gaps in Taiwan using 1997-2003 Manpower Utilization Survey data. Using several existing and lately developed measures, we inspect the gender wage gap by college major to…

  17. The "Public Opinion Survey of Human Attributes-Stuttering" (POSHA-S): Summary Framework and Empirical Comparisons

    ERIC Educational Resources Information Center

    St. Louis, Kenneth O.

    2011-01-01

    Purpose: The "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was developed to make available worldwide a standard measure of public attitudes toward stuttering that is practical, reliable, valid, and translatable. Mean data from past field studies as comparisons for interpretation of "POSHA-S" results are reported. Method: Means…

  18. The WASP and NGTS ground-based transit surveys

    NASA Astrophysics Data System (ADS)

    Wheatley, P. J.

    2015-10-01

    I will review the current status of ground-based exoplanet transit surveys, using the Wide Angle Search for Planets (WASP) and the Next Generation Transit Survey (NGTS) as specific examples. I will describe the methods employed by these surveys and show how planets from Neptune to Jupiter-size are detected and confirmed around bright stars. I will also give an overview of the remarkably wide range of exoplanet characterization that is made possible with large-telescope follow up of these bright transiting systems. This characterization includes bulk composition and spin-orbit alignment, as well as atmospheric properties such as thermal structure, composition and dynamics. Finally, I will outline how ground-based photometric studies of transiting planets will evolve with the advent of new space-based surveys such as TESS and PLATO.

  19. Web-based application for Data INterpolation Empirical Orthogonal Functions (DINEOF) analysis

    NASA Astrophysics Data System (ADS)

    Tomazic, Igor; Alvera-Azcarate, Aida; Barth, Alexander; Beckers, Jean-Marie

    2014-05-01

    DINEOF (Data INterpolating Empirical Orthogonal Functions) is a powerful tool based on EOF decomposition developed at the University of Liege/GHER for the reconstruction of missing data in satellite datasets, as well as for the reduction of noise and detection of outliers. DINEOF is openly available as a series of Fortran routines to be compiled by the user, and as binaries (that can be run directly without any compilation) both for Windows and Linux platforms. In order to facilitate the use of DINEOF and increase the number of interested users, we developed a web-based interface for DINEOF with the necessary parameters available to run high-quality DINEOF analysis. This includes choosing variable within selected dataset, defining a domain, time range, filtering criteria based on available variables in the dataset (e.g. quality flag, satellite zenith angle …) and defining necessary DINEOF parameters. Results, including reconstructed data and calculated EOF modes will be disseminated in NetCDF format using OpenDAP and WMS server allowing easy visualisation and analysis. First, we will include several satellite datasets of sea surface temperature and chlorophyll concentration obtained from MyOcean data centre and already remapped to the regular grid (L3C). Later, based on user's request, we plan to extend number of datasets available for reconstruction.

  20. An Empirical Study on Washback Effects of the Internet-Based College English Test Band 4 in China

    ERIC Educational Resources Information Center

    Wang, Chao; Yan, Jiaolan; Liu, Bao

    2014-01-01

    Based on Bailey's washback model, in respect of participants, process and products, the present empirical study was conducted to find the actual washback effects of the internet-based College English Test Band 4 (IB CET-4). The methods adopted are questionnaires, class observation, interview and the analysis of both the CET-4 teaching and testing…

  1. School-based survey participation: oral health and BMI survey of Ohio third graders.

    PubMed

    Detty, Amber M R

    2013-09-01

    During the 2009-2010 school year, the Ohio Department of Health conducted a statewide oral health and body mass index (BMI) screening survey among 3rd grade children. This marked the fifth school-based survey regarding the oral health of Ohio children since 1987. At 50 %, the participation rate of the 2009-2010 oral health and BMI survey was at the lowest level ever experienced. This study aimed to identify the factors associated with participation rates in a school-based survey. A stratified, random sample of 377 schools was drawn from the list of 1,742 Ohio public elementary schools with third grade students. All third grade children in the sampled schools with parent or guardian consent received an oral health screening and height/weight measurement by trained health professionals. Participation rates at the school level were then combined with data on school characteristics and survey implementation. Predictors of school form return, participation, and refusal rates were assessed by generalized linear modeling (GLM). High student mobility and larger school size were associated with lower form return (p = 0.000 and p = 0.001, respectively) and lower participation rates (p = 0.000 and p = 0.005, respectively). Surveying in the fall or spring (as opposed to winter) significantly decreased form return (p = 0.001 and p = 0.016, respectively) and participation rates (p = 0.008 and p = 0.002, respectively), while being surveyed by internal staff (versus external screeners) significantly increased form return (p = 0.003) and participation rates (p = 0.001). Efforts to increase participation should focus more on schools with higher student mobility and larger size. Additionally, participation could be improved by using internal staff and surveying during winter.

  2. An Empirical Model of Solar Indices and Hemispheric Power based on DMSP/SSUSI Data

    NASA Astrophysics Data System (ADS)

    Shaikh, D.; Jones, J.

    2014-09-01

    Aurorae are produced by the collision of charged energetic particles, typically the electrons, with the Earths neutral atmosphere particularly in the high latitude regions. These particles originate predominantly from the solar wind that traverses through the Earths magnetosphere and precipitates into the Earths atmosphere thereby resulting in emission of radiation in various frequency ranges. During this process, energetic electrons deposit their kinetic energy (10s of KeV) in the upper atmosphere. The rate of electron kinetic energy deposited over the northern or southern region is called electron hemispheric power (Hpe), measured in Gigawatt (GW). Since the origin and dynamics of these precipitating charged particles is intimately connected to the kinetic and magnetic activities taking place in our Sun, they can be used as a proxy to determine many physical processes that drive the space weather on our Earth. In this paper, we examine correlations that can possibly exist between the Hpe component and various other geomagnetic parameters such as kp, Ap, solar flux and sun spot numbers. For this purpose, we evaluate a year (2012) of data from the Special Sensor Ultraviolet Spectrographic Imager (SSUSI) of the Defense Meteorological Satellite Program (DMSP) Flight 18 - satellite. We find substantially strong correlations between the Hpe and Kp, Ap, the Sun spot number (SSN) and the solar flux density. The practical application of our empirical model is multifold. (i) We can determine/forecast Kp index directly from the electron flux density and use it to drive a variety of space weather models that heavily rely on the Kp input. (ii) The Kp and Ap forecasts from our empirical correlation model could be complementary to the traditional ground-based magnetometer data.

  3. Written institutional ethics policies on euthanasia: an empirical-based organizational-ethical framework.

    PubMed

    Lemiengre, Joke; Dierckx de Casterlé, Bernadette; Schotsmans, Paul; Gastmans, Chris

    2014-05-01

    As euthanasia has become a widely debated issue in many Western countries, hospitals and nursing homes especially are increasingly being confronted with this ethically sensitive societal issue. The focus of this paper is how healthcare institutions can deal with euthanasia requests on an organizational level by means of a written institutional ethics policy. The general aim is to make a critical analysis whether these policies can be considered as organizational-ethical instruments that support healthcare institutions to take their institutional responsibility for dealing with euthanasia requests. By means of an interpretative analysis, we conducted a process of reinterpretation of results of former Belgian empirical studies on written institutional ethics policies on euthanasia in dialogue with the existing international literature. The study findings revealed that legal regulations, ethical and care-oriented aspects strongly affected the development, the content, and the impact of written institutional ethics policies on euthanasia. Hence, these three cornerstones-law, care and ethics-constituted the basis for the empirical-based organizational-ethical framework for written institutional ethics policies on euthanasia that is presented in this paper. However, having a euthanasia policy does not automatically lead to more legal transparency, or to a more professional and ethical care practice. The study findings suggest that the development and implementation of an ethics policy on euthanasia as an organizational-ethical instrument should be considered as a dynamic process. Administrators and ethics committees must take responsibility to actively create an ethical climate supporting care providers who have to deal with ethical dilemmas in their practice.

  4. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User's Head Movement.

    PubMed

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-08-31

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest.

  5. Empirical Evaluation Indicators in Thai Higher Education: Theory-Based Multidimensional Learners' Assessment

    ERIC Educational Resources Information Center

    Sritanyarat, Dawisa; Russ-Eft, Darlene

    2016-01-01

    This study proposed empirical indicators which can be validated and adopted in higher education institutions to evaluate quality of teaching and learning, and to serve as an evaluation criteria for human resource management and development of higher institutions in Thailand. The main purpose of this study was to develop empirical indicators of a…

  6. Impact of Inadequate Empirical Therapy on the Mortality of Patients with Bloodstream Infections: a Propensity Score-Based Analysis

    PubMed Central

    Retamar, Pilar; Portillo, María M.; López-Prieto, María Dolores; Rodríguez-López, Fernando; de Cueto, Marina; García, María V.; Gómez, María J.; del Arco, Alfonso; Muñoz, Angel; Sánchez-Porto, Antonio; Torres-Tortosa, Manuel; Martín-Aspas, Andrés; Arroyo, Ascensión; García-Figueras, Carolina; Acosta, Federico; Corzo, Juan E.; León-Ruiz, Laura; Escobar-Lara, Trinidad

    2012-01-01

    The impact of the adequacy of empirical therapy on outcome for patients with bloodstream infections (BSI) is key for determining whether adequate empirical coverage should be prioritized over other, more conservative approaches. Recent systematic reviews outlined the need for new studies in the field, using improved methodologies. We assessed the impact of inadequate empirical treatment on the mortality of patients with BSI in the present-day context, incorporating recent methodological recommendations. A prospective multicenter cohort including all BSI episodes in adult patients was performed in 15 hospitals in Andalucía, Spain, over a 2-month period in 2006 to 2007. The main outcome variables were 14- and 30-day mortality. Adjusted analyses were performed by multivariate analysis and propensity score-based matching. Eight hundred one episodes were included. Inadequate empirical therapy was administered in 199 (24.8%) episodes; mortality at days 14 and 30 was 18.55% and 22.6%, respectively. After controlling for age, Charlson index, Pitt score, neutropenia, source, etiology, and presentation with severe sepsis or shock, inadequate empirical treatment was associated with increased mortality at days 14 and 30 (odds ratios [ORs], 2.12 and 1.56; 95% confidence intervals [95% CI], 1.34 to 3.34 and 1.01 to 2.40, respectively). The adjusted ORs after a propensity score-based matched analysis were 3.03 and 1.70 (95% CI, 1.60 to 5.74 and 0.98 to 2.98, respectively). In conclusion, inadequate empirical therapy is independently associated with increased mortality in patients with BSI. Programs to improve the quality of empirical therapy in patients with suspicion of BSI and optimization of definitive therapy should be implemented. PMID:22005999

  7. Is Project Based Learning More Effective than Direct Instruction in School Science Classrooms? An Analysis of the Empirical Research Evidence

    NASA Astrophysics Data System (ADS)

    Dann, Clifford

    An increasingly loud call by parents, school administrators, teachers, and even business leaders for "authentic learning", emphasizing both group-work and problem solving, has led to growing enthusiasm for inquiry-based learning over the past decade. Although "inquiry" can be defined in many ways, a curriculum called "project-based learning" has recently emerged as the inquiry practice-of-choice with roots in the educational constructivism that emerged in the mid-twentieth century. Often, project-based learning is framed as an alternative instructional strategy to direct instruction for maximizing student content knowledge. This study investigates the empirical evidence for such a comparison while also evaluating the overall quality of the available studies in the light of accepted standards for educational research. Specifically, this thesis investigates what the body of quantitative research says about the efficacy of project-based learning vs. direct instruction when considering student acquisition of content knowledge in science classrooms. Further, existing limitations of the research pertaining to project based learning and secondary school education are explored. The thesis concludes with a discussion of where and how we should focus our empirical efforts in the future. The research revealed that the available empirical research contains flaws in both design and instrumentation. In particular, randomization is poor amongst all the studies considered. The empirical evidence indicates that project-based learning curricula improved student content knowledge but that, while the results were statistically significant, increases in raw test scores were marginal.

  8. Evaluation of Physically and Empirically Based Models for the Estimation of Green Roof Evapotranspiration

    NASA Astrophysics Data System (ADS)

    Digiovanni, K. A.; Montalto, F. A.; Gaffin, S.; Rosenzweig, C.

    2010-12-01

    Green roofs and other urban green spaces can provide a variety of valuable benefits including reduction of the urban heat island effect, reduction of stormwater runoff, carbon sequestration, oxygen generation, air pollution mitigation etc. As many of these benefits are directly linked to the processes of evaporation and transpiration, accurate and representative estimation of urban evapotranspiration (ET) is a necessary tool for predicting and quantifying such benefits. However, many common ET estimation procedures were developed for agricultural applications, and thus carry inherent assumptions that may only be rarely applicable to urban green spaces. Various researchers have identified the estimation of expected urban ET rates as critical, yet poorly studied components of urban green space performance prediction and cite that further evaluation is needed to reconcile differences in predictions from varying ET modeling approaches. A small scale green roof lysimeter setup situated on the green roof of the Ethical Culture Fieldston School in the Bronx, NY has been the focus of ongoing monitoring initiated in June 2009. The experimental setup includes a 0.6 m by 1.2 m Lysimeter replicating the anatomy of the 500 m2 green roof of the building, with a roof membrane, drainage layer, 10 cm media depth, and planted with a variety of Sedum species. Soil moisture sensors and qualitative runoff measurements are also recorded in the Lysimeter, while a weather station situated on the rooftop records climatologic data. Direct quantification of actual evapotranspiration (AET) from the green roof weighing lysimeter was achieved through a mass balance approaches during periods absent of precipitation and drainage. A comparison of AET to estimates of potential evapotranspiration (PET) calculated from empirically and physically based ET models was performed in order to evaluate the applicability of conventional ET equations for the estimation of ET from green roofs. Results have

  9. Electrocardiogram signal denoising based on empirical mode decomposition technique: an overview

    NASA Astrophysics Data System (ADS)

    Han, G.; Lin, B.; Xu, Z.

    2017-03-01

    Electrocardiogram (ECG) signal is nonlinear and non-stationary weak signal which reflects whether the heart is functioning normally or abnormally. ECG signal is susceptible to various kinds of noises such as high/low frequency noises, powerline interference and baseline wander. Hence, the removal of noises from ECG signal becomes a vital link in the ECG signal processing and plays a significant role in the detection and diagnosis of heart diseases. The review will describe the recent developments of ECG signal denoising based on Empirical Mode Decomposition (EMD) technique including high frequency noise removal, powerline interference separation, baseline wander correction, the combining of EMD and Other Methods, EEMD technique. EMD technique is a quite potential and prospective but not perfect method in the application of processing nonlinear and non-stationary signal like ECG signal. The EMD combined with other algorithms is a good solution to improve the performance of noise cancellation. The pros and cons of EMD technique in ECG signal denoising are discussed in detail. Finally, the future work and challenges in ECG signal denoising based on EMD technique are clarified.

  10. An empirically based steady state friction law and implications for fault stability.

    PubMed

    Spagnuolo, E; Nielsen, S; Violay, M; Di Toro, G

    2016-04-16

    Empirically based rate-and-state friction laws (RSFLs) have been proposed to model the dependence of friction forces with slip and time. The relevance of the RSFL for earthquake mechanics is that few constitutive parameters define critical conditions for fault stability (i.e., critical stiffness and frictional fault behavior). However, the RSFLs were determined from experiments conducted at subseismic slip rates (V < 1 cm/s), and their extrapolation to earthquake deformation conditions (V > 0.1 m/s) remains questionable on the basis of the experimental evidence of (1) large dynamic weakening and (2) activation of particular fault lubrication processes at seismic slip rates. Here we propose a modified RSFL (MFL) based on the review of a large published and unpublished data set of rock friction experiments performed with different testing machines. The MFL, valid at steady state conditions from subseismic to seismic slip rates (0.1 µm/s < V < 3 m/s), describes the initiation of a substantial velocity weakening in the 1-20 cm/s range resulting in a critical stiffness increase that creates a peak of potential instability in that velocity regime. The MFL leads to a new definition of fault frictional stability with implications for slip event styles and relevance for models of seismic rupture nucleation, propagation, and arrest.

  11. PDE-based nonlinear diffusion techniques for denoising scientific and industrial images: an empirical study

    NASA Astrophysics Data System (ADS)

    Weeratunga, Sisira K.; Kamath, Chandrika

    2002-05-01

    Removing noise from data is often the first step in data analysis. Denoising techniques should not only reduce the noise, but do so without blurring or changing the location of the edges. Many approaches have been proposed to accomplish this; in this paper, we focus on one such approach, namely the use of non-linear diffusion operators. This approach has been studied extensively from a theoretical viewpoint ever since the 1987 work of Perona and Malik showed that non-linear filters outperformed the more traditional linear Canny edge detector. We complement this theoretical work by investigating the performance of several isotropic diffusion operators on test images from scientific domains. We explore the effects of various parameters such as the choice of diffusivity function, explicit and implicit methods for the discretization of the PDE, and approaches for the spatial discretization of the non-linear operator etc. We also compare these schemes with simple spatial filters and the more complex wavelet-based shrinkage techniques. Our empirical results show that, with an appropriate choice of parameters, diffusion-based schemes can be as effective as competitive techniques.

  12. Feasibility of an empirically based program for parents of preschoolers with autism spectrum disorder.

    PubMed

    Dababnah, Sarah; Parish, Susan L

    2016-01-01

    This article reports on the feasibility of implementing an existing empirically based program, The Incredible Years, tailored to parents of young children with autism spectrum disorder. Parents raising preschool-aged children (aged 3-6 years) with autism spectrum disorder (N = 17) participated in a 15-week pilot trial of the intervention. Quantitative assessments of the program revealed fidelity was generally maintained, with the exception of program-specific videos. Qualitative data from individual post-intervention interviews reported parents benefited most from child emotion regulation strategies, play-based child behavior skills, parent stress management, social support, and visual resources. More work is needed to further refine the program to address parent self-care, partner relationships, and the diverse behavioral and communication challenges of children across the autism spectrum. Furthermore, parent access and retention could potentially be increased by providing in-home childcare vouchers and a range of times and locations in which to offer the program. The findings suggest The Incredible Years is a feasible intervention for parents seeking additional support for child- and family-related challenges and offers guidance to those communities currently using The Incredible Years or other related parenting programs with families of children with autism spectrum disorder.

  13. Phospholipid-based nonlamellar mesophases for delivery systems: bridging the gap between empirical and rational design.

    PubMed

    Martiel, Isabelle; Sagalowicz, Laurent; Mezzenga, Raffaele

    2014-07-01

    Phospholipids are ubiquitous cell membrane components and relatively well-accepted ingredients due to their natural origin. Phosphatidylcholine (PC) in particular offers a promising alternative to monoglycerides for lyotropic liquid crystalline (LLC) delivery system applications in the food, cosmetics and pharmaceutical industries, provided its strong tendency to form zero-mean curvature lamellar mesophases in water can be overcome. Higher negative curvatures are usually reached through the addition of a third lipid component, forming a ternary diagram phospholipid/water/oil. The initial part of this work summarizes the potential advantages and the challenges of phospholipid-based delivery system applications. In the next part, various ternary PC/water/oil systems are discussed, with a special emphasis on the PC/water/cyclohexane and PC/water/α-tocopherol systems. We report that R-(+)-limonene has a quantitatively similar effect as cyclohexane. The last part is devoted to the theoretical interpretation of the observed phase behaviors. A fruitful parallel is drawn with PC polymer-like reverse micelles, leading to a thermodynamic description in terms of interfacial bending energy. Investigations at the molecular level are reviewed to help in bridging the empirical and theoretical approaches. Predictive rules are finally derived from this wide-ranging overview, thereby opening the way to a future rational design of PC-based LLC delivery systems.

  14. Predicting Protein Secondary Structure Using Consensus Data Mining (CDM) Based on Empirical Statistics and Evolutionary Information.

    PubMed

    Kandoi, Gaurav; Leelananda, Sumudu P; Jernigan, Robert L; Sen, Taner Z

    2017-01-01

    Predicting the secondary structure of a protein from its sequence still remains a challenging problem. The prediction accuracies remain around 80 %, and for very diverse methods. Using evolutionary information and machine learning algorithms in particular has had the most impact. In this chapter, we will first define secondary structures, then we will review the Consensus Data Mining (CDM) technique based on the robust GOR algorithm and Fragment Database Mining (FDM) approach. GOR V is an empirical method utilizing a sliding window approach to model the secondary structural elements of a protein by making use of generalized evolutionary information. FDM uses data mining from experimental structure fragments, and is able to successfully predict the secondary structure of a protein by combining experimentally determined structural fragments based on sequence similarities of the fragments. The CDM method combines predictions from GOR V and FDM in a hierarchical manner to produce consensus predictions for secondary structure. In other words, if sequence fragment are not available, then it uses GOR V to make the secondary structure prediction. The online server of CDM is available at http://gor.bb.iastate.edu/cdm/ .

  15. Time Domain Strain/Stress Reconstruction Based on Empirical Mode Decomposition: Numerical Study and Experimental Validation

    PubMed Central

    He, Jingjing; Zhou, Yibin; Guan, Xuefei; Zhang, Wei; Zhang, Weifang; Liu, Yongming

    2016-01-01

    Structural health monitoring has been studied by a number of researchers as well as various industries to keep up with the increasing demand for preventive maintenance routines. This work presents a novel method for reconstruct prompt, informed strain/stress responses at the hot spots of the structures based on strain measurements at remote locations. The structural responses measured from usage monitoring system at available locations are decomposed into modal responses using empirical mode decomposition. Transformation equations based on finite element modeling are derived to extrapolate the modal responses from the measured locations to critical locations where direct sensor measurements are not available. Then, two numerical examples (a two-span beam and a 19956-degree of freedom simplified airfoil) are used to demonstrate the overall reconstruction method. Finally, the present work investigates the effectiveness and accuracy of the method through a set of experiments conducted on an aluminium alloy cantilever beam commonly used in air vehicle and spacecraft. The experiments collect the vibration strain signals of the beam via optical fiber sensors. Reconstruction results are compared with theoretical solutions and a detailed error analysis is also provided. PMID:27537889

  16. An empirically based steady state friction law and implications for fault stability

    PubMed Central

    Nielsen, S.; Violay, M.; Di Toro, G.

    2016-01-01

    Abstract Empirically based rate‐and‐state friction laws (RSFLs) have been proposed to model the dependence of friction forces with slip and time. The relevance of the RSFL for earthquake mechanics is that few constitutive parameters define critical conditions for fault stability (i.e., critical stiffness and frictional fault behavior). However, the RSFLs were determined from experiments conducted at subseismic slip rates (V < 1 cm/s), and their extrapolation to earthquake deformation conditions (V > 0.1 m/s) remains questionable on the basis of the experimental evidence of (1) large dynamic weakening and (2) activation of particular fault lubrication processes at seismic slip rates. Here we propose a modified RSFL (MFL) based on the review of a large published and unpublished data set of rock friction experiments performed with different testing machines. The MFL, valid at steady state conditions from subseismic to seismic slip rates (0.1 µm/s < V < 3 m/s), describes the initiation of a substantial velocity weakening in the 1–20 cm/s range resulting in a critical stiffness increase that creates a peak of potential instability in that velocity regime. The MFL leads to a new definition of fault frictional stability with implications for slip event styles and relevance for models of seismic rupture nucleation, propagation, and arrest. PMID:27667875

  17. An empirically based steady state friction law and implications for fault stability

    NASA Astrophysics Data System (ADS)

    Spagnuolo, E.; Nielsen, S.; Violay, M.; Di Toro, G.

    2016-04-01

    Empirically based rate-and-state friction laws (RSFLs) have been proposed to model the dependence of friction forces with slip and time. The relevance of the RSFL for earthquake mechanics is that few constitutive parameters define critical conditions for fault stability (i.e., critical stiffness and frictional fault behavior). However, the RSFLs were determined from experiments conducted at subseismic slip rates (V < 1 cm/s), and their extrapolation to earthquake deformation conditions (V > 0.1 m/s) remains questionable on the basis of the experimental evidence of (1) large dynamic weakening and (2) activation of particular fault lubrication processes at seismic slip rates. Here we propose a modified RSFL (MFL) based on the review of a large published and unpublished data set of rock friction experiments performed with different testing machines. The MFL, valid at steady state conditions from subseismic to seismic slip rates (0.1 µm/s < V < 3 m/s), describes the initiation of a substantial velocity weakening in the 1-20 cm/s range resulting in a critical stiffness increase that creates a peak of potential instability in that velocity regime. The MFL leads to a new definition of fault frictional stability with implications for slip event styles and relevance for models of seismic rupture nucleation, propagation, and arrest.

  18. Vandenberg Air Force Base Emission Survey.

    DTIC Science & Technology

    1983-01-01

    meteorological conditions. These procedures are described in the following sections. The Toxic Hazard Corridor ( THC ) forecast is the method by which...areas downwind of planned vents and possible accidental spills of toxic chemicals. 2-64 A THC forecast must be requested from Base Weather immediately...prior to the start of the operations for which the THC is needed. A THC must be computed for the following operations: (1) Transfer of toxic

  19. Surveying ourselves: examining the use of a web-based approach for a physician survey.

    PubMed

    Matteson, Kristen A; Anderson, Britta L; Pinto, Stephanie B; Lopes, Vrishali; Schulkin, Jay; Clark, Melissa A

    2011-12-01

    A survey was distributed, using a sequential mixed-mode approach, to a national sample of obstetrician-gynecologists. Differences between responses to the web-based mode and the on-paper mode were compared to determine if there were systematic differences between respondents. Only two differences in respondents between the two modes were identified. University-based physicians were more likely to complete the web-based mode than private practice physicians. Mail respondents reported a greater volume of endometrial ablations compared to online respondents. The web-based mode had better data quality than the paper-based mailed mode in terms of less missing and inappropriate responses. Together, these findings suggest that, although a few differences were identified, the web-based survey mode attained adequate representativeness and improved data quality. Given the metrics examined for this study, exclusive use of web-based data collection may be appropriate for physician surveys with a minimal reduction in sample coverage and without a reduction in data quality.

  20. The Potential for Empirically Based Estimates of Expected Progress for Students with Learning Disabilities: Legal and Conceptual Issues.

    ERIC Educational Resources Information Center

    Stone, C. Addison; Doane, J. Abram

    2001-01-01

    The purpose of this article is to spark discussion regarding the value and feasibility of empirically based procedures for goal setting and evaluation of educational services. Recent legal decisions and policy debates point to the need for clearer criteria in decisions regarding appropriate educational services. Possible roles for school…

  1. Empirically Based Phenotypic Profiles of Children with Pervasive Developmental Disorders: Interpretation in the Light of the DSM-5

    ERIC Educational Resources Information Center

    Greaves-Lord, Kirstin; Eussen, Mart L. J. M.; Verhulst, Frank C.; Minderaa, Ruud B.; Mandy, William; Hudziak, James J.; Steenhuis, Mark Peter; de Nijs, Pieter F.; Hartman, Catharina A.

    2013-01-01

    This study aimed to contribute to the Diagnostic and Statistical Manual (DSM) debates on the conceptualization of autism by investigating (1) whether empirically based distinct phenotypic profiles could be distinguished within a sample of mainly cognitively able children with pervasive developmental disorder (PDD), and (2) how profiles related to…

  2. An Empirical Introduction to the Concept of Chemical Element Based on Van Hiele's Theory of Level Transitions

    ERIC Educational Resources Information Center

    Vogelezang, Michiel; Van Berkel, Berry; Verdonk, Adri

    2015-01-01

    Between 1970 and 1990, the Dutch working group "Empirical Introduction to Chemistry" developed a secondary school chemistry education curriculum based on the educational vision of the mathematicians van Hiele and van Hiele-Geldof. This approach viewed learning as a process in which students must go through discontinuous level transitions…

  3. Empirically Guided Coordination of Multiple Evidence-Based Treatments: An Illustration of Relevance Mapping in Children's Mental Health Services

    ERIC Educational Resources Information Center

    Chorpita, Bruce F.; Bernstein, Adam; Daleiden, Eric L.

    2011-01-01

    Objective: Despite substantial progress in the development and identification of psychosocial evidence-based treatments (EBTs) in mental health, there is minimal empirical guidance for selecting an optimal "set" of EBTs maximally applicable and generalizable to a chosen service sample. Relevance mapping is a proposed methodology that…

  4. Target-based fiber assignment for large survey spectrographs

    NASA Astrophysics Data System (ADS)

    Schaefer, Christoph E. R.; Makarem, Laleh; Kneib, Jean-Paul

    2016-07-01

    Next generation massive spectroscopic survey projects have to process a massive amount of targets. The preparation of subsequent observations should be feasible in a reasonable amount of time. We present a fast algorithm for target assignment that scales as O(log(n)). Our proposed algorithm follow a target based approach, which enables to assign large number of targets to their positioners quickly and with a very high assignment efficiency. We also discuss additional optimization of the fiber positioning problem to take into account the positioner collision problems and how to use the algorithm for an optimal survey strategy. We apply our target-based algorithm in the context of the MOONS project.

  5. Empirical likelihood-based confidence intervals for length-biased data

    PubMed Central

    Ning, J.; Qin, J.; Asgharian, M.; Shen, Y.

    2013-01-01

    Logistic or other constraints often preclude the possibility of conducting incident cohort studies. A feasible alternative in such cases is to conduct a cross-sectional prevalent cohort study for which we recruit prevalent cases, i.e. subjects who have already experienced the initiating event, say the onset of a disease. When the interest lies in estimating the lifespan between the initiating event and a terminating event, say death for instance, such subjects may be followed prospectively until the terminating event or loss to follow-up, whichever happens first. It is well known that prevalent cases have, on average, longer lifespans. As such they do not constitute a representative random sample from the target population; they comprise a biased sample. If the initiating events are generated from a stationary Poisson process, the so-called stationarity assumption, this bias is called length bias. The current literature on length-biased sampling lacks a simple method for estimating the margin of errors of commonly used summary statistics. We fill this gap using the empirical likelihood-based confidence intervals by adapting this method to right-censored length-biased survival data. Both large and small sample behaviors of these confidence intervals are studied. We illustrate our method using a set of data on survival with dementia, collected as part of the Canadian Study of Health and Aging. PMID:23027662

  6. Polarizable Empirical Force Field for Aromatic Compounds Based on the Classical Drude Oscillator

    PubMed Central

    Lopes, Pedro E. M.; Lamoureux, Guillaume; Roux, Benoit; MacKerell, Alexander D.

    2008-01-01

    The polarizable empirical CHARMM force field based on the classical Drude oscillator has been extended to the aromatic compounds benzene and toluene. Parameters were optimized for benzene and then transferred directly to toluene, with parameters for the methyl moiety of toluene taken from the previously published work on the alkanes. Optimization of all parameters was performed against an extensive set of quantum mechanical and experimental data. Ab initio data was used for determination of the electrostatic parameters, the vibrational analysis, and in the optimization of the relative magnitudes of the Lennard-Jones parameters. The absolute values of the Lennard-Jones parameters were determined by comparing computed and experimental heats of vaporization, molecular volumes, free energies of hydration and dielectric constants. The newly developed parameter set was extensively tested against additional experimental data such as vibrational spectra in the condensed phase, diffusion constants, heat capacities at constant pressure and isothermal compressibilities including data as a function of temperature. Moreover, the structure of liquid benzene, liquid toluene and of solutions of each in water were studied. In the case of benzene, the computed and experimental total distribution function were compared, with the developed model shown to be in excellent agreement with experiment. PMID:17388420

  7. Interdigitated silver-polymer-based antibacterial surface system activated by oligodynamic iontophoresis - an empirical characterization study.

    PubMed

    Shirwaiker, Rohan A; Wysk, Richard A; Kariyawasam, Subhashinie; Voigt, Robert C; Carrion, Hector; Nembhard, Harriet Black

    2014-02-01

    There is a pressing need to control the occurrences of nosocomial infections due to their detrimental effects on patient well-being and the rising treatment costs. To prevent the contact transmission of such infections via health-critical surfaces, a prophylactic surface system that consists of an interdigitated array of oppositely charged silver electrodes with polymer separations and utilizes oligodynamic iontophoresis has been recently developed. This paper presents a systematic study that empirically characterizes the effects of the surface system parameters on its antibacterial efficacy, and validates the system's effectiveness. In the first part of the study, a fractional factorial design of experiments (DOE) was conducted to identify the statistically significant system parameters. The data were used to develop a first-order response surface model to predict the system's antibacterial efficacy based on the input parameters. In the second part of the study, the effectiveness of the surface system was validated by evaluating it against four bacterial species responsible for several nosocomial infections - Staphylococcus aureus, Escherichia coli, Pseudomonas aeruginosa, and Enterococcus faecalis - alongside non-antibacterial polymer (acrylic) control surfaces. The system demonstrated statistically significant efficacy against all four bacteria. The results indicate that given a constant total effective surface area, the system designed with micro-scale features (minimum feature width: 20 μm) and activated by 15 μA direct current will provide the most effective antibacterial prophylaxis.

  8. Empirical prediction of Indian summer monsoon rainfall with different lead periods based on global SST anomalies

    NASA Astrophysics Data System (ADS)

    Pai, D. S.; Rajeevan, M.

    2006-02-01

    The main objective of this study was to develop empirical models with different seasonal lead time periods for the long range prediction of seasonal (June to September) Indian summer monsoon rainfall (ISMR). For this purpose, 13 predictors having significant and stable relationships with ISMR were derived by the correlation analysis of global grid point seasonal Sea-Surface Temperature (SST) anomalies and the tendency in the SST anomalies. The time lags of the seasonal SST anomalies were varied from 1 season to 4 years behind the reference monsoon season. The basic SST data set used was the monthly NOAA Extended Reconstructed Global SST (ERSST) data at 2° × 2° spatial grid for the period 1951 2003. The time lags of the 13 predictors derived from various areas of all three tropical ocean basins (Indian, Pacific and Atlantic Oceans) varied from 1 season to 3 years. Based on these inter-correlated predictors, 3 predictor sub sets A, B and C were formed with prediction lead time periods of 0, 1 and 2 seasons, respectively, from the beginning of the monsoon season. The selected principal components (PCs) of these predictor sets were used as the input parameters for the models A, B and C, respectively. The model development period was 1955 1984. The correct model size was derived using all-possible regressions procedure and Mallow’s “Cp” statistics.

  9. Ship classification using nonlinear features of radiated sound: an approach based on empirical mode decomposition.

    PubMed

    Bao, Fei; Li, Chen; Wang, Xinlong; Wang, Qingfu; Du, Shuanping

    2010-07-01

    Classification for ship-radiated underwater sound is one of the most important and challenging subjects in underwater acoustical signal processing. An approach to ship classification is proposed in this work based on analysis of ship-radiated acoustical noise in subspaces of intrinsic mode functions attained via the ensemble empirical mode decomposition. It is shown that detection and acquisition of stable and reliable nonlinear features become practically feasible by nonlinear analysis of the time series of individual decomposed components, each of which is simple enough and well represents an oscillatory mode of ship dynamics. Surrogate and nonlinear predictability analysis are conducted to probe and measure the nonlinearity and regularity. The results of both methods, which verify each other, substantiate that ship-radiated noises contain components with deterministic nonlinear features well serving for efficient classification of ships. The approach perhaps opens an alternative avenue in the direction toward object classification and identification. It may also import a new view of signals as complex as ship-radiated sound.

  10. Satellite-based empirical models linking river plume dynamics with hypoxic area and volume

    NASA Astrophysics Data System (ADS)

    Le, Chengfeng; Lehrter, John C.; Hu, Chuanmin; Obenour, Daniel R.

    2016-03-01

    Satellite-based empirical models explaining hypoxic area and volume variation were developed for the seasonally hypoxic (O2 < 2 mg L-1) northern Gulf of Mexico adjacent to the Mississippi River. Annual variations in midsummer hypoxic area and volume were related to Moderate Resolution Imaging Spectroradiometer-derived monthly estimates of river plume area (km2) and average, inner shelf chlorophyll a concentration (Chl a, mg m-3). River plume area in June was negatively related with midsummer hypoxic area (km2) and volume (km3), while July inner shelf Chl a was positively related to hypoxic area and volume. Multiple regression models using river plume area and Chl a as independent variables accounted for most of the variability in hypoxic area (R2 = 0.92) or volume (R2 = 0.89). These models explain more variation in hypoxic area than models using Mississippi River nutrient loads as independent variables. The results here also support a hypothesis that confinement of the river plume to the inner shelf is an important mechanism controlling hypoxia area and volume in this region.

  11. Empirical Study of User Preferences Based on Rating Data of Movies.

    PubMed

    Zhao, YingSi; Shen, Bo

    2016-01-01

    User preference plays a prominent role in many fields, including electronic commerce, social opinion, and Internet search engines. Particularly in recommender systems, it directly influences the accuracy of the recommendation. Though many methods have been presented, most of these have only focused on how to improve the recommendation results. In this paper, we introduce an empirical study of user preferences based on a set of rating data about movies. We develop a simple statistical method to investigate the characteristics of user preferences. We find that the movies have potential characteristics of closure, which results in the formation of numerous cliques with a power-law size distribution. We also find that a user related to a small clique always has similar opinions on the movies in this clique. Then, we suggest a user preference model, which can eliminate the predictions that are considered to be impracticable. Numerical results show that the model can reflect user preference with remarkable accuracy when data elimination is allowed, and random factors in the rating data make prediction error inevitable. In further research, we will investigate many other rating data sets to examine the universality of our findings.

  12. Empirical Study of User Preferences Based on Rating Data of Movies

    PubMed Central

    Zhao, YingSi; Shen, Bo

    2016-01-01

    User preference plays a prominent role in many fields, including electronic commerce, social opinion, and Internet search engines. Particularly in recommender systems, it directly influences the accuracy of the recommendation. Though many methods have been presented, most of these have only focused on how to improve the recommendation results. In this paper, we introduce an empirical study of user preferences based on a set of rating data about movies. We develop a simple statistical method to investigate the characteristics of user preferences. We find that the movies have potential characteristics of closure, which results in the formation of numerous cliques with a power-law size distribution. We also find that a user related to a small clique always has similar opinions on the movies in this clique. Then, we suggest a user preference model, which can eliminate the predictions that are considered to be impracticable. Numerical results show that the model can reflect user preference with remarkable accuracy when data elimination is allowed, and random factors in the rating data make prediction error inevitable. In further research, we will investigate many other rating data sets to examine the universality of our findings. PMID:26735847

  13. Empirical mode decomposition of digital mammograms for the statistical based characterization of architectural distortion.

    PubMed

    Zyout, Imad; Togneri, Roberto

    2015-01-01

    Among the different and common mammographic signs of the early-stage breast cancer, the architectural distortion is the most difficult to be identified. In this paper, we propose a new multiscale statistical texture analysis to characterize the presence of architectural distortion by distinguishing between textural patterns of architectural distortion and normal breast parenchyma. The proposed approach, firstly, applies the bidimensional empirical mode decomposition algorithm to decompose each mammographic region of interest into a set of adaptive and data-driven two-dimensional intrinsic mode functions (IMF) layers that capture details or high-frequency oscillations of the input image. Then, a model-based approach is applied to IMF histograms to acquire the first order statistics. The normalized entropy measure is also computed from each IMF and used as a complementary textural feature for the recognition of architectural distortion patterns. For evaluating the proposed AD characterization approach, we used a mammographic dataset of 187 true positive regions (i.e. depicting architectural distortion) and 887 true negative (normal parenchyma) regions, extracted from the DDSM database. Using the proposed multiscale textural features and the nonlinear support vector machine classifier, the best classification performance, in terms of the area under the receiver operating characteristic curve (or Az value), achieved was 0.88.

  14. Robust multitask learning with three-dimensional empirical mode decomposition-based features for hyperspectral classification

    NASA Astrophysics Data System (ADS)

    He, Zhi; Liu, Lin

    2016-11-01

    Empirical mode decomposition (EMD) and its variants have recently been applied for hyperspectral image (HSI) classification due to their ability to extract useful features from the original HSI. However, it remains a challenging task to effectively exploit the spectral-spatial information by the traditional vector or image-based methods. In this paper, a three-dimensional (3D) extension of EMD (3D-EMD) is proposed to naturally treat the HSI as a cube and decompose the HSI into varying oscillations (i.e. 3D intrinsic mode functions (3D-IMFs)). To achieve fast 3D-EMD implementation, 3D Delaunay triangulation (3D-DT) is utilized to determine the distances of extrema, while separable filters are adopted to generate the envelopes. Taking the extracted 3D-IMFs as features of different tasks, robust multitask learning (RMTL) is further proposed for HSI classification. In RMTL, pairs of low-rank and sparse structures are formulated by trace-norm and l1,2 -norm to capture task relatedness and specificity, respectively. Moreover, the optimization problems of RMTL can be efficiently solved by the inexact augmented Lagrangian method (IALM). Compared with several state-of-the-art feature extraction and classification methods, the experimental results conducted on three benchmark data sets demonstrate the superiority of the proposed methods.

  15. The mature minor: some critical psychological reflections on the empirical bases.

    PubMed

    Partridge, Brian C

    2013-06-01

    Moral and legal notions engaged in clinical ethics should not only possess analytic clarity but a sound basis in empirical findings. The latter condition brings into question the expansion of the mature minor exception. The mature minor exception in the healthcare law of the United States has served to enable those under the legal age to consent to medical treatment. Although originally developed primarily for minors in emergency or quasi-emergency need for health care, it was expanded especially from the 1970s in order to cover unemancipated minors older than 14 years. This expansion initially appeared plausible, given psychological data that showed the intellectual capacity of minors over 14 to recognize the causal connection between their choices and the consequences of their choices. However, subsequent psychological studies have shown that minors generally fail to have realistic affective and evaluative appreciations of the consequences of their decisions, because they tend to over-emphasize short-term benefits and underestimate long-term risks. Also, unlike most decisionmakers over 21, the decisions of minors are more often marked by the lack of adequate impulse control, all of which is reflected in the far higher involvement of adolescents in acts of violence, intentional injury, and serious automobile accidents. These effects are more evident in circumstances that elicit elevated affective responses. The advent of brain imaging has allowed the actual visualization of qualitative differences between how minors versus persons over the age of 21 generally assess risks and benefits and make decisions. In the case of most under the age of 21, subcortical systems fail adequately to be checked by the prefrontal systems that are involved in adult executive decisions. The neuroanatomical and psychological model developed by Casey, Jones, and Summerville offers an empirical insight into the qualitative differences in the neuroanatomical and neuropsychological bases

  16. Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education

    ERIC Educational Resources Information Center

    Amrein-Beardsley, A.; Haladyna, T.

    2012-01-01

    Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…

  17. A survey of the infrastructure for children's mental health services: implications for the implementation of empirically supported treatments (ESTs).

    PubMed

    Schoenwald, Sonja K; Chapman, Jason E; Kelleher, Kelly; Hoagwood, Kimberly Eaton; Landsverk, John; Stevens, Jack; Glisson, Charles; Rolls-Reutz, Jennifer

    2008-03-01

    A structured interview survey of directors of a large national sample (n = 200) of mental health service organizations treating children examined the governance, financing, staffing, services, and implementation practices of these organizations; and, director ratings of factors important to implementation of new treatments and services. Descriptive analyses showed private organizations financing services with public (particularly Medicaid) funds are prevalent and that employment of professional staff, clinical supervision and training, productivity requirements, and outcomes monitoring are common. Results of random effects regression models (RRMs) evaluating associations between governance, financing, and organizational characteristics and the use of new treatments and services showed for-profit organizations more likely to implement such treatments, and organizations with more licensed clinical staff and weekly clinical supervision in place less likely to do so. Results of RRMs evaluating relations between director ratings of the importance to new treatment and service implementation of three factors-fit with existing implementation practices, infrastructure support, and organizational mission and support-suggest greater importance to public than private organizations of these factors. Implications for EST implementation and future research are described.

  18. Hospital readiness for undertaking evidence-based practice: A survey.

    PubMed

    Nguyen, Thi Ngoc Minh; Wilson, Anne

    2016-12-01

    Despite the fact that evidence-based practice has increasing emphasis in health care, organizations are not always prepared for its implementation. Identifying organizational preparedness for implementing evidence-based practice is desirable prior to application. A cross-sectional survey was developed to explore nurses' perception of organizational support for evidence-based practice and was implemented via a self-enumerated survey completed by 234 nurses. Data were analyzed with descriptive and inferential statistics. Nurses reported that implementation of evidence-based practice is complex and fraught with challenges because of a lack of organizational support. A conceptual framework comprising three key factors: information resources, nursing leadership, and organizational infrastructure was proposed to assist health authorities in the implementation of evidence-based practice. Suggestions of how organizations can be more supportive of research utilization in practice include establishing a library, journal clubs/mentoring programs, nurses' involvement in decision-making at unit level, and a local nursing association.

  19. A novel signal compression method based on optimal ensemble empirical mode decomposition for bearing vibration signals

    NASA Astrophysics Data System (ADS)

    Guo, Wei; Tse, Peter W.

    2013-01-01

    Today, remote machine condition monitoring is popular due to the continuous advancement in wireless communication. Bearing is the most frequently and easily failed component in many rotating machines. To accurately identify the type of bearing fault, large amounts of vibration data need to be collected. However, the volume of transmitted data cannot be too high because the bandwidth of wireless communication is limited. To solve this problem, the data are usually compressed before transmitting to a remote maintenance center. This paper proposes a novel signal compression method that can substantially reduce the amount of data that need to be transmitted without sacrificing the accuracy of fault identification. The proposed signal compression method is based on ensemble empirical mode decomposition (EEMD), which is an effective method for adaptively decomposing the vibration signal into different bands of signal components, termed intrinsic mode functions (IMFs). An optimization method was designed to automatically select appropriate EEMD parameters for the analyzed signal, and in particular to select the appropriate level of the added white noise in the EEMD method. An index termed the relative root-mean-square error was used to evaluate the decomposition performances under different noise levels to find the optimal level. After applying the optimal EEMD method to a vibration signal, the IMF relating to the bearing fault can be extracted from the original vibration signal. Compressing this signal component obtains a much smaller proportion of data samples to be retained for transmission and further reconstruction. The proposed compression method were also compared with the popular wavelet compression method. Experimental results demonstrate that the optimization of EEMD parameters can automatically find appropriate EEMD parameters for the analyzed signals, and the IMF-based compression method provides a higher compression ratio, while retaining the bearing defect

  20. THE COS-HALOS SURVEY: AN EMPIRICAL DESCRIPTION OF METAL-LINE ABSORPTION IN THE LOW-REDSHIFT CIRCUMGALACTIC MEDIUM

    SciTech Connect

    Werk, Jessica K.; Prochaska, J. Xavier; Tripp, Todd M.; O'Meara, John M.; Peeples, Molly S.

    2013-02-15

    We present the equivalent width and column density measurements for low and intermediate ionization states of the circumgalactic medium (CGM) surrounding 44 low-z, L Almost-Equal-To L* galaxies drawn from the COS-Halos survey. These measurements are derived from far-UV transitions observed in HST/COS and Keck/HIRES spectra of background quasars within an impact parameter R < 160 kpc to the targeted galaxies. The data show significant metal-line absorption for 33 of the 44 galaxies, including quiescent systems, revealing the common occurrence of a cool (T Almost-Equal-To 10{sup 4}-10{sup 5} K), metal-enriched CGM. The detection rates and column densities derived for these metal lines decrease with increasing impact parameter, a trend we interpret as a declining metal surface density profile for the CGM. A comparison of the relative column densities of adjacent ionization states indicates that the gas is predominantly ionized. The large surface density in metals demands a large reservoir of metals and gas in the cool CGM (very conservatively, M {sup cool} {sub CGM} > 10{sup 9} M {sub Sun }), which likely traces a distinct density and/or temperature regime from the highly ionized CGM traced by O{sup +5} absorption. The large dispersion in absorption strengths (including non-detections) suggests that the cool CGM traces a wide range of densities or a mix of local ionizing conditions. Lastly, the kinematics inferred from the metal-line profiles are consistent with the cool CGM being bound to the dark matter halos hosting the galaxies; this gas may serve as fuel for future star formation. Future work will leverage this data set to provide estimates on the mass, metallicity, dynamics, and origin of the cool CGM in low-z, L* galaxies.

  1. How much does participatory flood management contribute to stakeholders' social capacity building? Empirical findings based on a triangulation of three evaluation approaches

    NASA Astrophysics Data System (ADS)

    Buchecker, M.; Menzel, S.; Home, R.

    2013-06-01

    Recent literature suggests that dialogic forms of risk communication are more effective to build stakeholders' hazard-related social capacities. In spite of the high theoretical expectations, there is a lack of univocal empirical evidence on the relevance of these effects. This is mainly due to the methodological limitations of the existing evaluation approaches. In our paper we aim at eliciting the contribution of participatory river revitalisation projects on stakeholders' social capacity building by triangulating the findings of three evaluation studies that were based on different approaches: a field-experimental, a qualitative long-term ex-post and a cross-sectional household survey approach. The results revealed that social learning and avoiding the loss of trust were more relevant benefits of participatory flood management than acceptance building. The results suggest that stakeholder involvements should be more explicitly designed as tools for long-term social learning.

  2. School-Based Budgeting Survey Study of Pilot Schools.

    ERIC Educational Resources Information Center

    Robinson, Carol

    This report describes results of a survey on school-based budgeting (SBB) in the Albuquerque, New Mexico, public schools (APS). SBB began in the 1986-87 school year at 33 of the 116 APS schools and alternative schools, with 16 elementary, 11 middle, and 6 high schools, participating in the first year. A total of 131 responses were received from…

  3. Lake Superior Phytoplankton Characterization from the 2006 Probability Based Survey

    EPA Science Inventory

    We conducted a late summer probability based survey of Lake Superior in 2006 which consisted of 52 sites stratified across 3 depth zones. As part of this effort, we collected composite phytoplankton samples from the epilimnion and the fluorescence maxima (Fmax) at 29 of the site...

  4. Selecting Great Lakes streams for lampricide treatment based on larval sea lamprey surveys

    USGS Publications Warehouse

    Christie, Gavin C.; Adams, Jean V.; Steeves, Todd B.; Slade, Jeffrey W.; Cuddy, Douglas W.; Fodale, Michael F.; Young, Robert J.; Kuc, Miroslaw; Jones, Michael L.

    2003-01-01

    The Empiric Stream Treatment Ranking (ESTR) system is a data-driven, model-based, decision tool for selecting Great Lakes streams for treatment with lampricide, based on estimates from larval sea lamprey (Petromyzon marinus) surveys conducted throughout the basin. The 2000 ESTR system was described and applied to larval assessment surveys conducted from 1996 to 1999. A comparative analysis of stream survey and selection data was conducted and improvements to the stream selection process were recommended. Streams were selected for treatment based on treatment cost, predicted treatment effectiveness, and the projected number of juvenile sea lampreys produced. On average, lampricide treatments were applied annually to 49 streams with 1,075 ha of larval habitat, killing 15 million larval and 514,000 juvenile sea lampreys at a total cost of $5.3 million, and marginal and mean costs of $85 and $10 per juvenile killed. The numbers of juvenile sea lampreys killed for given treatment costs showed a pattern of diminishing returns with increasing investment. Of the streams selected for treatment, those with > 14 ha of larval habitat targeted 73% of the juvenile sea lampreys for 60% of the treatment cost. Suggested improvements to the ESTR system were to improve accuracy and precision of model estimates, account for uncertainty in estimates, include all potentially productive streams in the process (not just those surveyed in the current year), consider the value of all larvae killed during treatment (not just those predicted to metamorphose the following year), use lake-specific estimates of damage, and establish formal suppression targets.

  5. Space-based infrared surveys of small bodies

    NASA Astrophysics Data System (ADS)

    Mommert, M.

    2014-07-01

    Most small bodies in the Solar System are too small and too distant to be spatially resolved, precluding a direct diameter derivation. Furthermore, measurements of the optical brightness alone only allow a rough estimate of the diameter, since the surface albedo is usually unknown and can have values between about 3 % and 60 % or more. The degeneracy can be resolved by considering the thermal emission of these objects, which is less prone to albedo effects and mainly a function of the diameter. Hence, the combination of optical and thermal-infrared observational data provides a means to independently derive an object's diameter and albedo. This technique is used in asteroid thermal models or more sophisticated thermophysical models (see, e.g., [1]). Infrared observations require cryogenic detectors and/or telescopes, depending on the actual wavelength range observed. Observations from the ground are additionally compromised by the variable transparency of Earth's atmosphere in major portions of the infrared wavelength ranges. Hence, space-based infrared telescopes, providing stable conditions and significantly better sensitivities than ground-based telescopes, are now used routinely to exploit this wavelength range. Two observation strategies are used with space-based infrared observatories: Space-based Infrared All-Sky Surveys. Asteroid surveys in the thermal infrared are less prone to albedo-related discovery bias compared to surveys with optical telescopes, providing a more complete picture of small body populations. The first space-based infrared survey of Solar System small bodies was performed with the Infrared Astronomical Satellite (IRAS) for 10 months in 1983. In the course of the 'IRAS Minor Planet Survey' [2], 2228 asteroids (3 new discoveries) and more than 25 comets (6 new discoveries) were observed. More recent space-based infrared all-sky asteroid surveys were performed by Akari (launched 2006) and the Wide-field Infrared Survey Explorer (WISE

  6. Survey on Existing Science Gateways - Based on DEGREE

    NASA Astrophysics Data System (ADS)

    Schwichtenberg, Horst; Claus, Steffen

    2010-05-01

    Science Gateways gather community-developed specific tools, applications, and data that is usually combined and presented in a graphical user interface which is customized to the needs of the target user community. Science Gateways serve as a single point of entry for the users and are usually represented by fat clients or web portals. Part of the DEGREE project (Dissemination and Exploitation of Grids in Earth Science) was a state-of-the-art survey of portal usage in Earth Science (ES) applications. This survey considered a list of 29 portals, including 17 ES portals and 12 generic developments coming from outside of the ES domain. The survey identified three common usage types of ES portals, including data dissemination (e.g. observational data), collaboration as well as usage of Grid-based resources (e.g. for processing of ES datasets). Based on these three usage types, key requirements could be extracted. These requirements were furthermore used for a feature comparison with existing portal developments coming from outside of the ES domain. This presentation gives an overview of the results of the survey (including a feature comparison of ES and non-ES portals). Furthermore, three portals are discussed in detail, one for each usage type (data dissemination, collaboration, Grid-based).

  7. Design-based and model-based inference in surveys of freshwater mollusks

    USGS Publications Warehouse

    Dorazio, R.M.

    1999-01-01

    Well-known concepts in statistical inference and sampling theory are used to develop recommendations for planning and analyzing the results of quantitative surveys of freshwater mollusks. Two methods of inference commonly used in survey sampling (design-based and model-based) are described and illustrated using examples relevant in surveys of freshwater mollusks. The particular objectives of a survey and the type of information observed in each unit of sampling can be used to help select the sampling design and the method of inference. For example, the mean density of a sparsely distributed population of mollusks can be estimated with higher precision by using model-based inference or by using design-based inference with adaptive cluster sampling than by using design-based inference with conventional sampling. More experience with quantitative surveys of natural assemblages of freshwater mollusks is needed to determine the actual benefits of different sampling designs and inferential procedures.

  8. Simulation of Long Lived Tracers Using an Improved Empirically Based Two-Dimensional Model Transport Algorithm

    NASA Technical Reports Server (NTRS)

    Fleming, E. L.; Jackman, C. H.; Stolarski, R. S.; Considine, D. B.

    1998-01-01

    We have developed a new empirically-based transport algorithm for use in our GSFC two-dimensional transport and chemistry model. The new algorithm contains planetary wave statistics, and parameterizations to account for the effects due to gravity waves and equatorial Kelvin waves. As such, this scheme utilizes significantly more information compared to our previous algorithm which was based only on zonal mean temperatures and heating rates. The new model transport captures much of the qualitative structure and seasonal variability observed in long lived tracers, such as: isolation of the tropics and the southern hemisphere winter polar vortex; the well mixed surf-zone region of the winter sub-tropics and mid-latitudes; the latitudinal and seasonal variations of total ozone; and the seasonal variations of mesospheric H2O. The model also indicates a double peaked structure in methane associated with the semiannual oscillation in the tropical upper stratosphere. This feature is similar in phase but is significantly weaker in amplitude compared to the observations. The model simulations of carbon-14 and strontium-90 are in good agreement with observations, both in simulating the peak in mixing ratio at 20-25 km, and the decrease with altitude in mixing ratio above 25 km. We also find mostly good agreement between modeled and observed age of air determined from SF6 outside of the northern hemisphere polar vortex. However, observations inside the vortex reveal significantly older air compared to the model. This is consistent with the model deficiencies in simulating CH4 in the northern hemisphere winter high latitudes and illustrates the limitations of the current climatological zonal mean model formulation. The propagation of seasonal signals in water vapor and CO2 in the lower stratosphere showed general agreement in phase, and the model qualitatively captured the observed amplitude decrease in CO2 from the tropics to midlatitudes. However, the simulated seasonal

  9. Model Selection for Equating Testlet-Based Tests in the NEAT Design: An Empirical Study

    ERIC Educational Resources Information Center

    He, Wei; Li, Feifei; Wolfe, Edward W.; Mao, Xia

    2012-01-01

    For those tests solely composed of testlets, local item independency assumption tends to be violated. This study, by using empirical data from a large-scale state assessment program, was interested in investigates the effects of using different models on equating results under the non-equivalent group anchor-test (NEAT) design. Specifically, the…

  10. Comparisons of experiment with cellulose models based on electronic structure and empirical force field theories

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Studies of cellobiose conformations with HF/6-31G* and B3LYP/6-31+G*quantum theory [1] gave a reference for studies with the much faster empirical methods such as MM3, MM4, CHARMM and AMBER. The quantum studies also enable a substantial reduction in the number of exo-cyclic group orientations that...

  11. Why Culture Matters: An Empirically-Based Pre-Deployment Training Program

    DTIC Science & Technology

    2005-09-01

    Johnston 1995; Monaghan and Just 2000; Salzman 2001; Lichbach and Zuckerman 2002; Wedeen 2002). Anthropologist Clifford Geertz (2000) agrees that......analysis of culture should be more descriptive or narrative in nature, rather than empirical. Geertz (2000) also took a multi-theoretical approach and

  12. Understanding Transactional Distance in Web-Based Learning Environments: An Empirical Study

    ERIC Educational Resources Information Center

    Huang, Xiaoxia; Chandra, Aruna; DePaolo, Concetta A.; Simmons, Lakisha L.

    2016-01-01

    Transactional distance is an important pedagogical theory in distance education that calls for more empirical support. The purpose of this study was to verify the theory by operationalizing and examining the relationship of (1) dialogue, structure and learner autonomy to transactional distance, and (2) environmental factors and learner demographic…

  13. Numerical simulation of bubble departure in subcooled pool boiling based on non-empirical boiling and condensation model

    NASA Astrophysics Data System (ADS)

    Ose, Y.; Kunugi, T.

    2013-07-01

    In this study, in order to clarify the heat transfer characteristics of the subcooled boiling phenomena and to discuss on their mechanism, a non-empirical boiling and condensation model for numerical simulation has been adopted. This model consists of an improved phase-change model and a consideration of a relaxation time based on the quasithermal equilibrium hypothesis. The transient three-dimensional numerical simulations based on the MARS (Multiinterface Advection and Reconstruction Solver) with the non-empirical boiling and condensation model have been conducted for an isolated boiling bubble behavior in a subcooled pool. The subcooled bubble behaviors, such as the growth process of the nucleate bubble on the heating surface, the condensation process and the extinction behaviors after departing from the heating surface were investigated, respectively. In this paper, the bubble departing behavior from the heating surface was discussed in detail. The overall numerical results showed in very good agreement with the experimental results.

  14. The Effect of Survey Mode on High School Risk Behavior Data: A Comparison between Web and Paper-Based Surveys

    ERIC Educational Resources Information Center

    Raghupathy, Shobana; Hahn-Smith, Stephen

    2013-01-01

    There has been increasing interest in using of web-based surveys--rather than paper based surveys--for collecting data on alcohol and other drug use in middle and high schools in the US. However, prior research has indicated that respondent confidentiality is an underlying concern with online data collection especially when computer-assisted…

  15. Upscaling Empirically Based Conceptualisations to Model Tropical Dominant Hydrological Processes for Historical Land Use Change

    NASA Astrophysics Data System (ADS)

    Toohey, R.; Boll, J.; Brooks, E.; Jones, J.

    2009-12-01

    Surface runoff and percolation to ground water are two hydrological processes of concern to the Atlantic slope of Costa Rica because of their impacts on flooding and drinking water contamination. As per legislation, the Costa Rican Government funds land use management from the farm to the regional scale to improve or conserve hydrological ecosystem services. In this study, we examined how land use (e.g., forest, coffee, sugar cane, and pasture) affects hydrological response at the point, plot (1 m2), and the field scale (1-6ha) to empirically conceptualize the dominant hydrological processes in each land use. Using our field data, we upscaled these conceptual processes into a physically-based distributed hydrological model at the field, watershed (130 km2), and regional (1500 km2) scales. At the point and plot scales, the presence of macropores and large roots promoted greater vertical percolation and subsurface connectivity in the forest and coffee field sites. The lack of macropores and large roots, plus the addition of management artifacts (e.g., surface compaction and a plough layer), altered the dominant hydrological processes by increasing lateral flow and surface runoff in the pasture and sugar cane field sites. Macropores and topography were major influences on runoff generation at the field scale. Also at the field scale, antecedent moisture conditions suggest a threshold behavior as a temporal control on surface runoff generation. However, in this tropical climate with very intense rainstorms, annual surface runoff was less than 10% of annual precipitation at the field scale. Significant differences in soil and hydrological characteristics observed at the point and plot scales appear to have less significance when upscaled to the field scale. At the point and plot scales, percolation acted as the dominant hydrological process in this tropical environment. However, at the field scale for sugar cane and pasture sites, saturation-excess runoff increased as

  16. Mental Health Functioning in the Human Rights Field: Findings from an International Internet-Based Survey.

    PubMed

    Joscelyne, Amy; Knuckey, Sarah; Satterthwaite, Margaret L; Bryant, Richard A; Li, Meng; Qian, Meng; Brown, Adam D

    2015-01-01

    Human rights advocates play a critical role in promoting respect for human rights world-wide, and engage in a broad range of strategies, including documentation of rights violations, monitoring, press work and report-writing, advocacy, and litigation. However, little is known about the impact of human rights work on the mental health of human rights advocates. This study examined the mental health profile of human rights advocates and risk factors associated with their psychological functioning. 346 individuals currently or previously working in the field of human rights completed an internet-based survey regarding trauma exposure, depression, posttraumatic stress disorder (PTSD), resilience and occupational burnout. PTSD was measured with the Posttraumatic Stress Disorder Checklist-Civilian Version (PCL-C) and depression was measured with the Patient History Questionnaire-9 (PHQ-9). These findings revealed that among human rights advocates that completed the survey, 19.4% met criteria for PTSD, 18.8% met criteria for subthreshold PTSD, and 14.7% met criteria for depression. Multiple linear regressions revealed that after controlling for symptoms of depression, PTSD symptom severity was predicted by human rights-related trauma exposure, perfectionism and negative self-appraisals about human rights work. In addition, after controlling for symptoms of PTSD, depressive symptoms were predicted by perfectionism and lower levels of self-efficacy. Survey responses also suggested high levels of resilience: 43% of responders reported minimal symptoms of PTSD. Although survey responses suggest that many human rights workers are resilient, they also suggest that human rights work is associated with elevated rates of PTSD and depression. The field of human rights would benefit from further empirical research, as well as additional education and training programs in the workplace about enhancing resilience in the context of human rights work.

  17. Mental Health Functioning in the Human Rights Field: Findings from an International Internet-Based Survey

    PubMed Central

    Joscelyne, Amy; Knuckey, Sarah; Satterthwaite, Margaret L.; Bryant, Richard A.; Li, Meng; Qian, Meng; Brown, Adam D.

    2015-01-01

    Human rights advocates play a critical role in promoting respect for human rights world-wide, and engage in a broad range of strategies, including documentation of rights violations, monitoring, press work and report-writing, advocacy, and litigation. However, little is known about the impact of human rights work on the mental health of human rights advocates. This study examined the mental health profile of human rights advocates and risk factors associated with their psychological functioning. 346 individuals currently or previously working in the field of human rights completed an internet-based survey regarding trauma exposure, depression, posttraumatic stress disorder (PTSD), resilience and occupational burnout. PTSD was measured with the Posttraumatic Stress Disorder Checklist-Civilian Version (PCL-C) and depression was measured with the Patient History Questionnaire-9 (PHQ-9). These findings revealed that among human rights advocates that completed the survey, 19.4% met criteria for PTSD, 18.8% met criteria for subthreshold PTSD, and 14.7% met criteria for depression. Multiple linear regressions revealed that after controlling for symptoms of depression, PTSD symptom severity was predicted by human rights-related trauma exposure, perfectionism and negative self-appraisals about human rights work. In addition, after controlling for symptoms of PTSD, depressive symptoms were predicted by perfectionism and lower levels of self-efficacy. Survey responses also suggested high levels of resilience: 43% of responders reported minimal symptoms of PTSD. Although survey responses suggest that many human rights workers are resilient, they also suggest that human rights work is associated with elevated rates of PTSD and depression. The field of human rights would benefit from further empirical research, as well as additional education and training programs in the workplace about enhancing resilience in the context of human rights work. PMID:26700305

  18. 23 CFR Appendix C to Part 1240 - Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153)

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 23 Highways 1 2012-04-01 2012-04-01 false Certification (Calendar Year 1998 Survey Based on Survey... 1240—Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153) State Certification-Calendar Year 1998 Seat Belt Use Survey State of Seat Belt Use Rate Reported for Calendar...

  19. 23 CFR Appendix C to Part 1240 - Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153)

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 23 Highways 1 2014-04-01 2014-04-01 false Certification (Calendar Year 1998 Survey Based on Survey... 1240—Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153) State Certification-Calendar Year 1998 Seat Belt Use Survey State of Seat Belt Use Rate Reported for Calendar...

  20. 23 CFR Appendix C to Part 1240 - Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153)

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 23 Highways 1 2013-04-01 2013-04-01 false Certification (Calendar Year 1998 Survey Based on Survey... 1240—Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153) State Certification-Calendar Year 1998 Seat Belt Use Survey State of Seat Belt Use Rate Reported for Calendar...

  1. HIV testing in national population-based surveys: experience from the Demographic and Health Surveys.

    PubMed Central

    Mishra, Vinod; Vaessen, Martin; Boerma, J. Ties; Arnold, Fred; Way, Ann; Barrere, Bernard; Cross, Anne; Hong, Rathavuth; Sangha, Jasbir

    2006-01-01

    OBJECTIVES: To describe the methods used in the Demographic and Health Surveys (DHS) to collect nationally representative data on the prevalence of human immunodeficiency virus (HIV) and assess the value of such data to country HIV surveillance systems. METHODS: During 2001-04, national samples of adult women and men in Burkina Faso, Cameroon, Dominican Republic, Ghana, Mali, Kenya, United Republic of Tanzania and Zambia were tested for HIV. Dried blood spot samples were collected for HIV testing, following internationally accepted ethical standards. The results for each country are presented by age, sex, and urban versus rural residence. To estimate the effects of non-response, HIV prevalence among non-responding males and females was predicted using multivariate statistical models for those who were tested, with a common set of predictor variables. RESULTS: Rates of HIV testing varied from 70% among Kenyan men to 92% among women in Burkina Faso and Cameroon. Despite large differences in HIV prevalence between the surveys (1-16%), fairly consistent patterns of HIV infection were observed by age, sex and urban versus rural residence, with considerably higher rates in urban areas and in women, especially at younger ages. Analysis of non-response bias indicates that although predicted HIV prevalence tended to be higher in non-tested males and females than in those tested, the overall effects of non-response on the observed national estimates of HIV prevalence are insignificant. CONCLUSIONS: Population-based surveys can provide reliable, direct estimates of national and regional HIV seroprevalence among men and women irrespective of pregnancy status. Survey data greatly enhance surveillance systems and the accuracy of national estimates in generalized epidemics. PMID:16878227

  2. The processing of rotor startup signals based on empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Gai, Guanghong

    2006-01-01

    In this paper, we applied empirical mode decomposition method to analyse rotor startup signals, which are non-stationary and contain a lot of additional information other than that from its stationary running signals. The methodology developed in this paper decomposes the original startup signals into intrinsic oscillation modes or intrinsic modes function (IMFs). Then, we obtained rotating frequency components for Bode diagrams plot by corresponding IMFs, according to the characteristics of rotor system. The method can obtain precise critical speed without complex hardware support. The low-frequency components were extracted from these IMFs in vertical and horizontal directions. Utilising these components, we constructed a drift locus of rotor revolution centre, which provides some significant information to fault diagnosis of rotating machinery. Also, we proved that empirical mode decomposition method is more precise than Fourier filter for the extraction of low-frequency component.

  3. Organizational Learning, Strategic Flexibility and Business Model Innovation: An Empirical Research Based on Logistics Enterprises

    NASA Astrophysics Data System (ADS)

    Bao, Yaodong; Cheng, Lin; Zhang, Jian

    Using the data of 237 Jiangsu logistics firms, this paper empirically studies the relationship among organizational learning capability, business model innovation, strategic flexibility. The results show as follows; organizational learning capability has positive impacts on business model innovation performance; strategic flexibility plays mediating roles on the relationship between organizational learning capability and business model innovation; interaction among strategic flexibility, explorative learning and exploitative learning play significant roles in radical business model innovation and incremental business model innovation.

  4. Epicentral Location of Regional Seismic Events Based on Empirical Green’s Functions from Ambient Noise

    DTIC Science & Technology

    2010-09-01

    located and characterized by the University of Utah Seismic Stations (UUSS) and by the Department of Earth and Atmospheric Sciences at Saint Louis...of sources at different depths; e.g., earthquakes within Earth’s crust, volcanic explosions, meteoritic impacts, explosions, mine collapses, or...not require knowledge of Earth structure. ● It works for weak events where the detection of body wave phases may be problematic. ●The empirical

  5. A Reliability Test of a Complex System Based on Empirical Likelihood

    PubMed Central

    Zhang, Jun; Hui, Yongchang

    2016-01-01

    To analyze the reliability of a complex system described by minimal paths, an empirical likelihood method is proposed to solve the reliability test problem when the subsystem distributions are unknown. Furthermore, we provide a reliability test statistic of the complex system and extract the limit distribution of the test statistic. Therefore, we can obtain the confidence interval for reliability and make statistical inferences. The simulation studies also demonstrate the theorem results. PMID:27760130

  6. Measurement of COPD Severity Using a Survey-Based Score

    PubMed Central

    Omachi, Theodore A.; Katz, Patricia P.; Yelin, Edward H.; Iribarren, Carlos; Blanc, Paul D.

    2010-01-01

    Background: A comprehensive survey-based COPD severity score has usefulness for epidemiologic and health outcomes research. We previously developed and validated the survey-based COPD Severity Score without using lung function or other physiologic measurements. In this study, we aimed to further validate the severity score in a different COPD cohort and using a combination of patient-reported and objective physiologic measurements. Methods: Using data from the Function, Living, Outcomes, and Work cohort study of COPD, we evaluated the concurrent and predictive validity of the COPD Severity Score among 1,202 subjects. The survey instrument is a 35-point score based on symptoms, medication and oxygen use, and prior hospitalization or intubation for COPD. Subjects were systemically assessed using structured telephone survey, spirometry, and 6-min walk testing. Results: We found evidence to support concurrent validity of the score. Higher COPD Severity Score values were associated with poorer FEV1 (r = −0.38), FEV1% predicted (r = −0.40), Body mass, Obstruction, Dyspnea, Exercise Index (r = 0.57), and distance walked in 6 min (r = −0.43) (P < .0001 in all cases). Greater COPD severity was also related to poorer generic physical health status (r = −0.49) and disease-specific health-related quality of life (r = 0.57) (P < .0001). The score also demonstrated predictive validity. It was also associated with a greater prospective risk of acute exacerbation of COPD defined as ED visits (hazard ratio [HR], 1.31; 95% CI, 1.24-1.39), hospitalizations (HR, 1.59; 95% CI, 1.44-1.75), and either measure of hospital-based care for COPD (HR, 1.34; 95% CI, 1.26-1.41) (P < .0001 in all cases). Conclusion: The COPD Severity Score is a valid survey-based measure of disease-specific severity, both in terms of concurrent and predictive validity. The score is a psychometrically sound instrument for use in epidemiologic and outcomes research in COPD. PMID:20040611

  7. Empirical model of the thermospheric mass density based on CHAMP satellite observations

    NASA Astrophysics Data System (ADS)

    Liu, Huixin; Hirano, Takashi; Watanabe, Shigeto

    2013-02-01

    The decadal observations from CHAMP satellite have provided ample information on the Earth's upper thermosphere, reshaping our understandings of the vertical coupling in the atmosphere and near-Earth space. An empirical model of the thermospheric mass density is constructed from these high-resolution observations using the multivariable least-squares fitting method. It describes the density variation with latitude, longitude, height, local time, season, and solar and geomagnetic activity levels within the altitude range of 350-420 km. It represents well prominent thermosphere structures like the equatorial mass density anomaly (EMA) and the wave-4 longitudinal pattern. Furthermore, the empirical model reveals two distinct features. First, the EMA is found to have a clear altitude dependence, with its crests moving equatorward with increasing altitude. Second, the equinoctial asymmetry is found to strongly depend on solar cycle, with its magnitude and phase being strongly regulated by solar activity levels. The equinoctial density maxima occur significantly after the actual equinox dates toward solar minimum, which may signal growing influence from the lower atmosphere forcing. This empirical model provides an instructive tool in exploring thermospheric density structures and dynamics. It can also be easily incorporated into other models to have a more accurate description of the background thermosphere, for both scientific and practical purposes.

  8. Deep in Data: Empirical Data Based Software Accuracy Testing Using the Building America Field Data Repository: Preprint

    SciTech Connect

    Neymark, J.; Roberts, D.

    2013-06-01

    An opportunity is available for using home energy consumption and building description data to develop a standardized accuracy test for residential energy analysis tools. That is, to test the ability of uncalibrated simulations to match real utility bills. Empirical data collected from around the United States have been translated into a uniform Home Performance Extensible Markup Language format that may enable software developers to create translators to their input schemes for efficient access to the data. This may facilitate the possibility of modeling many homes expediently, and thus implementing software accuracy test cases by applying the translated data. This paper describes progress toward, and issues related to, developing a usable, standardized, empirical data-based software accuracy test suite.

  9. Avian survey and field guide for Osan Air Base, Korea.

    SciTech Connect

    Levenson, J.

    2006-12-05

    This report summarizes the results of the avian surveys conducted at Osan Air Base (AB). This ongoing survey is conducted to comply with requirements of the Environmental Governing Standards (EGS) for the Republic of Korea, the Integrated Natural Resources Management Plan (INRMP) for Osan AB, and the 51st Fighter Wing's Bird Aircraft Strike Hazard (BASH) Plan. One hundred ten bird species representing 35 families were identified and recorded. Seven species are designated as Natural Monuments, and their protection is accorded by the Korean Ministry of Culture and Tourism. Three species appear on the Korean Association for Conservation of Nature's (KACN's) list of Reserved Wild Species and are protected by the Korean Ministry of Environment. Combined, ten different species are Republic of Korea (ROK)-protected. The primary objective of the avian survey at Osan AB was to determine what species of birds are present on the airfield and their respective habitat requirements during the critical seasons of the year. This requirement is specified in Annex J.14.c of the 51st Fighter BASH Plan 91-212 (51 FW OPLAN 91-212). The second objective was to initiate surveys to determine what bird species are present on Osan AB throughout the year and from the survey results, determine if threatened, endangered, or other Korean-listed bird species are present on Osan AB. This overall census satisfies Criterion 13-3.e of the EGS for Korea. The final objective was to formulate management strategies within Osan AB's operational requirements to protect and enhance habitats of known threatened, endangered, and ROK-protected species in accordance with EGS Criterion 13-3.a that are also favorable for the reproduction of indigenous species in accordance with the EGS Criterion 13-3.h.

  10. A survey of GPU-based medical image computing techniques.

    PubMed

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming; Wang, Defeng

    2012-09-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine.

  11. Road rage: prevalence pattern and web based survey feasibility.

    PubMed

    Mina, Shaily; Verma, Rohit; Balhara, Yatan Pal Singh; Ul-Hasan, Shiraz

    2014-01-01

    Introduction. Incidents of road rage are on a rise in India, but the literature is lacking in the aspect. There is an increasing realization of possibility of effective web based interventions to deliver public health related messages. Objective. The aim was to quantitatively evaluate risk factors among motor vehicle drivers using an internet based survey. Methods. Facebook users were evaluated using Life Orientation Test-Revised (LOT-R) and Driving Anger Scale (DAS). Results. An adequate response rate of 65.9% and satisfactory reliability with sizable correlation were obtained for both scales. Age was found to be positively correlated to LOT-R scores (r = 0.21; P = 0.02) and negatively correlated to DAS scores (r = -0.19; P = 0.03). Years of education were correlated to LOT-R scores (r = 0.26; P = 0.005) but not DAS scores (r = -0.14; P = 0.11). LOT-R scores did not correlate to DAS scores. Conclusion. There is high prevalence of anger amongst drivers in India particularly among younger males. A short web survey formatted in easy to use question language can result in a feasible conduction of an online survey.

  12. Does community-based conservation shape favorable attitudes among locals? an empirical study from nepal.

    PubMed

    Mehta, J N; Heinen, J T

    2001-08-01

    Like many developing countries, Nepal has adopted a community-based conservation (CBC) approach in recent years to manage its protected areas mainly in response to poor park-people relations. Among other things, under this approach the government has created new "people-oriented" conservation areas, formed and devolved legal authority to grassroots-level institutions to manage local resources, fostered infrastructure development, promoted tourism, and provided income-generating trainings to local people. Of interest to policy-makers and resource managers in Nepal and worldwide is whether this approach to conservation leads to improved attitudes on the part of local people. It is also important to know if personal costs and benefits associated with various intervention programs, and socioeconomic and demographic characteristics influence these attitudes. We explore these questions by looking at the experiences in Annapurna and Makalu-Barun Conservation Areas, Nepal, which have largely adopted a CBC approach in policy formulation, planning, and management. The research was conducted during 1996 and 1997; the data collection methods included random household questionnaire surveys, informal interviews, and review of official records and published literature. The results indicated that the majority of local people held favorable attitudes toward these conservation areas. Logistic regression results revealed that participation in training, benefit from tourism, wildlife depredation issue, ethnicity, gender, and education level were the significant predictors of local attitudes in one or the other conservation area. We conclude that the CBC approach has potential to shape favorable local attitudes and that these attitudes will be mediated by some personal attributes.

  13. Rapid Mapping Method Based on Free Blocks of Surveys

    NASA Astrophysics Data System (ADS)

    Yu, Xianwen; Wang, Huiqing; Wang, Jinling

    2016-06-01

    While producing large-scale larger than 1:2000 maps in cities or towns, the obstruction from buildings leads to difficult and heavy tasks of measuring mapping control points. In order to avoid measuring the mapping control points and shorten the time of fieldwork, in this paper, a quick mapping method is proposed. This method adjusts many free blocks of surveys together, and transforms the points from all free blocks of surveys into the same coordinate system. The entire surveying area is divided into many free blocks, and connection points are set on the boundaries between free blocks. An independent coordinate system of every free block is established via completely free station technology, and the coordinates of the connection points, detail points and control points in every free block in the corresponding independent coordinate systems are obtained based on poly-directional open traverses. Error equations are established based on connection points, which are determined together to obtain the transformation parameters. All points are transformed from the independent coordinate systems to a transitional coordinate system via the transformation parameters. Several control points are then measured by GPS in a geodetic coordinate system. All the points can then be transformed from the transitional coordinate system to the geodetic coordinate system. In this paper, the implementation process and mathematical formulas of the new method are presented in detail, and the formula to estimate the precision of surveys is given. An example has demonstrated that the precision of using the new method could meet large-scale mapping needs.

  14. Methodology of the National School-based Health Survey in Malaysia, 2012.

    PubMed

    Yusoff, Fadhli; Saari, Riyanti; Naidu, Balkish M; Ahmad, Noor Ani; Omar, Azahadi; Aris, Tahir

    2014-09-01

    The National School-Based Health Survey 2012 was a nationwide school health survey of students in Standard 4 to Form 5 (10-17 years of age), who were schooling in government schools in Malaysia during the period of data collection. The survey comprised 3 subsurveys: the Global School Health Survey (GSHS), the Mental Health Survey, and the National School-Based Nutrition Survey. The aim of the survey was to provide data on the health status of adolescents in Malaysia toward strengthening the adolescent health program in the country. The design of the survey was created to fulfill the requirements of the 3 subsurveys. A 2-stage stratified sampling method was adopted in the sampling. The methods for data collection were via questionnaire and physical examination. The National School-Based Health Survey 2012 adopted an appropriate methodology for a school-based survey to ensure valid and reliable findings.

  15. A Survey of Artificial Immune System Based Intrusion Detection

    PubMed Central

    Li, Tao; Hu, Xinlei; Wang, Feng; Zou, Yang

    2014-01-01

    In the area of computer security, Intrusion Detection (ID) is a mechanism that attempts to discover abnormal access to computers by analyzing various interactions. There is a lot of literature about ID, but this study only surveys the approaches based on Artificial Immune System (AIS). The use of AIS in ID is an appealing concept in current techniques. This paper summarizes AIS based ID methods from a new view point; moreover, a framework is proposed for the design of AIS based ID Systems (IDSs). This framework is analyzed and discussed based on three core aspects: antibody/antigen encoding, generation algorithm, and evolution mode. Then we collate the commonly used algorithms, their implementation characteristics, and the development of IDSs into this framework. Finally, some of the future challenges in this area are also highlighted. PMID:24790549

  16. Comparison of ensemble post-processing approaches, based on empirical and dynamical error modelisation of rainfall-runoff model forecasts

    NASA Astrophysics Data System (ADS)

    Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.

    2012-04-01

    In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological

  17. Vision-based traffic surveys in urban environments

    NASA Astrophysics Data System (ADS)

    Chen, Zezhi; Ellis, Tim; Velastin, Sergio A.

    2016-09-01

    This paper presents a state-of-the-art, vision-based vehicle detection and type classification to perform traffic surveys from a roadside closed-circuit television camera. Vehicles are detected using background subtraction based on a Gaussian mixture model that can cope with vehicles that become stationary over a significant period of time. Vehicle silhouettes are described using a combination of shape and appearance features using an intensity-based pyramid histogram of orientation gradients (HOG). Classification is performed using a support vector machine, which is trained on a small set of hand-labeled silhouette exemplars. These exemplars are identified using a model-based preclassifier that utilizes calibrated images mapped by Google Earth to provide accurately surveyed scene geometry matched to visible image landmarks. Kalman filters track the vehicles to enable classification by majority voting over several consecutive frames. The system counts vehicles and separates them into four categories: car, van, bus, and motorcycle (including bicycles). Experiments with real-world data have been undertaken to evaluate system performance and vehicle detection rates of 96.45% and classification accuracy of 95.70% have been achieved on this data.

  18. Prospects for Gaia and other space-based surveys .

    NASA Astrophysics Data System (ADS)

    Bailer-Jones, Coryn A. L.

    Gaia is a fully-approved all-sky astrometric and photometric survey due for launch in 2011. It will measure accurate parallaxes and proper motions for everything brighter than G=20 (ca. 109 stars). Its primary objective is to study the composition, origin and evolution of our Galaxy from the 3D structure, 3D velocities, abundances and ages of its stars. In some respects it can be considered as a cosmological survey at redshift zero. Several other upcoming space-based surveys, in particular JWST and Herschel, will study star and galaxy formation in the early (high-redshift) universe. In this paper I briefly describe these missions, as well as SIM and Jasmine, and explain why they need to observe from space. I then discuss some Galactic science contributions of Gaia concerning dark matter, the search for substructure, stellar populations and the mass-luminosity relation. The Gaia data are complex and require the development of novel analysis methods; here I summarize the principle of the astrometric processing. In the last two sections I outline how the Gaia data can be exploited in connection with other observational and theoretical work in order to build up a more comprehensive picture of galactic evolution.

  19. Accounting protesting and warm glow bidding in Contingent Valuation surveys considering the management of environmental goods--an empirical case study assessing the value of protecting a Natura 2000 wetland area in Greece.

    PubMed

    Grammatikopoulou, Ioanna; Olsen, Søren Bøye

    2013-11-30

    Based on a Contingent Valuation survey aiming to reveal the willingness to pay (WTP) for conservation of a wetland area in Greece, we show how protest and warm glow motives can be taken into account when modeling WTP. In a sample of more than 300 respondents, we find that 54% of the positive bids are rooted to some extent in warm glow reasoning while 29% of the zero bids can be classified as expressions of protest rather than preferences. In previous studies, warm glow bidders are only rarely identified while protesters are typically identified and excluded from further analysis. We test for selection bias associated with simple removal of both protesters and warm glow bidders in our data. Our findings show that removal of warm glow bidders does not significantly distort WTP whereas we find strong evidence of selection bias associated with removal of protesters. We show how to correct for such selection bias by using a sample selection model. In our empirical sample, using the typical approach of removing protesters from the analysis, the value of protecting the wetland is significantly underestimated by as much as 46% unless correcting for selection bias.

  20. A Multi-wavelength Study of Nearby Galaxies Based on Molecular Line Surveys: MIPS Observations

    NASA Astrophysics Data System (ADS)

    Fazio, Giovanni; Wang, Zhong; Bush, Stephanie; Cox, Thomas J.; Keto, Eric; Pahre, Michael; Rosolowsky, Erik; Smith, Howard

    2008-03-01

    Dense molecular gas, warm dust, and hot ionized gas are different components of the multi-step transformation of cold gas into stars and star clusters. While empirical laws on star formation in galaxies have been established based on global measurements of these components, substantial galaxy-to-galaxy variations still exist and remain unexplained. To understand the mechanisms that induce and regulate star formation and thus galaxy evolution, we need to study processes on the local scales of typical star forming regions and giant molecular clouds. In a set of pilot studies, we analyzed the Spitzer and Galex data of nearby giant spirals M31, M33 and M99, and compared with the new interferometric CO maps of matching angular resolution. We found evidence that variations in local condition, environmental effects, and viewing geometry may explain much of the large scatter in the empirical relationships. Based on the success of this initial investigation, we have collected high- resolution CO images of 63 late-type galaxies from several large surveys, and we are working on obtaining a complete set of Spitzer and Galex data for these galaxies. A companion Spitzer archival research program will re-examine the existing observations along with CO, HI, UV and optical data, focusing on correlations in spatially resolved, individual star-forming regions. Here we propose MIPS imaging of the 11 galaxies in our CO sample that have not already been observed by Spitzer. A GO proposal will request IRAC time for these galaxies, which are a significant addition to our study because they substantially increase the fraction of gas-rich late types in the full sample. Insight from this program will be applicable to not only nearby system, but also high red-shift galaxies for which only integrated quantities are measurable.

  1. Measuring microscopic evolution processes of complex networks based on empirical data

    NASA Astrophysics Data System (ADS)

    Chi, Liping

    2015-04-01

    Aiming at understanding the microscopic mechanism of complex systems in real world, we perform the measurement that characterizes the evolution properties on two empirical data sets. In the Autonomous Systems Internet data, the network size keeps growing although the system suffers a high rate of node deletion (r = 0.4) and link deletion (q = 0.81). However, the average degree keeps almost unchanged during the whole time range. At each time step the external links attached to a new node are about c = 1.1 and the internal links added between existing nodes are approximately m = 8. For the Scientific Collaboration data, it is a cumulated result of all the authors from 1893 up to the considered year. There is no deletion of nodes and links, r = q = 0. The external and internal links at each time step are c = 1.04 and m = 0, correspondingly. The exponents of degree distribution p(k) ∼ k-γ of these two empirical datasets γdata are in good agreement with that obtained theoretically γtheory. The results indicate that these evolution quantities may provide an insight into capturing the microscopic dynamical processes that govern the network topology.

  2. Empirical estimation of genome-wide significance thresholds based on the 1000 Genomes Project data set

    PubMed Central

    Kanai, Masahiro; Tanaka, Toshihiro; Okada, Yukinori

    2016-01-01

    To assess the statistical significance of associations between variants and traits, genome-wide association studies (GWAS) should employ an appropriate threshold that accounts for the massive burden of multiple testing in the study. Although most studies in the current literature commonly set a genome-wide significance threshold at the level of P=5.0 × 10−8, the adequacy of this value for respective populations has not been fully investigated. To empirically estimate thresholds for different ancestral populations, we conducted GWAS simulations using the 1000 Genomes Phase 3 data set for Africans (AFR), Europeans (EUR), Admixed Americans (AMR), East Asians (EAS) and South Asians (SAS). The estimated empirical genome-wide significance thresholds were Psig=3.24 × 10−8 (AFR), 9.26 × 10−8 (EUR), 1.83 × 10−7 (AMR), 1.61 × 10−7 (EAS) and 9.46 × 10−8 (SAS). We additionally conducted trans-ethnic meta-analyses across all populations (ALL) and all populations except for AFR (ΔAFR), which yielded Psig=3.25 × 10−8 (ALL) and 4.20 × 10−8 (ΔAFR). Our results indicate that the current threshold (P=5.0 × 10−8) is overly stringent for all ancestral populations except for Africans; however, we should employ a more stringent threshold when conducting a meta-analysis, regardless of the presence of African samples. PMID:27305981

  3. Empirical estimation of genome-wide significance thresholds based on the 1000 Genomes Project data set.

    PubMed

    Kanai, Masahiro; Tanaka, Toshihiro; Okada, Yukinori

    2016-10-01

    To assess the statistical significance of associations between variants and traits, genome-wide association studies (GWAS) should employ an appropriate threshold that accounts for the massive burden of multiple testing in the study. Although most studies in the current literature commonly set a genome-wide significance threshold at the level of P=5.0 × 10(-8), the adequacy of this value for respective populations has not been fully investigated. To empirically estimate thresholds for different ancestral populations, we conducted GWAS simulations using the 1000 Genomes Phase 3 data set for Africans (AFR), Europeans (EUR), Admixed Americans (AMR), East Asians (EAS) and South Asians (SAS). The estimated empirical genome-wide significance thresholds were Psig=3.24 × 10(-8) (AFR), 9.26 × 10(-8) (EUR), 1.83 × 10(-7) (AMR), 1.61 × 10(-7) (EAS) and 9.46 × 10(-8) (SAS). We additionally conducted trans-ethnic meta-analyses across all populations (ALL) and all populations except for AFR (ΔAFR), which yielded Psig=3.25 × 10(-8) (ALL) and 4.20 × 10(-8) (ΔAFR). Our results indicate that the current threshold (P=5.0 × 10(-8)) is overly stringent for all ancestral populations except for Africans; however, we should employ a more stringent threshold when conducting a meta-analysis, regardless of the presence of African samples.

  4. Agent-Based Distributed Data Mining: A Survey

    NASA Astrophysics Data System (ADS)

    Moemeng, Chayapol; Gorodetsky, Vladimir; Zuo, Ziye; Yang, Yong; Zhang, Chengqi

    Distributed data mining is originated from the need of mining over decentralised data sources. Data mining techniques involving in such complex environment must encounter great dynamics due to changes in the system can affect the overall performance of the system. Agent computing whose aim is to deal with complex systems has revealed opportunities to improve distributed data mining systems in a number of ways. This paper surveys the integration of multi-agent system and distributed data mining, also known as agent-based distributed data mining, in terms of significance, system overview, existing systems, and research trends.

  5. Developing an Internet-based Survey to Collect Program Cost Data

    ERIC Educational Resources Information Center

    Caffray, Christine M.; Chatterji, Pinka

    2009-01-01

    This manuscript describes the development and testing of an Internet-based cost survey that was designed by the authors for the National Assembly on School-Based Health Care (NASBHC) to capture the costs of school-based health programs. The intent of the survey was twofold. First, the survey was designed to collect comprehensive data on costs in a…

  6. Empirical evaluation of H.265/HEVC-based dynamic adaptive video streaming over HTTP (HEVC-DASH)

    NASA Astrophysics Data System (ADS)

    Irondi, Iheanyi; Wang, Qi; Grecos, Christos

    2014-05-01

    Real-time HTTP streaming has gained global popularity for delivering video content over Internet. In particular, the recent MPEG-DASH (Dynamic Adaptive Streaming over HTTP) standard enables on-demand, live, and adaptive Internet streaming in response to network bandwidth fluctuations. Meanwhile, emerging is the new-generation video coding standard, H.265/HEVC (High Efficiency Video Coding) promises to reduce the bandwidth requirement by 50% at the same video quality when compared with the current H.264/AVC standard. However, little existing work has addressed the integration of the DASH and HEVC standards, let alone empirical performance evaluation of such systems. This paper presents an experimental HEVC-DASH system, which is a pull-based adaptive streaming solution that delivers HEVC-coded video content through conventional HTTP servers where the client switches to its desired quality, resolution or bitrate based on the available network bandwidth. Previous studies in DASH have focused on H.264/AVC, whereas we present an empirical evaluation of the HEVC-DASH system by implementing a real-world test bed, which consists of an Apache HTTP Server with GPAC, an MP4Client (GPAC) with open HEVC-based DASH client and a NETEM box in the middle emulating different network conditions. We investigate and analyze the performance of HEVC-DASH by exploring the impact of various network conditions such as packet loss, bandwidth and delay on video quality. Furthermore, we compare the Intra and Random Access profiles of HEVC coding with the Intra profile of H.264/AVC when the correspondingly encoded video is streamed with DASH. Finally, we explore the correlation among the quality metrics and network conditions, and empirically establish under which conditions the different codecs can provide satisfactory performance.

  7. An Empirical Investigation of a Theoretically Based Measure of Perceived Wellness

    ERIC Educational Resources Information Center

    Harari, Marc J.; Waehler, Charles A.; Rogers, James R.

    2005-01-01

    The Perceived Wellness Survey (PWS; T. Adams, 1995; T. Adams, J. Bezner, & M. Steinhardt, 1997) is a recently developed instrument intended to operationalize the comprehensive Perceived Wellness Model (T. Adams, J. Bezner, & M. Steinhardt, 1997), an innovative model that attempts to include the balance of multiple life activities in its evaluation…

  8. University-Based Evaluation Training Programs in the United States 1980-2008: An Empirical Examination

    ERIC Educational Resources Information Center

    LaVelle, John M.; Donaldson, Stewart I.

    2010-01-01

    Evaluation practice has grown in leaps and bounds in recent years. In contrast, the most recent survey data suggest that there has been a sharp decline in the number and strength of preservice evaluation training programs in the United States. In an effort to further understand this curious trend, an alternative methodology was used to examine the…

  9. An automatic electroencephalography blinking artefact detection and removal method based on template matching and ensemble empirical mode decomposition.

    PubMed

    Bizopoulos, Paschalis A; Al-Ani, Tarik; Tsalikakis, Dimitrios G; Tzallas, Alexandros T; Koutsouris, Dimitrios D; Fotiadis, Dimitrios I

    2013-01-01

    Electrooculographic (EOG) artefacts are one of the most common causes of Electroencephalogram (EEG) distortion. In this paper, we propose a method for EOG Blinking Artefacts (BAs) detection and removal from EEG. Normalized Correlation Coefficient (NCC), based on a predetermined BA template library was used for detecting the BA. Ensemble Empirical Mode Decomposition (EEMD) was applied to the contaminated region and a statistical algorithm determined which Intrinsic Mode Functions (IMFs) correspond to the BA. The proposed method was applied in simulated EEG signals, which were contaminated with artificially created EOG BAs, increasing the Signal-to-Error Ratio (SER) of the EEG Contaminated Region (CR) by 35 dB on average.

  10. An Empirical Model-based MOE for Friction Reduction by Slot-Ejected Polymer Solutions in an Aqueous Environment

    DTIC Science & Technology

    2007-12-21

    Date: 21 December 2007 Prepared by: Dr. John G. Pierce Under Contract No.: N00014-06-C-0535 Submitted to: Office of Naval Research Dr...1. REPORT DATE 21 DEC 2007 2. REPORT TYPE 3. DATES COVERED 00-00-2007 to 00-00-2007 4. TITLE AND SUBTITLE An Empirical Model-based MOE for...SPONSOR/MONITOR’S ACRONYM(S) 11 . SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution

  11. Empirical rainfall thresholds and copula based IDF curves for shallow landslides and flash floods

    NASA Astrophysics Data System (ADS)

    Bezak, Nejc; Šraj, Mojca; Brilly, Mitja; Mikoš, Matjaž

    2015-04-01

    Large mass movements, like deep-seated landslides or large debris flows, and flash floods can endanger human lives and cause huge environmental and economic damage in hazard areas. The main objective of the study was to investigate the characteristics of selected extreme rainfall events, which triggered landslides and caused flash floods, in Slovenia in the last 25 years. Seven extreme events, which occurred in Slovenia (Europe) in the last 25 years (1990-2014) and caused 17 casualties and about 500 million Euros of economic loss, were analysed in this study. Post-event analyses showed that rainfall characteristics triggering flash floods and landslides are different where landslides were triggered by longer duration (up to one or few weeks) rainfall events and flash floods by short duration (few hours to one or two days) rainfall events. The sensitivity analysis results indicate that inter-event time variable, which is defined as the minimum duration of the period without rain between two consecutive rainfall events, and sample definition methodology can have significant influence on the position of rainfall events in the intensity-duration space, on the constructed intensity-duration-frequency (IDF) curves and on the relationship between the empirical rainfall threshold curves and IDF curves constructed using copula approach. The empirical rainfall threshold curves (ID curves) were also evaluated for the selected extreme events. The results indicate that a combination of several empirical rainfall thresholds with appropriate high density of rainfall measuring network can be used as part of the early warning system for initiation of landslides and debris flows. However, different rainfall threshold curves should be used for lowland and mountainous areas in Slovenia. Furthermore, the intensity-duration-frequency (IDF) relationship was constructed using the Frank copula functions for 16 pluviographic meteorological stations in Slovenia using the high resolution

  12. Ab initio based empirical potential used to study the mechanical properties of molybdenum

    NASA Astrophysics Data System (ADS)

    Park, Hyoungki; Fellinger, Michael R.; Lenosky, Thomas J.; Tipton, William W.; Trinkle, Dallas R.; Rudin, Sven P.; Woodward, Christopher; Wilkins, John W.; Hennig, Richard G.

    2012-06-01

    Density-functional theory energies, forces, and elastic constants determine the parametrization of an empirical, modified embedded-atom method potential for molybdenum. The accuracy and transferability of the potential are verified by comparison to experimental and density-functional data for point defects, phonons, thermal expansion, surface and stacking fault energies, and ideal shear strength. Searching the energy landscape predicted by the potential using a genetic algorithm verifies that it reproduces not only the correct bcc ground state of molybdenum but also all low-energy metastable phases. The potential is also applicable to the study of plastic deformation and used to compute energies, core structures, and Peierls stresses of screw and edge dislocations.

  13. Stability evaluation of short-circuiting gas metal arc welding based on ensemble empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Huang, Yong; Wang, Kehong; Zhou, Zhilan; Zhou, Xiaoxiao; Fang, Jimi

    2017-03-01

    The arc of gas metal arc welding (GMAW) contains abundant information about its stability and droplet transition, which can be effectively characterized by extracting the arc electrical signals. In this study, ensemble empirical mode decomposition (EEMD) was used to evaluate the stability of electrical current signals. The welding electrical signals were first decomposed by EEMD, and then transformed to a Hilbert–Huang spectrum and a marginal spectrum. The marginal spectrum is an approximate distribution of amplitude with frequency of signals, and can be described by a marginal index. Analysis of various welding process parameters showed that the marginal index of current signals increased when the welding process was more stable, and vice versa. Thus EEMD combined with the marginal index can effectively uncover the stability and droplet transition of GMAW.

  14. Bearing fault detection based on hybrid ensemble detector and empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Georgoulas, George; Loutas, Theodore; Stylios, Chrysostomos D.; Kostopoulos, Vassilis

    2013-12-01

    Aiming at more efficient fault diagnosis, this research work presents an integrated anomaly detection approach for seeded bearing faults. Vibration signals from normal bearings and bearings with three different fault locations, as well as different fault sizes and loading conditions are examined. The Empirical Mode Decomposition and the Hilbert Huang transform are employed for the extraction of a compact feature set. Then, a hybrid ensemble detector is trained using data coming only from the normal bearings and it is successfully applied for the detection of any deviation from the normal condition. The results prove the potential use of the proposed scheme as a first stage of an alarm signalling system for the detection of bearing faults irrespective of their loading condition.

  15. Gold price analysis based on ensemble empirical model decomposition and independent component analysis

    NASA Astrophysics Data System (ADS)

    Xian, Lu; He, Kaijian; Lai, Kin Keung

    2016-07-01

    In recent years, the increasing level of volatility of the gold price has received the increasing level of attention from the academia and industry alike. Due to the complexity and significant fluctuations observed in the gold market, however, most of current approaches have failed to produce robust and consistent modeling and forecasting results. Ensemble Empirical Model Decomposition (EEMD) and Independent Component Analysis (ICA) are novel data analysis methods that can deal with nonlinear and non-stationary time series. This study introduces a new methodology which combines the two methods and applies it to gold price analysis. This includes three steps: firstly, the original gold price series is decomposed into several Intrinsic Mode Functions (IMFs) by EEMD. Secondly, IMFs are further processed with unimportant ones re-grouped. Then a new set of data called Virtual Intrinsic Mode Functions (VIMFs) is reconstructed. Finally, ICA is used to decompose VIMFs into statistically Independent Components (ICs). The decomposition results reveal that the gold price series can be represented by the linear combination of ICs. Furthermore, the economic meanings of ICs are analyzed and discussed in detail, according to the change trend and ICs' transformation coefficients. The analyses not only explain the inner driving factors and their impacts but also conduct in-depth analysis on how these factors affect gold price. At the same time, regression analysis has been conducted to verify our analysis. Results from the empirical studies in the gold markets show that the EEMD-ICA serve as an effective technique for gold price analysis from a new perspective.

  16. [Utilization of CAP Survey, Based on Questionnaire Results from Survey Participants].

    PubMed

    Hirano, Akiko; Ohno, Hiroie

    2015-08-01

    The survey provided by the College of American Pathologists (CAP) is chosen as one of the proficiency testing programs in Japan, and, recently, the numbers of participating facilities have increased. CAP provides 754 programs, and more than 1,000 tests were provided in 2014. Materials are translated as the "CAP global inter-laboratory comparison program" under the instruction of the Japanese Society of Laboratory Medicine (JSLM) selected from CAP surveys in Japan, and 68 programs and 261 items are provided. The total number of participating facilities was 174. CAP itself and the other services CAP provides are not well-known, while recognition of "the CAP survey as the proficiency test" has increased. The question "What is CAP and the CAP survey" was analyzed as a result of the questionnaire surveys conducted in 2014, and the advantage of the CAP survey and how to utilize it were considered. A questionnaire survey was conducted about the CAP survey for Japanese participants in 2014. Fifty-three questions were asked about their satisfaction level, intended use, and improvement. Eighty replies were analyzed. As a result, most CAP survey participants are satisfied. They intend to mainly use the CAP survey for their quality control. Furthermore, they can continuously monitor their systems throughout all testing phases as the survey has numbers of shipments a year and several specimens per each mailing. This helps in laboratory performance improvement. The Evaluation and Participant Summary (PSR) also effectively improves the laboratories' performance. CAP-accredited laboratories are required to participate in all survey programs concerning the test menu which they provide. Therefore, they have become accustomed to reviewing the evaluation and performing self-evaluation with a high usage rate of the Evaluation and PSR of the CAP survey. The questionnaire proved that performing the CAP survey properly enhanced the laboratories' quality control, and this meets the

  17. Basewide environmental baseline survey, Reese Air Force Base, Texas

    SciTech Connect

    1996-11-26

    This Environmental Baseline Survey (EBS) has been prepared to document the environmental condition of real property at Reese Air Force Base (AFB), Texas, resulting from the storage, release, and disposal of hazardous substances and petroleum products and their derivatives over the installation`s history. Although primarily a management tool, this EBS is also used by the Air Force to meet its obligations under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), 42 U.S. Code Section 9620(h), as amended by the Community Environmental Response Facilitation Act (CERFA) (Public Law 102-426). Table ES-1 lists all Category 1 uncontaminated property associated with Reese AFB based on information obtained through a records search, interviews, and visual inspections at Reese AFB and Figures ES-1a and ES-1b depict their locations.

  18. Sensor Systems Based on FPGAs and Their Applications: A Survey

    PubMed Central

    de la Piedra, Antonio; Braeken, An; Touhafi, Abdellah

    2012-01-01

    In this manuscript, we present a survey of designs and implementations of research sensor nodes that rely on FPGAs, either based upon standalone platforms or as a combination of microcontroller and FPGA. Several current challenges in sensor networks are distinguished and linked to the features of modern FPGAs. As it turns out, low-power optimized FPGAs are able to enhance the computation of several types of algorithms in terms of speed and power consumption in comparison to microcontrollers of commercial sensor nodes. We show that architectures based on the combination of microcontrollers and FPGA can play a key role in the future of sensor networks, in fields where processing capabilities such as strong cryptography, self-testing and data compression, among others, are paramount.

  19. Fault location based on synchronized measurements: a comprehensive survey.

    PubMed

    Al-Mohammed, A H; Abido, M A

    2014-01-01

    This paper presents a comprehensive survey on transmission and distribution fault location algorithms that utilize synchronized measurements. Algorithms based on two-end synchronized measurements and fault location algorithms on three-terminal and multiterminal lines are reviewed. Series capacitors equipped with metal oxide varistors (MOVs), when set on a transmission line, create certain problems for line fault locators and, therefore, fault location on series-compensated lines is discussed. The paper reports the work carried out on adaptive fault location algorithms aiming at achieving better fault location accuracy. Work associated with fault location on power system networks, although limited, is also summarized. Additionally, the nonstandard high-frequency-related fault location techniques based on wavelet transform are discussed. Finally, the paper highlights the area for future research.

  20. Fault Location Based on Synchronized Measurements: A Comprehensive Survey

    PubMed Central

    Al-Mohammed, A. H.; Abido, M. A.

    2014-01-01

    This paper presents a comprehensive survey on transmission and distribution fault location algorithms that utilize synchronized measurements. Algorithms based on two-end synchronized measurements and fault location algorithms on three-terminal and multiterminal lines are reviewed. Series capacitors equipped with metal oxide varistors (MOVs), when set on a transmission line, create certain problems for line fault locators and, therefore, fault location on series-compensated lines is discussed. The paper reports the work carried out on adaptive fault location algorithms aiming at achieving better fault location accuracy. Work associated with fault location on power system networks, although limited, is also summarized. Additionally, the nonstandard high-frequency-related fault location techniques based on wavelet transform are discussed. Finally, the paper highlights the area for future research. PMID:24701191

  1. RN's experiences of sex-based and sexual harassment--an empirical study.

    PubMed

    Madison, J

    1997-01-01

    A survey of 317 registered nurses enrolled in tertiary post-registration courses found that two thirds of the 197 respondents had encountered sexual harassment in the work place. A quarter of these nurses identified medical officers and 22.1% identified co-workers as their harassers.This paper identifies the harassing behaviours the respondents experienced, their responses to the behaviour and the effects the harassment had on them.

  2. Empirical Methods for Predicting Eutrophication in Impoundments. Report 1. Phase I. Data Base Development.

    DTIC Science & Technology

    1981-05-01

    estimating volume and area variations with elevation, required for volume-averaging of water quality data and for calculating material load- ings,.)*ave...in the EPA National Eutrophication Survey Compendium . ... 92 6 Distributions of Volume and Area Slope Parameters ..... . 106 7 White River System...the morphometric profiles have been tested by comparing reported volumes at any elevation with the integral of reported areas with respect to depth

  3. A Compound fault diagnosis for rolling bearings method based on blind source separation and ensemble empirical mode decomposition.

    PubMed

    Wang, Huaqing; Li, Ruitong; Tang, Gang; Yuan, Hongfang; Zhao, Qingliang; Cao, Xi

    2014-01-01

    A Compound fault signal usually contains multiple characteristic signals and strong confusion noise, which makes it difficult to separate week fault signals from them through conventional ways, such as FFT-based envelope detection, wavelet transform or empirical mode decomposition individually. In order to improve the compound faults diagnose of rolling bearings via signals' separation, the present paper proposes a new method to identify compound faults from measured mixed-signals, which is based on ensemble empirical mode decomposition (EEMD) method and independent component analysis (ICA) technique. With the approach, a vibration signal is firstly decomposed into intrinsic mode functions (IMF) by EEMD method to obtain multichannel signals. Then, according to a cross correlation criterion, the corresponding IMF is selected as the input matrix of ICA. Finally, the compound faults can be separated effectively by executing ICA method, which makes the fault features more easily extracted and more clearly identified. Experimental results validate the effectiveness of the proposed method in compound fault separating, which works not only for the outer race defect, but also for the rollers defect and the unbalance fault of the experimental system.

  4. Development of the Knowledge-based & Empirical Combined Scoring Algorithm (KECSA) to Score Protein-Ligand Interactions

    PubMed Central

    Zheng, Zheng

    2013-01-01

    We describe a novel knowledge-based protein-ligand scoring function that employs a new definition for the reference state, allowing us to relate a statistical potential to a Lennard-Jones (LJ) potential. In this way, the LJ potential parameters were generated from protein-ligand complex structural data contained in the PDB. Forty-nine types of atomic pairwise interactions were derived using this method, which we call the knowledge-based and empirical combined scoring algorithm (KECSA). Two validation benchmarks were introduced to test the performance of KECSA. The first validation benchmark included two test sets that address the training-set and enthalpy/entropy of KECSA The second validation benchmark suite included two large-scale and five small-scale test sets to compare the reproducibility of KECSA with respect to two empirical score functions previously developed in our laboratory (LISA and LISA+), as well as to other well-known scoring methods. Validation results illustrate that KECSA shows improved performance in all test sets when compared with other scoring methods especially in its ability to minimize the RMSE. LISA and LISA+ displayed similar performance using the correlation coefficient and Kendall τ as the metric of quality for some of the small test sets. Further pathways for improvement are discussed which would KECSA more sensitive to subtle changes in ligand structure. PMID:23560465

  5. Relevant modes selection method based on Spearman correlation coefficient for laser signal denoising using empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Duan, Yabo; Song, Chengtian

    2016-12-01

    Empirical mode decomposition (EMD) is a recently proposed nonlinear and nonstationary laser signal denoising method. A noisy signal is broken down using EMD into oscillatory components that are called intrinsic mode functions (IMFs). Thresholding-based denoising and correlation-based partial reconstruction of IMFs are the two main research directions for EMD-based denoising. Similar to other decomposition-based denoising approaches, EMD-based denoising methods require a reliable threshold to determine which IMFs are noise components and which IMFs are noise-free components. In this work, we propose a new approach in which each IMF is first denoised using EMD interval thresholding (EMD-IT), and then a robust thresholding process based on Spearman correlation coefficient is used for relevant modes selection. The proposed method tackles the problem using a thresholding-based denoising approach coupled with partial reconstruction of the relevant IMFs. Other traditional denoising methods, including correlation-based EMD partial reconstruction (EMD-Correlation), discrete Fourier transform and wavelet-based methods, are investigated to provide a comparison with the proposed technique. Simulation and test results demonstrate the superior performance of the proposed method when compared with the other methods.

  6. Empirical application of empathy enhancing program based on movement concept for married couples in conflict

    PubMed Central

    Kim, Soo-Yeon; Kang, Hye-Won; Chung, Yong-Chul; Park, Seungha

    2013-01-01

    In the field of marital therapy, it is known that couple movement program helps married couples faced with conflict situation to rebuild the relationship and to maintain a family homeostasis. The purpose of this study was to configure and apply the kinesthetic empathy program and to assess the effectiveness for married couples in conflict. To achieve the research aims, qualitative research method has been conducted, subjecting three couples, 6 people, who are participating in expressive movement program for this study. The study used focus group interview method for collecting date and employed for the interview method by mixing the semi-structured and unstructured questionnaire. The results were followings. First, through kinesthetic empathy enhancing program, one could develop self-awareness and emotional attunement. Second, the result showed the relationship between intention and empathy. It shows that “knowing spouse’s hidden intention” is significant factors to understand others. Third, kinesthetic empathy program could complement general marriage counseling program. The results of this study provide empirical evidence that movement program functions as an empathy enhancer through the process of perceiving, feeling, thinking, and interacting with others. PMID:24278896

  7. Spectral analysis of Hall-effect thruster plasma oscillations based on the empirical mode decomposition

    SciTech Connect

    Kurzyna, J.; Mazouffre, S.; Lazurenko, A.; Albarede, L.; Bonhomme, G.; Makowski, K.; Dudeck, M.; Peradzynski, Z.

    2005-12-15

    Hall-effect thruster plasma oscillations recorded by means of probes located at the channel exit are analyzed using the empirical mode decomposition (EMD) method. This self-adaptive technique permits to decompose a nonstationary signal into a set of intrinsic modes, and acts as a very efficient filter allowing to separate contributions of different underlying physical mechanisms. Applying the Hilbert transform to the whole set of modes allows to identify peculiar events and to assign them a range of instantaneous frequency and power. In addition to 25 kHz breathing-type oscillations which are unambiguously identified, the EMD approach confirms the existence of oscillations with instantaneous frequencies in the range of 100-500 kHz typical for ion transit-time oscillations. Modeling of high-frequency modes ({nu}{approx}10 MHz) resulting from EMD of measured wave forms supports the idea that high-frequency plasma oscillations originate from electron-density perturbations propagating azimuthally with the electron drift velocity.

  8. Towards high performing hospital enterprise systems: an empirical and literature based design framework

    NASA Astrophysics Data System (ADS)

    dos Santos Fradinho, Jorge Miguel

    2014-05-01

    Our understanding of enterprise systems (ES) is gradually evolving towards a sense of design which leverages multidisciplinary bodies of knowledge that may bolster hybrid research designs and together further the characterisation of ES operation and performance. This article aims to contribute towards ES design theory with its hospital enterprise systems design (HESD) framework, which reflects a rich multidisciplinary literature and two in-depth hospital empirical cases from the US and UK. In doing so it leverages systems thinking principles and traditionally disparate bodies of knowledge to bolster the theoretical evolution and foundation of ES. A total of seven core ES design elements are identified and characterised with 24 main categories and 53 subcategories. In addition, it builds on recent work which suggests that hospital enterprises are comprised of multiple internal ES configurations which may generate different levels of performance. Multiple sources of evidence were collected including electronic medical records, 54 recorded interviews, observation, and internal documents. Both in-depth cases compare and contrast higher and lower performing ES configurations. Following literal replication across in-depth cases, this article concludes that hospital performance can be improved through an enriched understanding of hospital ES design.

  9. Cardiopulmonary Resuscitation Pattern Evaluation Based on Ensemble Empirical Mode Decomposition Filter via Nonlinear Approaches

    PubMed Central

    Ma, Matthew Huei-Ming

    2016-01-01

    Good quality cardiopulmonary resuscitation (CPR) is the mainstay of treatment for managing patients with out-of-hospital cardiac arrest (OHCA). Assessment of the quality of the CPR delivered is now possible through the electrocardiography (ECG) signal that can be collected by an automated external defibrillator (AED). This study evaluates a nonlinear approximation of the CPR given to the asystole patients. The raw ECG signal is filtered using ensemble empirical mode decomposition (EEMD), and the CPR-related intrinsic mode functions (IMF) are chosen to be evaluated. In addition, sample entropy (SE), complexity index (CI), and detrended fluctuation algorithm (DFA) are collated and statistical analysis is performed using ANOVA. The primary outcome measure assessed is the patient survival rate after two hours. CPR pattern of 951 asystole patients was analyzed for quality of CPR delivered. There was no significant difference observed in the CPR-related IMFs peak-to-peak interval analysis for patients who are younger or older than 60 years of age, similarly to the amplitude difference evaluation for SE and DFA. However, there is a difference noted for the CI (p < 0.05). The results show that patients group younger than 60 years have higher survival rate with high complexity of the CPR-IMFs amplitude differences. PMID:27529068

  10. Ticking the boxes: a survey of workplace-based assessments

    PubMed Central

    Gilberthorpe, Thomas; Sarfo, Maame Duku; Lawrence-Smith, Geoff

    2016-01-01

    Aims and method To survey the quality of workplace-based assessments (WPBAs) through retrospective analysis of completed WPBA forms against training targets derived from the Royal College of Psychiatrists' Portfolio Online. Results Almost a third of assessments analysed showed no divergence in assessment scores across the varied assessment domains and there was poor correlation between domain scores and the nature of comments provided by assessors. Of the assessments that suggested action points only half were considered to be sufficiently ‘specific’ and ‘achievable’ to be useful for trainees' learning. Clinical implications WPBA is not currently being utilised to its full potential as a formative assessment tool and more widespread audit is needed to establish whether this is a local or a national issue. PMID:27087994

  11. An all-atom structure-based potential for proteins: bridging minimal models with all-atom empirical forcefields.

    PubMed

    Whitford, Paul C; Noel, Jeffrey K; Gosavi, Shachi; Schug, Alexander; Sanbonmatsu, Kevin Y; Onuchic, José N

    2009-05-01

    Protein dynamics take place on many time and length scales. Coarse-grained structure-based (Go) models utilize the funneled energy landscape theory of protein folding to provide an understanding of both long time and long length scale dynamics. All-atom empirical forcefields with explicit solvent can elucidate our understanding of short time dynamics with high energetic and structural resolution. Thus, structure-based models with atomic details included can be used to bridge our understanding between these two approaches. We report on the robustness of folding mechanisms in one such all-atom model. Results for the B domain of Protein A, the SH3 domain of C-Src Kinase, and Chymotrypsin Inhibitor 2 are reported. The interplay between side chain packing and backbone folding is explored. We also compare this model to a C(alpha) structure-based model and an all-atom empirical forcefield. Key findings include: (1) backbone collapse is accompanied by partial side chain packing in a cooperative transition and residual side chain packing occurs gradually with decreasing temperature, (2) folding mechanisms are robust to variations of the energetic parameters, (3) protein folding free-energy barriers can be manipulated through parametric modifications, (4) the global folding mechanisms in a C(alpha) model and the all-atom model agree, although differences can be attributed to energetic heterogeneity in the all-atom model, and (5) proline residues have significant effects on folding mechanisms, independent of isomerization effects. Because this structure-based model has atomic resolution, this work lays the foundation for future studies to probe the contributions of specific energetic factors on protein folding and function.

  12. An All-atom Structure-Based Potential for Proteins: Bridging Minimal Models with All-atom Empirical Forcefields

    PubMed Central

    Whitford, Paul C.; Noel, Jeffrey K.; Gosavi, Shachi; Schug, Alexander; Sanbonmatsu, Kevin Y.; Onuchic, José N.

    2012-01-01

    Protein dynamics take place on many time and length scales. Coarse-grained structure-based (Gō) models utilize the funneled energy landscape theory of protein folding to provide an understanding of both long time and long length scale dynamics. All-atom empirical forcefields with explicit solvent can elucidate our understanding of short time dynamics with high energetic and structural resolution. Thus, structure-based models with atomic details included can be used to bridge our understanding between these two approaches. We report on the robustness of folding mechanisms in one such all-atom model. Results for the B domain of Protein A, the SH3 domain of C-Src Kinase and Chymotrypsin Inhibitor 2 are reported. The interplay between side chain packing and backbone folding is explored. We also compare this model to a Cα structure-based model and an all-atom empirical forcefield. Key findings include 1) backbone collapse is accompanied by partial side chain packing in a cooperative transition and residual side chain packing occurs gradually with decreasing temperature 2) folding mechanisms are robust to variations of the energetic parameters 3) protein folding free energy barriers can be manipulated through parametric modifications 4) the global folding mechanisms in a Cα model and the all-atom model agree, although differences can be attributed to energetic heterogeneity in the all-atom model 5) proline residues have significant effects on folding mechanisms, independent of isomerization effects. Since this structure-based model has atomic resolution, this work lays the foundation for future studies to probe the contributions of specific energetic factors on protein folding and function. PMID:18837035

  13. Racism, health status, and birth outcomes: results of a participatory community-based intervention and health survey.

    PubMed

    Carty, Denise C; Kruger, Daniel J; Turner, Tonya M; Campbell, Bettina; DeLoney, E Hill; Lewis, E Yvonne

    2011-02-01

    Many community-based participatory research (CBPR) partnerships address social determinants of health as a central consideration. However, research studies that explicitly address racism are scarce in the CBPR literature, and there is a dearth of available community-generated data to empirically examine how racism influences health disparities at the local level. In this paper, we provide results of a cross-sectional, population-based health survey conducted in the urban areas of Genesee and Saginaw Counties in Michigan to assess how a sustained community intervention to reduce racism and infant mortality influenced knowledge, beliefs, and experiences of racism and to explore how perceived racism is associated with self-rated health and birth outcomes. We used ANOVA and regression models to compare the responses of intervention participants and non-participants as well as African Americans and European Americans (N = 629). We found that intervention participants reported greater acknowledgment of the enduring and differential impact of racism in comparison to the non-intervention participants. Moreover, survey analyses revealed that racism was associated with health in the following ways: (1) experiences of racial discrimination predicted self-rated physical health, mental health, and smoking status; (2) perceived racism against one's racial group predicted lower self-rated physical health; and (3) emotional responses to racism-related experiences were marginally associated with lower birth-weight births in the study sample. Our study bolsters the published findings on perceived racism and health outcomes and highlights the usefulness of CBPR and community surveys to empirically investigate racism as a social determinant of health.

  14. An Empirical Study of Neural Network-Based Audience Response Technology in a Human Anatomy Course for Pharmacy Students.

    PubMed

    Fernández-Alemán, José Luis; López-González, Laura; González-Sequeros, Ofelia; Jayne, Chrisina; López-Jiménez, Juan José; Carrillo-de-Gea, Juan Manuel; Toval, Ambrosio

    2016-04-01

    This paper presents an empirical study of a formative neural network-based assessment approach by using mobile technology to provide pharmacy students with intelligent diagnostic feedback. An unsupervised learning algorithm was integrated with an audience response system called SIDRA in order to generate states that collect some commonality in responses to questions and add diagnostic feedback for guided learning. A total of 89 pharmacy students enrolled on a Human Anatomy course were taught using two different teaching methods. Forty-four students employed intelligent SIDRA (i-SIDRA), whereas 45 students received the same training but without using i-SIDRA. A statistically significant difference was found between the experimental group (i-SIDRA) and the control group (traditional learning methodology), with T (87) = 6.598, p < 0.001. In four MCQs tests, the difference between the number of correct answers in the first attempt and in the last attempt was also studied. A global effect size of 0.644 was achieved in the meta-analysis carried out. The students expressed satisfaction with the content provided by i-SIDRA and the methodology used during the process of learning anatomy (M = 4.59). The new empirical contribution presented in this paper allows instructors to perform post hoc analyses of each particular student's progress to ensure appropriate training.

  15. Towards a critical evaluation of an empirical and volume-based solvation function for ligand docking

    PubMed Central

    Muniz, Heloisa S.

    2017-01-01

    Molecular docking is an important tool for the discovery of new biologically active molecules given that the receptor structure is known. An excellent environment for the development of new methods and improvement of the current methods is being provided by the rapid growth in the number of proteins with known structure. The evaluation of the solvation energies outstands among the challenges for the modeling of the receptor-ligand interactions, especially in the context of molecular docking where a fast, though accurate, evaluation is ought to be achieved. Here we evaluated a variation of the desolvation energy model proposed by Stouten (Stouten P.F.W. et al, Molecular Simulation, 1993, 10: 97–120), or SV model. The SV model showed a linear correlation with experimentally determined solvation energies, as available in the database FreeSolv. However, when used in retrospective docking simulations using the benchmarks DUD, charged-matched DUD and DUD-Enhanced, the SV model resulted in poorer enrichments when compared to a pure force field model with no correction for solvation effects. The data provided here is consistent with other empirical solvation models employed in the context of molecular docking and indicates that a good model to account for solvent effects is still a goal to achieve. On the other hand, despite the inability to improve the enrichment of retrospective simulations, the SV solvation model showed an interesting ability to reduce the number of molecules with net charge -2 and -3 e among the top-scored molecules in a prospective test. PMID:28323889

  16. Changing Healthcare Providers’ Behavior during Pediatric Inductions with an Empirically-based Intervention

    PubMed Central

    Martin, Sarah R.; Chorney, Jill MacLaren; Tan, Edwin T.; Fortier, Michelle A.; Blount, Ronald L.; Wald, Samuel H.; Shapiro, Nina L.; Strom, Suzanne L.; Patel, Swati; Kain, Zeev N.

    2011-01-01

    Background Each year over 4 million children experience significant levels of preoperative anxiety, which has been linked to poor recovery outcomes. Healthcare providers (HCP) and parents represent key resources for children to help them manage their preoperative anxiety. The present study reports on the development and preliminary feasibility testing of a new intervention designed to change HCP and parent perioperative behaviors that have been previously reported to be associated with children’s coping and stress behaviors before surgery. Methods An empirically-derived intervention, Provider-Tailored Intervention for Perioperative Stress, was developed to train HCPs to increase behaviors that promote children’s coping and decrease behaviors that may exacerbate children’s distress. Rates of HCP behaviors were coded and compared between pre-intervention and post-intervention. Additionally, rates of parents’ behaviors were compared between those that interacted with HCPs before training to those interacting with HCPs post-intervention. Results Effect sizes indicated that HCPs that underwent training demonstrated increases in rates of desired behaviors (range: 0.22 to 1.49) and decreases in rates of undesired behaviors (range: 0.15 to 2.15). Additionally, parents, who were indirectly trained, also demonstrated changes to their rates of desired (range: 0.30 to 0.60) and undesired behaviors (range: 0.16 to 0.61). Conclusions The intervention successfully modified HCP and parent behaviors. It represents a potentially new clinical way to decrease anxiety in children. A recently National Institute of Child Health and Development funded multi-site randomized control trial will examine the efficacy of this intervention in reducing children’s preoperative anxiety and improving children’s postoperative recovery is about to start. PMID:21606826

  17. Combining Empirical Relationships with Data Based Mechanistic Modeling to Inform Solute Tracer Investigations across Stream Orders

    NASA Astrophysics Data System (ADS)

    Herrington, C.; Gonzalez-Pinzon, R.; Covino, T. P.; Mortensen, J.

    2015-12-01

    Solute transport studies in streams and rivers often begin with the introduction of conservative and reactive tracers into the water column. Information on the transport of these substances is then captured within tracer breakthrough curves (BTCs) and used to estimate, for instance, travel times and dissolved nutrient and carbon dynamics. Traditionally, these investigations have been limited to systems with small discharges (< 200 L/s) and with small reach lengths (< 500 m), partly due to the need for a priori information of the reach's hydraulic characteristics (e.g., channel geometry, resistance and dispersion coefficients) to predict arrival times, times to peak concentrations of the solute and mean travel times. Current techniques to acquire these channel characteristics through preliminary tracer injections become cost prohibitive at higher stream orders and the use of semi-continuous water quality sensors for collecting real-time information may be affected from erroneous readings that are masked by high turbidity (e.g., nitrate signals with SUNA instruments or fluorescence measures) and/or high total dissolved solids (e.g., making prohibitively expensive the use of salt tracers such as NaCl) in larger systems. Additionally, a successful time-of-travel study is valuable for only a single discharge and river stage. We have developed a method to predict tracer BTCs to inform sampling frequencies at small and large stream orders using empirical relationships developed from multiple tracer injections spanning several orders of magnitude in discharge and reach length. This method was successfully tested in 1st to 8th order systems along the Middle Rio Grande River Basin in New Mexico, USA.

  18. Ensemble Empirical Mode Decomposition based methodology for ultrasonic testing of coarse grain austenitic stainless steels.

    PubMed

    Sharma, Govind K; Kumar, Anish; Jayakumar, T; Purnachandra Rao, B; Mariyappa, N

    2015-03-01

    A signal processing methodology is proposed in this paper for effective reconstruction of ultrasonic signals in coarse grained high scattering austenitic stainless steel. The proposed methodology is comprised of the Ensemble Empirical Mode Decomposition (EEMD) processing of ultrasonic signals and application of signal minimisation algorithm on selected Intrinsic Mode Functions (IMFs) obtained by EEMD. The methodology is applied to ultrasonic signals obtained from austenitic stainless steel specimens of different grain size, with and without defects. The influence of probe frequency and data length of a signal on EEMD decomposition is also investigated. For a particular sampling rate and probe frequency, the same range of IMFs can be used to reconstruct the ultrasonic signal, irrespective of the grain size in the range of 30-210 μm investigated in this study. This methodology is successfully employed for detection of defects in a 50mm thick coarse grain austenitic stainless steel specimens. Signal to noise ratio improvement of better than 15 dB is observed for the ultrasonic signal obtained from a 25 mm deep flat bottom hole in 200 μm grain size specimen. For ultrasonic signals obtained from defects at different depths, a minimum of 7 dB extra enhancement in SNR is achieved as compared to the sum of selected IMF approach. The application of minimisation algorithm with EEMD processed signal in the proposed methodology proves to be effective for adaptive signal reconstruction with improved signal to noise ratio. This methodology was further employed for successful imaging of defects in a B-scan.

  19. Holding-based network of nations based on listed energy companies: An empirical study on two-mode affiliation network of two sets of actors

    NASA Astrophysics Data System (ADS)

    Li, Huajiao; Fang, Wei; An, Haizhong; Gao, Xiangyun; Yan, Lili

    2016-05-01

    Economic networks in the real world are not homogeneous; therefore, it is important to study economic networks with heterogeneous nodes and edges to simulate a real network more precisely. In this paper, we present an empirical study of the one-mode derivative holding-based network constructed by the two-mode affiliation network of two sets of actors using the data of worldwide listed energy companies and their shareholders. First, we identify the primitive relationship in the two-mode affiliation network of the two sets of actors. Then, we present the method used to construct the derivative network based on the shareholding relationship between two sets of actors and the affiliation relationship between actors and events. After constructing the derivative network, we analyze different topological features on the node level, edge level and entire network level and explain the meanings of the different values of the topological features combining the empirical data. This study is helpful for expanding the usage of complex networks to heterogeneous economic networks. For empirical research on the worldwide listed energy stock market, this study is useful for discovering the inner relationships between the nations and regions from a new perspective.

  20. Data-based empirical model reduction as an approach to data mining

    NASA Astrophysics Data System (ADS)

    Ghil, M.

    2012-12-01

    Science is very much about finding order in chaos, patterns in oodles of data, signal in noise, and so on. One can see any scientific description as a model of the data, whether verbal, statistical or dynamical. In this talk, I will provide an approach to such descriptions that relies on constructing nonlinear, stochastically forced models, via empirical model reduction (EMR). EMR constructs a low-order nonlinear system of prognostic equations driven by stochastic forcing; it estimates both the dynamical operator and the properties of the driving noise directly from observations or from a high-order model's simulation. The multi-level EMR structure for modeling the stochastic forcing allows one to capture feedback between high- and low-frequency components of the variability, thus parameterizing the "fast scales," often referred to as the "noise," in terms of the memory of the "slow" scales, referred to as the "signal." EMR models have been shown to capture quite well features of the high-dimensional data sets involved, in the frequency domain as well as in the spatial domain. Illustrative examples will involve capturing correctly patterns in data sets that are either purely observational or generated by high-end models. They will be selected from intraseasonal variability of the mid-latitude atmosphere, seasonal-to-interannual variability of the sea surface temperature field, and air-sea interaction in the Southern Ocean. The work described in this talk is joint with M.D. Chekroun, D. Kondrashov, S. Kravtsov, and A.W. Robertson. Recent results on using a modified and improved form of EMR modeling for predictive purposes will be provided in a separate talk by D. Kondrashov, M. Chekroun and M. Ghil on "Data-Driven Model Reduction and Climate Prediction: Nonlinear Stochastic, Energy-Conserving Models With Memory Effects."Detailed budget of mean phase-space tendencies for the plane spanned by EOFs 1 and 4 of an intermediate-complexity model of mid-latitude flow

  1. Meta-Analysis of Group Learning Activities: Empirically Based Teaching Recommendations

    ERIC Educational Resources Information Center

    Tomcho, Thomas J.; Foels, Rob

    2012-01-01

    Teaching researchers commonly employ group-based collaborative learning approaches in Teaching of Psychology teaching activities. However, the authors know relatively little about the effectiveness of group-based activities in relation to known psychological processes associated with group dynamics. Therefore, the authors conducted a meta-analytic…

  2. Formula-Based Public School Funding System in Victoria: An Empirical Analysis of Equity

    ERIC Educational Resources Information Center

    Bandaranayake, Bandara

    2013-01-01

    This article explores the formula-based school funding system in the state of Victoria, Australia, where state funds are directly allocated to schools based on a range of equity measures. The impact of Victoria' funding system for education in terms of alleviating inequality and disadvantage is contentious, to say the least. It is difficult to…

  3. Probabilistic Algorithms, Integration, and Empirical Evaluation for Disambiguating Multiple Selections in Frustum-Based Pointing

    DTIC Science & Technology

    2006-06-01

    generated and is used for processing selections. Kolsch et al. [11] developed a real-time hand gesture recognition system that can act as the sole...576–583. [19] G. Schmidt and D. House, “Model-based motion filtering for improving arm gesture recognition performance,” in Gesture-based

  4. An Empirical Analysis of the Antecedents of Web-Based Learning Continuance

    ERIC Educational Resources Information Center

    Chiu, Chao-Min; Sun, Szu-Yuan; Sun, Pei-Chen; Ju, Teresa L.

    2007-01-01

    Like any other product, service and Web-based application, the success of Web-based learning depends largely on learners' satisfaction and other factors that will eventually increase learners' intention to continue using it. This paper integrates the concept of subjective task value and fairness theory to construct a model for investigating the…

  5. Outcome (Competency) Based Education: An Exploration of Its Origins, Theoretical Basis, and Empirical Evidence

    ERIC Educational Resources Information Center

    Morcke, Anne Mette; Dornan, Tim; Eika, Berit

    2013-01-01

    Outcome based or competency based education (OBE) is so firmly established in undergraduate medical education that it might not seem necessary to ask why it was included in recommendations for the future, like the Flexner centenary report. Uncritical acceptance may not, however, deliver its greatest benefits. Our aim was to explore the…

  6. Text-Based On-Line Conferencing: A Conceptual and Empirical Analysis Using a Minimal Prototype.

    ERIC Educational Resources Information Center

    McCarthy, John C.; And Others

    1993-01-01

    Analyzes requirements for text-based online conferencing through the use of a minimal prototype. Topics discussed include prototyping with a minimal system; text-based communication; the system as a message passer versus the system as a shared data structure; and three exercises that showed how users worked with the prototype. (Contains 61…

  7. An Empirical Study of Instructor Adoption of Web-Based Learning Systems

    ERIC Educational Resources Information Center

    Wang, Wei-Tsong; Wang, Chun-Chieh

    2009-01-01

    For years, web-based learning systems have been widely employed in both educational and non-educational institutions. Although web-based learning systems are emerging as a useful tool for facilitating teaching and learning activities, the number of users is not increasing as fast as expected. This study develops an integrated model of instructor…

  8. [Surveying a zoological facility through satellite-based geodesy].

    PubMed

    Böer, M; Thien, W; Tölke, D

    2000-06-01

    In the course of a thesis submitted for a diploma degree within the Fachhochschule Oldenburg the Serengeti Safaripark was surveyed in autumn and winter 1996/97 laying in the planning foundations for the application for licences from the controlling authorities. Taking into consideration the special way of keeping animals in the Serengeti Safaripark (game ranching, spacious walk-through-facilities) the intention was to employ the outstanding satellite based geodesy. This technology relies on special aerials receiving signals from 24 satellites which circle around the globe. These data are being gathered and examined. This examination produces the exact position of this aerial in a system of coordinates which allows depicting this point on a map. This procedure was used stationary (from a strictly defined point) as well as in the movement (in a moving car). Additionally conventional procedures were used when the satellite based geodesy came to its limits. Finally a detailed map of the Serengeti Safaripark was created which shows the position and size of stables and enclosures as well as wood and water areas and the sectors of the leisure park. Furthermore the established areas of the enclosures together with an already existing animal databank have flown into an information system with the help of which the stock of animals can be managed enclosure-orientated.

  9. ECG-based heartbeat classification for arrhythmia detection: A survey.

    PubMed

    Luz, Eduardo José da S; Schwartz, William Robson; Cámara-Chávez, Guillermo; Menotti, David

    2016-04-01

    An electrocardiogram (ECG) measures the electric activity of the heart and has been widely used for detecting heart diseases due to its simplicity and non-invasive nature. By analyzing the electrical signal of each heartbeat, i.e., the combination of action impulse waveforms produced by different specialized cardiac tissues found in the heart, it is possible to detect some of its abnormalities. In the last decades, several works were developed to produce automatic ECG-based heartbeat classification methods. In this work, we survey the current state-of-the-art methods of ECG-based automated abnormalities heartbeat classification by presenting the ECG signal preprocessing, the heartbeat segmentation techniques, the feature description methods and the learning algorithms used. In addition, we describe some of the databases used for evaluation of methods indicated by a well-known standard developed by the Association for the Advancement of Medical Instrumentation (AAMI) and described in ANSI/AAMI EC57:1998/(R)2008 (ANSI/AAMI, 2008). Finally, we discuss limitations and drawbacks of the methods in the literature presenting concluding remarks and future challenges, and also we propose an evaluation process workflow to guide authors in future works.

  10. A Survey on ROC-based Ordinal Regression

    NASA Astrophysics Data System (ADS)

    Waegeman, Willem; Baets, Bernard De

    Ordinal regression can be seen as a special case of preference learning, in which the class labels corresponding with data instances can take values from an ordered finite set. In such a setting, the classes usually have a linguistic interpretation attached by humans to subdivide the data into a number of preference bins. In this chapter, we give a general survey on ordinal regression from a machine learning point of view. In particular, we elaborate on some important connections with ROC analysis that have been introduced recently by the present authors. First, the important role of an underlying ranking function in ordinal regression models is discussed, as well as its impact on the performance evaluation of such models. Subsequently, we describe a new ROC-based performance measure that directly evaluates the underlying ranking function, and we place it in the more general context of ROC analysis as the volume under an r-dimensional ROC surface (VUS) for in general rclasses. Furthermore, we also discuss the scalability of this measure and show that it can be computed very efficiently for large samples. Finally, we present a kernel-based learning algorithm that optimizes VUS as a specific case of structured support vector machines.

  11. Fault identification of rotor-bearing system based on ensemble empirical mode decomposition and self-zero space projection analysis

    NASA Astrophysics Data System (ADS)

    Jiang, Fan; Zhu, Zhencai; Li, Wei; Zhou, Gongbo; Chen, Guoan

    2014-07-01

    Accurately identifying faults in rotor-bearing systems by analyzing vibration signals, which are nonlinear and nonstationary, is challenging. To address this issue, a new approach based on ensemble empirical mode decomposition (EEMD) and self-zero space projection analysis is proposed in this paper. This method seeks to identify faults appearing in a rotor-bearing system using simple algebraic calculations and projection analyses. First, EEMD is applied to decompose the collected vibration signals into a set of intrinsic mode functions (IMFs) for features. Second, these extracted features under various mechanical health conditions are used to design a self-zero space matrix according to space projection analysis. Finally, the so-called projection indicators are calculated to identify the rotor-bearing system's faults with simple decision logic. Experiments are implemented to test the reliability and effectiveness of the proposed approach. The results show that this approach can accurately identify faults in rotor-bearing systems.

  12. Gyroscope-driven mouse pointer with an EMOTIV® EEG headset and data analysis based on Empirical Mode Decomposition.

    PubMed

    Rosas-Cholula, Gerardo; Ramirez-Cortes, Juan Manuel; Alarcon-Aquino, Vicente; Gomez-Gil, Pilar; Rangel-Magdaleno, Jose de Jesus; Reyes-Garcia, Carlos

    2013-08-14

    This paper presents a project on the development of a cursor control emulating the typical operations of a computer-mouse, using gyroscope and eye-blinking electromyographic signals which are obtained through a commercial 16-electrode wireless headset, recently released by Emotiv. The cursor position is controlled using information from a gyroscope included in the headset. The clicks are generated through the user's blinking with an adequate detection procedure based on the spectral-like technique called Empirical Mode Decomposition (EMD). EMD is proposed as a simple and quick computational tool, yet effective, aimed to artifact reduction from head movements as well as a method to detect blinking signals for mouse control. Kalman filter is used as state estimator for mouse position control and jitter removal. The detection rate obtained in average was 94.9%. Experimental setup and some obtained results are presented.

  13. Re-reading nursing and re-writing practice: towards an empirically based reformulation of the nursing mandate.

    PubMed

    Allen, Davina

    2004-12-01

    This article examines field studies of nursing work published in the English language between 1993 and 2003 as the first step towards an empirically based reformulation of the nursing mandate. A decade of ethnographic research reveals that, contrary to contemporary theories which promote an image of nursing work centred on individualised unmediated caring relationships, in real-life practice the core nursing contribution is that of the healthcare mediator. Eight bundles of activity that comprise this intermediary role are described utilising evidence from the literature. The mismatch between nursing's culture and ideals and the structure and constraints of the work setting is a chronic source of practitioner dissatisfaction. It is argued that the profession has little to gain by pursuing an agenda of holistic patient care centred on emotional intimacy and that an alternative occupational mandate focused on the healthcare mediator function might make for more humane health services and a more viable professional future.

  14. Empirically Supported Treatments in Psychotherapy: Towards an Evidence-Based or Evidence-Biased Psychology in Clinical Settings?

    PubMed Central

    Castelnuovo, Gianluca

    2010-01-01

    The field of research and practice in psychotherapy has been deeply influenced by two different approaches: the empirically supported treatments (ESTs) movement, linked with the evidence-based medicine (EBM) perspective and the “Common Factors” approach, typically connected with the “Dodo Bird Verdict”. About the first perspective, since 1998 a list of ESTs has been established in mental health field. Criterions for “well-established” and “probably efficacious” treatments have arisen. The development of these kinds of paradigms was motivated by the emergence of a “managerial” approach and related systems for remuneration also for mental health providers and for insurance companies. In this article ESTs will be presented underlining also some possible criticisms. Finally complementary approaches, that could add different evidence in the psychotherapy research in comparison with traditional EBM approach, are presented. PMID:21833197

  15. A novel approach for baseline correction in 1H-MRS signals based on ensemble empirical mode decomposition.

    PubMed

    Parto Dezfouli, Mohammad Ali; Dezfouli, Mohsen Parto; Rad, Hamidreza Saligheh

    2014-01-01

    Proton magnetic resonance spectroscopy ((1)H-MRS) is a non-invasive diagnostic tool for measuring biochemical changes in the human body. Acquired (1)H-MRS signals may be corrupted due to a wideband baseline signal generated by macromolecules. Recently, several methods have been developed for the correction of such baseline signals, however most of them are not able to estimate baseline in complex overlapped signal. In this study, a novel automatic baseline correction method is proposed for (1)H-MRS spectra based on ensemble empirical mode decomposition (EEMD). This investigation was applied on both the simulated data and the in-vivo (1)H-MRS of human brain signals. Results justify the efficiency of the proposed method to remove the baseline from (1)H-MRS signals.

  16. Gyroscope-Driven Mouse Pointer with an EMOTIV® EEG Headset and Data Analysis Based on Empirical Mode Decomposition

    PubMed Central

    Rosas-Cholula, Gerardo; Ramirez-Cortes, Juan Manuel; Alarcon-Aquino, Vicente; Gomez-Gil, Pilar; Rangel-Magdaleno, Jose de Jesus; Reyes-Garcia, Carlos

    2013-01-01

    This paper presents a project on the development of a cursor control emulating the typical operations of a computer-mouse, using gyroscope and eye-blinking electromyographic signals which are obtained through a commercial 16-electrode wireless headset, recently released by Emotiv. The cursor position is controlled using information from a gyroscope included in the headset. The clicks are generated through the user's blinking with an adequate detection procedure based on the spectral-like technique called Empirical Mode Decomposition (EMD). EMD is proposed as a simple and quick computational tool, yet effective, aimed to artifact reduction from head movements as well as a method to detect blinking signals for mouse control. Kalman filter is used as state estimator for mouse position control and jitter removal. The detection rate obtained in average was 94.9%. Experimental setup and some obtained results are presented. PMID:23948873

  17. Effects of Personalization and Invitation Email Length on Web-Based Survey Response Rates

    ERIC Educational Resources Information Center

    Trespalacios, Jesús H.; Perkins, Ross A.

    2016-01-01

    Individual strategies to increase response rate and survey completion have been extensively researched. Recently, efforts have been made to investigate a combination of interventions to yield better response rates for web-based surveys. This study examined the effects of four different survey invitation conditions on response rate. From a large…

  18. Restoration of images degraded by signal-dependent noise based on energy minimization: an empirical study

    NASA Astrophysics Data System (ADS)

    Bajić, Buda; Lindblad, Joakim; Sladoje, Nataša

    2016-07-01

    Most energy minimization-based restoration methods are developed for signal-independent Gaussian noise. The assumption of Gaussian noise distribution leads to a quadratic data fidelity term, which is appealing in optimization. When an image is acquired with a photon counting device, it contains signal-dependent Poisson or mixed Poisson-Gaussian noise. We quantify the loss in performance that occurs when a restoration method suited for Gaussian noise is utilized for mixed noise. Signal-dependent noise can be treated by methods based on either classical maximum a posteriori (MAP) probability approach or on a variance stabilization approach (VST). We compare performances of these approaches on a large image material and observe that VST-based methods outperform those based on MAP in both quality of restoration and in computational efficiency. We quantify improvement achieved by utilizing Huber regularization instead of classical total variation regularization. The conclusion from our study is a recommendation to utilize a VST-based approach combined with regularization by Huber potential for restoration of images degraded by blur and signal-dependent noise. This combination provides a robust and flexible method with good performance and high speed.

  19. Web-based surveys as an alternative to traditional mail methods.

    PubMed

    Fleming, Christopher M; Bowden, Mark

    2009-01-01

    Environmental economists have long used surveys to gather information about people's preferences. A recent innovation in survey methodology has been the advent of web-based surveys. While the Internet appears to offer a promising alternative to conventional survey administration modes, concerns exist over potential sampling biases associated with web-based surveys and the effect these may have on valuation estimates. This paper compares results obtained from a travel cost questionnaire of visitors to Fraser Island, Australia, that was conducted using two alternate survey administration modes; conventional mail and web-based. It is found that response rates and the socio-demographic make-up of respondents to the two survey modes are not statistically different. Moreover, both modes yield similar consumer surplus estimates.

  20. Web-based dynamic Delphi: a new survey instrument

    NASA Astrophysics Data System (ADS)

    Yao, JingTao; Liu, Wei-Ning

    2006-04-01

    We present a mathematical model for a dynamic Delphi survey method which takes advantages of Web technology. A comparative study on the performance of the conventional Delphi method and the dynamic Delphi instrument is conducted. It is suggested that a dynamic Delphi survey may form a consensus quickly. However, the result may not be robust due to the judgement leaking issues.

  1. Preparation, Practice, and Performance: An Empirical Examination of the Impact of Standards-Based Instruction on Secondary Students' Math and Science Achievement

    ERIC Educational Resources Information Center

    Thompson, Carla J.

    2009-01-01

    For almost two decades proponents of educational reform have advocated the use of standards-based education in maths and science classrooms for improving teacher practices, increasing student learning, and raising the quality of maths and science instruction. This study empirically examined the impact of specific standards-based teacher…

  2. Children's Experiences of Completing a Computer-Based Violence Survey: Finnish Child Victim Survey Revisited.

    PubMed

    Fagerlund, Monica; Ellonen, Noora

    2016-07-01

    The involvement of children as research subjects requires special considerations with regard to research practices and ethics. This is especially true concerning sensitive research topics such as sexual victimization. Prior research suggests that reflecting these experiences in a survey can cause negative feelings in child participants, although posing only a minimal to moderate risk. Analyzing only predefined, often negative feelings related to answering a sexual victimization survey has dominated the existing literature. In this article children's free-text comments about answering a victimization survey and experiences of sexual victimization are analyzed together to evaluate the effects of research participation in relation to this sensitive issue. Altogether 11,364 children, aged 11-12 and 15-16, participated in the Finnish Child Victim Survey in 2013. Of these, 69% (7,852) reflected on their feelings about answering the survey. Results indicate that both clearly negative and positive feelings are more prevalent among victimized children compared to their nonvictimized peers. Characteristics unique to sexual victimization as well as differences related to gender and age are also discussed. The study contributes to the important yet contradictory field of studying the effects of research participation on children.

  3. Teaching Standards-Based Group Work Competencies to Social Work Students: An Empirical Examination

    ERIC Educational Resources Information Center

    Macgowan, Mark J.; Vakharia, Sheila P.

    2012-01-01

    Objectives: Accreditation standards and challenges in group work education require competency-based approaches in teaching social work with groups. The Association for the Advancement of Social Work with Groups developed Standards for Social Work Practice with Groups, which serve as foundation competencies for professional practice. However, there…

  4. An Empirical Typology of Residential Care/Assisted Living Based on a Four-State Study

    ERIC Educational Resources Information Center

    Park, Nan Sook; Zimmerman, Sheryl; Sloane, Philip D.; Gruber-Baldini, Ann L.; Eckert, J. Kevin

    2006-01-01

    Purpose: Residential care/assisted living describes diverse facilities providing non-nursing home care to a heterogeneous group of primarily elderly residents. This article derives typologies of assisted living based on theoretically and practically grounded evidence. Design and Methods: We obtained data from the Collaborative Studies of Long-Term…

  5. Homogeneity in Community-Based Rape Prevention Programs: Empirical Evidence of Institutional Isomorphism

    ERIC Educational Resources Information Center

    Townsend, Stephanie M.; Campbell, Rebecca

    2007-01-01

    This study examined the practices of 24 community-based rape prevention programs. Although these programs were geographically dispersed throughout one state, they were remarkably similar in their approach to rape prevention programming. DiMaggio and Powell's (1991) theory of institutional isomorphism was used to explain the underlying causes of…

  6. Empirical Investigation into Motives for Choosing Web-Based Distance Learning Programs

    ERIC Educational Resources Information Center

    Alkhattabi, Mona

    2016-01-01

    Today, in association with rapid social and economic changes, there is an increasing level of demand for distance and online learning programs. This study will focus on identifying the main motivational factors for choosing a web-based distance-learning program. Moreover, it will investigate how these factors relate to age, gender, marital status…

  7. Theoretical and Empirical Base for Implementation Components of Health-Promoting Schools

    ERIC Educational Resources Information Center

    Samdal, Oddrun; Rowling, Louise

    2011-01-01

    Purpose: Efforts to create a scientific base for the health-promoting school approach have so far not articulated a clear "Science of Delivery". There is thus a need for systematic identification of clearly operationalised implementation components. To address a next step in the refinement of the health-promoting schools' work, this paper sets out…

  8. An Adaptive E-Learning System Based on Students' Learning Styles: An Empirical Study

    ERIC Educational Resources Information Center

    Drissi, Samia; Amirat, Abdelkrim

    2016-01-01

    Personalized e-learning implementation is recognized as one of the most interesting research areas in the distance web-based education. Since the learning style of each learner is different one must fit e-learning with the different needs of learners. This paper presents an approach to integrate learning styles into adaptive e-learning hypermedia.…

  9. Introducing Evidence-Based Principles to Guide Collaborative Approaches to Evaluation: Results of an Empirical Process

    ERIC Educational Resources Information Center

    Shulha, Lyn M.; Whitmore, Elizabeth; Cousins, J. Bradley; Gilbert, Nathalie; al Hudib, Hind

    2016-01-01

    This article introduces a set of evidence-based principles to guide evaluation practice in contexts where evaluation knowledge is collaboratively produced by evaluators and stakeholders. The data from this study evolved in four phases: two pilot phases exploring the desirability of developing a set of principles; an online questionnaire survey…

  10. Young Readers' Narratives Based on a Picture Book: Model Readers and Empirical Readers

    ERIC Educational Resources Information Center

    Hoel, Trude

    2015-01-01

    The article present parts of a research project where the aim is to investigate six- to seven-year-old children's language use in storytelling. The children's oral texts are based on the wordless picture book "Frog, Where Are You?" Which has been, and still remains, a frequent tool for collecting narratives from children. The Frog story…

  11. Web-based Educational Media: Issues and Empirical Test of Learning.

    ERIC Educational Resources Information Center

    Radhakrishnan, Senthil; Bailey, James E.

    This paper addresses issues and cost benefits of World Wide Web-based education systems. It presents the results of an effort to identify problems that arise when considering this media and suggests conceptual solutions to some of these problems. To evaluate these solutions, a prototype system was built and tested in an engineering classroom; the…

  12. 77 FR 73432 - Proposed Information Collection; Comment Request; Survey of Shore-Based and Boat-Based Non...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-10

    ... Shore-Based and Boat-Based Non-Commercial Fishing on St. Croix, U.S. Virgin Islands AGENCY: National... will be a survey of boat-based, non- commercial fishers on St. Croix to document levels of catch and...-month period, will include two data collection activities: (1) A survey of fishers at boat ramps and,...

  13. Empirical Characteristics of Family-Based Linkage to a Complex Trait: the ADIPOQ Region and Adiponectin Levels

    PubMed Central

    Hellwege, Jacklyn N.; Palmer, Nicholette D.; Brown, W. Mark; Ziegler, Julie T.; An, S. Sandy; Guo, Xiuqing; Chen, Y.-D. Ida; Taylor, Kent; Hawkins, Gregory A.; Ng, Maggie C.Y.; Speliotes, Elizabeth K.; Lorenzo, Carlos; Norris, Jill M.; Rotter, Jerome I.; Wagenknecht, Lynne E.; Langefeld, Carl D.; Bowden, Donald W.

    2014-01-01

    We previously identified a low frequency (1.1%) coding variant (G45R; rs200573126) in the adiponectin gene (ADIPOQ) which was the basis for a multipoint microsatellite linkage signal (LOD=8.2) for plasma adiponectin levels in Hispanic families. We have empirically evaluated the ability of data from targeted common variants, exome chip genotyping, and genome-wide association study (GWAS) data to detect linkage and association to adiponectin protein levels at this locus. Simple two-point linkage and association analyses were performed in 88 Hispanic families (1150 individuals) using 10,958 SNPs on chromosome 3. Approaches were compared for their ability to map the functional variant, G45R, which was strongly linked (two-point LOD=20.98) and powerfully associated (p-value=8.1×10−50). Over 450 SNPs within a broad 61 Mb interval around rs200573126 showed nominal evidence of linkage (LOD>3) but only four other SNPs in this region were associated with p-values<1.0×10−4. When G45R was accounted for, the maximum LOD score across the interval dropped to 4.39 and the best p-value was 1.1×10−5. Linked and/or associated variants ranged in frequency (0.0018 to 0.50) and type (coding, non-coding) and had little detectable linkage disequilibrium with rs200573126 (r2<0.20). In addition, the two-point linkage approach empirically outperformed multipoint microsatellite and multipoint SNP analysis. In the absence of data for rs200573126, family-based linkage analysis using a moderately dense SNP dataset, including both common and low frequency variants, resulted in stronger evidence for an adiponectin locus than association data alone. Thus, linkage analysis can be a useful tool to facilitate identification of high impact genetic variants. PMID:25447270

  14. Conventional empirical law reverses in the phase transitions of 122-type iron-based superconductors

    PubMed Central

    Yu, Zhenhai; Wang, Lin; Wang, Luhong; Liu, Haozhe; Zhao, Jinggeng; Li, Chunyu; Sinogeikin, Stanislav; Wu, Wei; Luo, Jianlin; Wang, Nanlin; Yang, Ke; Zhao, Yusheng; Mao, Ho-kwang

    2014-01-01

    Phase transition of solid-state materials is a fundamental research topic in condensed matter physics, materials science and geophysics. It has been well accepted and widely proven that isostructural compounds containing different cations undergo same pressure-induced phase transitions but at progressively lower pressures as the cation radii increases. However, we discovered that this conventional law reverses in the structural transitions in 122-type iron-based superconductors. In this report, a combined low temperature and high pressure X-ray diffraction (XRD) measurement has identified the phase transition curves among the tetragonal (T), orthorhombic (O) and the collapsed-tetragonal (cT) phases in the structural phase diagram of the iron-based superconductor AFe2As2 (A = Ca, Sr, Eu, and Ba). The cation radii dependence of the phase transition pressure (T → cT) shows an opposite trend in which the compounds with larger ambient radii cations have a higher transition pressure. PMID:25417655

  15. Specification-based software sizing: An empirical investigation of function metrics

    NASA Technical Reports Server (NTRS)

    Jeffery, Ross; Stathis, John

    1993-01-01

    For some time the software industry has espoused the need for improved specification-based software size metrics. This paper reports on a study of nineteen recently developed systems in a variety of application domains. The systems were developed by a single software services corporation using a variety of languages. The study investigated several metric characteristics. It shows that: earlier research into inter-item correlation within the overall function count is partially supported; a priori function counts, in themself, do not explain the majority of the effort variation in software development in the organization studied; documentation quality is critical to accurate function identification; and rater error is substantial in manual function counting. The implication of these findings for organizations using function based metrics are explored.

  16. Patients’ Acceptance towards a Web-Based Personal Health Record System: An Empirical Study in Taiwan

    PubMed Central

    Liu, Chung-Feng; Tsai, Yung-Chieh; Jang, Fong-Lin

    2013-01-01

    The health care sector has become increasingly interested in developing personal health record (PHR) systems as an Internet-based telehealthcare implementation to improve the quality and decrease the cost of care. However, the factors that influence patients’ intention to use PHR systems remain unclear. Based on physicians’ therapeutic expertise, we implemented a web-based infertile PHR system and proposed an extended Technology Acceptance Model (TAM) that integrates the physician-patient relationship (PPR) construct into TAM’s original perceived ease of use (PEOU) and perceived usefulness (PU) constructs to explore which factors will influence the behavioral intentions (BI) of infertile patients to use the PHR. From ninety participants from a medical center, 50 valid responses to a self-rating questionnaire were collected, yielding a response rate of 55.56%. The partial least squares (PLS) technique was used to assess the causal relationships that were hypothesized in the extended model. The results indicate that infertile patients expressed a moderately high intention to use the PHR system. The PPR and PU of patients had significant effects on their BI to use PHR, whereas the PEOU indirectly affected the patients’ BI through the PU. This investigation confirms that PPR can have a critical role in shaping patients’ perceptions of the use of healthcare information technologies. Hence, we suggest that hospitals should promote the potential usefulness of PHR and improve the quality of the physician-patient relationship to increase patients’ intention of using PHR. PMID:24142185

  17. Empirical evaluation of analytical models for parallel relational data-base queries. Master's thesis

    SciTech Connect

    Denham, M.C.

    1990-12-01

    This thesis documents the design and implementation of three parallel join algorithms to be used in the verification of analytical models developed by Kearns. Kearns developed a set of analytical models for a variety of relational database queries. These models serve as tools for the design of parallel relational database system. Each of Kearns' models is classified as either single step or multiple step. The single step models reflect queries that require only one operation while the multiple step models reflect queries that require multiple operations. Three parallel join algorithms were implemented based upon Kearns' models. Two are based upon single step join models and one is based upon a multiple step join model. They are implemented on an Intel iPSC/1 parallel computer. The single step join algorithms include the parallel nested-loop join and the bucket (or hash) join. The multiple step algorithm that was implemented is a pipelined version of the bucket join. The results show that within the constraints of the test cases run, the three models are all at least accurate to within about 8.5% and they should prove useful in the design of parallel relational database systems.

  18. Empirically Supported Family-Based Treatments for Conduct Disorder and Delinquency in Adolescents

    PubMed Central

    Henggeler, Scott W.; Sheidow, Ashli J.

    2011-01-01

    Several family-based treatments of conduct disorder and delinquency in adolescents have emerged as evidence-based and, in recent years, have been transported to more than 800 community practice settings. These models include multisystemic therapy, functional family therapy, multidimensional treatment foster care, and, to a lesser extent, brief strategic family therapy. In addition to summarizing the theoretical and clinical bases of these treatments, their results in efficacy and effectiveness trials are examined with particular emphasis on any demonstrated capacity to achieve favorable outcomes when implemented by real world practitioners in community practice settings. Special attention is also devoted to research on purported mechanisms of change as well as the long-term sustainability of outcomes achieved by these treatment models. Importantly, we note that the developers of each of the models have developed quality assurance systems to support treatment fidelity and youth and family outcomes; and the developers have formed purveyor organizations to facilitate the large scale transport of their respective treatments to community settings nationally and internationally. PMID:22283380

  19. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User’s Head Movement

    PubMed Central

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-01-01

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user’s head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest. PMID:27589768

  20. Empirical force field for cisplatin based on quantum dynamics data: case study of new parameterization scheme for coordination compounds.

    PubMed

    Yesylevskyy, S; Cardey, Bruno; Kraszewski, S; Foley, Sarah; Enescu, Mironel; da Silva, Antônio M; Dos Santos, Hélio F; Ramseyer, Christophe

    2015-10-01

    Parameterization of molecular complexes containing a metallic compound, such as cisplatin, is challenging due to the unconventional coordination nature of the bonds which involve platinum atoms. In this work, we develop a new methodology of parameterization for such compounds based on quantum dynamics (QD) calculations. We show that the coordination bonds and angles are more flexible than in normal covalent compounds. The influence of explicit solvent is also shown to be crucial to determine the flexibility of cisplatin in quantum dynamics simulations. Two empirical topologies of cisplatin were produced by fitting its atomic fluctuations against QD in vacuum and QD with explicit first solvation shell of water molecules respectively. A third topology built in a standard way from the static optimized structure was used for comparison. The later one leads to an excessively rigid molecule and exhibits much smaller fluctuations of the bonds and angles than QD reveals. It is shown that accounting for the high flexibility of cisplatin molecule is needed for adequate description of its first hydration shell. MD simulations with flexible QD-based topology also reveal a significant decrease of the barrier of passive diffusion of cisplatin accross the model lipid bilayer. These results confirm that flexibility of organometallic compounds is an important feature to be considered in classical molecular dynamics topologies. Proposed methodology based on QD simulations provides a systematic way of building such topologies.

  1. Lake Superior Zooplankton Biomass Predictions from LOPC Tow Surveys Compare Well with a Probability Based Net Survey

    EPA Science Inventory

    We conducted a probability-based sampling of Lake Superior in 2006 and compared the zooplankton biomass estimate with laser optical plankton counter (LOPC) predictions. The net survey consisted of 52 sites stratified across three depth zones (0-30, 30-150, >150 m). The LOPC tow...

  2. Synthesizing Results From Empirical Research on Computer-Based Scaffolding in STEM Education

    PubMed Central

    Belland, Brian R.; Walker, Andrew E.; Kim, Nam Ju; Lefler, Mason

    2016-01-01

    Computer-based scaffolding assists students as they generate solutions to complex problems, goals, or tasks, helping increase and integrate their higher order skills in the process. However, despite decades of research on scaffolding in STEM (science, technology, engineering, and mathematics) education, no existing comprehensive meta-analysis has synthesized the results of these studies. This review addresses that need by synthesizing the results of 144 experimental studies (333 outcomes) on the effects of computer-based scaffolding designed to assist the full range of STEM learners (primary through adult education) as they navigated ill-structured, problem-centered curricula. Results of our random effect meta-analysis (a) indicate that computer-based scaffolding showed a consistently positive (ḡ = 0.46) effect on cognitive outcomes across various contexts of use, scaffolding characteristics, and levels of assessment and (b) shed light on many scaffolding debates, including the roles of customization (i.e., fading and adding) and context-specific support. Specifically, scaffolding’s influence on cognitive outcomes did not vary on the basis of context-specificity, presence or absence of scaffolding change, and logic by which scaffolding change is implemented. Scaffolding’s influence was greatest when measured at the principles level and among adult learners. Still scaffolding’s effect was substantial and significantly greater than zero across all age groups and assessment levels. These results suggest that scaffolding is a highly effective intervention across levels of different characteristics and can largely be designed in many different ways while still being highly effective. PMID:28344365

  3. Conventional empirical law reverses in the phase transitions of 122-type iron-based superconductors

    SciTech Connect

    Yu, Zhenhai; Wang, Lin; Wang, Luhong; Liu, Haozhe; Zhao, Jinggeng; Li, Chunyu; Sinogeikin, Stanislav; Wu, Wei; Luo, Jianlin; Wang, Nanlin; Yang, Ke; Zhao, Yusheng; Mao, Ho -kwang

    2014-11-24

    Phase transition of solid-state materials is a fundamental research topic in condensed matter physics, materials science and geophysics. It has been well accepted and widely proven that isostructural compounds containing different cations undergo same pressure-induced phase transitions but at progressively lower pressures as the cation radii increases. However, we discovered that this conventional law reverses in the structural transitions in 122-type iron-based superconductors. In this report, a combined low temperature and high pressure X-ray diffraction (XRD) measurement has identified the phase transition curves among the tetragonal (T), orthorhombic (O) and the collapsed-tetragonal (cT) phases in the structural phase diagram of the iron-based superconductor AFe2As2 (A = Ca, Sr, Eu, and Ba). As a result, the cation radii dependence of the phase transition pressure (T → cT) shows an opposite trend in which the compounds with larger ambient radii cations have a higher transition pressure.

  4. Conventional empirical law reverses in the phase transitions of 122-type iron-based superconductors

    DOE PAGES

    Yu, Zhenhai; Wang, Lin; Wang, Luhong; ...

    2014-11-24

    Phase transition of solid-state materials is a fundamental research topic in condensed matter physics, materials science and geophysics. It has been well accepted and widely proven that isostructural compounds containing different cations undergo same pressure-induced phase transitions but at progressively lower pressures as the cation radii increases. However, we discovered that this conventional law reverses in the structural transitions in 122-type iron-based superconductors. In this report, a combined low temperature and high pressure X-ray diffraction (XRD) measurement has identified the phase transition curves among the tetragonal (T), orthorhombic (O) and the collapsed-tetragonal (cT) phases in the structural phase diagram ofmore » the iron-based superconductor AFe2As2 (A = Ca, Sr, Eu, and Ba). As a result, the cation radii dependence of the phase transition pressure (T → cT) shows an opposite trend in which the compounds with larger ambient radii cations have a higher transition pressure.« less

  5. The influence of land urbanization on landslides: An empirical estimation based on Chinese provincial panel data.

    PubMed

    Li, Gerui; Lei, Yalin; Yao, Huajun; Wu, Sanmang; Ge, Jianping

    2017-04-10

    This study used panel data for 28 provinces and municipalities in China from 2003 to 2014 to investigate the relationship between land urbanization and landslides by building panel models for a national sample and subsamples from the three regions of China and studied the problems of landslide prevention measures based on the relationship. The results showed that 1) at the national level, the percentage of built-up area and road density are respectively negative and positive for landslides. 2) At the regional level, the improvement of landslide prevention measures with increasing economic development only appears in built-up areas. The percentage of built-up areas increases the number of landslides in the western region and decreases the number in the central and eastern regions; the degree of decrease in the eastern region is larger than in the central region. Road density increases the number of landslides in each region, and the degree increases gradually from the west to the east. 3) The effect of landslide prevention funding is not obvious. Although the amount of landslide prevention funds decreases the number of landslides at the national level, the degree of increase is too small. Except in the central region, the amount of landslide prevention funding did not decrease the number of landslides effectively in the western and eastern regions. We propose a series of policy implications based on these test results that may help to improve landslide prevention measures.

  6. A prediction procedure for propeller aircraft flyover noise based on empirical data

    NASA Astrophysics Data System (ADS)

    Smith, M. H.

    1981-04-01

    Forty-eight different flyover noise certification tests are analyzed using multiple linear regression methods. A prediction model is presented based on this analysis, and the results compared with the test data and two other prediction methods. The aircraft analyzed include 30 single engine aircraft, 16 twin engine piston aircraft, and two twin engine turboprops. The importance of helical tip Mach number is verified and the relationship of several other aircraft, engine, and propeller parameters is developed. The model shows good agreement with the test data and is at least as accurate as the other prediction methods. It has the advantage of being somewhat easier to use since it is in the form of a single equation.

  7. Psychological First Aid: A Consensus-Derived, Empirically Supported, Competency-Based Training Model

    PubMed Central

    Everly, George S.; Brown, Lisa M.; Wendelboe, Aaron M.; Abd Hamid, Nor Hashidah; Tallchief, Vicki L.; Links, Jonathan M.

    2014-01-01

    Surges in demand for professional mental health services occasioned by disasters represent a major public health challenge. To build response capacity, numerous psychological first aid (PFA) training models for professional and lay audiences have been developed that, although often concurring on broad intervention aims, have not systematically addressed pedagogical elements necessary for optimal learning or teaching. We describe a competency-based model of PFA training developed under the auspices of the Centers for Disease Control and Prevention and the Association of Schools of Public Health. We explain the approach used for developing and refining the competency set and summarize the observable knowledge, skills, and attitudes underlying the 6 core competency domains. We discuss the strategies for model dissemination, validation, and adoption in professional and lay communities. PMID:23865656

  8. Joint multifractal analysis based on the partition function approach: analytical analysis, numerical simulation and empirical application

    NASA Astrophysics Data System (ADS)

    Xie, Wen-Jie; Jiang, Zhi-Qiang; Gu, Gao-Feng; Xiong, Xiong; Zhou, Wei-Xing

    2015-10-01

    Many complex systems generate multifractal time series which are long-range cross-correlated. Numerous methods have been proposed to characterize the multifractal nature of these long-range cross correlations. However, several important issues about these methods are not well understood and most methods consider only one moment order. We study the joint multifractal analysis based on partition function with two moment orders, which was initially invented to investigate fluid fields, and derive analytically several important properties. We apply the method numerically to binomial measures with multifractal cross correlations and bivariate fractional Brownian motions without multifractal cross correlations. For binomial multifractal measures, the explicit expressions of mass function, singularity strength and multifractal spectrum of the cross correlations are derived, which agree excellently with the numerical results. We also apply the method to stock market indexes and unveil intriguing multifractality in the cross correlations of index volatilities.

  9. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study †

    PubMed Central

    Tîrnăucă, Cristina; Montaña, José L.; Ontañón, Santiago; González, Avelino J.; Pardo, Luis M.

    2016-01-01

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent’s actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches. PMID:27347956

  10. Empirical estimation of consistency parameter in intertemporal choice based on Tsallis’ statistics

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki; Oono, Hidemi; Radford, Mark H. B.

    2007-07-01

    Impulsivity and inconsistency in intertemporal choice have been attracting attention in econophysics and neuroeconomics. Although loss of self-control by substance abusers is strongly related to their inconsistency in intertemporal choice, researchers in neuroeconomics and psychopharmacology have usually studied impulsivity in intertemporal choice using a discount rate (e.g. hyperbolic k), with little effort being expended on parameterizing subject's inconsistency in intertemporal choice. Recent studies using Tsallis’ statistics-based econophysics have found a discount function (i.e. q-exponential discount function), which may continuously parameterize a subject's consistency in intertemporal choice. In order to examine the usefulness of the consistency parameter (0⩽q⩽1) in the q-exponential discounting function in behavioral studies, we experimentally estimated the consistency parameter q in Tsallis’ statistics-based discounting function by assessing the points of subjective equality (indifference points) at seven delays (1 week-25 years) in humans (N=24). We observed that most (N=19) subjects’ intertemporal choice was completely inconsistent ( q=0, i.e. hyperbolic discounting), the mean consistency (0⩽q⩽1) was smaller than 0.5, and only one subject had a completely consistent intertemporal choice ( q=1, i.e. exponential discounting). There was no significant correlation between impulsivity and inconsistency parameters. Our results indicate that individual differences in consistency in intertemporal choice can be parameterized by introducing a q-exponential discount function and most people discount delayed rewards hyperbolically, rather than exponentially (i.e. mean q is smaller than 0.5). Further, impulsivity and inconsistency in intertemporal choice can be considered as separate behavioral tendencies. The usefulness of the consistency parameter q in psychopharmacological studies of addictive behavior was demonstrated in the present study.

  11. An empirical RBF model of the magnetosphere parameterized by interplanetary and ground-based drivers

    NASA Astrophysics Data System (ADS)

    Tsyganenko, N. A.; Andreeva, V. A.

    2016-11-01

    In our recent paper (Andreeva and Tsyganenko, 2016), a novel method was proposed to model the magnetosphere directly from spacecraft data, with no a priori knowledge nor ad hoc assumptions about the geometry of the magnetic field sources. The idea was to split the field into the toroidal and poloidal parts and then expand each part into a weighted sum of radial basis functions (RBF). In the present work we take the next step forward by having developed a full-fledged model of the near magnetosphere, based on a multiyear set of space magnetometer data (1995-2015) and driven by ground-based and interplanetary input parameters. The model consolidates the largest ever amount of data and has been found to provide the best ever merit parameters, in terms of both the overall RMS residual field and record-high correlation coefficients between the observed and model field components. By experimenting with different combinations of input parameters and their time-averaging intervals, we found the best so far results to be given by the ram pressure Pd, SYM-H, and N-index by Newell et al. (2007). In addition, the IMF By has also been included as a model driver, with a goal to more accurately represent the IMF penetration effects. The model faithfully reproduces both externally and internally induced variations in the global distribution of the geomagnetic field and electric currents. Stronger solar wind driving results in a deepening of the equatorial field depression and a dramatic increase of its dawn-dusk asymmetry. The Earth's dipole tilt causes a consistent deformation of the magnetotail current sheet and a significant north-south asymmetry of the polar cusp depressions on the dayside. Next steps to further develop the new approach are also discussed.

  12. Using Information and Communications Technology in a National Population-Based Survey: The Kenya AIDS Indicator Survey 2012

    PubMed Central

    Ojwang’, James K.; Lee, Veronica C.; Waruru, Anthony; Ssempijja, Victor; Ng’ang’a, John G.; Wakhutu, Brian E.; Kandege, Nicholas O.; Koske, Danson K.; Kamiru, Samuel M.; Omondi, Kenneth O.; Kakinyi, Mutua; Kim, Andrea A.; Oluoch, Tom

    2016-01-01

    Background With improvements in technology, electronic data capture (EDC) for large surveys is feasible. EDC offers benefits over traditional paper-based data collection, including more accurate data, greater completeness of data, and decreased data cleaning burden. Methods The second Kenya AIDS Indicator Survey (KAIS 2012) was a population-based survey of persons aged 18 months to 64 years. A software application was designed to capture the interview, specimen collection, and home-based testing and counseling data. The application included: interview translations for local languages; options for single, multiple, and fill-in responses; and automated participant eligibility determination. Data quality checks were programmed to automate skip patterns and prohibit outlier responses. A data sharing architecture was developed to transmit the data in realtime from the field to a central server over a virtual private network. Results KAIS 2012 was conducted between October 2012 and February 2013. Overall, 68,202 records for the interviews, specimen collection, and home-based testing and counseling were entered into the application. Challenges arose during implementation, including poor connectivity and a systems malfunction that created duplicate records, which prevented timely data transmission to the central server. Data cleaning was minimal given the data quality control measures. Conclusions KAIS 2012 demonstrated the feasibility of using EDC in a population-based survey. The benefits of EDC were apparent in data quality and minimal time needed for data cleaning. Several important lessons were learned, such as the time and monetary investment required before survey implementation, the importance of continuous application testing, and contingency plans for data transmission due to connectivity challenges. PMID:24732816

  13. An Empirical Orthogonal Function-Based Algorithm for Estimating Terrestrial Latent Heat Flux from Eddy Covariance, Meteorological and Satellite Observations

    PubMed Central

    Feng, Fei; Li, Xianglan; Yao, Yunjun; Liang, Shunlin; Chen, Jiquan; Zhao, Xiang; Jia, Kun; Pintér, Krisztina; McCaughey, J. Harry

    2016-01-01

    Accurate estimation of latent heat flux (LE) based on remote sensing data is critical in characterizing terrestrial ecosystems and modeling land surface processes. Many LE products were released during the past few decades, but their quality might not meet the requirements in terms of data consistency and estimation accuracy. Merging multiple algorithms could be an effective way to improve the quality of existing LE products. In this paper, we present a data integration method based on modified empirical orthogonal function (EOF) analysis to integrate the Moderate Resolution Imaging Spectroradiometer (MODIS) LE product (MOD16) and the Priestley-Taylor LE algorithm of Jet Propulsion Laboratory (PT-JPL) estimate. Twenty-two eddy covariance (EC) sites with LE observation were chosen to evaluate our algorithm, showing that the proposed EOF fusion method was capable of integrating the two satellite data sets with improved consistency and reduced uncertainties. Further efforts were needed to evaluate and improve the proposed algorithm at larger spatial scales and time periods, and over different land cover types. PMID:27472383

  14. PDE-based Non-Linear Diffusion Techniques for Denoising Scientific and Industrial Images: An Empirical Study

    SciTech Connect

    Weeratunga, S K; Kamath, C

    2001-12-20

    Removing noise from data is often the first step in data analysis. Denoising techniques should not only reduce the noise, but do so without blurring or changing the location of the edges. Many approaches have been proposed to accomplish this; in this paper, they focus on one such approach, namely the use of non-linear diffusion operators. This approach has been studied extensively from a theoretical viewpoint ever since the 1987 work of Perona and Malik showed that non-linear filters outperformed the more traditional linear Canny edge detector. They complement this theoretical work by investigating the performance of several isotropic diffusion operators on test images from scientific domains. They explore the effects of various parameters such as the choice of diffusivity function, explicit and implicit methods for the discretization of the PDE, and approaches for the spatial discretization of the non-linear operator etc. They also compare these schemes with simple spatial filters and the more complex wavelet-based shrinkage techniques. The empirical results show that, with an appropriate choice of parameters, diffusion-based schemes can be as effective as competitive techniques.

  15. A Cutting Pattern Recognition Method for Shearers Based on Improved Ensemble Empirical Mode Decomposition and a Probabilistic Neural Network.

    PubMed

    Xu, Jing; Wang, Zhongbin; Tan, Chao; Si, Lei; Liu, Xinhua

    2015-10-30

    In order to guarantee the stable operation of shearers and promote construction of an automatic coal mining working face, an online cutting pattern recognition method with high accuracy and speed based on Improved Ensemble Empirical Mode Decomposition (IEEMD) and Probabilistic Neural Network (PNN) is proposed. An industrial microphone is installed on the shearer and the cutting sound is collected as the recognition criterion to overcome the disadvantages of giant size, contact measurement and low identification rate of traditional detectors. To avoid end-point effects and get rid of undesirable intrinsic mode function (IMF) components in the initial signal, IEEMD is conducted on the sound. The end-point continuation based on the practical storage data is performed first to overcome the end-point effect. Next the average correlation coefficient, which is calculated by the correlation of the first IMF with others, is introduced to select essential IMFs. Then the energy and standard deviation of the reminder IMFs are extracted as features and PNN is applied to classify the cutting patterns. Finally, a simulation example, with an accuracy of 92.67%, and an industrial application prove the efficiency and correctness of the proposed method.

  16. Identifying P phase arrival of weak events: The Akaike Information Criterion picking application based on the Empirical Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Li, Xibing; Shang, Xueyi; Morales-Esteban, A.; Wang, Zewei

    2017-03-01

    Seismic P phase arrival picking of weak events is a difficult problem in seismology. The algorithm proposed in this research is based on Empirical Mode Decomposition (EMD) and on the Akaike Information Criterion (AIC) picker. It has been called the EMD-AIC picker. The EMD is a self-adaptive signal decomposition method that not only improves Signal to Noise Ratio (SNR) but also retains P phase arrival information. Then, P phase arrival picking has been determined by applying the AIC picker to the selected main Intrinsic Mode Functions (IMFs). The performance of the EMD-AIC picker has been evaluated on the basis of 1938 micro-seismic signals from the Yongshaba mine (China). The P phases identified by this algorithm have been compared with manual pickings. The evaluation results confirm that the EMD-AIC pickings are highly accurate for the majority of the micro-seismograms. Moreover, the pickings are independent of the kind of noise. Finally, the results obtained by this algorithm have been compared to the wavelet based Discrete Wavelet Transform (DWT)-AIC pickings. This comparison has demonstrated that the EMD-AIC picking method has a better picking accuracy than the DWT-AIC picking method, thus showing this method's reliability and potential.

  17. A Cutting Pattern Recognition Method for Shearers Based on Improved Ensemble Empirical Mode Decomposition and a Probabilistic Neural Network

    PubMed Central

    Xu, Jing; Wang, Zhongbin; Tan, Chao; Si, Lei; Liu, Xinhua

    2015-01-01

    In order to guarantee the stable operation of shearers and promote construction of an automatic coal mining working face, an online cutting pattern recognition method with high accuracy and speed based on Improved Ensemble Empirical Mode Decomposition (IEEMD) and Probabilistic Neural Network (PNN) is proposed. An industrial microphone is installed on the shearer and the cutting sound is collected as the recognition criterion to overcome the disadvantages of giant size, contact measurement and low identification rate of traditional detectors. To avoid end-point effects and get rid of undesirable intrinsic mode function (IMF) components in the initial signal, IEEMD is conducted on the sound. The end-point continuation based on the practical storage data is performed first to overcome the end-point effect. Next the average correlation coefficient, which is calculated by the correlation of the first IMF with others, is introduced to select essential IMFs. Then the energy and standard deviation of the reminder IMFs are extracted as features and PNN is applied to classify the cutting patterns. Finally, a simulation example, with an accuracy of 92.67%, and an industrial application prove the efficiency and correctness of the proposed method. PMID:26528985

  18. Emotional competencies in geriatric nursing: empirical evidence from a computer based large scale assessment calibration study.

    PubMed

    Kaspar, Roman; Hartig, Johannes

    2016-03-01

    The care of older people was described as involving substantial emotion-related affordances. Scholars in vocational training and nursing disagree whether emotion-related skills could be conceptualized and assessed as a professional competence. Studies on emotion work and empathy regularly neglect the multidimensionality of these phenomena and their relation to the care process, and are rarely conclusive with respect to nursing behavior in practice. To test the status of emotion-related skills as a facet of client-directed geriatric nursing competence, 402 final-year nursing students from 24 German schools responded to a 62-item computer-based test. 14 items were developed to represent emotion-related affordances. Multi-dimensional IRT modeling was employed to assess a potential subdomain structure. Emotion-related test items did not form a separate subdomain, and were found to be discriminating across the whole competence continuum. Tasks concerning emotion work and empathy are reliable indicators for various levels of client-directed nursing competence. Claims for a distinct emotion-related competence in geriatric nursing, however, appear excessive with a process-oriented perspective.

  19. Evidence-Based Guidelines for Empirical Therapy of Neutropenic Fever in Korea

    PubMed Central

    Kim, Sung-Han; Kim, Soo Young; Kim, Chung-Jong; Park, Wan Beom; Song, Young Goo; Choi, Jung-Hyun

    2011-01-01

    Neutrophils play an important role in immunological function. Neutropenic patients are vulnerable to infection, and except fever is present, inflammatory reactions are scarce in many cases. Additionally, because infections can worsen rapidly, early evaluation and treatments are especially important in febrile neutropenic patients. In cases in which febrile neutropenia is anticipated due to anticancer chemotherapy, antibiotic prophylaxis can be used, based on the risk of infection. Antifungal prophylaxis may also be considered if long-term neutropenia or mucosal damage is expected. When fever is observed in patients suspected to have neutropenia, an adequate physical examination and blood and sputum cultures should be performed. Initial antibiotics should be chosen by considering the risk of complications following the infection; if the risk is low, oral antibiotics can be used. For initial intravenous antibiotics, monotherapy with a broad-spectrum antibiotic or combination therapy with two antibiotics is recommended. At 3-5 days after beginning the initial antibiotic therapy, the condition of the patient is assessed again to determine whether the fever has subsided or symptoms have worsened. If the patient's condition has improved, intravenous antibiotics can be replaced with oral antibiotics; if the condition has deteriorated, a change of antibiotics or addition of antifungal agents should be considered. If the causative microorganism is identified, initial antimicrobial or antifungal agents should be changed accordingly. When the cause is not detected, the initial agents should continue to be used until the neutrophil count recovers. PMID:21716917

  20. Temporal asymmetries in Interbank Market: an empirically grounded Agent-Based Model

    NASA Astrophysics Data System (ADS)

    Zlatic, Vinko; Popovic, Marko; Abraham, Hrvoje; Caldarelli, Guido; Iori, Giulia

    2014-03-01

    We analyse the changes in the topology of the structure of the E-mid interbank market in the period from September 1st 1999 to September 1st 2009. We uncover a type of temporal irreversibility in the growth of the largest component of the interbank trading network, which is not common to any of the usual network growth models. Such asymmetry, which is also detected on the growth of the clustering and reciprocity coefficient, reveals that the trading mechanism is driven by different dynamics at the beginning and at the end of the day. We are able to recover the complexity of the system by means of a simple Agent Based Model in which the probability of matching between counter parties depends on a time varying vertex fitness (or attractiveness) describing banks liquidity needs. We show that temporal irreversibility is associated with heterogeneity in the banking system and emerges when the distribution of liquidity shocks across banks is broad. We acknowledge support from FET project FOC-II.

  1. Carbon emissions, logistics volume and GDP in China: empirical analysis based on panel data model.

    PubMed

    Guo, Xiaopeng; Ren, Dongfang; Shi, Jiaxing

    2016-12-01

    This paper studies the relationship among carbon emissions, GDP, and logistics by using a panel data model and a combination of statistics and econometrics theory. The model is based on the historical data of 10 typical provinces and cities in China during 2005-2014. The model in this paper adds the variability of logistics on the basis of previous studies, and this variable is replaced by the freight turnover of the provinces. Carbon emissions are calculated by using the annual consumption of coal, oil, and natural gas. GDP is the gross domestic product. The results showed that the amount of logistics and GDP have a contribution to carbon emissions and the long-term relationships are different between different cities in China, mainly influenced by the difference among development mode, economic structure, and level of logistic development. After the testing of panel model setting, this paper established a variable coefficient model of the panel. The influence of GDP and logistics on carbon emissions is obtained according to the influence factors among the variables. The paper concludes with main findings and provides recommendations toward rational planning of urban sustainable development and environmental protection for China.

  2. Determining Survey Satisficing of Online Longitudinal Survey Data in the Multicenter AIDS Cohort Study: A Group-Based Trajectory Analysis

    PubMed Central

    Di, Junrui; Li, Ying; Friedman, M Reuel; Reddy, Susheel; Surkan, Pamela J; Shoptaw, Steven

    2016-01-01

    Background Survey satisficing occurs when participants respond to survey questions rapidly without carefully reading or comprehending them. Studies have demonstrated the occurrence of survey satisficing, which can degrade survey quality, particularly in longitudinal studies. Objective The aim of this study is to use a group-based trajectory analysis method to identify satisficers when similar survey questions were asked periodically in a long-standing cohort, and to examine factors associated with satisficing in the surveys having sensitive human immunodeficiency virus (HIV)-related behavioral questions. Methods Behavioral data were collected semiannually online at all four sites of the Multicenter AIDS Cohort Study (MACS) from October 2008 through March 2013. Based on the start and end times, and the word counts per variable, response speed (word counts per second) for each participant visit was calculated. Two-step group-based trajectory analyses of the response speed across 9 study visits were performed to identify potential survey satisficing. Generalized linear models with repeated measures were used to investigate the factors associated with satisficing on HIV-related behavioral surveys. Results Among the total 2138 male participants, the median baseline age was 51 years (interquartile range, 45-58); most of the participants were non-Hispanic white (62.72%, 1341/2138) and college graduates (46.59%, 996/2138), and half were HIV seropositive (50.00%, 1069/2138). A total of 543 men (25.40%, 543/2138) were considered potential satisficers with respect to their increased trajectory tendency of response speed. In the multivariate analysis, being 10 years older at the baseline visit increased the odds of satisficing by 44% (OR 1.44, 95% CI 1.27-1.62, P<.001). Compared with the non-Hispanic white participants, non-Hispanic black participants were 122% more likely to satisfice the HIV-related behavioral survey (OR 2.22, 95% CI 1.69-2.91, P<.001), and 99% more likely

  3. Voice Disorder Management Competencies: A Survey of School-Based Speech-Language Pathologists in Nebraska

    ERIC Educational Resources Information Center

    Teten, Amy F.; DeVeney, Shari L.; Friehe, Mary J.

    2016-01-01

    Purpose: The purpose of this survey was to determine the self-perceived competence levels in voice disorders of practicing school-based speech-language pathologists (SLPs) and identify correlated variables. Method: Participants were 153 master's level, school-based SLPs with a Nebraska teaching certificate and/or licensure who completed a survey,…

  4. An empirical approach to predicting long term behavior of metal particle based recording media

    NASA Technical Reports Server (NTRS)

    Hadad, Allan S.

    1991-01-01

    Alpha iron particles used for magnetic recording are prepared through a series of dehydration and reduction steps of alpha-Fe2O3-H2O resulting in acicular, polycrystalline, body centered cubic (bcc) alpha-Fe particles that are single magnetic domains. Since fine iron particles are pyrophoric by nature, stabilization processes had to be developed in order for iron particles to be considered as a viable recording medium for long term archival (i.e., 25+ years) information storage. The primary means of establishing stability is through passivation or controlled oxidation of the iron particle's surface. Since iron particles used for magnetic recording are small, additional oxidation has a direct impact on performance especially where archival storage of recorded information for long periods of time is important. Further stabilization chemistry/processes had to be developed to guarantee that iron particles could be considered as a viable long term recording medium. In an effort to retard the diffusion of iron ions through the oxide layer, other elements such as silicon, aluminum, and chromium have been added to the base iron to promote more dense scale formation or to alleviate some of the non-stoichiometric behavior of the oxide or both. The presence of water vapor has been shown to disrupt the passive layer, subsequently increasing the oxidation rate of the iron. A study was undertaken to examine the degradation in magnetic properties as a function of both temperature and humidity on silicon-containing iron particles between 50-120 deg C and 3-89 percent relative humidity. The methodology to which experimental data was collected and analyzed leading to predictive capability is discussed.

  5. Intelligence in Bali--A Case Study on Estimating Mean IQ for a Population Using Various Corrections Based on Theory and Empirical Findings

    ERIC Educational Resources Information Center

    Rindermann, Heiner; te Nijenhuis, Jan

    2012-01-01

    A high-quality estimate of the mean IQ of a country requires giving a well-validated test to a nationally representative sample, which usually is not feasible in developing countries. So, we used a convenience sample and four corrections based on theory and empirical findings to arrive at a good-quality estimate of the mean IQ in Bali. Our study…

  6. Lessons Learned: Cultural and linguistic enhancement of surveys through community-based participatory research

    PubMed Central

    Formea, Christine M.; Mohamed, Ahmed A.; Hassan, Abdullahi; Osman, Ahmed; Weis, Jennifer A.; Sia, Irene G.; Wieland, Mark L.

    2014-01-01

    Background Surveys are frequently implemented in community-based participatory research (CBPR), but adaptation and translation of surveys can be logistically and methodologically challenging when working with immigrant and refugee populations. Objective To describe a process of participatory survey adaptation and translation. Methods Within an established CBPR partnership, a survey about diabetes was adapted for health literacy and local relevance and then translated through a process of forward translation, group deliberation, and back translation. Lessons Learned The group deliberation process was the most time-intensive and important component of the process. The process enhanced community ownership of the larger project while maximizing local applicability of the product. Conclusions A participatory process of survey adaptation and translation resulted in significant revisions to approximate semantic, cultural, and conceptual equivalence with the original surveys. This approach is likely to enhance community acceptance of the survey instrument during the implementation phase. PMID:25435559

  7. The dappled nature of causes of psychiatric illness: replacing the organic-functional/hardware-software dichotomy with empirically based pluralism.

    PubMed

    Kendler, K S

    2012-04-01

    Our tendency to see the world of psychiatric illness in dichotomous and opposing terms has three major sources: the philosophy of Descartes, the state of neuropathology in late nineteenth century Europe (when disorders were divided into those with and without demonstrable pathology and labeled, respectively, organic and functional), and the influential concept of computer functionalism wherein the computer is viewed as a model for the human mind-brain system (brain=hardware, mind=software). These mutually re-enforcing dichotomies, which have had a pernicious influence on our field, make a clear prediction about how 'difference-makers' (aka causal risk factors) for psychiatric disorders should be distributed in nature. In particular, are psychiatric disorders like our laptops, which when they dysfunction, can be cleanly divided into those with software versus hardware problems? I propose 11 categories of difference-makers for psychiatric illness from molecular genetics through culture and review their distribution in schizophrenia, major depression and alcohol dependence. In no case do these distributions resemble that predicted by the organic-functional/hardware-software dichotomy. Instead, the causes of psychiatric illness are dappled, distributed widely across multiple categories. We should abandon Cartesian and computer-functionalism-based dichotomies as scientifically inadequate and an impediment to our ability to integrate the diverse information about psychiatric illness our research has produced. Empirically based pluralism provides a rigorous but dappled view of the etiology of psychiatric illness. Critically, it is based not on how we wish the world to be but how the difference-makers for psychiatric illness are in fact distributed.

  8. An improved empirical model of electron and ion fluxes at geosynchronous orbit based on upstream solar wind conditions

    DOE PAGES

    Denton, M. H.; Henderson, M. G.; Jordanova, V. K.; ...

    2016-07-01

    In this study, a new empirical model of the electron fluxes and ion fluxes at geosynchronous orbit (GEO) is introduced, based on observations by Los Alamos National Laboratory (LANL) satellites. The model provides flux predictions in the energy range ~1 eV to ~40 keV, as a function of local time, energy, and the strength of the solar wind electric field (the negative product of the solar wind speed and the z component of the magnetic field). Given appropriate upstream solar wind measurements, the model provides a forecast of the fluxes at GEO with a ~1 h lead time. Model predictionsmore » are tested against in-sample observations from LANL satellites and also against out-of-sample observations from the Compact Environmental Anomaly Sensor II detector on the AMC-12 satellite. The model does not reproduce all structure seen in the observations. However, for the intervals studied here (quiet and storm times) the normalized root-mean-square deviation < ~0.3. It is intended that the model will improve forecasting of the spacecraft environment at GEO and also provide improved boundary/input conditions for physical models of the magnetosphere.« less

  9. Empirically Based Profiles of the Early Literacy Skills of Children With Language Impairment in Early Childhood Special Education.

    PubMed

    Justice, Laura; Logan, Jessica; Kaderavek, Joan; Schmitt, Mary Beth; Tompkins, Virginia; Bartlett, Christopher

    2015-01-01

    The purpose of this study was to empirically determine whether specific profiles characterize preschool-aged children with language impairment (LI) with respect to their early literacy skills (print awareness, name-writing ability, phonological awareness, alphabet knowledge); the primary interest was to determine if one or more profiles suggested vulnerability for future reading problems. Participants were 218 children enrolled in early childhood special education classrooms, 95% of whom received speech-language services. Children were administered an assessment of early literacy skills in the fall of the academic year. Based on results of latent profile analysis, four distinct literacy profiles were identified, with the single largest profile (55% of children) representing children with generally poor literacy skills across all areas examined. Children in the two low-risk categories had higher oral language skills than those in the high-risk and moderate-risk profiles. Across three of the four early literacy measures, children with language as their primary disability had higher scores than those with LI concomitant with other disabilities. These findings indicate that there are specific profiles of early literacy skills among children with LI, with about one half of children exhibiting a profile indicating potential susceptibility for future reading problems.

  10. An improved empirical model of electron and ion fluxes at geosynchronous orbit based on upstream solar wind conditions

    SciTech Connect

    Denton, M. H.; Henderson, M. G.; Jordanova, V. K.; Thomsen, M. F.; Borovsky, J. E.; Woodroffe, J.; Hartley, D. P.; Pitchford, D.

    2016-07-01

    In this study, a new empirical model of the electron fluxes and ion fluxes at geosynchronous orbit (GEO) is introduced, based on observations by Los Alamos National Laboratory (LANL) satellites. The model provides flux predictions in the energy range ~1 eV to ~40 keV, as a function of local time, energy, and the strength of the solar wind electric field (the negative product of the solar wind speed and the z component of the magnetic field). Given appropriate upstream solar wind measurements, the model provides a forecast of the fluxes at GEO with a ~1 h lead time. Model predictions are tested against in-sample observations from LANL satellites and also against out-of-sample observations from the Compact Environmental Anomaly Sensor II detector on the AMC-12 satellite. The model does not reproduce all structure seen in the observations. However, for the intervals studied here (quiet and storm times) the normalized root-mean-square deviation < ~0.3. It is intended that the model will improve forecasting of the spacecraft environment at GEO and also provide improved boundary/input conditions for physical models of the magnetosphere.

  11. Combined magnetic and kinetic control of advanced tokamak steady state scenarios based on semi-empirical modelling

    NASA Astrophysics Data System (ADS)

    Moreau, D.; Artaud, J. F.; Ferron, J. R.; Holcomb, C. T.; Humphreys, D. A.; Liu, F.; Luce, T. C.; Park, J. M.; Prater, R.; Turco, F.; Walker, M. L.

    2015-06-01

    This paper shows that semi-empirical data-driven models based on a two-time-scale approximation for the magnetic and kinetic control of advanced tokamak (AT) scenarios can be advantageously identified from simulated rather than real data, and used for control design. The method is applied to the combined control of the safety factor profile, q(x), and normalized pressure parameter, βN, using DIII-D parameters and actuators (on-axis co-current neutral beam injection (NBI) power, off-axis co-current NBI power, electron cyclotron current drive power, and ohmic coil). The approximate plasma response model was identified from simulated open-loop data obtained using a rapidly converging plasma transport code, METIS, which includes an MHD equilibrium and current diffusion solver, and combines plasma transport nonlinearity with 0D scaling laws and 1.5D ordinary differential equations. The paper discusses the results of closed-loop METIS simulations, using the near-optimal ARTAEMIS control algorithm (Moreau D et al 2013 Nucl. Fusion 53 063020) for steady state AT operation. With feedforward plus feedback control, the steady state target q-profile and βN are satisfactorily tracked with a time scale of about 10 s, despite large disturbances applied to the feedforward powers and plasma parameters. The robustness of the control algorithm with respect to disturbances of the H&CD actuators and of plasma parameters such as the H-factor, plasma density and effective charge, is also shown.

  12. Empirically-Based Crop Insurance for China: A Pilot Study in the Down-middle Yangtze River Area of China

    NASA Astrophysics Data System (ADS)

    Wang, Erda; Yu, Yang; Little, Bertis B.; Chen, Zhongxin; Ren, Jianqiang

    Factors that caused slow growth in crop insurance participation and its ultimate failure in China were multi-faceted including high agricultural production risk, low participation rate, inadequate public awareness, high loss ratio, insufficient and interrupted government financial support. Thus, a clear and present need for data driven analyses and empirically-based risk management exists in China. In the present investigation, agricultural production data for two crops (corn, rice) in five counties in Jiangxi Province and Hunan province for design of a pilot crop insurance program in China. A crop insurance program was designed which (1) provides 75% coverage, (2) a 55% premium rate reduction for the farmer compared to catastrophic coverage most recently offered, and uses the currently approved governmental premium subsidy level. Thus a safety net for Chinese farmers that help maintain agricultural production at a level of self-sufficiency that costs less than half the current plans requires one change to the program: ≥80% of producers must participate in an area.

  13. New insights into Hoogsteen base pairs in DNA duplexes from a structure-based survey

    PubMed Central

    Zhou, Huiqing; Hintze, Bradley J.; Kimsey, Isaac J.; Sathyamoorthy, Bharathwaj; Yang, Shan; Richardson, Jane S.; Al-Hashimi, Hashim M.

    2015-01-01

    Hoogsteen (HG) base pairs (bps) provide an alternative pairing geometry to Watson–Crick (WC) bps and can play unique functional roles in duplex DNA. Here, we use structural features unique to HG bps (syn purine base, HG hydrogen bonds and constricted C1′–C1′ distance across the bp) to search for HG bps in X-ray structures of DNA duplexes in the Protein Data Bank. The survey identifies 106 A•T and 34 G•C HG bps in DNA duplexes, many of which are undocumented in the literature. It also uncovers HG-like bps with syn purines lacking HG hydrogen bonds or constricted C1′–C1′ distances that are analogous to conformations that have been proposed to populate the WC-to-HG transition pathway. The survey reveals HG preferences similar to those observed for transient HG bps in solution by nuclear magnetic resonance, including stronger preferences for A•T versus G•C bps, TA versus GG steps, and also suggests enrichment at terminal ends with a preference for 5′-purine. HG bps induce small local perturbations in neighboring bps and, surprisingly, a small but significant degree of DNA bending (∼14°) directed toward the major groove. The survey provides insights into the preferences and structural consequences of HG bps in duplex DNA. PMID:25813047

  14. Fall 1994 wildlife and vegetation survey, Norton Air Force Base, California

    SciTech Connect

    Not Available

    1994-12-15

    The fall 1994 wildlife and vegetation surveys were completed October 3-7, 1994, at Norton Air Force Base (AFB), California. Two biologists from CDM Federal Programs, the U.S. Environmental Protection Agency (EPA) regional biologist and the Oak Ridge National Laboratory (ORNL) lead biologist conducted the surveys. A habitat assessment of three Installation Restoration Project (IRP) sites at Norton Air Force Base was also completed during the fall survey period. The IRP sites include: Landfill No. 2 (Site 2); the Industrial Wastewater Treatment Plant (IWTP) area; and Former Fire Training Area No. 1 (Site 5). The assessments were designed to qualitatively characterize the sites of concern, identify potential ecological receptors, and provide information for Remedial Design/Remedial Action activities. A Reference Area (Santa Ana River Wash) and the base urban areas were also characterized. The reference area assessment was performed to provide a baseline for comparison with the IRP site habitats. The fall 1994 survey is the second of up to four surveys that may be completed. In order to develop a complete understanding of all plant and animal species using the base, these surveys were planned to be conducted over four seasons. Species composition can vary widely during the course of a year in Southern California, and therefore, seasonal surveys will provide the most complete and reliable data to address changes in habitat structure and wildlife use of the site. Subsequent surveys will focus on seasonal wildlife observations and a spring vegetation survey.

  15. Discussion on climate oscillations: CMIP5 general circulation models versus a semi-empirical harmonic model based on astronomical cycles

    NASA Astrophysics Data System (ADS)

    Scafetta, Nicola

    2013-11-01

    Power spectra of global surface temperature (GST) records (available since 1850) reveal major periodicities at about 9.1, 10-11, 19-22 and 59-62 years. Equivalent oscillations are found in numerous multisecular paleoclimatic records. The Coupled Model Intercomparison Project 5 (CMIP5) general circulation models (GCMs), to be used in the IPCC Fifth Assessment Report (AR5, 2013), are analyzed and found not able to reconstruct this variability. In particular, from 2000 to 2013.5 a GST plateau is observed while the GCMs predicted a warming rate of about 2 °C/century. In contrast, the hypothesis that the climate is regulated by specific natural oscillations more accurately fits the GST records at multiple time scales. For example, a quasi 60-year natural oscillation simultaneously explains the 1850-1880, 1910-1940 and 1970-2000 warming periods, the 1880-1910 and 1940-1970 cooling periods and the post 2000 GST plateau. This hypothesis implies that about 50% of the ~ 0.5 °C global surface warming observed from 1970 to 2000 was due to natural oscillations of the climate system, not to anthropogenic forcing as modeled by the CMIP3 and CMIP5 GCMs. Consequently, the climate sensitivity to CO2 doubling should be reduced by half, for example from the 2.0-4.5 °C range (as claimed by the IPCC, 2007) to 1.0-2.3 °C with a likely median of ~ 1.5 °C instead of ~ 3.0 °C. Also modern paleoclimatic temperature reconstructions showing a larger preindustrial variability than the hockey-stick shaped temperature reconstructions developed in early 2000 imply a weaker anthropogenic effect and a stronger solar contribution to climatic changes. The observed natural oscillations could be driven by astronomical forcings. The ~ 9.1 year oscillation appears to be a combination of long soli-lunar tidal oscillations, while quasi 10-11, 20 and 60 year oscillations are typically found among major solar and heliospheric oscillations driven mostly by Jupiter and Saturn movements. Solar models based

  16. Autonomous and Remote-Controlled Airborne and Ground-Based Robotic Platforms for Adaptive Geophysical Surveying

    NASA Astrophysics Data System (ADS)

    Spritzer, J. M.; Phelps, G. A.

    2011-12-01

    Low-cost autonomous and remote-controlled robotic platforms have opened the door to precision-guided geophysical surveying. Over the past two years, the U.S. Geological Survey, Senseta, NASA Ames Research Center, and Carnegie Mellon University Silicon Valley, have developed and deployed small autonomous and remotely controlled vehicles for geophysical investigations. The purpose of this line of investigation is to 1) increase the analytical capability, resolution, and repeatability, and 2) decrease the time, and potentially the cost and map-power necessary to conduct near-surface geophysical surveys. Current technology has advanced to the point where vehicles can perform geophysical surveys autonomously, freeing the geoscientist to process and analyze the incoming data in near-real time. This has enabled geoscientists to monitor survey parameters; process, analyze and interpret the incoming data; and test geophysical models in the same field session. This new approach, termed adaptive surveying, provides the geoscientist with choices of how the remainder of the survey should be conducted. Autonomous vehicles follow pre-programmed survey paths, which can be utilized to easily repeat surveys on the same path over large areas without the operator fatigue and error that plague man-powered surveys. While initial deployments with autonomous systems required a larger field crew than a man-powered survey, over time operational experience costs and man power requirements will decrease. Using a low-cost, commercially available chassis as the base for autonomous surveying robotic systems promise to provide higher precision and efficiency than human-powered techniques. An experimental survey successfully demonstrated the adaptive techniques described. A magnetic sensor was mounted on a small rover, which autonomously drove a prescribed course designed to provide an overview of the study area. Magnetic data was relayed to the base station periodically, processed and gridded. A

  17. An Empirical Kaiser Criterion.

    PubMed

    Braeken, Johan; van Assen, Marcel A L M

    2016-03-31

    In exploratory factor analysis (EFA), most popular methods for dimensionality assessment such as the screeplot, the Kaiser criterion, or-the current gold standard-parallel analysis, are based on eigenvalues of the correlation matrix. To further understanding and development of factor retention methods, results on population and sample eigenvalue distributions are introduced based on random matrix theory and Monte Carlo simulations. These results are used to develop a new factor retention method, the Empirical Kaiser Criterion. The performance of the Empirical Kaiser Criterion and parallel analysis is examined in typical research settings, with multiple scales that are desired to be relatively short, but still reliable. Theoretical and simulation results illustrate that the new Empirical Kaiser Criterion performs as well as parallel analysis in typical research settings with uncorrelated scales, but much better when scales are both correlated and short. We conclude that the Empirical Kaiser Criterion is a powerful and promising factor retention method, because it is based on distribution theory of eigenvalues, shows good performance, is easily visualized and computed, and is useful for power analysis and sample size planning for EFA. (PsycINFO Database Record

  18. Detection of sea otters in boat-based surveys of Prince William Sound, Alaska

    USGS Publications Warehouse

    Udevitz, Mark S.; Bodkin, James L.; Costa, Daniel P.

    1995-01-01

    Boat-based surveys have been commonly used to monitor sea otter populations, but there has been little quantitative work to evaluate detection biases that may affect these surveys. We used ground-based observers to investigate sea otter detection probabilities in a boat-based survey of Prince William Sound, Alaska. We estimated that 30% of the otters present on surveyed transects were not detected by boat crews. Approximately half (53%) of the undetected otters were missed because the otters left the transects, apparently in response to the approaching boat. Unbiased estimates of detection probabilities will be required for obtaining unbiased population estimates from boat-based surveys of sea otters. Therefore, boat-based surveys should include methods to estimate sea otter detection probabilities under the conditions specific to each survey. Unbiased estimation of detection probabilities with ground-based observers requires either that the ground crews detect all of the otters in observed subunits, or that there are no errors in determining which crews saw each detected otter. Ground-based observer methods may be appropriate in areas where nearly all of the sea otter habitat is potentially visible from ground-based vantage points.

  19. Teaching Margery and Julian in Anthology-Based Survey Courses

    ERIC Educational Resources Information Center

    Petersen, Zina

    2006-01-01

    Recognizing that many of us teach the medieval English women mystics Margery Kempe and Julian of Norwich in survey courses, this essay attempts to put these writers in context for teachers who may have only a passing familiarity with the period. Focusing on passages of their writings found in the Longman and Norton anthologies of British…

  20. School/Community-Based Alcoholism/Substance Abuse Prevention Survey.

    ERIC Educational Resources Information Center

    Owan, Tom Choken; And Others

    This report describes school and community efforts to prevent alcoholism and substance abuse among American Indian and Alaskan Native youth. In 1986, the Indian Health Service (IHS) surveyed Bureau of Indian Affairs schools, public schools with large Indian enrollments, and community groups involved in 225 IHS-funded alcohol and substance abuse…

  1. Six-Month Results for the Kelly Air Force Base Compressed Work Week Survey

    DTIC Science & Technology

    1993-07-01

    14. SUBJECT TERMS 15. NUMBER OF PAGES Air Force Base workers Compressed work week 60 Attitude survey Lifestyle 16. PRICE CODE 17. SECURITY...10 APPENDIX A The Kelly AFB Attitude Survey ................. 13 B Responses to the Lifestyle and Job Relat i Questions (1-91) on the Survey (Sections...on CWS for a 6-month period. There are few published studies regarding the impact of CWS on the lifestyle or quality of life of the employee

  2. Frequency recognition in an SSVEP-based brain computer interface using empirical mode decomposition and refined generalized zero-crossing.

    PubMed

    Wu, Chi-Hsun; Chang, Hsiang-Chih; Lee, Po-Lei; Li, Kuen-Shing; Sie, Jyun-Jie; Sun, Chia-Wei; Yang, Chia-Yen; Li, Po-Hung; Deng, Hua-Ting; Shyu, Kuo-Kai

    2011-03-15

    This paper presents an empirical mode decomposition (EMD) and refined generalized zero crossing (rGZC) approach to achieve frequency recognition in steady-stated visual evoked potential (SSVEP)-based brain computer interfaces (BCIs). Six light emitting diode (LED) flickers with high flickering rates (30, 31, 32, 33, 34, and 35 Hz) functioned as visual stimulators to induce the subjects' SSVEPs. EEG signals recorded in the Oz channel were segmented into data epochs (0.75 s). Each epoch was then decomposed into a series of oscillation components, representing fine-to-coarse information of the signal, called intrinsic mode functions (IMFs). The instantaneous frequencies in each IMF were calculated by refined generalized zero-crossing (rGZC). IMFs with mean instantaneous frequencies (f(GZC)) within 29.5 Hz and 35.5 Hz (i.e., 29.5≤f(GZC)≤35.5 Hz) were designated as SSVEP-related IMFs. Due to the time-locked and phase-locked characteristics of SSVEP, the induced SSVEPs had the same frequency as the gazing visual stimulator. The LED flicker that contributed the majority of the frequency content in SSVEP-related IMFs was chosen as the gaze target. This study tests the proposed system in five male subjects (mean age=25.4±2.07 y/o). Each subject attempted to activate four virtual commands by inputting a sequence of cursor commands on an LCD screen. The average information transfer rate (ITR) and accuracy were 36.99 bits/min and 84.63%. This study demonstrates that EMD is capable of extracting SSVEP data in SSVEP-based BCI system.

  3. Single-trial analysis of cortical oscillatory activities during voluntary movements using empirical mode decomposition (EMD)-based spatiotemporal approach.

    PubMed

    Lee, Po-Lei; Shang, Li-Zen; Wu, Yu-Te; Shu, Chih-Hung; Hsieh, Jen-Chuen; Lin, Yung-Yang; Wu, Chi-Hsun; Liu, Yu-Lu; Yang, Chia-Yen; Sun, Chia-Wei; Shyu, Kuo-Kai

    2009-08-01

    This study presents a method based on empirical mode decomposition (EMD) and a spatial template-based matching approach to extract sensorimotor oscillatory activities from multi-channel magnetoencephalographic (MEG) measurements during right index finger lifting. The longitudinal gradiometer of the sensor unit which presents most prominent SEF was selected on which each single-trial recording was decomposed into a set of intrinsic mode functions (IMFs). The correlation between each IMF of the selected channel and raw data on other channels were created and represented as a spatial map. The sensorimotor-related IMFs with corresponding correlational spatial map exhibiting large values on primary sensorimotor area (SMI) were selected via spatial-template matching process. Trial-specific alpha and beta bands were determined in sensorimotor-related oscillatory activities using a two-spectrum comparison between the spectra obtained from baseline period (-4 to -3 s) and movement-onset period (-0.5 to 0.5 s). Sensorimotor-related oscillatory activities were filtered within the trial-specific frequency bands to resolve task-related oscillatory activities. Results demonstrated that the optimal phase and amplitude information were preserved not only for alpha suppression (event-related desynchronization) and beta rebound (event-related synchronization) but also for profound analysis of subtle dynamics across trials. The retention of high SNR in the extracted oscillatory activities allow various methods of source estimation that can be applied to study the intricate brain dynamics of motor control mechanisms. The present study enables the possibility of investigating cortical pathophysiology of movement disorder on a trial-by-trial basis which also permits an effective alternative for participants or patients who can not endure lengthy procedures or are incapable of sustaining long experiments.

  4. The ethics of feedback of HIV test results in population-based surveys of HIV infection.

    PubMed

    Maher, Dermot

    2013-12-01

    Population-based disease prevalence surveys raise ethical questions, including whether participants should be routinely told their test results. Ethical guidelines call for informing survey participants of any clinically relevant finding to enable appropriate management. However, in anonymous surveys of human immunodeficiency virus (HIV) infection, participants can "opt out" of being given their test results or are offered the chance to undergo voluntary HIV testing in local counselling and testing services. This is aimed at minimizing survey participation bias. Those who opt out of being given their HIV test results and who do not seek their results miss the opportunity to receive life-saving antiretroviral therapy. The justification for HIV surveys without routine feedback of results to participants is based on a public health utility argument: that the benefits of more rigorous survey methods - reduced participation bias - outweigh the benefits to individuals of knowing their HIV status. However, people with HIV infection have a strong immediate interest in knowing their HIV status. In consideration of the ethical value of showing respect for people and thereby alleviating suffering, an argument based on public health utility is not an appropriate justification. In anonymous HIV surveys as well as other prevalence surveys of treatable conditions in any setting, participation should be on the basis of routine individual feedback of results as an integral part of fully informed participation. Ensuring that surveys are ethically sound may stimulate participation, increase a broader uptake of HIV testing and reduce stigmatization of people who are HIV-positive.

  5. Assessing changes to South African maize production areas in 2055 using empirical and process-based crop models

    NASA Astrophysics Data System (ADS)

    Estes, L.; Bradley, B.; Oppenheimer, M.; Beukes, H.; Schulze, R. E.; Tadross, M.

    2010-12-01

    Rising temperatures and altered precipitation patterns associated with climate change pose a significant threat to crop production, particularly in developing countries. In South Africa, a semi-arid country with a diverse agricultural sector, anthropogenic climate change is likely to affect staple crops and decrease food security. Here, we focus on maize production, South Africa’s most widely grown crop and one with high socio-economic value. We build on previous coarser-scaled studies by working at a finer spatial resolution and by employing two different modeling approaches: the process-based DSSAT Cropping System Model (CSM, version 4.5), and an empirical distribution model (Maxent). For climate projections, we use an ensemble of 10 general circulation models (GCMs) run under both high and low CO2 emissions scenarios (SRES A2 and B1). The models were down-scaled to historical climate records for 5838 quinary-scale catchments covering South Africa (mean area = 164.8 km2), using a technique based on self-organizing maps (SOMs) that generates precipitation patterns more consistent with observed gradients than those produced by the parent GCMs. Soil hydrological and mechanical properties were derived from textural and compositional data linked to a map of 26422 land forms (mean area = 46 km2), while organic carbon from 3377 soil profiles was mapped using regression kriging with 8 spatial predictors. CSM was run using typical management parameters for the several major dryland maize production regions, and with projected CO2 values. The Maxent distribution model was trained using maize locations identified using annual phenology derived from satellite images coupled with airborne crop sampling observations. Temperature and precipitation projections were based on GCM output, with an additional 10% increase in precipitation to simulate higher water-use efficiency under future CO2 concentrations. The two modeling approaches provide spatially explicit projections of

  6. A survey of surveys

    SciTech Connect

    Kent, S.M.

    1994-11-01

    A new era for the field of Galactic structure is about to be opened with the advent of wide-area digital sky surveys. In this article, the author reviews the status and prospects for research for 3 new ground-based surveys: the Sloan Digital Sky Survey (SDSS), the Deep Near-Infrared Survey of the Southern Sky (DENIS) and the Two Micron AU Sky Survey (2MASS). These surveys will permit detailed studies of Galactic structure and stellar populations in the Galaxy with unprecedented detail. Extracting the information, however, will be challenging.

  7. Libraries and Desktop Storage Options: Results of a Web-Based Survey.

    ERIC Educational Resources Information Center

    Hendricks, Arthur; Wang, Jian

    2002-01-01

    Reports the results of a Web-based survey that investigated what plans, if any, librarians have for dealing with the expected obsolescence of the floppy disk and still retain effective library service. Highlights include data storage options, including compact disks, zip disks, and networked storage products; and a copy of the Web survey.…

  8. The VIMOS Public Extragalactic Redshift Survey (VIPERS). PCA-based automatic cleaning and reconstruction of survey spectra

    NASA Astrophysics Data System (ADS)

    Marchetti, A.; Garilli, B.; Granett, B. R.; Guzzo, L.; Iovino, A.; Scodeggio, M.; Bolzonella, M.; de la Torre, S.; Abbas, U.; Adami, C.; Bottini, D.; Cappi, A.; Cucciati, O.; Davidzon, I.; Franzetti, P.; Fritz, A.; Krywult, J.; Le Brun, V.; Le Fèvre, O.; Maccagni, D.; Małek, K.; Marulli, F.; Polletta, M.; Pollo, A.; Tasca, L. A. M.; Tojeiro, R.; Vergani, D.; Zanichelli, A.; Arnouts, S.; Bel, J.; Branchini, E.; Coupon, J.; De Lucia, G.; Ilbert, O.; Moutard, T.; Moscardini, L.; Zamorani, G.

    2017-03-01

    Context. Identifying spurious reduction artefacts in galaxy spectra is a challenge for large surveys. Aims: We present an algorithm for identifying and repairing spurious residual features in sky-subtracted galaxy spectra by using data from the VIMOS Public Extragalactic Redshift Survey (VIPERS) as a test case. Methods: The algorithm uses principal component analysis (PCA) applied to the galaxy spectra in the observed frame to identify sky line residuals imprinted at characteristic wavelengths. We further model the galaxy spectra in the rest-frame using PCA to estimate the most probable continuum in the corrupted spectral regions, which are then repaired. Results: We apply the method to 90 000 spectra from the VIPERS survey and compare the results with a subset for which careful editing was performed by hand. We find that the automatic technique reproduces the time-consuming manual cleaning in a uniform and objective manner across a large data sample. The mask data products produced in this work are released together with the VIPERS second public data release (PDR-2). based on observations collected at the European Southern Observatory, Cerro Paranal, Chile, using the Very Large Telescope under programs 182.A-0886 and partly 070.A-9007. Also based on observations obtained with MegaPrime/MegaCam, a joint project of CFHT and CEA/DAPNIA at the Canada-France-Hawaii Telescope (CFHT), that is operated by the National Research Council (NRC) of Canada, the Institut National des Sciences de l'Univers of the Centre National de la Recherche Scientifique (CNRS) of France, and the University of Hawaii. This work is based in part on data products produced at TERAPIX and the Canadian Astronomy Data Centre as part of the Canada-France-Hawaii Telescope Legacy Survey, which is a collaborative project of NRC and CNRS. The VIPERS web site is http://www.vipers.inaf.it/.

  9. Motion Trajectories for Wide-area Surveying with a Rover-based Distributed Spectrometer

    NASA Technical Reports Server (NTRS)

    Tunstel, Edward; Anderson, Gary; Wilson, Edmond

    2006-01-01

    A mobile ground survey application that employs remote sensing as a primary means of area coverage is highlighted. It is distinguished from mobile robotic area coverage problems that employ contact or proximity-based sensing. The focus is on a specific concept for performing mobile surveys in search of biogenic gases on planetary surfaces using a distributed spectrometer -- a rover-based instrument designed for wide measurement coverage of promising search areas. Navigation algorithms for executing circular and spiral survey trajectories are presented for widearea distributed spectroscopy and evaluated based on area covered and distance traveled.

  10. Disk Luminosity Function Based on the Lowell Proper Motion Survey

    NASA Astrophysics Data System (ADS)

    Kim, Mee-Jeong; Lee, Sang-Gak

    1991-12-01

    Disk stellar luminosity function has been derived with stars in the Lowell Proper Motion Survey which contains about 9000 stars with mu => 0.27" of arc/yr, 8 < m_pg < 17 and with bright stars in the Smithsonian Astrophysical Observatory (SAO) Star Catalogue. Luminosity function has been obtained with stars within 20 pc by Luyten's mean absolute magnitudes method using Reduced Proper Motion Diagram to select disk stars. Magnitudes and colors, in the SAO Star Catalogue as well as in the Lowell Proper Motion Survey have been transformed to the UBV system from the published UBV data. It has been found that stars which have higher proper motion than the original limit of the proper motion survey are missed, when the relation between the absolute magnitude and reduced proper motion is applied to sample stars without considering the dispersion in magnitude. Correction factors for missing stars have been estimated according to their limits of proper motion which are dependent on the absolute magnitude. Resulting lumi- nosity function shows Wielen's dip at M_B ~ 10, and systematic enhancement of stars on the average of about delta log Phi(M_B) ~ 0.2 compared with Luyten's luminosity function.

  11. Scrutinizing a Survey-Based Measure of Science and Mathematics Teacher Knowledge: Relationship to Observations of Teaching Practice

    NASA Astrophysics Data System (ADS)

    Talbot, Robert M.

    2016-12-01

    There is a clear need for valid and reliable instrumentation that measures teacher knowledge. However, the process of investigating and making a case for instrument validity is not a simple undertaking; rather, it is a complex endeavor. This paper presents the empirical case of one aspect of such an instrument validation effort. The particular instrument under scrutiny was developed in order to determine the effect of a teacher education program on novice science and mathematics teachers' strategic knowledge (SK). The relationship between novice science and mathematics teachers' SK as measured by a survey and their SK as inferred from observations of practice using a widely used observation protocol is the subject of this paper. Moderate correlations between parts of the observation-based construct and the SK construct were observed. However, the main finding of this work is that the context in which the measurement is made (in situ observations vs. ex situ survey) is an essential factor in establishing the validity of the measurement itself.

  12. Predicting Student Performance in Web-Based Distance Education Courses Based on Survey Instruments Measuring Personality Traits and Technical Skills

    ERIC Educational Resources Information Center

    Hall, Michael

    2008-01-01

    Two common web-based surveys, "Is Online Learning Right for Me?' and "What Technical Skills Do I Need?", were combined into a single survey instrument and given to 228 on-campus and 83 distance education students. The students were enrolled in four different classes (business, computer information services, criminal justice, and…

  13. Modeling magnetic fields from a DC power cable buried beneath San Francisco Bay based on empirical measurements

    SciTech Connect

    Kavet, Robert; Wyman, Megan T.; Klimley, A. Peter; Carretero, Luis

    2016-02-25

    Here, the Trans Bay Cable (TBC) is a ±200-kilovolt (kV), 400 MW 85-km long High Voltage Direct Current (DC) buried transmission line linking Pittsburg, CA with San Francisco, CA (SF) beneath the San Francisco Estuary. The TBC runs parallel to the migratory route of various marine species, including green sturgeon, Chinook salmon, and steelhead trout. In July and August 2014, an extensive series of magnetic field measurements were taken using a pair of submerged Geometrics magnetometers towed behind a survey vessel in four locations in the San Francisco estuary along profiles that cross the cable’s path; these included the San Francisco-Oakland Bay Bridge (BB), the Richmond-San Rafael Bridge (RSR), the Benicia- Martinez Bridge (Ben) and an area in San Pablo Bay (SP) in which a bridge is not present. In this paper, we apply basic formulas that ideally describe the magnetic field from a DC cable summed vectorially with the background geomagnetic field (in the absence of other sources that would perturb the ambient field) to derive characteristics of the cable that are otherwise not immediately observable. Magnetic field profiles from measurements taken along 170 survey lines were inspected visually for evidence of a distinct pattern representing the presence of the cable. Many profiles were dominated by field distortions unrelated to the cable caused by bridge structures or other submerged objects, and the cable’s contribution to the field was not detectable. BB, with 40 of the survey lines, did not yield usable data for these reasons. The unrelated anomalies could be up to 100 times greater than those from the cable. In total, discernible magnetic field profiles measured from 76 survey lines were regressed against the equations, representing eight days of measurement. The modeled field anomalies due to the cable (the difference between the maximum and minimum field along the survey line at the cable crossing) were virtually identical to the measured values. The

  14. Modeling Magnetic Fields from a DC Power Cable Buried Beneath San Francisco Bay Based on Empirical Measurements

    PubMed Central

    Kavet, Robert; Wyman, Megan T.; Klimley, A. Peter

    2016-01-01

    The Trans Bay Cable (TBC) is a ±200-kilovolt (kV), 400 MW 85-km long High Voltage Direct Current (DC) buried transmission line linking Pittsburg, CA with San Francisco, CA (SF) beneath the San Francisco Estuary. The TBC runs parallel to the migratory route of various marine species, including green sturgeon, Chinook salmon, and steelhead trout. In July and August 2014, an extensive series of magnetic field measurements were taken using a pair of submerged Geometrics magnetometers towed behind a survey vessel in four locations in the San Francisco estuary along profiles that cross the cable’s path; these included the San Francisco-Oakland Bay Bridge (BB), the Richmond-San Rafael Bridge (RSR), the Benicia-Martinez Bridge (Ben) and an area in San Pablo Bay (SP) in which a bridge is not present. In this paper, we apply basic formulas that ideally describe the magnetic field from a DC cable summed vectorially with the background geomagnetic field (in the absence of other sources that would perturb the ambient field) to derive characteristics of the cable that are otherwise not immediately observable. Magnetic field profiles from measurements taken along 170 survey lines were inspected visually for evidence of a distinct pattern representing the presence of the cable. Many profiles were dominated by field distortions unrelated to the cable caused by bridge structures or other submerged objects, and the cable’s contribution to the field was not detectable. BB, with 40 of the survey lines, did not yield usable data for these reasons. The unrelated anomalies could be up to 100 times greater than those from the cable. In total, discernible magnetic field profiles measured from 76 survey lines were regressed against the equations, representing eight days of measurement. The modeled field anomalies due to the cable (the difference between the maximum and minimum field along the survey line at the cable crossing) were virtually identical to the measured values. The

  15. Modeling magnetic fields from a DC power cable buried beneath San Francisco Bay based on empirical measurements

    DOE PAGES

    Kavet, Robert; Wyman, Megan T.; Klimley, A. Peter; ...

    2016-02-25

    Here, the Trans Bay Cable (TBC) is a ±200-kilovolt (kV), 400 MW 85-km long High Voltage Direct Current (DC) buried transmission line linking Pittsburg, CA with San Francisco, CA (SF) beneath the San Francisco Estuary. The TBC runs parallel to the migratory route of various marine species, including green sturgeon, Chinook salmon, and steelhead trout. In July and August 2014, an extensive series of magnetic field measurements were taken using a pair of submerged Geometrics magnetometers towed behind a survey vessel in four locations in the San Francisco estuary along profiles that cross the cable’s path; these included the Sanmore » Francisco-Oakland Bay Bridge (BB), the Richmond-San Rafael Bridge (RSR), the Benicia- Martinez Bridge (Ben) and an area in San Pablo Bay (SP) in which a bridge is not present. In this paper, we apply basic formulas that ideally describe the magnetic field from a DC cable summed vectorially with the background geomagnetic field (in the absence of other sources that would perturb the ambient field) to derive characteristics of the cable that are otherwise not immediately observable. Magnetic field profiles from measurements taken along 170 survey lines were inspected visually for evidence of a distinct pattern representing the presence of the cable. Many profiles were dominated by field distortions unrelated to the cable caused by bridge structures or other submerged objects, and the cable’s contribution to the field was not detectable. BB, with 40 of the survey lines, did not yield usable data for these reasons. The unrelated anomalies could be up to 100 times greater than those from the cable. In total, discernible magnetic field profiles measured from 76 survey lines were regressed against the equations, representing eight days of measurement. The modeled field anomalies due to the cable (the difference between the maximum and minimum field along the survey line at the cable crossing) were virtually identical to the measured

  16. Development of An Empirical Water Quality Model for Stormwater Based on Watershed Land Use in Puget Sound

    SciTech Connect

    Cullinan, Valerie I.; May, Christopher W.; Brandenberger, Jill M.; Judd, Chaeli; Johnston, Robert K.

    2007-03-29

    The Sinclair and Dyes Inlet watershed is located on the west side of Puget Sound in Kitsap County, Washington, U.S.A. (Figure 1). The Puget Sound Naval Shipyard (PSNS), U.S Environmental Protection Agency (USEPA), the Washington State Department of Ecology (WA-DOE), Kitsap County, City of Bremerton, City of Bainbridge Island, City of Port Orchard, and the Suquamish Tribe have joined in a cooperative effort to evaluate water-quality conditions in the Sinclair-Dyes Inlet watershed and correct identified problems. A major focus of this project, known as Project ENVVEST, is to develop Water Clean-up (TMDL) Plans for constituents listed on the 303(d) list within the Sinclair and Dyes Inlet watershed. Segments within the Sinclair and Dyes Inlet watershed were listed on the State of Washington’s 1998 303(d) because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue (WA-DOE 2003). Stormwater loading was identified by ENVVEST as one potential source of sediment contamination, which lacked sufficient data for a contaminant mass balance calculation for the watershed. This paper summarizes the development of an empirical model for estimating contaminant concentrations in all streams discharging into Sinclair and Dyes Inlets based on watershed land use, 18 storm events, and wet/dry season baseflow conditions between November 2002 and May 2005. Stream pollutant concentrations along with estimates for outfalls and surface runoff will be used in estimating the loading and ultimately in establishing a Water Cleanup Plan (TMDL) for the Sinclair-Dyes Inlet watershed.

  17. Pathways from parental AIDS to child psychological, educational and sexual risk: developing an empirically-based interactive theoretical model.

    PubMed

    Cluver, Lucie; Orkin, Mark; Boyes, Mark E; Sherr, Lorraine; Makasi, Daphne; Nikelo, Joy

    2013-06-01

    Increasing evidence demonstrates negative psychological, health, and developmental outcomes for children associated with parental HIV/AIDS illness and death. However, little is known about how parental AIDS leads to negative child outcomes. This study used a structural equation modelling approach to develop an empirically-based theoretical model of interactive relationships between parental or primary caregiver AIDS-illness, AIDS-orphanhood and predicted intervening factors associated with children's psychological distress, educational access and sexual health. Cross-sectional data were collected in 2009-2011, from 6002 children aged 10-17 years in three provinces of South Africa using stratified random sampling. Comparison groups included children orphaned by AIDS, orphaned by other causes and non-orphans, and children whose parents or primary caregivers were unwell with AIDS, unwell with other causes or healthy. Participants reported on psychological symptoms, educational access, and sexual health risks, as well as hypothesized sociodemographic and intervening factors. In order to build an interactive theoretical model of multiple child outcomes, multivariate regression and structural equation models were developed for each individual outcome, and then combined into an overall model. Neither AIDS-orphanhood nor parental AIDS-illness were directly associated with psychological distress, educational access, or sexual health. Instead, significant indirect effects of AIDS-orphanhood and parental AIDS-illness were obtained on all measured outcomes. Child psychological, educational and sexual health risks share a common set of intervening variables including parental disability, poverty, community violence, stigma, and child abuse that together comprise chain effects. In all models, parental AIDS-illness had stronger effects and more risk pathways than AIDS-orphanhood, especially via poverty and parental disability. AIDS-orphanhood and parental AIDS-illness impact

  18. A hybrid model for PM₂.₅ forecasting based on ensemble empirical mode decomposition and a general regression neural network.

    PubMed

    Zhou, Qingping; Jiang, Haiyan; Wang, Jianzhou; Zhou, Jianling

    2014-10-15

    Exposure to high concentrations of fine particulate matter (PM₂.₅) can cause serious health problems because PM₂.₅ contains microscopic solid or liquid droplets that are sufficiently small to be ingested deep into human lungs. Thus, daily prediction of PM₂.₅ levels is notably important for regulatory plans that inform the public and restrict social activities in advance when harmful episodes are foreseen. A hybrid EEMD-GRNN (ensemble empirical mode decomposition-general regression neural network) model based on data preprocessing and analysis is firstly proposed in this paper for one-day-ahead prediction of PM₂.₅ concentrations. The EEMD part is utilized to decompose original PM₂.₅ data into several intrinsic mode functions (IMFs), while the GRNN part is used for the prediction of each IMF. The hybrid EEMD-GRNN model is trained using input variables obtained from principal component regression (PCR) model to remove redundancy. These input variables accurately and succinctly reflect the relationships between PM₂.₅ and both air quality and meteorological data. The model is trained with data from January 1 to November 1, 2013 and is validated with data from November 2 to November 21, 2013 in Xi'an Province, China. The experimental results show that the developed hybrid EEMD-GRNN model outperforms a single GRNN model without EEMD, a multiple linear regression (MLR) model, a PCR model, and a traditional autoregressive integrated moving average (ARIMA) model. The hybrid model with fast and accurate results can be used to develop rapid air quality warning systems.

  19. Empirical Equation Based Chirality (n, m) Assignment of Semiconducting Single Wall Carbon Nanotubes from Resonant Raman Scattering Data

    PubMed Central

    Arefin, Md Shamsul

    2012-01-01

    This work presents a technique for the chirality (n, m) assignment of semiconducting single wall carbon nanotubes by solving a set of empirical equations of the tight binding model parameters. The empirical equations of the nearest neighbor hopping parameters, relating the term (2n− m) with the first and second optical transition energies of the semiconducting single wall carbon nanotubes, are also proposed. They provide almost the same level of accuracy for lower and higher diameter nanotubes. An algorithm is presented to determine the chiral index (n, m) of any unknown semiconducting tube by solving these empirical equations using values of radial breathing mode frequency and the first or second optical transition energy from resonant Raman spectroscopy. In this paper, the chirality of 55 semiconducting nanotubes is assigned using the first and second optical transition energies. Unlike the existing methods of chirality assignment, this technique does not require graphical comparison or pattern recognition between existing experimental and theoretical Kataura plot.

  20. An empirical study on the matrix-based protein representations and their combination with sequence-based approaches.

    PubMed

    Nanni, Loris; Lumini, Alessandra; Brahnam, Sheryl

    2013-03-01

    Many domains have a stake in the development of reliable systems for automatic protein classification. Of particular interest in recent studies of automatic protein classification is the exploration of new methods for extracting features from a protein that enhance classification for specific problems. These methods have proven very useful in one or two domains, but they have failed to generalize well across several domains (i.e. classification problems). In this paper, we evaluate several feature extraction approaches for representing proteins with the aim of sequence-based protein classification. Several protein representations are evaluated, those starting from: the position specific scoring matrix (PSSM) of the proteins; the amino-acid sequence; a matrix representation of the protein, of dimension (length of the protein) ×20, obtained using the substitution matrices for representing each amino-acid as a vector. A valuable result is that a texture descriptor can be extracted from the PSSM protein representation which improves the performance of standard descriptors based on the PSSM representation. Experimentally, we develop our systems by comparing several protein descriptors on nine different datasets. Each descriptor is used to train a support vector machine (SVM) or an ensemble of SVM. Although different stand-alone descriptors work well on some datasets (but not on others), we have discovered that fusion among classifiers trained using different descriptors obtains a good performance across all the tested datasets. Matlab code/Datasets used in the proposed paper are available at http://www.bias.csr.unibo.it\

  1. Empirical and targeted therapy of candidemia with fluconazole versus echinocandins: a propensity score-derived analysis of a population-based, multicentre prospective cohort.

    PubMed

    López-Cortés, L E; Almirante, B; Cuenca-Estrella, M; Garnacho-Montero, J; Padilla, B; Puig-Asensio, M; Ruiz-Camps, I; Rodríguez-Baño, J

    2016-08-01

    We compared the clinical efficacy of fluconazole and echinocandins in the treatment of candidemia in real practice. The CANDIPOP study is a prospective, population-based cohort study on candidemia carried out between May 2010 and April 2011 in 29 Spanish hospitals. Using strict inclusion criteria, we separately compared the impact of empirical and targeted therapy with fluconazole or echinocandins on 30-day mortality. Cox regression, including a propensity score (PS) for receiving echinocandins, stratified analysis on the PS quartiles and PS-based matched analyses, were performed. The empirical and targeted therapy cohorts comprised 316 and 421 cases, respectively; 30-day mortality was 18.7% with fluconazole and 33.9% with echinocandins (p 0.02) in the empirical therapy group and 19.8% with fluconazole and 27.7% with echinocandins (p 0.06) in the targeted therapy group. Multivariate Cox regression analysis including PS showed that empirical therapy with fluconazole was associated with better prognosis (adjusted hazard ratio 0.38; 95% confidence interval 0.17-0.81; p 0.01); no differences were found within each PS quartile or in cases matched according to PS. Targeted therapy with fluconazole did not show a significant association with mortality in the Cox regression analysis (adjusted hazard ratio 0.77; 95% confidence interval 0.41-1.46; p 0.63), in the PS quartiles or in PS-matched cases. The results were similar among patients with severe sepsis and septic shock. Empirical or targeted treatment with fluconazole was not associated with increased 30-day mortality compared to echinocandins among adults with candidemia.

  2. Wastewater Characterization Survey, Laughlin Air Force Base, Texas

    DTIC Science & Technology

    1989-02-01

    USAFOEHL REPORT 89-009EQO0105BWA WASTEWATER CHARACTERIZATION SURVEY, LAUGHLIN AFB TX CHARLES W. ATTEBERY, 1Lt, USAF, BSC o ROBERT D. BINOVI, Lt Col...including foreign nations. This report has been reviewed and is approved for publication. CHARLES W. ATTEBERY, ILt, USAF, BSC -SI-1kLTN R. BI...AS RPT C DTIC USERS UnclassWi ed 22a NAME Of RESPONSIBLE INDIVIDUAL 22b TELEPHONEA~nude Agia Co) 2.OFICS’ML Charles W. Attebery, ILt, USAF, BSC (512

  3. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  4. Correlation analysis of lung cancer and urban spatial factor: based on survey in Shanghai

    PubMed Central

    Xu, Wangyue; Tang, Jian; Jiang, Xiji

    2016-01-01

    Background The density of particulate matter (PM) in mega-cities in China such as Beijing and Shanghai has exceeded basic standards for health in recent years. Human exposure to PMs has been identified as traceable and controllable factor among all complicated risk factors for lung cancer. While the improvement of air quality needs tremendous efforts and time, certain revision of PM’s density might happen associated with the adjustment of built environment. It is also proved that urban built environment is directly relevant to respiratory disease. Studies have respectively explored the indoor and outdoor factors on respiratory diseases. More comprehensive spatial factors need to be analyzed to understand the cumulative effect of built environment upon respiratory system. This interdisciplinary study examines the impact of both indoor (including age of housing, interval after decoration, indoor humidity etc.) and outdoor spatial factors (including density, parking, green spaces etc.) on lung cancer. Methods A survey of lung cancer patients and a control group has been conducted in 2014 and 2015. A total of 472 interviewees are randomly selected within a pool of local residents who have resided in Shanghai for more than 5 years. Data are collected including their socio-demographic factors, lifestyle factors, and external and internal residential area factors. Regression models are established based on collected data to analyze the associations between lung cancer and urban spatial factors. Results Regression models illustrate that lung cancer presents significantly associated with a number of spatial factors. Significant outdoor spatial factors include external traffic volume (P=0.003), main plant type (P=0.035 for trees) of internal green space, internal water body (P=0.027) and land use of surrounding blocks (P=0.005 for residential areas of 7-9 floors, P=0.000 for residential areas of 4-6 floors, P=0.006 for business/commercial areas over 10 floors, P=0.005 for

  5. Empirically Based Profiles of the Early Literacy Skills of Children with Language Impairment in Early Childhood Special Education

    ERIC Educational Resources Information Center

    Justice, Laura; Logan, Jessica; Kaderavek, Joan; Schmitt, Mary Beth; Tompkins, Virginia; Bartlett, Christopher

    2015-01-01

    The purpose of this study was to empirically determine whether specific profiles characterize preschool-aged children with language impairment (LI) with respect to their early literacy skills (print awareness, name-writing ability, phonological awareness, alphabet knowledge); the primary interest was to determine if one or more profiles suggested…

  6. Modeling invariant object processing based on tight integration of simulated and empirical data in a Common Brain Space

    PubMed Central

    Peters, Judith C.; Reithler, Joel; Goebel, Rainer

    2012-01-01

    Recent advances in Computer Vision and Experimental Neuroscience provided insights into mechanisms underlying invariant object recognition. However, due to the different research aims in both fields models tended to evolve independently. A tighter integration between computational and empirical work may contribute to cross-fertilized development of (neurobiologically plausible) computational models and computationally defined empirical theories, which can be incrementally merged into a comprehensive brain model. After reviewing theoretical and empirical work on invariant object perception, this article proposes a novel framework in which neural network activity and measured neuroimaging data are interfaced in a common representational space. This enables direct quantitative comparisons between predicted and observed activity patterns within and across multiple stages of object processing, which may help to clarify how high-order invariant representations are created from low-level features. Given the advent of columnar-level imaging with high-resolution fMRI, it is time to capitalize on this new window into the brain and test which predictions of the various object recognition models are supported by this novel empirical evidence. PMID:22408617

  7. Use of Evidence-Based Practice Resources and Empirically Supported Treatments for Posttraumatic Stress Disorder among University Counseling Center Psychologists

    ERIC Educational Resources Information Center

    Juel, Morgen Joray

    2012-01-01

    In the present study, an attempt was made to determine the degree to which psychologists at college and university counseling centers (UCCs) utilized empirically supported treatments with their posttraumatic stress disorder (PTSD) clients. In addition, an attempt was made to determine how frequently UCC psychologists utilized a number of…

  8. SU-E-T-05: A 2D EPID Transit Dosimetry Model Based On An Empirical Quadratic Formalism

    SciTech Connect

    Tan, Y; Metwaly, M; Glegg, M; Baggarley, S; Elliott, A

    2014-06-01

    Purpose: To describe a 2D electronic portal imaging device (EPID) transit dosimetry model, based on an empirical quadratic formalism, that can predict either EPID or in-phantom dose distribution for comparisons with EPID captured image or treatment planning system (TPS) dose respectively. Methods: A quadratic equation can be used to relate the reduction in intensity of an exit beam to the equivalent path length of the attenuator. The calibration involved deriving coefficients from a set of dose planes measured for homogeneous phantoms with known thicknesses under reference conditions. In this study, calibration dose planes were measured with EPID and ionisation chamber (IC) in water for the same reference beam (6MV, 100mu, 20×20cm{sup 2}) and set of thicknesses (0–30cm). Since the same calibration conditions were used, the EPID and IC measurements can be related through the quadratic equation. Consequently, EPID transit dose can be predicted from TPS exported dose planes and in-phantom dose can be predicted using EPID distribution captured during treatment as an input. The model was tested with 4 open fields, 6 wedge fields, and 7 IMRT fields on homogeneous and heterogeneous phantoms. Comparisons were done using 2D absolute gamma (3%/3mm) and results were validated against measurements with a commercial 2D array device. Results: The gamma pass rates for comparisons between EPID measured and predicted ranged from 93.6% to 100.0% for all fields and phantoms tested. Results from this study agreed with 2D array measurements to within 3.1%. Meanwhile, comparisons in-phantom between TPS computed and predicted ranged from 91.6% to 100.0%. Validation with 2D array device was not possible for inphantom comparisons. Conclusion: A 2D EPID transit dosimetry model for treatment verification was described and proven to be accurate. The model has the advantage of being generic and allows comparisons at the EPID plane as well as multiple planes in-phantom.

  9. Retrieval of ice thickness from radar-altimeter data based on empirical relation between ice thickness and freeboard

    NASA Astrophysics Data System (ADS)

    Alexandrov, V.; Sandven, S.

    2009-04-01

    The basic technique of computing sea ice thickness by satellite altimetry is to measure freeboard (that is the height of the ice or snow surface above water) from the difference between the surface height of the larger ice floes, and the height of the thin ice or water surface in the major leads. The ice freeboard measurements are then converted to ice thickness by assuming hydrostatic equilibrium and using fixed densities of ice, sea water and snow, as well as snow depth [Laxon et al., 2003]. Our studies revealed that the natural variability of sea ice density results in a significant uncertainty of ice thickness retrieval, which can reach ± 70 cm for thick first-year ice and multiyear ice. It was found that the interannual and regional variability of snow depth, which is less than 10 cm in most regions, cause uncertainty of ice thickness calculation of ± 20 cm for multiyear ice, and ± 30 cm - for first-year ice. The present knowledge on parameterization of ice density and, to some degree snow depth and density, as well as their dependence on ice thickness, precludes accurate calculation of ice thickness from measured ice freeboard values using isostatic equilibrium equation. Another possible approach for ice thickness retrieval from radar-altimeter data can be based on empirical relations between ice thickness and freeboard. The most extensive data set, containing sea ice and snow measurements, was collected during aircraft landings associated with the Soviet Union's historical Sever airborne and North Pole drifting station programs. The data set contains measurements of 23 parameters, including ice thickness, ice freeboard and snow depth, which were measured at the same time in 688 landings in 1980 - 1982, 1984 - 1986, and 1988. The following regression equation, relating average ice thickness and average ice freeboard has been derived from these data: Hice= 8.3098 Fest + 35.739 Obtained regression dependence allows estimation of ice thickness from measured

  10. An empirical survey on the influence of machining parameters on tool wear in diamond turning of large single crystal silicon optics

    SciTech Connect

    Blaedel, K L; Carr, J W; Davis, P J; Goodman, W; Haack, J K; Krulewich, D; McClellan, M; Syn, C K; Zimmermann, M.

    1999-07-01

    The research described in this paper is a continuation of the collaborative efforts by Lawrence Livermore National Laboratory (LLNL), Schafer Corporation and TRW to develop a process for single point diamond turning (SPDT) of large single crystal silicon (SCSi) optical substrates on the Large Optic Diamond Turning Machine (LODTM). The principal challenge to obtaining long track lengths in SCSi has been to identify a set of machining parameters which yield a process that provides both low and predictable tool wear. Identifying such a process for SCSi has proven to be a formidable task because multiple crystallographic orientations with a range of hardness values are encountered when machining conical and annular optical substrates. The LODTM cutting program can compensate for tool wear if it is predictable. However, if the tool wear is not predictable then the figured area of the optical substrate may have unacceptably high error that can not be removed by post-polishing. The emphasis of this survey was limited to elucidating the influence of cutting parameters on the tool wear. We present two preliminary models that can be used to predict tool wear over the parameter space investigated. During the past two and one-half years a series of three evolutionary investigations were performed. The first investigation, the Parameter Assessment Study (PAS), was designed to survey fundamental machining parameters and assess their influence on tool wear [1]. The results of the PAS were used as a point-of-departure for designing the second investigation, the Parameter Selection Study (PSS). The goal of the PSS was to explore the trends identified in the PAS in more detail, to determine if the experimental results obtained in the PAS could be repeated on a different diamond turning machine (DTM), and to select a more optimal set of machining parameters that could be used in subsequent investigations such as the Fluid Down-Select Study (FDS). The goal of the FDS was to compare

  11. An empirical evaluation of two theoretically-based hypotheses on the directional association between self-worth and hope.

    PubMed

    McDavid, Lindley; McDonough, Meghan H; Smith, Alan L

    2015-06-01

    Fostering self-worth and hope are important goals of positive youth development (PYD) efforts, yet intervention design is complicated by contrasting theoretical hypotheses regarding the directional association between these constructs. Therefore, within a longitudinal design we tested: (1) that self-worth predicts changes in hope (self theory; Harter, 1999), and (2) that hope predicts changes in self-worth (hope theory; Snyder, 2002) over time. Youth (N = 321; Mage = 10.33 years) in a physical activity-based PYD program completed surveys 37-45 days prior to and on the second day and third-to-last day of the program. A latent variable panel model that included autoregressive and cross-lagged paths indicated that self-worth was a significant predictor of change in hope, but hope did not predict change in self-worth. Therefore, the directional association between self-worth and hope is better explained by self-theory and PYD programs should aim to enhance perceptions of self-worth to build perceptions of hope.

  12. OBSERVATIONS OF BINARY STARS WITH THE DIFFERENTIAL SPECKLE SURVEY INSTRUMENT. V. TOWARD AN EMPIRICAL METAL-POOR MASS–LUMINOSITY RELATION

    SciTech Connect

    Horch, Elliott P.; Van Altena, William F.; Demarque, Pierre; Howell, Steve B.; Everett, Mark E.; Ciardi, David R.; Teske, Johanna K.; Henry, Todd J.; Winters, Jennifer G. E-mail: william.vanaltena@yale.edu E-mail: steve.b.howell@nasa.gov E-mail: ciardi@ipac.caltech.edu E-mail: thenry@astro.gsu.edu

    2015-05-15

    In an effort to better understand the details of the stellar structure and evolution of metal-poor stars, the Gemini North telescope was used on two occasions to take speckle imaging data of a sample of known spectroscopic binary stars and other nearby stars in order to search for and resolve close companions. The observations were obtained using the Differential Speckle Survey Instrument, which takes data in two filters simultaneously. The results presented here are of 90 observations of 23 systems in which one or more companions was detected, and six stars where no companion was detected to the limit of the camera capabilities at Gemini. In the case of the binary and multiple stars, these results are then further analyzed to make first orbit determinations in five cases, and orbit refinements in four other cases. The mass information is derived, and since the systems span a range in metallicity, a study is presented that compares our results with the expected trend in total mass as derived from the most recent Yale isochrones as a function of metal abundance. These data suggest that metal-poor main-sequence stars are less massive at a given color than their solar-metallicity analogues in a manner consistent with that predicted from the theory.

  13. Cryptosporidiosis in Indonesia: a hospital-based study and a community-based survey.

    PubMed

    Katsumata, T; Hosea, D; Wasito, E B; Kohno, S; Hara, K; Soeparto, P; Ranuh, I G

    1998-10-01

    Hospital-based and community-based studies were conducted to understand the prevalence and mode of transmission of Cryptosporidium parvum infection in Surabaya, Indonesia. In both studies people with and without diarrhea were examined for oocysts. A community-based survey included questionnaires to a community and stool examination of cats. Questionnaires covered demographic information, health status, and hygienic indicators. In the hospital, C. parvum oocysts were found in 26 (2.8%) of 917 patients with diarrhea and 15 (1.4%) of 1,043 control patients. The most susceptible age was less than two years old. The prevalence was higher during the rainy season. A community-based study again showed that C. parvum oocysts were frequently detected in diarrhea samples (8.2%), exclusively during rainy season. Thirteen (2.4%) of 532 cats passed C. parvum oocysts. A multiple logistic regression model indicated that contact with cats, rain, flood, and crowded living conditions are significant risk factors for Cryptosporidium infection.

  14. Energy-Based Acoustic Source Localization Methods: A Survey

    PubMed Central

    Meng, Wei; Xiao, Wendong

    2017-01-01

    Energy-based source localization is an important problem in wireless sensor networks (WSNs), which has been studied actively in the literature. Numerous localization algorithms, e.g., maximum likelihood estimation (MLE) and nonlinear-least-squares (NLS) methods, have been reported. In the literature, there are relevant review papers for localization in WSNs, e.g., for distance-based localization. However, not much work related to energy-based source localization is covered in the existing review papers. Energy-based methods are proposed and specially designed for a WSN due to its limited sensor capabilities. This paper aims to give a comprehensive review of these different algorithms for energy-based single and multiple source localization problems, their merits and demerits and to point out possible future research directions. PMID:28212281

  15. Energy-Based Acoustic Source Localization Methods: A Survey.

    PubMed

    Meng, Wei; Xiao, Wendong

    2017-02-15

    Energy-based source localization is an important problem in wireless sensor networks (WSNs), which has been studied actively in the literature. Numerous localization algorithms, e.g., maximum likelihood estimation (MLE) and nonlinear-least-squares (NLS) methods, have been reported. In the literature, there are relevant review papers for localization in WSNs, e.g., for distance-based localization. However, not much work related to energy-based source localization is covered in the existing review papers. Energy-based methods are proposed and specially designed for a WSN due to its limited sensor capabilities. This paper aims to give a comprehensive review of these different algorithms for energy-based single and multiple source localization problems, their merits and demerits and to point out possible future research directions.

  16. Tectonic map of Liberia based on geophysical and geological surveys

    USGS Publications Warehouse

    Behrendt, John Charles; Wotorson, Cletus S.

    1972-01-01

    Interpretation of the results of aeromagnetic, total-gamma radioactivity, and gravity surveys combined with geologic data for Western Liberia from White and Leo (1969) and other geologic information allows the construction of a tectonic map of Liberia. The map approximately delineates the boundaries between the Liberian (ca. 2700 m.y.) province in the northwestern two-thirds of the country, the Eburnean (ca. 2000 m.y.) province in the south-eastern one-third, and the Pan-African (ca. 550 m.y.) province in the coastal area of the northwestern two-thirds of the country. Rock follation and tectonic structural features trend northeastward in the Liberian province, east-northeastward to north-northeastward in the Eburnean province, and northwestward in the Pan-African age province. Linear residual magnetic anomailes 20-80 km wide and 200-600 gammas in amplitude and following the northeast structural trend typical of the Liberian age province cross the entire country and extend into Sierra Leone and Ivory Coast.

  17. Copula-based IDF curves and empirical rainfall thresholds for flash floods and rainfall-induced landslides

    NASA Astrophysics Data System (ADS)

    Bezak, Nejc; Šraj, Mojca; Mikoš, Matjaž

    2016-10-01

    Floods, landslides and debris flows are natural events that occur all over the world and are often induced by extreme rainfall conditions. Several extreme events occurred in Slovenia (Europe) in the last 25 years that caused 18 casualties and approximately 500 million Euros of economic loss. The intensity-duration-frequency (IDF) relationship was constructed using the Frank copula function for several rainfall stations using high-resolution rainfall data with an average subsample length of 34 years. The empirical rainfall threshold curves were also evaluated for selected extreme events. Post-event analyses showed that rainfall characteristics triggering flash floods and landslides are different. The sensitivity analysis results indicate that the inter-event time definition (IETD) and subsample definition methodology can have a significant influence on the position of rainfall events in the intensity-duration space, the constructed IDF curves and on the relationship between the empirical rainfall threshold curves and the IDF curves constructed using the copula approach. Furthermore, a combination of several empirical rainfall thresholds with an appropriate high-density rainfall measurement network can be used as part of the early warning system of the initiation of landslides and debris flows. However, different rainfall threshold curves should be used for lowland and mountainous areas in Slovenia.

  18. Issues in nursing: strategies for an Internet-based, computer-assisted telephone survey.

    PubMed

    Piamjariyakul, Ubolrat; Bott, Marjorie J; Taunton, Roma Lee

    2006-08-01

    The study describes the design and implementation of an Internet-based, computed-assisted telephone survey about the care-planning process in 107 long-term care facilities in the Midwest. Two structured telephone surveys were developed to interview the care planning coordinators and their team members. Questionmark Perception Software Version 3 was used to develop the surveys in a wide range of formats. The responses were drawn into a database that was exported to a spreadsheet format and converted to a statistical format by the Information Technology team. Security of the database was protected. Training sessions were provided to project staff. The interviews were tape-recorded for the quality checks. The inter-rater reliabilities were above 95% to 100% agreement. Investigators should consider using Internet-based survey tools, especially for multisite studies that allow access to larger samples at less cost. Exploring multiple software systems for the best fit to the study requirements is essential.

  19. Market-Based Multirobot Coordination: A Comprehensive Survey and Analysis

    DTIC Science & Technology

    2005-12-01

    and when to bid. In Proceedings of the International Symposium on Distributed Autonomous Robotic Systems (DARS), 2006. [7] P. Caloud, W. Choi, J. C...study of market-based and threshold- based task allocation. In Proceedings of the International Symposium on Dis- tributed Autonomous Robotic Systems (DARS...coalition formation. In Dis- tributed Autonomous Robotic Systems (DARS), 2006. [71] E. Wolfstetter. Auctions: An introduction. Journal of Economic

  20. Cost Model/Data Base Catalog Non-DoD/Academic Survey. Volume 1. Project Summary

    DTIC Science & Technology

    1988-10-30

    TASK WORK UNIT ELEMENT NO. NO. NO. ACCESSION NO. 11. TITLE (Include Security Classification) I Cost Model/Data Base Catalog Non-I)D/Academic Survey...this and the previous effort, MCR looked at many of them. The majority of I the studies were: I limited in scope, either to a certain service or... work performed under Task 1, Survey Non-DoD Cost Analysis Tools and Task .2, Survey Academic Institution Cost I Analysis Tools. Work on the other two

  1. Specialize or risk disappearance - empirical evidence of anisomerism based on comparative and developmental studies of gnathostome head and limb musculature.

    PubMed

    Diogo, Rui; Ziermann, Janine M; Linde-Medina, Marta

    2015-08-01

    William K. Gregory was one of the most influential authors defending the existence of an evolutionary trend in vertebrates from a higher degree of polyisomerism (more polyisomeric or 'serial' anatomical structures arranged along any body axis) to cases of anisomerism (specialization or loss of at least some original polyisomeric structures). Anisomerism was the subject of much interest during the 19th and the beginning of the 20th centuries, particularly due to the influence of the Romantic German School and the notion of 'primitive archetype' and because it was conceptually linked to other crucial biological issues (e.g. complexity, scala naturae, progress, modularity or phenotypic integration). However, discussions on anisomerism and related issues (e.g. Williston's law) have been almost exclusively based on hard tissues. Here we provide the first detailed empirical test, and discussion, of anisomerism based on quantitative data obtained from phylogenetic and comparative analyses of the head and forelimb muscles of gnathostomes. Our results strongly support the existence of such a trend in both forelimb and head musculature. For instance, the last common ancestor (LCA) of extant tetrapods likely had 38 polyisomeric muscles (PMs) out of a total of 70 forelimb muscles (i.e. 54%), whereas in the LCAs of extant amniotes and of mammals these numbers were 38/73 (52%) and 21/67 (31%), and in humans are 11/59 (19%). Interestingly, the number of PMs that became specialized during the forelimb evolutionary transition from the LCA of extant tetrapods to humans (13) is very similar to the number of PMs that became lost (14), indicating that both specialization and loss contributed equally to the trend towards anisomerism. By contrast, during the evolution of the head musculature from the LCA of gnathostomes to humans a total of 27 PMs were lost whereas only one muscle became specialized. Importantly, the evolutionary trend towards anisomerism is not related to a general

  2. Land-based lidar mapping: a new surveying technique to shed light on rapid topographic change

    USGS Publications Warehouse

    Collins, Brian D.; Kayen, Robert

    2006-01-01

    The rate of natural change in such dynamic environments as rivers and coastlines can sometimes overwhelm the monitoring capacity of conventional surveying methods. In response to this limitation, U.S. Geological Survey (USGS) scientists are pioneering new applications of light detection and ranging (lidar), a laser-based scanning technology that promises to greatly increase our ability to track rapid topographic changes and manage their impact on affected communities.

  3. Schiff bases: a short survey on an evergreen chemistry tool.

    PubMed

    Qin, Wenling; Long, Sha; Panunzio, Mauro; Biondi, Stefano

    2013-10-08

    The review reports a short biography of the Italian naturalized chemist Hugo Schiff and an outline on the synthesis and use of his most popular discovery: the imines, very well known and popular as Schiff Bases. Recent developments on their "metallo-imines" variants have been described. The applications of Schiff bases in organic synthesis as partner in Staudinger and hetero Diels-Alder reactions, as "privileged" ligands in the organometallic complexes and as biological active Schiff intermediates/targets have been reported as well.

  4. Empirical agreement in model validation.

    PubMed

    Jebeile, Julie; Barberousse, Anouk

    2016-04-01

    Empirical agreement is often used as an important criterion when assessing the validity of scientific models. However, it is by no means a sufficient criterion as a model can be so adjusted as to fit available data even though it is based on hypotheses whose plausibility is known to be questionable. Our aim in this paper is to investigate into the uses of empirical agreement within the process of model validation.

  5. Differences in Rate of Response to Web-Based Surveys among College Students

    ERIC Educational Resources Information Center

    Mitra, Ananda; Jain-Shukla, Parul; Robbins, Adrienne; Champion, Heather; Durant, Robert

    2008-01-01

    This article provides a broad overview of the definition of web-based surveys examining some of the benefits and burdens related to using the Web for data collection. It draws upon the experience of two years of data collection on 10 university campuses demonstrating that there are noticeable differences in the speed with which web-based surveys…

  6. Empirically based Suggested Insights into the Concept of False-Self Defense: Contributions From a Study on Normalization of Children With Disabilities.

    PubMed

    Eichengreen, Adva; Hoofien, Dan; Bachar, Eytan

    2016-02-01

    The concept of the false self has been used widely in psychoanalytic theory and practice but seldom in empirical research. In this empirically based study, elevated features of false-self defense were hypothetically associated with risk factors attendant on processes of rehabilitation and integration of children with disabilities, processes that encourage adaptation of the child to the able-bodied environment. Self-report questionnaires and in-depth interviews were conducted with 88 deaf and hard-of-hearing students and a comparison group of 88 hearing counterparts. Results demonstrate that despite the important contribution of rehabilitation and integration to the well-being of these children, these efforts may put the child at risk of increased use of the false-self defense. The empirical findings suggest two general theoretical conclusions: (1) The Winnicottian concept of the environment, usually confined to the parent-child relationship, can be understood more broadly as including cultural, social, and rehabilitational variables that both influence the parent-child relationship and operate independently of it. (2) The monolithic conceptualization of the false self may be more accurately unpacked to reveal two distinct subtypes: the compliant and the split false self.

  7. A survey on evolutionary algorithm based hybrid intelligence in bioinformatics.

    PubMed

    Li, Shan; Kang, Liying; Zhao, Xing-Ming

    2014-01-01

    With the rapid advance in genomics, proteomics, metabolomics, and other types of omics technologies during the past decades, a tremendous amount of data related to molecular biology has been produced. It is becoming a big challenge for the bioinformatists to analyze and interpret these data with conventional intelligent techniques, for example, support vector machines. Recently, the hybrid intelligent methods, which integrate several standard intelligent approaches, are becoming more and more popular due to their robustness and efficiency. Specifically, the hybrid intelligent approaches based on evolutionary algorithms (EAs) are widely used in various fields due to the efficiency and robustness of EAs. In this review, we give an introduction about the applications of hybrid intelligent methods, in particular those based on evolutionary algorithm, in bioinformatics. In particular, we focus on their applications to three common problems that arise in bioinformatics, that is, feature selection, parameter estimation, and reconstruction of biological networks.

  8. A Survey on Evolutionary Algorithm Based Hybrid Intelligence in Bioinformatics

    PubMed Central

    Li, Shan; Zhao, Xing-Ming

    2014-01-01

    With the rapid advance in genomics, proteomics, metabolomics, and other types of omics technologies during the past decades, a tremendous amount of data related to molecular biology has been produced. It is becoming a big challenge for the bioinformatists to analyze and interpret these data with conventional intelligent techniques, for example, support vector machines. Recently, the hybrid intelligent methods, which integrate several standard intelligent approaches, are becoming more and more popular due to their robustness and efficiency. Specifically, the hybrid intelligent approaches based on evolutionary algorithms (EAs) are widely used in various fields due to the efficiency and robustness of EAs. In this review, we give an introduction about the applications of hybrid intelligent methods, in particular those based on evolutionary algorithm, in bioinformatics. In particular, we focus on their applications to three common problems that arise in bioinformatics, that is, feature selection, parameter estimation, and reconstruction of biological networks. PMID:24729969

  9. Content based Image Retrieval based on Different Global and Local Color Histogram Methods: A Survey

    NASA Astrophysics Data System (ADS)

    Suhasini, Pallikonda Sarah; Sri Rama Krishna, K.; Murali Krishna, I. V.

    2016-06-01

    Different global and local color histogram methods for content based image retrieval (CBIR) are investigated in this paper. Color histogram is a widely used descriptor for CBIR. Conventional method of extracting color histogram is global, which misses the spatial content, is less invariant to deformation and viewpoint changes, and results in a very large three dimensional histogram corresponding to the color space used. To address the above deficiencies, different global and local histogram methods are proposed in recent research. Different ways of extracting local histograms to have spatial correspondence, invariant colour histogram to add deformation and viewpoint invariance and fuzzy linking method to reduce the size of the histogram are found in recent papers. The color space and the distance metric used are vital in obtaining color histogram. In this paper the performance of CBIR based on different global and local color histograms in three different color spaces, namely, RGB, HSV, L*a*b* and also with three distance measures Euclidean, Quadratic and Histogram intersection are surveyed, to choose appropriate method for future research.

  10. Content based Image Retrieval based on Different Global and Local Color Histogram Methods: A Survey

    NASA Astrophysics Data System (ADS)

    Suhasini, Pallikonda Sarah; Sri Rama Krishna, K.; Murali Krishna, I. V.

    2017-02-01

    Different global and local color histogram methods for content based image retrieval (CBIR) are investigated in this paper. Color histogram is a widely used descriptor for CBIR. Conventional method of extracting color histogram is global, which misses the spatial content, is less invariant to deformation and viewpoint changes, and results in a very large three dimensional histogram corresponding to the color space used. To address the above deficiencies, different global and local histogram methods are proposed in recent research. Different ways of extracting local histograms to have spatial correspondence, invariant colour histogram to add deformation and viewpoint invariance and fuzzy linking method to reduce the size of the histogram are found in recent papers. The color space and the distance metric used are vital in obtaining color histogram. In this paper the performance of CBIR based on different global and local color histograms in three different color spaces, namely, RGB, HSV, L*a*b* and also with three distance measures Euclidean, Quadratic and Histogram intersection are surveyed, to choose appropriate method for future research.

  11. Empirical ugri-UBVRc transformations for galaxies

    NASA Astrophysics Data System (ADS)

    Cook, David O.; Dale, Daniel A.; Johnson, Benjamin D.; Van Zee, Liese; Lee, Janice C.; Kennicutt, Robert C.; Calzetti, Daniela; Staudaher, Shawn M.; Engelbracht, Charles W.

    2014-11-01

    We present empirical colour transformations between Sloan Digital Sky Survey ugri and Johnson-Cousins UBVRc photometry for nearby galaxies (D < 11 Mpc). We use the Local Volume Legacy (LVL) galaxy sample where there are 90 galaxies with overlapping observational coverage for these two filter sets. The LVL galaxy sample consists of normal, non-starbursting galaxies. We also examine how well the LVL galaxy colours are described by previous transformations derived from standard calibration stars and model-based galaxy templates. We find significant galaxy colour scatter around most of the previous transformation relationships. In addition, the previous transformations show systematic offsets between transformed and observed galaxy colours which are visible in observed colour-colour trends. The LVL-based galaxy transformations show no systematic colour offsets and reproduce the observed colour-colour galaxy trends.

  12. A Survey of Permanently-Manned Lunar Base Concepts

    DTIC Science & Technology

    1989-06-01

    of water and air. Furthermore, the resupply of water , air, and food is necessary. When the initial lunar base is established, research will be...directed to develop simple biogenerative systems that will supply a portion of the required food, and regenerate air and water . Requirements for an eventual...terrestrial hydrogen to form water . If frozen water is found at the poles, the need for exported water will be 26 eliminated. Residual iron is a by

  13. Implementing Project Based Survey Research Skills to Grade Six ELP Students with "The Survey Toolkit" and "TinkerPlots"[R

    ERIC Educational Resources Information Center

    Walsh, Thomas, Jr.

    2011-01-01

    "Survey Toolkit Collecting Information, Analyzing Data and Writing Reports" (Walsh, 2009a) is discussed as a survey research curriculum used by the author's sixth grade students. The report describes the implementation of "The Survey Toolkit" curriculum and "TinkerPlots"[R] software to provide instruction to students learning a project based…

  14. Radiological survey at the Puget Sound Naval Shipyard and Naval Submarine Base, Bangor

    SciTech Connect

    Fowler, T.W.; Cox, C.

    1998-07-01

    This report presents results of a radiological survey conducted in September 1996 by the National Air and Radiation Environmental Laboratory (NAREL) to assess levels of environmental radioactivity around Puget Sound Naval Shipyard (PSNS) at Bremerton, WA, and the Naval Submarine Base (NSBB) near Bangor, WA. The purpose of the survey was to assess whether the construction, maintenance, overhaul, refueling or operations nuclear-powered warships have created elevated levels of radioactivity that could expose nearby populations or contaminate the environment. During this survey 227 samples were collected: 126 at the PSNS study site; 73 at the NSBB site; 21 from background locations; and 7 near the outfall of the Bremerton sewage treatment plant. Samples included drinking water, harbor water, sediment, sediment cores, and biota. All samples were analyzed for gross alpha and beta activities and gamma emitting radionuclides, and some samples were also analyzed for radium-226 and isotopes of uranium, plutonium, and thorium. In addition to sample collection and analysis, radiation surveys were performed using portable survey instruments to detect gamma radiation. Based on this radiological survey, practices regarding nuclear-powered warship operations at PSNS and NSBB have resulted in no increases in radioactivity causing significant population exposure or contamination of the environment.

  15. Trajectory-Based Visual Localization in Underwater Surveying Missions

    PubMed Central

    Burguera, Antoni; Bonin-Font, Francisco; Oliver, Gabriel

    2015-01-01

    We present a new vision-based localization system applied to an autonomous underwater vehicle (AUV) with limited sensing and computation capabilities. The traditional EKF-SLAM approaches are usually expensive in terms of execution time; the approach presented in this paper strengthens this method by adopting a trajectory-based schema that reduces the computational requirements. The pose of the vehicle is estimated using an extended Kalman filter (EKF), which predicts the vehicle motion by means of a visual odometer and corrects these predictions using the data associations (loop closures) between the current frame and the previous ones. One of the most important steps in this procedure is the image registration method, as it reinforces the data association and, thus, makes it possible to close loops reliably. Since the use of standard EKFs entail linearization errors that can distort the vehicle pose estimations, the approach has also been tested using an iterated Kalman filter (IEKF). Experiments have been conducted using a real underwater vehicle in controlled scenarios and in shallow sea waters, showing an excellent performance with very small errors, both in the vehicle pose and in the overall trajectory estimates. PMID:25594602

  16. NVO, surveys and archiving ground-based data

    NASA Astrophysics Data System (ADS)

    Huchra, John P.

    2002-12-01

    The era of extremely large, public databases in astronomy is upon us. Such databases will open (are opening!) the field to new research and new researchers. However it is important to be sure the resources are available to properly archive groundbased astronomical data, and include the necessary quality checks and calibrations. An NVO without a proper archive will have limited usefulness. This also implies that with limited resources, not all data can or should be archived. NASA already has a very good handle on US space-based astronomical data. Agencies and organizations that operate astronomical facilities, particularly groundbased observatories, need to plan and budget for these activities now. We should not underestimate the effort required to produce high quality data products that will be useful for the broader community.

  17. A Comparative Investigation of TPB and Altruism Frameworks for an Empirically Based Communication Approach to Enhance Paper Recycling

    ERIC Educational Resources Information Center

    Chaisamrej, Rungrat; Zimmerman, Rick S.

    2014-01-01

    This research compared the ability of the theory of planned behavior (TPB) and the altruism framework (AM) to predict paper-recycling behavior. It was comprised of formative research and a major survey. Data collected from 628 undergraduate students in Thailand were analyzed using structural equation modeling. Results showed that TPB was superior…

  18. A Household-Based Distribution-Sensitive Human Development Index: An Empirical Application to Mexico, Nicaragua and Peru

    ERIC Educational Resources Information Center

    Lopez-Calva, Luis F.; Ortiz-Juarez, Eduardo

    2012-01-01

    In measuring human development, one of the main concerns relates to the inclusion of a measure that penalizes inequalities in the distribution of achievements across the population. Using indicators from nationally representative household surveys and census data, this paper proposes a straightforward methodology to estimate a household-based…

  19. Sleepwalking in Parkinson's disease: a questionnaire-based survey.

    PubMed

    Oberholzer, Michael; Poryazova, Rositsa; Bassetti, Claudio L

    2011-07-01

    Sleepwalking (SW) corresponds to a complex sleep-associated behavior that includes locomotion, mental confusion, and amnesia. SW is present in about 10% of children and 2-3% of adults. In a retrospective series of 165 patients with Parkinson's disease (PD), we found adult-onset ("de novo") SW "de novo" in six (4%) of them. The aim of this study was to assess prospectively and systematically the frequency and characteristics of SW in PD patients. A questionnaire including items on sleep quality, sleep disorders, and specifically also SW and REM sleep behavior disorder (RBD), PD characteristics and severity, was sent to the members of the national PD patients organization in Switzerland. In the study, 36/417 patients (9%) reported SW, of which 22 (5%) had adult-onset SW. Patients with SW had significantly longer disease duration (p = 0.035), they reported more often hallucinations (p = 0.004) and nightmares (p = 0.003), and they had higher scores, suggestive for RBD in a validated questionnaire (p = 0.001). Patients with SW were also sleepier (trend to a higher Epworth Sleepiness Scale score, p = 0.055). Our data suggest that SW in PD patients is (1) more common than in the general population, and (2) is associated with RBD, nightmares, and hallucinations. Further studies including polysomnographic recordings are needed to confirm the results of this questionnaire-based analysis, to understand the relationship between SW and other nighttime wandering behaviors in PD, and to clarify the underlying mechanisms.

  20. Questionnaire-based survey of parturition in the queen.

    PubMed

    Musters, J; de Gier, J; Kooistra, H S; Okkens, A C

    2011-06-01

    The lack of scientific data concerning whether parturition in the queen proceeds normally or not may prevent veterinarians and cat owners from recognizing parturition problems in time. A questionnaire-based study of parturition in 197 queens was performed to determine several parameters of parturition and their influence on its progress. The mean length of gestation was 65.3 days (range 57 to 72 days) and it decreased with increasing litter size (P = 0.02). The median litter size was 4.5 kittens (range 1 to 9), with more males (53%) than females (46%) (P = 0.05). Sixty-nine percent of the kittens were born in anterior presentation and 31% in posterior presentation, indicating that either can be considered normal in the cat. Males were born in posterior position (34%) more often than females (26%) (P = 0.03). The mean birth weight was 98 g (range of 35 to 167 g) and decreased with increasing litter size (P < 0.01). Mean birth weight was higher in males and kittens born in posterior presentation (P < 0.01). Forty-four (5%) of the 887 kittens were stillborn. This was not correlated with the presentation at expulsion but stillborn kittens were more often female (P = 0.02) and weighed less than those born alive (P = 0.04). The median interkitten time was 30 min (range 2 to 343 min) and 95% were born within 100 min after expulsion of the preceding kitten. The interkitten time as a measure of the progress of parturition was not influenced by the kitten's gender, presentation at expulsion, birth weight, or stillbirth, or by the parity of the queen. The results of this study can be used to develop reference values for parturition parameters in the queen, both to determine whether a given parturition is abnormal and as the basis for a parturition protocol.

  1. Correcting for cell-type effects in DNA methylation studies: reference-based method outperforms latent variable approaches in empirical studies.

    PubMed

    Hattab, Mohammad W; Shabalin, Andrey A; Clark, Shaunna L; Zhao, Min; Kumar, Gaurav; Chan, Robin F; Xie, Lin Ying; Jansen, Rick; Han, Laura K M; Magnusson, Patrik K E; van Grootheest, Gerard; Hultman, Christina M; Penninx, Brenda W J H; Aberg, Karolina A; van den Oord, Edwin J C G

    2017-01-30

    Based on an extensive simulation study, McGregor and colleagues recently recommended the use of surrogate variable analysis (SVA) to control for the confounding effects of cell-type heterogeneity in DNA methylation association studies in scenarios where no cell-type proportions are available. As their recommendation was mainly based on simulated data, we sought to replicate findings in two large-scale empirical studies. In our empirical data, SVA did not fully correct for cell-type effects, its performance was somewhat unstable, and it carried a risk of missing true signals caused by removing variation that might be linked to actual disease processes. By contrast, a reference-based correction method performed well and did not show these limitations. A disadvantage of this approach is that if reference methylomes are not (publicly) available, they will need to be generated once for a small set of samples. However, given the notable risk we observed for cell-type confounding, we argue that, to avoid introducing false-positive findings into the literature, it could be well worth making this investment.Please see related Correspondence article: https://genomebiology.biomedcentral.com/articles/10/1186/s13059-017-1149-7 and related Research article: https://genomebiology.biomedcentral.com/articles/10.1186/s13059-016-0935-y.

  2. Tidal wind, temperature, and density fields from 80-400 km: Results from a physics-based empirical fit model to TIMED observations

    NASA Astrophysics Data System (ADS)

    Oberheide, Jens; Forbes, Jeffrey M.; Zhang, Xiaoli; Bruinsma, Sean; Ward, William E.

    A physics-based empirical fit model (HME, Hough Mode Extensions) is constrained with tidal temperature and wind observations in the MLT region made by the SABER and TIDI instru-ments on the TIMED satellite. The resulting amplitudes and phases represent self-consistent solutions of the tidal equations for upward propagating tides in horizontal wind, vertical wind, temperature and density, from pole-to-pole and up to 400 km altitude. Monthly mean aver-ages for 2002-2008 are presented for the most important migrating and nonmigrating diurnal and semidiurnal tides. Results are validated with ground-based observations (MLT region) and in-situ tidal diagnostics from the CHAMP satellite (400 km). Although designed as a TIMED based tidal climatology, the fit approach might be usable to further improve empirical upper atmosphere models, particularly in terms of nonmigrating tides: Each tide (wavenum-ber/frequency pair) is described by 2-4 latitude and height independent fit parameters per month and 2-4 corresponding time independent two-dimensional basis functions (HMEs). The fit parameters are the same for winds, temperature, and density.

  3. The Problems with Access to Compulsory Education in China and the Effects of the Policy of Direct Subsidies to Students: An Empirical Study Based on a Small Sample

    ERIC Educational Resources Information Center

    Yanqing, Ding

    2012-01-01

    After a brief review of the achievements and the problems in compulsory education enrollment in the thirty years since the reform and opening up, this study analyzes the current compulsory education enrollment and dropout rates in China's least-developed regions and the factors affecting school enrollment based on survey data from a small sample…

  4. Exoplanets -New Results from Space and Ground-based Surveys

    NASA Astrophysics Data System (ADS)

    Udry, Stephane

    The exploration of the outer solar system and in particular of the giant planets and their environments is an on-going process with the Cassini spacecraft currently around Saturn, the Juno mission to Jupiter preparing to depart and two large future space missions planned to launch in the 2020-2025 time frame for the Jupiter system and its satellites (Europa and Ganymede) on the one hand, and the Saturnian system and Titan on the other hand [1,2]. Titan, Saturn's largest satellite, is the only other object in our Solar system to possess an extensive nitrogen atmosphere, host to an active organic chemistry, based on the interaction of N2 with methane (CH4). Following the Voyager flyby in 1980, Titan has been intensely studied from the ground-based large telescopes (such as the Keck or the VLT) and by artificial satellites (such as the Infrared Space Observatory and the Hubble Space Telescope) for the past three decades. Prior to Cassini-Huygens, Titan's atmospheric composition was thus known to us from the Voyager missions and also through the explorations by the ISO. Our perception of Titan had thus greatly been enhanced accordingly, but many questions remained as to the nature of the haze surrounding the satellite and the composition of the surface. The recent revelations by the Cassini-Huygens mission have managed to surprise us with many discoveries [3-8] and have yet to reveal more of the interesting aspects of the satellite. The Cassini-Huygens mission to the Saturnian system has been an extraordinary success for the planetary community since the Saturn-Orbit-Insertion (SOI) in July 2004 and again the very successful probe descent and landing of Huygens on January 14, 2005. One of its main targets was Titan. Titan was revealed to be a complex world more like the Earth than any other: it has a dense mostly nitrogen atmosphere and active climate and meteorological cycles where the working fluid, methane, behaves under Titan conditions the way that water does on

  5. Empirical model of global thermospheric temperature and composition based on data from the OGO-6 quadrupole mass spectrometer

    NASA Technical Reports Server (NTRS)

    Hedin, A. E.; Mayr, H. G.; Reber, C. A.; Spencer, N. W.; Carignan, G. R.

    1972-01-01

    An empirical global model for magnetically quiet conditions has been derived from longitudinally averaged N2, O, and He densities by means of an expansion in spherical harmonics. The data were obtained by the OGO-6 neutral mass spectrometer and cover the altitude range 400 to 600 km for the period 27 June 1969 to 13 May 1971. The accuracy of the analytical description is of the order of the experimental error for He and O and about three times experimental error for N2, thus providing a reasonable overall representation of the satellite observations. Two model schemes are used: one representing densities extrapolated to 450 km and one representing densities extrapolated to 120 km with exospheric temperatures inferred from N2 densities. Using the best fit model parameters the global thermospheric structure is presented in the form of a number of contour plots.

  6. A 16-year examination of domestic violence among Asians and Asian Americans in the empirical knowledge base: a content analysis.

    PubMed

    Yick, Alice G; Oomen-Early, Jody

    2008-08-01

    Until recently, research studies have implied that domestic violence does not affect Asian American and immigrant communities, or even Asians abroad, because ethnicity or culture has not been addressed. In this content analysis, the authors examined trends in publications in leading scholarly journals on violence relating to Asian women and domestic violence. A coding schema was developed, with two raters coding the data with high interrater reliability. Sixty articles were published over the 16 years studied, most atheoretical and focusing on individual levels of analysis. The terms used in discussing domestic violence reflected a feminist perspective. Three quarters of the studies were empirical, with most guided by logical positivism using quantitative designs. Most targeted specific Asian subgroups (almost a third focused on Asian Indians) rather than categorizing Asians as a general ethnic category. The concept of "Asian culture" was most often assessed by discussing Asian family structure. Future research is discussed in light of the findings.

  7. Estimating nonresponse bias in a telephone-based health surveillance survey in New York City.

    PubMed

    Lim, Sungwoo; Immerwahr, Stephen; Lee, Sunghee; Harris, Tiffany G

    2013-10-15

    Despite concerns about nonresponse bias due to decreasing response rates, telephone surveys remain a viable option for conducting local population-based surveillance. However, this becomes problematic for urban populations, which typically have higher nonresponse rates. Unfortunately, traditional methods of evaluating nonresponse bias pose challenges for public health practitioners due to high costs. In this study, we sought to increase understanding of survey nonresponse at the zip code level in an urban area and to demonstrate the use of a practical tool for assessing nonresponse bias. Data from the 2008 New York City Community Health Survey, a landline telephone survey of residential households in New York, New York, were matched with zip-code-level data from the 2000 US Census. Although response rates varied across zip codes and zip-code-level sociodemographic characteristics, estimated nonresponse bias for the 5 health measures (general health status, current health insurance coverage, asthma, binge drinking, and physical activity) was not substantial (ranging from -3.8% to 2.4%). Findings confirmed previous research that survey participation rates can vary a great deal across small areas and that there is no direct relationship between response rates and nonresponse bias. This study highlights the importance of assessing nonresponse bias for local urban surveys and demonstrates a workable assessment tool.

  8. Evaluation of Midwater Trawl Selectivity and its Influence on Acoustic-Based Fish Population Surveys

    NASA Astrophysics Data System (ADS)

    Williams, Kresimir

    Trawls are used extensively during fisheries abundance surveys to derive estimates of fish density and, in the case of acoustic-based surveys, to identify acoustically sampled fish populations. However, trawls are selective in what fish they retain, resulting in biased estimates of density, species, and size compositions. Selectivity of the midwater trawl used in acoustic-based surveys of walleye pollock (Theragra chalcogramma) was evaluated using multiple methods. The effects of trawl selectivity on the acoustic-based survey abundance estimates and the stock assessment were evaluated for the Gulf of Alaska walleye pollock population. Selectivity was quantified using recapture, or pocket, nets attached to the outside of the trawl. Pocket net catches were modeled using a hierarchical Bayesian model to provide uncertainty in selectivity parameter estimates. Significant under-sampling of juvenile pollock by the midwater trawl was found, with lengths at 50% retention ranging from 14--26 cm over three experiments. Escapement was found to be light dependent, with more fish escaping in dark conditions. Highest escapement rates were observed in the aft of the trawl near to the codend though the bottom panel of the trawl. The behavioral mechanisms involved in the process of herding and escapement were evaluated using stereo-cameras, a DIDSON high frequency imaging sonar, and pocket nets. Fish maintained greater distances from the trawl panel during daylight, suggesting trawl modifications such as increased visibility of netting materials may evoke stronger herding responses and increased retention of fish. Selectivity and catchability of pollock by the midwater trawl was also investigated using acoustic density as an independent estimate of fish abundance to compare with trawl catches. A modeling framework was developed to evaluate potential explanatory factors for selectivity and catchability. Selectivity estimates were dependent on which vessel was used for the survey

  9. An in-Depth Survey of Visible Light Communication Based Positioning Systems

    PubMed Central

    Do, Trong-Hop; Yoo, Myungsik

    2016-01-01

    While visible light communication (VLC) has become the candidate for the wireless technology of the 21st century due to its inherent advantages, VLC based positioning also has a great chance of becoming the standard approach to positioning. Within the last few years, many studies on VLC based positioning have been published, but there are not many survey works in this field. In this paper, an in-depth survey of VLC based positioning systems is provided. More than 100 papers ranging from pioneering papers to the state-of-the-art in the field were collected and classified based on the positioning algorithms, the types of receivers, and the multiplexing techniques. In addition, current issues and research trends in VLC based positioning are discussed. PMID:27187395

  10. An in-Depth Survey of Visible Light Communication Based Positioning Systems.

    PubMed

    Do, Trong-Hop; Yoo, Myungsik

    2016-05-12

    While visible light communication (VLC) has become the candidate for the wireless technology of the 21st century due to its inherent advantages, VLC based positioning also has a great chance of becoming the standard approach to positioning. Within the last few years, many studies on VLC based positioning have been published, but there are not many survey works in this field. In this paper, an in-depth survey of VLC based positioning systems is provided. More than 100 papers ranging from pioneering papers to the state-of-the-art in the field were collected and classified based on the positioning algorithms, the types of receivers, and the multiplexing techniques. In addition, current issues and research trends in VLC based positioning are discussed.

  11. An HIV epidemic model based on viral load dynamics: value in assessing empirical trends in HIV virulence and community viral load.

    PubMed

    Herbeck, Joshua T; Mittler, John E; Gottlieb, Geoffrey S; Mullins, James I

    2014-06-01

    Trends in HIV virulence have been monitored since the start of the AIDS pandemic, as studying HIV virulence informs our understanding of HIV epidemiology and pathogenesis. Here, we model changes in HIV virulence as a strictly evolutionary process, using set point viral load (SPVL) as a proxy, to make inferences about empirical SPVL trends from longitudinal HIV cohorts. We develop an agent-based epidemic model based on HIV viral load dynamics. The model contains functions for viral load and transmission, SPVL and disease progression, viral load trajectories in multiple stages of infection, and the heritability of SPVL across transmissions. We find that HIV virulence evolves to an intermediate level that balances infectiousness with longer infected lifespans, resulting in an optimal SPVL∼4.75 log10 viral RNA copies/mL. Adaptive viral evolution may explain observed HIV virulence trends: our model produces SPVL trends with magnitudes that are broadly similar to empirical trends. With regard to variation among studies in empirical SPVL trends, results from our model suggest that variation may be explained by the specific epidemic context, e.g. the mean SPVL of the founding lineage or the age of the epidemic; or improvements in HIV screening and diagnosis that results in sampling biases. We also use our model to examine trends in community viral load, a population-level measure of HIV viral load that is thought to reflect a population's overall transmission potential. We find that community viral load evolves in association with SPVL, in the absence of prevention programs such as antiretroviral therapy, and that the mean community viral load is not necessarily a strong predictor of HIV incidence.

  12. AN A PRIORI INVESTIGATION OF ASTROPHYSICAL FALSE POSITIVES IN GROUND-BASED TRANSITING PLANET SURVEYS

    SciTech Connect

    Evans, Tom M.; Sackett, Penny D.

    2010-03-20

    Astrophysical false positives due to stellar eclipsing binaries pose one of the greatest challenges to ground-based surveys for transiting hot Jupiters. We have used known properties of multiple star systems and hot Jupiter systems to predict, a priori, the number of such false detections and the number of genuine planet detections recovered in two hypothetical but realistic ground-based transit surveys targeting fields close to the galactic plane (b {approx} 10{sup 0}): a shallow survey covering a magnitude range 10 < V < 13 and a deep survey covering a magnitude range 15 < V < 19. Our results are consistent with the commonly reported experience of false detections outnumbering planet detections by a factor of {approx}10 in shallow surveys, while in our synthetic deep survey we find {approx}1-2 false detections for every planet detection. We characterize the eclipsing binary configurations that are most likely to cause false detections and find that they can be divided into three main types: (1) two dwarfs undergoing grazing transits, (2) two dwarfs undergoing low-latitude transits in which one component has a substantially smaller radius than the other, and (3) two eclipsing dwarfs blended with one or more physically unassociated foreground stars. We also predict that a significant fraction of hot Jupiter detections are blended with the light from other stars, showing that care must be taken to identify the presence of any unresolved neighbors in order to obtain accurate estimates of planetary radii. This issue is likely to extend to terrestrial planet candidates in the CoRoT and Kepler transit surveys, for which neighbors of much fainter relative brightness will be important.

  13. Earthquake Scenarios Based Upon the Data and Methodologies of the U.S. Geological Survey's National Seismic Hazard Mapping Project

    NASA Astrophysics Data System (ADS)

    Rukstales, K. S.; Petersen, M. D.; Frankel, A. D.; Harmsen, S. C.; Wald, D. J.; Quitoriano, V. R.; Haller, K. M.

    2011-12-01

    The U.S. Geological Survey's (USGS) National Seismic Hazard Mapping Project (NSHMP) utilizes a database of over 500 faults across the conterminous United States to constrain earthquake source models for probabilistic seismic hazard maps. Additionally, the fault database is now being used to produce a suite of deterministic ground motions for earthquake scenarios that are based on the same fault source parameters and empirical ground motion prediction equations used for the probabilistic hazard maps. Unlike the calculated hazard map ground motions, local soil amplification is applied to the scenario calculations based on the best available Vs30 (average shear-wave velocity down to 30 meters) mapping, or in some cases using topographic slope as a proxy. Systematic outputs include all standard USGS ShakeMap products, including GIS, KML, XML, and HAZUS input files. These data are available from the ShakeMap web pages with a searchable archive. The scenarios are being produced within the framework of a geographic information system (GIS) so that alternative scenarios can readily be produced by altering fault source parameters, Vs30 soil amplification, as well as the weighting of ground motion prediction equations used in the calculations. The alternative scenarios can then be used for sensitivity analysis studies to better characterize uncertainty in the source model and convey this information to decision makers. By providing a comprehensive collection of earthquake scenarios based upon the established data and methods of the USGS NSHMP, we hope to provide a well-documented source of data which can be used for visualization, planning, mitigation, loss estimation, and research purposes.

  14. HIV/AIDS Misconceptions among Latinos: Findings from a Population-Based Survey of California Adults

    ERIC Educational Resources Information Center

    Ritieni, Assunta; Moskowitz, Joel; Tholandi, Maya

    2008-01-01

    Misconceptions about HIV/AIDS among Latino adults (N=454) in California were examined using data from a population-based telephone survey conducted in 2000. Common misconceptions concerning modes of HIV transmission included transmission via mosquito or animal bite (64.1%), public facilities (48.3%), or kissing someone on the cheek (24.8%). A…

  15. Sexually Transmitted Diseases and Risk Behaviors among California Farmworkers: Results from a Population-Based Survey

    ERIC Educational Resources Information Center

    Brammeier, Monique; Chow, Joan M.; Samuel, Michael C.; Organista, Kurt C.; Miller, Jamie; Bolan, Gail

    2008-01-01

    Context: The prevalence of sexually transmitted diseases and associated risk behaviors among California farmworkers is not well described. Purpose: To estimate the prevalence of sexually transmitted diseases (STDs) and associated risk behaviors among California farmworkers. Methods: Cross-sectional analysis of population-based survey data from 6…

  16. A Community-Based Activities Survey: Systematically Determining the Impact on and of Faculty

    ERIC Educational Resources Information Center

    Perry, Lane; Farmer, Betty; Onder, David; Tanner, Benjamin; Burton, Carol

    2015-01-01

    As a descriptive case study from Western Carolina University (WCU), this article describes the development of a measuring, monitoring, and tracking system (the WCU Community-based Activities Survey) for faculty engagement in, adoption of, and impact through community engagement practices both internal and external to their courses. This paper will…

  17. Overview of Results from 1994 & 1995 School-Based Decision Making Surveys.

    ERIC Educational Resources Information Center

    Lindle, Jane Clark; Gale, Bruce S.; Curry-White, Brenda

    The 1994 and 1995 School-Based Decision Making (SBDM) Surveys were conducted in the fall of each of those years for the Study of Education Policy. This report compares the 1994 and 1995 responses to three questions: (1) What do people think of the effectiveness of SBDM? (2) Who is involved in the SBDM decisions? and (3) What are councils doing?…

  18. Medical Students' Experiences with Addicted Patients: A Web-Based Survey

    ERIC Educational Resources Information Center

    Midmer, Deana; Kahan, Meldon; Wilson, Lynn

    2008-01-01

    Project CREATE was an initiative to strengthen undergraduate medical education in addictions. As part of a needs assessment, forty-six medical students at Ontario's five medical schools completed a bi-weekly, interactive web-based survey about addiction-related learning events. In all, 704 unique events were recorded, for an average of 16.7…

  19. Our Environment, Our Health: A Community-Based Participatory Environmental Health Survey in Richmond, California

    ERIC Educational Resources Information Center

    Cohen, Alison; Lopez, Andrea; Malloy, Nile; Morello-Frosch, Rachel

    2012-01-01

    This study presents a health survey conducted by a community-based participatory research partnership between academic researchers and community organizers to consider environmental health and environmental justice issues in four neighborhoods of Richmond, California, a low-income community of color living along the fence line of a major oil…

  20. School Nutrition Directors are Receptive to Web-Based Training Opportunities: A National Survey

    ERIC Educational Resources Information Center

    Zoellner, Jamie; Carr, Deborah H.

    2009-01-01

    Purpose/Objective: The purpose of this study was to investigate school nutrition directors' (SNDs) previous experience with web-based training (WBT), interest in utilizing WBT within 14 functional areas, and logistical issues (time, price, educational credits, etc.) of developing and delivering WBT learning modules. Methods: A survey was developed…

  1. Survey Based Needs Assessment; A Paradigm for Planning the Decentralization of Continuing Health Professional Education.

    ERIC Educational Resources Information Center

    Fryer, George E., Jr.; Krugman, Richard D.

    1981-01-01

    Efforts of the SEARCH/AHEC (Statewide Education Activities for Rural Colorado's Health/Area Health Education Center) Program to base the conduct of administration of its most important program component on results of a survey of potential recipients of its services are described. (Author/GK)

  2. What Attributes Determine Severity of Function in Autism? A Web-Based Survey of Stakeholders

    ERIC Educational Resources Information Center

    Di Rezze, Briano; Rosenbaum, Peter; Zwaigenbaum, Lonnie

    2012-01-01

    Service providers and researchers in autism spectrum disorders (ASD) are challenged to categorize clinical variation in function. Classification systems for children with cerebral palsy have enabled clinicians and families to describe levels of function. A web-based survey engaged international ASD stakeholders to advise on considerations of…

  3. Sources of Error in Substance Use Prevalence Surveys

    PubMed Central

    Johnson, Timothy P.

    2014-01-01

    Population-based estimates of substance use patterns have been regularly reported now for several decades. Concerns with the quality of the survey methodologies employed to produce those estimates date back almost as far. Those concerns have led to a considerable body of research specifically focused on understanding the nature and consequences of survey-based errors in substance use epidemiology. This paper reviews and summarizes that empirical research by organizing it within a total survey error model framework that considers multiple types of representation and measurement errors. Gaps in our knowledge of error sources in substance use surveys and areas needing future research are also identified. PMID:27437511

  4. Accuracies of the empirical theories of the escape probability based on Eigen model and Braun model compared with the exact extension of Onsager theory.

    PubMed

    Wojcik, Mariusz; Tachiya, M

    2009-03-14

    This paper deals with the exact extension of the original Onsager theory of the escape probability to the case of finite recombination rate at nonzero reaction radius. The empirical theories based on the Eigen model and the Braun model, which are applicable in the absence and presence of an external electric field, respectively, are based on a wrong assumption that both recombination and separation processes in geminate recombination follow exponential kinetics. The accuracies of the empirical theories are examined against the exact extension of the Onsager theory. The Eigen model gives the escape probability in the absence of an electric field, which is different by a factor of 3 from the exact one. We have shown that this difference can be removed by operationally redefining the volume occupied by the dissociating partner before dissociation, which appears in the Eigen model as a parameter. The Braun model gives the escape probability in the presence of an electric field, which is significantly different from the exact one over the whole range of electric fields. Appropriate modification of the original Braun model removes the discrepancy at zero or low electric fields, but it does not affect the discrepancy at high electric fields. In all the above theories it is assumed that recombination takes place only at the reaction radius. The escape probability in the case when recombination takes place over a range of distances is also calculated and compared with that in the case of recombination only at the reaction radius.

  5. Creation of an Empirical Energy-Balance Based Snow Module Simulating Both Snowmelt and Snow Accumulation for Mountain Hydrology

    NASA Astrophysics Data System (ADS)

    Riboust, P.; Le Moine, N.; Thirel, G.; Ribstein, P.

    2015-12-01

    In Nordic and mountainous regions, hydrological processes are more complex than for regular rainfall-driven watersheds. Snow accumulates in winter, acting as a reservoir, and melts during late spring and summer. In order to take into account these additional natural processes present in mountainous watersheds, snow modules have been created in order to help rainfall-runoff models to simulate river discharge. Many empirical degree-day snow models have been designed to simulate snowmelt and river discharge when coupled to a rainfall runoff model, but few of them simulate correctly the amount of snow water equivalent (SWE) at point scale. Simulating correctly not only the amount of snowmelt but also the water content of the snowpack has several potential advantages: it allows improving the model reliability and performance for short-term and long-term prediction, spatial regionalization, and it makes it possible to perform data assimilation using observed snow measurements. The objective of our study is to create a new simple empirical snow module, with a structure allowing the use of snow data for calibration or assimilation. We used a model structure close to the snow model defined by M.T. Walter (2005) where each of the processes of the energy balance is parameterized using only temperature and precipitation data. The conductive fluxes into the snowpack have been modeled using analyticalsolutions to the heat equation with phase change. This model which is in-between the degree-day and the physical energy-balance approaches. It has the advantages to use only temperature and precipitation which arewidely available data and to take account of energy balance processes without being computationally intensive. Another advantage is that all state variables of the model should be comparable with observable measurements.For the moment, the snow module has been parameterized at point scale and has been tested over Switzerland and the US, using MeteoSwiss and SNOTEL USGS

  6. Experience base for Radioactive Waste Thermal Processing Systems: A preliminary survey

    SciTech Connect

    Mayberry, J.; Geimer, R.; Gillins, R.; Steverson, E.M.; Dalton, D. ); Anderson, G.L. )

    1992-04-01

    In the process of considering thermal technologies for potential treatment of the Idaho National Engineering Laboratory mixed transuranic contaminated wastes, a preliminary survey of the experience base available from Radioactive Waste Thermal Processing Systems is reported. A list of known commercial radioactive waste facilities in the United States and some international thermal treatment facilities are provided. Survey focus is upon the US Department of Energy thermal treatment facilities. A brief facility description and a preliminary summary of facility status, and problems experienced is provided for a selected subset of the DOE facilities.

  7. Institution-Specific Victimization Surveys: Addressing Legal and Practical Disincentives to Gender-Based Violence Reporting on College Campuses.

    PubMed

    Cantalupo, Nancy Chi

    2014-07-01

    This review brings together both the legal literature and original empirical research regarding the advisability of amending the Jeanne Clery Disclosure of Campus Security Policy and Campus Crime Statistics Act or creating new Department of Education regulations to mandate that all higher education institutions survey their students approximately every 5 years about students' experiences with sexual violence. Legal research conducted regarding the three relevant federal legal regimes show inconsistent incentives for schools to encourage victim reporting and proactively address sexual violence on campus. Moreover, the original research carried out for this article shows that the experience of institutions that have voluntarily conducted such surveys suggests many benefits not only for students, prospective students, parents, and the general public but also for schools themselves. These experiences confirm the practical viability of a mandated survey by the Department of Education.

  8. Law of Empires.

    ERIC Educational Resources Information Center

    Martz, Carlton

    2001-01-01

    This issue of "Bill of Rights in Action" explores issues raised by empires and imperial law. The first article, "Clash of Empires: The Fight for North America," looks at the clash of empires and the fight for North America during the 18th century. The second article, "When Roman Law Ruled the Western World," examines…

  9. TESTING GROUND BASED GEOPHYSICAL TECHNIQUES TO REFINE ELECTROMAGNETIC SURVEYS NORTH OF THE 300 AREA HANFORD WASHINGTON

    SciTech Connect

    PETERSEN SW

    2010-12-02

    Airborne electromagnetic (AEM) surveys were flown during fiscal year (FY) 2008 within the 600 Area in an attempt to characterize the underlying subsurface and to aid in the closure and remediation design study goals for the 200-PO-1 Groundwater Operable Unit (OU). The rationale for using the AEM surveys was that airborne surveys can cover large areas rapidly at relatively low costs with minimal cultural impact, and observed geo-electrical anomalies could be correlated with important subsurface geologic and hydrogeologic features. Initial interpretation of the AEM surveys indicated a tenuous correlation with the underlying geology, from which several anomalous zones likely associated with channels/erosional features incised into the Ringold units were identified near the River Corridor. Preliminary modeling resulted in a slightly improved correlation but revealed that more information was required to constrain the modeling (SGW-39674, Airborne Electromagnetic Survey Report, 200-PO-1 Groundwater Operable Unit, 600 Area, Hanford Site). Both time-and frequency domain AEM surveys were collected with the densest coverage occurring adjacent to the Columbia River Corridor. Time domain surveys targeted deeper subsurface features (e.g., top-of-basalt) and were acquired using the HeliGEOTEM{reg_sign} system along north-south flight lines with a nominal 400 m (1,312 ft) spacing. The frequency domain RESOLVE system acquired electromagnetic (EM) data along tighter spaced (100 m [328 ft] and 200 m [656 ft]) north-south profiles in the eastern fifth of the 200-PO-1 Groundwater OU (immediately adjacent to the River Corridor). The overall goal of this study is to provide further quantification of the AEM survey results, using ground based geophysical methods, and to link results to the underlying geology and/or hydrogeology. Specific goals of this project are as follows: (1) Test ground based geophysical techniques for the efficacy in delineating underlying geology; (2) Use ground

  10. Free-free and fixed base modal survey tests of the Space Station Common Module Prototype

    NASA Technical Reports Server (NTRS)

    Driskill, T. C.; Anderson, J. B.; Coleman, A. D.

    1992-01-01

    This paper describes the testing aspects and the problems encountered during the free-free and fixed base modal surveys completed on the original Space Station Common Module Prototype (CMP). The CMP is a 40-ft long by 14.5-ft diameter 'waffle-grid' cylinder built by the Boeing Company and housed at the Marshall Space Flight Center (MSFC) near Huntsville, AL. The CMP modal survey tests were conducted at MSFC by the Dynamics Test Branch. The free-free modal survey tests (June '90 to Sept. '90) included interface verification tests (IFVT), often referred to as impedance measurements, mass-additive testing and linearity studies. The fixed base modal survey tests (Feb. '91 to April '91), including linearity studies, were conducted in a fixture designed to constrain the CMP in 7 total degrees-of-freedom at five trunnion interfaces (two primary, two secondary, and the keel). The fixture also incorporated an airbag off-load system designed to alleviate the non-linear effects of friction in the primary and secondary trunnion interfaces. Numerous test configurations were performed with the objective of providing a modal data base for evaluating the various testing methodologies to verify dynamic finite element models used for input to coupled load analysis.

  11. "Suntelligence" Survey

    MedlinePlus

    ... to the American Academy of Dermatology's "Suntelligence" sun-smart survey. Please answer the following questions to measure ... be able to view a ranking of major cities suntelligence based on residents' responses to this survey. ...

  12. Illumination Variation-Resistant Video-Based Heart Rate Measurement Using Joint Blind Source Separation and Ensemble Empirical Mode Decomposition.

    PubMed

    Cheng, Juan; Chen, Xun; Xu, Lingxi; Wang, Z Jane

    2016-10-06

    Recent studies have demonstrated that heart rate (HR) could be estimated using video data (e.g., exploring human facial regions of interest (ROIs)) under well controlled conditions. However, in practice, the pulse signals may be contaminated by motions and illumination variations. In this paper, tackling the illumination variation challenge, we propose an illuminationrobust framework using joint blind source separation (JBSS) and ensemble empirical mode decomposition (EEMD) to effectively evaluate HR from webcam videos. The framework takes the hypotheses that both facial ROI and background ROI have similar illumination variations. The background ROI is then considered as a noise reference sensor to denoise the facial signals by using the JBSS technique to extract the underlying illumination variation sources. Further, the reconstructed illumination-resisted green channel of the facial ROI is detrended and decomposed into a number of intrinsic mode functions (IMFs) using EEMD to estimate the HR. Experimental results demonstrated that the proposed framework could estimate HR more accurately than the state-of-the-art methods. The Bland-Altman plots showed that it led to better agreement with HR ground truth with the mean bias 1.15 beat per minute (bpm), with 95 % limits from -15.43 bpm to 17.73 bpm, and the correlation coefficient 0.53. This study provides a promising solution for realistic non-contact and robust HR measurement applications.

  13. HI4PI: A full-sky H I survey based on EBHIS and GASS

    NASA Astrophysics Data System (ADS)

    HI4PI Collaboration; Ben Bekhti, N.; Flöer, L.; Keller, R.; Kerp, J.; Lenz, D.; Winkel, B.; Bailin, J.; Calabretta, M. R.; Dedes, L.; Ford, H. A.; Gibson, B. K.; Haud, U.; Janowiecki, S.; Kalberla, P. M. W.; Lockman, F. J.; McClure-Griffiths, N. M.; Murphy, T.; Nakanishi, H.; Pisano, D. J.; Staveley-Smith, L.

    2016-10-01

    Context. Measurement of the Galactic neutral atomic hydrogen (H I) column density, NH I, and brightness temperatures, TB, is of high scientific value for a broad range of astrophysical disciplines. In the past two decades, one of the most-used legacy H I datasets has been the Leiden/Argentine/Bonn Survey (LAB). Aims: We release the H I 4π survey (HI4PI), an all-sky database of Galactic H I, which supersedes the LAB survey. Methods: The HI4PI survey is based on data from the recently completed first coverage of the Effelsberg-Bonn H I Survey (EBHIS) and from the third revision of the Galactic All-Sky Survey (GASS). EBHIS and GASS share similar angular resolution and match well in sensitivity. Combined, they are ideally suited to be a successor to LAB. Results: The new HI4PI survey outperforms the LAB in angular resolution (ϑFWHM = 16´´.2) and sensitivity (σrms = 43 mK). Moreover, it has full spatial sampling and thus overcomes a major drawback of LAB, which severely undersamples the sky. We publish all-sky column density maps of the neutral atomic hydrogen in the Milky Way, along with full spectroscopic data, in several map projections including HEALPix. HI4PI datasets are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/594/A116

  14. Our environment, our health: a community-based participatory environmental health survey in Richmond, California.

    PubMed

    Cohen, Alison; Lopez, Andrea; Malloy, Nile; Morello-Frosch, Rachel

    2012-04-01

    This study presents a health survey conducted by a community-based participatory research partnership between academic researchers and community organizers to consider environmental health and environmental justice issues in four neighborhoods of Richmond, California, a low-income community of color living along the fence line of a major oil refinery and near other industrial and mobile sources of pollution. The Richmond health survey aimed to assess local concerns and perceptions of neighborhood conditions, health problems, mobile and stationary hazards, access to health care, and other issues affecting residents of Richmond. Although respondents thought their neighborhoods were good places to live, they expressed concerns about neighborhood stressors and particular sources of pollution, and identified elevated asthma rates for children and long-time Richmond residents. The Richmond health survey offers a holistic, community-centered perspective to understanding local environmental health issues, and can inform future environmental health research and organizing efforts for community-university collaboratives.

  15. 1995 Area 1 bird survey/Zone 1, Operable Unit 2, Robins Air Force Base, Georgia

    SciTech Connect

    Wade, M.C.

    1995-08-01

    Robins Air Force Base is located in Warner Robins, Georgia, approximately 90 miles southeast of Atlanta, Georgia. As part of the Baseline Investigation (CDM Federal 1994) a two day bird survey was conducted by M. C. Wade (Oak Ridge National Laboratory) and B.A. Beatty (CDM Federal Programs) in May 1995. The subject area of investigation includes the sludge lagoon, Landfill No. 4, and the wetland area east of the landfill and west of Hannah Road (including two ponds). This is known as Area 1. The Area 1 wetlands include bottomland hardwood forest, stream, and pond habitats. The objectives of this survey were to document bird species using the Area I wetlands and to see if the change in hydrology (due to the installation of the Sewage Treatment Plant effluent diversion and stormwater runon control systems) has resulted in changes at Area 1 since the previous survey of May 1992 (CDM Federal 1994).

  16. The Importance of Adhering to Details of the Total Design Method (TDM) for Mail Surveys.

    ERIC Educational Resources Information Center

    Dillman, Don A.; And Others

    1984-01-01

    The empirical effects of adherence of details of the Total Design Method (TDM) approach to the design of mail surveys is discussed, based on the implementation of a common survey in 11 different states. The results suggest that greater adherence results in higher response, especially in the later stages of the TDM. (BW)

  17. Considerations for Conducting Web-Based Survey Research With People Living With Human Immunodeficiency Virus Using a Community-Based Participatory Approach

    PubMed Central

    Solomon, Patricia; Worthington, Catherine; Ibáñez-Carrasco, Francisco; Baxter, Larry; Nixon, Stephanie A; Baltzer-Turje, Rosalind; Robinson, Greg; Zack, Elisse

    2014-01-01

    Background Web or Internet-based surveys are increasingly popular in health survey research. However, the strengths and challenges of Web-based surveys with people living with human immunodeficiency virus (HIV) are unclear. Objective The aim of this article is to describe our experience piloting a cross-sectional, Web-based, self-administered survey with adults living with HIV using a community-based participatory research approach. Methods We piloted a Web-based survey that investigated disability and rehabilitation services use with a sample of adults living with HIV in Canada. Community organizations in five provinces emailed invitations to clients, followed by a thank you/reminder one week later. We obtained survey feedback in a structured phone interview with respondents. Participant responses were transcribed verbatim and analyzed using directed content analysis. Results Of 30 people living with HIV who accessed the survey link, 24/30 (80%) initiated and 16/30 (53%) completed the survey instrument. A total of 17 respondents participated in post-survey interviews. Participants described the survey instrument as comprehensive, suggesting content validity. The majority (13/17, 76%) felt instruction and item wording were clear and easy to understand, and found the software easy to navigate. Participants felt having a pop-up reminder directing them to missed items would be useful. Conclusions Strengths of implementing the Web-based survey included: our community-based participatory approach, ease of software use, ability for respondents to complete the questionnaire on one’s own time at one’s own pace, opportunity to obtain geographic variation, and potential for respondent anonymity. Considerations for future survey implementation included: respondent burden and fatigue, the potentially sensitive nature of HIV Web-based research, data management and storage, challenges verifying informed consent, varying computer skills among respondents, and the burden on

  18. Improving Inpatient Surveys: Web-Based Computer Adaptive Testing Accessed via Mobile Phone QR Codes

    PubMed Central

    2016-01-01

    Background The National Health Service (NHS) 70-item inpatient questionnaire surveys inpatients on their perceptions of their hospitalization experience. However, it imposes more burden on the patient than other similar surveys. The literature shows that computerized adaptive testing (CAT) based on item response theory can help shorten the item length of a questionnaire without compromising its precision. Objective Our aim was to investigate whether CAT can be (1) efficient with item reduction and (2) used with quick response (QR) codes scanned by mobile phones. Methods After downloading the 2008 inpatient survey data from the Picker Institute Europe website and analyzing the difficulties of this 70-item questionnaire, we used an author-made Excel program using the Rasch partial credit model to simulate 1000 patients’ true scores followed by a standard normal distribution. The CAT was compared to two other scenarios of answering all items (AAI) and the randomized selection method (RSM), as we investigated item length (efficiency) and measurement accuracy. The author-made Web-based CAT program for gathering patient feedback was effectively accessed from mobile phones by scanning the QR code. Results We found that the CAT can be more efficient for patients answering questions (ie, fewer items to respond to) than either AAI or RSM without compromising its measurement accuracy. A Web-based CAT inpatient survey accessed by scanning a QR code on a mobile phone was viable for gathering inpatient satisfaction responses. Conclusions With advances in technology, patients can now be offered alternatives for providing feedback about hospitalization satisfaction. This Web-based CAT is a possible option in health care settings for reducing the number of survey items, as well as offering an innovative QR code access. PMID:26935793

  19. The Personal Health Survey

    ERIC Educational Resources Information Center

    Thorne, Frederick C.

    1978-01-01

    The Personal Health Survey (PHS) is a 200-item inventory designed to sample symptomatology as subjective experiences from the 12 principal domains of organ system and psychophysiological functioning. This study investigates the factorial validity of the empirically constructed scales. (Author)

  20. Aggregation, Validation, and Generalization of Qualitative Data - Methodological and Practical Research Strategies Illustrated by the Research Process of an empirically Based Typology.

    PubMed

    Weis, Daniel; Willems, Helmut

    2016-12-12

    The article deals with the question of how aggregated data which allow for generalizable insights can be generated from single-case based qualitative investigations. Thereby, two central challenges of qualitative social research are outlined: First, researchers must ensure that the single-case data can be aggregated and condensed so that new collective structures can be detected. Second, they must apply methods and practices to allow for the generalization of the results beyond the specific study. In the following, we demonstrate how and under what conditions these challenges can be addressed in research practice. To this end, the research process of the construction of an empirically based typology is described. A qualitative study, conducted within the framework of the Luxembourg Youth Report, is used to illustrate this process. Specifically, strategies are presented which increase the likelihood of generalizability or transferability of the results, while also highlighting their limitations.

  1. Toward regional- to continental-scale estimates of vegetation canopy height: An empirical approach based on data from the Shuttle Radar Topography Mission

    NASA Astrophysics Data System (ADS)

    Walker, Wayne S.

    This dissertation investigates the feasibility of exploiting interferometric synthetic aperture radar (InSAR) data acquired during the 2000 Shuttle Radar Topography Mission (SRTM) for the purpose of obtaining regional- to continental-scale estimates of vegetation canopy height. The specific objectives were to (1) assess the quality of SRTM C- and X-band data in the context of canopy height retrieval with an emphasis on vertical accuracy and horizontal resolution, (2) determine the extent to which SRTM C-band data could be used to develop empirical models for canopy height prediction, and (3) develop a robust SRTM-based approach for generating a year-2000 baseline map of canopy height for the conterminous U.S. The assessment of SRTM data quality revealed the presence of a vegetation signal sufficient to support canopy height retrieval. In the vertical dimension, signal quality was found to be most affected by error attributed to residual phase noise, and a novel strategy for error mitigation was developed. In the horizontal dimension, the resolution of the SRTM C- and X-band data was estimated at approximately 45 meters. Pilot studies conducted in Georgia and California demonstrated that empirical estimates of canopy height could be obtained from the SRTM C-band vegetation signal in conjunction with the National Elevation Dataset assuming the availability of sufficient field reference data and an appropriate level of error mitigation. The studies also revealed the importance of stand-level characteristics, including stand size and shape in the context of phase noise reduction and stand structure where regression model development is concerned. Supported by an unprecedented confluence of national geospatial data layers as well as an extensive national reference data network, a proof-of-concept study was designed to evaluate a novel, empirical approach for broad-scale SRTM-based canopy height mapping. The study produced the first-ever InSAR-based map of canopy height

  2. In Vitro Comparison of Combination- and Mono-therapy for the Empiric and Optimal Coverage of Bacterial Keratitis Based on Incidence of Infection

    PubMed Central

    Kowalski, Regis P.; Kowalski, Tyler A.; Shanks, Robert M.Q.; Romanowski, Eric G.; Karenchak, Lisa M.; Mah, Francis S.

    2012-01-01

    Purpose Cefazolin/tobramycin, Cefuroxime/gentamicin, and moxifloxacin were compared using bacterial keratitis isolates to determine whether empiric therapy constituted optimal anti-bacterial treatment. Methods Based on percent incidence of corneal infection, 27 Staphylococcus aureus, 16 Pseudomonas aeruginosa, 10 Serratia marcescens, 4 Moraxella lacunata, 3 Haemophilus influenzae, 9 coagulase-negative Staphylococci, 7 Streptococcus viridans, 6 Streptococcus pneumoniae, 7 assorted Gram-positive isolates, and 11 assorted Gram-negative isolates were tested for MICs to cefazolin, tobramycin, cefuroxime, gentamicin, and moxifloxacin using E-tests to determine susceptibility and potency. Results The in vitro coverage (susceptible to at least one antibiotic) of cefuroxime/gentamicin (97%) was statistically equal to cefazolin/tobramycin (93%) and moxifloxacin (92%) (p=0.29). Double coverage (susceptible to both antibiotics) was equivalent (p=0.77) for cefuroxime/ gentamicin (42%) and cefazolin/tobramycin (40%). The susceptibilities of individual coverage were moxifloxacin (92%), gentamicin (89%), tobramycin (74%), cefazolin (58%), and cefuroxime (52%). Methicillin-resistant Staphylococcus aureus was best covered by gentamicin 100% (9 of 9). Tobramycin was more potent (p=0.00001) than gentamicin for Pseudomonas aeruginosa, while cefazolin was more potent (p=0.0004) than cefuroxime for Staphylococcus aureus. Conclusions Although there appears to be no in vitro empiric coverage advantage between cefazolin/tobramycin, cefuroxime/gentamicin, and moxifloxacin monotherapy, potency differences may occur, and optimal treatment can best be determined with laboratory studies. PMID:23132444

  3. An Empirical Determination of the Intergalactic Background Light Using Near-Infrared Deep Galaxy Survey Data Out to 5 Micrometers and the Gamma-Ray Opacity of the Universe

    NASA Technical Reports Server (NTRS)

    Scully, Sean T.; Malkan, Matthew A.; Stecker, Floyd W.

    2014-01-01

    We extend our previous model-independent determination of the intergalactic background light, based purely on galaxy survey data, out to a wavelength of 5 micrometers. Our approach enables us to constrain the range of photon densities, based on the uncertainties from observationally determined luminosity densities and colors. We further determine a 68% confidence upper and lower limit on the opacity of the universe to gamma-rays up to energies of 1.6/(1 + z) terraelectron volts. A comparison of our lower limit redshift-dependent opacity curves to the opacity limits derived from the results of both ground-based air Cerenkov telescope and Fermi-LAT observations of PKS 1424+240 allows us to place a new upper limit on the redshift of this source, independent of IBL modeling.

  4. An empirical determination of the intergalactic background light using near-infrared deep galaxy survey data out to 5 μm and the gamma-ray opacity of the universe

    SciTech Connect

    Scully, Sean T.; Malkan, Matthew A.; Stecker, Floyd W.

    2014-04-01

    We extend our previous model-independent determination of the intergalactic background light, based purely on galaxy survey data, out to a wavelength of 5 μm. Our approach enables us to constrain the range of photon densities, based on the uncertainties from observationally determined luminosity densities and colors. We further determine a 68% confidence upper and lower limit on the opacity of the universe to γ-rays up to energies of 1.6/(1 + z) TeV. A comparison of our lower limit redshift-dependent opacity curves to the opacity limits derived from the results of both ground-based air Cerenkov telescope and Fermi-LAT observations of PKS 1424+240 allows us to place a new upper limit on the redshift of this source, independent of IBL modeling.

  5. Semivolatile Organic Compounds in Homes: Strategies for Efficient and Systematic Exposure Measurement Based on Empirical and Theoretical Factors

    PubMed Central

    2014-01-01

    Residential exposure can dominate total exposure for commercial chemicals of health concern; however, despite the importance of consumer exposures, methods for estimating household exposures remain limited. We collected house dust and indoor air samples in 49 California homes and analyzed for 76 semivolatile organic compounds (SVOCs)—phthalates, polybrominated diphenyl ethers (PBDEs), polychlorinated biphenyls (PCBs), polycyclic aromatic hydrocarbons (PAHs), and pesticides. Sixty chemicals were detected in either dust or air and here we report 58 SVOCs detected in dust for the first time. In dust, phthalates (bis(2-ethylhexyl) phthalate, benzyl butyl phthalate, di-n-butyl phthalate) and flame retardants (PBDE 99, PBDE 47) were detected at the highest concentrations relative to other chemicals at the 95th percentile, while phthalates were highest at the median. Because SVOCs are found in both gas and condensed phases and redistribute from their original source over time, partitioning models can clarify their fate indoors. We use empirical data to validate air-dust partitioning models and use these results, combined with experience in SVOC exposure assessment, to recommend residential exposure measurement strategies. We can predict dust concentrations reasonably well from measured air concentrations (R2 = 0.80). Partitioning models and knowledge of chemical Koa elucidate exposure pathways and suggest priorities for chemical regulation. These findings also inform study design by allowing researchers to select sampling approaches optimized for their chemicals of interest and study goals. While surface wipes are commonly used in epidemiology studies because of ease of implementation, passive air sampling may be more standardized between homes and also relatively simple to deploy. Validation of passive air sampling methods for SVOCs is a priority. PMID:25488487

  6. Empirical chemosensitivity testing in a spheroid model of ovarian cancer using a microfluidics-based multiplex platform

    PubMed Central

    Das, Tamal; Meunier, Liliane; Barbe, Laurent; Provencher, Diane; Guenat, Olivier; Gervais, Thomas; Mes-Masson, Anne-Marie

    2013-01-01

    The use of biomarkers to infer drug response in patients is being actively pursued, yet significant challenges with this approach, including the complicated interconnection of pathways, have limited its application. Direct empirical testing of tumor sensitivity would arguably provide a more reliable predictive value, although it has garnered little attention largely due to the technical difficulties associated with this approach. We hypothesize that the application of recently developed microtechnologies, coupled to more complex 3-dimensional cell cultures, could provide a model to address some of these issues. As a proof of concept, we developed a microfluidic device where spheroids of the serous epithelial ovarian cancer cell line TOV112D are entrapped and assayed for their chemoresponse to carboplatin and paclitaxel, two therapeutic agents routinely used for the treatment of ovarian cancer. In order to index the chemoresponse, we analyzed the spatiotemporal evolution of the mortality fraction, as judged by vital dyes and confocal microscopy, within spheroids subjected to different drug concentrations and treatment durations inside the microfluidic device. To reflect microenvironment effects, we tested the effect of exogenous extracellular matrix and serum supplementation during spheroid formation on their chemotherapeutic response. Spheroids displayed augmented chemoresistance in comparison to monolayer culturing. This resistance was further increased by the simultaneous presence of both extracellular matrix and high serum concentration during spheroid formation. Following exposure to chemotherapeutics, cell death profiles were not uniform throughout the spheroid. The highest cell death fraction was found at the center of the spheroid and the lowest at the periphery. Collectively, the results demonstrate the validity of the approach, and provide the basis for further investigation of chemotherapeutic responses in ovarian cancer using microfluidics technology. In

  7. Semivolatile organic compounds in homes: strategies for efficient and systematic exposure measurement based on empirical and theoretical factors.

    PubMed

    Dodson, Robin E; Camann, David E; Morello-Frosch, Rachel; Brody, Julia G; Rudel, Ruthann A

    2015-01-06

    Residential exposure can dominate total exposure for commercial chemicals of health concern; however, despite the importance of consumer exposures, methods for estimating household exposures remain limited. We collected house dust and indoor air samples in 49 California homes and analyzed for 76 semivolatile organic compounds (SVOCs)--phthalates, polybrominated diphenyl ethers (PBDEs), polychlorinated biphenyls (PCBs), polycyclic aromatic hydrocarbons (PAHs), and pesticides. Sixty chemicals were detected in either dust or air and here we report 58 SVOCs detected in dust for the first time. In dust, phthalates (bis(2-ethylhexyl) phthalate, benzyl butyl phthalate, di-n-butyl phthalate) and flame retardants (PBDE 99, PBDE 47) were detected at the highest concentrations relative to other chemicals at the 95th percentile, while phthalates were highest at the median. Because SVOCs are found in both gas and condensed phases and redistribute from their original source over time, partitioning models can clarify their fate indoors. We use empirical data to validate air-dust partitioning models and use these results, combined with experience in SVOC exposure assessment, to recommend residential exposure measurement strategies. We can predict dust concentrations reasonably well from measured air concentrations (R(2) = 0.80). Partitioning models and knowledge of chemical Koa elucidate exposure pathways and suggest priorities for chemical regulation. These findings also inform study design by allowing researchers to select sampling approaches optimized for their chemicals of interest and study goals. While surface wipes are commonly used in epidemiology studies because of ease of implementation, passive air sampling may be more standardized between homes and also relatively simple to deploy. Validation of passive air sampling methods for SVOCs is a priority.

  8. Estimates for ELF effects: noise-based thresholds and the number of experimental conditions required for empirical searches.

    PubMed

    Weaver, J C; Astumian, R D

    1992-01-01

    Interactions between physical fields and biological systems present difficult conceptual problems. Complete biological systems, even isolated cells, are exceedingly complex. This argues against the pursuit of theoretical models, with the possible consequence that only experimental studies should be considered. In contrast, electromagnetic fields are well understood. Further, some subsystems of cells (viz. cell membranes) can be reasonably represented by physical models. This argues for the pursuit of theoretical models which quantitatively describe interactions of electromagnetic fields with that subsystem. Here we consider the hypothesis that electric fields, not magnetic fields, are the source of interactions, From this it follows that the cell membrane is a relevant subsystem, as the membrane is much more resistive than the intra- or extracellular regions. A general class of interactions is considered: electroconformational changes associated with the membrane. Expected results of such as approach include the dependence of the interaction on key parameters (e.g., cell size, field magnitude, frequency, and exposure time), constraints on threshold exposure conditions, and insight into how experiments might be designed. Further, because it is well established that strong and moderate electric fields interact significantly with cells, estimates of the extrapolated interaction for weaker fields can be sought. By employing signal-to-noise (S/N) ratio criteria, theoretical models can also be used to estimate threshold magnitudes. These estimates are particularly relevant to in vitro conditions, for which most biologically generated background fields are absent. Finally, we argue that if theoretical model predictions are unavailable to guide the selection of experimental conditions, an overwhelmingly large number of different conditions will be needed to find, establish, and characterize bioelectromagnetic effects in an empirical search. This is contrasted with well

  9. How Good Is Crude MDL for Solving the Bias-Variance Dilemma? An Empirical Investigation Based on Bayesian Networks

    PubMed Central

    Cruz-Ramírez, Nicandro; Acosta-Mesa, Héctor Gabriel; Mezura-Montes, Efrén; Guerra-Hernández, Alejandro; Hoyos-Rivera, Guillermo de Jesús; Barrientos-Martínez, Rocío Erandi; Gutiérrez-Fragoso, Karina; Nava-Fernández, Luis Alonso; González-Gaspar, Patricia; Novoa-del-Toro, Elva María; Aguilera-Rueda, Vicente Josué; Ameca-Alducin, María Yaneli

    2014-01-01

    The bias-variance dilemma is a well-known and important problem in Machine Learning. It basically relates the generalization capability (goodness of fit) of a learning method to its corresponding complexity. When we have enough data at hand, it is possible to use these data in such a way so as to minimize overfitting (the risk of selecting a complex model that generalizes poorly). Unfortunately, there are many situations where we simply do not have this required amount of data. Thus, we need to find methods capable of efficiently exploiting the available data while avoiding overfitting. Different metrics have been proposed to achieve this goal: the Minimum Description Length principle (MDL), Akaike’s Information Criterion (AIC) and Bayesian Information Criterion (BIC), among others. In this paper, we focus on crude MDL and empirically evaluate its performance in selecting models with a good balance between goodness of fit and complexity: the so-called bias-variance dilemma, decomposition or tradeoff. Although the graphical interaction between these dimensions (bias and variance) is ubiquitous in the Machine Learning literature, few works present experimental evidence to recover such interaction. In our experiments, we argue that the resulting graphs allow us to gain insights that are difficult to unveil otherwise: that crude MDL naturally selects balanced models in terms of bias-variance, which not necessarily need be the gold-standard ones. We carry out these experiments using a specific model: a Bayesian network. In spite of these motivating results, we also should not overlook three other components that may significantly affect the final model selection: the search procedure, the noise rate and the sample size. PMID:24671204

  10. How good is crude MDL for solving the bias-variance dilemma? An empirical investigation based on Bayesian networks.

    PubMed

    Cruz-Ramírez, Nicandro; Acosta-Mesa, Héctor Gabriel; Mezura-Montes, Efrén; Guerra-Hernández, Alejandro; Hoyos-Rivera, Guillermo de Jesús; Barrientos-Martínez, Rocío Erandi; Gutiérrez-Fragoso, Karina; Nava-Fernández, Luis Alonso; González-Gaspar, Patricia; Novoa-del-Toro, Elva María; Aguilera-Rueda, Vicente Josué; Ameca-Alducin, María Yaneli

    2014-01-01

    The bias-variance dilemma is a well-known and important problem in Machine Learning. It basically relates the generalization capability (goodness of fit) of a learning method to its corresponding complexity. When we have enough data at hand, it is possible to use these data in such a way so as to minimize overfitting (the risk of selecting a complex model that generalizes poorly). Unfortunately, there are many situations where we simply do not have this required amount of data. Thus, we need to find methods capable of efficiently exploiting the available data while avoiding overfitting. Different metrics have been proposed to achieve this goal: the Minimum Description Length principle (MDL), Akaike's Information Criterion (AIC) and Bayesian Information Criterion (BIC), among others. In this paper, we focus on crude MDL and empirically evaluate its performance in selecting models with a good balance between goodness of fit and complexity: the so-called bias-variance dilemma, decomposition or tradeoff. Although the graphical interaction between these dimensions (bias and variance) is ubiquitous in the Machine Learning literature, few works present experimental evidence to recover such interaction. In our experiments, we argue that the resulting graphs allow us to gain insights that are difficult to unveil otherwise: that crude MDL naturally selects balanced models in terms of bias-variance, which not necessarily need be the gold-standard ones. We carry out these experiments using a specific model: a Bayesian network. In spite of these motivating results, we also should not overlook three other components that may significantly affect the final model selection: the search procedure, the noise rate and the sample size.

  11. Complex systems approach to scientific publication and peer-review system: development of an agent-based model calibrated with empirical journal data.

    PubMed

    Kovanis, Michail; Porcher, Raphaël; Ravaud, Philippe; Trinquart, Ludovic

    Scientific peer-review and publication systems incur a huge burden in terms of costs and time. Innovative alternatives have been proposed to improve the systems, but assessing their impact in experimental studies is not feasible at a systemic level. We developed an agent-based model by adopting a unified view of peer review and publication systems and calibrating it with empirical journal data in the biomedical and life sciences. We modeled researchers, research manuscripts and scientific journals as agents. Researchers were characterized by their scientific level and resources, manuscripts by their scientific value, and journals by their reputation and acceptance or rejection thresholds. These state variables were used in submodels for various processes such as production of articles, submissions to target journals, in-house and external peer review, and resubmissions. We collected data for a sample of biomedical and life sciences journals regarding acceptance rates, resubmission patterns and total number of published articles. We adjusted submodel parameters so that the agent-based model outputs fit these empirical data. We simulated 105 journals, 25,000 researchers and 410,000 manuscripts over 10 years. A mean of 33,600 articles were published per year; 19 % of submitted manuscripts remained unpublished. The mean acceptance rate was 21 % after external peer review and rejection rate 32 % after in-house review; 15 % publications resulted from the first submission, 47 % the second submission and 20 % the third submission. All decisions in the model were mainly driven by the scientific value, whereas journal targeting and persistence in resubmission defined whether a manuscript would be published or abandoned after one or many rejections. This agent-based model may help in better understanding the determinants of the scientific publication and peer-review systems. It may also help in assessing and identifying the most promising alternative systems of peer

  12. Simulation of Strong Ground Motion Based on Conventional Empirical Green s Functions In the Michoacán State, Mexico

    NASA Astrophysics Data System (ADS)

    Vazquez Rosas, R.; Aguirre Gonzalez, J. J.; Mijares Arellano, H. H.

    2012-12-01

    In the present work, we study the state of Michoacán, one of the most important seimogenic zones in Mexico. Three kinds of sources exist in the state, producing tectonic earthquakes, volcanic earthquakes, and events due to local faults in the region. For this reason, it is of vital importance the study of source parameters in the Michoacán state. In this work in particular we applied the simulation of strong ground motions by the conventional empirical Green s functions proposed by Irikura (1986). We installed a temporary network consisting of 6 accelerograph stations across the state, at the following locations: Faro de Brucerías, Aguililla, Apatzingán, Pátzcuaro, Morelia, and Maravatío. The stations form a line that is perpendicular to the coastline and has a total length of 366 km, while the distance between neighboring stations varies from 60 to 80 km. Among all the seismic events recorded at this temporary network, we select 2 events originated along the coastline of Michoacán (May the 2007), with moment magnitudes of 4.3 and 5.1 Mw. In order to calibrate the model, the earthquake of May 31, 2007 (M 5.1) was simulated using the aftershock of May 27 of that year (M 4.3) with satisfactory results, following the same method and considering the ω2 spectral model with constant stress drop. Later, we calculated six scenarios for a postulate earthquake of M 7.4. From the six scenarios the largest peak ground accelerations for each station were, 83 cm/s2 in Faro de Brucerías , 15.4 cm/s2 in Apatzingán, 23 cm/s2 in Pátzcuaro, 3.7 cm/s2 in Morelia and Maravatio con 3.0 cm/s2 . One limitation of this study is that we used relatively small-magnitude earthquakes. This was a consequence of the relatively short operation period of the temporary network, which had to be limited to 3 months. To improve these simulations it is necessary to have more information about rupture processes of the recorded earthquakes. And likewise, information of future earthquakes in the

  13. Empirical Research on Performance Improvement

    ERIC Educational Resources Information Center

    Marker, Anthony; Huglin, Linda; Johnsen, Liz

    2006-01-01

    In 2002, James Klein published a study based on a content analysis of research articles in "PIQ" from 1997 through 2000. That study was aimed at determining how much empirical research was being reported in HPT and what the focus of that research was. Klein found that only about one third of the articles published in "PIQ" represented empirical…

  14. Twitter Strategies for Web-Based Surveying: Descriptive Analysis From the International Concussion Study

    PubMed Central

    Düking, Peter; Mellalieu, Stephen D

    2016-01-01

    Background Social media provides researchers with an efficient means to reach and engage with a large and diverse audience. Twitter allows for the virtual social interaction among a network of users that enables researchers to recruit and administer surveys using snowball sampling. Although using Twitter to administer surveys for research is not new, strategies to improve response rates are yet to be reported. Objective To compare the potential and actual reach of 2 Twitter accounts that administered a Web-based concussion survey to rugby players and trainers using 2 distinct Twitter-targeting strategies. Furthermore, the study sought to determine the likelihood of receiving a retweet based on the time of the day and day of the week of posting. Methods A survey based on previous concussion research was exported to a Web-based survey website Survey Monkey. The survey comprised 2 questionnaires, one for players, and one for those involved in the game (eg, coaches and athletic trainers). The Web-based survey was administered using 2 existing Twitter accounts, with each account executing a distinct targeting strategy. A list of potential Twitter accounts to target was drawn up, together with a list of predesigned tweets. The list of accounts to target was divided into ‘High-Profile’ and ‘Low-Profile’, based on each accounts’ position to attract publicity with a high social interaction potential. The potential reach (number of followers of the targeted account), and actual reach (number of retweets received by each post) between the 2 strategies were compared. The number of retweets received by each account was further analyzed to understand when the most likely time of day, and day of the week, a retweet would be received. Results The number of retweets received by a Twitter account decreased by 72% when using the ‘high-profile strategy’ compared with the ‘low-profile strategy’ (incidence rate ratio (IRR); 0.28, 95% confidence interval (CI) 0

  15. A survey of individuals in US-based pharmaceutical industry HEOR departments: attitudes on policy topics.

    PubMed

    Neumann, Peter J; Saret, Cayla J

    2013-10-01

    We surveyed US-based leaders in health economics and outcomes research (HEOR) departments in drug and device companies to examine their views on the state of the field. We created a questionnaire that was emailed to 123 US-based senior HEOR professionals at 54 companies. Of the 123 recipients, 74 (60%) completed the survey. Most respondents (92%) expected their company's HEOR use to increase, and 80% reported that their organization's senior management viewed HEOR work as critical. Approximately 62% agreed that Academy of Managed Care Pharmacy (AMCP) dossiers are useful to US health plans, and 55% stated that Food and Drug Administration Modernization Act (FDAMA) Section 114 is useful. Approximately 49% believed the US government should use cost-effectiveness analysis in coverage and reimbursement decisions, but only 31% expected this to occur within 3 years. The findings suggest strong support for the function at senior management levels and optimism about the field.

  16. A Case for Increasing Empirical Attention to Head Start's Home-Based Program: An Exploration of Routine Collaborative Goal Setting

    ERIC Educational Resources Information Center

    Manz, Patricia H.; Lehtinen, Jaana; Bracaliello, Catherine

    2013-01-01

    Collaborative goal setting among home visitors and family members is a mandate for Head Start's home-based program. Yet, a dearth of research is available for advancing evidence-based practices for setting and monitoring home visiting goals or for understanding how family characteristics or program features are associated with them. With the…

  17. Relationship of Student Undergraduate Achievement and Personality Characteristics in a Total Web-Based Environment: An Empirical Study

    ERIC Educational Resources Information Center

    Schniederjans, Marc J.; Kim, Eyong B.

    2005-01-01

    Web-based education is a popular format for the delivery of college courses. Research has shown that it may not be the best form of education for all students. Today, many students (and student advisors) face a choice in course delivery format (i.e., Web-based or more traditional classroom courses). This research study examines the relationship…

  18. Canopy Water Content retrieval at different scales from empirical and physically-based remote sensing methods in the SMOS VAS cal/val site

    NASA Astrophysics Data System (ADS)

    Camacho de Coca, Fernando

    The SMOS (Soil Moisture and Ocean Salinity) space mission from the European Space Agency (ESA) is aimed to observe soil moisture over the continents and sea surface salinity over the oceans with enough resolution to be used in global climatic studies. Calibration and validation of SMOS products is an essential activity due to the exploratory nature of the mission. After launch, intense field activities will collect in situ information simultaneous to SMOS obser-vations in order to improve the empirical aspects of retrieval algorithms and to validate the products generated from these observations. One of the cal/val site selected over land is the Valencia Anchor station site (with an extended area for SMOS of 125 x 125 km2), where soil moisture will be measure intensively in 2010. In order to validate the SMOS soil moisture algorithms is necessary to characterize the water content in the vegetation canopy layer. This paper describes the retrieval of the canopy water content (CWC) in the Valencia Anchor station site at different scales using high-resolution satellite sensors and low resolution MODIS data. Two different approaches has been used: (??) an empirical approach has been used for mapping CWC over the VAS site during field campaigns, (??) artificial neural network has been trained with radiative transfer model simulations to analyze the performance of this approach for mapping CWC. 1. Two field campaigns were carried out to collect CWC at ground level in the study area over vineyards and irrigated crops. Measured CWC values for vineyard crops ranged between 0.05 and 0.2 kg/m2, and over irrigated crops CWC ranged between 0.1 and 1,5 kg/m2, which correspond with the full range of expected variations in the study area. An empirical relationship based on a multivariate ordinary least squares algorithm between the collected ground dataset and concomitant reflectance values coming from high resolution (HR) satellite sensors has been selected to produce CWC HR maps

  19. Natural language processing-based COTS software and related technologies survey.

    SciTech Connect

    Stickland, Michael G.; Conrad, Gregory N.; Eaton, Shelley M.

    2003-09-01

    Natural language processing-based knowledge management software, traditionally developed for security organizations, is now becoming commercially available. An informal survey was conducted to discover and examine current NLP and related technologies and potential applications for information retrieval, information extraction, summarization, categorization, terminology management, link analysis, and visualization for possible implementation at Sandia National Laboratories. This report documents our current understanding of the technologies, lists software vendors and their products, and identifies potential applications of these technologies.

  20. One-Year Results for the Kelly Air Force Base Compressed Work Week Survey

    DTIC Science & Technology

    1994-01-01

    home and at work, resulting in social stability. 14. SUBJECT TERMS 15. NUMBER OF PAGES Compressed Work Week Lifestyle .44 Attitude Survey Air Force Base...for 1-year. Few published studies have investigated the impact of CWS on the lifestyle or quality of life of the employee, particularly over extended...grouped into lifestyle subcategories (family, community, health, leisure, social, cultural, sleep, and finances) or job related subcategories

  1. A Survey of Partition-Based Techniques for Copy-Move Forgery Detection

    PubMed Central

    Nathalie Diane, Wandji Nanda; Xingming, Sun; Moise, Fah Kue

    2014-01-01

    A copy-move forged image results from a specific type of image tampering procedure carried out by copying a part of an image and pasting it on one or more parts of the same image generally to maliciously hide unwanted objects/regions or clone an object. Therefore, detecting such forgeries mainly consists in devising ways of exposing identical or relatively similar areas in images. This survey attempts to cover existing partition-based copy-move forgery detection techniques. PMID:25152931

  2. A survey of partition-based techniques for copy-move forgery detection.

    PubMed

    Diane, Wandji Nanda Nathalie; Xingming, Sun; Moise, Fah Kue

    2014-01-01

    A copy-move forged image results from a specific type of image tampering procedure carried out by copying a part of an image and pasting it on one or more parts of the same image generally to maliciously hide unwanted objects/regions or clone an object. Therefore, detecting such forgeries mainly consists in devising ways of exposing identical or relatively similar areas in images. This survey attempts to cover existing partition-based copy-move forgery detection techniques.

  3. A new tool for spatiotemporal pattern decomposition based on empirical mode decomposition: A case study of monthly mean precipitation in Taihu Lake Basin, China

    NASA Astrophysics Data System (ADS)

    Chenhua, Shen; Yani, Yan

    2017-02-01

    We present a new tool for spatiotemporal pattern decomposition and utilize this new tool to decompose spatiotemporal patterns of monthly mean precipitation from January 1957 to May 2015 in Taihu Lake Basin, China. Our goal is to show that this new tool can mine more hidden information than empirical orthogonal function (EOF). First, based on EOF and empirical mode decomposition (EMD), the time series which is an average over the study region is decomposed into a variety of intrinsic mode functions (IMFs) and a residue by means of EMD. Then, these IMFs are supposed to be explanatory variables and a time series of precipitation in every station is considered as a dependent variable. Next, a linear multivariate regression equation is derived and corresponding coefficients are estimated. These estimated coefficients are physically interpreted as spatial coefficients and their physical meaning is an orthogonal projection between IMF and a precipitation time series in every station. Spatial patterns are presented depending on spatial coefficients. The spatiotemporal patterns include temporal patterns and spatial patterns at various timescales. Temporal pattern is obtained by means of EMD. Based on this temporal pattern, spatial patterns at various timescales will be gotten. The proposed tool has been applied in decomposition of spatiotemporal pattern of monthly mean precipitation in Taihu Lake Basin, China. Since spatial patterns are associated with intrinsic frequency, the new and individual spatial patterns are detected and explained physically. Our analysis shows that this new tool is reliable and applicable for geophysical data in the presence of nonstationarity and long-range correlation and can handle nonstationary spatiotemporal series and has the capacity to extract more hidden time-frequency information on spatiotemporal patterns.

  4. Consultants’ Perceptions of School Counselors’ Ability to Implement an Empirically-Based Intervention for Adolescent Social Anxiety Disorder

    PubMed Central

    Warner, Carrie Masia; Brice, Chad; Esseling, Petra G.; Stewart, Catherine E.; Mufson, Laura; Herzig, Kathleen

    2013-01-01

    Social anxiety is highly prevalent but goes untreated. Although school-based CBT programs are efficacious when delivered by specialized psychologists, it is unclear whether school counselors can implement these interventions effectively, which is essential to promote sustainable school programs. We present an initial consultation strategy to support school counselor implementation of group CBT for social anxiety and an evaluation of counselors’ treatment fidelity. Counselors were highly adherent to the treatment, but competence varied based on measurement. Counselors and consultants demonstrated good agreement for adherence, but relatively modest correspondence in competence ratings. We discuss future directions for school-based implementation efforts informed by these initial findings. PMID:23716144

  5. Empirical models of terrestrial trapped radiation.

    PubMed

    Panasyuk, M I

    1996-01-01

    A survey of empirical models of particles (electrons, protons and heavier ions) of the Earth's radiation belts developed to date is presented. Results of intercomparison of the different models as well as comparison with experimental data are reported. Aspects of further development of radiation condition modelling in near-Earth space, including dynamic model developing are discussed.

  6. A Review of Theoretical and Empirical Advancements

    ERIC Educational Resources Information Center

    Wang, Mo; Henkens, Kene; van Solinge, Hanna

    2011-01-01

    In this article, we review both theoretical and empirical advancements in retirement adjustment research. After reviewing and integrating current theories about retirement adjustment, we propose a resource-based dynamic perspective to apply to the understanding of retirement adjustment. We then review empirical findings that are associated with…

  7. Estimation of daily global solar radiation using wavelet regression, ANN, GEP and empirical models: A comparative study of selected temperature-based approaches

    NASA Astrophysics Data System (ADS)

    Sharifi, Sayed Saber; Rezaverdinejad, Vahid; Nourani, Vahid

    2016-11-01

    Although the sunshine-based models generally have a better performance than temperature-based models for estimating solar radiation, the limited availability of sunshine duration records makes the development of temperature-based methods inevitable. This paper presents a comparative study between Artificial Neural Networks (ANNs), Gene Expression Programming (GEP), Wavelet Regression (WR) and 5 selected temperature-based empirical models for estimating the daily global solar radiation. A new combination of inputs including four readily accessible parameters have been employed: daily mean clearness index (KT), temperature range (ΔT), theoretical sunshine duration (N) and extraterrestrial radiation (Ra). Ten statistical indicators in a form of GPI (Global Performance Indicator) is used to ascertain the suitability of the models. The performance of selected models across the range of solar radiation values, was depicted by the quantile-quantile (Q-Q) plots. Comparing these plots makes it evident that ANNs can cover a broader range of solar radiation values. The results shown indicate that the performance of ANN model was clearly superior to the other models. The findings also demonstrated that WR model performed well and presented high accuracy in estimations of daily global solar radiation.

  8. PM3 semi-empirical IR spectra simulations for metal complexes of schiff bases of sulfa drugs

    NASA Astrophysics Data System (ADS)

    Topacli, C.; Topacli, A.

    2003-06-01

    The molecular structures and infrared spectra of Co, Ni, Cu and Zn complexes of two schiff base ligands, viz N-( o-vanillinidene)sulfanilamide ( oVSaH) and N-( o-vanillinidene)sulfamerazine ( oVSmrzH) are studied in detail by PM3 method. It has been shown that the proposed structures for the compounds derived from microanalytical, magnetic and various spectral data were consistent with the IR spectra simulated by PM3 method. Coordination effects on ν(CN) and ν(C-O) modes in the schiff base ligands are in close agreement with the observed results.

  9. A Parameter Identification Method for Helicopter Noise Source Identification and Physics-Based Semi-Empirical Modeling

    NASA Technical Reports Server (NTRS)

    Greenwood, Eric, II; Schmitz, Fredric H.

    2010-01-01

    A new physics-based parameter identification method for rotor harmonic noise sources is developed using an acoustic inverse simulation technique. This new method allows for the identification of individual rotor harmonic noise sources and allows them to be characterized in terms of their individual non-dimensional governing parameters. This new method is applied to both wind tunnel measurements and ground noise measurements of two-bladed rotors. The method is shown to match the parametric trends of main rotor Blade-Vortex Interaction (BVI) noise, allowing accurate estimates of BVI noise to be made for operating conditions based on a small number of measurements taken at different operating conditions.

  10. 23 CFR Appendix C to Part 1240 - Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153)

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... GRANTS FOR USE OF SEAT BELTS-ALLOCATIONS BASED ON SEAT BELT USE RATES Pt. 1240, App. C Appendix C to Part... Certification-Calendar Year 1998 Seat Belt Use Survey State of Seat Belt Use Rate Reported for Calendar Year... cars, pickup trucks, vans, minivans, and sport utility vehicles), measures seat belt use by all...

  11. 23 CFR Appendix C to Part 1240 - Certification (Calendar Year 1998 Survey Based on Survey Approved Under 23 U.S.C. 153)

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... GRANTS FOR USE OF SEAT BELTS-ALLOCATIONS BASED ON SEAT BELT USE RATES Pt. 1240, App. C Appendix C to Part... Certification-Calendar Year 1998 Seat Belt Use Survey State of Seat Belt Use Rate Reported for Calendar Year... cars, pickup trucks, vans, minivans, and sport utility vehicles), measures seat belt use by all...

  12. Dynamic Interaction: A Measurement Development and Empirical Evaluation of Knowledge Based Systems and Web 2.0 Decision Support Mashups

    ERIC Educational Resources Information Center

    Beemer, Brandon Alan

    2010-01-01

    The research presented in this dissertation focuses on the organizational and consumer need for knowledge based support in unstructured domains, by developing a measurement scale for dynamic interaction. Addressing this need is approached and evaluated from two different perspectives. The first approach is the development of Knowledge Based…

  13. Analyzing Interactions by an IIS-Map-Based Method in Face-to-Face Collaborative Learning: An Empirical Study

    ERIC Educational Resources Information Center

    Zheng, Lanqin; Yang, Kaicheng; Huang, Ronghuai

    2012-01-01

    This study proposes a new method named the IIS-map-based method for analyzing interactions in face-to-face collaborative learning settings. This analysis method is conducted in three steps: firstly, drawing an initial IIS-map according to collaborative tasks; secondly, coding and segmenting information flows into information items of IIS; thirdly,…

  14. Examining the Potential of Web-Based Multimedia to Support Complex Fine Motor Skill Learning: An Empirical Study

    ERIC Educational Resources Information Center

    Papastergiou, Marina; Pollatou, Elisana; Theofylaktou, Ioannis; Karadimou, Konstantina

    2014-01-01

    Research on the utilization of the Web for complex fine motor skill learning that involves whole body movements is still scarce. The aim of this study was to evaluate the impact of the introduction of a multimedia web-based learning environment, which was targeted at a rhythmic gymnastics routine consisting of eight fine motor skills, into an…

  15. An Empirical Research of Chinese Learners' Acquisition of the English Article System--Based on Syntactic Misanalysis Account

    ERIC Educational Resources Information Center

    Jian, Shi

    2013-01-01

    In the field of applied linguistics, the English article is the acknowledged teaching and learning difficulty and receives lots of attention in second language acquisition (SLA). This paper, based on the Syntactic Misanalysis Account (SMA) advocated by Trenkic in which L2 articles are analyzed as adjectives by L2ers, proposes the English article…

  16. The Effects of the Use of Activity-Based Costing Software in the Learning Process: An Empirical Analysis

    ERIC Educational Resources Information Center

    Tan, Andrea; Ferreira, Aldónio

    2012-01-01

    This study investigates the influence of the use of accounting software in teaching activity-based costing (ABC) on the learning process. It draws upon the Theory of Planned Behaviour and uses the end-user computer satisfaction (EUCS) framework to examine students' satisfaction with the ABC software. The study examines students' satisfaction with…

  17. A Cross-Cultural Usability Study on the Internationalization of User Interfaces Based on an Empirical Five Factor Model

    ERIC Educational Resources Information Center

    Chakraborty, Joyram

    2009-01-01

    With the internationalization of e-commerce, it is no longer viable to design one user interface for all environments. Web-based applications and services can be accessed from all over the globe. To account for this globalization process, software developers need to understand that simply accounting for language translation of their websites for…

  18. Comparison of Expert-Based and Empirical Evaluation Methodologies in the Case of a CBL Environment: The ''Orestis'' Experience

    ERIC Educational Resources Information Center

    Karoulis, Athanasis; Demetriadis, Stavros; Pombortsis, Andreas

    2006-01-01

    This paper compares several interface evaluation methods applied in the case of a computer based learning (CBL) environment, during a longitudinal study performed in three European countries, Greece, Germany, and Holland, and within the framework of an EC funded Leonardo da Vinci program. The paper firstly considers the particularities of the CBL…

  19. An Empirical Comparison of an Expert Systems Approach and an IRT Approach to Computer-Based Adaptive Mastery Testing.

    ERIC Educational Resources Information Center

    Luk, HingKwan

    This study examined whether an expert system approach involving intelligent selection of items (EXSPRT-I) is as efficient as item response theory (IRT) based three-parameter adaptive mastery testing (AMT) when there are enough subjects to estimate the three IRT item parameters for all items in the test and when subjects in the item parameter…

  20. Historic Building Information Modelling - Adding intelligence to laser and image based surveys of European classical architecture

    NASA Astrophysics Data System (ADS)

    Murphy, Maurice; McGovern, Eugene; Pavia, Sara

    2013-02-01

    Historic Building Information Modelling (HBIM) is a novel prototype library of parametric objects, based on historic architectural data and a system of cross platform programmes for mapping parametric objects onto point cloud and image survey data. The HBIM process begins with remote collection of survey data using a terrestrial laser scanner combined with digital photo modelling. The next stage involves the design and construction of a parametric library of objects, which are based on the manuscripts ranging from Vitruvius to 18th century architectural pattern books. In building parametric objects, the problem of file format and exchange of data has been overcome within the BIM ArchiCAD software platform by using geometric descriptive language (GDL). The plotting of parametric objects onto the laser scan surveys as building components to create or form the entire building is the final stage in the reverse engineering process. The final HBIM product is the creation of full 3D models including detail behind the object's surface concerning its methods of construction and material make-up. The resultant HBIM can automatically create cut sections, details and schedules in addition to the orthographic projections and 3D models (wire frame or textured) for both the analysis and conservation of historic objects, structures and environments.

  1. Historic Building Information Modelling - Adding Intelligence to Laser and Image Based Surveys

    NASA Astrophysics Data System (ADS)

    Murphy, M.; McGovern, E.; Pavia, S.

    2011-09-01

    Historic Building Information Modelling (HBIM) is a novel prototype library of parametric objects based on historic data and a system of cross platform programmes for mapping parametric objects onto a point cloud and image survey data. The HBIM process begins with remote collection of survey data using a terrestrial laser scanner combined with digital photo modelling. The next stage involves the design and construction of a parametric library of objects, which are based on the manuscripts ranging from Vitruvius to 18th century architectural pattern books. In building parametric objects, the problem of file format and exchange of data has been overcome within the BIM ArchiCAD software platform by using geometric descriptive language (GDL). The plotting of parametric objects onto the laser scan surveys as building components to create or form the entire building is the final stage in the reverse engin- eering process. The final HBIM product is the creation of full 3D models including detail behind the object's surface concerning its methods of construction and material make-up. The resultant HBIM can automatically create cut sections, details and schedules in addition to the orthographic projections and 3D models (wire frame or textured).

  2. NUV Star Catalog from the Lunar-based Ultraviolet Telescope Survey: First Release

    NASA Astrophysics Data System (ADS)

    Meng, Xian-Min; Han, Xu-Hui; Wei, Jian-Yan; Wang, Jing; Cao, Li; Qiu, Yu-Lei; Wu, Chao; Deng, Jin-Song; Cai, Hong-Bo; Xin, Li-Ping

    2016-11-01

    We present a star catalog extracted from the Lunar-based Ultraviolet Telescope (LUT) survey program. LUT's observable sky area is a circular belt around the Moon's north pole, and the survey program covers a preferred area of about 2400 deg2 which includes a region of the Galactic plane. The data are processed with an automatic pipeline which copes with stray light contamination, artificial sources, cosmic rays, flat field calibration, photometry and so on. In the first release version, the catalog provides high confidence sources which have been cross-identified with the Tycho-2 catalog. All the sources have signal-to-noise ratio larger than 5, and the corresponding magnitude limit is typically 14.4 mag, but can be as deep as ˜16 mag if stray light contamination is at the lowest level. A total of 86 467 stars are recorded in the catalog. The full catalog in electronic form is available online.

  3. National surveys of radiofrequency field strengths from radio base stations in Africa.

    PubMed

    Joyner, Ken H; Van Wyk, Marthinus J; Rowley, Jack T

    2014-01-01

    The authors analysed almost 260 000 measurement points from surveys of radiofrequency (RF) field strengths near radio base stations in seven African countries over two time frames from 2001 to 2003 and 2006 to 2012. The results of the national surveys were compared, chronological trends investigated and potential exposures compared by technology and with frequency modulation (FM) radio. The key findings from thes data are that irrespective of country, the year and mobile technology, RF fields at a ground level were only a small fraction of the international human RF exposure recommendations. Importantly, there has been no significant increase in typical measured levels since the introduction of 3G services. The mean levels in these African countries are similar to the reported levels for countries of Asia, Europe and North America using similar mobile technologies. The median level for the FM services in South Africa was comparable to the individual but generally lower than the combined mobile services.

  4. A national survey of school-based, adolescent suicide prevention programs.

    PubMed

    Garland, A; Shaffer, D; Whittle, B

    1989-11-01

    A national survey of suicide prevention programs was conducted to determine the number, distribution and content of school-based, curriculum programs for adolescents. One hundred fifteen programs were identified. The total number of students and schools targeted for prevention efforts more than doubled during the academic years 1984/1985 to 1986/1987. Content of the programs was similar, with nearly all including information on suicide warning signs and other facts, as well as on accessing community mental health resources. Most included a separate component for school staff and parents. Ninety-five percent subscribed to the view that suicide is most commonly a response to extreme stress or pressure and could happen to anyone. Possible negative implications of this "stress model" of suicide were discussed. While this survey plays an important first step in providing a description of these programs, more evaluative research is needed to determine what effect, if any, these programs have on suicidal behavior.

  5. National surveys of radiofrequency field strengths from radio base stations in Africa

    PubMed Central

    Joyner, Ken H.; Van Wyk, Marthinus J.; Rowley, Jack T.

    2014-01-01

    The authors analysed almost 260 000 measurement points from surveys of radiofrequency (RF) field strengths near radio base stations in seven African countries over two time frames from 2001 to 2003 and 2006 to 2012. The results of the national surveys were compared, chronological trends investigated and potential exposures compared by technology and with frequency modulation (FM) radio. The key findings from thes data are that irrespective of country, the year and mobile technology, RF fields at a ground level were only a small fraction of the international human RF exposure recommendations. Importantly, there has been no significant increase in typical measured levels since the introduction of 3G services. The mean levels in these African countries are similar to the reported levels for countries of Asia, Europe and North America using similar mobile technologies. The median level for the FM services in South Africa was comparable to the individual but generally lower than the combined mobile services. PMID:24044904

  6. The Cepheid Period-Luminosity Relation in M31 Based on the PHAT Survey

    NASA Astrophysics Data System (ADS)

    Wagner-Kaiser, Rachel; Sarajedini, A.; Stanek, K. Z.; Dalcanton, J.; Williams, B. F.; Dolphin, A. E.; PHAT Team

    2013-01-01

    Using Hubble Space Telescope Advanced Camera for Surveys (HST/ACS) observations from the Panchromatic Hubble Andromeda Treasury (PHAT), we present a new optical period-luminosity relation for Cepheid variables in M31. Photometry of cepheids from the DIRECT ground-based survey are fit to light curve templates, which are used to derive the phase of the variation. The phases, periods, and light curve templates are then applied in order to de-phase the high precision HST photometry. This results in a period-luminosity relation with a significantly smaller dispersion as compared with using random phase observations. We then apply this relation to derive a new distance modulus to M31. Support for this work was provided by NASA through grant number HST-GO-12055 from the Space Telescope Science Institute, which is operated by AURA, Inc., under NASA contract NAS5-26555.

  7. Teaching Knowledge with Curriculum-Based Technology: Development of a Survey Instrument for Pre-Service Teachers

    ERIC Educational Resources Information Center

    Ozden, Sule Yilmaz; Mouza, Chrystalla; Shinas, Valerie Harlow

    2016-01-01

    The purpose of this quantitative study was to develop and test an accessible and interpretable survey instrument intended to measure pre-service teachers' Technological Pedagogical Content Knowledge (TPACK). The "Survey of Teaching Knowledge with Curriculum-Based Technology" was developed and administered to 124 pre-service teachers…

  8. The National Nursing Assistant Survey: Improving the Evidence Base for Policy Initiatives to Strengthen the Certified Nursing Assistant Workforce

    ERIC Educational Resources Information Center

    Squillace, Marie R.; Remsburg, Robin E.; Harris-Kojetin, Lauren D.; Bercovitz, Anita; Rosenoff, Emily; Han, Beth

    2009-01-01

    Purpose: This study introduces the first National Nursing Assistant Survey (NNAS), a major advance in the data available about certified nursing assistants (CNAs) and a rich resource for evidence-based policy, practice, and applied research initiatives. We highlight potential uses of this new survey using select population estimates as examples of…

  9. Development and validation of a web-based questionnaire for surveying the health and working conditions of high-performance marine craft populations

    PubMed Central

    de Alwis, Manudul Pahansen; Lo Martire, Riccardo; Äng, Björn O; Garme, Karl

    2016-01-01

    Background High-performance marine craft crews are susceptible to various adverse health conditions caused by multiple interactive factors. However, there are limited epidemiological data available for assessment of working conditions at sea. Although questionnaire surveys are widely used for identifying exposures, outcomes and associated risks with high accuracy levels, until now, no validated epidemiological tool exists for surveying occupational health and performance in these populations. Aim To develop and validate a web-based questionnaire for epidemiological assessment of occupational and individual risk exposure pertinent to the musculoskeletal health conditions and performance in high-performance marine craft populations. Method A questionnaire for investigating the association between work-related exposure, performance and health was initially developed by a consensus panel under four subdomains, viz. demography, lifestyle, work exposure and health and systematically validated by expert raters for content relevance and simplicity in three consecutive stages, each iteratively followed by a consensus panel revision. The item content validity index (I-CVI) was determined as the proportion of experts giving a rating of 3 or 4. The scale content validity index (S-CVI/Ave) was computed by averaging the I-CVIs for the assessment of the questionnaire as a tool. Finally, the questionnaire was pilot tested. Results The S-CVI/Ave increased from 0.89 to 0.96 for relevance and from 0.76 to 0.94 for simplicity, resulting in 36 items in the final questionnaire. The pilot test confirmed the feasibility of the questionnaire. Conclusions The present study shows that the web-based questionnaire fulfils previously published validity acceptance criteria and is therefore considered valid and feasible for the empirical surveying of epidemiological aspects among high-performance marine craft crews and similar populations. PMID:27324717

  10. Using model based systems engineering for the development of the Large Synoptic Survey Telescope's operational plan

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Willman, Beth; Petravick, Don; Johnson, Margaret; Reil, Kevin; Marshall, Stuart; Thomas, Sandrine; Lotz, Paul; Schumacher, German; Lim, Kian-Tat; Jenness, Tim; Jacoby, Suzanne; Emmons, Ben; Axelrod, Tim

    2016-08-01

    We† provide an overview of the Model Based Systems Engineering (MBSE) language, tool, and methodology being used in our development of the Operational Plan for Large Synoptic Survey Telescope (LSST) operations. LSST's Systems Engineering (SE) team is using a model-based approach to operational plan development to: 1) capture the topdown stakeholders' needs and functional allocations defining the scope, required tasks, and personnel needed for operations, and 2) capture the bottom-up operations and maintenance activities required to conduct the LSST survey across its distributed operations sites for the full ten year survey duration. To accomplish these complimentary goals and ensure that they result in self-consistent results, we have developed a holistic approach using the Sparx Enterprise Architect modeling tool and Systems Modeling Language (SysML). This approach utilizes SysML Use Cases, Actors, associated relationships, and Activity Diagrams to document and refine all of the major operations and maintenance activities that will be required to successfully operate the observatory and meet stakeholder expectations. We have developed several customized extensions of the SysML language including the creation of a custom stereotyped Use Case element with unique tagged values, as well as unique association connectors and Actor stereotypes. We demonstrate this customized MBSE methodology enables us to define: 1) the rolls each human Actor must take on to successfully carry out the activities associated with the Use Cases; 2) the skills each Actor must possess; 3) the functional allocation of all required stakeholder activities and Use Cases to organizational entities tasked with carrying them out; and 4) the organization structure required to successfully execute the operational survey. Our approach allows for continual refinement utilizing the systems engineering spiral method to expose finer levels of detail as necessary. For example, the bottom-up, Use Case

  11. Social Media and Evidence-Based Maternity Care: A Cross-Sectional Survey Study

    PubMed Central

    Dekker, Rebecca L.; King, Sarah; Lester, Kara

    2016-01-01

    ABSTRACT The purpose of this study was to describe how people use social media to find and disseminate information about evidence-based maternity care. We used a cross-sectional Internet-based survey design in which 1,661 participants were recruited from childbirth-related blogs. Participants answered questions about how they find, use, and share evidence-based maternity information using social media. Overall, women in this study were highly engaged in using social media to find and share maternity information. Most respondents were very interested in reading evidence-based maternity care articles online. Most intend to use this information that they found, despite the fact that a substantial percentage had no intentions of discussing this information with their childbirth educators or physician. PMID:27445448

  12. Social Media and Evidence-Based Maternity Care: A Cross-Sectional Survey Study.

    PubMed

    Dekker, Rebecca L; King, Sarah; Lester, Kara

    2016-01-01

    The purpose of this study was to describe how people use social media to find and disseminate information about evidence-based maternity care. We used a cross-sectional Internet-based survey design in which 1,661 participants were recruited from childbirth-related blogs. Participants answered questions about how they find, use, and share evidence-based maternity information using social media. Overall, women in this study were highly engaged in using social media to find and share maternity information. Most respondents were very interested in reading evidence-based maternity care articles online. Most intend to use this information that they found, despite the fact that a substantial percentage had no intentions of discussing this information with their childbirth educators or physician.

  13. Survey of RF exposure levels from mobile telephone base stations in Australia.

    PubMed

    Henderson, S I; Bangay, M J

    2006-01-01

    This paper reports the results of an exposure level survey of radiofrequency electromagnetic energy originating from mobile telephone base station antennas. Measurements of CDMA800, GSM900, GSM1800, and 3G(UMTS) signals were performed at distances ranging over 50 to 500 m from 60 base stations in five Australian cities. The exposure levels from these mobile telecommunications base stations were found to be well below the general public exposure limits of the ICNIRP guidelines and the Australian radiofrequency standard (ARPANSA RPS3). The highest recorded level from a single base station was 7.8 x 10(-3) W/m(2), which translates to 0.2% of the general public exposure limit.

  14. An empirical model of H2O, CO2 and CO coma distributions and production rates for comet 67P/Churyumov-Gerasimenko based on ROSINA/DFMS measurements and AMPS-DSMC simulations

    NASA Astrophysics Data System (ADS)

    Hansen, Kenneth C.; Altwegg, Kathrin; Bieler, Andre; Berthelier, Jean-Jacques; Calmonte, Ursina; Combi, Michael R.; De Keyser, Johan; Fiethe, Björn; Fougere, Nicolas; Fuselier, Stephen; Gombosi, T. I.; Hässig, Myrtha; Huang, Zhenguang; Le Roy, Léna; Rubin, Martin; Tenishev, Valeriy; Toth, Gabor; Tzou, Chia-Yu; ROSINA Team

    2016-10-01

    We have previously used results from the AMPS DSMC (Adaptive Mesh Particle Simulator Direct Simulation Monte Carlo) model to create an empirical model of the near comet water (H2O) coma of comet 67P/Churyumov-Gerasimenko. In this work we create additional empirical models for the coma distributions of CO2 and CO. The AMPS simulations are based on ROSINA DFMS (Rosetta Orbiter Spectrometer for Ion and Neutral Analysis, Double Focusing Mass Spectrometer) data taken over the entire timespan of the Rosetta mission. The empirical model is created using AMPS DSMC results which are extracted from simulations at a range of radial distances, rotation phases and heliocentric distances. The simulation results are then averaged over a comet rotation and fitted to an empirical model distribution. Model coefficients are then fitted to piecewise-linear functions of heliocentric distance. The final product is an empirical model of the coma distribution which is a function of heliocentric distance, radial distance, and sun-fixed longitude and latitude angles. The model clearly mimics the behavior of water shifting production from North to South across the inbound equinox while the CO2 production is always in the South.The empirical model can be used to de-trend the spacecraft motion from the ROSINA COPS and DFMS data. The ROSINA instrument measures the neutral coma density at a single point and the measured value is influenced by the location of the spacecraft relative to the comet and the comet-sun line. Using the empirical coma model we can correct for the position of the spacecraft and compute a total production rate based on single point measurements. In this presentation we will present the coma production rates as a function of heliocentric distance for the entire Rosetta mission.This work was supported by contracts JPL#1266313 and JPL#1266314 from the US Rosetta Project and NASA grant NNX14AG84G from the Planetary Atmospheres Program.

  15. What Do Stroke Patients Look for in Game-Based Rehabilitation: A Survey Study.

    PubMed

    Hung, Ya-Xuan; Huang, Pei-Chen; Chen, Kuan-Ta; Chu, Woei-Chyn

    2016-03-01

    Stroke is one of the most common causes of physical disability, and early, intensive, and repetitive rehabilitation exercises are crucial to the recovery of stroke survivors. Unfortunately, research shows that only one third of stroke patients actually perform recommended exercises at home, because of the repetitive and mundane nature of conventional rehabilitation exercises. Thus, to motivate stroke survivors to engage in monotonous rehabilitation is a significant issue in the therapy process. Game-based rehabilitation systems have the potential to encourage patients continuing rehabilitation exercises at home. However, these systems are still rarely adopted at patients' places. Discovering and eliminating the obstacles in promoting game-based rehabilitation at home is therefore essential. For this purpose, we conducted a study to collect and analyze the opinions and expectations of stroke patients and clinical therapists. The study is composed of 2 parts: Rehab-preference survey - interviews to both patients and therapists to understand the current practices, challenges, and expectations on game-based rehabilitation systems; and Rehab-compatibility survey - a gaming experiment with therapists to elaborate what commercial games are compatible with rehabilitation. The study is conducted with 30 outpatients with stroke and 19 occupational therapists from 2 rehabilitation centers in Taiwan. Our surveys show that game-based rehabilitation systems can turn the rehabilitation exercises more appealing and provide personalized motivation for various stroke patients. Patients prefer to perform rehabilitation exercises with more diverse and fun games, and need cost-effective rehabilitation systems, which are often built on commodity hardware. Our study also sheds light on incorporating the existing design-for-fun games into rehabilitation system. We envision the results are helpful in developing a platform which enables rehab-compatible (i.e., existing, appropriately

  16. Identifying significant covariates for anti-HIV treatment response: mechanism-based differential equation models and empirical semiparametric regression models.

    PubMed

    Huang, Yangxin; Liang, Hua; Wu, Hulin

    2008-10-15

    In this paper, the mechanism-based ordinary differential equation (ODE) model and the flexible semiparametric regression model are employed to identify the significant covariates for antiretroviral response in AIDS clinical trials. We consider the treatment effect as a function of three factors (or covariates) including pharmacokinetics, drug adherence and susceptibility. Both clinical and simulated data examples are given to illustrate these two different kinds of modeling approaches. We found that the ODE model is more powerful to model the mechanism-based nonlinear relationship between treatment effects and virological response biomarkers. The ODE model is also better in identifying the significant factors for virological response, although it is slightly liberal and there is a trend to include more factors (or covariates) in the model. The semiparametric mixed-effects regression model is very flexible to fit the virological response data, but it is too liberal to identify correct factors for the virological response; sometimes it may miss the correct factors. The ODE model is also biologically justifiable and good for predictions and simulations for various biological scenarios. The limitations of the ODE models include the high cost of computation and the requirement of biological assumptions that sometimes may not be easy to validate. The methodologies reviewed in this paper are also generally applicable to studies of other viruses such as hepatitis B virus or hepatitis C virus.

  17. Developing a Suitable Model for Supplier Selection Based on Supply Chain Risks: An Empirical Study from Iranian Pharmaceutical Companies

    PubMed Central

    Mehralian, Gholamhossein; Rajabzadeh Gatari, Ali; Morakabati, Mohadese; Vatanpour, Hossein

    2012-01-01

    The supply chain represents the critical link between the development of new product and the market in pharmaceutical industry. Over the years, improvements made in supply chain operations have focused largely on ways to reduce cost and gain efficiencies in scale. In addition, powerful regulatory and market forces have provided new incentives for pharmaceutical firms to basically rethink the way they produce and distribute products, and also to re-imagine the role of the supply chain in driving strategic growth, brand differentiation and economic value in the health continuum. The purpose of this paper is to formulate basic factors involved in risk analysis of pharmaceutical industry, and also determine the effective factors involved in suppliers selection and their priorities. This paper is based on the results of literature review, experts’ opinion acquisition, statistical analysis and also using MADM models on data gathered from distributed questionnaires. The model consists of the following steps and components: first factors involved in to supply chain risks are determined. Based on them a framework is considered. According the result of statistical analysis and MADM models the risk factors are formulated. The paper determines the main components and influenceial factors involving in the supply chain risks. Results showed that delivery risk can make an important contribution to mitigate the risk of pharmaceutical industry. PMID:24250442

  18. Developing a suitable model for supplier selection based on supply chain risks: an empirical study from Iranian pharmaceutical companies.

    PubMed

    Mehralian, Gholamhossein; Rajabzadeh Gatari, Ali; Morakabati, Mohadese; Vatanpour, Hossein

    2012-01-01

    The supply chain represents the critical link between the development of new product and the market in pharmaceutical industry. Over the years, improvements made in supply chain operations have focused largely on ways to reduce cost and gain efficiencies in scale. In addition, powerful regulatory and market forces have provided new incentives for pharmaceutical firms to basically rethink the way they produce and distribute products, and also to re-imagine the role of the supply chain in driving strategic growth, brand differentiation and economic value in the health continuum. The purpose of this paper is to formulate basic factors involved in risk analysis of pharmaceutical industry, and also determine the effective factors involved in suppliers selection and their priorities. This paper is based on the results of literature review, experts' opinion acquisition, statistical analysis and also using MADM models on data gathered from distributed questionnaires. The model consists of the following steps and components: first factors involved in to supply chain risks are determined. Based on them a framework is considered. According the result of statistical analysis and MADM models the risk factors are formulated. The paper determines the main components and influenceial factors involving in the supply chain risks. Results showed that delivery risk can make an important contribution to mitigate the risk of pharmaceutical industry.

  19. Subjectivity of LiDAR-Based Offset Measurements: Results from a Public Online Survey

    NASA Astrophysics Data System (ADS)

    Salisbury, J. B.; Arrowsmith, R.; Rockwell, T. K.; Haddad, D. E.; Zielke, O.; Madden, C.

    2012-12-01

    Geomorphic features (e.g., stream channels) that are offset in an earthquake can be measured to determine slip at that location. Analysis of these and other offset features can provide useful information for generating fault slip distributions. Remote analyses of active fault zones using high-resolution LiDAR data have recently been pursued in several studies, but there is a lack of consistency between users both for data analysis and results reporting. Individual investigators typically make offset measurements in a particular study area with their own protocols for measurement, assessing uncertainty, and quality rating, yet there is no coherent understanding of the reliability and repeatability of the measurements from observer to observer. We invited the participation of colleagues, interested geoscience communities, and the general public to measure ten geomorphic offsets from active faults in western North America using remote measurement methods that span a range of complexity (e.g., paper image and scale, the Google Earth ruler tool, and a MATLAB GUI for calculating backslip required to properly restore tectonic deformation) to explore the subjectivity involved with measuring geomorphic offsets. We provided a semi-quantitative quality-rating rubric for a description of offset quality, but there was a general lack of quality rating/offset uncertainty reporting. Survey responses (including mapped fault traces and piercing lines) were anonymously submitted along with user experience information. We received 11 paper-, 28 Google Earth-, and 16 MATLAB-based survey responses, though not all individuals measured every feature provided. For all survey methods, the majority of responses are in close agreement. However, large discrepancies arise where users interpret landforms differently, specifically the pre-earthquake morphologies and total offset accumulation of geomorphic features. Experienced users make more consistent measurements, whereas beginners less

  20. Teaching earth science in the field: GPS-based educational trails as a practically relevant, empirical verified approach

    NASA Astrophysics Data System (ADS)

    Kisser, Thomas

    2015-04-01

    GPS devices are common use in the daily life and are used in geography classes increasingly often. Presently, specialist literature is merely descriptive and thematically reduced to the function of orientation. The questions whether they are an applicable tool for teaching earth science circumstances and if the lasting learning success shows any differences compared to normal lessons hold in a class room haven't been answered. Neurobiological and teaching psychological knowledge support the idea that students completing the GPS-based educational trail will learn more successful compared to students in a "normal" class: A successful contextualization of modern geomedia stimulates the motivation. Geocaches are also suitable for didactical structuration. The order of "Geopoints" is chosen in a way that the structure of the landscape is being displayed adequate. The students feel addressed affectively due to the real-life encounters and experience their environment consciously. The presented concept "GPS-based educational trail" is different from a normal geocache, which is merely a hide-and-seek-game. Here, the main focus lays on the field work and earth science. The GPS-decvices are used for the orientation between the Geopoints. In order to get two groups with characteristics as different as possible, due to their developmental psychology, age-related education of cognitive and methodical competence, classes from grade 5 (11 years old) and 11 (17 years old) have been chosen. The different cognitive states of development require different didactical approaches. For the 11 grade the topic "rearrangements of fluvial topography" is a possible one. Using the example of anthropogenic rearrangements of the Rheinaue wetlands near Karlsruhe the interdependency between human and environment can be shown. The "Nördlinger Ries" between the Swabian and the Franconian Jura has been chosen for grade 5. The typical elements of the Swabian Jura (karst formation, hydrogeology