Science.gov

Sample records for empirically based theoretical

  1. Generalized Constitutive-Based Theoretical and Empirical Models for Hot Working Behavior of Functionally Graded Steels

    NASA Astrophysics Data System (ADS)

    Vanini, Seyed Ali Sadough; Abolghasemzadeh, Mohammad; Assadi, Abbas

    2013-07-01

    Functionally graded steels with graded ferritic and austenitic regions including bainite and martensite intermediate layers produced by electroslag remelting have attracted much attention in recent years. In this article, an empirical model based on the Zener-Hollomon (Z-H) constitutive equation with generalized material constants is presented to investigate the effects of temperature and strain rate on the hot working behavior of functionally graded steels. Next, a theoretical model, generalized by strain compensation, is developed for the flow stress estimation of functionally graded steels under hot compression based on the phase mixture rule and boundary layer characteristics. The model is used for different strains and grading configurations. Specifically, the results for αβγMγ steels from empirical and theoretical models showed excellent agreement with those of experiments of other references within acceptable error.

  2. Distributed optical fiber-based theoretical and empirical methods monitoring hydraulic engineering subjected to seepage velocity

    NASA Astrophysics Data System (ADS)

    Su, Huaizhi; Tian, Shiguang; Cui, Shusheng; Yang, Meng; Wen, Zhiping; Xie, Wei

    2016-09-01

    In order to systematically investigate the general principle and method of monitoring seepage velocity in the hydraulic engineering, the theoretical analysis and physical experiment were implemented based on distributed fiber-optic temperature sensing (DTS) technology. During the coupling influence analyses between seepage field and temperature field in the embankment dam or dike engineering, a simplified model was constructed to describe the coupling relationship of two fields. Different arrangement schemes of optical fiber and measuring approaches of temperature were applied on the model. The inversion analysis idea was further used. The theoretical method of monitoring seepage velocity in the hydraulic engineering was finally proposed. A new concept, namely the effective thermal conductivity, was proposed referring to the thermal conductivity coefficient in the transient hot-wire method. The influence of heat conduction and seepage could be well reflected by this new concept, which was proved to be a potential approach to develop an empirical method monitoring seepage velocity in the hydraulic engineering.

  3. Outcome (competency) based education: an exploration of its origins, theoretical basis, and empirical evidence.

    PubMed

    Morcke, Anne Mette; Dornan, Tim; Eika, Berit

    2013-10-01

    Outcome based or competency based education (OBE) is so firmly established in undergraduate medical education that it might not seem necessary to ask why it was included in recommendations for the future, like the Flexner centenary report. Uncritical acceptance may not, however, deliver its greatest benefits. Our aim was to explore the underpinnings of OBE: its historical origins, theoretical basis, and empirical evidence of its effects in order to answer the question: How can predetermined learning outcomes influence undergraduate medical education? This literature review had three components: A review of historical landmarks in the evolution of OBE; a review of conceptual frameworks and theories; and a systematic review of empirical publications from 1999 to 2010 that reported data concerning the effects of learning outcomes on undergraduate medical education. OBE had its origins in behaviourist theories of learning. It is tightly linked to the assessment and regulation of proficiency, but less clearly linked to teaching and learning activities. Over time, there have been cycles of advocacy for, then criticism of, OBE. A recurring critique concerns the place of complex personal and professional attributes as "competencies". OBE has been adopted by consensus in the face of weak empirical evidence. OBE, which has been advocated for over 50 years, can contribute usefully to defining requisite knowledge and skills, and blueprinting assessments. Its applicability to more complex aspects of clinical performance is not clear. OBE, we conclude, provides a valuable approach to some, but not all, important aspects of undergraduate medical education. PMID:22987194

  4. A Review of Theoretical and Empirical Advancements

    ERIC Educational Resources Information Center

    Wang, Mo; Henkens, Kene; van Solinge, Hanna

    2011-01-01

    In this article, we review both theoretical and empirical advancements in retirement adjustment research. After reviewing and integrating current theories about retirement adjustment, we propose a resource-based dynamic perspective to apply to the understanding of retirement adjustment. We then review empirical findings that are associated with…

  5. Theoretical and empirical bases for dialect-neutral language assessment: contributions from theoretical and applied linguistics to communication disorders.

    PubMed

    Pearson, Barbara Zurer

    2004-02-01

    Three avenues of theoretical research provide insights for discovering abstract properties of language that are subject to disorder and amenable to assessment: (1) the study of universal grammar and its acquisition; (2) descriptions of African American English (AAE) Syntax, Semantics, and Phonology within theoretical linguistics; and (3) the study of specific language impairment (SLI) cross-linguistically. Abstract linguistic concepts were translated into a set of assessment protocols that were used to establish normative data on language acquisition (developmental milestones) in typically developing AAE children ages 4 to 9 years. Testing AAE-speaking language impaired (LI) children and both typically developing (TD) and LI Mainstream American English (MAE)-learning children on these same measures provided the data to select assessments for which (1) TD MAE and AAE children performed the same, and (2) TD performance was reliably different from LI performance in both dialect groups. PMID:15088229

  6. Outcome (Competency) Based Education: An Exploration of Its Origins, Theoretical Basis, and Empirical Evidence

    ERIC Educational Resources Information Center

    Morcke, Anne Mette; Dornan, Tim; Eika, Berit

    2013-01-01

    Outcome based or competency based education (OBE) is so firmly established in undergraduate medical education that it might not seem necessary to ask why it was included in recommendations for the future, like the Flexner centenary report. Uncritical acceptance may not, however, deliver its greatest benefits. Our aim was to explore the…

  7. Theoretical and Empirical Base for Implementation Components of Health-Promoting Schools

    ERIC Educational Resources Information Center

    Samdal, Oddrun; Rowling, Louise

    2011-01-01

    Purpose: Efforts to create a scientific base for the health-promoting school approach have so far not articulated a clear "Science of Delivery". There is thus a need for systematic identification of clearly operationalised implementation components. To address a next step in the refinement of the health-promoting schools' work, this paper sets out…

  8. Designing Educative Curriculum Materials: A Theoretically and Empirically Driven Process

    ERIC Educational Resources Information Center

    Davis, Elizabeth A.; Palincsar, Annemarie Sullivan; Arias, Anna Maria; Bismack, Amber Schultz; Marulis, Loren M.; Iwashyna, Stefanie K.

    2014-01-01

    In this article, the authors argue for a design process in the development of educative curriculum materials that is theoretically and empirically driven. Using a design-based research approach, they describe their design process for incorporating educative features intended to promote teacher learning into existing, high-quality curriculum…

  9. Theoretical and Empirical Descriptions of Thermospheric Density

    NASA Astrophysics Data System (ADS)

    Solomon, S. C.; Qian, L.

    2004-12-01

    The longest-term and most accurate overall description the density of the upper thermosphere is provided by analysis of change in the ephemeris of Earth-orbiting satellites. Empirical models of the thermosphere developed in part from these measurements can do a reasonable job of describing thermospheric properties on a climatological basis, but the promise of first-principles global general circulation models of the coupled thermosphere/ionosphere system is that a true high-resolution, predictive capability may ultimately be developed for thermospheric density. However, several issues are encountered when attempting to tune such models so that they accurately represent absolute densities as a function of altitude, and their changes on solar-rotational and solar-cycle time scales. Among these are the crucial ones of getting the heating rates (from both solar and auroral sources) right, getting the cooling rates right, and establishing the appropriate boundary conditions. However, there are several ancillary issues as well, such as the problem of registering a pressure-coordinate model onto an altitude scale, and dealing with possible departures from hydrostatic equilibrium in empirical models. Thus, tuning a theoretical model to match empirical climatology may be difficult, even in the absence of high temporal or spatial variation of the energy sources. We will discuss some of the challenges involved, and show comparisons of simulations using the NCAR Thermosphere-Ionosphere-Electrodynamics General Circulation Model (TIE-GCM) to empirical model estimates of neutral thermosphere density and temperature. We will also show some recent simulations using measured solar irradiance from the TIMED/SEE instrument as input to the TIE-GCM.

  10. Theoretical modeling of stream potholes based upon empirical observations from the Orange River, Republic of South Africa

    NASA Astrophysics Data System (ADS)

    Springer, Gregory S.; Tooth, Stephen; Wohl, Ellen E.

    2006-12-01

    Potholes carved into streambeds can be important components of channel incision, but they have received little quantitative attention. Here empirical evidence is presented from three sites along the Orange River, Republic of South Africa that demonstrates that the pothole dimensions of radius and depth are strongly correlated using a simple power law. Where radius is the dependent variable, the exponent of the power law describes the rate of increase in radius with increasing depth. Erosion within potholes is complexly related to erosion on the adjacent bed. Erosion efficiencies within small, hemispherical potholes must be high if the potholes are to survive in the face of bed translation (incision). As potholes deepen, however, the necessary efficiencies decline rapidly. Increasing concavity associated with growth imposes stricter constraints; comparatively deep potholes must erode orders of magnitude larger volumes of substrate than shallower potholes in the face of bed retreat. Hemispherical potholes are eventually converted to cylindrical potholes, the geometries of which favor enlargement while they are small. Geometric models constructed using the power law show unambiguously that more substrate is eroded by volume from cylindrical pothole walls during growth than from cylindrical pothole floors. Grinders thus play a secondary role to suspended sediment entrained within the vortices that occur in potholes. Continued growth leads to coalescence with other potholes or destruction through block detachment depending on local geology. The combination of geology and erosion mechanisms may determine whether a strath or inner channel develops as a consequence of the process.

  11. Semivolatile Organic Compounds in Homes: Strategies for Efficient and Systematic Exposure Measurement Based on Empirical and Theoretical Factors

    PubMed Central

    2014-01-01

    Residential exposure can dominate total exposure for commercial chemicals of health concern; however, despite the importance of consumer exposures, methods for estimating household exposures remain limited. We collected house dust and indoor air samples in 49 California homes and analyzed for 76 semivolatile organic compounds (SVOCs)—phthalates, polybrominated diphenyl ethers (PBDEs), polychlorinated biphenyls (PCBs), polycyclic aromatic hydrocarbons (PAHs), and pesticides. Sixty chemicals were detected in either dust or air and here we report 58 SVOCs detected in dust for the first time. In dust, phthalates (bis(2-ethylhexyl) phthalate, benzyl butyl phthalate, di-n-butyl phthalate) and flame retardants (PBDE 99, PBDE 47) were detected at the highest concentrations relative to other chemicals at the 95th percentile, while phthalates were highest at the median. Because SVOCs are found in both gas and condensed phases and redistribute from their original source over time, partitioning models can clarify their fate indoors. We use empirical data to validate air-dust partitioning models and use these results, combined with experience in SVOC exposure assessment, to recommend residential exposure measurement strategies. We can predict dust concentrations reasonably well from measured air concentrations (R2 = 0.80). Partitioning models and knowledge of chemical Koa elucidate exposure pathways and suggest priorities for chemical regulation. These findings also inform study design by allowing researchers to select sampling approaches optimized for their chemicals of interest and study goals. While surface wipes are commonly used in epidemiology studies because of ease of implementation, passive air sampling may be more standardized between homes and also relatively simple to deploy. Validation of passive air sampling methods for SVOCs is a priority. PMID:25488487

  12. Competence and drug use: theoretical frameworks, empirical evidence and measurement.

    PubMed

    Lindenberg, C S; Solorzano, R; Kelley, M; Darrow, V; Gendrop, S C; Strickland, O

    1998-01-01

    Statistics show that use of harmful substances (alcohol, cigarettes, marijuana, cocaine) among women of childbearing age is widespread and serious. Numerous theoretical models and empirical studies have attempted to explain the complex factors that lead individuals to use drugs. The Social Stress Model of Substance Abuse [1] is one model developed to explain parameters that influence drug use. According to the model, the likelihood of an individual engaging in drug use is seen as a function of the stress level and the extent to which it is offset by stress modifiers such as social networks, social competencies, and resources. The variables of the denominator are viewed as interacting with each other to buffer the impact of stress [1]. This article focuses on one of the constructs in this model: that of competence. It presents a summary of theoretical and conceptual formulations for the construct of competence, a review of empirical evidence for the association of competence with drug use, and describes the preliminary development of a multi-scale instrument designed to assess drug protective competence among low-income Hispanic childbearing women. Based upon theoretical and empirical studies, eight domains of drug protective competence were identified and conceptually defined. Using subscales from existing instruments with psychometric evidence for their validity and reliability, a multi-scale instrument was developed to assess drug protective competence. Hypothesis testing was used to assess construct validity. Four drug protective competence domains (social influence, sociability, self-worth, and control/responsibility) were found to be statistically associated with drug use behaviors. Although not statistically significant, expected trends were observed between drug use and the other four domains of drug protective competence (intimacy, nurturance, goal directedness, and spiritual directedness). Study limitations and suggestions for further psychometric testing

  13. Empirical and theoretical analysis of complex systems

    NASA Astrophysics Data System (ADS)

    Zhao, Guannan

    structures evolve on a similar timescale to individual level transmission, we investigated the process of transmission through a model population comprising of social groups which follow simple dynamical rules for growth and break-up, and the profiles produced bear a striking resemblance to empirical data obtained from social, financial and biological systems. Finally, for better implementation of a widely accepted power law test algorithm, we have developed a fast testing procedure using parallel computation.

  14. The Generality of Empirical and Theoretical Explanations of Behavior

    PubMed Central

    Guilhardi, Paulo; Church, Russell M.

    2009-01-01

    For theoretical explanations of data, parameter values estimated from a single dependent measure from one procedure are used to predict alternative dependent measures from many procedures. Theoretical explanations were compared to empirical explanations of data in which known functions and principles were used to fit only selected dependent measures. The comparison focused on the ability of theoretical and empirical explanations to generalize across samples of the data, across dependent measures of behavior, and across different procedures. Rat and human data from fixed-interval and peak procedures, in which principles (e.g., scalar timing) are well known, were described and fit by a theory with independent modules for perception, memory, and decision. The theoretical approach consisted of fitting closed-form equations of the theory to response rate gradients calculated from the data, simulating responses using parameter values previously estimated, and comparing theoretical predictions with dependent measures not used to estimate parameters. Although the empirical and theoretical explanations provided similar fits to the response rate gradients that generalized across samples and had the same number of parameters, only the theoretical explanation generalized across procedures and dependent measures. PMID:19429213

  15. Physical Violence between Siblings: A Theoretical and Empirical Analysis

    ERIC Educational Resources Information Center

    Hoffman, Kristi L.; Kiecolt, K. Jill; Edwards, John N.

    2005-01-01

    This study develops and tests a theoretical model to explain sibling violence based on the feminist, conflict, and social learning theoretical perspectives and research in psychology and sociology. A multivariate analysis of data from 651 young adults generally supports hypotheses from all three theoretical perspectives. Males with brothers have…

  16. Empirical and theoretical models of terrestrial trapped radiation

    SciTech Connect

    Panasyuk, M.I.

    1996-07-01

    A survey of current Skobeltsyn Institute of Nuclear Physics, Moscow State University (INP MSU) empirical and theoretical models of particles (electrons, protons and heavier irons) of the Earth{close_quote}s radiation belts developed to date is presented. Results of intercomparison of the different models as well as comparison with experimental data are reported. Aspects of further development of radiation condition modelling in near-Earth space are discussed. {copyright} {ital 1996 American Institute of Physics.}

  17. Segmented crystalline scintillators: empirical and theoretical investigation of a high quantum efficiency EPID based on an initial engineering prototype CsI(TI) detector.

    PubMed

    Sawant, Amit; Antonuk, Larry E; El-Mohri, Youcef; Zhao, Qihua; Wang, Yi; Li, Yixin; Du, Hong; Perna, Louis

    2006-04-01

    Modern-day radiotherapy relies on highly sophisticated forms of image guidance in order to implement increasingly conformal treatment plans and achieve precise dose delivery. One of the most important goals of such image guidance is to delineate the clinical target volume from surrounding normal tissue during patient setup and dose delivery, thereby avoiding dependence on surrogates such as bony landmarks. In order to achieve this goal, it is necessary to integrate highly efficient imaging technology, capable of resolving soft-tissue contrast at very low doses, within the treatment setup. In this paper we report on the development of one such modality, which comprises a nonoptimized, prototype electronic portal imaging device (EPID) based on a 40 mm thick, segmented crystalline CsI(Tl) detector incorporated into an indirect-detection active matrix flat panel imager (AMFPI). The segmented detector consists of a matrix of 160 x 160 optically isolated, crystalline CsI(Tl) elements spaced at 1016 microm pitch. The detector was coupled to an indirect detection-based active matrix array having a pixel pitch of 508 microm, with each detector element registered to 2 x 2 array pixels. The performance of the prototype imager was evaluated under very low-dose radiotherapy conditions and compared to that of a conventional megavoltage AMFPI based on a Lanex Fast-B phosphor screen. Detailed quantitative measurements were performed in order to determine the x-ray sensitivity, modulation transfer function, noise power spectrum, and detective quantum efficiency (DQE). In addition, images of a contrast-detail phantom and an anthropomorphic head phantom were also acquired. The prototype imager exhibited approximately 22 times higher zero-frequency DQE (approximately 22%) compared to that of the conventional AMFPI (approximately 1%). The measured zero-frequency DQE was found to be lower than theoretical upper limits (approximately 27%) calculated from Monte Carlo simulations, which

  18. Segmented crystalline scintillators: Empirical and theoretical investigation of a high quantum efficiency EPID based on an initial engineering prototype CsI(Tl) detector

    SciTech Connect

    Sawant, Amit; Antonuk, Larry E.; El-Mohri, Youcef; Zhao Qihua; Wang Yi; Li Yixin; Du Hong; Perna, Louis

    2006-04-15

    Modern-day radiotherapy relies on highly sophisticated forms of image guidance in order to implement increasingly conformal treatment plans and achieve precise dose delivery. One of the most important goals of such image guidance is to delineate the clinical target volume from surrounding normal tissue during patient setup and dose delivery, thereby avoiding dependence on surrogates such as bony landmarks. In order to achieve this goal, it is necessary to integrate highly efficient imaging technology, capable of resolving soft-tissue contrast at very low doses, within the treatment setup. In this paper we report on the development of one such modality, which comprises a nonoptimized, prototype electronic portal imaging device (EPID) based on a 40 mm thick, segmented crystalline CsI(Tl) detector incorporated into an indirect-detection active matrix flat panel imager (AMFPI). The segmented detector consists of a matrix of 160x160 optically isolated, crystalline CsI(Tl) elements spaced at 1016 {mu}m pitch. The detector was coupled to an indirect detection-based active matrix array having a pixel pitch of 508 {mu}m, with each detector element registered to 2x2 array pixels. The performance of the prototype imager was evaluated under very low-dose radiotherapy conditions and compared to that of a conventional megavoltage AMFPI based on a Lanex Fast-B phosphor screen. Detailed quantitative measurements were performed in order to determine the x-ray sensitivity, modulation transfer function, noise power spectrum, and detective quantum efficiency (DQE). In addition, images of a contrast-detail phantom and an anthropomorphic head phantom were also acquired. The prototype imager exhibited approximately 22 times higher zero-frequency DQE ({approx}22%) compared to that of the conventional AMFPI ({approx}1%). The measured zero-frequency DQE was found to be lower than theoretical upper limits ({approx}27%) calculated from Monte Carlo simulations, which were based solely on the x

  19. Empirical STORM-E Model. [I. Theoretical and Observational Basis

    NASA Technical Reports Server (NTRS)

    Mertens, Christopher J.; Xu, Xiaojing; Bilitza, Dieter; Mlynczak, Martin G.; Russell, James M., III

    2013-01-01

    Auroral nighttime infrared emission observed by the Sounding of the Atmosphere using Broadband Emission Radiometry (SABER) instrument onboard the Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics (TIMED) satellite is used to develop an empirical model of geomagnetic storm enhancements to E-region peak electron densities. The empirical model is called STORM-E and will be incorporated into the 2012 release of the International Reference Ionosphere (IRI). The proxy for characterizing the E-region response to geomagnetic forcing is NO+(v) volume emission rates (VER) derived from the TIMED/SABER 4.3 lm channel limb radiance measurements. The storm-time response of the NO+(v) 4.3 lm VER is sensitive to auroral particle precipitation. A statistical database of storm-time to climatological quiet-time ratios of SABER-observed NO+(v) 4.3 lm VER are fit to widely available geomagnetic indices using the theoretical framework of linear impulse-response theory. The STORM-E model provides a dynamic storm-time correction factor to adjust a known quiescent E-region electron density peak concentration for geomagnetic enhancements due to auroral particle precipitation. Part II of this series describes the explicit development of the empirical storm-time correction factor for E-region peak electron densities, and shows comparisons of E-region electron densities between STORM-E predictions and incoherent scatter radar measurements. In this paper, Part I of the series, the efficacy of using SABER-derived NO+(v) VER as a proxy for the E-region response to solar-geomagnetic disturbances is presented. Furthermore, a detailed description of the algorithms and methodologies used to derive NO+(v) VER from SABER 4.3 lm limb emission measurements is given. Finally, an assessment of key uncertainties in retrieving NO+(v) VER is presented

  20. Converging Paradigms: A Reflection on Parallel Theoretical Developments in Psychoanalytic Metapsychology and Empirical Dream Research.

    PubMed

    Schmelowszky, Ágoston

    2016-08-01

    In the last decades one can perceive a striking parallelism between the shifting perspective of leading representatives of empirical dream research concerning their conceptualization of dreaming and the paradigm shift within clinically based psychoanalytic metapsychology with respect to its theory on the significance of dreaming. In metapsychology, dreaming becomes more and more a central metaphor of mental functioning in general. The theories of Klein, Bion, and Matte-Blanco can be considered as milestones of this paradigm shift. In empirical dream research, the competing theories of Hobson and of Solms respectively argued for and against the meaningfulness of the dream-work in the functioning of the mind. In the meantime, empirical data coming from various sources seemed to prove the significance of dream consciousness for the development and maintenance of adaptive waking consciousness. Metapsychological speculations and hypotheses based on empirical research data seem to point in the same direction, promising for contemporary psychoanalytic practice a more secure theoretical base. In this paper the author brings together these diverse theoretical developments and presents conclusions regarding psychoanalytic theory and technique, as well as proposing an outline of an empirical research plan for testing the specificity of psychoanalysis in developing dream formation. PMID:27500705

  1. Whole-body cryotherapy: empirical evidence and theoretical perspectives.

    PubMed

    Bleakley, Chris M; Bieuzen, François; Davison, Gareth W; Costello, Joseph T

    2014-01-01

    Whole-body cryotherapy (WBC) involves short exposures to air temperatures below -100°C. WBC is increasingly accessible to athletes, and is purported to enhance recovery after exercise and facilitate rehabilitation postinjury. Our objective was to review the efficacy and effectiveness of WBC using empirical evidence from controlled trials. We found ten relevant reports; the majority were based on small numbers of active athletes aged less than 35 years. Although WBC produces a large temperature gradient for tissue cooling, the relatively poor thermal conductivity of air prevents significant subcutaneous and core body cooling. There is weak evidence from controlled studies that WBC enhances antioxidant capacity and parasympathetic reactivation, and alters inflammatory pathways relevant to sports recovery. A series of small randomized studies found WBC offers improvements in subjective recovery and muscle soreness following metabolic or mechanical overload, but little benefit towards functional recovery. There is evidence from one study only that WBC may assist rehabilitation for adhesive capsulitis of the shoulder. There were no adverse events associated with WBC; however, studies did not seem to undertake active surveillance of predefined adverse events. Until further research is available, athletes should remain cognizant that less expensive modes of cryotherapy, such as local ice-pack application or cold-water immersion, offer comparable physiological and clinical effects to WBC. PMID:24648779

  2. Whole-body cryotherapy: empirical evidence and theoretical perspectives

    PubMed Central

    Bleakley, Chris M; Bieuzen, François; Davison, Gareth W; Costello, Joseph T

    2014-01-01

    Whole-body cryotherapy (WBC) involves short exposures to air temperatures below −100°C. WBC is increasingly accessible to athletes, and is purported to enhance recovery after exercise and facilitate rehabilitation postinjury. Our objective was to review the efficacy and effectiveness of WBC using empirical evidence from controlled trials. We found ten relevant reports; the majority were based on small numbers of active athletes aged less than 35 years. Although WBC produces a large temperature gradient for tissue cooling, the relatively poor thermal conductivity of air prevents significant subcutaneous and core body cooling. There is weak evidence from controlled studies that WBC enhances antioxidant capacity and parasympathetic reactivation, and alters inflammatory pathways relevant to sports recovery. A series of small randomized studies found WBC offers improvements in subjective recovery and muscle soreness following metabolic or mechanical overload, but little benefit towards functional recovery. There is evidence from one study only that WBC may assist rehabilitation for adhesive capsulitis of the shoulder. There were no adverse events associated with WBC; however, studies did not seem to undertake active surveillance of predefined adverse events. Until further research is available, athletes should remain cognizant that less expensive modes of cryotherapy, such as local ice-pack application or cold-water immersion, offer comparable physiological and clinical effects to WBC. PMID:24648779

  3. Ability and Learning: A Theoretical and Empirical Synthesis.

    ERIC Educational Resources Information Center

    Haertel, Geneva D.; Walberg, Herbert J.

    To gauge the relationship between intellectual ability and learning, the authors review the work of 20 theorists and analyze empirical correlations at both the elementary and secondary school levels. Intellectual ability is defined in the paper as including intelligence, prior learning, special aptitudes, and other cognitive characteristics. The…

  4. Potential benefits of remote sensing: Theoretical framework and empirical estimate

    NASA Technical Reports Server (NTRS)

    Eisgruber, L. M.

    1972-01-01

    A theoretical framwork is outlined for estimating social returns from research and application of remote sensing. The approximate dollar magnitude is given of a particular application of remote sensing, namely estimates of corn production, soybeans, and wheat. Finally, some comments are made on the limitations of this procedure and on the implications of results.

  5. Alternative Information Theoretic Measures of Television Messages: An Empirical Test.

    ERIC Educational Resources Information Center

    Danowski, James A.

    This research examines two information theoretic measures of media exposure within the same sample of respondents and examines their relative strengths in predicting self-reported aggression. The first measure is the form entropy (DYNUFAM) index of Watt and Krull, which assesses the structural and organizational properties of specific television…

  6. The ascent of man: Theoretical and empirical evidence for blatant dehumanization.

    PubMed

    Kteily, Nour; Bruneau, Emile; Waytz, Adam; Cotterill, Sarah

    2015-11-01

    Dehumanization is a central concept in the study of intergroup relations. Yet although theoretical and methodological advances in subtle, "everyday" dehumanization have progressed rapidly, blatant dehumanization remains understudied. The present research attempts to refocus theoretical and empirical attention on blatant dehumanization, examining when and why it provides explanatory power beyond subtle dehumanization. To accomplish this, we introduce and validate a blatant measure of dehumanization based on the popular depiction of evolutionary progress in the "Ascent of Man." We compare blatant dehumanization to established conceptualizations of subtle and implicit dehumanization, including infrahumanization, perceptions of human nature and human uniqueness, and implicit associations between ingroup-outgroup and human-animal concepts. Across 7 studies conducted in 3 countries, we demonstrate that blatant dehumanization is (a) more strongly associated with individual differences in support for hierarchy than subtle or implicit dehumanization, (b) uniquely predictive of numerous consequential attitudes and behaviors toward multiple outgroup targets, (c) predictive above prejudice, and (d) reliable over time. Finally, we show that blatant-but not subtle-dehumanization spikes immediately after incidents of real intergroup violence and strongly predicts support for aggressive actions like torture and retaliatory violence (after the Boston Marathon bombings and Woolwich attacks in England). This research extends theory on the role of dehumanization in intergroup relations and intergroup conflict and provides an intuitive, validated empirical tool to reliably measure blatant dehumanization. PMID:26121523

  7. Theoretical, Methodological, and Empirical Approaches to Cost Savings: A Compendium

    SciTech Connect

    M Weimar

    1998-12-10

    This publication summarizes and contains the original documentation for understanding why the U.S. Department of Energy's (DOE's) privatization approach provides cost savings and the different approaches that could be used in calculating cost savings for the Tank Waste Remediation System (TWRS) Phase I contract. The initial section summarizes the approaches in the different papers. The appendices are the individual source papers which have been reviewed by individuals outside of the Pacific Northwest National Laboratory and the TWRS Program. Appendix A provides a theoretical basis for and estimate of the level of savings that can be" obtained from a fixed-priced contract with performance risk maintained by the contractor. Appendix B provides the methodology for determining cost savings when comparing a fixed-priced contractor with a Management and Operations (M&O) contractor (cost-plus contractor). Appendix C summarizes the economic model used to calculate cost savings and provides hypothetical output from preliminary calculations. Appendix D provides the summary of the approach for the DOE-Richland Operations Office (RL) estimate of the M&O contractor to perform the same work as BNFL Inc. Appendix E contains information on cost growth and per metric ton of glass costs for high-level waste at two other DOE sites, West Valley and Savannah River. Appendix F addresses a risk allocation analysis of the BNFL proposal that indicates,that the current approach is still better than the alternative.

  8. Submarine gas hydrate estimation: Theoretical and empirical approaches

    SciTech Connect

    Ginsburg, G.D.; Soloviev, V.A.

    1995-12-01

    The published submarine gas hydrate resource estimates are based on the concepts of their continuous extent over large areas and depth intervals and/or the regionally high hydrate concentrations in sediments. The observational data are in conflict with these concepts. At present such estimates cannot be made to an accuracy better than an order of magnitude. The amount of methane in shallow subbottom (seepage associated) gas-hydrate accumulations is estimated at 10{sup 14} m{sup 3} STP, and in deep-seated hydrates at 10{sup 15} m{sup 3} according to observational data. From the genetic standpoint for the time being gas hydrate potential could be only assessed as far less than 10{sup 17} m{sup 3} because rates of related hydrogeological and geochemical processes have not been adequately studied.

  9. The sensations of everyday life: empirical, theoretical, and pragmatic considerations.

    PubMed

    Dunn, W

    2001-01-01

    The experience of being human is embedded in sensory events of everyday life. This lecture reviews sensory processing literature, including neuroscience and social science perspectives. Introduced is Dunns Model of Sensory Processing, and the evidence supporting this model is summarized. Specifically, using Sensory Profile questionnaires (i.e., items describing responses to sensory events in daily life; persons mark the frequency of each behavior), persons birth to 90 years of age demonstrate four sensory processing patterns: sensory seeking, sensory avoiding, sensory sensitivity, and low registration. These patterns are based on a persons neurological thresholds and self-regulation strategies. Psychophysiology studies verify these sensory processing patterns; persons with strong preferences in each pattern also have unique patterns of habituation and responsivity in skin conductance. Studies also indicate that persons with disabilities respond differently than peers on these questionnaires, suggesting underlying poor sensory processing in certain disorders, including autism, attention deficit hyperactivity disorder, developmental delays, and schizophrenia. The author proposes relationships between sensory processing and temperament and personality traits. The four categories of temperament share some consistency with the four sensory processing patterns described in Dunn's model. As with temperament, each person has some level of responsiveness within each sensory processing preference (i.e., a certain amount of seeking, avoiding, etc., not one or the other). The author suggests that one's sensory processing preferences simultaneously reflect his or her nervous system needs and form the basis for the manifestation of temperament and personality. The final section of this lecture outlines parameters for developing best practice that supports interventions based on this knowledge. PMID:12959225

  10. An Analysis of Enabling School Structure: Theoretical, Empirical, and Research Considerations

    ERIC Educational Resources Information Center

    Sinden, James E.; Hoy, Wayne K.; Sweetland, Scott R.

    2004-01-01

    The construct of enabling school structure is empirically analyzed in this qualitative study of high schools. First, the theoretical underpinning of enabling school structure is developed. Then, six high schools, which were determined to have enabling structures in a large quantitative study of Ohio schools, were analyzed in depth using…

  11. Theoretical Foundation of Zisman's Empirical Equation for Wetting of Liquids on Solid Surfaces

    ERIC Educational Resources Information Center

    Zhu, Ruzeng; Cui, Shuwen; Wang, Xiaosong

    2010-01-01

    Theories of wetting of liquids on solid surfaces under the condition that van der Waals force is dominant are briefly reviewed. We show theoretically that Zisman's empirical equation for wetting of liquids on solid surfaces is a linear approximation of the Young-van der Waals equation in the wetting region, and we express the two parameters in…

  12. A Unified Model of Knowledge Sharing Behaviours: Theoretical Development and Empirical Test

    ERIC Educational Resources Information Center

    Chennamaneni, Anitha; Teng, James T. C.; Raja, M. K.

    2012-01-01

    Research and practice on knowledge management (KM) have shown that information technology alone cannot guarantee that employees will volunteer and share knowledge. While previous studies have linked motivational factors to knowledge sharing (KS), we took a further step to thoroughly examine this theoretically and empirically. We developed a…

  13. University Students' Understanding of the Concepts Empirical, Theoretical, Qualitative and Quantitative Research

    ERIC Educational Resources Information Center

    Murtonen, Mari

    2015-01-01

    University research education in many disciplines is frequently confronted by problems with students' weak level of understanding of research concepts. A mind map technique was used to investigate how students understand central methodological concepts of empirical, theoretical, qualitative and quantitative. The main hypothesis was that some…

  14. Empirically Based Play Interventions for Children

    ERIC Educational Resources Information Center

    Reddy, Linda A., Ed.; Files-Hall, Tara M., Ed.; Schaefer, Charles E., Ed.

    2005-01-01

    "Empirically Based Play Interventions for Children" is a compilation of innovative, well-designed play interventions, presented for the first time in one text. Play therapy is the oldest and most popular form of child therapy in clinical practice and is widely considered by practitioners to be uniquely responsive to children's developmental needs.…

  15. An empirical and theoretical investigation of the intensities of 4f-4f electronic transitions

    SciTech Connect

    Devlin, M.T.

    1987-01-01

    The intensities of certain lanthanide 4f-4f electronic transitions exhibit extraordinary sensitivity to the ligand environment near a lanthanide ion, and empirical and theoretical investigations of these 4f-4f electric-dipole transitions are reported herein. From these studies, the mechanistic basis of 4f-4f electric-dipole transition intensities are evaluated. Additionally, correlations between the structure of lanthanide-ligand complexes and empirically observed electronic transition intensities are developed. The general applicability and utility of these spectra-structure correlations are also evaluated. The influence of the ligand environment of 4f-4f transition intensities is investigated by measuring the absorption spectra of a series of well-characterized neodymium (Nd{sup 3+}), holmium (Ho{sup 3+}) and erbium (Er{sup 3+})-ligand complexes. Trends in the absorption intensity spectra of these lanthanide complexes are related to specific structural features of each complex. The empirically observed spectral trends are evaluated by theoretically investigation the mechanism by which 4f-4f electric-dipole transitions occur. Two separate models of 4f-4f electronic transitions, the static - coupling and the dynamic - coupling models, are incorporated into the general Judd-Ofelt intensity theory. Using these two models, theoretical calculations of 4f-4f electronic transition intensities are performed. The results of these calculations are in good agreement with empirically observed 4f-4f electronic transition intensities, and they are useful in rationalizing the observed spectra-structure correlations.

  16. Color and psychological functioning: a review of theoretical and empirical work

    PubMed Central

    Elliot, Andrew J.

    2015-01-01

    In the past decade there has been increased interest in research on color and psychological functioning. Important advances have been made in theoretical work and empirical work, but there are also important weaknesses in both areas that must be addressed for the literature to continue to develop apace. In this article, I provide brief theoretical and empirical reviews of research in this area, in each instance beginning with a historical background and recent advancements, and proceeding to an evaluation focused on weaknesses that provide guidelines for future research. I conclude by reiterating that the literature on color and psychological functioning is at a nascent stage of development, and by recommending patience and prudence regarding conclusions about theory, findings, and real-world application. PMID:25883578

  17. Dignity in the care of older people – a review of the theoretical and empirical literature

    PubMed Central

    Gallagher, Ann; Li, Sarah; Wainwright, Paul; Jones, Ian Rees; Lee, Diana

    2008-01-01

    Background Dignity has become a central concern in UK health policy in relation to older and vulnerable people. The empirical and theoretical literature relating to dignity is extensive and as likely to confound and confuse as to clarify the meaning of dignity for nurses in practice. The aim of this paper is critically to examine the literature and to address the following questions: What does dignity mean? What promotes and diminishes dignity? And how might dignity be operationalised in the care of older people? This paper critically reviews the theoretical and empirical literature relating to dignity and clarifies the meaning and implications of dignity in relation to the care of older people. If nurses are to provide dignified care clarification is an essential first step. Methods This is a review article, critically examining papers reporting theoretical perspectives and empirical studies relating to dignity. The following databases were searched: Assia, BHI, CINAHL, Social Services Abstracts, IBSS, Web of Knowledge Social Sciences Citation Index and Arts & Humanities Citation Index and location of books a chapters in philosophy literature. An analytical approach was adopted to the publications reviewed, focusing on the objectives of the review. Results and discussion We review a range of theoretical and empirical accounts of dignity and identify key dignity promoting factors evident in the literature, including staff attitudes and behaviour; environment; culture of care; and the performance of specific care activities. Although there is scope to learn more about cultural aspects of dignity we know a good deal about dignity in care in general terms. Conclusion We argue that what is required is to provide sufficient support and education to help nurses understand dignity and adequate resources to operationalise dignity in their everyday practice. Using the themes identified from our review we offer proposals for the direction of future research. PMID:18620561

  18. Conceptual and empirical problems with game theoretic approaches to language evolution

    PubMed Central

    Watumull, Jeffrey; Hauser, Marc D.

    2014-01-01

    The importance of game theoretic models to evolutionary theory has been in formulating elegant equations that specify the strategies to be played and the conditions to be satisfied for particular traits to evolve. These models, in conjunction with experimental tests of their predictions, have successfully described and explained the costs and benefits of varying strategies and the dynamics for establishing equilibria in a number of evolutionary scenarios, including especially cooperation, mating, and aggression. Over the past decade or so, game theory has been applied to model the evolution of language. In contrast to the aforementioned scenarios, however, we argue that these models are problematic due to conceptual confusions and empirical difficiences. In particular, these models conflate the comptutations and representations of our language faculty (mechanism) with its utility in communication (function); model languages as having different fitness functions for which there is no evidence; depend on assumptions for the starting state of the system, thereby begging the question of how these systems evolved; and to date, have generated no empirical studies at all. Game theoretic models of language evolution have therefore failed to advance how or why language evolved, or why it has the particular representations and computations that it does. We conclude with some brief suggestions for how this situation might be ameliorated, enabling this important theoretical tool to make substantive empirical contributions. PMID:24678305

  19. Conceptual and empirical problems with game theoretic approaches to language evolution.

    PubMed

    Watumull, Jeffrey; Hauser, Marc D

    2014-01-01

    The importance of game theoretic models to evolutionary theory has been in formulating elegant equations that specify the strategies to be played and the conditions to be satisfied for particular traits to evolve. These models, in conjunction with experimental tests of their predictions, have successfully described and explained the costs and benefits of varying strategies and the dynamics for establishing equilibria in a number of evolutionary scenarios, including especially cooperation, mating, and aggression. Over the past decade or so, game theory has been applied to model the evolution of language. In contrast to the aforementioned scenarios, however, we argue that these models are problematic due to conceptual confusions and empirical difficiences. In particular, these models conflate the comptutations and representations of our language faculty (mechanism) with its utility in communication (function); model languages as having different fitness functions for which there is no evidence; depend on assumptions for the starting state of the system, thereby begging the question of how these systems evolved; and to date, have generated no empirical studies at all. Game theoretic models of language evolution have therefore failed to advance how or why language evolved, or why it has the particular representations and computations that it does. We conclude with some brief suggestions for how this situation might be ameliorated, enabling this important theoretical tool to make substantive empirical contributions. PMID:24678305

  20. Why It Is Hard to Find Genes Associated With Social Science Traits: Theoretical and Empirical Considerations

    PubMed Central

    Lee, James J.; Benjamin, Daniel J.; Beauchamp, Jonathan P.; Glaeser, Edward L.; Borst, Gregoire; Pinker, Steven; Laibson, David I.

    2013-01-01

    Objectives. We explain why traits of interest to behavioral scientists may have a genetic architecture featuring hundreds or thousands of loci with tiny individual effects rather than a few with large effects and why such an architecture makes it difficult to find robust associations between traits and genes. Methods. We conducted a genome-wide association study at 2 sites, Harvard University and Union College, measuring more than 100 physical and behavioral traits with a sample size typical of candidate gene studies. We evaluated predictions that alleles with large effect sizes would be rare and most traits of interest to social science are likely characterized by a lack of strong directional selection. We also carried out a theoretical analysis of the genetic architecture of traits based on R.A. Fisher’s geometric model of natural selection and empirical analyses of the effects of selection bias and phenotype measurement stability on the results of genetic association studies. Results. Although we replicated several known genetic associations with physical traits, we found only 2 associations with behavioral traits that met the nominal genome-wide significance threshold, indicating that physical and behavioral traits are mainly affected by numerous genes with small effects. Conclusions. The challenge for social science genomics is the likelihood that genes are connected to behavioral variation by lengthy, nonlinear, interactive causal chains, and unraveling these chains requires allying with personal genomics to take advantage of the potential for large sample sizes as well as continuing with traditional epidemiological studies. PMID:23927501

  1. The Role of Trait Emotional Intelligence in Academic Performance: Theoretical Overview and Empirical Update.

    PubMed

    Perera, Harsha N

    2016-01-01

    Considerable debate still exists among scholars over the role of trait emotional intelligence (TEI) in academic performance. The dominant theoretical position is that TEI should be orthogonal or only weakly related to achievement; yet, there are strong theoretical reasons to believe that TEI plays a key role in performance. The purpose of the current article is to provide (a) an overview of the possible theoretical mechanisms linking TEI with achievement and (b) an update on empirical research examining this relationship. To elucidate these theoretical mechanisms, the overview draws on multiple theories of emotion and regulation, including TEI theory, social-functional accounts of emotion, and expectancy-value and psychobiological model of emotion and regulation. Although these theoretical accounts variously emphasize different variables as focal constructs, when taken together, they provide a comprehensive picture of the possible mechanisms linking TEI with achievement. In this regard, the article redresses the problem of vaguely specified theoretical links currently hampering progress in the field. The article closes with a consideration of directions for future research. PMID:26515326

  2. On the complex relationship between energy expenditure and longevity: Reconciling the contradictory empirical results with a simple theoretical model.

    PubMed

    Hou, Chen; Amunugama, Kaushalya

    2015-07-01

    The relationship between energy expenditure and longevity has been a central theme in aging studies. Empirical studies have yielded controversial results, which cannot be reconciled by existing theories. In this paper, we present a simple theoretical model based on first principles of energy conservation and allometric scaling laws. The model takes into considerations the energy tradeoffs between life history traits and the efficiency of the energy utilization, and offers quantitative and qualitative explanations for a set of seemingly contradictory empirical results. We show that oxidative metabolism can affect cellular damage and longevity in different ways in animals with different life histories and under different experimental conditions. Qualitative data and the linearity between energy expenditure, cellular damage, and lifespan assumed in previous studies are not sufficient to understand the complexity of the relationships. Our model provides a theoretical framework for quantitative analyses and predictions. The model is supported by a variety of empirical studies, including studies on the cellular damage profile during ontogeny; the intra- and inter-specific correlations between body mass, metabolic rate, and lifespan; and the effects on lifespan of (1) diet restriction and genetic modification of growth hormone, (2) the cold and exercise stresses, and (3) manipulations of antioxidant. PMID:26086438

  3. Theoretical and empirical qualification of a mechanical-optical interface for parallel optics links

    NASA Astrophysics Data System (ADS)

    Chuang, S.; Schoellner, D.; Ugolini, A.; Wakjira, J.; Wolf, G.; Gandhi, P.; Persaud, A.

    2015-03-01

    As the implementation of parallel optics continues to evolve, development of a universal coupling interface between VCSEL/PD arrays and the corresponding photonic turn connector is necessary. A newly developed monolithic mechanical-optical interface efficiently couples optical transmit/receive arrays to the accompanying fiber optic connector. This paper describes the optical model behind the coupling interface and validates the model using empirical measurements. Optical modeling will address how the interface is adaptable to the broad range of VCSEL/PD optical parameters from commercially available VCSEL hardware manufacturers; the optical model will illustrate coupling efficiencies versus launch specifications. Theoretical modeling will examine system sensitivity through Monte Carlo simulations and provide alignment tolerance requirements. Empirical results will be presented to validate the optical model predictions and subsequent system performance. Functionality will be demonstrated through optical loss and coupling efficiency measurements. System metrics will include characterizations such as eye diagram results and link loss measurements.

  4. Scientific thinking in young children: theoretical advances, empirical research, and policy implications.

    PubMed

    Gopnik, Alison

    2012-09-28

    New theoretical ideas and empirical research show that very young children's learning and thinking are strikingly similar to much learning and thinking in science. Preschoolers test hypotheses against data and make causal inferences; they learn from statistics and informal experimentation, and from watching and listening to others. The mathematical framework of probabilistic models and Bayesian inference can describe this learning in precise ways. These discoveries have implications for early childhood education and policy. In particular, they suggest both that early childhood experience is extremely important and that the trend toward more structured and academic early childhood programs is misguided. PMID:23019643

  5. SAGE II/Umkehr ozone comparisons and aerosols effects: An empirical and theoretical study. Final report

    SciTech Connect

    Newchurch, M.

    1997-09-15

    The objectives of this research were to: (1) examine empirically the aerosol effect on Umkehr ozone profiles using SAGE II aerosol and ozone data; (2) examine theoretically the aerosol effect on Umkehr ozone profiles; (3) examine the differences between SAGE II ozone profiles and both old- and new-format Umkehr ozone profiles for ozone-trend information; (4) reexamine SAGE I-Umkehr ozone differences with the most recent version of SAGE I data; and (5) contribute to the SAGE II science team.

  6. The Theoretical and Empirical Basis for Meditation as an Intervention for PTSD

    ERIC Educational Resources Information Center

    Lang, Ariel J.; Strauss, Jennifer L.; Bomyea, Jessica; Bormann, Jill E.; Hickman, Steven D.; Good, Raquel C.; Essex, Michael

    2012-01-01

    In spite of the existence of good empirically supported treatments for posttraumatic stress disorder (PTSD), consumers and providers continue to ask for more options for managing this common and often chronic condition. Meditation-based approaches are being widely implemented, but there is minimal research rigorously assessing their effectiveness.…

  7. Common liability to addiction and “gateway hypothesis”: Theoretical, empirical and evolutionary perspective

    PubMed Central

    Vanyukov, Michael M.; Tarter, Ralph E.; Kirillova, Galina P.; Kirisci, Levent; Reynolds, Maureen D.; Kreek, Mary Jeanne; Conway, Kevin P.; Maher, Brion S.; Iacono, William G.; Bierut, Laura; Neale, Michael C.; Clark, Duncan B.; Ridenour, Ty A.

    2013-01-01

    Background Two competing concepts address the development of involvement with psychoactive substances: the “gateway hypothesis” (GH) and common liability to addiction (CLA). Method The literature on theoretical foundations and empirical findings related to both concepts is reviewed. Results The data suggest that drug use initiation sequencing, the core GH element, is variable and opportunistic rather than uniform and developmentally deterministic. The association between risks for use of different substances, if any, can be more readily explained by common underpinnings than by specific staging. In contrast, the CLA concept is grounded in genetic theory and supported by data identifying common sources of variation in the risk for specific addictions. This commonality has identifiable neurobiological substrate and plausible evolutionary explanations. Conclusions Whereas the “gateway” hypothesis does not specify mechanistic connections between “stages”, and does not extend to the risks for addictions, the concept of common liability to addictions incorporates sequencing of drug use initiation as well as extends to related addictions and their severity, provides a parsimonious explanation of substance use and addiction co-occurrence, and establishes a theoretical and empirical foundation to research in etiology, quantitative risk and severity measurement, as well as targeted non-drug-specific prevention and early intervention. PMID:22261179

  8. Theoretical and empirical scale dependency of Z-R relationships: Evidence, impacts, and correction

    NASA Astrophysics Data System (ADS)

    Verrier, Sébastien; Barthès, Laurent; Mallet, Cécile

    2013-07-01

    Estimation of rainfall intensities from radar measurements relies to a large extent on power-laws relationships between rain rates R and radar reflectivities Z, i.e., Z = a*R^b. These relationships are generally applied unawarely of the scale, which is questionable since the nonlinearity of these relations could lead to undesirable discrepancies when combined with scale aggregation. Since the parameters (a,b) are expectedly related with drop size distribution (DSD) properties, they are often derived at disdrometer scale, not at radar scale, which could lead to errors at the latter. We propose to investigate the statistical behavior of Z-R relationships across scales both on theoretical and empirical sides. Theoretically, it is shown that claimed multifractal properties of rainfall processes could constrain the parameters (a,b) such that the exponent b would be scale independent but the prefactor a would be growing as a (slow) power law of time or space scale. In the empirical part (which may be read independently of theoretical considerations), high-resolution disdrometer (Dual-Beam Spectropluviometer) data of rain rates and reflectivity factors are considered at various integration times comprised in the range 15 s - 64 min. A variety of regression techniques is applied on Z-R scatterplots at all these time scales, establishing empirical evidence of a behavior coherent with theoretical considerations: a grows as a 0.1 power law of scale while b decreases more slightly. The properties of a are suggested to be closely linked to inhomogeneities in the DSDs since extensions of Z-R relationships involving (here, strongly nonconstant) normalization parameters of the DSDs seem to be more robust across scales. The scale dependence of simple Z = a*R^b relationships is advocated to be a possible source of overestimation of rainfall intensities or accumulations. Several ways for correcting such scaling biases (which can reach >15-20% in terms of relative error) are suggested

  9. Evolution of the empirical and theoretical foundations of eyewitness identification reform.

    PubMed

    Clark, Steven E; Moreland, Molly B; Gronlund, Scott D

    2014-04-01

    Scientists in many disciplines have begun to raise questions about the evolution of research findings over time (Ioannidis in Epidemiology, 19, 640-648, 2008; Jennions & Møller in Proceedings of the Royal Society, Biological Sciences, 269, 43-48, 2002; Mullen, Muellerleile, & Bryan in Personality and Social Psychology Bulletin, 27, 1450-1462, 2001; Schooler in Nature, 470, 437, 2011), since many phenomena exhibit decline effects-reductions in the magnitudes of effect sizes as empirical evidence accumulates. The present article examines empirical and theoretical evolution in eyewitness identification research. For decades, the field has held that there are identification procedures that, if implemented by law enforcement, would increase eyewitness accuracy, either by reducing false identifications, with little or no change in correct identifications, or by increasing correct identifications, with little or no change in false identifications. Despite the durability of this no-cost view, it is unambiguously contradicted by data (Clark in Perspectives on Psychological Science, 7, 238-259, 2012a; Clark & Godfrey in Psychonomic Bulletin & Review, 16, 22-42, 2009; Clark, Moreland, & Rush, 2013; Palmer & Brewer in Law and Human Behavior, 36, 247-255, 2012), raising questions as to how the no-cost view became well-accepted and endured for so long. Our analyses suggest that (1) seminal studies produced, or were interpreted as having produced, the no-cost pattern of results; (2) a compelling theory was developed that appeared to account for the no-cost pattern; (3) empirical results changed over the years, and subsequent studies did not reliably replicate the no-cost pattern; and (4) the no-cost view survived despite the accumulation of contradictory empirical evidence. Theories of memory that were ruled out by early data now appear to be supported by data, and the theory developed to account for early data now appears to be incorrect. PMID:24258271

  10. Theoretic Fit and Empirical Fit: The Performance of Maximum Likelihood versus Generalized Least Squares Estimation in Structural Equation Models.

    ERIC Educational Resources Information Center

    Olsson, Ulf Henning; Troye, Sigurd Villads; Howell, Roy D.

    1999-01-01

    Used simulation to compare the ability of maximum likelihood (ML) and generalized least-squares (GLS) estimation to provide theoretic fit in models that are parsimonious representations of a true model. The better empirical fit obtained for GLS, compared with ML, was obtained at the cost of lower theoretic fit. (Author/SLD)

  11. Ignorance, Vulnerability and the Occurrence of "Radical Surprises": Theoretical Reflections and Empirical Findings

    NASA Astrophysics Data System (ADS)

    Kuhlicke, C.

    2009-04-01

    , that the flood was far beyond people's power of imagination (nescience). The reason therefore is that previous to the flood an institutionalized space of experience and horizon of expectation existed, which did not consider the possibility that the "stability" of the river is artificially created by engineering achievements to reduce its naturally given variability. Based on the empirical findings and the theoretical reasoning overall conclusions are drawn and implications for flood risk management under conditions global environmental change are outlined.

  12. How beauty works. Theoretical mechanisms and two empirical applications on students' evaluation of teaching.

    PubMed

    Wolbring, Tobias; Riordan, Patrick

    2016-05-01

    Plenty of studies show that the physical appearance of a person affects a variety of outcomes in everyday life. However, due to an incomplete theoretical explication and empirical problems in disentangling different beauty effects, it is unclear which mechanisms are at work. To clarify how beauty works we present explanations from evolutionary theory and expectation states theory and show where both perspectives differ and where interlinkage appears promising. Using students' evaluations of teaching we find observational and experimental evidence for the different causal pathways of physical attractiveness. First, independent raters strongly agree over the physical attractiveness of a person. Second, attractive instructors receive better student ratings. Third, students attend classes of attractive instructors more frequently - even after controlling for teaching quality. Fourth, we find no evidence that attractiveness effects become stronger if rater and ratee are of the opposite sex. Finally, the beauty premium turns into a penalty if an attractive instructor falls short of students' expectations. PMID:26973043

  13. The adaptive evolution of virulence: a review of theoretical predictions and empirical tests.

    PubMed

    Cressler, Clayton E; McLEOD, David V; Rozins, Carly; VAN DEN Hoogen, Josée; Day, Troy

    2016-06-01

    Why is it that some parasites cause high levels of host damage (i.e. virulence) whereas others are relatively benign? There are now numerous reviews of virulence evolution in the literature but it is nevertheless still difficult to find a comprehensive treatment of the theory and data on the subject that is easily accessible to non-specialists. Here we attempt to do so by distilling the vast theoretical literature on the topic into a set of relatively few robust predictions. We then provide a comprehensive assessment of the available empirical literature that tests these predictions. Our results show that there have been some notable successes in integrating theory and data but also that theory and empiricism in this field do not 'speak' to each other very well. We offer a few suggestions for how the connection between the two might be improved. PMID:26302775

  14. The Influence of Education and Socialization on Radicalization: An Exploration of Theoretical Presumptions and Empirical Research.

    PubMed

    Pels, Trees; de Ruyter, Doret J

    2012-06-01

    BACKGROUND AND OBJECTIVE: Research into radicalization does not pay much attention to education. This is remarkable and possibly misses an important influence on the process of radicalization. Therefore this article sets out to explore the relation between education on the one hand and the onset or prevention of radicalization on the other hand. METHOD: This article is a theoretical literature review. It has analyzed empirical studies-mainly from European countries-about the educational aims, content and style of Muslim parents and parents with (extreme) right-wing sympathies. RESULTS: Research examining similarity in right-wing sympathies between parents and children yields mixed results, but studies among adolescents point to a significant concordance. Research also showed that authoritarian parenting may play a significant role. Similar research among Muslim families was not found. While raising children with distrust and an authoritarian style are prevalent, the impact on adolescents has not been investigated. The empirical literature we reviewed does not give sufficient evidence to conclude that democratic ideal in and an authoritative style of education are conducive to the development of a democratic attitude. CONCLUSION: There is a knowledge gap with regard to the influence of education on the onset or the prevention of radicalization. Schools and families are underappreciated sources of informal social control and social capital and therefore the gap should be closed. If there is a better understanding of the effect of education, policy as well as interventions can be developed to assist parents and teachers in preventing radicalization. PMID:22611328

  15. From the bench to modeling--R0 at the interface between empirical and theoretical approaches in epidemiology of environmentally transmitted infectious diseases.

    PubMed

    Ivanek, Renata; Lahodny, Glenn

    2015-02-01

    transmission rate of infection and the pathogen growth rate in the environment. Moreover, we identified experimental conditions for which the theoretical R0 predictions based on the hypotheses H2 and H3 differ greatly, which would assist their discrimination and conclusive validation against future empirical studies. Once a valid theoretical R0 is identified for Salmonella Typhimurium in mice, its generalizability to other host-pathogen-environment systems should be tested. The present study may serve as a template for integrated empirical and theoretical research of R0 in the epidemiology of ETIDs. PMID:25441048

  16. Theoretical Foundations for Evidence-Based Health Informatics: Why? How?

    PubMed

    Scott, Philip J; Georgiou, Andrew; Hyppönen, Hannele; Craven, Catherine K; Rigby, Michael; Brender McNair, Jytte

    2016-01-01

    A scientific approach to health informatics requires sound theoretical foundations. Health informatics implementation would be more effective if evidence-based and guided by theories about what is likely to work in what circumstances. We report on a Medinfo 2015 workshop on this topic jointly organized by the EFMI Working Group on Assessment of Health Information Systems and the IMIA Working Group on Technology Assessment and Quality Development. We discuss the findings of the workshop and propose an approach to consolidate empirical knowledge into testable middle-range theories. PMID:27577457

  17. Coaching and guidance with patient decision aids: A review of theoretical and empirical evidence

    PubMed Central

    2013-01-01

    Background Coaching and guidance are structured approaches that can be used within or alongside patient decision aids (PtDAs) to facilitate the process of decision making. Coaching is provided by an individual, and guidance is embedded within the decision support materials. The purpose of this paper is to: a) present updated definitions of the concepts “coaching” and “guidance”; b) present an updated summary of current theoretical and empirical insights into the roles played by coaching/guidance in the context of PtDAs; and c) highlight emerging issues and research opportunities in this aspect of PtDA design. Methods We identified literature published since 2003 on shared decision making theoretical frameworks inclusive of coaching or guidance. We also conducted a sub-analysis of randomized controlled trials included in the 2011 Cochrane Collaboration Review of PtDAs with search results updated to December 2010. The sub-analysis was conducted on the characteristics of coaching and/or guidance included in any trial of PtDAs and trials that allowed the impact of coaching and/or guidance with PtDA to be compared to another intervention or usual care. Results Theoretical evidence continues to justify the use of coaching and/or guidance to better support patients in the process of thinking about a decision and in communicating their values/preferences with others. In 98 randomized controlled trials of PtDAs, 11 trials (11.2%) included coaching and 63 trials (64.3%) provided guidance. Compared to usual care, coaching provided alongside a PtDA improved knowledge and decreased mean costs. The impact on some other outcomes (e.g., participation in decision making, satisfaction, option chosen) was more variable, with some trials showing positive effects and other trials reporting no differences. For values-choice agreement, decisional conflict, adherence, and anxiety there were no differences between groups. None of these outcomes were worse when patients were exposed

  18. Empirically Based Myths: Astrology, Biorhythms, and ATIs.

    ERIC Educational Resources Information Center

    Ragsdale, Ronald G.

    1980-01-01

    A myth may have an empirical basis through chance occurrence; perhaps Aptitude Treatment Interactions (ATIs) are in this category. While ATIs have great utility in describing, planning, and implementing instruction, few disordinal interactions have been found. Article suggests narrowing of ATI research with replications and estimates of effect…

  19. Multisystemic Therapy: An Empirically Supported, Home-Based Family Therapy Approach.

    ERIC Educational Resources Information Center

    Sheidow, Ashli J.; Woodford, Mark S.

    2003-01-01

    Multisystemic Therapy (MST) is a well-validated, evidenced-based treatment for serious clinical problems presented by adolescents and their families. This article is an introduction to the MST approach and outlines key clinical features, describes the theoretical underpinnings, and discusses the empirical support for MST's effectiveness with a…

  20. A theoretical and empirical analysis of context: neighbourhoods, smoking and youth.

    PubMed

    Frohlich, Katherine L; Potvin, Louise; Chabot, Patrick; Corin, Ellen

    2002-05-01

    Numerous studies are currently addressing the issue of contextual effects on health and disease outcomes. The majority of these studies fall short of providing a theoretical basis with which to explain what context is and how it affects individual disease outcomes. We propose a theoretical model, entitled collective lifestyles, which brings together three concepts from practice theory: social structure, social practices and agency. We do so in an attempt to move away from both behavioural and structural-functionalist explanations of the differential distribution of disease outcomes among areas by including a contextualisation of health behaviours that considers their meaning. We test the framework using the empirical example of smoking and pre-adolescents in 32 communities across Québec, Canada. Social structure is operationalised as characteristics and resources; characteristics are the socio-economic aggregate characteristics of individuals culled from the 1996 Canadian Census, and resources are what regulates and transforms smoking practices. Information about social practices was collected in focus groups with pre-adolescents from four of the participating communities. Using zero-order and partial correlations we find that a portrait of communities emerges. Where there is a high proportion of more socio-economically advantaged people, resources tend to be more smoking discouraging, with the opposite being true for disadvantaged communities. Upon analysis of the focus group material, however, we find that the social practices in communities do not necessarily reflect the "objectified" measures of social structure. We suggest that a different conceptualisation of accessibility and lifestyle in contextual studies may enable us to improve our grasp on how differential rates of disease come about in local areas. PMID:12058856

  1. Evaluation of theoretical and empirical water vapor sorption isotherm models for soils

    NASA Astrophysics Data System (ADS)

    Arthur, Emmanuel; Tuller, Markus; Moldrup, Per; de Jonge, Lis W.

    2016-01-01

    The mathematical characterization of water vapor sorption isotherms of soils is crucial for modeling processes such as volatilization of pesticides and diffusive and convective water vapor transport. Although numerous physically based and empirical models were previously proposed to describe sorption isotherms of building materials, food, and other industrial products, knowledge about the applicability of these functions for soils is noticeably lacking. We present an evaluation of nine models for characterizing adsorption/desorption isotherms for a water activity range from 0.03 to 0.93 based on measured data of 207 soils with widely varying textures, organic carbon contents, and clay mineralogy. In addition, the potential applicability of the models for prediction of sorption isotherms from known clay content was investigated. While in general, all investigated models described measured adsorption and desorption isotherms reasonably well, distinct differences were observed between physical and empirical models and due to the different degrees of freedom of the model equations. There were also considerable differences in model performance for adsorption and desorption data. While regression analysis relating model parameters and clay content and subsequent model application for prediction of measured isotherms showed promise for the majority of investigated soils, for soils with distinct kaolinitic and smectitic clay mineralogy predicted isotherms did not closely match the measurements.

  2. Empirical corroboration of an earlier theoretical resolution to the UV paradox of insect polarized skylight orientation.

    PubMed

    Wang, Xin; Gao, Jun; Fan, Zhiguo

    2014-02-01

    It is surprising that many insect species use only the ultraviolet (UV) component of the polarized skylight for orientation and navigation purposes, while both the intensity and the degree of polarization of light from the clear sky are lower in the UV than at longer (blue, green, red) wavelengths. Why have these insects chosen the UV part of the polarized skylight? This strange phenomenon is called the "UV-sky-pol paradox". Although earlier several speculations tried to resolve this paradox, they did this without any quantitative data. A theoretical and computational model has convincingly explained why it is advantageous for certain animals to detect celestial polarization in the UV. We performed a sky-polarimetric approach and built a polarized skylight sensor that models the processing of polarization signals by insect photoreceptors. Using this model sensor, we carried out measurements under clear and cloudy sky conditions. Our results showed that light from the cloudy sky has maximal degree of polarization in the UV. Furthermore, under both clear and cloudy skies the angle of polarization of skylight can be detected with a higher accuracy. By this, we corroborated empirically the soundness of the earlier computational resolution of the UV-sky-pol paradox. PMID:24402685

  3. Empirical corroboration of an earlier theoretical resolution to the UV paradox of insect polarized skylight orientation

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Gao, Jun; Fan, Zhiguo

    2014-02-01

    It is surprising that many insect species use only the ultraviolet (UV) component of the polarized skylight for orientation and navigation purposes, while both the intensity and the degree of polarization of light from the clear sky are lower in the UV than at longer (blue, green, red) wavelengths. Why have these insects chosen the UV part of the polarized skylight? This strange phenomenon is called the "UV-sky-pol paradox". Although earlier several speculations tried to resolve this paradox, they did this without any quantitative data. A theoretical and computational model has convincingly explained why it is advantageous for certain animals to detect celestial polarization in the UV. We performed a sky-polarimetric approach and built a polarized skylight sensor that models the processing of polarization signals by insect photoreceptors. Using this model sensor, we carried out measurements under clear and cloudy sky conditions. Our results showed that light from the cloudy sky has maximal degree of polarization in the UV. Furthermore, under both clear and cloudy skies the angle of polarization of skylight can be detected with a higher accuracy. By this, we corroborated empirically the soundness of the earlier computational resolution of the UV-sky-pol paradox.

  4. Safety climate and injuries: an examination of theoretical and empirical relationships.

    PubMed

    Beus, Jeremy M; Payne, Stephanie C; Bergman, Mindy E; Arthur, Winfred

    2010-07-01

    Our purpose in this study was to meta-analytically address several theoretical and empirical issues regarding the relationships between safety climate and injuries. First, we distinguished between extant safety climate-->injury and injury-->safety climate relationships for both organizational and psychological safety climates. Second, we examined several potential moderators of these relationships. Meta-analyses revealed that injuries were more predictive of organizational safety climate than safety climate was predictive of injuries. Additionally, the injury-->safety climate relationship was stronger for organizational climate than for psychological climate. Moderator analyses revealed that the degree of content contamination in safety climate measures inflated effects, whereas measurement deficiency attenuated effects. Additionally, moderator analyses showed that as the time period over which injuries were assessed lengthened, the safety climate-->injury relationship was attenuated. Supplemental meta-analyses of specific safety climate dimensions also revealed that perceived management commitment to safety is the most robust predictor of occupational injuries. Contrary to expectations, the operationalization of injuries did not meaningfully moderate safety climate-injury relationships. Implications and recommendations for future research and practice are discussed. PMID:20604591

  5. Solubility of caffeine from green tea in supercritical CO2: a theoretical and empirical approach.

    PubMed

    Gadkari, Pravin Vasantrao; Balaraman, Manohar

    2015-12-01

    Decaffeination of fresh green tea was carried out with supercritical CO2 in the presence of ethanol as co-solvent. The solubility of caffeine in supercritical CO2 varied from 44.19 × 10(-6) to 149.55 × 10(-6) (mole fraction) over a pressure and temperature range of 15 to 35 MPa and 313 to 333 K, respectively. The maximum solubility of caffeine was obtained at 25 MPa and 323 K. Experimental solubility data were correlated with the theoretical equation of state models Peng-Robinson (PR), Soave Redlich-Kwong (SRK), and Redlich-Kwong (RK). The RK model had regressed experimental data with 15.52 % average absolute relative deviation (AARD). In contrast, Gordillo empirical model regressed the best to experimental data with only 0.96 % AARD. Under supercritical conditions, solubility of caffeine in tea matrix was lower than the solubility of pure caffeine. Further, solubility of caffeine in supercritical CO2 was compared with solubility of pure caffeine in conventional solvents and a maximum solubility 90 × 10(-3) mol fraction was obtained with chloroform. PMID:26604372

  6. Rural Employment, Migration, and Economic Development: Theoretical Issues and Empirical Evidence from Africa. Africa Rural Employment Paper No. 1.

    ERIC Educational Resources Information Center

    Byerlee, Derek; Eicher, Carl K.

    Employment problems in Africa were examined with special emphasis on rural employment and migration within the context of overall economic development. A framework was provided for analyzing rural employment in development; that framework was used to analyze empirical information from Africa; and theoretical issues were raised in analyzing rural…

  7. Calculation of theoretical and empirical nutrient N critical loads in the mixed conifer ecosystems of southern California.

    PubMed

    Breiner, Joan; Gimeno, Benjamin S; Fenn, Mark

    2007-01-01

    Edaphic, foliar, and hydrologic forest nutrient status indicators from 15 mixed conifer forest stands in the Sierra Nevada, San Gabriel Mountains, and San Bernardino National Forest were used to estimate empirical or theoretical critical loads (CL) for nitrogen (N) as a nutrient. Soil acidification response to N deposition was also evaluated. Robust empirical relationships were found relating N deposition to plant N uptake (N in foliage), N fertility (litter C/N ratio), and soil acidification. However, no consistent empirical CL were obtained when the thresholds for parameters indicative of N excess from other types of ecosystems were used. Similarly, the highest theoretical CL for nutrient N calculated using the simple mass balance steady state model (estimates ranging from 1.4-8.8 kg N/ha/year) was approximately two times lower than the empirical observations. Further research is needed to derive the thresholds for indicators associated with the impairment of these mixed conifer forests exposed to chronic N deposition within a Mediterranean climate. Further development or parameterization of models for the calculation of theoretical critical loads suitable for these ecosystems will also be an important aspect of future critical loads research. PMID:17450298

  8. Theoretical and Empirical Comparisons between Two Models for Continuous Item Responses.

    ERIC Educational Resources Information Center

    Ferrando, Pere J.

    2002-01-01

    Analyzed the relations between two continuous response models intended for typical response items: the linear congeneric model and Samejima's continuous response model (CRM). Illustrated the relations described using an empirical example and assessed the relations through a simulation study. (SLD)

  9. Attachment-based family therapy for depressed and suicidal adolescents: theory, clinical model and empirical support.

    PubMed

    Ewing, E Stephanie Krauthamer; Diamond, Guy; Levy, Suzanne

    2015-01-01

    Attachment-Based Family Therapy (ABFT) is a manualized family-based intervention designed for working with depressed adolescents, including those at risk for suicide, and their families. It is an empirically informed and supported treatment. ABFT has its theoretical underpinnings in attachment theory and clinical roots in structural family therapy and emotion focused therapies. ABFT relies on a transactional model that aims to transform the quality of adolescent-parent attachment, as a means of providing the adolescent with a more secure relationship that can support them during challenging times generally, and the crises related to suicidal thinking and behavior, specifically. This article reviews: (1) the theoretical foundations of ABFT (attachment theory, models of emotional development); (2) the ABFT clinical model, including training and supervision factors; and (3) empirical support. PMID:25778674

  10. A Model of Resource Allocation in Public School Districts: A Theoretical and Empirical Analysis.

    ERIC Educational Resources Information Center

    Chambers, Jay G.

    This paper formulates a comprehensive model of resource allocation in a local public school district. The theoretical framework specified could be applied equally well to any number of local public social service agencies. Section 1 develops the theoretical model describing the process of resource allocation. This involves the determination of the…

  11. Empirically Based Strategies for Preventing Juvenile Delinquency.

    PubMed

    Pardini, Dustin

    2016-04-01

    Juvenile crime is a serious public health problem that results in significant emotional and financial costs for victims and society. Using etiologic models as a guide, multiple interventions have been developed to target risk factors thought to perpetuate the emergence and persistence of delinquent behavior. Evidence suggests that the most effective interventions tend to have well-defined treatment protocols, focus on therapeutic approaches as opposed to external control techniques, and use multimodal cognitive-behavioral treatment strategies. Moving forward, there is a need to develop effective policies and procedures that promote the widespread adoption of evidence-based delinquency prevention practices across multiple settings. PMID:26980128

  12. Culminating Experience Empirical and Theoretical Research Projects, University of Tennessee at Chattanooga, Spring, 2005

    ERIC Educational Resources Information Center

    Watson, Sandy White, Ed.

    2005-01-01

    This document represents a sample collection of master's theses from the University of Tennessee at Chattanooga's Teacher Education Program, spring semester, 2005. The majority of these student researchers were simultaneously student teaching while writing their theses. Studies were empirical and conceptual in nature and demonstrate some ways in…

  13. Why Do People Need Self-Esteem? A Theoretical and Empirical Review

    ERIC Educational Resources Information Center

    Pyszczynsi, Tom; Greenberg, Jeff; Solomon, Sheldon; Arndt, Jamie; Schimel, Jeff

    2004-01-01

    Terror management theory (TMT; J. Greenberg, T. Pyszczynski, & S. Solomon, 1986) posits that people are motivated to pursue positive self-evaluations because self-esteem provides a buffer against the omnipresent potential for anxiety engendered by the uniquely human awareness of mortality. Empirical evidence relevant to the theory is reviewed…

  14. Development of an Axiomatic Theory of Organization/Environment Interaction: A Theoretical and Empirical Analysis.

    ERIC Educational Resources Information Center

    Ganey, Rodney F.

    The goal of this paper was to develop a theory of organization/environment interaction by examining the impact of perceived environmental uncertainty on organizational processes and on organizational goal attainment. It examines theories from the organizational environment literature and derives corollaries that are empirically tested using a data…

  15. Empirical social-ecological system analysis: from theoretical framework to latent variable structural equation model.

    PubMed

    Asah, Stanley Tanyi

    2008-12-01

    The social-ecological system (SES) approach to natural resource management holds enormous promise towards achieving sustainability. Despite this promise, social-ecological interactions are complex and elusive; they require simplification to guide effective application of the SES approach. The complex, adaptive and place-specific nature of human-environment interactions impedes determination of state and trends in SES parameters of interest to managers and policy makers. Based on a rigorously developed systemic theoretical model, this paper integrates field observations, interviews, surveys, and latent variable modeling to illustrate the development of simplified and easily interpretable indicators of the state of, and trends in, relevant SES processes. Social-agricultural interactions in the Logone floodplain, in the Lake Chad basin, served as case study. This approach is found to generate simplified determinants of the state of SESs, easily communicable across the array of stakeholders common in human-environment interactions. The approach proves to be useful for monitoring SESs, guiding interventions, and assessing the effectiveness of interventions. It incorporates real time responses to biophysical change in understanding coarse scale processes within which finer scales are embedded. This paper emphasizes the importance of merging quantitative and qualitative methods for effective monitoring and assessment of SESs. PMID:18773239

  16. Image-Based Empirical Modeling of the Plasmasphere

    NASA Technical Reports Server (NTRS)

    Adrian, Mark L.; Gallagher, D. L.

    2008-01-01

    A new suite of empirical models of plasmaspheric plasma based on remote, global images from the IMAGE EUV instrument is proposed for development. The purpose of these empirical models is to establish the statistical properties of the plasmasphere as a function of conditions. This suite of models will mark the first time the plasmaspheric plume is included in an empirical model. Development of these empirical plasmaspheric models will support synoptic studies (such as for wave propagation and growth, energetic particle loss through collisions and dust transport as influenced by charging) and serves as a benchmark against which physical models can be tested. The ability to know that a specific global density distribution occurs in response to specific magnetospheric and solar wind factors is a huge advantage over all previous in-situ based empirical models. The consequence of creating these new plasmaspheric models will be to provide much higher fidelity and much richer quantitative descriptions of the statistical properties of plasmaspheric plasma in the inner magnetosphere, whether that plasma is in the main body of the plasmasphere, nearby during recovery or in the plasmaspheric plume. Model products to be presented include statistical probabilities for being in the plasmasphere, near thermal He+ density boundaries and the complexity of its spatial structure.

  17. Public Disaster Communication and Child and Family Disaster Mental Health: a Review of Theoretical Frameworks and Empirical Evidence.

    PubMed

    Houston, J Brian; First, Jennifer; Spialek, Matthew L; Sorenson, Mary E; Koch, Megan

    2016-06-01

    Children have been identified as particularly vulnerable to psychological and behavioral difficulties following disaster. Public child and family disaster communication is one public health tool that can be utilized to promote coping/resilience and ameliorate maladaptive child reactions following an event. We conducted a review of the public disaster communication literature and identified three main functions of child and family disaster communication: fostering preparedness, providing psychoeducation, and conducting outreach. Our review also indicates that schools are a promising system for child and family disaster communication. We complete our review with three conclusions. First, theoretically, there appears to be a great opportunity for public disaster communication focused on child disaster reactions. Second, empirical research assessing the effects of public child and family disaster communication is essentially nonexistent. Third, despite the lack of empirical evidence in this area, there is opportunity for public child and family disaster communication efforts that address new domains. PMID:27086315

  18. Responses to Commentaries on Advances in Empirically Based Assessment.

    ERIC Educational Resources Information Center

    McConaughy, Stephanie H.

    1993-01-01

    Author of article (this issue) describing research program to advance assessment of children's behavioral and emotional problems; presenting conceptual framework for multiaxial empirically based assessment; and summarizing research efforts to develop cross-informant scales for scoring parent, teacher, and self-reports responds to commentaries on…

  19. GIS Teacher Training: Empirically-Based Indicators of Effectiveness

    ERIC Educational Resources Information Center

    Höhnle, Steffen; Fögele, Janis; Mehren, Rainer; Schubert, Jan Christoph

    2016-01-01

    In spite of various actions, the implementation of GIS (geographic information systems) in German schools is still very low. In the presented research, teaching experts as well as teaching novices were presented with empirically based constraints for implementation stemming from an earlier survey. In the process of various group discussions, the…

  20. Theoretical NMR correlations based Structure Discussion

    PubMed Central

    2011-01-01

    The constitutional assignment of natural products by NMR spectroscopy is usually based on 2D NMR experiments like COSY, HSQC, and HMBC. The actual difficulty of the structure elucidation problem depends more on the type of the investigated molecule than on its size. The moment HMBC data is involved in the process or a large number of heteroatoms is present, a possibility of multiple solutions fitting the same data set exists. A structure elucidation software can be used to find such alternative constitutional assignments and help in the discussion in order to find the correct solution. But this is rarely done. This article describes the use of theoretical NMR correlation data in the structure elucidation process with WEBCOCON, not for the initial constitutional assignments, but to define how well a suggested molecule could have been described by NMR correlation data. The results of this analysis can be used to decide on further steps needed to assure the correctness of the structural assignment. As first step the analysis of the deviation of carbon chemical shifts is performed, comparing chemical shifts predicted for each possible solution with the experimental data. The application of this technique to three well known compounds is shown. Using NMR correlation data alone for the description of the constitutions is not always enough, even when including 13C chemical shift prediction. PMID:21797997

  1. Denoising ECG signal based on ensemble empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Zhi-dong, Zhao; Liu, Juan; Wang, Sheng-tao

    2011-10-01

    The electrocardiogram (ECG) has been used extensively for detection of heart disease. Frequently the signal is corrupted by various kinds of noise such as muscle noise, electromyogram (EMG) interference, instrument noise etc. In this paper, a new ECG denoising method is proposed based on the recently developed ensemble empirical mode decomposition (EEMD). Noisy ECG signal is decomposed into a series of intrinsic mode functions (IMFs). The statistically significant information content is build by the empirical energy model of IMFs. Noisy ECG signal collected from clinic recording is processed using the method. The results show that on contrast with traditional methods, the novel denoising method can achieve the optimal denoising of the ECG signal.

  2. Mechanisms of risk and resilience in military families: theoretical and empirical basis of a family-focused resilience enhancement program.

    PubMed

    Saltzman, William R; Lester, Patricia; Beardslee, William R; Layne, Christopher M; Woodward, Kirsten; Nash, William P

    2011-09-01

    Recent studies have confirmed that repeated wartime deployment of a parent exacts a toll on military children and families and that the quality and functionality of familial relations is linked to force preservation and readiness. As a result, family-centered care has increasingly become a priority across the military health system. FOCUS (Families OverComing Under Stress), a family-centered, resilience-enhancing program developed by a team at UCLA and Harvard Schools of Medicine, is a primary initiative in this movement. In a large-scale implementation project initiated by the Bureau of Navy Medicine, FOCUS has been delivered to thousands of Navy, Marine, Navy Special Warfare, Army, and Air Force families since 2008. This article describes the theoretical and empirical foundation and rationale for FOCUS, which is rooted in a broad conception of family resilience. We review the literature on family resilience, noting that an important next step in building a clinically useful theory of family resilience is to move beyond developing broad "shopping lists" of risk indicators by proposing specific mechanisms of risk and resilience. Based on the literature, we propose five primary risk mechanisms for military families and common negative "chain reaction" pathways through which they undermine the resilience of families contending with wartime deployments and parental injury. In addition, we propose specific mechanisms that mobilize and enhance resilience in military families and that comprise central features of the FOCUS Program. We describe these resilience-enhancing mechanisms in detail, followed by a discussion of the ways in which evaluation data from the program's first 2 years of operation supports the proposed model and the specified mechanisms of action. PMID:21655938

  3. Ecological risk and resilience perspective: a theoretical framework supporting evidence-based practice in schools.

    PubMed

    Powers, Joelle D

    2010-10-01

    Multidisciplinary school practitioners are clearly being called to use evidence-based practices from reputable sources such as their own professional organizations and federal agencies. In spite of this encouragement, most schools are not regularly employing empirically supported interventions. This paper further promotes the use of this approach by describing the theoretical support for evidence-based practice in schools. The ecological risk and resilience theoretical framework presented fills a gap in the literature and advocates for evidence-based practice in schools by illustrating how it can assist practitioners such as school social workers to better address problems associated with school failure. PMID:21082473

  4. Empirical Testing of a Theoretical Extension of the Technology Acceptance Model: An Exploratory Study of Educational Wikis

    ERIC Educational Resources Information Center

    Liu, Xun

    2010-01-01

    This study extended the technology acceptance model and empirically tested the new model with wikis, a new type of educational technology. Based on social cognitive theory and the theory of planned behavior, three new variables, wiki self-efficacy, online posting anxiety, and perceived behavioral control, were added to the original technology…

  5. Corrective Feedback in L2 Writing: Theoretical Perspectives, Empirical Insights, and Future Directions

    ERIC Educational Resources Information Center

    Van Beuningen, Catherine

    2010-01-01

    The role of (written) corrective feedback (CF) in the process of acquiring a second language (L2) has been an issue of considerable controversy among theorists and researchers alike. Although CF is a widely applied pedagogical tool and its use finds support in SLA theory, practical and theoretical objections to its usefulness have been raised…

  6. Predicting Child Abuse Potential: An Empirical Investigation of Two Theoretical Frameworks

    ERIC Educational Resources Information Center

    Begle, Angela Moreland; Dumas, Jean E.; Hanson, Rochelle F.

    2010-01-01

    This study investigated two theoretical risk models predicting child maltreatment potential: (a) Belsky's (1993) developmental-ecological model and (b) the cumulative risk model in a sample of 610 caregivers (49% African American, 46% European American; 53% single) with a child between 3 and 6 years old. Results extend the literature by using a…

  7. Rural Schools, Social Capital and the Big Society: A Theoretical and Empirical Exposition

    ERIC Educational Resources Information Center

    Bagley, Carl; Hillyard, Sam

    2014-01-01

    The paper commences with a theoretical exposition of the current UK government's policy commitment to the idealised notion of the Big Society and the social capital currency underpinning its formation. The paper positions this debate in relation to the rural and adopts an ethnographically-informed methodological approach to provide an…

  8. Perceptual Organization in Schizophrenia Spectrum Disorders: Empirical Research and Theoretical Implications

    ERIC Educational Resources Information Center

    Uhlhaas, Peter J.; Silverstein, Steven M.

    2005-01-01

    The research into perceptual organization in schizophrenia spectrum disorders has found evidence for and against a perceptual organization deficit and has interpreted the data from within several different theoretical frameworks. A synthesis of this evidence, however, reveals that this body of work has produced reliable evidence for deficits in…

  9. Multiple Embedded Inequalities and Cultural Diversity in Educational Systems: A Theoretical and Empirical Exploration

    ERIC Educational Resources Information Center

    Verhoeven, Marie

    2011-01-01

    This article explores the social construction of cultural diversity in education, with a view to social justice. It examines how educational systems organize ethno-cultural difference and how this process contributes to inequalities. Theoretical resources are drawn from social philosophy as well as from recent developments in social organisation…

  10. Three essays on energy and environmental economics: Empirical, applied, and theoretical

    NASA Astrophysics Data System (ADS)

    Karney, Daniel Houghton

    Energy and environmental economics are closely related fields as nearly all forms of energy production generate pollution and thus nearly all forms of environmental policy affect energy production and consumption. The three essays in this dissertation are related by their common themes of energy and environmental economics, but they differ in their methodologies. The first chapter is an empirical exercise that looks that the relationship between electricity price deregulation and maintenance outages at nuclear power plants. The second chapter is an applied theory paper that investigates environmental regulation in a multiple pollutants setting. The third chapter develops a new methodology regarding the construction of analytical general equilibrium models that can be used to study topics in energy and environmental economics.

  11. Chronic Pain in a Couples Context: A Review and Integration of Theoretical Models and Empirical Evidence

    PubMed Central

    Leonard, Michelle T.; Cano, Annmarie; Johansen, Ayna B.

    2007-01-01

    Researchers have become increasingly interested in the social context of chronic pain conditions. The purpose of this article is to provide an integrated review of the evidence linking marital functioning with chronic pain outcomes including pain severity, physical disability, pain behaviors, and psychological distress. We first present an overview of existing models that identify an association between marital functioning and pain variables. We then review the empirical evidence for a relationship between pain variables and several marital functioning variables including marital satisfaction, spousal support, spouse responses to pain, and marital interaction. On the basis of the evidence, we present a working model of marital and pain variables, identify gaps in the literature, and offer recommendations for research and clinical work. Perspective The authors provide a comprehensive review of the relationships between marital functioning and chronic pain variables to advance future research and help treatment providers understand marital processes in chronic pain. PMID:16750794

  12. Colour in insect thermoregulation: empirical and theoretical tests in the colour-changing grasshopper, Kosciuscola tristis.

    PubMed

    Umbers, K D L; Herberstein, M E; Madin, J S

    2013-01-01

    Body colours can result in different internal body temperatures, but evidence for the biological significance of colour-induced temperature differences is inconsistent. We investigated the relationship between body colour and temperature in a model insect species that rapidly changes colour. We used an empirical approach and constructed a heat budget model to quantify whether a colour change from black to turquoise has a role in thermoregulation for the chameleon grasshopper (Kosciuscola tristis). Our study shows that colour change in K. tristis provides relatively small temperature differences that vary greatly with wind speed (0.55 °C at ms(-1) to 0.05 °C at 10 ms(-1)). The biological significance of this difference is unclear and we discuss the requirement for more studies that directly test hypotheses regarding the fitness effects of colour in manipulating body temperature. PMID:23108152

  13. Evidence-based Nursing Education - a Systematic Review of Empirical Research

    PubMed Central

    Reiber, Karin

    2011-01-01

    The project „Evidence-based Nursing Education – Preparatory Stage“, funded by the Landesstiftung Baden-Württemberg within the programme Impulsfinanzierung Forschung (Funding to Stimulate Research), aims to collect information on current research concerned with nursing education and to process existing data. The results of empirical research which has already been carried out were systematically evaluated with aim of identifying further topics, fields and matters of interest for empirical research in nursing education. In the course of the project, the available empirical studies on nursing education were scientifically analysed and systematised. The over-arching aim of the evidence-based training approach – which extends beyond the aims of this project - is the conception, organisation and evaluation of vocational training and educational processes in the caring professions on the basis of empirical data. The following contribution first provides a systematic, theoretical link to the over-arching reference framework, as the evidence-based approach is adapted from thematically related specialist fields. The research design of the project is oriented towards criteria introduced from a selection of studies and carries out a two-stage systematic review of the selected studies. As a result, the current status of research in nursing education, as well as its organisation and structure, and questions relating to specialist training and comparative education are introduced and discussed. Finally, the empirical research on nursing training is critically appraised as a complementary element in educational theory/psychology of learning and in the ethical tradition of research. This contribution aims, on the one hand, to derive and describe the methods used, and to introduce the steps followed in gathering and evaluating the data. On the other hand, it is intended to give a systematic overview of empirical research work in nursing education. In order to preserve a

  14. Guiding Empirical and Theoretical Explorations of Organic Matter Decay By Synthesizing Temperature Responses of Enzyme Kinetics, Microbes, and Isotope Fluxes

    NASA Astrophysics Data System (ADS)

    Billings, S. A.; Ballantyne, F.; Lehmeier, C.; Min, K.

    2014-12-01

    Soil organic matter (SOM) transformation rates generally increase with temperature, but whether this is realized depends on soil-specific features. To develop predictive models applicable to all soils, we must understand two key, ubiquitous features of SOM transformation: the temperature sensitivity of myriad enzyme-substrate combinations and temperature responses of microbial physiology and metabolism, in isolation from soil-specific conditions. Predicting temperature responses of production of CO2 vs. biomass is also difficult due to soil-specific features: we cannot know the identity of active microbes nor the substrates they employ. We highlight how recent empirical advances describing SOM decay can help develop theoretical tools relevant across diverse spatial and temporal scales. At a molecular level, temperature effects on purified enzyme kinetics reveal distinct temperature sensitivities of decay of diverse SOM substrates. Such data help quantify the influence of microbial adaptations and edaphic conditions on decay, have permitted computation of the relative availability of carbon (C) and nitrogen (N) liberated upon decay, and can be used with recent theoretical advances to predict changes in mass specific respiration rates as microbes maintain biomass C:N with changing temperature. Enhancing system complexity, we can subject microbes to temperature changes while controlling growth rate and without altering substrate availability or identity of the active population, permitting calculation of variables typically inferred in soils: microbial C use efficiency (CUE) and isotopic discrimination during C transformations. Quantified declines in CUE with rising temperature are critical for constraining model CUE estimates, and known changes in δ13C of respired CO2 with temperature is useful for interpreting δ13C-CO2 at diverse scales. We suggest empirical studies important for advancing knowledge of how microbes respond to temperature, and ideas for theoretical

  15. Social-Emotional Well-Being and Resilience of Children in Early Childhood Settings--PERIK: An Empirically Based Observation Scale for Practitioners

    ERIC Educational Resources Information Center

    Mayr, Toni; Ulich, Michaela

    2009-01-01

    Compared with the traditional focus on developmental problems, research on positive development is relatively new. Empirical research in children's well-being has been scarce. The aim of this study was to develop a theoretically and empirically based instrument for practitioners to observe and assess preschool children's well-being in early…

  16. Advanced airfoil design empirically based transonic aircraft drag buildup technique

    NASA Technical Reports Server (NTRS)

    Morrison, W. D., Jr.

    1976-01-01

    To systematically investigate the potential of advanced airfoils in advance preliminary design studies, empirical relationships were derived, based on available wind tunnel test data, through which total drag is determined recognizing all major aircraft geometric variables. This technique recognizes a single design lift coefficient and Mach number for each aircraft. Using this technique drag polars are derived for all Mach numbers up to MDesign + 0.05 and lift coefficients -0.40 to +0.20 from CLDesign.

  17. Theoretical performance assessment and empirical analysis of super-resolution under unknown affine sensor motion.

    PubMed

    Thelen, Brian J; Valenzuela, John R; LeBlanc, Joel W

    2016-04-01

    This paper deals with super-resolution (SR) processing and associated theoretical performance assessment for under-sampled video data collected from a moving imaging platform with unknown motion and assuming a relatively flat scene. This general scenario requires joint estimation of the high-resolution image and the parameters that determine a projective transform that relates the collected frames to one another. A quantitative assessment of the variance in the random error as achieved through a joint-estimation approach (e.g., SR image reconstruction and motion estimation) is carried out via the general framework of M-estimators and asymptotic statistics. This approach provides a performance measure on estimating the fine-resolution scene when there is a lack of perspective information and represents a significant advancement over previous work that considered only the more specific scenario of mis-registration. A succinct overview of the theoretical framework is presented along with some specific results on the approximate random error for the case of unknown translation and affine motions. A comparison is given between the approximated random error and that actually achieved by an M-estimator approach to the joint-estimation problem. These results provide insight on the reduction in SR reconstruction accuracy when jointly estimating unknown inter-frame affine motion. PMID:27140759

  18. Why do people need self-esteem? A theoretical and empirical review.

    PubMed

    Pyszczynski, Tom; Greenberg, Jeff; Solomon, Sheldon; Arndt, Jamie; Schimel, Jeff

    2004-05-01

    Terror management theory (TMT; J. Greenberg, T. Pyszczynski, & S. Solomon, 1986) posits that people are motivated to pursue positive self-evaluations because self-esteem provides a buffer against the omnipresent potential for anxiety engendered by the uniquely human awareness of mortality. Empirical evidence relevant to the theory is reviewed showing that high levels of self-esteem reduce anxiety and anxiety-related defensive behavior, reminders of one's mortality increase self-esteem striving and defense of self-esteem against threats in a variety of domains, high levels of self-esteem eliminate the effect of reminders of mortality on both self-esteem striving and the accessibility of death-related thoughts, and convincing people of the existence of an afterlife eliminates the effect of mortality salience on self-esteem striving. TMT is compared with other explanations for why people need self-esteem, and a critique of the most prominent of these, sociometer theory, is provided. PMID:15122930

  19. Pharmaceuticals, political money, and public policy: a theoretical and empirical agenda.

    PubMed

    Jorgensen, Paul D

    2013-01-01

    Why, when confronted with policy alternatives that could improve patient care, public health, and the economy, does Congress neglect those goals and tailor legislation to suit the interests of pharmaceutical corporations? In brief, for generations, the pharmaceutical industry has convinced legislators to define policy problems in ways that protect its profit margin. It reinforces this framework by selectively providing information and by targeting campaign contributions to influential legislators and allies. In this way, the industry displaces the public's voice in developing pharmaceutical policy. Unless citizens mobilize to confront the political power of pharmaceutical firms, objectionable industry practices and public policy will not change. Yet we need to refine this analysis. I propose a research agenda to uncover pharmaceutical influence. It develops the theory of dependence corruption to explain how the pharmaceutical industry is able to deflect the broader interests of the general public. It includes empirical studies of lobbying and campaign finance to uncover the means drug firms use to: (1) shape the policy framework adopted and information used to analyze policy; (2) subsidize the work of political allies; and (3) influence congressional voting. PMID:24088146

  20. Predicting child abuse potential: an empirical investigation of two theoretical frameworks.

    PubMed

    Begle, Angela Moreland; Dumas, Jean E; Hanson, Rochelle F

    2010-01-01

    This study investigated two theoretical risk models predicting child maltreatment potential: (a) Belsky's (1993) developmental-ecological model and (b) the cumulative risk model in a sample of 610 caregivers (49% African American, 46% European American; 53% single) with a child between 3 and 6 years old. Results extend the literature by using a widely accepted and valid risk instrument rather than occurrence rates (e.g., reports to child protective services, observations). Results indicated Belsky's developmental-ecological model, in which risk markers were organized into three separate conceptual domains, provided a poor fit to the data. In contrast, the cumulative risk model, which included the accumulation of risk markers, was significant in predicting child abuse potential. PMID:20390812

  1. Empirical, theoretical, and practical advantages of the HEXACO model of personality structure.

    PubMed

    Ashton, Michael C; Lee, Kibeom

    2007-05-01

    The authors argue that a new six-dimensional framework for personality structure--the HEXACO model--constitutes a viable alternative to the well-known Big Five or five-factor model. The new model is consistent with the cross-culturally replicated finding of a common six-dimensional structure containing the factors Honesty-Humility (H), Emotionality (E), eExtraversion (X), Agreeableness (A), Conscientiousness (C), and Openness to Experience (O). Also, the HEXACO model predicts several personality phenomena that are not explained within the B5/FFM, including the relations of personality factors with theoretical biologists' constructs of reciprocal and kin altruism and the patterns of sex differences in personality traits. In addition, the HEXACO model accommodates several personality variables that are poorly assimilated within the B5/FFM. PMID:18453460

  2. UVCS Empirical Constraints on Theoretical Models of Solar Wind Source Regions

    NASA Astrophysics Data System (ADS)

    Kohl, J. L.; Cranmer, S. R.; Miralles, M. P.; Panasyuk, A.; Strachan, L.

    2007-12-01

    Spectroscopic observations from the Ultraviolet Coronagraph Spectrometer (UVCS) on the Solar and Heliospheric Observatory (SOHO) have resulted in empirical models of polar coronal holes, polar plumes, coronal jets and streamers. These findings have been used to make significant progress toward identifying and characterizing the physical processes that produce extended heating in the corona and accelerate fast and slow solar wind streams. The UVCS scientific observations, which began in April 1996 and continue at this writing, have provided determinations of proton and minor ion temperatures (including evidence for anisotropic microscopic velocity distributions in coronal holes and quiescent equatorial streamers), outflow velocities, and elemental abundances. The variations in these quantities over the solar cycle also have been determined. For example, observations of large polar coronal holes at different phases of the solar cycle indicate that line width is positively correlated with outflow speed and anti-correlated with electron density. This paper will review these results, and present new results from measurements taken as the current solar activity cycle approaches solar minimum. The results regarding preferential ion heating and acceleration of heavy ions (i.e., O5+) in polar coronal holes have contributed in a major way to the advances in understanding solar wind acceleration that have occurred during the past decade. It is important to verify and confirm the key features of these findings. Hence, the results from a new analysis of an expanded set of UVCS data from polar coronal holes at solar minimum by S. R. Cranmer, A. Panasyuk and J. L. Kohl will be presented. This work has been supported by the National Aeronautics and Space Administration (NASA) under Grants NNG06G188G and NNX07AL72G and NNX06AG95G to the Smithsonian Astrophysical Observatory.

  3. Lay attitudes toward deception in medicine: Theoretical considerations and empirical evidence

    PubMed Central

    Pugh, Jonathan; Kahane, Guy; Maslen, Hannah; Savulescu, Julian

    2016-01-01

    Abstract Background: There is a lack of empirical data on lay attitudes toward different sorts of deception in medicine. However, lay attitudes toward deception should be taken into account when we consider whether deception is ever permissible in a medical context. The objective of this study was to examine lay attitudes of U.S. citizens toward different sorts of deception across different medical contexts. Methods: A one-time online survey was administered to U.S. users of the Amazon “Mechanical Turk” website. Participants were asked to answer questions regarding a series of vignettes depicting different sorts of deception in medical care, as well as a question regarding their general attitudes toward truth-telling. Results: Of the 200 respondents, the majority found the use of placebos in different contexts to be acceptable following partial disclosure but found it to be unacceptable if it involved outright lying. Also, 55.5% of respondents supported the use of sham surgery in clinical research, although 55% claimed that it would be unacceptable to deceive patients in this research, even if this would improve the quality of the data from the study. Respondents supported fully informing patients about distressing medical information in different contexts, especially when the patient is suffering from a chronic condition. In addition, 42.5% of respondents believed that it is worse to deceive someone by providing the person with false information than it is to do so by giving the person true information that is likely to lead them to form a false belief, without telling them other important information that shows it to be false. However, 41.5% believed that the two methods of deception were morally equivalent. Conclusions: Respondents believed that some forms of deception were acceptable in some circumstances. While the majority of our respondents opposed outright lying in medical contexts, they were prepared to support partial disclosure and the use of

  4. Adult Coping with Childhood Sexual Abuse: A Theoretical and Empirical Review

    PubMed Central

    Walsh, Kate; Fortier, Michelle A.; DiLillo, David

    2009-01-01

    Coping has been suggested as an important element in understanding the long-term functioning of individuals with a history of child sexual abuse (CSA). The present review synthesizes the literature on coping with CSA, first by examining theories of coping with trauma, and, second by examining how these theories have been applied to studies of coping in samples of CSA victims. Thirty-nine studies were reviewed, including eleven descriptive studies of the coping strategies employed by individuals with a history of CSA, eighteen correlational studies of the relationship between coping strategies and long-term functioning of CSA victims, and ten investigations in which coping was examined as a mediational factor in relation to long-term outcomes. These studies provide initial information regarding early sexual abuse and subsequent coping processes. However, this literature is limited by several theoretical and methodological issues, including a failure to specify the process of coping as it occurs, a disparity between theory and research, and limited applicability to clinical practice. Future directions of research are discussed and include the need to understand coping as a process, identification of coping in relation to adaptive outcomes, and considerations of more complex mediational and moderational processes in the study of coping with CSA. PMID:20161502

  5. Swahili women since the nineteenth century: theoretical and empirical considerations on gender and identity construction.

    PubMed

    Gower, R; Salm, S; Falola, T

    1996-01-01

    This paper provides an analysis and update on the theoretical discussion about the link between gender and identity and uses a group of Swahili women in eastern Africa as an example of how this link works in practice. The first part of the study provides a brief overview of gender theory related to the terms "gender" and "identity." It is noted that gender is only one aspect of identity and that the concept of gender has undergone important changes such as the reconceptualization of the terms "sex" and "gender." The second part of the study synthesizes the experiences of Swahili women in the 19th century when the convergence of gender and class was very important. The status of Muslim women is reviewed, and it is noted that even influential women practiced purdah and that all Swahili women experienced discrimination, which inhibited their opportunities for socioeconomic mobility. Slavery and concubinage were widespread during this period, and the participation of Islamic women in spirit possession cults was a way for women to express themselves culturally. The separation of men and women in Swahili culture led to the development of two distinct subcultures, which excluded women from most aspects of public life. The third part of the study looks at the experiences of Swahili women since the 19th century both during and after the colonial period. It is shown that continuity exists in trends observed over a period of 200 years. For example, the mobility of Swahili women remains limited by Islam, but women do exert influence behind the scenes. It is concluded that the socioeconomic status of Swahili woman has been shaped more by complex forces such as class, ethnic, religious, and geographic area than by the oppression of Islam and colonialism. This study indicates that gender cannot be studied in isolation from other salient variables affecting identity. PMID:12292423

  6. Innovation in Information Technology: Theoretical and Empirical Study in SMQR Section of Export Import in Automotive Industry

    NASA Astrophysics Data System (ADS)

    Edi Nugroho Soebandrija, Khristian; Pratama, Yogi

    2014-03-01

    This paper has the objective to provide the innovation in information technology in both theoretical and empirical study. Precisely, both aspects relate to the Shortage Mispacking Quality Report (SMQR) Claims in Export and Import in Automotive Industry. This paper discusses the major aspects of Innovation, Information Technology, Performance and Competitive Advantage. Furthermore, In the empirical study of PT. Astra Honda Motor (AHM) refers to SMQR Claims, Communication Systems, Analysis and Design Systems. Briefly both aspects of the major aspects and its empirical study are discussed in the Introduction Session. Furthermore, the more detail discussion is conducted in the related aspects in other sessions of this paper, in particular in Literature Review in term classical and updated reference of current research. The increases of SMQR claim and communication problem at PT. Astra Daihatsu Motor (PT. ADM) which still using the email cause the time of claim settlement become longer and finally it causes the rejected of SMQR claim by supplier. With presence of this problem then performed to design the integrated communication system to manage the communication process of SMQR claim between PT. ADM with supplier. The systems was analyzed and designed is expected to facilitate the claim communication process so that can be run in accordance with the procedure and fulfill the target of claim settlement time and also eliminate the difficulties and problems on the previous manual communication system with the email. The design process of the system using the approach of system development life cycle method by Kendall & Kendall (2006)which design process covers the SMQR problem communication process, judgment process by the supplier, claim process, claim payment process and claim monitoring process. After getting the appropriate system designs for managing the SMQR claim, furthermore performed the system implementation and can be seen the improvement in claim communication

  7. Alignment of Standards and Assessment: A Theoretical and Empirical Study of Methods for Alignment

    ERIC Educational Resources Information Center

    Nasstrom, Gunilla; Henriksson, Widar

    2008-01-01

    Introduction: In a standards-based school-system alignment of policy documents with standards and assessment is important. To be able to evaluate whether schools and students have reached the standards, the assessment should focus on the standards. Different models and methods can be used for measuring alignment, i.e. the correspondence between…

  8. Theoretical and Empirical Underpinnings of the What Works Clearinghouse Attrition Standard for Randomized Controlled Trials

    ERIC Educational Resources Information Center

    Deke, John; Chiang, Hanley

    2014-01-01

    Meeting the What Works Clearinghouse (WWC) attrition standard (or one of the attrition standards based on the WWC standard) is now an important consideration for researchers conducting studies that could potentially be reviewed by the WWC (or other evidence reviews). Understanding the basis of this standard is valuable for anyone seeking to meet…

  9. AGENT-BASED MODELS IN EMPIRICAL SOCIAL RESEARCH*

    PubMed Central

    Bruch, Elizabeth; Atwell, Jon

    2014-01-01

    Agent-based modeling has become increasingly popular in recent years, but there is still no codified set of recommendations or practices for how to use these models within a program of empirical research. This article provides ideas and practical guidelines drawn from sociology, biology, computer science, epidemiology, and statistics. We first discuss the motivations for using agent-based models in both basic science and policy-oriented social research. Next, we provide an overview of methods and strategies for incorporating data on behavior and populations into agent-based models, and review techniques for validating and testing the sensitivity of agent-based models. We close with suggested directions for future research. PMID:25983351

  10. An empirically based electrosource horizon lead-acid battery model

    SciTech Connect

    Moore, S.; Eshani, M.

    1996-09-01

    An empirically based mathematical model of a lead-acid battery for use in the Texas A and M University`s Electrically Peaking Hybrid (ELPH) computer simulation is presented. The battery model is intended to overcome intuitive difficulties with currently available models by employing direct relationships between state-of-charge, voltage, and power demand. The model input is the power demand or load. Model outputs include voltage, an instantaneous battery efficiency coefficient and a state-of-charge indicator. A time and current depend voltage hysteresis is employed to ensure correct voltage tracking inherent with the highly transient nature of a hybrid electric drivetrain.

  11. Coping, acculturation, and psychological adaptation among migrants: a theoretical and empirical review and synthesis of the literature

    PubMed Central

    Kuo, Ben C.H.

    2014-01-01

    Given the continuous, dynamic demographic changes internationally due to intensive worldwide migration and globalization, the need to more fully understand how migrants adapt and cope with acculturation experiences in their new host cultural environment is imperative and timely. However, a comprehensive review of what we currently know about the relationship between coping behavior and acculturation experience for individuals undergoing cultural changes has not yet been undertaken. Hence, the current article aims to compile, review, and examine cumulative cross-cultural psychological research that sheds light on the relationships among coping, acculturation, and psychological and mental health outcomes for migrants. To this end, this present article reviews prevailing literature pertaining to: (a) the stress and coping conceptual perspective of acculturation; (b) four theoretical models of coping, acculturation and cultural adaptation; (c) differential coping pattern among diverse acculturating migrant groups; and (d) the relationship between coping variabilities and acculturation levels among migrants. In terms of theoretical understanding, this review points to the relative strengths and limitations associated with each of the four theoretical models on coping-acculturation-adaptation. These theories and the empirical studies reviewed in this article further highlight the central role of coping behaviors/strategies in the acculturation process and outcome for migrants and ethnic populations, both conceptually and functionally. Moreover, the review shows that across studies culturally preferred coping patterns exist among acculturating migrants and migrant groups and vary with migrants' acculturation levels. Implications and limitations of the existing literature for coping, acculturation, and psychological adaptation research are discussed and recommendations for future research are put forth. PMID:25750766

  12. Dialectical behavior therapy for borderline personality disorder: theoretical and empirical foundations.

    PubMed

    Shearin, E N; Linehan, M M

    1994-01-01

    Dialectical behavior therapy (DBT) is a cognitive-behavioral psychotherapy developed by Linehan for parasuicidal patients with a diagnosis of borderline personality disorder (BPD). DBT is based on a biosocial theory that views BPD as primarily a dysfunction of the emotion regulation system. The treatment is organized around a hierarchy of behavioral goals that vary in different modes of therapy. In two randomized trials, DBT has shown superiority in reducing parasuicide, medical risk of parasuicides, number of hospital days, dropout from treatment and anger while improving social adjustment. Most gains were maintained through a 1-year follow-up. In one process study testing DBT theory, dialectical techniques balancing acceptance and change were more effective than pure change or acceptance techniques in reducing suicidal behavior. PMID:8010153

  13. The Ease of Language Understanding (ELU) model: theoretical, empirical, and clinical advances

    PubMed Central

    Rönnberg, Jerker; Lunner, Thomas; Zekveld, Adriana; Sörqvist, Patrik; Danielsson, Henrik; Lyxell, Björn; Dahlström, Örjan; Signoret, Carine; Stenfelt, Stefan; Pichora-Fuller, M. Kathleen; Rudner, Mary

    2013-01-01

    Working memory is important for online language processing during conversation. We use it to maintain relevant information, to inhibit or ignore irrelevant information, and to attend to conversation selectively. Working memory helps us to keep track of and actively participate in conversation, including taking turns and following the gist. This paper examines the Ease of Language Understanding model (i.e., the ELU model, Rönnberg, 2003; Rönnberg et al., 2008) in light of new behavioral and neural findings concerning the role of working memory capacity (WMC) in uni-modal and bimodal language processing. The new ELU model is a meaning prediction system that depends on phonological and semantic interactions in rapid implicit and slower explicit processing mechanisms that both depend on WMC albeit in different ways. It is based on findings that address the relationship between WMC and (a) early attention processes in listening to speech, (b) signal processing in hearing aids and its effects on short-term memory, (c) inhibition of speech maskers and its effect on episodic long-term memory, (d) the effects of hearing impairment on episodic and semantic long-term memory, and finally, (e) listening effort. New predictions and clinical implications are outlined. Comparisons with other WMC and speech perception models are made. PMID:23874273

  14. Size-dependent standard deviation for growth rates: empirical results and theoretical modeling.

    PubMed

    Podobnik, Boris; Horvatic, Davor; Pammolli, Fabio; Wang, Fengzhong; Stanley, H Eugene; Grosse, I

    2008-05-01

    We study annual logarithmic growth rates R of various economic variables such as exports, imports, and foreign debt. For each of these variables we find that the distributions of R can be approximated by double exponential (Laplace) distributions in the central parts and power-law distributions in the tails. For each of these variables we further find a power-law dependence of the standard deviation sigma(R) on the average size of the economic variable with a scaling exponent surprisingly close to that found for the gross domestic product (GDP) [Phys. Rev. Lett. 81, 3275 (1998)]. By analyzing annual logarithmic growth rates R of wages of 161 different occupations, we find a power-law dependence of the standard deviation sigma(R) on the average value of the wages with a scaling exponent beta approximately 0.14 close to those found for the growth of exports, imports, debt, and the growth of the GDP. In contrast to these findings, we observe for payroll data collected from 50 states of the USA that the standard deviation sigma(R) of the annual logarithmic growth rate R increases monotonically with the average value of payroll. However, also in this case we observe a power-law dependence of sigma(R) on the average payroll with a scaling exponent beta approximately -0.08 . Based on these observations we propose a stochastic process for multiple cross-correlated variables where for each variable (i) the distribution of logarithmic growth rates decays exponentially in the central part, (ii) the distribution of the logarithmic growth rate decays algebraically in the far tails, and (iii) the standard deviation of the logarithmic growth rate depends algebraically on the average size of the stochastic variable. PMID:18643131

  15. Size-dependent standard deviation for growth rates: Empirical results and theoretical modeling

    NASA Astrophysics Data System (ADS)

    Podobnik, Boris; Horvatic, Davor; Pammolli, Fabio; Wang, Fengzhong; Stanley, H. Eugene; Grosse, I.

    2008-05-01

    We study annual logarithmic growth rates R of various economic variables such as exports, imports, and foreign debt. For each of these variables we find that the distributions of R can be approximated by double exponential (Laplace) distributions in the central parts and power-law distributions in the tails. For each of these variables we further find a power-law dependence of the standard deviation σ(R) on the average size of the economic variable with a scaling exponent surprisingly close to that found for the gross domestic product (GDP) [Phys. Rev. Lett. 81, 3275 (1998)]. By analyzing annual logarithmic growth rates R of wages of 161 different occupations, we find a power-law dependence of the standard deviation σ(R) on the average value of the wages with a scaling exponent β≈0.14 close to those found for the growth of exports, imports, debt, and the growth of the GDP. In contrast to these findings, we observe for payroll data collected from 50 states of the USA that the standard deviation σ(R) of the annual logarithmic growth rate R increases monotonically with the average value of payroll. However, also in this case we observe a power-law dependence of σ(R) on the average payroll with a scaling exponent β≈-0.08 . Based on these observations we propose a stochastic process for multiple cross-correlated variables where for each variable (i) the distribution of logarithmic growth rates decays exponentially in the central part, (ii) the distribution of the logarithmic growth rate decays algebraically in the far tails, and (iii) the standard deviation of the logarithmic growth rate depends algebraically on the average size of the stochastic variable.

  16. Periodic limb movements of sleep: empirical and theoretical evidence supporting objective at-home monitoring

    PubMed Central

    Moro, Marilyn; Goparaju, Balaji; Castillo, Jelina; Alameddine, Yvonne; Bianchi, Matt T

    2016-01-01

    Introduction Periodic limb movements of sleep (PLMS) may increase cardiovascular and cerebrovascular morbidity. However, most people with PLMS are either asymptomatic or have nonspecific symptoms. Therefore, predicting elevated PLMS in the absence of restless legs syndrome remains an important clinical challenge. Methods We undertook a retrospective analysis of demographic data, subjective symptoms, and objective polysomnography (PSG) findings in a clinical cohort with or without obstructive sleep apnea (OSA) from our laboratory (n=443 with OSA, n=209 without OSA). Correlation analysis and regression modeling were performed to determine predictors of periodic limb movement index (PLMI). Markov decision analysis with TreeAge software compared strategies to detect PLMS: in-laboratory PSG, at-home testing, and a clinical prediction tool based on the regression analysis. Results Elevated PLMI values (>15 per hour) were observed in >25% of patients. PLMI values in No-OSA patients correlated with age, sex, self-reported nocturnal leg jerks, restless legs syndrome symptoms, and hypertension. In OSA patients, PLMI correlated only with age and self-reported psychiatric medications. Regression models indicated only a modest predictive value of demographics, symptoms, and clinical history. Decision modeling suggests that at-home testing is favored as the pretest probability of PLMS increases, given plausible assumptions regarding PLMS morbidity, costs, and assumed benefits of pharmacological therapy. Conclusion Although elevated PLMI values were commonly observed, routinely acquired clinical information had only weak predictive utility. As the clinical importance of elevated PLMI continues to evolve, it is likely that objective measures such as PSG or at-home PLMS monitors will prove increasingly important for clinical and research endeavors. PMID:27540316

  17. Comparison between empirical and physically based models of atmospheric correction

    NASA Astrophysics Data System (ADS)

    Mandanici, E.; Franci, F.; Bitelli, G.; Agapiou, A.; Alexakis, D.; Hadjimitsis, D. G.

    2015-06-01

    A number of methods have been proposed for the atmospheric correction of the multispectral satellite images, based on either atmosphere modelling or images themselves. Full radiative transfer models require a lot of ancillary information about the atmospheric conditions at the acquisition time. Whereas, image based methods cannot account for all the involved phenomena. Therefore, the aim of this paper is the comparison of different atmospheric correction methods for multispectral satellite images. The experimentation was carried out on a study area located in the catchment area of Yialias river, 20 km South of Nicosia, the Cyprus capital. The following models were tested, both empirical and physically based: Dark object subtraction, QUAC, Empirical line, 6SV, and FLAASH. They were applied on a Landsat 8 multispectral image. The spectral signatures of ten different land cover types were measured during a field campaign in 2013 and 15 samples were collected for laboratory measurements in a second campaign in 2014. GER 1500 spectroradiometer was used; this instrument can record electromagnetic radiation from 350 up to 1050 nm, includes 512 different channels and each channel covers about 1.5 nm. The spectral signatures measured were used to simulate the reflectance values for the multispectral sensor bands by applying relative spectral response filters. These data were considered as ground truth to assess the accuracy of the different image correction models. Results do not allow to establish which method is the most accurate. The physics-based methods describe better the shape of the signatures, whereas the image-based models perform better regarding the overall albedo.

  18. Theoretical bases of radar (selected pages)

    NASA Astrophysics Data System (ADS)

    Shirman, Ya. D.; Golikov, V. N.; Busygin, I. N.; Kostin, G. A.; Manshos, V. N.

    1987-06-01

    A textbook is presented for radio engineering departments of schools of higher education, which prepare specialists in radar. The use of statistical methods of analysis as the single base is it special feature. The principles are given of construction and the theory of the devices/equipment of optimum detection in the presence of interferences; the methods are examined for obtaining the radar information taking into account achievements in the region of the optimum working/treatment of serrated radar signals, laws governing secondary radiation and radiowave propagation. A large number of examples, which permits the reader to more rapidly master main questions of theory and its application, are given.

  19. Development of an empirically based dynamic biomechanical strength model

    NASA Technical Reports Server (NTRS)

    Pandya, A.; Maida, J.; Aldridge, A.; Hasson, S.; Woolford, B.

    1992-01-01

    The focus here is on the development of a dynamic strength model for humans. Our model is based on empirical data. The shoulder, elbow, and wrist joints are characterized in terms of maximum isolated torque, position, and velocity in all rotational planes. This information is reduced by a least squares regression technique into a table of single variable second degree polynomial equations determining the torque as a function of position and velocity. The isolated joint torque equations are then used to compute forces resulting from a composite motion, which in this case is a ratchet wrench push and pull operation. What is presented here is a comparison of the computed or predicted results of the model with the actual measured values for the composite motion.

  20. Developing an empirical base for clinical nurse specialist education.

    PubMed

    Stahl, Arleen M; Nardi, Deena; Lewandowski, Margaret A

    2008-01-01

    This article reports on the design of a clinical nurse specialist (CNS) education program using National Association of Clinical Nurse Specialists (NACNS) CNS competencies to guide CNS program clinical competency expectations and curriculum outcomes. The purpose is to contribute to the development of an empirical base for education and credentialing of CNSs. The NACNS CNS core competencies and practice competencies in all 3 spheres of influence guided the creation of clinical competency grids for this university's practicum courses. This project describes the development, testing, and application of these clinical competency grids that link the program's CNS clinical courses with the NACNS CNS competencies. These documents guide identification, tracking, measurement, and evaluation of the competencies throughout the clinical practice portion of the CNS program. This ongoing project will continue to provide data necessary to the benchmarking of CNS practice competencies, which is needed to evaluate the effectiveness of direct practice performance and the currency of graduate nursing education. PMID:18438164

  1. Ontology-Based Empirical Knowledge Verification for Professional Virtual Community

    ERIC Educational Resources Information Center

    Chen, Yuh-Jen

    2011-01-01

    A professional virtual community provides an interactive platform for enterprise experts to create and share their empirical knowledge cooperatively, and the platform contains a tremendous amount of hidden empirical knowledge that knowledge experts have preserved in the discussion process. Therefore, enterprise knowledge management highly…

  2. Open-circuit sensitivity model based on empirical parameters for a capacitive-type MEMS acoustic sensor

    NASA Astrophysics Data System (ADS)

    Lee, Jaewoo; Jeon, J. H.; Je, C. H.; Lee, S. Q.; Yang, W. S.; Lee, S.-G.

    2016-03-01

    An empirical-based open-circuit sensitivity model for a capacitive-type MEMS acoustic sensor is presented. To intuitively evaluate the characteristic of the open-circuit sensitivity, the empirical-based model is proposed and analysed by using a lumped spring-mass model and a pad test sample without a parallel plate capacitor for the parasitic capacitance. The model is composed of three different parameter groups: empirical, theoretical, and mixed data. The empirical residual stress from the measured pull-in voltage of 16.7 V and the measured surface topology of the diaphragm were extracted as +13 MPa, resulting in the effective spring constant of 110.9 N/m. The parasitic capacitance for two probing pads including the substrate part was 0.25 pF. Furthermore, to verify the proposed model, the modelled open-circuit sensitivity was compared with the measured value. The MEMS acoustic sensor had an open- circuit sensitivity of -43.0 dBV/Pa at 1 kHz with a bias of 10 V, while the modelled open- circuit sensitivity was -42.9 dBV/Pa, which showed good agreement in the range from 100 Hz to 18 kHz. This validates the empirical-based open-circuit sensitivity model for designing capacitive-type MEMS acoustic sensors.

  3. Noise cancellation in IR video based on empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Piñeiro-Ave, José; Blanco-Velasco, Manuel; Cruz-Roldán, Fernando; Artés-Rodríguez, Antonio

    2013-05-01

    Currently there is a huge demand for simple low cost IR cameras for both civil and military applications, among which one of the most common is the surveillance of restricted access zones. In the design of low cost IR cameras, it is necessary to avoid the use of several elements present in more sophisticated cameras, such as the refrigeration systems and the temperature control of the detectors, so as to prevent the use of a mechanical modulator of the incident radiation (chopper). Consequently, the detection algorithms must reliably separate the target signal from high noise and drift caused by temporal variations of the background image of the scene and the additional drift due to thermal instability detectors. A very important step towards this goal is the design of a preprocessing stage to eliminate noise. Thus, in this work we propose using the Empirical Mode Decomposition (EMD) method to attain this objective. In order to evaluate the quality of the reconstructed clean signal, the Average to Peak Ratio is assessed to evaluate the effectiveness in reconstructing the waveform of the signal from the target. We compare the EMD method with other classical method of noise cancellation based on the Discrete Wavelet Transform (DWT). The results reported by simulations show that the proposed scheme based on EMD performs better than traditional ones.

  4. Empirically and theoretically determined spatial and temporal variability of the Late Holocene sea level in the South-Central Pacific (Invited)

    NASA Astrophysics Data System (ADS)

    Eisenhauer, A.; Rashid, R. J.; Hallmann, N.; Stocchi, P.; Fietzke, J.; Camoin, G.; Vella, C.; Samankassou, E.

    2013-12-01

    We present U/Th dated fossil corals which were collected from reef platforms on three islands (Moorea, Huahine and Bora Bora) of the Society Islands, French Polynesia. In particular U/Th-dated fossil microatolls precisely constrain the timing and amplitude of sea-level variations at and after the 'Holocene Sea Level Maximum, HSLM' because microatolls grow close or even directly at the current sea-level position. We found that sea level reached a subsidence corrected position of at least ~1.5 m above present sea level (apsl) at ~5.4 ka before present (BP) relative to Huahine island and a maximum amplitude of at least ~2.0 m apsl at ~2.0 ka BP relative to Moorea. In between 5.4 and 2 ka minimum sealevel oscillated between 1.5 and 2 m for ~3 ka but then declined to the present position after ~2 ka BP. Based on statistical arguments on the coral age distribution HSLM is constrained to an interval of 3.5×0.8 ka. Former studies being in general accord with our data show that sea level in French Polynesia was ~1 m higher than present between 5,000 and 1,250 yrs BP and that a highstand was reached between 2,000 and 1,500 yrs BP (Pirazzoli and Montaggioni, 1988) and persisted until 1,200 yrs BP in the Tuamotu Archipelago (Pirazzoli and Montaggioni, 1986). Modeling of the Late Holocene sea-level rise performed during the course of this study taking glacio-isostatic and the ocean syphoning effect into account predicts a Late Holocene sea-level highstand of ~1 m apsl at ~4 ka BP for Bora Bora which is in general agreement with the statistical interpretation of our empirical data. However, the modeled HSLM amplitude of ~1 m apsl is considerably smaller than predicted by the empirical data indicating amplitudes of more than 2 m. Furthermore, the theoretical model predicts a continuously falling sea level after ~4 ka to the present. This is in contrast to the empirical data indicating a sea level remaining above at least ~1 m apsl between 5 ka and 2 ka then followed by a certain

  5. Painting by numbers: nanoparticle-based colorants in the post-empirical age.

    PubMed

    Klupp Taylor, Robin N; Seifrt, Frantisek; Zhuromskyy, Oleksandr; Peschel, Ulf; Leugering, Günter; Peukert, Wolfgang

    2011-06-17

    The visual appearance of the artificial world is largely governed by films or composites containing particles with at least one dimension smaller than a micron. Over the past century and a half, the optical properties of such materials have been scrutinized and a broad range of colorant products, based mostly on empirical microstructural improvements, developed. With the advent of advanced synthetic approaches capable of tailoring particle shape, size and composition on the nanoscale, the question of what is the optimum particle for a certain optical property can no longer be answered solely by experimentation. Instead, new and improved computational approaches are required to invert the structure-function relationship. This progress report reviews the development in our understanding of this relationship and indicates recent examples of how theoretical design is taking an ever increasingly important role in the search for enhanced or multifunctional colorants. PMID:21538592

  6. Activity Theory as a Theoretical Framework for Health Self-Quantification: A Systematic Review of Empirical Studies

    PubMed Central

    2016-01-01

    Background Self-quantification (SQ) is a way of working in which, by using tracking tools, people aim to collect, manage, and reflect on personal health data to gain a better understanding of their own body, health behavior, and interaction with the world around them. However, health SQ lacks a formal framework for describing the self-quantifiers’ activities and their contextual components or constructs to pursue these health related goals. Establishing such framework is important because it is the first step to operationalize health SQ fully. This may in turn help to achieve the aims of health professionals and researchers who seek to make or study changes in the self-quantifiers’ health systematically. Objective The aim of this study was to review studies on health SQ in order to answer the following questions: What are the general features of the work and the particular activities that self-quantifiers perform to achieve their health objectives? What constructs of health SQ have been identified in the scientific literature? How have these studies described such constructs? How would it be possible to model these constructs theoretically to characterize the work of health SQ? Methods A systematic review of peer-reviewed literature was conducted. A total of 26 empirical studies were included. The content of these studies was thematically analyzed using Activity Theory as an organizing framework. Results The literature provided varying descriptions of health SQ as data-driven and objective-oriented work mediated by SQ tools. From the literature, we identified two types of SQ work: work on data (ie, data management activities) and work with data (ie, health management activities). Using Activity Theory, these activities could be characterized into 6 constructs: users, tracking tools, health objectives, division of work, community or group setting, and SQ plan and rules. We could not find a reference to any single study that accounted for all these activities and

  7. Methods for combining a theoretical and an empirical approach in modelling pressure and flow control valves for CAE-programs for fluid power circuits

    NASA Astrophysics Data System (ADS)

    Handroos, Heikki

    An analytical mathematical model for a fluid power valve uses equations based on physical laws. The parameters consist of physical coefficients, dimensions of the internal elements, spring constants, etc. which are not provided by the component manufacturers. The valve has to be dismantled in order to determine their values. The model is only in accordance with a particular type of valve construction and there are a large number of parameters. This is a major common problem in computer aided engineering (CAE) programs for fluid power circuits. Methods for solving this problem by combining a theoretical and an empirical approach are presented. Analytical models for single stage pressure and flow control valves are brought into forms which contain fewer parameters whose values can be determined from measured characteristic curves. The least squares criterion is employed to identify the parameter values describing the steady state of a valve. The steady state characteristic curves that are required data for this identification are quite often provided by the manufacturers. The parameters describing the dynamics of a valve are determined using a simple noncomputational method using dynamic characteristic curves that can be easily measured. The importance of the identification accuracy of the different parameters of the single stage pressure relief valve model is compared using a parameter sensitivity analysis method. A new comparison method called relative mean value criterion is used to compare the influences of variations of the different parameters to a nominal dynamic response.

  8. Evidence-based ethics? On evidence-based practice and the "empirical turn" from normative bioethics

    PubMed Central

    Goldenberg, Maya J

    2005-01-01

    Background The increase in empirical methods of research in bioethics over the last two decades is typically perceived as a welcomed broadening of the discipline, with increased integration of social and life scientists into the field and ethics consultants into the clinical setting, however it also represents a loss of confidence in the typical normative and analytic methods of bioethics. Discussion The recent incipiency of "Evidence-Based Ethics" attests to this phenomenon and should be rejected as a solution to the current ambivalence toward the normative resolution of moral problems in a pluralistic society. While "evidence-based" is typically read in medicine and other life and social sciences as the empirically-adequate standard of reasonable practice and a means for increasing certainty, I propose that the evidence-based movement in fact gains consensus by displacing normative discourse with aggregate or statistically-derived empirical evidence as the "bottom line". Therefore, along with wavering on the fact/value distinction, evidence-based ethics threatens bioethics' normative mandate. The appeal of the evidence-based approach is that it offers a means of negotiating the demands of moral pluralism. Rather than appealing to explicit values that are likely not shared by all, "the evidence" is proposed to adjudicate between competing claims. Quantified measures are notably more "neutral" and democratic than liberal markers like "species normal functioning". Yet the positivist notion that claims stand or fall in light of the evidence is untenable; furthermore, the legacy of positivism entails the quieting of empirically non-verifiable (or at least non-falsifiable) considerations like moral claims and judgments. As a result, evidence-based ethics proposes to operate with the implicit normativity that accompanies the production and presentation of all biomedical and scientific facts unchecked. Summary The "empirical turn" in bioethics signals a need for

  9. A patient-centered pharmacy services model of HIV patient care in community pharmacy settings: a theoretical and empirical framework.

    PubMed

    Kibicho, Jennifer; Owczarzak, Jill

    2012-01-01

    Reflecting trends in health care delivery, pharmacy practice has shifted from a drug-specific to a patient-centered model of care, aimed at improving the quality of patient care and reducing health care costs. In this article, we outline a theoretical model of patient-centered pharmacy services (PCPS), based on in-depth, qualitative interviews with a purposive sample of 28 pharmacists providing care to HIV-infected patients in specialty, semispecialty, and nonspecialty pharmacy settings. Data analysis was an interactive process informed by pharmacists' interviews and a review of the general literature on patient centered care, including Medication Therapy Management (MTM) services. Our main finding was that the current models of pharmacy services, including MTM, do not capture the range of pharmacy services in excess of mandated drug dispensing services. In this article, we propose a theoretical PCPS model that reflects the actual services pharmacists provide. The model includes five elements: (1) addressing patients as whole, contextualized persons; (2) customizing interventions to unique patient circumstances; (3) empowering patients to take responsibility for their own health care; (4) collaborating with clinical and nonclinical providers to address patient needs; and (5) developing sustained relationships with patients. The overarching goal of PCPS is to empower patients' to take responsibility for their own health care and self-manage their HIV-infection. Our findings provide the foundation for future studies regarding how widespread these practices are in diverse community settings, the validity of the proposed PCPS model, the potential for standardizing pharmacist practices, and the feasibility of a PCPS framework to reimburse pharmacists services. PMID:22149903

  10. Theoretical bases for conducting certain technological processes in space

    NASA Technical Reports Server (NTRS)

    Okhotin, A. S.

    1979-01-01

    Dimensionless conservation equations are presented and the theoretical bases of fluid behavior aboard orbiting satellites with application to the processes of manufacturing crystals in weightlessness. The small amount of gravitational acceleration is shown to increase the separation of bands of varying concentration. Natural convection is shown to have no practical effect on crystallization from a liquid melt. Barodiffusion is also negligibly small in realistic conditions of weightlessness. The effects of surface tension become increasingly large, and suggestions are made for further research.

  11. Segmented Labor Markets: A Review of the Theoretical and Empirical Literature and Its Implication for Educational Planning.

    ERIC Educational Resources Information Center

    Carnoy, Martin

    The study reviews orthodox theories of labor markets, presents new formulations of segmentation theory, and provides empirical tests of segmentation in the United States and several developing nations. Orthodox labor market theory views labor as being paid for its contribution to production and that investment in education and vocational training…

  12. E-learning in engineering education: a theoretical and empirical study of the Algerian higher education institution

    NASA Astrophysics Data System (ADS)

    Benchicou, Soraya; Aichouni, Mohamed; Nehari, Driss

    2010-06-01

    Technology-mediated education or e-learning is growing globally both in scale and delivery capacity due to the large diffusion of the ubiquitous information and communication technologies (ICT) in general and the web technologies in particular. This statement has not yet been fully supported by research, especially in developing countries such as Algeria. The purpose of this paper was to identify directions for addressing the needs of academics in higher education institutions in Algeria in order to adopt the e-learning approach as a strategy to improve quality of education. The paper will report results of an empirical study that measures the readiness of the Algerian higher education institutions towards the implementation of ICT in the educational process and the attitudes of faculty members towards the application of the e-learning approach in engineering education. Three main objectives were targeted, namely: (a) to provide an initial evaluation of faculty members' attitudes and perceptions towards web-based education; (b) reporting on their perceived requirements for implementing e-learning in university courses; (c) providing an initial input for a collaborative process of developing an institutional strategy for e-learning. Statistical analysis of the survey results indicates that the Algerian higher education institution, which adopted the Licence - Master and Doctorate educational system, is facing a big challenge to take advantage of emerging technological innovations and the advent of e-learning to further develop its teaching programmes and to enhance the quality of education in engineering fields. The successful implementation of this modern approach is shown to depend largely on a set of critical success factors that would include: 1. The extent to which the institution will adopt a formal and official e-learning strategy. 2. The extent to which faculty members will adhere and adopt this strategy and develop ownership of the various measures in the

  13. A Physically Based Theoretical Model of Spore Deposition for Predicting Spread of Plant Diseases.

    PubMed

    Isard, Scott A; Chamecki, Marcelo

    2016-03-01

    A physically based theory for predicting spore deposition downwind from an area source of inoculum is presented. The modeling framework is based on theories of turbulence dispersion in the atmospheric boundary layer and applies only to spores that escape from plant canopies. A "disease resistance" coefficient is introduced to convert the theoretical spore deposition model into a simple tool for predicting disease spread at the field scale. Results from the model agree well with published measurements of Uromyces phaseoli spore deposition and measurements of wheat leaf rust disease severity. The theoretical model has the advantage over empirical models in that it can be used to assess the influence of source distribution and geometry, spore characteristics, and meteorological conditions on spore deposition and disease spread. The modeling framework is refined to predict the detailed two-dimensional spatial pattern of disease spread from an infection focus. Accounting for the time variations of wind speed and direction in the refined modeling procedure improves predictions, especially near the inoculum source, and enables application of the theoretical modeling framework to field experiment design. PMID:26595112

  14. An empirical/theoretical model with dimensionless numbers to predict the performance of electrodialysis systems on the basis of operating conditions.

    PubMed

    Karimi, Leila; Ghassemi, Abbas

    2016-07-01

    Among the different technologies developed for desalination, the electrodialysis/electrodialysis reversal (ED/EDR) process is one of the most promising for treating brackish water with low salinity when there is high risk of scaling. Multiple researchers have investigated ED/EDR to optimize the process, determine the effects of operating parameters, and develop theoretical/empirical models. Previously published empirical/theoretical models have evaluated the effect of the hydraulic conditions of the ED/EDR on the limiting current density using dimensionless numbers. The reason for previous studies' emphasis on limiting current density is twofold: 1) to maximize ion removal, most ED/EDR systems are operated close to limiting current conditions if there is not a scaling potential in the concentrate chamber due to a high concentration of less-soluble salts; and 2) for modeling the ED/EDR system with dimensionless numbers, it is more accurate and convenient to use limiting current density, where the boundary layer's characteristics are known at constant electrical conditions. To improve knowledge of ED/EDR systems, ED/EDR models should be also developed for the Ohmic region, where operation reduces energy consumption, facilitates targeted ion removal, and prolongs membrane life compared to limiting current conditions. In this paper, theoretical/empirical models were developed for ED/EDR performance in a wide range of operating conditions. The presented ion removal and selectivity models were developed for the removal of monovalent ions and divalent ions utilizing the dominant dimensionless numbers obtained from laboratory scale electrodialysis experiments. At any system scale, these models can predict ED/EDR performance in terms of monovalent and divalent ion removal. PMID:27108213

  15. A theoretical and empirical investigation into the willingness-to-pay function for new innovative drugs by Germany's health technology assessment agency (IQWiG).

    PubMed

    Gandjour, Afschin

    2013-11-01

    Under the recently enacted pharmaceutical price and reimbursement regulation in Germany, new drugs are subject to a rapid assessment to determine whether there is sufficient evidence of added clinical benefits compared with the existing standard of treatment. If such added benefits are confirmed, manufacturers and representatives of the Statutory Health Insurance (SHI) are expected to negotiate an appropriate reimbursement price. If parties fail to reach an agreement, a final decision on the reimbursement price will be made by an arbitration body. If one of the parties involved wishes so, then the Institute for Quality and Efficiency in Health Care (Institut für Qualität und Wirtschaftlichkeit im Gesundheitswesen, IQWiG) will be commissioned with a formal evaluation of costs and benefits of the product in question. IQWiG will make a recommendation for a reimbursement price based on the 'efficiency frontier' in a therapeutic area. The purpose of the assessments is to provide support for decision-making bodies that act on behalf of the SHI insurants. To determine the willingness to pay for new drugs, IQWiG uses the following decision rule: the incremental cost-effectiveness ratio of a new drug compared with the next effective intervention should not be higher than that of the next effective intervention compared with its comparator. The purpose of this paper was to investigate the theoretical and empirical relationship between the willingness to pay for drugs and their health benefits. The analysis shows that across disease areas IQWiG has a curvilinear relationship between willingness to pay and health benefits. Future research may address the validity of the willingness-to-pay function from the viewpoint of the individual SHI insurants. PMID:25595007

  16. Theoretical geology

    NASA Astrophysics Data System (ADS)

    Mikeš, Daniel

    2010-05-01

    erroneous assumptions and do not solve the very fundamental issue that lies at the base of the problem. This problem is straighforward and obvious: a sedimentary system is inherently four-dimensional (3 spatial dimensions + 1 temporal dimension). Any method using an inferior number or dimensions is bound to fail to describe the evolution of a sedimentary system. It is indicative of the present day geological world that such fundamental issues be overlooked. The only reason for which one can appoint the socalled "rationality" in todays society. Simple "common sense" leads us to the conclusion that in this case the empirical method is bound to fail and the only method that can solve the problem is the theoretical approach. Reasoning that is completely trivial for the traditional exact sciences like physics and mathematics and applied sciences like engineering. However, not for geology, a science that was traditionally descriptive and jumped to empirical science, skipping the stage of theoretical science. I argue that the gap of theoretical geology is left open and needs to be filled. Every discipline in geology lacks a theoretical base. This base can only be filled by the theoretical/inductive approach and can impossibly be filled by the empirical/deductive approach. Once a critical mass of geologists realises this flaw in todays geology, we can start solving the fundamental problems in geology.

  17. Accuracy of Population Validity and Cross-Validity Estimation: An Empirical Comparison of Formula-Based, Traditional Empirical, and Equal Weights Procedures.

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Bilgic, Reyhan; Edwards, Jack E.; Fleer, Paul F.

    1999-01-01

    Performed an empirical Monte Carlo study using predictor and criterion data from 84,808 U.S. Air Force enlistees. Compared formula-based, traditional empirical, and equal-weights procedures. Discusses issues for basic research on validation and cross-validation. (SLD)

  18. Time Domain Strain/Stress Reconstruction Based on Empirical Mode Decomposition: Numerical Study and Experimental Validation.

    PubMed

    He, Jingjing; Zhou, Yibin; Guan, Xuefei; Zhang, Wei; Zhang, Weifang; Liu, Yongming

    2016-01-01

    Structural health monitoring has been studied by a number of researchers as well as various industries to keep up with the increasing demand for preventive maintenance routines. This work presents a novel method for reconstruct prompt, informed strain/stress responses at the hot spots of the structures based on strain measurements at remote locations. The structural responses measured from usage monitoring system at available locations are decomposed into modal responses using empirical mode decomposition. Transformation equations based on finite element modeling are derived to extrapolate the modal responses from the measured locations to critical locations where direct sensor measurements are not available. Then, two numerical examples (a two-span beam and a 19956-degree of freedom simplified airfoil) are used to demonstrate the overall reconstruction method. Finally, the present work investigates the effectiveness and accuracy of the method through a set of experiments conducted on an aluminium alloy cantilever beam commonly used in air vehicle and spacecraft. The experiments collect the vibration strain signals of the beam via optical fiber sensors. Reconstruction results are compared with theoretical solutions and a detailed error analysis is also provided. PMID:27537889

  19. The Importance of Emotion in Theories of Motivation: Empirical, Methodological, and Theoretical Considerations from a Goal Theory Perspective

    ERIC Educational Resources Information Center

    Turner, Julianne C.; Meyer, Debra K.; Schweinle, Amy

    2003-01-01

    Despite its importance to educational psychology, prominent theories of motivation have mostly ignored emotion. In this paper, we review theoretical conceptions of the relation between motivation and emotion and discuss the role of emotion in understanding student motivation in classrooms. We demonstrate that emotion is one of the best indicators…

  20. The Equivalence of Information-Theoretic and Likelihood-Based Methods for Neural Dimensionality Reduction

    PubMed Central

    Williamson, Ross S.; Sahani, Maneesh; Pillow, Jonathan W.

    2015-01-01

    Stimulus dimensionality-reduction methods in neuroscience seek to identify a low-dimensional space of stimulus features that affect a neuron’s probability of spiking. One popular method, known as maximally informative dimensions (MID), uses an information-theoretic quantity known as “single-spike information” to identify this space. Here we examine MID from a model-based perspective. We show that MID is a maximum-likelihood estimator for the parameters of a linear-nonlinear-Poisson (LNP) model, and that the empirical single-spike information corresponds to the normalized log-likelihood under a Poisson model. This equivalence implies that MID does not necessarily find maximally informative stimulus dimensions when spiking is not well described as Poisson. We provide several examples to illustrate this shortcoming, and derive a lower bound on the information lost when spiking is Bernoulli in discrete time bins. To overcome this limitation, we introduce model-based dimensionality reduction methods for neurons with non-Poisson firing statistics, and show that they can be framed equivalently in likelihood-based or information-theoretic terms. Finally, we show how to overcome practical limitations on the number of stimulus dimensions that MID can estimate by constraining the form of the non-parametric nonlinearity in an LNP model. We illustrate these methods with simulations and data from primate visual cortex. PMID:25831448

  1. Empirically Estimable Classification Bounds Based on a Nonparametric Divergence Measure

    PubMed Central

    Berisha, Visar; Wisler, Alan; Hero, Alfred O.; Spanias, Andreas

    2015-01-01

    Information divergence functions play a critical role in statistics and information theory. In this paper we show that a non-parametric f-divergence measure can be used to provide improved bounds on the minimum binary classification probability of error for the case when the training and test data are drawn from the same distribution and for the case where there exists some mismatch between training and test distributions. We confirm the theoretical results by designing feature selection algorithms using the criteria from these bounds and by evaluating the algorithms on a series of pathological speech classification tasks. PMID:26807014

  2. Why resilience is unappealing to social science: Theoretical and empirical investigations of the scientific use of resilience.

    PubMed

    Olsson, Lennart; Jerneck, Anne; Thoren, Henrik; Persson, Johannes; O'Byrne, David

    2015-05-01

    Resilience is often promoted as a boundary concept to integrate the social and natural dimensions of sustainability. However, it is a troubled dialogue from which social scientists may feel detached. To explain this, we first scrutinize the meanings, attributes, and uses of resilience in ecology and elsewhere to construct a typology of definitions. Second, we analyze core concepts and principles in resilience theory that cause disciplinary tensions between the social and natural sciences (system ontology, system boundary, equilibria and thresholds, feedback mechanisms, self-organization, and function). Third, we provide empirical evidence of the asymmetry in the use of resilience theory in ecology and environmental sciences compared to five relevant social science disciplines. Fourth, we contrast the unification ambition in resilience theory with methodological pluralism. Throughout, we develop the argument that incommensurability and unification constrain the interdisciplinary dialogue, whereas pluralism drawing on core social scientific concepts would better facilitate integrated sustainability research. PMID:26601176

  3. Why resilience is unappealing to social science: Theoretical and empirical investigations of the scientific use of resilience

    PubMed Central

    Olsson, Lennart; Jerneck, Anne; Thoren, Henrik; Persson, Johannes; O’Byrne, David

    2015-01-01

    Resilience is often promoted as a boundary concept to integrate the social and natural dimensions of sustainability. However, it is a troubled dialogue from which social scientists may feel detached. To explain this, we first scrutinize the meanings, attributes, and uses of resilience in ecology and elsewhere to construct a typology of definitions. Second, we analyze core concepts and principles in resilience theory that cause disciplinary tensions between the social and natural sciences (system ontology, system boundary, equilibria and thresholds, feedback mechanisms, self-organization, and function). Third, we provide empirical evidence of the asymmetry in the use of resilience theory in ecology and environmental sciences compared to five relevant social science disciplines. Fourth, we contrast the unification ambition in resilience theory with methodological pluralism. Throughout, we develop the argument that incommensurability and unification constrain the interdisciplinary dialogue, whereas pluralism drawing on core social scientific concepts would better facilitate integrated sustainability research. PMID:26601176

  4. Empirical Analysis and Refinement of Expert System Knowledge Bases

    PubMed Central

    Weiss, Sholom M.; Politakis, Peter; Ginsberg, Allen

    1986-01-01

    Recent progress in knowledge base refinement for expert systems is reviewed. Knowledge base refinement is characterized by the constrained modification of rule-components in an existing knowledge base. The goals are to localize specific weaknesses in a knowledge base and to improve an expert system's performance. Systems that automate some aspects of knowledge base refinement can have a significant impact on the related problems of knowledge base acquisition, maintenance, verification, and learning from experience. The SEEK empiricial analysis and refinement system is reviewed and its successor system, SEEK2, is introduced. Important areas for future research in knowledge base refinement are described.

  5. An empirical formula based on Monte Carlo simulation for diffuse reflectance from turbid media

    NASA Astrophysics Data System (ADS)

    Gnanatheepam, Einstein; Aruna, Prakasa Rao; Ganesan, Singaravelu

    2016-03-01

    Diffuse reflectance spectroscopy has been widely used in diagnostic oncology and characterization of laser irradiated tissue. However, still accurate and simple analytical equation does not exist for estimation of diffuse reflectance from turbid media. In this work, a diffuse reflectance lookup table for a range of tissue optical properties was generated using Monte Carlo simulation. Based on the generated Monte Carlo lookup table, an empirical formula for diffuse reflectance was developed using surface fitting method. The variance between the Monte Carlo lookup table surface and the surface obtained from the proposed empirical formula is less than 1%. The proposed empirical formula may be used for modeling of diffuse reflectance from tissue.

  6. Fleet Fatality Risk and its Sensitivity to Vehicle Mass Change in Frontal Vehicle-to-Vehicle Crashes, Using a Combined Empirical and Theoretical Model.

    PubMed

    Shi, Yibing; Nusholtz, Guy S

    2015-11-01

    The objective of this study is to analytically model the fatality risk in frontal vehicle-to-vehicle crashes of the current vehicle fleet, and its sensitivity to vehicle mass change. A model is built upon an empirical risk ratio-mass ratio relationship from field data and a theoretical mass ratio-velocity change ratio relationship dictated by conservation of momentum. The fatality risk of each vehicle is averaged over the closing velocity distribution to arrive at the mean fatality risks. The risks of the two vehicles are summed and averaged over all possible crash partners to find the societal mean fatality risk associated with a subject vehicle of a given mass from a fleet specified by a mass distribution function. Based on risk exponent and mass distribution from a recent fleet, the subject vehicle mean fatality risk is shown to increase, while at the same time that for the partner vehicles decreases, as the mass of the subject vehicle decreases. The societal mean fatality risk, the sum of these, incurs a penalty with respect to a fleet with complete mass equality. This penalty reaches its minimum (~8% for the example fleet) for crashes with a subject vehicle whose mass is close to the fleet mean mass. The sensitivity, i.e., the rate of change of the societal mean fatality risk with respect to the mass of the subject vehicle is assessed. Results from two sets of fully regression-based analyses, Kahane (2012) and Van Auken and Zellner (2013), are approximately compared with the current result. The general magnitudes of the results are comparable, but differences exist at a more detailed level. The subject vehicle-oriented societal mean fatality risk is averaged over all possible subject vehicle masses of a given fleet to obtain the overall mean fatality risk of the fleet. It is found to increase approximately linearly at a rate of about 0.8% for each 100 lb decrease in mass of all vehicles in the fleet. PMID:26660748

  7. Deep in Data. Empirical Data Based Software Accuracy Testing Using the Building America Field Data Repository

    SciTech Connect

    Neymark, J.; Roberts, D.

    2013-06-01

    This paper describes progress toward developing a usable, standardized, empirical data-based software accuracy test suite using home energy consumption and building description data. Empirical data collected from around the United States have been translated into a uniform Home Performance Extensible Markup Language format that may enable software developers to create translators to their input schemes for efficient access to the data. This could allow for modeling many homes expediently, and thus implementing software accuracy test cases by applying the translated data.

  8. Are prejudices against disabled persons determined by personality characteristics? Reviewing a theoretical approach on the basis of empirical research findings.

    PubMed

    Cloerkes, G

    1981-01-01

    Taking as point of departure the results obtained from research on prejudice, many authors believe that the quality of attitudes toward disabled persons is influenced by the personality structure of the nondisabled. In order to verify this assumption, a secondary analysis of 67 empirical studies was undertaken. These studies referred to different personality variables such as authoritarianism, ethnocentrism, dogmatism, rigidity, intolerance of ambiguity, cognitive simplicity, anxiety, ego-weakness, self-concept, body-concept, aggressiveness, empathy, intelligence, etc. The results can be summarized as follows: Statistical criteria show that single personality traits have relatively little influence on the attitudes towards disabled persons. An adequate evaluation of the research findings is complicated by, at times, considerable methodological problems which arise when applying the proper test instruments to non-clinical populations. Marked correlations are to be found in particular in the case of authoritarianism, ethnocentrism, intolerance of ambiguity, anxiety, and ego-weakness. The intercorrelations, however, between most of the personality variables are rather high, which by cumulation of "extreme" factors may, in fact, sometimes result in particularly unfavorable attitudes toward the disabled. Thus, personality-related research findings to provide certain valuable explanations. Special attention should be devoted to the multiple connections between personality structure and social structure. PMID:6452419

  9. The Demand for Cigarettes as Derived from the Demand for Weight Loss: A Theoretical and Empirical Investigation.

    PubMed

    Cawley, John; Dragone, Davide; Von Hinke Kessler Scholder, Stephanie

    2016-01-01

    This paper offers an economic model of smoking and body weight and provides new empirical evidence on the extent to which the demand for cigarettes is derived from the demand for weight loss. In the model, smoking causes weight loss in addition to having direct utility benefits and direct health consequences. It predicts that some individuals smoke for weight loss and that the practice is more common among those who consider themselves overweight and those who experience greater disutility from excess weight. We test these hypotheses using nationally representative data in which adolescents are directly asked whether they smoke to control their weight. We find that, among teenagers who smoke frequently, 46% of girls and 30% of boys are smoking in part to control their weight. As predicted by the model, this practice is significantly more common among those who describe themselves as too fat and among groups that tend to experience greater disutility from obesity. We conclude by discussing the implications of these findings for tax policy; specifically, the demand for cigarettes is less price elastic among those who smoke for weight loss, all else being equal. Public health efforts to reduce smoking initiation and encourage cessation may wish to design campaigns to alter the derived nature of cigarette demand, especially among adolescent girls. PMID:25346511

  10. Theoretical investigation of graphene-based photonic modulators

    PubMed Central

    Gosciniak, Jacek; Tan, Dawn T. H.

    2013-01-01

    Integration of electronics and photonics for future applications requires an efficient conversion of electrical to optical signals. The excellent electronic and photonic properties of graphene make it a suitable material for integrated systems with extremely wide operational bandwidth. In this paper, we analyze the novel geometry of modulator based on the rib photonic waveguide configuration with a double-layer graphene placed between a slab and ridge. The theoretical analysis of graphene-based electro-absorption modulator was performed showing that a 3 dB modulation with ~ 600 nm-long waveguide is possible resulting in energy per bit below 1 fJ/bit. The optical bandwidth of such modulators exceeds 12 THz with an operation speed ranging from 160 GHz to 850 GHz and limited only by graphene resistance. The performances of modulators were evaluated based on the figure of merit defined as the ratio between extinction ratio and insertion losses where it was found to exceed 220. PMID:23719514

  11. Theoretical detection ranges for acoustic based manatee avoidance technology.

    PubMed

    Phillips, Richard; Niezrecki, Christopher; Beusse, Diedrich O

    2006-07-01

    The West Indian manatee (Trichechus manatus latirostris) has become endangered partly because of watercraft collisions in Florida's coastal waterways. To reduce the number of collisions, warning systems based upon detecting manatee vocalizations have been proposed. One aspect of the feasibility of an acoustically based warning system relies upon the distance at which a manatee vocalization is detectable. Assuming a mixed spreading model, this paper presents a theoretical analysis of the system detection capabilities operating within various background and watercraft noise conditions. This study combines measured source levels of manatee vocalizations with the modeled acoustic properties of manatee habitats to develop a method for determining the detection range and hydrophone spacing requirements for acoustic based manatee avoidance technologies. In quiet environments (background noise approximately 70 dB) it was estimated that manatee vocalizations are detectable at approximately 250 m, with a 6 dB detection threshold, In louder environments (background noise approximately 100dB) the detection range drops to 2.5 m. In a habitat with 90 dB of background noise, a passing boat with a maximum noise floor of 120 dB would be the limiting factor when it is within approximately 100 m of a hydrophone. The detection range was also found to be strongly dependent on the manatee vocalization source level. PMID:16875213

  12. Genetic load, inbreeding depression, and hybrid vigor covary with population size: An empirical evaluation of theoretical predictions.

    PubMed

    Lohr, Jennifer N; Haag, Christoph R

    2015-12-01

    Reduced population size is thought to have strong consequences for evolutionary processes as it enhances the strength of genetic drift. In its interaction with selection, this is predicted to increase the genetic load, reduce inbreeding depression, and increase hybrid vigor, and in turn affect phenotypic evolution. Several of these predictions have been tested, but comprehensive studies controlling for confounding factors are scarce. Here, we show that populations of Daphnia magna, which vary strongly in genetic diversity, also differ in genetic load, inbreeding depression, and hybrid vigor in a way that strongly supports theoretical predictions. Inbreeding depression is positively correlated with genetic diversity (a proxy for Ne ), and genetic load and hybrid vigor are negatively correlated with genetic diversity. These patterns remain significant after accounting for potential confounding factors and indicate that, in small populations, a large proportion of the segregation load is converted into fixed load. Overall, the results suggest that the nature of genetic variation for fitness-related traits differs strongly between large and small populations. This has large consequences for evolutionary processes in natural populations, such as selection on dispersal, breeding systems, ageing, and local adaptation. PMID:26497949

  13. Performance-Based Service Quality Model: An Empirical Study on Japanese Universities

    ERIC Educational Resources Information Center

    Sultan, Parves; Wong, Ho

    2010-01-01

    Purpose: This paper aims to develop and empirically test the performance-based higher education service quality model. Design/methodology/approach: The study develops 67-item instrument for measuring performance-based service quality with a particular focus on the higher education sector. Scale reliability is confirmed using the Cronbach's alpha.…

  14. Empirically Based School Interventions Targeted at Academic and Mental Health Functioning

    ERIC Educational Resources Information Center

    Hoagwood, Kimberly E.; Olin, S. Serene; Kerker, Bonnie D.; Kratochwill, Thomas R.; Crowe, Maura; Saka, Noa

    2007-01-01

    This review examines empirically based studies of school-based mental health interventions. The review identified 64 out of more than 2,000 articles published between 1990 and 2006 that met methodologically rigorous criteria for inclusion. Of these 64 articles, only 24 examined both mental health "and" educational outcomes. The majority of…

  15. Empirical and theoretical dosimetry in support of whole body radio frequency (RF) exposure in seated human volunteers at 220 MHz.

    PubMed

    Allen, Stewart J; Adair, Eleanor R; Mylacraine, Kevin S; Hurt, William; Ziriax, John

    2005-09-01

    This study reports the dosimetry performed to support an experiment that measured physiological responses of seated volunteer human subjects exposed to 220 MHz fields. Exposures were performed in an anechoic chamber which was designed to provide uniform fields for frequencies of 100 MHz or greater. A vertical half-wave dipole with a 90 degrees reflector was used to optimize the field at the subject's location. The vertically polarized E field was incident on the dorsal side of the phantoms and human volunteers. The dosimetry plan required measurement of stationary probe drift, field strengths as a function of distance, electric and magnetic field maps at 200, 225, and 250 cm from the dipole antenna, and specific absorption rate (SAR) measurements using a human phantom, as well as theoretical predictions of SAR with the finite difference time domain (FDTD) method. A NBS (National Bureau of Standards, now NIST, National Institute of Standards and Technology, Boulder, CO) 10 cm loop antenna was positioned 150 cm to the right, 100 cm above and 60 cm behind the subject (toward the transmitting antenna) and was read prior to each subject's exposure and at 5 min intervals during all RF exposures. Transmitter stability was determined by measuring plate voltage, plate current, screen voltage and grid voltage for the driver and final amplifiers before and at 5 min intervals throughout the RF exposures. These dosimetry measurements assured accurate and consistent exposures. FDTD calculations were used to determine SAR distribution in a seated human subject. This study reports the necessary dosimetry to precisely control exposure levels for studies of the physiological consequences of human volunteer exposures to 220 MHz. PMID:15931686

  16. The Empirical Investigation of Perspective-Based Reading

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Green, Scott; Laitenberger, Oliver; Shull, Forrest; Sorumgard, Sivert; Zelkowitz, Marvin V.

    1996-01-01

    We consider reading techniques a fundamental means of achieving high quality software. Due to the lack of research in this area, we are experimenting with the application and comparison of various reading techniques. This paper deals with our experiences with Perspective-Based Reading (PBR), a particular reading technique for requirements documents. The goal of PBR is to provide operational scenarios where members of a review team read a document from a particular perspective (e.g., tester, developer, user). Our assumption is that the combination of different perspectives provides better coverage of the document than the same number of readers using their usual technique.

  17. A new entropy based on a group-theoretical structure

    NASA Astrophysics Data System (ADS)

    Curado, Evaldo M. F.; Tempesta, Piergiulio; Tsallis, Constantino

    2016-03-01

    A multi-parametric version of the nonadditive entropy Sq is introduced. This new entropic form, denoted by S a , b , r, possesses many interesting statistical properties, and it reduces to the entropy Sq for b = 0, a = r : = 1 - q (hence Boltzmann-Gibbs entropy SBG for b = 0, a = r → 0). The construction of the entropy S a , b , r is based on a general group-theoretical approach recently proposed by one of us, Tempesta (2016). Indeed, essentially all the properties of this new entropy are obtained as a consequence of the existence of a rational group law, which expresses the structure of S a , b , r with respect to the composition of statistically independent subsystems. Depending on the choice of the parameters, the entropy S a , b , r can be used to cover a wide range of physical situations, in which the measure of the accessible phase space increases say exponentially with the number of particles N of the system, or even stabilizes, by increasing N, to a limiting value. This paves the way to the use of this entropy in contexts where the size of the phase space does not increase as fast as the number of its constituting particles (or subsystems) increases.

  18. Organizing the public health-clinical health interface: theoretical bases.

    PubMed

    St-Pierre, Michèle; Reinharz, Daniel; Gauthier, Jacques-Bernard

    2006-01-01

    This article addresses the issue of the interface between public health and clinical health within the context of the search for networking approaches geared to a more integrated delivery of health services. The articulation of an operative interface is complicated by the fact that the definition of networking modalities involves complex intra- and interdisciplinary and intra- and interorganizational systems across which a new transversal dynamics of intervention practices and exchanges between service structures must be established. A better understanding of the situation is reached by shedding light on the rationale underlying the organizational methods that form the bases of the interface between these two sectors of activity. The Quebec experience demonstrates that neither the structural-functionalist approach, which emphasizes remodelling establishment structures and functions as determinants of integration, nor the structural-constructivist approach, which prioritizes distinct fields of practice in public health and clinical health, adequately serves the purpose of networking and integration. Consequently, a theoretical reframing is imperative. In this regard, structuration theory, which fosters the simultaneous study of methods of inter-structure coordination and inter-actor cooperation, paves the way for a better understanding of the situation and, in turn, to the emergence of new integration possibilities. PMID:16645802

  19. Polymer electrolyte membrane fuel cell fault diagnosis based on empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Damour, Cédric; Benne, Michel; Grondin-Perez, Brigitte; Bessafi, Miloud; Hissel, Daniel; Chabriat, Jean-Pierre

    2015-12-01

    Diagnosis tool for water management is relevant to improve the reliability and lifetime of polymer electrolyte membrane fuel cells (PEMFCs). This paper presents a novel signal-based diagnosis approach, based on Empirical Mode Decomposition (EMD), dedicated to PEMFCs. EMD is an empirical, intuitive, direct and adaptive signal processing method, without pre-determined basis functions. The proposed diagnosis approach relies on the decomposition of FC output voltage to detect and isolate flooding and drying faults. The low computational cost of EMD, the reduced number of required measurements, and the high diagnosis accuracy of flooding and drying faults diagnosis make this approach a promising online diagnosis tool for PEMFC degraded modes management.

  20. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study.

    PubMed

    Tîrnăucă, Cristina; Montaña, José L; Ontañón, Santiago; González, Avelino J; Pardo, Luis M

    2016-01-01

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent's actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches. PMID:27347956

  1. Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data

    NASA Technical Reports Server (NTRS)

    Rompala, John T.

    2005-01-01

    A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.

  2. Towards an Empirically Based Parametric Explosion Spectral Model

    SciTech Connect

    Ford, S R; Walter, W R; Ruppert, S; Matzel, E; Hauk, T; Gok, R

    2009-08-31

    Small underground nuclear explosions need to be confidently detected, identified, and characterized in regions of the world where they have never before been tested. The focus of our work is on the local and regional distances (< 2000 km) and phases (Pn, Pg, Sn, Lg) necessary to see small explosions. We are developing a parametric model of the nuclear explosion seismic source spectrum that is compatible with the earthquake-based geometrical spreading and attenuation models developed using the Magnitude Distance Amplitude Correction (MDAC) techniques (Walter and Taylor, 2002). The explosion parametric model will be particularly important in regions without any prior explosion data for calibration. The model is being developed using the available body of seismic data at local and regional distances for past nuclear explosions at foreign and domestic test sites. Parametric modeling is a simple and practical approach for widespread monitoring applications, prior to the capability to carry out fully deterministic modeling. The achievable goal of our parametric model development is to be able to predict observed local and regional distance seismic amplitudes for event identification and yield determination in regions with incomplete or no prior history of underground nuclear testing. The relationship between the parametric equations and the geologic and containment conditions will assist in our physical understanding of the nuclear explosion source.

  3. The Empirical Investigation of Perspective-Based Reading

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Green, Scott; Laitenberger, Oliver; Shull, Forrest; Sorumgard, Sivert; Zelkowitz, Marvin V.

    1995-01-01

    We consider reading techniques a fundamental means of achieving high quality software. Due to lack of research in this area, we are experimenting with the application and comparison of various reading techniques. This paper deals with our experiences with Perspective Based Reading (PBR) a particular reading technique for requirement documents. The goal of PBR is to provide operation scenarios where members of a review team read a document from a particular perspective (eg., tester, developer, user). Our assumption is that the combination of different perspective provides better coverage of the document than the same number of readers using their usual technique. To test the efficacy of PBR, we conducted two runs of a controlled experiment in the environment of NASA GSFC Software Engineering Laboratory (SEL), using developers from the environment. The subjects read two types of documents, one generic in nature and the other from the NASA Domain, using two reading techniques, PBR and their usual technique. The results from these experiment as well as the experimental design, are presented and analyzed. When there is a statistically significant distinction, PBR performs better than the subjects' usual technique. However, PBR appears to be more effective on the generic documents than on the NASA documents.

  4. An Empirically Based Error-Model for Radar Rainfall Estimates

    NASA Astrophysics Data System (ADS)

    Ciach, G. J.

    2004-05-01

    Mathematical modeling of the way radar rainfall (RR) approximates the physical truth is a prospective method to quantify the RR uncertainties. In this approach one can represent RR in the form of an "observation equation," that is, as a function of the corresponding true rainfall and a random error process. The error process describes the cumulative effect of all the sources of RR uncertainties. We present the results of our work on the identification and estimation of this relationship. They are based on the Level II reflectivity data from the WSR-88D radar in Tulsa, Oklahoma, and rainfall measurements from 23 surrounding Oklahoma Mesonet raingauges. Accumulation intervals from one hour to one day were analyzed using this sample. The raingauge accumulations were used as an approximation of the true rainfall in this study. The RR error-model that we explored is factorized into a deterministic distortion, which is a function of the true rainfall, and a multiplicative random error factor that is a positively-defined random variable. The distribution of the error factor depends on the true rainfall, however, its expectation in this representation is always equal to one (all the biases are modeled by the deterministic component). With this constraint, the deterministic distortion function can be defined as the conditional mean of RR conditioned on the true rainfall. We use nonparametric regression to estimate the deterministic distortion, and the variance and quantiles of the random error factor, as functions of the true rainfall. The results show that the deterministic distortion is a nonlinear function of the true rainfall that indicates systematic overestimation of week rainfall and underestimation of strong rainfall (conditional bias). The standard deviation of the error factor is a decreasing function of the true rainfall that ranges from about 0.8 for week rainfall to about 0.3 for strong rainfall. For larger time-scales, both the deterministic distortion and the

  5. Implementing community-based provider participation in research: an empirical study

    PubMed Central

    2012-01-01

    Background Since 2003, the United States National Institutes of Health (NIH) has sought to restructure the clinical research enterprise in the United States by promoting collaborative research partnerships between academically-based investigators and community-based physicians. By increasing community-based provider participation in research (CBPPR), the NIH seeks to advance the science of discovery by conducting research in clinical settings where most people get their care, and accelerate the translation of research results into everyday clinical practice. Although CBPPR is seen as a promising strategy for promoting the use of evidence-based clinical services in community practice settings, few empirical studies have examined the organizational factors that facilitate or hinder the implementation of CBPPR. The purpose of this study is to explore the organizational start-up and early implementation of CBPPR in community-based practice. Methods We used longitudinal, case study research methods and an organizational model of innovation implementation to theoretically guide our study. Our sample consisted of three community practice settings that recently joined the National Cancer Institute’s (NCI) Community Clinical Oncology Program (CCOP) in the United States. Data were gathered through site visits, telephone interviews, and archival documents from January 2008 to May 2011. Results The organizational model for innovation implementation was useful in identifying and investigating the organizational factors influencing start-up and early implementation of CBPPR in CCOP organizations. In general, the three CCOP organizations varied in the extent to which they achieved consistency in CBPPR over time and across physicians. All three CCOP organizations demonstrated mixed levels of organizational readiness for change. Hospital management support and resource availability were limited across CCOP organizations early on, although they improved in one CCOP organization

  6. An Empirical Investigation of a Theoretically Based Measure of Perceived Wellness

    ERIC Educational Resources Information Center

    Harari, Marc J.; Waehler, Charles A.; Rogers, James R.

    2005-01-01

    The Perceived Wellness Survey (PWS; T. Adams, 1995; T. Adams, J. Bezner, & M. Steinhardt, 1997) is a recently developed instrument intended to operationalize the comprehensive Perceived Wellness Model (T. Adams, J. Bezner, & M. Steinhardt, 1997), an innovative model that attempts to include the balance of multiple life activities in its evaluation…

  7. Theoretical and empirical study of single-substance, upward two-phase flow in a constant-diameter adiabatic pipe

    SciTech Connect

    Laoulache, R.N.; Maeder, P.F.; DiPippo, R.

    1987-05-01

    A Scheme is developed to describe the upward flow of a two-phase mixture of a single substance in a vertical adiabatic constant area pipe. The scheme is based on dividing the mixture into a homogeneous core surrounded by a liquid film. This core may be a mixture of bubbles in a contiguous liquid phase, or a mixture of droplets in a contiguous vapor phase. Emphasis is placed upon the latter case since the range of experimental measurements of pressure, temperature, and void fraction collected in this study fall in the slug-churn''- annular'' flow regimes. The core is turbulent, whereas the liquid film may be laminar or turbulent. Turbulent stresses are modeled by using Prandtl's mixing-length theory. The working fluid is Dichlorotetrafluoroethane CCIF{sub 2}-CCIF{sub 2} known as refrigerant 114 (R-114); the two-phase mixture is generated from the single phase substance by the process of flashing. In this study, the effect of the Froude and Reynolds numbers on the liquid film characteristics is examined. The compressibility is accounted for through the acceleration pressure gradient of the core and not directly through the Mach number. An expression for an interfacial friction coefficient between the turbulent core and the liquid film is developed; it is similar to Darcy's friction coefficient for a single phase flow in a rough pipe. Finally, an actual steam-water geothermal well is simulated; it is based on actual field data from New Zealand. A similarity theory is used to predict the steam-water mixture pressure and temperature starting with laboratory measurements on the flow of R-114.

  8. Theoretical and empirical study of single-substance, upward two-phase flow in a constant-diameter adiabatic pipe

    SciTech Connect

    Laoulache, R.N.; Maeder, P.F.; DiPippo, R.

    1987-05-01

    A scheme is developed to describe the upward flow of a two-phase mixture of a single substance in a vertical adiabatic constant area pipe. The scheme is based on dividing the mixture into a homogeneous core surrounded by a liquid film. This core may be a mixture of bubbles in a contiguous liquid phase, or a mixture of droplets in a contiguous vapor phase. The core is turbulent, whereas the liquid film may be laminar or turbulent. The working fluid is Dichlorotetrafluoroethane CClF/sub 2/-CClF/sub 2/ known as refrigerant 114 (R-114); the two-phase mixture is generated from the single phase substance by the process of flashing. In this study, the effect of the Froude and Reynolds numbers on the liquid film characteristics is examined. An expression for an interfacial friction coefficient between the turbulent core and the liquid film is developed; it is similar to Darcy's friction coefficient for a single phase flow in a rough pipe. Results indicate that for the range of Reynolds and Froude numbers considered, the liquid film is likely to be turbulent rather than laminar. The study also shows that two-dimensional effects are important, and the flow is never fully developed either in the film or the core. In addition, the new approach for the turbulent film is capable of predicting a local net flow rate that may be upward, downward, stationary, or stalled. An actual steam-water geothermal well is simulated. A similarity theory is used to predict the steam-water mixture pressure and temperature starting with laboratory measurements on the flow of R-114. Results indicate that the theory can be used to predict the pressure gradient in the two-phase region based on laboratory measurements.

  9. School-Based Management and Paradigm Shift in Education an Empirical Study

    ERIC Educational Resources Information Center

    Cheng, Yin Cheong; Mok, Magdalena Mo Ching

    2007-01-01

    Purpose: This paper aims to report empirical research investigating how school-based management (SBM) and paradigm shift (PS) in education are closely related to teachers' student-centered teaching and students' active learning in a sample of Hong Kong secondary schools. Design/methodology/approach: It is a cross-sectional survey research…

  10. Implementing Evidence-Based Practice: A Review of the Empirical Research Literature

    ERIC Educational Resources Information Center

    Gray, Mel; Joy, Elyssa; Plath, Debbie; Webb, Stephen A.

    2013-01-01

    The article reports on the findings of a review of empirical studies examining the implementation of evidence-based practice (EBP) in the human services. Eleven studies were located that defined EBP as a research-informed, clinical decision-making process and identified barriers and facilitators to EBP implementation. A thematic analysis of the…

  11. An Empirically-Based Statewide System for Identifying Quality Pre-Kindergarten Programs

    ERIC Educational Resources Information Center

    Williams, Jeffrey M.; Landry, Susan H.; Anthony, Jason L.; Swank, Paul R.; Crawford, April D.

    2012-01-01

    This study presents an empirically-based statewide system that links information about pre-kindergarten programs with children's school readiness scores to certify pre-kindergarten classrooms as promoting school readiness. Over 8,000 children from 1,255 pre-kindergarten classrooms were followed longitudinally for one year. Pre-kindergarten quality…

  12. Development of an Empirically Based Questionnaire to Investigate Young Students' Ideas about Nature of Science

    ERIC Educational Resources Information Center

    Chen, Sufen; Chang, Wen-Hua; Lieu, Sang-Chong; Kao, Huey-Lien; Huang, Mao-Tsai; Lin, Shu-Fen

    2013-01-01

    This study developed an empirically based questionnaire to monitor young learners' conceptions of nature of science (NOS). The questionnaire, entitled Students' Ideas about Nature of Science (SINOS), measured views on theory-ladenness, use of creativity and imagination, tentativeness of scientific knowledge, durability of scientific knowledge,…

  13. Feasibility of an Empirically Based Program for Parents of Preschoolers with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Dababnah, Sarah; Parish, Susan L.

    2016-01-01

    This article reports on the feasibility of implementing an existing empirically based program, "The Incredible Years," tailored to parents of young children with autism spectrum disorder. Parents raising preschool-aged children (aged 3-6?years) with autism spectrum disorder (N?=?17) participated in a 15-week pilot trial of the…

  14. An Empirically Based Method of Q-Matrix Validation for the DINA Model: Development and Applications

    ERIC Educational Resources Information Center

    de la Torre, Jimmy

    2008-01-01

    Most model fit analyses in cognitive diagnosis assume that a Q matrix is correct after it has been constructed, without verifying its appropriateness. Consequently, any model misfit attributable to the Q matrix cannot be addressed and remedied. To address this concern, this paper proposes an empirically based method of validating a Q matrix used…

  15. Empirical vs. Expected IRT-Based Reliability Estimation in Computerized Multistage Testing (MST)

    ERIC Educational Resources Information Center

    Zhang, Yanwei; Breithaupt, Krista; Tessema, Aster; Chuah, David

    2006-01-01

    Two IRT-based procedures to estimate test reliability for a certification exam that used both adaptive (via a MST model) and non-adaptive design were considered in this study. Both procedures rely on calibrated item parameters to estimate error variance. In terms of score variance, one procedure (Method 1) uses the empirical ability distribution…

  16. Satellite-based empirical models linking river plume dynamics with hypoxic area andvolume

    EPA Science Inventory

    Satellite-based empirical models explaining hypoxic area and volume variation were developed for the seasonally hypoxic (O2 < 2 mg L−1) northern Gulf of Mexico adjacent to the Mississippi River. Annual variations in midsummer hypoxic area and ...

  17. Use of an Empirically Based Marriage Education Program by Religious Organizations: Results of a Dissemination Trial

    ERIC Educational Resources Information Center

    Markman, Howard J.; Whitton, Sarah W.; Kline, Galena H.; Stanley, Scott M.; Thompson, Huette; St. Peters, Michelle; Leber, Douglas B.; Olmos-Gallo, P. Antonio; Prado, Lydia; Williams, Tamara; Gilbert, Katy; Tonelli, Laurie; Bobulinski, Michelle; Cordova, Allen

    2004-01-01

    We present an evaluation of the extent to which an empirically based couples' intervention program was successfully disseminated in the community. Clergy and lay leaders from 27 religious organizations who were trained to deliver the Prevention and Relationship Enhancement Program (PREP) were contacted approximately yearly for 5 years following…

  18. Untangling the Evidence: Introducing an Empirical Model for Evidence-Based Library and Information Practice

    ERIC Educational Resources Information Center

    Gillespie, Ann

    2014-01-01

    Introduction: This research is the first to investigate the experiences of teacher-librarians as evidence-based practice. An empirically derived model is presented in this paper. Method: This qualitative study utilised the expanded critical incident approach, and investigated the real-life experiences of fifteen Australian teacher-librarians,…

  19. Sexual functioning and partner relationships in women with turner syndrome: some empirical data and theoretical considerations regarding sexual desire.

    PubMed

    Rolstad, Susanna Göthlin; Möller, Anders; Bryman, Inger; Boman, Ulla Wide

    2007-01-01

    The aim of this study was to describe marital status, sexual history, and sexual functioning in a group of women with Turner syndrome, and to compare the results with general Swedish population data. The sample consists of 57 women over 18 years of age. Data were collected from an interview, and using two self-report questionnaires: the McCoy Sexual Rating Scale and the Relationship Rating Scale (RS). Compared to population data, the women with Turner syndrome were less likely to have a partner and had had their sexual debut later. Single women differed more from the general population than did women with a partner, regarding sexual desire and sexual activity. Several women with a partner reported sexual problems, but unanimously reported being satisfied with their sex life and partner relationship. The level of sexual desire in women with Turner syndrome is discussed in relation to Levine's model of human sexual desire, where psychological and social motivational factors are considered in addition to a biologically based sexual drive (Levine, 1992). PMID:17454521

  20. Teaching the Rhythms of English: A New Theoretical Base.

    ERIC Educational Resources Information Center

    Faber, David

    1986-01-01

    Presents some reasons why more emphasis should be placed on the mastery of the rhythmic features of the target language in foreign language teaching. An account of an important recent theoretical contribution to the description of the principles underlying English speech rhythm is included. (SED)

  1. PDE-based Non-Linear Diffusion Techniques for Denoising Scientific and Industrial Images: An Empirical Study

    SciTech Connect

    Weeratunga, S K; Kamath, C

    2001-12-20

    Removing noise from data is often the first step in data analysis. Denoising techniques should not only reduce the noise, but do so without blurring or changing the location of the edges. Many approaches have been proposed to accomplish this; in this paper, they focus on one such approach, namely the use of non-linear diffusion operators. This approach has been studied extensively from a theoretical viewpoint ever since the 1987 work of Perona and Malik showed that non-linear filters outperformed the more traditional linear Canny edge detector. They complement this theoretical work by investigating the performance of several isotropic diffusion operators on test images from scientific domains. They explore the effects of various parameters such as the choice of diffusivity function, explicit and implicit methods for the discretization of the PDE, and approaches for the spatial discretization of the non-linear operator etc. They also compare these schemes with simple spatial filters and the more complex wavelet-based shrinkage techniques. The empirical results show that, with an appropriate choice of parameters, diffusion-based schemes can be as effective as competitive techniques.

  2. [Comorbidity of substance use and other psychiatric disorders--theoretical foundation and evidence based therapy].

    PubMed

    Gouzoulis-Mayfrank, E

    2008-05-01

    The coincidence of two or more psychiatric disorders in the same person (comorbidity or dual diagnosis) is no rare exception. It is rather common and therapeutically highly relevant. Comorbid patients exhibit frequently severe manifestations of the disorder(s) and they require intensive treatment to meet their special needs and the interdependencies of their disorders. The present overview deals with the theoretical foundations of comorbidity of substance use and other psychiatric disorders. We present data on the prevalence of different comorbidities and discuss the models, which have been proposed to explain how substance use and other disorders relate with each other. Furthermore, we describe the clinical characteristics and long-term course of comorbid patients, as well as some general therapeutic principles including the advantages of integrated therapeutic programmes. In addition, we carried out a systematic literature search on specific pharmaco- and psychotherapies for common comorbidities using the databases MEDLINE, EMBASE and PsycInfo (up to December 2007), and assessed the methodological quality of the identified trials. Based on this search we present the empirical evidence for the effectiveness of specific treatments and make therapeutic recommendations which are graded according to the strength of existing evidence. In conclusion, integrated treatment programs are more effective, provided they take into account the multiple deficits of comorbid patients, adjust and adapt the different therapeutic components to each other, and set realistic goals. The next step should be a broader application of integrated treatment programs and their adoption as standard treatment within the national health systems. PMID:18557218

  3. Theoretical magnetograms based on quantitative simulation of a magnetospheric substorm

    NASA Technical Reports Server (NTRS)

    Chen, C.-K.; Wolf, R. A.; Karty, J. L.; Harel, M.

    1982-01-01

    Substorm currents derived from the Rice University computer simulation of the September 19, 1976 substorm event are used to compute theoretical magnetograms as a function of universal time for various stations, integrating the Biot-Savart law over a maze of about 2700 wires and bands that carry the ring, Birkeland and horizontal ionospheric currents. A comparison of theoretical results with corresponding observations leads to a claim of general agreement, especially for stations at high and middle magnetic latitudes. Model results suggest that the ground magnetic field perturbations arise from complicated combinations of different kinds of currents, and that magnetic field disturbances due to different but related currents cancel each other out despite the inapplicability of Fukushima's (1973) theorem. It is also found that the dawn-dusk asymmetry in the horizontal magnetic field disturbance component at low latitudes is due to a net downward Birkeland current at noon, a net upward current at midnight, and, generally, antisunward-flowing electrojets.

  4. 78 FR 54464 - Premier Empire Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-04

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Premier Empire Energy, LLC; Supplemental Notice That Initial Market-Based... above-referenced proceeding, of Premier Empire Energy, LLC's application for market-based rate...

  5. Theoretical Investigations of Plasma-Based Accelerators and Other Advanced Accelerator Concepts

    SciTech Connect

    Shuets, G.

    2004-05-21

    Theoretical investigations of plasma-based accelerators and other advanced accelerator concepts. The focus of the work was on the development of plasma based and structure based accelerating concepts, including laser-plasma, plasma channel, and microwave driven plasma accelerators.

  6. Theoretical magnetograms based on quantitative simulation of a magnetospheric substorm

    SciTech Connect

    Chen, C.; Wolf, R.A.; Harel, M.; Karty, J.L.

    1982-08-01

    Using substorm currents derived from the Rice computer simulation of the substorm event of September 19, 1976, we have computed theoretical magnetograms as a function of universal time for various stations. A theoretical Dst has also been computed. Our computed magnetograms were obtained by integrating the Biot-Savart law over a maze of approximately 2700 wires and bands that carry the ring currents, the Birkeland currents, and the horizontal ionospheric currents. Ground currents and dynamo currents were neglected. Computed contributions to the magnetic field perturbation from eleven different kinds of currents are displayed (e.g., ring currents, northern hemisphere Birkeland currents). First, overall agreement of theory and data is generally satisfactory, especially for stations at high and mid-magnetic latitudes. Second, model results suggest that the ground magnetic field perturbations arise from very complicated combinations of different kinds of currents and that the magnetic field disturbances due to different but related currents often cancel each other, despite the fact that complicated inhomogeneous conductivities in our model prevent rigorous application of Fukushima's theorem. Third, both the theoretical and observed Dst decrease during the expansion phase of the substorm, but data indicate that Dst relaxes back toward its initial value within about an hour after the peak of the substorm. Fourth, the dawn-dusk asymmetry in the horizontal component of magnetic field disturbance at low latitudes in a substorm is essentially due to a net downward Birkeland current at noon, net upward current at midnight, and generally antisunward flowing electrojets; it is not due to a physical partial ring current injected into the duskside of the inner magnetosphere.

  7. Empirically Based Psychosocial Therapies for Schizophrenia: The Disconnection between Science and Practice

    PubMed Central

    Shean, Glenn D.

    2013-01-01

    Empirically validated psychosocial therapies for individuals diagnosed with schizophrenia were described in the report of the Schizophrenia Patient Outcomes Research Team (PORT, 2009). The PORT team identified eight psychosocial treatments: assertive community treatment, supported employment, cognitive behavioral therapy, family-based services, token economy, skills training, psychosocial interventions for alcohol and substance use disorders, and psychosocial interventions for weight management. PORT listings of empirically validated psychosocial therapies provide a useful template for the design of effective recovery-oriented mental health care systems. Unfortunately, surveys indicate that PORT listings have not been implemented in clinical settings. Obstacles to the implementation of PORT psychosocial therapy listings and suggestions for changes needed to foster implementation are discussed. Limitations of PORT therapy listings that are based on therapy outcome efficacy studies are discussed, and cross-cultural and course and outcome studies of correlates of recovery are summarized. PMID:23738068

  8. Empirically Based Psychosocial Therapies for Schizophrenia: The Disconnection between Science and Practice.

    PubMed

    Shean, Glenn D

    2013-01-01

    Empirically validated psychosocial therapies for individuals diagnosed with schizophrenia were described in the report of the Schizophrenia Patient Outcomes Research Team (PORT, 2009). The PORT team identified eight psychosocial treatments: assertive community treatment, supported employment, cognitive behavioral therapy, family-based services, token economy, skills training, psychosocial interventions for alcohol and substance use disorders, and psychosocial interventions for weight management. PORT listings of empirically validated psychosocial therapies provide a useful template for the design of effective recovery-oriented mental health care systems. Unfortunately, surveys indicate that PORT listings have not been implemented in clinical settings. Obstacles to the implementation of PORT psychosocial therapy listings and suggestions for changes needed to foster implementation are discussed. Limitations of PORT therapy listings that are based on therapy outcome efficacy studies are discussed, and cross-cultural and course and outcome studies of correlates of recovery are summarized. PMID:23738068

  9. Effectiveness of a Theoretically-Based Judgment and Decision Making Intervention for Adolescents

    PubMed Central

    Knight, Danica K.; Dansereau, Donald F.; Becan, Jennifer E.; Rowan, Grace A.; Flynn, Patrick M.

    2014-01-01

    Although adolescents demonstrate capacity for rational decision making, their tendency to be impulsive, place emphasis on peers, and ignore potential consequences of their actions often translates into higher risk-taking including drug use, illegal activity, and physical harm. Problems with judgment and decision making contribute to risky behavior and are core issues for youth in treatment. Based on theoretical and empirical advances in cognitive science, the Treatment Readiness and Induction Program (TRIP) represents a curriculum-based decision making intervention that can be easily inserted into a variety of content-oriented modalities as well as administered as a separate therapeutic course. The current study examined the effectiveness of TRIP for promoting better judgment among 519 adolescents (37% female; primarily Hispanic and Caucasian) in residential substance abuse treatment. Change over time in decision making and premeditation (i.e., thinking before acting) was compared among youth receiving standard operating practice (n = 281) versus those receiving standard practice plus TRIP (n = 238). Change in TRIP-specific content knowledge was examined among clients receiving TRIP. Premeditation improved among youth in both groups; TRIP clients showed greater improvement in decision making. TRIP clients also reported significant increases over time in self-awareness, positive-focused thinking (e.g., positive self-talk, goal setting), and recognition of the negative effects of drug use. While both genders showed significant improvement, males showed greater gains in metacognitive strategies (i.e., awareness of one’s own cognitive process) and recognition of the negative effects of drug use. These results suggest that efforts to teach core thinking strategies and apply/practice them through independent intervention modules may benefit adolescents when used in conjunction with content-based programs designed to change problematic behaviors. PMID:24760288

  10. Scaling up explanation generation: Large-scale knowledge bases and empirical studies

    SciTech Connect

    Lester, J.C.; Porter, B.W.

    1996-12-31

    To explain complex phenomena, an explanation system must be able to select information from a formal representation of domain knowledge, organize the selected information into multisentential discourse plans, and realize the discourse plans in text. Although recent years have witnessed significant progress in the development of sophisticated computational mechanisms for explanation, empirical results have been limited. This paper reports on a seven year effort to empirically study explanation generation from semantically rich, large-scale knowledge bases. We first describe Knight, a robust explanation system that constructs multi-sentential and multi-paragraph explanations from the Biology Knowledge Base, a large-scale knowledge base in the domain of botanical anatomy, physiology, and development. We then introduce the Two Panel evaluation methodology and describe how Knight`s performance was assessed with this methodology in the most extensive empirical evaluation conducted on an explanation system. In this evaluation, Knight scored within {open_quotes}half a grade{close_quote} of domain experts, and its performance exceeded that of one of the domain experts.

  11. Population forecasts and confidence intervals for Sweden: a comparison of model-based and empirical approaches.

    PubMed

    Cohen, J E

    1986-02-01

    This paper compares several methods of generating confidence intervals for forecasts of population size. Two rest on a demographic model for age-structured populations with stochastic fluctuations in vital rates. Two rest on empirical analyses of past forecasts of population sizes of Sweden at five-year intervals from 1780 to 1980 inclusive. Confidence intervals produced by the different methods vary substantially. The relative sizes differ in the various historical periods. The narrowest intervals offer a lower bound on uncertainty about the future. Procedures for estimating a range of confidence intervals are tentatively recommended. A major lesson is that finitely many observations of the past and incomplete theoretical understanding of the present and future can justify at best a range of confidence intervals for population projections. Uncertainty attaches not only to the point forecasts of future population, but also to the estimates of those forecasts' uncertainty. PMID:3484356

  12. Measuring microscopic evolution processes of complex networks based on empirical data

    NASA Astrophysics Data System (ADS)

    Chi, Liping

    2015-04-01

    Aiming at understanding the microscopic mechanism of complex systems in real world, we perform the measurement that characterizes the evolution properties on two empirical data sets. In the Autonomous Systems Internet data, the network size keeps growing although the system suffers a high rate of node deletion (r = 0.4) and link deletion (q = 0.81). However, the average degree keeps almost unchanged during the whole time range. At each time step the external links attached to a new node are about c = 1.1 and the internal links added between existing nodes are approximately m = 8. For the Scientific Collaboration data, it is a cumulated result of all the authors from 1893 up to the considered year. There is no deletion of nodes and links, r = q = 0. The external and internal links at each time step are c = 1.04 and m = 0, correspondingly. The exponents of degree distribution p(k) ∼ k-γ of these two empirical datasets γdata are in good agreement with that obtained theoretically γtheory. The results indicate that these evolution quantities may provide an insight into capturing the microscopic dynamical processes that govern the network topology.

  13. Theoretic base of Edge Local Mode triggering by vertical displacements

    SciTech Connect

    Wang, Z. T.; He, Z. X.; Wang, Z. H.; Wu, N.; Tang, C. J.

    2015-05-15

    Vertical instability is studied with R-dependent displacement. For Solovev's configuration, the stability boundary of the vertical instability is calculated. The pressure gradient is a destabilizing factor which is contrary to Rebhan's result. Equilibrium parallel current density, j{sub //}, at plasma boundary is a drive of the vertical instability similar to Peeling-ballooning modes; however, the vertical instability cannot be stabilized by the magnetic shear which tends towards infinity near the separatrix. The induced current observed in the Edge Local Mode (ELM) triggering experiment by vertical modulation is derived. The theory provides some theoretic explanation for the mitigation of type-I ELMS on ASDEX Upgrade. The principle could be also used for ITER.

  14. [Theoretical analysis of recompression-based therapies of decompression illness].

    PubMed

    Nikolaev, V P; Sokolov, G M; Komarevtsev, V N

    2011-01-01

    Theoretical analysis is concerned with the benefits of oxygen, air and nitrogen-helium-oxygen recompression schedules used to treat decompression illness in divers. Mathematical modeling of tissue bubbles dynamics during diving shows that one-hour oxygen recompression to 200 kPa does not diminish essentially the size of bubble enclosed in a layer that reduces tenfold the intensity of gas diffusion from bubbles. However, these bubbles dissolve fully in all the body tissues equally after 2-hr. air compression to 800 kPa and ensuing 2-d decompression by the Russian navy tables, and 1.5-hr. N-He-O2 compression to this pressure followed by 5-day decompression. The overriding advantage of the gas mixture recompression is that it obviates the narcotic action of nitrogen at the peak of chamber pressure and does not create dangerous tissue supersaturation and conditions for emergence of large bubbles at the end of decompression. PMID:21970044

  15. Information Theoretic Similarity Measures for Content Based Image Retrieval.

    ERIC Educational Resources Information Center

    Zachary, John; Iyengar, S. S.

    2001-01-01

    Content-based image retrieval is based on the idea of extracting visual features from images and using them to index images in a database. Proposes similarity measures and an indexing algorithm based on information theory that permits an image to be represented as a single number. When used in conjunction with vectors, this method displays…

  16. A theoretical drought classification method for the multivariate drought index based on distribution properties of standardized drought indices

    NASA Astrophysics Data System (ADS)

    Hao, Zengchao; Hao, Fanghua; Singh, Vijay P.; Xia, Youlong; Ouyang, Wei; Shen, Xinyi

    2016-06-01

    Drought indices have been commonly used to characterize different properties of drought and the need to combine multiple drought indices for accurate drought monitoring has been well recognized. Based on linear combinations of multiple drought indices, a variety of multivariate drought indices have recently been developed for comprehensive drought monitoring to integrate drought information from various sources. For operational drought management, it is generally required to determine thresholds of drought severity for drought classification to trigger a mitigation response during a drought event to aid stakeholders and policy makers in decision making. Though the classification of drought categories based on the univariate drought indices has been well studied, drought classification method for the multivariate drought index has been less explored mainly due to the lack of information about its distribution property. In this study, a theoretical drought classification method is proposed for the multivariate drought index, based on a linear combination of multiple indices. Based on the distribution property of the standardized drought index, a theoretical distribution of the linear combined index (LDI) is derived, which can be used for classifying drought with the percentile approach. Application of the proposed method for drought classification of LDI, based on standardized precipitation index (SPI), standardized soil moisture index (SSI), and standardized runoff index (SRI) is illustrated with climate division data from California, United States. Results from comparison with the empirical methods show a satisfactory performance of the proposed method for drought classification.

  17. The Theoretical Astrophysical Observatory: Cloud-based Mock Galaxy Catalogs

    NASA Astrophysics Data System (ADS)

    Bernyk, Maksym; Croton, Darren J.; Tonini, Chiara; Hodkinson, Luke; Hassan, Amr H.; Garel, Thibault; Duffy, Alan R.; Mutch, Simon J.; Poole, Gregory B.; Hegarty, Sarah

    2016-03-01

    We introduce the Theoretical Astrophysical Observatory (TAO), an online virtual laboratory that houses mock observations of galaxy survey data. Such mocks have become an integral part of the modern analysis pipeline. However, building them requires expert knowledge of galaxy modeling and simulation techniques, significant investment in software development, and access to high performance computing. These requirements make it difficult for a small research team or individual to quickly build a mock catalog suited to their needs. To address this TAO offers access to multiple cosmological simulations and semi-analytic galaxy formation models from an intuitive and clean web interface. Results can be funnelled through science modules and sent to a dedicated supercomputer for further processing and manipulation. These modules include the ability to (1) construct custom observer light cones from the simulation data cubes; (2) generate the stellar emission from star formation histories, apply dust extinction, and compute absolute and/or apparent magnitudes; and (3) produce mock images of the sky. All of TAO’s features can be accessed without any programming requirements. The modular nature of TAO opens it up for further expansion in the future.

  18. Theoretical performance analysis for CMOS based high resolution detectors.

    PubMed

    Jain, Amit; Bednarek, Daniel R; Rudin, Stephen

    2013-03-01

    High resolution imaging capabilities are essential for accurately guiding successful endovascular interventional procedures. Present x-ray imaging detectors are not always adequate due to their inherent limitations. The newly-developed high-resolution micro-angiographic fluoroscope (MAF-CCD) detector has demonstrated excellent clinical image quality; however, further improvement in performance and physical design may be possible using CMOS sensors. We have thus calculated the theoretical performance of two proposed CMOS detectors which may be used as a successor to the MAF. The proposed detectors have a 300 μm thick HL-type CsI phosphor, a 50 μm-pixel CMOS sensor with and without a variable gain light image intensifier (LII), and are designated MAF-CMOS-LII and MAF-CMOS, respectively. For the performance evaluation, linear cascade modeling was used. The detector imaging chains were divided into individual stages characterized by one of the basic processes (quantum gain, binomial selection, stochastic and deterministic blurring, additive noise). Ranges of readout noise and exposure were used to calculate the detectors' MTF and DQE. The MAF-CMOS showed slightly better MTF than the MAF-CMOS-LII, but the MAF-CMOS-LII showed far better DQE, especially for lower exposures. The proposed detectors can have improved MTF and DQE compared with the present high resolution MAF detector. The performance of the MAF-CMOS is excellent for the angiography exposure range; however it is limited at fluoroscopic levels due to additive instrumentation noise. The MAF-CMOS-LII, having the advantage of the variable LII gain, can overcome the noise limitation and hence may perform exceptionally for the full range of required exposures; however, it is more complex and hence more expensive. PMID:24353390

  19. An Empirical Pixel-Based Correction for Imperfect CTE. I. HST's Advanced Camera for Surveys

    NASA Astrophysics Data System (ADS)

    Anderson, Jay; Bedin, Luigi

    2010-09-01

    We use an empirical approach to characterize the effect of charge-transfer efficiency (CTE) losses in images taken with the Wide-Field Channel of the Advanced Camera for Surveys (ACS). The study is based on profiles of warm pixels in 168 dark exposures taken between 2009 September and October. The dark exposures allow us to explore charge traps that affect electrons when the background is extremely low. We develop a model for the readout process that reproduces the observed trails out to 70 pixels. We then invert the model to convert the observed pixel values in an image into an estimate of the original pixel values. We find that when we apply this image-restoration process to science images with a variety of stars on a variety of background levels, it restores flux, position, and shape. This means that the observed trails contain essentially all of the flux lost to inefficient CTE. The Space Telescope Science Institute is currently evaluating this algorithm with the aim of optimizing it and eventually providing enhanced data products. The empirical procedure presented here should also work for other epochs (e.g., pre-SM4), though the parameters may have to be recomputed for the time when ACS was operated at a higher temperature than the current -81°C. Finally, this empirical approach may also hold promise for other instruments, such as WFPC2, STIS, the ASC's HRC, and even WFC3/UVIS.

  20. An Empirical Typology of Residential Care/Assisted Living Based on a Four-State Study

    ERIC Educational Resources Information Center

    Park, Nan Sook; Zimmerman, Sheryl; Sloane, Philip D.; Gruber-Baldini, Ann L.; Eckert, J. Kevin

    2006-01-01

    Purpose: Residential care/assisted living describes diverse facilities providing non-nursing home care to a heterogeneous group of primarily elderly residents. This article derives typologies of assisted living based on theoretically and practically grounded evidence. Design and Methods: We obtained data from the Collaborative Studies of Long-Term…

  1. Exploring multi/full polarised SAR imagery for understanding surface soil moisture and roughness by using semi-empirical and theoretical models and field experiments

    NASA Astrophysics Data System (ADS)

    Dong, Lu; Marzahn, Philip; Ludwig, Ralf

    2010-05-01

    -range digital photogrammetry for surface roughness retrieval. A semi-empirical model is tested and a theoretical model AIEM is utilised for further understanding. Results demonstrate that the semi-empirical soil moisture retrieval algorithm, which was developed in studies in humid climate conditions, must be carefully adapted to the drier Mediterranean environment. Modifying the approach by incorporating regional field data, led to a considerable improvement of the algorithms performance. In addition, it is found that the current representation of soil surface roughness in the AIEM is insufficient to account for the specific heterogeneities on the field scale. The findings in this study indicate the necessity for future research, which must be extended to a more integrated combination of current sensors, e.g. ENVISAT/ASAR, ALOS/PALSAR and Radarsat-2 imagery and advanced development of soil moisture retrieval model for multi/full polarised radar imagery.

  2. Theoretical Foundations of "Competitive Team-Based Learning"

    ERIC Educational Resources Information Center

    Hosseini, Seyed Mohammad Hassan

    2010-01-01

    This paper serves as a platform to precisely substantiate the success of "Competitive Team-Based Learning" (CTBL) as an effective and rational educational approach. To that end, it brings to the fore part of the (didactic) theories and hypotheses which in one way or another delineate and confirm the mechanisms under which successful…

  3. Why Problem-Based Learning Works: Theoretical Foundations

    ERIC Educational Resources Information Center

    Marra, Rose M.; Jonassen, David H.; Palmer, Betsy; Luft, Steve

    2014-01-01

    Problem-based learning (PBL) is an instructional method where student learning occurs in the context of solving an authentic problem. PBL was initially developed out of an instructional need to help medical school students learn their basic sciences knowledge in a way that would be more lasting while helping to develop clinical skills…

  4. EXPERIMENTAL AND THEORETICAL EVALUATIONS OF OBSERVATIONAL-BASED TECHNIQUES

    EPA Science Inventory

    Observational Based Methods (OBMs) can be used by EPA and the States to develop reliable ozone controls approaches. OBMs use actual measured concentrations of ozone, its precursors, and other indicators to determine the most appropriate strategy for ozone control. The usual app...

  5. Flavor symmetry based MSSM: Theoretical models and phenomenological analysis

    NASA Astrophysics Data System (ADS)

    Babu, K. S.; Gogoladze, Ilia; Raza, Shabbar; Shafi, Qaisar

    2014-09-01

    We present a class of supersymmetric models in which symmetry considerations alone dictate the form of the soft SUSY breaking Lagrangian. We develop a class of minimal models, denoted as sMSSM—for flavor symmetry-based minimal supersymmetric standard model—that respect a grand unified symmetry such as SO(10) and a non-Abelian flavor symmetry H which suppresses SUSY-induced flavor violation. Explicit examples are constructed with the flavor symmetry being gauged SU(2)H and SO(3)H with the three families transforming as 2+1 and 3 representations, respectively. A simple solution is found in the case of SU(2)H for suppressing the flavor violating D-terms based on an exchange symmetry. Explicit models based on SO(3)H without the D-term problem are developed. In addition, models based on discrete non-Abelian flavor groups are presented which are automatically free from D-term issues. The permutation group S3 with a 2+1 family assignment, as well as the tetrahedral group A4 with a 3 assignment are studied. In all cases, a simple solution to the SUSY CP problem is found, based on spontaneous CP violation leading to a complex quark mixing matrix. We develop the phenomenology of the resulting sMSSM, which is controlled by seven soft SUSY breaking parameters for both the 2+1 assignment and the 3 assignment of fermion families. These models are special cases of the phenomenological MSSM (pMSSM), but with symmetry restrictions. We discuss the parameter space of sMSSM compatible with LHC searches, B-physics constraints and dark matter relic abundance. Fine-tuning in these models is relatively mild, since all SUSY particles can have masses below about 3 TeV.

  6. Empirical and physics based mathematical models of uranium hydride decomposition kinetics with quantified uncertainties.

    SciTech Connect

    Salloum, Maher N.; Gharagozloo, Patricia E.

    2013-10-01

    Metal particle beds have recently become a major technique for hydrogen storage. In order to extract hydrogen from such beds, it is crucial to understand the decomposition kinetics of the metal hydride. We are interested in obtaining a a better understanding of the uranium hydride (UH3) decomposition kinetics. We first developed an empirical model by fitting data compiled from different experimental studies in the literature and quantified the uncertainty resulting from the scattered data. We found that the decomposition time range predicted by the obtained kinetics was in a good agreement with published experimental results. Secondly, we developed a physics based mathematical model to simulate the rate of hydrogen diffusion in a hydride particle during the decomposition. We used this model to simulate the decomposition of the particles for temperatures ranging from 300K to 1000K while propagating parametric uncertainty and evaluated the kinetics from the results. We compared the kinetics parameters derived from the empirical and physics based models and found that the uncertainty in the kinetics predicted by the physics based model covers the scattered experimental data. Finally, we used the physics-based kinetics parameters to simulate the effects of boundary resistances and powder morphological changes during decomposition in a continuum level model. We found that the species change within the bed occurring during the decomposition accelerates the hydrogen flow by increasing the bed permeability, while the pressure buildup and the thermal barrier forming at the wall significantly impede the hydrogen extraction.

  7. HIRS-AMTS satellite sounding system test - Theoretical and empirical vertical resolving power. [High resolution Infrared Radiation Sounder - Advanced Moisture and Temperature Sounder

    NASA Technical Reports Server (NTRS)

    Thompson, O. E.

    1982-01-01

    The present investigation is concerned with the vertical resolving power of satellite-borne temperature sounding instruments. Information is presented on the capabilities of the High Resolution Infrared Radiation Sounder (HIRS) and a proposed sounding instrument called the Advanced Moisture and Temperature Sounder (AMTS). Two quite different methods for assessing the vertical resolving power of satellite sounders are discussed. The first is the theoretical method of Conrath (1972) which was patterned after the work of Backus and Gilbert (1968) The Backus-Gilbert-Conrath (BGC) approach includes a formalism for deriving a retrieval algorithm for optimizing the vertical resolving power. However, a retrieval algorithm constructed in the BGC optimal fashion is not necessarily optimal as far as actual temperature retrievals are concerned. Thus, an independent criterion for vertical resolving power is discussed. The criterion is based on actual retrievals of signal structure in the temperature field.

  8. Theoretically predicted Fox-7 based new high energy density molecules

    NASA Astrophysics Data System (ADS)

    Ghanta, Susanta

    2016-08-01

    Computational investigation of CHNO based high energy density molecules (HEDM) are designed with FOX-7 (1, 1-dinitro 2, 2-diamino ethylene) skeleton. We report structures, stability and detonation properties of these new molecules. A systematic analysis is presented for the crystal density, activation energy for nitro to nitrite isomerisation and the C-NO2 bond dissociation energy of these molecules. The Atoms in molecules (AIM) calculations have been performed to interpret the intra-molecular weak H-bonding interactions and the stability of C-NO2 bonds. The structure optimization, frequency and bond dissociation energy calculations have been performed at B3LYP level of theory by using G03 quantum chemistry package. Some of the designed molecules are found to be more promising HEDM than FOX-7 molecule, and are proposed to be candidate for synthetic purpose.

  9. Empirical and theoretical investigation of the noise performance of indirect detection, active matrix flat-panel imagers (AMFPIs) for diagnostic radiology.

    PubMed

    Siewerdsen, J H; Antonuk, L E; el-Mohri, Y; Yorkston, J; Huang, W; Boudry, J M; Cunningham, I A

    1997-01-01

    Noise properties of active matrix, flat-panel imagers under conditions relevant to diagnostic radiology are investigated. These studies focus on imagers based upon arrays with pixels incorporating a discrete photodiode coupled to a thin-film transistor, both fabricated from hydrogenated amorphous silicon. These optically sensitive arrays are operated with an overlying x-ray converter to allow indirect detection of incident x rays. External electronics, including gate driver circuits and preamplification circuits, are also required to operate the arrays. A theoretical model describing the signal and noise transfer properties of the imagers under conditions relevant to diagnostic radiography, fluoroscopy, and mammography is developed. This frequency-dependent model is based upon a cascaded systems analysis wherein the imager is conceptually divided into a series of stages having intrinsic gain and spreading properties. Predictions from the model are compared with x-ray sensitivity and noise measurements obtained from individual pixels from an imager with a pixel format of 1536 x 1920 pixels at a pixel pitch of 127 microns. The model is shown to be in excellent agreement with measurements obtained with diagnostic x rays using various phosphor screens. The model is used to explore the potential performance of existing and hypothetical imagers for application in radiography, fluoroscopy, and mammography as a function of exposure, additive noise, and fill factor. These theoretical predictions suggest that imagers of this general design incorporating a CsI: Tl intensifying screen can be optimized to provide detective quantum efficiency (DQE) superior to existing screen-film and storage phosphor systems for general radiography and mammography. For fluoroscopy, the model predicts that with further optimization of a-Si:H imagers, DQE performance approaching that of the best x-ray image intensifier systems may be possible. The results of this analysis suggest strategies for

  10. An Empirical Pixel-Based Correction for Imperfect CTE. I. HST's Advanced Camera for Surveys

    NASA Astrophysics Data System (ADS)

    Anderson, Jay; Bedin, Luigi R.

    2010-09-01

    We use an empirical approach to characterize the effect of charge-transfer efficiency (CTE) losses in images taken with the Wide-Field Channel of the Advanced Camera for Surveys (ACS). The study is based on profiles of warm pixels in 168 dark exposures taken between 2009 September and October. The dark exposures allow us to explore charge traps that affect electrons when the background is extremely low. We develop a model for the readout process that reproduces the observed trails out to 70 pixels. We then invert the model to convert the observed pixel values in an image into an estimate of the original pixel values. We find that when we apply this image-restoration process to science images with a variety of stars on a variety of background levels, it restores flux, position, and shape. This means that the observed trails contain essentially all of the flux lost to inefficient CTE. The Space Telescope Science Institute is currently evaluating this algorithm with the aim of optimizing it and eventually providing enhanced data products. The empirical procedure presented here should also work for other epochs (e.g., pre-SM4), though the parameters may have to be recomputed for the time when ACS was operated at a higher temperature than the current -81°C. Finally, this empirical approach may also hold promise for other instruments, such as WFPC2, STIS, the ACS's HRC, and even WFC3/UVIS. Based on observations with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by AURA, Inc., under NASA contract NAS 5-26555.

  11. Advances on Empirical Mode Decomposition-based Time-Frequency Analysis Methods in Hydrocarbon Detection

    NASA Astrophysics Data System (ADS)

    Chen, H. X.; Xue, Y. J.; Cao, J.

    2015-12-01

    Empirical mode decomposition (EMD), which is a data-driven adaptive decomposition method and is not limited by time-frequency uncertainty spreading, is proved to be more suitable for seismic signals which are nonlinear and non-stationary. Compared with other Fourier-based and wavelet-based time-frequency methods, EMD-based time-frequency methods have higher temporal and spatial resolution and yield hydrocarbon interpretations with more statistical significance. Empirical mode decomposition algorithm has now evolved from EMD to Ensemble EMD (EEMD) to Complete Ensemble EMD (CEEMD). Even though EMD-based time-frequency methods offer many promising features for analyzing and processing geophysical data, there are some limitations or defects in EMD-based time-frequency methods. This presentation will present a comparative study on hydrocarbon detection using seven EMD-based time-frequency analysis methods, which include: (1) first, EMD combined with Hilbert transform (HT) as a time-frequency analysis method is used for hydrocarbon detection; and (2) second, Normalized Hilbert transform (NHT) and HU Methods respectively combined with HT as improved time-frequency analysis methods are applied for hydrocarbon detection; and (3) three, EMD combined with Teager-Kaiser energy (EMD/TK) is investigated for hydrocarbon detection; and (4) four, EMD combined with wavelet transform (EMDWave) as a seismic attenuation estimation method is comparatively studied; and (5) EEMD- and CEEMD- based time-frequency analysis methods used as highlight volumes technology are studied. The differences between these methods in hydrocarbon detection will be discussed. The question of getting a meaningful instantaneous frequency by HT and mode-mixing issues in EMD will be analysed. The work was supported by NSFC under grant Nos. 41430323, 41404102 and 41274128.

  12. Imitative Modeling as a Theoretical Base for Instructing Language-Disordered Children

    ERIC Educational Resources Information Center

    Courtright, John A.; Courtright, Illene C.

    1976-01-01

    A modification of A. Bandura's social learning theory (imitative modeling) was employed as a theoretical base for language instruction with eight language disordered children (5 to 10 years old). (Author/SBH)

  13. Experimental and Theoretical Study of Microturbine-Based BCHP System

    SciTech Connect

    Fairchild, P.D.

    2001-07-12

    On-site and near-site distributed power generation (DG), as part of a Buildings Cooling, Heating and Power (BCHP) system, brings both electricity and waste heat from the DG sources closer to the end user's electric and thermal loads. Consequently, the waste heat can be used as input power for heat-activated air conditioners, chillers, and desiccant dehumidification systems; to generate steam for space heating; or to provide hot water for laundry, kitchen, cleaning services and/or rest rooms. By making use of what is normally waste heat, BCHP systems meet a building's electrical and thermal loads with a lower input of fossil fuel, yielding resource efficiencies of 40 to 70% or more. To ensure the success of BCHP systems, interactions of a DG system-such as a microturbine and thermal heat recovery units under steady-state modes of operation with various exhaust back pressures-must be considered. This article studies the performance and emissions of a 30-kW microturbine over a range of design and off-design conditions in steady-state operating mode with various back pressures. In parallel with the experimental part of the project, a BCHP mathematical model was developed describing basic thermodynamic and hydraulic processes in the system, heat and material balances, and the relationship of the balances. to the system configuration. The model can determine the efficiency of energy conversion both for an individual microturbine unit and for the entire BCHP system for various system configurations and external loads. Based on actual data Tom a 30-kW microturbine, linear analysis was used to obtain an analytical relationship between the changes in the thermodynamic and hydraulic parameters of the system. The actual data show that, when the backpressure at the microturbine exhaust outlet is increased to the maximum of 7 in. WC (0.017 atm), the microturbine's useful power output decreases by from 3.5 % at a full power setting of 30 kW to 5.5 % at a one-third power setting

  14. A theoretically based determination of bowen-ratio fetch requirements

    USGS Publications Warehouse

    Stannard, D.I.

    1997-01-01

    Determination of fetch requirements for accurate Bowen-ratio measurements of latent- and sensible-heat fluxes is more involved than for eddy-correlation measurements because Bowen-ratio sensors are located at two heights, rather than just one. A simple solution to the diffusion equation is used to derive an expression for Bowen-ratio fetch requirements, downwind of a step change in surface fluxes. These requirements are then compared to eddy-correlation fetch requirements based on the same diffusion equation solution. When the eddy-correlation and upper Bowen-ratio sensor heights are equal, and the available energy upwind and downwind of the step change is constant, the Bowen-ratio method requires less fetch than does eddy correlation. Differences in fetch requirements between the two methods are greatest over relatively smooth surfaces. Bowen-ratio fetch can be reduced significantly by lowering the lower sensor, as well as the upper sensor. The Bowen-ratio fetch model was tested using data from a field experiment where multiple Bowen-ratio systems were deployed simultaneously at various fetches and heights above a field of bermudagrass. Initial comparisons were poor, but improved greatly when the model was modified (and operated numerically) to account for the large roughness of the upwind cotton field.

  15. Theoretical study of impurity effects in iron-based superconductors

    NASA Astrophysics Data System (ADS)

    Navarro Gastiasoro, Maria; Hirschfeld, Peter; Andersen, Brian

    2013-03-01

    Several open questions remain unanswered for the iron-based superconductors (FeSC), including the importance of electronic correlations and the symmetry of the superconducting order parameter. Motivated by recent STM experiments which show a fascinating variety of resonant defect states in FeSC, we adopt a realistic five-band model including electronic Coulomb correlations to study local effects of disorder in the FeSC. In order to minimize the number of free parameters, we use the pairing interactions obtained from spin-fluctuation exchange to determine the homogeneous superconducting state. The ability of local impurity potentials to induce resonant states depends on their scattering strength Vimp; in addition, for appropriate Vimp, such states are associated with local orbital- and magnetic order. We investigate the density of states near such impurities and show how tunneling experiments may be used to probe local induced order. In the SDW phase, we show how C2 symmetry-breaking dimers are naturally formed around impurities which also form cigar-like (pi,pi) structures embedded in the (pi,0) magnetic bulk phase. Such electronic dimers have been shown to be candidates for explaining the so-called nematogens observed previously by QPI in Co-doped CaFe2As2.

  16. Awareness-based game-theoretic space resource management

    NASA Astrophysics Data System (ADS)

    Chen, Genshe; Chen, Huimin; Pham, Khanh; Blasch, Erik; Cruz, Jose B., Jr.

    2009-05-01

    Over recent decades, the space environment becomes more complex with a significant increase in space debris and a greater density of spacecraft, which poses great difficulties to efficient and reliable space operations. In this paper we present a Hierarchical Sensor Management (HSM) method to space operations by (a) accommodating awareness modeling and updating and (b) collaborative search and tracking space objects. The basic approach is described as follows. Firstly, partition the relevant region of interest into district cells. Second, initialize and model the dynamics of each cell with awareness and object covariance according to prior information. Secondly, explicitly assign sensing resources to objects with user specified requirements. Note that when an object has intelligent response to the sensing event, the sensor assigned to observe an intelligent object may switch from time-to-time between a strong, active signal mode and a passive mode to maximize the total amount of information to be obtained over a multi-step time horizon and avoid risks. Thirdly, if all explicitly specified requirements are satisfied and there are still more sensing resources available, we assign the additional sensing resources to objects without explicitly specified requirements via an information based approach. Finally, sensor scheduling is applied to each sensor-object or sensor-cell pair according to the object type. We demonstrate our method with realistic space resources management scenario using NASA's General Mission Analysis Tool (GMAT) for space object search and track with multiple space borne observers.

  17. Rare-earth element based permanent magnets: a theoretical investigation

    NASA Astrophysics Data System (ADS)

    Chouhan, Rajiv K.; Paudyal, Durga

    Permanent magnetic materials with large magnetization and high magnetocrystalline anisotropy are important for technical applications. In this context rare-earth (R) element based materials are good candidates because of their localized 4 f electrons. The 4 f crystal field splitting provides large part of magnetic anisotropy depending upon the crystal environment. The d spin orbit coupling of alloyed transition metal component provides additional anisotropy. RCo5 and its derivative R2Co17 are known compounds for large magnetic anisotropy. Here we have performed electronic structure calculations to predict new materials in this class by employing site substitutions. In these investigations, we have performed density functional theory including on-site electron correlation (DFT +U) and L-S coupling calculations. The results show that the abundant Ce substitution in R sites and Ti/Zr substitutions in some of the Co sites help reduce criticality without substantially affecting the magnetic moment and magnetic anisotropy in these materials. This work is supported by the Critical Materials Institute, an Energy Innovation Hub funded by the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Advanced Manufacturing Office.

  18. Implementation of an empirically based drug and violence prevention and intervention program in public school settings.

    PubMed

    Cunningham, P B; Henggeler, S W

    2001-06-01

    Describes the implementation of a collaborative preventive intervention project (Healthy Schools) designed to reduce levels of bullying and related antisocial behaviors in children attending two urban middle schools serving primarily African American students. These schools have high rates of juvenile violence, as reflected by suspensions and expulsions for behavioral problems. Using a quasi-experimental design, empirically based drug and violence prevention programs, Bullying Prevention and Project ALERT, are being implemented at each middle school. In addition, an intensive evidence-based intervention, multisystemic therapy, is being used to target students at high risk of expulsion and court referral. Hence, the proposed project integrates both universal approaches to prevention and a model that focuses on indicated cases. Targeted outcomes, by which the effectiveness of this comprehensive school-based program will be measured, are reduced youth violence, reduced drug use, and improved psychosocial functioning of participating youth. PMID:11393922

  19. Dip-separated structural filtering using seislet transform and adaptive empirical mode decomposition based dip filter

    NASA Astrophysics Data System (ADS)

    Chen, Yangkang

    2016-07-01

    The seislet transform has been demonstrated to have a better compression performance for seismic data compared with other well-known sparsity promoting transforms, thus it can be used to remove random noise by simply applying a thresholding operator in the seislet domain. Since the seislet transform compresses the seismic data along the local structures, the seislet thresholding can be viewed as a simple structural filtering approach. Because of the dependence on a precise local slope estimation, the seislet transform usually suffers from low compression ratio and high reconstruction error for seismic profiles that have dip conflicts. In order to remove the limitation of seislet thresholding in dealing with conflicting-dip data, I propose a dip-separated filtering strategy. In this method, I first use an adaptive empirical mode decomposition based dip filter to separate the seismic data into several dip bands (5 or 6). Next, I apply seislet thresholding to each separated dip component to remove random noise. Then I combine all the denoised components to form the final denoised data. Compared with other dip filters, the empirical mode decomposition based dip filter is data-adaptive. One only needs to specify the number of dip components to be separated. Both complicated synthetic and field data examples show superior performance of my proposed approach than the traditional alternatives. The dip-separated structural filtering is not limited to seislet thresholding, and can also be extended to all those methods that require slope information.

  20. Determination of knock characteristics in spark ignition engines: an approach based on ensemble empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Li, Ning; Yang, Jianguo; Zhou, Rui; Liang, Caiping

    2016-04-01

    Knock is one of the major constraints to improve the performance and thermal efficiency of spark ignition (SI) engines. It can also result in severe permanent engine damage under certain operating conditions. Based on the ensemble empirical mode decomposition (EEMD), this paper proposes a new approach to determine the knock characteristics in SI engines. By adding a uniformly distributed and finite white Gaussian noise, the EEMD can preserve signal continuity in different scales and therefore alleviates the mode-mixing problem occurring in the classic empirical mode decomposition (EMD). The feasibilities of applying the EEMD to detect the knock signatures of a test SI engine via the pressure signal measured from combustion chamber and the vibration signal measured from cylinder head are investigated. Experimental results show that the EEMD-based method is able to detect the knock signatures from both the pressure signal and vibration signal, even in initial stage of knock. Finally, by comparing the application results with those obtained by short-time Fourier transform (STFT), Wigner-Ville distribution (WVD) and discrete wavelet transform (DWT), the superiority of the EEMD method in determining knock characteristics is demonstrated.

  1. Dip-separated structural filtering using seislet transform and adaptive empirical mode decomposition based dip filter

    NASA Astrophysics Data System (ADS)

    Chen, Yangkang

    2016-04-01

    The seislet transform has been demonstrated to have a better compression performance for seismic data compared with other well-known sparsity promoting transforms, thus it can be used to remove random noise by simply applying a thresholding operator in the seislet domain. Since the seislet transform compresses the seismic data along the local structures, the seislet thresholding can be viewed as a simple structural filtering approach. Because of the dependence on a precise local slope estimation, the seislet transform usually suffers from low compression ratio and high reconstruction error for seismic profiles that have dip conflicts. In order to remove the limitation of seislet thresholding in dealing with conflicting-dip data, I propose a dip-separated filtering strategy. In this method, I first use an adaptive empirical mode decomposition based dip filter to separate the seismic data into several dip bands (5 or 6). Next, I apply seislet thresholding to each separated dip component to remove random noise. Then I combine all the denoised components to form the final denoised data. Compared with other dip filters, the empirical mode decomposition based dip filter is data-adaptive. One only need to specify the number of dip components to be separated. Both complicated synthetic and field data examples show superior performance of my proposed approach than the traditional alternatives. The dip-separated structural filtering is not limited to seislet thresholding, and can also be extended to all those methods that require slope information.

  2. Dynamics of bloggers’ communities: Bipartite networks from empirical data and agent-based modeling

    NASA Astrophysics Data System (ADS)

    Mitrović, Marija; Tadić, Bosiljka

    2012-11-01

    We present an analysis of the empirical data and the agent-based modeling of the emotional behavior of users on the Web portals where the user interaction is mediated by posted comments, like Blogs and Diggs. We consider the dataset of discussion-driven popular Diggs, in which all comments are screened by machine-learning emotion detection in the text, to determine positive and negative valence (attractiveness and aversiveness) of each comment. By mapping the data onto a suitable bipartite network, we perform an analysis of the network topology and the related time-series of the emotional comments. The agent-based model is then introduced to simulate the dynamics and to capture the emergence of the emotional behaviors and communities. The agents are linked to posts on a bipartite network, whose structure evolves through their actions on the posts. The emotional states (arousal and valence) of each agent fluctuate in time, subject to the current contents of the posts to which the agent is exposed. By an agent’s action on a post its current emotions are transferred to the post. The model rules and the key parameters are inferred from the considered empirical data to ensure their realistic values and mutual consistency. The model assumes that the emotional arousal over posts drives the agent’s action. The simulations are preformed for the case of constant flux of agents and the results are analyzed in full analogy with the empirical data. The main conclusions are that the emotion-driven dynamics leads to long-range temporal correlations and emergent networks with community structure, that are comparable with the ones in the empirical system of popular posts. In view of pure emotion-driven agents actions, this type of comparisons provide a quantitative measure for the role of emotions in the dynamics on real blogs. Furthermore, the model reveals the underlying mechanisms which relate the post popularity with the emotion dynamics and the prevalence of negative

  3. Consistent climate-driven spatial patterns of terrestrial ecosystem carbon fluxes in the northern hemisphere: a theoretical framework and synthesis of empirical evidence

    NASA Astrophysics Data System (ADS)

    Yu, G.; Niu, S.; Chen, Z.; Zhu, X.

    2013-12-01

    A predictive understanding of the terrestrial ecosystem carbon fluxes has been developed slowly, largely owing to lack of broad generalizations and a theoretical framework as well as clearly defined hypotheses. We synthesized Eddy flux data in different regions of northern hemisphere and previously published papers, then developed a framework for the climate controls on the geoecological patterns of terrestrial ecosystem C fluxes, and proposed the underlying mechanisms. Based on the case studies and synthesis, we found that the spatial patterns of ecosystem C fluxes in China, Asia, three continents of the northern hemisphere all had general patterns: predominately controlled by temperature and precipitation, supporting and further developing the traditional theory of 'climate controls on the spatial patterns of ecosystem productivity' in Miami and other models. Five hypotheses were proposed to explain the ecological mechanisms and processes that attribute to the climate-driven spatial patterns of C fluxes. (1) Two key processes determining gross primary productivity (GPP), i.e. growing season length and carbon uptake capacity, are jointly controlled by temperature and precipitation; (2) Ecosystem respiration (ER) is predominately determined also by temperature and precipitation, as well as substrate supply; (3) Components of ecosystem C fluxes are closely coupled with each other in response to climate change; (4) Vegetation types and soil nutrients in particular area are fundamentally determined by environmental factors, which may impact C fluxes within a certain range, but couldn't change the climate-driven pattern of C fluxes at large scale, (5) Land use only changes the magnitude of C fluxes, but doesn't change the spatial patterns and their climate dependence. All of these hypotheses were well validated by the evidences of data synthesis, which could provide the foundation for a theoretical framework for better understanding and predicting geoecological

  4. Empirical calibrations of optical absorption-line indices based on the stellar library MILES

    NASA Astrophysics Data System (ADS)

    Johansson, Jonas; Thomas, Daniel; Maraston, Claudia

    2010-07-01

    Stellar population models of absorption-line indices are an important tool for the analysis of stellar population spectra. They are most accurately modelled through empirical calibrations of absorption-line indices with the stellar parameters such as effective temperature, metallicity and surface gravity, which are the so-called fitting functions. Here we present new empirical fitting functions for the 25 optical Lick absorption-line indices based on the new stellar library Medium resolution INT Library of Empirical Spectra (MILES). The major improvements with respect to the Lick/IDS library are the better sampling of stellar parameter space, a generally higher signal-to-noise ratio and a careful flux calibration. In fact, we find that errors on individual index measurements in MILES are considerably smaller than in Lick/IDS. Instead, we find the rms of the residuals between the final fitting functions and the data to be dominated by errors in the stellar parameters. We provide fitting functions for both Lick/IDS and MILES spectral resolutions and compare our results with other fitting functions in the literature. A FORTRAN 90 code is available online in order to simplify the implementation in stellar population models. We further calculate the offsets in index measurements between the Lick/IDS system to a flux-calibrated system. For this purpose, we use the three libraries MILES, ELODIE and STELIB. We find that offsets are negligible in some cases, most notably for the widely used indices Hβ, Mgb, Fe5270 and Fe5335. In a number of cases, however, the difference between the flux-calibrated library and Lick/IDS is significant with the offsets depending on index strengths. Interestingly, there is no general agreement between the three libraries for a large number of indices, which hampers the derivation of a universal offset between the Lick/IDS and flux-calibrated systems.

  5. Towards high performing hospital enterprise systems: an empirical and literature based design framework

    NASA Astrophysics Data System (ADS)

    dos Santos Fradinho, Jorge Miguel

    2014-05-01

    Our understanding of enterprise systems (ES) is gradually evolving towards a sense of design which leverages multidisciplinary bodies of knowledge that may bolster hybrid research designs and together further the characterisation of ES operation and performance. This article aims to contribute towards ES design theory with its hospital enterprise systems design (HESD) framework, which reflects a rich multidisciplinary literature and two in-depth hospital empirical cases from the US and UK. In doing so it leverages systems thinking principles and traditionally disparate bodies of knowledge to bolster the theoretical evolution and foundation of ES. A total of seven core ES design elements are identified and characterised with 24 main categories and 53 subcategories. In addition, it builds on recent work which suggests that hospital enterprises are comprised of multiple internal ES configurations which may generate different levels of performance. Multiple sources of evidence were collected including electronic medical records, 54 recorded interviews, observation, and internal documents. Both in-depth cases compare and contrast higher and lower performing ES configurations. Following literal replication across in-depth cases, this article concludes that hospital performance can be improved through an enriched understanding of hospital ES design.

  6. A theoretical model of drumlin formation based on observations at Múlajökull, Iceland

    NASA Astrophysics Data System (ADS)

    Iverson, Neal R.; McCracken, Reba; Zoet, Lucas; Benediktsson, Ívar; Schomacker, Anders; Johnson, Mark; Finlayson, Andrew; Phillips, Emrys; Everest, Jeremy

    2016-04-01

    Theoretical models of drumlin formation have generally been developed in isolation from observations in modern drumlin forming environments - a major limitation on the empiricism necessary to confidently formulate models and test them. Observations at a rare modern drumlin field exposed by the recession of the Icelandic surge-type glacier, Múlajökull, allow an empirically-grounded and physically-based model of drumlin formation to be formulated and tested. Till fabrics based on anisotropy of magnetic susceptibility and clast orientations, along with stratigraphic observations and results of ground penetrating radar, indicate that drumlin relief results from basal till deposition on drumlins and erosion between them. These data also indicate that surges cause till deposition both on and between drumlins and provide no evidence of the longitudinally compressive or extensional strain in till that would be expected if flux divergence in a deforming bed were significant. Over 2000 measurements of till density, together with consolidation tests on the till, indicate that effective stresses on the bed were higher between drumlins than within them. This observation agrees with evidence that subglacial water drainage during normal flow of the glacier is through channels in low areas between drumlins and that crevasse swarms, which reduce total normal stresses on the bed, are coincident with drumlins. In the new model slip of ice over a bed with a sinusoidal perturbation, crevasse swarms, and flow of subglacial water toward R-channels that bound the bed undulation during periods of normal flow result in effective stresses that increase toward channels and decrease from the stoss to the lee sides of the undulation. This effective-stress pattern causes till entrainment and erosion by regelation infiltration (Rempel, 2008, JGR, 113) that peaks at the heads of incipient drumlins and near R-channels, while bed shear is inhibited by effective stresses too high to allow

  7. Polarizable Empirical Force Field for Hexopyranose Monosaccharides Based on the Classical Drude Oscillator

    PubMed Central

    2015-01-01

    A polarizable empirical force field based on the classical Drude oscillator is presented for the hexopyranose form of selected monosaccharides. Parameter optimization targeted quantum mechanical (QM) dipole moments, solute–water interaction energies, vibrational frequencies, and conformational energies. Validation of the model was based on experimental data on crystals, densities of aqueous-sugar solutions, diffusion constants of glucose, and rotational preferences of the exocylic hydroxymethyl of d-glucose and d-galactose in aqueous solution as well as additional QM data. Notably, the final model involves a single electrostatic model for all sixteen diastereomers of the monosaccharides, indicating the transferability of the polarizable model. The presented parameters are anticipated to lay the foundation for a comprehensive polarizable force field for saccharides that will be compatible with the polarizable Drude parameters for lipids and proteins, allowing for simulations of glycolipids and glycoproteins. PMID:24564643

  8. Empirical analysis of web-based user-object bipartite networks

    NASA Astrophysics Data System (ADS)

    Shang, Ming-Sheng; Lü, Linyuan; Zhang, Yi-Cheng; Zhou, Tao

    2010-05-01

    Understanding the structure and evolution of web-based user-object networks is a significant task since they play a crucial role in e-commerce nowadays. This letter reports the empirical analysis on two large-scale web sites, audioscrobbler.com and del.icio.us, where users are connected with music groups and bookmarks, respectively. The degree distributions and degree-degree correlations for both users and objects are reported. We propose a new index, named collaborative similarity, to quantify the diversity of tastes based on the collaborative selection. Accordingly, the correlation between degree and selection diversity is investigated. We report some novel phenomena well characterizing the selection mechanism of web users and outline the relevance of these phenomena to the information recommendation problem.

  9. STEAM: a software tool based on empirical analysis for micro electro mechanical systems

    NASA Astrophysics Data System (ADS)

    Devasia, Archana; Pasupuleti, Ajay; Sahin, Ferat

    2006-03-01

    In this research a generalized software framework that enables accurate computer aided design of MEMS devices is developed. The proposed simulation engine utilizes a novel material property estimation technique that generates effective material properties at the microscopic level. The material property models were developed based on empirical analysis and the behavior extraction of standard test structures. A literature review is provided on the physical phenomena that govern the mechanical behavior of thin films materials. This survey indicates that the present day models operate under a wide range of assumptions that may not be applicable to the micro-world. Thus, this methodology is foreseen to be an essential tool for MEMS designers as it would develop empirical models that relate the loading parameters, material properties, and the geometry of the microstructures with its performance characteristics. This process involves learning the relationship between the above parameters using non-parametric learning algorithms such as radial basis function networks and genetic algorithms. The proposed simulation engine has a graphical user interface (GUI) which is very adaptable, flexible, and transparent. The GUI is able to encompass all parameters associated with the determination of the desired material property so as to create models that provide an accurate estimation of the desired property. This technique was verified by fabricating and simulating bilayer cantilevers consisting of aluminum and glass (TEOS oxide) in our previous work. The results obtained were found to be very encouraging.

  10. Polarizable Empirical Force Field for Acyclic Poly-Alcohols Based on the Classical Drude Oscillator

    PubMed Central

    He, Xibing; Lopes, Pedro E. M.; MacKerell, Alexander D.

    2014-01-01

    A polarizable empirical force field for acyclic polyalcohols based on the classical Drude oscillator is presented. The model is optimized with an emphasis on the transferability of the developed parameters among molecules of different sizes in this series and on the condensed-phase properties validated against experimental data. The importance of the explicit treatment of electronic polarizability in empirical force fields is demonstrated in the cases of this series of molecules with vicinal hydroxyl groups that can form cooperative intra- and intermolecular hydrogen bonds. Compared to the CHARMM additive force field, improved treatment of the electrostatic interactions avoids overestimation of the gas-phase dipole moments, results in significant improvement in the treatment of the conformational energies, and leads to the correct balance of intra- and intermolecular hydrogen bonding of glycerol as evidenced by calculated heat of vaporization being in excellent agreement with experiment. Computed condensed phase data, including crystal lattice parameters and volumes and densities of aqueous solutions are in better agreement with experimental data as compared to the corresponding additive model. Such improvements are anticipated to significantly improve the treatment of polymers in general, including biological macromolecules. PMID:23703219

  11. An Empirical Evaluation of Puzzle-Based Learning as an Interest Approach for Teaching Introductory Computer Science

    ERIC Educational Resources Information Center

    Merrick, K. E.

    2010-01-01

    This correspondence describes an adaptation of puzzle-based learning to teaching an introductory computer programming course. Students from two offerings of the course--with and without the puzzle-based learning--were surveyed over a two-year period. Empirical results show that the synthesis of puzzle-based learning concepts with existing course…

  12. Conformational polymorphism in a Schiff-base macrocyclic organic ligand: an experimental and theoretical study.

    PubMed

    Lo Presti, Leonardo; Soave, Raffaella; Longhi, Mariangela; Ortoleva, Emanuele

    2010-10-01

    Polymorphism in the highly flexible organic Schiff-base macrocycle ligand 3,6,9,17,20,23-hexa-azapentacyclo(23.3.1.1(11,15).0(2,6).0(16,20))triaconta-1(29),9,11,13,15(30),23,25,27-octaene (DIEN, C(24)H(30)N(6)) has been studied by single-crystal X-ray diffraction and both solid-state and gas-phase density functional theory (DFT) calculations. In the literature, only solvated structures of the title compound are known. Two new polymorphs and a new solvated form of DIEN, all obtained from the same solvent with different crystallization conditions, are presented for the first time. They all have P\\bar 1 symmetry, with the macrocycle positioned on inversion centres. The two unsolvated polymorphic forms differ in the number of molecules in the asymmetric unit Z', density and cohesive energy. Theoretical results confirm that the most stable form is (II°), with Z' = 1.5. Two distinct molecular conformations have been found, named `endo' or `exo' according to the orientation of the imine N atoms, which can be directed towards the interior or the exterior of the macrocycle. The endo arrangement is ubiquitous in the solid state and is shared by two independent molecules which constitute an invariant supramolecular synthon in all the known crystal forms of DIEN. It is also the most stable arrangement in the gas phase. The exo form, on the other hand, appears only in phase (II°), which contains both the conformers. Similarities and differences among the occurring packing motifs, as well as solvent effects, are discussed with the aid of Hirshfeld surface fingerprint plots and correlated to the results of the energy analysis. A possible interconversion path in the gas phase between the endo and the exo conformers has been found by DFT calculations; it consists of a two-step mechanism with activation energies of the order of 30-40 kJ mol(-1). These findings have been related to the empirical evidence that the most stable phase (II°) is also the last appearing one, in

  13. Developing Empirically Based, Culturally Grounded Drug Prevention Interventions for Indigenous Youth Populations

    PubMed Central

    Okamoto, Scott K.; Helm, Susana; Pel, Suzanne; McClain, Latoya L.; Hill, Amber P.; Hayashida, Janai K. P.

    2012-01-01

    This article describes the relevance of a culturally grounded approach toward drug prevention development for indigenous youth populations. This approach builds drug prevention from the “ground up” (ie, from the values, beliefs, and worldviews of the youth that are the intended consumers of the program), and is contrasted with efforts that focus on adapting existing drug prevention interventions to fit the norms of different youth ethnocultural groups. The development of an empirically based drug prevention program focused on rural Native Hawaiian youth is described as a case example of culturally grounded drug prevention development for indigenous youth, and the impact of this effort on the validity of the intervention and on community engagement and investment in the development of the program are discussed. Finally, implications of this approach for behavioral health services and the development of an indigenous prevention science are discussed. PMID:23188485

  14. A Human ECG Identification System Based on Ensemble Empirical Mode Decomposition

    PubMed Central

    Zhao, Zhidong; Yang, Lei; Chen, Diandian; Luo, Yi

    2013-01-01

    In this paper, a human electrocardiogram (ECG) identification system based on ensemble empirical mode decomposition (EEMD) is designed. A robust preprocessing method comprising noise elimination, heartbeat normalization and quality measurement is proposed to eliminate the effects of noise and heart rate variability. The system is independent of the heart rate. The ECG signal is decomposed into a number of intrinsic mode functions (IMFs) and Welch spectral analysis is used to extract the significant heartbeat signal features. Principal component analysis is used reduce the dimensionality of the feature space, and the K-nearest neighbors (K-NN) method is applied as the classifier tool. The proposed human ECG identification system was tested on standard MIT-BIH ECG databases: the ST change database, the long-term ST database, and the PTB database. The system achieved an identification accuracy of 95% for 90 subjects, demonstrating the effectiveness of the proposed method in terms of accuracy and robustness. PMID:23698274

  15. Empirical likelihood based detection procedure for change point in mean residual life functions under random censorship.

    PubMed

    Chen, Ying-Ju; Ning, Wei; Gupta, Arjun K

    2016-05-01

    The mean residual life (MRL) function is one of the basic parameters of interest in survival analysis that describes the expected remaining time of an individual after a certain age. The study of changes in the MRL function is practical and interesting because it may help us to identify some factors such as age and gender that may influence the remaining lifetimes of patients after receiving a certain surgery. In this paper, we propose a detection procedure based on the empirical likelihood for the changes in MRL functions with right censored data. Two real examples are also given: Veterans' administration lung cancer study and Stanford heart transplant to illustrate the detecting procedure. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26936529

  16. Conformational studies of (2'-5') polynucleotides: theoretical computations of energy, base morphology, helical structure, and duplex formation.

    PubMed Central

    Srinivasan, A R; Olson, W K

    1986-01-01

    A detailed theoretical analysis has been carried out to probe the conformational characteristics of (2'-5') polynucleotide chains. Semi-empirical energy calculations are used to estimate the preferred torsional combinations of the monomeric repeating unit. The resulting morphology of adjacent bases and the tendency to form regular single-stranded structures are determined by standard computational procedures. The torsional preferences are in agreement with available nmr measurements on model compounds. The tendencies to adopt base stacked and intercalative geometries are markedly depressed compared to those in (3'-5') chains. Very limited families of regular monomerically repeating single-stranded (2'-5') helices are found. Base stacking, however, can be enhanced (but helix formation is at the same time depressed) in mixed puckered chains. Constrained (2'-5') duplex structures have been constructed from a search of all intervening glycosyl and sugar conformations that form geometrically feasible phosphodiester linkages. Both A- and B-type base stacking are found to generate non-standard backbone torsions and mixed glycosyl/sugar combinations. The 2'- and 5'-residues are locked in totally different arrangements and are thereby prevented from generating long helical structures. PMID:2426656

  17. How "Does" the Comforting Process Work? An Empirical Test of an Appraisal-Based Model of Comforting

    ERIC Educational Resources Information Center

    Jones, Susanne M.; Wirtz, John G.

    2006-01-01

    Burleson and Goldsmith's (1998) comforting model suggests an appraisal-based mechanism through which comforting messages can bring about a positive change in emotional states. This study is a first empirical test of three causal linkages implied by the appraisal-based comforting model. Participants (N=258) talked about an upsetting event with a…

  18. An Empirical Study on Washback Effects of the Internet-Based College English Test Band 4 in China

    ERIC Educational Resources Information Center

    Wang, Chao; Yan, Jiaolan; Liu, Bao

    2014-01-01

    Based on Bailey's washback model, in respect of participants, process and products, the present empirical study was conducted to find the actual washback effects of the internet-based College English Test Band 4 (IB CET-4). The methods adopted are questionnaires, class observation, interview and the analysis of both the CET-4 teaching and testing…

  19. Written institutional ethics policies on euthanasia: an empirical-based organizational-ethical framework.

    PubMed

    Lemiengre, Joke; Dierckx de Casterlé, Bernadette; Schotsmans, Paul; Gastmans, Chris

    2014-05-01

    As euthanasia has become a widely debated issue in many Western countries, hospitals and nursing homes especially are increasingly being confronted with this ethically sensitive societal issue. The focus of this paper is how healthcare institutions can deal with euthanasia requests on an organizational level by means of a written institutional ethics policy. The general aim is to make a critical analysis whether these policies can be considered as organizational-ethical instruments that support healthcare institutions to take their institutional responsibility for dealing with euthanasia requests. By means of an interpretative analysis, we conducted a process of reinterpretation of results of former Belgian empirical studies on written institutional ethics policies on euthanasia in dialogue with the existing international literature. The study findings revealed that legal regulations, ethical and care-oriented aspects strongly affected the development, the content, and the impact of written institutional ethics policies on euthanasia. Hence, these three cornerstones-law, care and ethics-constituted the basis for the empirical-based organizational-ethical framework for written institutional ethics policies on euthanasia that is presented in this paper. However, having a euthanasia policy does not automatically lead to more legal transparency, or to a more professional and ethical care practice. The study findings suggest that the development and implementation of an ethics policy on euthanasia as an organizational-ethical instrument should be considered as a dynamic process. Administrators and ethics committees must take responsibility to actively create an ethical climate supporting care providers who have to deal with ethical dilemmas in their practice. PMID:24420744

  20. Empirical mode decomposition based background removal and de-noising in polarization interference imaging spectrometer.

    PubMed

    Zhang, Chunmin; Ren, Wenyi; Mu, Tingkui; Fu, Lili; Jia, Chenling

    2013-02-11

    Based on empirical mode decomposition (EMD), the background removal and de-noising procedures of the data taken by polarization interference imaging interferometer (PIIS) are implemented. Through numerical simulation, it is discovered that the data processing methods are effective. The assumption that the noise mostly exists in the first intrinsic mode function is verified, and the parameters in the EMD thresholding de-noising methods is determined. In comparison, the wavelet and windowed Fourier transform based thresholding de-noising methods are introduced. The de-noised results are evaluated by the SNR, spectral resolution and peak value of the de-noised spectrums. All the methods are used to suppress the effect from the Gaussian and Poisson noise. The de-noising efficiency is higher for the spectrum contaminated by Gaussian noise. The interferogram obtained by the PIIS is processed by the proposed methods. Both the interferogram without background and noise free spectrum are obtained effectively. The adaptive and robust EMD based methods are effective to the background removal and de-noising in PIIS. PMID:23481716

  1. Target detection for low cost uncooled MWIR cameras based on empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Piñeiro-Ave, José; Blanco-Velasco, Manuel; Cruz-Roldán, Fernando; Artés-Rodríguez, Antonio

    2014-03-01

    In this work, a novel method for detecting low intensity fast moving objects with low cost Medium Wavelength Infrared (MWIR) cameras is proposed. The method is based on background subtraction in a video sequence obtained with a low density Focal Plane Array (FPA) of the newly available uncooled lead selenide (PbSe) detectors. Thermal instability along with the lack of specific electronics and mechanical devices for canceling the effect of distortion make background image identification very difficult. As a result, the identification of targets is performed in low signal to noise ratio (SNR) conditions, which may considerably restrict the sensitivity of the detection algorithm. These problems are addressed in this work by means of a new technique based on the empirical mode decomposition, which accomplishes drift estimation and target detection. Given that background estimation is the most important stage for detecting, a previous denoising step enabling a better drift estimation is designed. Comparisons are conducted against a denoising technique based on the wavelet transform and also with traditional drift estimation methods such as Kalman filtering and running average. The results reported by the simulations show that the proposed scheme has superior performance.

  2. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User's Head Movement.

    PubMed

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-01-01

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest. PMID:27589768

  3. Empirical Evaluation Indicators in Thai Higher Education: Theory-Based Multidimensional Learners' Assessment

    ERIC Educational Resources Information Center

    Sritanyarat, Dawisa; Russ-Eft, Darlene

    2016-01-01

    This study proposed empirical indicators which can be validated and adopted in higher education institutions to evaluate quality of teaching and learning, and to serve as an evaluation criteria for human resource management and development of higher institutions in Thailand. The main purpose of this study was to develop empirical indicators of a…

  4. Biological clues on neuronal degeneration based on theoretical fits of decay patterns: towards a mathematical neuropathology.

    PubMed

    Triarhou, Lazaros C

    2010-01-01

    The application of the best mathematical fit to quantitative data on cell death over time in models of nervous abiotrophies can yield useful clues as to the cellular properties of degenerative processes. We review data obtained in two neurogenetic models of movement disorders in the laboratory mouse, the 'Purkinje cell degeneration' (pcd) mutant, a model of cerebellar ataxia, and the 'weaver' (wv) mutant, a combined degeneration of multiple systems including the mesostriatal dopaminergic pathway. In the cerebellum of pcd mice, analyses of transsynaptic granule cell death subsequent to the genetically-determined degeneration of Purkinje cells show that granule neuron fallout follows a typical pattern of exponential decay. In the midbrain of weaver mice, regression fits show that dopaminergic neuron fallout combines two independent components, an initial exponential decay, superceded by a linear regression, with a threshold around 100 days. The biological connotations of such analyses are discussed in light of the empirical observations and the theoretical simulation models. The theoretical connotations may link neuron loss to specific cellular idiosyncracies in elucidating the pathogenesis of chronic neurodegenerative disorders, including Parkinson's disease. PMID:20383806

  5. Evaluation of Physically and Empirically Based Models for the Estimation of Green Roof Evapotranspiration

    NASA Astrophysics Data System (ADS)

    Digiovanni, K. A.; Montalto, F. A.; Gaffin, S.; Rosenzweig, C.

    2010-12-01

    Green roofs and other urban green spaces can provide a variety of valuable benefits including reduction of the urban heat island effect, reduction of stormwater runoff, carbon sequestration, oxygen generation, air pollution mitigation etc. As many of these benefits are directly linked to the processes of evaporation and transpiration, accurate and representative estimation of urban evapotranspiration (ET) is a necessary tool for predicting and quantifying such benefits. However, many common ET estimation procedures were developed for agricultural applications, and thus carry inherent assumptions that may only be rarely applicable to urban green spaces. Various researchers have identified the estimation of expected urban ET rates as critical, yet poorly studied components of urban green space performance prediction and cite that further evaluation is needed to reconcile differences in predictions from varying ET modeling approaches. A small scale green roof lysimeter setup situated on the green roof of the Ethical Culture Fieldston School in the Bronx, NY has been the focus of ongoing monitoring initiated in June 2009. The experimental setup includes a 0.6 m by 1.2 m Lysimeter replicating the anatomy of the 500 m2 green roof of the building, with a roof membrane, drainage layer, 10 cm media depth, and planted with a variety of Sedum species. Soil moisture sensors and qualitative runoff measurements are also recorded in the Lysimeter, while a weather station situated on the rooftop records climatologic data. Direct quantification of actual evapotranspiration (AET) from the green roof weighing lysimeter was achieved through a mass balance approaches during periods absent of precipitation and drainage. A comparison of AET to estimates of potential evapotranspiration (PET) calculated from empirically and physically based ET models was performed in order to evaluate the applicability of conventional ET equations for the estimation of ET from green roofs. Results have

  6. Impact of Inadequate Empirical Therapy on the Mortality of Patients with Bloodstream Infections: a Propensity Score-Based Analysis

    PubMed Central

    Retamar, Pilar; Portillo, María M.; López-Prieto, María Dolores; Rodríguez-López, Fernando; de Cueto, Marina; García, María V.; Gómez, María J.; del Arco, Alfonso; Muñoz, Angel; Sánchez-Porto, Antonio; Torres-Tortosa, Manuel; Martín-Aspas, Andrés; Arroyo, Ascensión; García-Figueras, Carolina; Acosta, Federico; Corzo, Juan E.; León-Ruiz, Laura; Escobar-Lara, Trinidad

    2012-01-01

    The impact of the adequacy of empirical therapy on outcome for patients with bloodstream infections (BSI) is key for determining whether adequate empirical coverage should be prioritized over other, more conservative approaches. Recent systematic reviews outlined the need for new studies in the field, using improved methodologies. We assessed the impact of inadequate empirical treatment on the mortality of patients with BSI in the present-day context, incorporating recent methodological recommendations. A prospective multicenter cohort including all BSI episodes in adult patients was performed in 15 hospitals in Andalucía, Spain, over a 2-month period in 2006 to 2007. The main outcome variables were 14- and 30-day mortality. Adjusted analyses were performed by multivariate analysis and propensity score-based matching. Eight hundred one episodes were included. Inadequate empirical therapy was administered in 199 (24.8%) episodes; mortality at days 14 and 30 was 18.55% and 22.6%, respectively. After controlling for age, Charlson index, Pitt score, neutropenia, source, etiology, and presentation with severe sepsis or shock, inadequate empirical treatment was associated with increased mortality at days 14 and 30 (odds ratios [ORs], 2.12 and 1.56; 95% confidence intervals [95% CI], 1.34 to 3.34 and 1.01 to 2.40, respectively). The adjusted ORs after a propensity score-based matched analysis were 3.03 and 1.70 (95% CI, 1.60 to 5.74 and 0.98 to 2.98, respectively). In conclusion, inadequate empirical therapy is independently associated with increased mortality in patients with BSI. Programs to improve the quality of empirical therapy in patients with suspicion of BSI and optimization of definitive therapy should be implemented. PMID:22005999

  7. Is Project Based Learning More Effective than Direct Instruction in School Science Classrooms? An Analysis of the Empirical Research Evidence

    NASA Astrophysics Data System (ADS)

    Dann, Clifford

    An increasingly loud call by parents, school administrators, teachers, and even business leaders for "authentic learning", emphasizing both group-work and problem solving, has led to growing enthusiasm for inquiry-based learning over the past decade. Although "inquiry" can be defined in many ways, a curriculum called "project-based learning" has recently emerged as the inquiry practice-of-choice with roots in the educational constructivism that emerged in the mid-twentieth century. Often, project-based learning is framed as an alternative instructional strategy to direct instruction for maximizing student content knowledge. This study investigates the empirical evidence for such a comparison while also evaluating the overall quality of the available studies in the light of accepted standards for educational research. Specifically, this thesis investigates what the body of quantitative research says about the efficacy of project-based learning vs. direct instruction when considering student acquisition of content knowledge in science classrooms. Further, existing limitations of the research pertaining to project based learning and secondary school education are explored. The thesis concludes with a discussion of where and how we should focus our empirical efforts in the future. The research revealed that the available empirical research contains flaws in both design and instrumentation. In particular, randomization is poor amongst all the studies considered. The empirical evidence indicates that project-based learning curricula improved student content knowledge but that, while the results were statistically significant, increases in raw test scores were marginal.

  8. A hybrid filtering method based on a novel empirical mode decomposition for friction signals

    NASA Astrophysics Data System (ADS)

    Li, Chengwei; Zhan, Liwei

    2015-12-01

    During a measurement, the measured signal usually contains noise. To remove the noise and preserve the important feature of the signal, we introduce a hybrid filtering method that uses a new intrinsic mode function (NIMF) and a modified Hausdorff distance. The NIMF is defined as the difference between the noisy signal and each intrinsic mode function (IMF), which is obtained by empirical mode decomposition (EMD), ensemble EMD, complementary ensemble EMD, or complete ensemble EMD with adaptive noise (CEEMDAN). The relevant mode selecting is based on the similarity between the first NIMF and the rest of the NIMFs. With this filtering method, the EMD and improved versions are used to filter the simulation and friction signals. The friction signal between an airplane tire and the runaway is recorded during a simulated airplane touchdown and features spikes of various amplitudes and noise. The filtering effectiveness of the four hybrid filtering methods are compared and discussed. The results show that the filtering method based on CEEMDAN outperforms other signal filtering methods.

  9. Feasibility of an empirically based program for parents of preschoolers with autism spectrum disorder.

    PubMed

    Dababnah, Sarah; Parish, Susan L

    2016-01-01

    This article reports on the feasibility of implementing an existing empirically based program, The Incredible Years, tailored to parents of young children with autism spectrum disorder. Parents raising preschool-aged children (aged 3-6 years) with autism spectrum disorder (N = 17) participated in a 15-week pilot trial of the intervention. Quantitative assessments of the program revealed fidelity was generally maintained, with the exception of program-specific videos. Qualitative data from individual post-intervention interviews reported parents benefited most from child emotion regulation strategies, play-based child behavior skills, parent stress management, social support, and visual resources. More work is needed to further refine the program to address parent self-care, partner relationships, and the diverse behavioral and communication challenges of children across the autism spectrum. Furthermore, parent access and retention could potentially be increased by providing in-home childcare vouchers and a range of times and locations in which to offer the program. The findings suggest The Incredible Years is a feasible intervention for parents seeking additional support for child- and family-related challenges and offers guidance to those communities currently using The Incredible Years or other related parenting programs with families of children with autism spectrum disorder. PMID:25717131

  10. Empirical mode decomposition-based motion artifact correction method for functional near-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Gu, Yue; Han, Junxia; Liang, Zhenhu; Yan, Jiaqing; Li, Zheng; Li, Xiaoli

    2016-01-01

    Functional near-infrared spectroscopy (fNIRS) is a promising technique for monitoring brain activity. However, it is sensitive to motion artifacts. Many methods have been developed for motion correction, such as spline interpolation, wavelet filtering, and kurtosis-based wavelet filtering. We propose a motion correction method based on empirical mode decomposition (EMD), which is applied to segments of data identified as having motion artifacts. The EMD method is adaptive, data-driven, and well suited for nonstationary data. To test the performance of the proposed EMD method and to compare it with other motion correction methods, we used simulated hemodynamic responses added to real resting-state fNIRS data. The EMD method reduced mean squared error in 79% of channels and increased signal-to-noise ratio in 78% of channels. Moreover, it produced the highest Pearson's correlation coefficient between the recovered signal and the original signal, significantly better than the comparison methods (p<0.01, paired t-test). These results indicate that the proposed EMD method is a first choice method for motion artifact correction in fNIRS.