Science.gov

Sample records for 5-point likert-type scale

  1. Reporting and Interpreting Scores Derived from Likert-Type Scales

    ERIC Educational Resources Information Center

    Warmbrod, J. Robert

    2014-01-01

    Forty-nine percent of the 706 articles published in the "Journal of Agricultural Education" from 1995 to 2012 reported quantitative research with at least one variable measured by a Likert-type scale. Grounded in the classical test theory definition of reliability and the tenets basic to Likert-scale measurement methodology, for the…

  2. Using Likert-Type Scales in the Social Sciences

    ERIC Educational Resources Information Center

    Croasmun, James T.; Ostrom, Lee

    2011-01-01

    Likert scales are useful in social science and attitude research projects. The General Self-Efficacy Exam is a test used to determine whether factors in educational settings affect participant's learning self-efficacy. The original instrument had 10 efficacy items and used a 4-point Likert scale. The Cronbach's alphas for the original test ranged…

  3. The Effects of Halo and Leniency on Cooperating Teacher Reports Using Likert-Type Rating Scales.

    ERIC Educational Resources Information Center

    Phelps, LeAdelle; And Others

    1986-01-01

    This study evaluated the adequacy of a Likert-type scale cooperating teacher report. Results demonstrated a significant presence of leniency error and halo effect, leaving highly questionable the validity of the report as a whole. Findings are discussed. (Author/MT)

  4. Examining Perceptions and Attitudes: A Review of Likert-Type Scales Versus Q-Methodology.

    PubMed

    Ho, Grace W K

    2016-07-24

    The purpose of this article is to compare and discuss the use of Likert-type scales and Q-methodology to examine perceptions and attitudes in nursing research. This article provides a brief review of each approach, and how they have been used to advance our knowledge in health-related perceptions and attitudes. Although Likert-type scales are economical, efficient, and easy to analyze, the results can be difficult to interpret or translate into meaningful practice. In contrast, Q-methodology yields holistic and in-depth information on what the prevailing perceptions and attitudes are, but its conduct is logistically challenging and the results' generalizability can be limited. The appropriate uses of either or both approaches to answer different research questions will be discussed. Nurse scientists are called upon to continue our exploration, utilization, and expansion of unique methodologies that directly speak to a meaningful examination of these important constructs in nursing research.

  5. A Review of the Reliability and Validity of Likert-Type Scales for People with Intellectual Disability

    ERIC Educational Resources Information Center

    Hartley, S. L.; MacLean, W. E., Jr.

    2006-01-01

    Background: Likert-type scales are increasingly being used among people with intellectual disability (ID). These scales offer an efficient method for capturing a wide range of variance in self-reported attitudes and behaviours. This review is an attempt to evaluate the reliability and validity of Likert-type scales in people with ID. Methods:…

  6. How Differences among Data Collectors Are Reflected in the Reliability and Validity of Data Collected by Likert-Type Scales?

    ERIC Educational Resources Information Center

    Köksal, Mustafa Serdar; Ertekin, Pelin; Çolakoglu, Özgür Murat

    2014-01-01

    The purpose of this study is to investigate association of data collectors' differences with the differences in reliability and validity of scores regarding affective variables (motivation toward science learning and science attitude) that are measured by Likert-type scales. Four researchers trained in data collection and seven science teachers…

  7. A Psychometric Evaluation of 4-Point and 6-Point Likert-Type Scales in Relation to Reliability and Validity.

    ERIC Educational Resources Information Center

    Chang, Lei

    1994-01-01

    Reliability and validity of 4-point and 6-point scales were assessed using a new model-based approach to fit empirical data from 165 graduate students completing an attitude measure. Results suggest that the issue of four- versus six-point scales may depend on the empirical setting. (SLD)

  8. A Two-Decision Model for Responses to Likert-Type Items

    ERIC Educational Resources Information Center

    Thissen-Roe, Anne; Thissen, David

    2013-01-01

    Extreme response set, the tendency to prefer the lowest or highest response option when confronted with a Likert-type response scale, can lead to misfit of item response models such as the generalized partial credit model. Recently, a series of intrinsically multidimensional item response models have been hypothesized, wherein tendency toward…

  9. Ramsay Curve IRT for Likert-Type Data

    ERIC Educational Resources Information Center

    Woods, Carol M.

    2007-01-01

    Ramsay curve item response theory (RC-IRT) was recently developed to detect and correct for nonnormal latent variables when unidimensional IRT models are fitted to data using maximum marginal likelihood estimation. The purpose of this research is to evaluate the performance of RC-IRT for Likert-type item responses with varying test lengths, sample…

  10. Estimating Ordinal Reliability for Likert-Type and Ordinal Item Response Data: A Conceptual, Empirical, and Practical Guide

    ERIC Educational Resources Information Center

    Gadermann, Anne M.; Guhn, Martin; Zumbo, Bruno D.

    2012-01-01

    This paper provides a conceptual, empirical, and practical guide for estimating ordinal reliability coefficients for ordinal item response data (also referred to as Likert, Likert-type, ordered categorical, or rating scale item responses). Conventionally, reliability coefficients, such as Cronbach's alpha, are calculated using a Pearson…

  11. The Response Scale for the Intellectual Disability Module of the WHOQOL: 5-Point or 3-Point?

    ERIC Educational Resources Information Center

    Fang, J.; Fleck, M. P.; Green, A.; McVilly, K.; Hao, Y.; Tan, W.; Fu, R.; Power, M.

    2011-01-01

    Objective: To deal with the question of whether a 5-point response Likert scale should be changed to a 3-point scale when used in the field testing of people with intellectual disabilities (IDs), which was raised after the pilot study of World Health Organization Quality of Life (WHOQOL)-DIS, a module being developed with the World Health…

  12. A Comparison of Anchor-Item Designs for the Concurrent Calibration of Large Banks of Likert-Type Items

    ERIC Educational Resources Information Center

    Garcia-Perez, Miguel A.; Alcala-Quintana, Rocio; Garcia-Cueto, Eduardo

    2010-01-01

    Current interest in measuring quality of life is generating interest in the construction of computerized adaptive tests (CATs) with Likert-type items. Calibration of an item bank for use in CAT requires collecting responses to a large number of candidate items. However, the number is usually too large to administer to each subject in the…

  13. Value-Eroding Teacher Behaviors Scale: A Validity and Reliability Study

    ERIC Educational Resources Information Center

    Arseven, Zeynep; Kiliç, Abdurrahman; Sahin, Seyma

    2016-01-01

    In the present study, it is aimed to develop a valid and reliable scale for determining value-eroding behaviors of teachers, hence their values of judgment. The items of the "Value-eroding Teacher Behaviors Scale" were designed in the form of 5-point likert type rating scale. The exploratory factor analysis (EFA) was conducted to…

  14. Preparing Attitude Scale to Define Students' Attitudes about Environment, Recycling, Plastic and Plastic Waste

    ERIC Educational Resources Information Center

    Avan, Cagri; Aydinli, Bahattin; Bakar, Fatma; Alboga, Yunus

    2011-01-01

    The aim of this study is to introduce an attitude scale in order to define students? attitudes about environment, recycling, plastics, plastic waste. In this study, 80 attitude sentences according to 5-point Likert-type scale were prepared and applied to 492 students of 6th grade in the Kastamonu city center of Turkey. The scale consists of…

  15. A mixed-binomial model for Likert-type personality measures.

    PubMed

    Allik, Jüri

    2014-01-01

    Personality measurement is based on the idea that values on an unobservable latent variable determine the distribution of answers on a manifest response scale. Typically, it is assumed in the Item Response Theory (IRT) that latent variables are related to the observed responses through continuous normal or logistic functions, determining the probability with which one of the ordered response alternatives on a Likert-scale item is chosen. Based on an analysis of 1731 self- and other-rated responses on the 240 NEO PI-3 questionnaire items, it was proposed that a viable alternative is a finite number of latent events which are related to manifest responses through a binomial function which has only one parameter-the probability with which a given statement is approved. For the majority of items, the best fit was obtained with a mixed-binomial distribution, which assumes two different subpopulations who endorse items with two different probabilities. It was shown that the fit of the binomial IRT model can be improved by assuming that about 10% of random noise is contained in the answers and by taking into account response biases toward one of the response categories. It was concluded that the binomial response model for the measurement of personality traits may be a workable alternative to the more habitual normal and logistic IRT models.

  16. A mixed-binomial model for Likert-type personality measures

    PubMed Central

    Allik, Jüri

    2014-01-01

    Personality measurement is based on the idea that values on an unobservable latent variable determine the distribution of answers on a manifest response scale. Typically, it is assumed in the Item Response Theory (IRT) that latent variables are related to the observed responses through continuous normal or logistic functions, determining the probability with which one of the ordered response alternatives on a Likert-scale item is chosen. Based on an analysis of 1731 self- and other-rated responses on the 240 NEO PI-3 questionnaire items, it was proposed that a viable alternative is a finite number of latent events which are related to manifest responses through a binomial function which has only one parameter—the probability with which a given statement is approved. For the majority of items, the best fit was obtained with a mixed-binomial distribution, which assumes two different subpopulations who endorse items with two different probabilities. It was shown that the fit of the binomial IRT model can be improved by assuming that about 10% of random noise is contained in the answers and by taking into account response biases toward one of the response categories. It was concluded that the binomial response model for the measurement of personality traits may be a workable alternative to the more habitual normal and logistic IRT models. PMID:24847291

  17. Further Empirical Results on Parametric Versus Non-Parametric IRT Modeling of Likert-Type Personality Data

    ERIC Educational Resources Information Center

    Maydeu-Olivares, Albert

    2005-01-01

    Chernyshenko, Stark, Chan, Drasgow, and Williams (2001) investigated the fit of Samejima's logistic graded model and Levine's non-parametric MFS model to the scales of two personality questionnaires and found that the graded model did not fit well. We attribute the poor fit of the graded model to small amounts of multidimensionality present in…

  18. A Mathematical Approach in Evaluating Biotechnology Attitude Scale: Rough Set Data Analysis

    ERIC Educational Resources Information Center

    Narli, Serkan; Sinan, Olcay

    2011-01-01

    Individuals' thoughts and attitudes towards biotechnology have been investigated in many countries. A Likert-type scale is the most commonly used scale to measure attitude. However, the weak side of a likert-type scale is that different responses may produce the same score. The Rough set method has been regarded to address this shortcoming. A…

  19. Designing the Nuclear Energy Attitude Scale.

    ERIC Educational Resources Information Center

    Calhoun, Lawrence; And Others

    1988-01-01

    Presents a refined method for designing a valid and reliable Likert-type scale to test attitudes toward the generation of electricity from nuclear energy. Discusses various tests of validity that were used on the nuclear energy scale. Reports results of administration and concludes that the test is both reliable and valid. (CW)

  20. An Exploration of the Validity of the Unbounded Write-In Scale.

    ERIC Educational Resources Information Center

    Stapleton, Laura M.; Edmonds, Meaghan

    An exploratory reliability and validity study was conducted of a relatively new response scale developed in the marketing field. Unlike many Likert-type scales, the "unbounded write-in" scale is claimed to produce distributions that more closely approximate normal distributions. This type of scale has been used in large-scale marketing studies.…

  1. The Role of Reading Comprehension in Responses to Positively and Negatively Worded Items on Rating Scales

    ERIC Educational Resources Information Center

    Weems, Gail H.; Onwuegbuzie, Anthony J.; Collins, Kathleen M. T.

    2006-01-01

    Should instruments, such as Likert-type scales, contain both positively worded and negatively worded items within the same scale (i.e. mixed format)? Recent evidence suggests that the use of scales with a mixed format can adversely affect the psychometric properties of scales. In particular, the mean item response to the positively worded items…

  2. Development of an Attitude Scale towards High School Physics Lessons

    ERIC Educational Resources Information Center

    Yavas, Pervin Ünlü; Çagan, Sultan

    2017-01-01

    The aim of this study was to develop a Likert type attitude scale for high school students with regard to high school physics lessons. The research was carried out with high school students who were studying in Ankara. First, the opinions of 105 high school students about physics lessons were obtained and then 55 scale items were determined from…

  3. Factor Analysis of the Omega Scale: A Scale Designed To Measure the Attitudes of College Students toward Their Own Deaths and the Disposition of Their Bodies.

    ERIC Educational Resources Information Center

    Staik, Irene M.

    A study was undertaken to provide a factor analysis of the Omega Scale, a 25-item, Likert-type scale developed in 1984 to assess attitudes toward death and funerals and other body disposition practices. The Omega Scale was administered to 250 students enrolled in introductory psychology classes at two higher education institutions in Alabama.…

  4. The Intuitive Eating Scale: Development and Preliminary Validation

    ERIC Educational Resources Information Center

    Hawks, Steven; Merrill, Ray M.; Madanat, Hala N.

    2004-01-01

    This article describes the development and validation of an instrument designed to measure the concept of intuitive eating. To ensure face and content validity for items used in the Likert-type Intuitive Eating Scale (IES), content domain was clearly specified and a panel of experts assessed the validity of each item. Based on responses from 391…

  5. The Development of a Behavior Patterns Rating Scale for Preservice Teachers

    ERIC Educational Resources Information Center

    Caliskan, Nihat; Kuzu, Okan; Kuzu, Yasemin

    2017-01-01

    The purpose of this study was to develop a rating scale that can be used to evaluate behavior patterns of the organization people pattern of preservice teachers (PSTs). By reviewing the related literature on people patterns, a preliminary scale of 38 items with a five-points Likert type was prepared. The number of items was reduced to 29 after…

  6. Development of a Scale for Measuring Teachers' Attitudes toward Students' Inappropriate Behaviour

    ERIC Educational Resources Information Center

    Malak, Md. Saiful; Sharma, Umesh; Deppeler, Joanne M.

    2017-01-01

    This study aimed at developing a valid and reliable instrument for measuring attitudes of primary schoolteachers toward inappropriate student behaviour. A systematic approach was used to develop the scale. Results provide preliminary evidence that the new instrument (consisting of 13 items on a six-point Likert type scale) meets the standards for…

  7. A Cross-Cultural Validation Study of the Computer Attitude Scale.

    ERIC Educational Resources Information Center

    Kim, JinGyu; And Others

    The reliability and factorial validity of the Computer Attitudes Scale (CAS) was assessed with college students in South Korea. The CAS was developed for use with high school students, but has been used in higher education in the United States. It is a Likert-type scale of 30 positive and negative statements about the use of computers, and is one…

  8. Determination of Reliability and Validity for Myself as a Teacher Scale.

    ERIC Educational Resources Information Center

    Handley, Herbert M.; Thomson, James R., Jr.

    The reliability and validity of the Myself as a Teacher Scale (MTS), developed to assess the self-concept of teachers, were studied. Materials developed by David P. Butts and Robert Howe were used to construct a 62-item Likert-type scale asking individuals to rate themselves on certain criteria. After a pilot study with 92 preservice teachers and…

  9. Retrospective Assessment of Childhood Sexual and Physical Abuse: A Comparison of Scaled and Behaviorally Specific Approaches

    ERIC Educational Resources Information Center

    DiLillo, David; Fortier, Michelle A.; Hayes, Sarah A.; Trask, Emily; Perry, Andrea R.; Messman-Moore, Terri; Fauchier, Angele; Nash, Cindy

    2006-01-01

    This study compared retrospective reports of childhood sexual and physical abuse as assessed by two measures: the Childhood Trauma Questionnaire (CTQ), which uses a Likert-type scaling approach, and the Computer Assisted Maltreatment Inventory (CAMI), which employs a behaviorally specific means of assessment. Participants included 1,195…

  10. In Search of the Optimal Number of Response Categories in a Rating Scale

    ERIC Educational Resources Information Center

    Lee, Jihyun; Paek, Insu

    2014-01-01

    Likert-type rating scales are still the most widely used method when measuring psychoeducational constructs. The present study investigates a long-standing issue of identifying the optimal number of response categories. A special emphasis is given to categorical data, which were generated by the Item Response Theory (IRT) Graded-Response Modeling…

  11. The Development of a Competence Scale for Learning Science: Inquiry and Communication

    ERIC Educational Resources Information Center

    Chang, Huey-Por; Chen, Chin-Chang; Guo, Gwo-Jen; Cheng, Yeong-Jin; Lin, Chen-Yung; Jen, Tsung-Hau

    2011-01-01

    The objective of this study was to develop an instrument to measure school students' competence in learning science as part of a large research project in Taiwan. The instrument consisted of 29 self-report, Likert-type items divided into 2 scales: Competence in Scientific Inquiry and Competence in Communication. The Competence in Scientific…

  12. Reliability and validity of the Student Perceptions of School Cohesion Scale in a sample of Salvadoran secondary school students

    PubMed Central

    2009-01-01

    Background Despite a growing body of research from the United States and other industrialized countries on the inverse association between supportive social relationships in the school and youth risk behavior engagement, research on the measurement of supportive school social relationships in Central America is limited. We examined the psychometric properties of the Student Perceptions of School Cohesion (SPSC) scale, a 10-item scale that asks students to rate with a 5-point Likert-type response scale their perceptions of the school social environment, in a sample of public secondary school students (mean age = 15 years) living in central El Salvador. Methods Students (n = 982) completed a self-administered questionnaire that included the SPSC scale along with measures of youth health risk behaviors based on the Center for Disease Control and Prevention's Youth Risk Behavior Survey. Exploratory factor analysis was used to assess the factor structure of the scale, and two internal consistency estimates of reliability were computed. Construct validity was assessed by examining whether students who reported low school cohesion were significantly more likely to report physical fighting and illicit drug use. Results Results indicated that the SPSC scale has three latent factors, which explained 61.6% of the variance: supportive school relationships, student-school connectedness, and student-teacher connectedness. The full scale and three subscales had good internal consistency (rs = .87 and α = .84 for the full scale; rs and α between .71 and .75 for the three subscales). Significant associations were found between the full scale and all three subscales with physical fighting (p ≤ .001) and illicit drug use (p < .05). Conclusion Findings provide evidence of reliability and validity of the SPSC for the measurement of supportive school relationships in Latino adolescents living in El Salvador. These findings provide a foundation for further research on school cohesion

  13. Appropriate Statistical Analysis for Two Independent Groups of Likert-Type Data

    ERIC Educational Resources Information Center

    Warachan, Boonyasit

    2011-01-01

    The objective of this research was to determine the robustness and statistical power of three different methods for testing the hypothesis that ordinal samples of five and seven Likert categories come from equal populations. The three methods are the two sample t-test with equal variances, the Mann-Whitney test, and the Kolmogorov-Smirnov test. In…

  14. A method for estimating spikelet number per panicle: Integrating image analysis and a 5-point calibration model.

    PubMed

    Zhao, Sanqin; Gu, Jiabing; Zhao, Youyong; Hassan, Muhammad; Li, Yinian; Ding, Weimin

    2015-11-06

    Spikelet number per panicle (SNPP) is one of the most important yield components used to estimate rice yields. The use of high-throughput quantitative image analysis methods for understanding the diversity of the panicle has increased rapidly. However, it is difficult to simultaneously extract panicle branch and spikelet/grain information from images at the same resolution due to the different scales of these traits. To use a lower resolution and meet the accuracy requirement, we proposed an interdisciplinary method that integrated image analysis and a 5-point calibration model to rapidly estimate SNPP. First, a linear relationship model between the total length of the primary branch (TLPB) and the SNPP was established based on the physiological characteristics of the panicle. Second, the TLPB and area (the primary branch region) traits were rapidly extracted by developing image analysis algorithm. Finally, a 5-point calibration method was adopted to improve the universality of the model. The number of panicle samples that the error of the SNPP estimates was less than 10% was greater than 90% by the proposed method. The estimation accuracy was consistent with the accuracy determined using manual measurements. The proposed method uses available concepts and techniques for automated estimations of rice yield information.

  15. Using Visual Analogue Scales in eHealth: Non-Response Effects in a Lifestyle Intervention

    PubMed Central

    Reips, Ulf-Dietrich; Wienert, Julian; Lippke, Sonia

    2016-01-01

    Background Visual analogue scales (VASs) have been shown to be valid measurement instruments and a better alternative to Likert-type scales in Internet-based research, both empirically and theoretically [1,2]. Upsides include more differentiated responses, better measurement level, and less error. Their feasibility and properties in the context of eHealth, however, have not been examined so far. Objective The present study examined VASs in the context of a lifestyle study conducted online, measuring the impact of VASs on distributional properties and non-response. Method A sample of 446 participants with a mean age of 52.4 years (standard deviation (SD) = 12.1) took part in the study. The study was carried out as a randomized controlled trial, aimed at supporting participants over 8 weeks with an additional follow-up measurement. In addition to the randomized questionnaire, participants were further randomly assigned to either a Likert-type or VAS response scale version of the measures. Results Results showed that SDs were lower for items answered via VASs, 2P (Y ≥ 47 | n=55, P=.5) < .001. Means did not differ across versions. Participants in the VAS version showed lower dropout rates than participants in the Likert version, odds ratio = 0.75, 90% CI (0.58-0.98), P=.04. Number of missing values did not differ between questionnaire versions. Conclusions The VAS is shown to be a valid instrument in the eHealth context, offering advantages over Likert-type scales. The results of the study provide further support for the use of VASs in Internet-based research, extending the scope to senior samples in the health context. Trial Registration Clinicaltrials.gov NCT01909349; https://clinicaltrials.gov/ct2/show/NCT01909349 (Archived by WebCite at http://www.webcitation.org/6h88sLw2Y) PMID:27334562

  16. Development of Attitudes Towards Homosexuality Scale for Indians (AHSI).

    PubMed

    Ahuja, Kanika K

    2017-02-02

    Attitudes towards homosexuality vary across cultures, with the legal and societal position being rather complicated in India. This study describes the process of developing and validating a Likert-type scale to assess attitudes toward homosexuality amongst heterosexuals. Phase 1 describes the development of the scale. Items were written based on thematic analysis of narratives generated from 50 college students and reviewing existing scales. After administering the 70-item scale to 68 participants, item analysis yielded 20 statements with item-total correlations over .70. Cronbach alpha was .97. In Phase 2, the 20-item Attitudes Towards Homosexuality Scale for Indians (AHSI) was administered to 142 participants. Analysis yielded a corrected split-half correlation of .91. Further, AHSI discriminated between women and men; between liberal arts and STEM/business students; and those who reported interpersonal contact with gay men and lesbian women and those who did not. The scale has satisfactory reliability and shows promising construct validity.

  17. Teacher Trainees' Strategies for Managing the Behaviours of Students with Special Needs

    ERIC Educational Resources Information Center

    Ali, Manisah Mohd.; Abdullah, Rozila; Majid, Rosadah Abdul

    2014-01-01

    This study aimed to determine how a group of teacher trainees handled challenging behaviour by students during teaching practice. A total of 35 teacher trainees from the special education programme of a local university were chosen as respondents. A questionnaire based on a 5-point Likert-type scale was administered in this study. The data were…

  18. Observing Coronal Mass Ejections from the Sun-Earth L5 Point

    NASA Astrophysics Data System (ADS)

    Gopalswamy, N.; Davila, J. M.; St Cyr, O. C.

    2013-12-01

    Coronal mass ejections (CMEs) are the most energetic phenomenon in the heliosphere and are known to be responsible for severe space weather. Most of the current knowledge on CMEs accumulated over the past few decades has been derived from observations made from the Sun-Earth line, which is not the ideal vantage point to observe Earth-affecting CMEs (Gopalswamy et al., 2011a,b). The STEREO mission viewed CMEs from points away from the Sun-Earth line and demonstrated the importance of such observations in understanding the three-dimensional structure of CMEs and their true kinematics. In this paper, we show that it is advantageous to observe CMEs from the Sun-Earth L5 point in studying CMEs that affect Earth. In particular, these observations are important in identifying that part of the CME that is likely to arrive at Earth. L5 observations are critical for several aspects of CME studies such as: (i) they can also provide near-Sun space speed of CMEs, which is an important input for modeling Earth-arriving CMEs, (ii) backside and frontside CMEs can be readily distinguished even without inner coronal imagers, and (iii) preceding CMEs in the path of Earth-affecting CMEs can be identified for a better estimate of the travel time, which may not be possible from the Sun-Earth line. We also discuss how the L5 vantage point compares with the Sun-Earth L4 point for observing Earth-affecting CMEs. References Gopalswamy, N., Davila, J. M., St. Cyr, O. C., Sittler, E. C., Auchère, F., Duvall, T. L., Hoeksema, J. T., Maksimovic, M., MacDowall, R. J., Szabo, A., Collier, M. R. (2011a), Earth-Affecting Solar Causes Observatory (EASCO): A potential International Living with a Star Mission from Sun-Earth L5 JASTP 73, 658-663, DOI: 10.1016/j.jastp.2011.01.013 Gopalswamy, N., Davila, J. M., Auchère, F., Schou, J., Korendyke, C. M. Shih, A., Johnston, J. C., MacDowall, R. J., Maksimovic, M., Sittler, E., et al. (2011b), Earth-Affecting Solar Causes Observatory (EASCO): a mission at

  19. Scales

    MedlinePlus

    Scales are a visible peeling or flaking of outer skin layers. These layers are called the stratum ... Scales may be caused by dry skin, certain inflammatory skin conditions, or infections. Eczema , ringworm , and psoriasis ...

  20. Quantification of the UK 5-point breast imaging classification and mapping to BI-RADS to facilitate comparison with international literature

    PubMed Central

    Taylor, K; Britton, P; O'Keeffe, S; Wallis, M G

    2011-01-01

    Objective The UK 5-point breast imaging scoring system, recently formalised by the Royal College of Radiologists Breast Group, does not specify the likelihood of malignancy in each category. The breast imaging and reporting data system (BI-RADS) is widely used throughout North America and much of Europe. The main purpose of this study is to quantify the cancer likelihood of each of the UK 5-point categories and map them to comparable BI-RADS categories to facilitate comparison with North American and European literature and publication of UK research abroad. Methods During the 8 year study period, mammogram and ultrasound results were UK scored and the percentage of cancer outcomes within each group calculated. These were then compared with the percentage incidence of the BI-RADS categories. Results Of 23 741 separate assessment episodes, 15 288 mammograms and 10 642 ultrasound examinations were evaluated. There was a direct correlation between UK scoring and BI-RADS for categories 1 and 5. UK Score 2 lipomas and simple cysts correlated with BI-RADS 2, with the remaining UK Score 2 lesions (mostly fibroadenomas) assigned to BI-RADS 3. BI-RADS 4 incorporates a wide range of cancer risk (2–95%) with subdivisions a, b and c indicating increasing, but unspecified, likelihood of malignancy. UK Score 3 correlated with BI-RADS 4 a/b and UK Score 4 corresponded with BI-RADS 4c. Conclusion This study quantifies the cancer likelihood of the UK scoring and maps them to parallel BI-RADS categories, with equivalent cancer risks. This facilitates the ability to share UK research data and clinical practice on an international scale. PMID:22011830

  1. Internal consistency of a five-item form of the Francis Scale of Attitude Toward Christianity among adolescent students.

    PubMed

    Campo-Arias, Adalberto; Oviedo, Heidi Celina; Cogollo, Zuleima

    2009-04-01

    The short form of the Francis Scale of Attitude Toward Christianity (L. J. Francis, 1992) is a 7-item Likert-type scale that shows high homogeneity among adolescents. The psychometric performance of a shorter version of this scale has not been explored. The authors aimed to determine the internal consistency of a 5-item form of the Francis Scale of Attitude Toward Christianity among 405 students from a school in Cartagena, Colombia. The authors computed the Cronbach's alpha coefficient for the 5 items with a greater corrected item-total punctuation correlation. The version without Items 2 and 7 showed internal consistency of .87. The 5-item version of the Francis Scale of Attitude Toward Christianity exhibited higher internal consistency than did the 7-item version. Future researchers should corroborate this finding.

  2. Scales

    ScienceCinema

    Murray Gibson

    2016-07-12

    Musical scales involve notes that, sounded simultaneously (chords), sound good together. The result is the left brain meeting the right brain — a Pythagorean interval of overlapping notes. This synergy would suggest less difference between the working of the right brain and the left brain than common wisdom would dictate. The pleasing sound of harmony comes when two notes share a common harmonic, meaning that their frequencies are in simple integer ratios, such as 3/2 (G/C) or 5/4 (E/C).

  3. Scales

    SciTech Connect

    Murray Gibson

    2007-04-27

    Musical scales involve notes that, sounded simultaneously (chords), sound good together. The result is the left brain meeting the right brain — a Pythagorean interval of overlapping notes. This synergy would suggest less difference between the working of the right brain and the left brain than common wisdom would dictate. The pleasing sound of harmony comes when two notes share a common harmonic, meaning that their frequencies are in simple integer ratios, such as 3/2 (G/C) or 5/4 (E/C).

  4. Selection, Optimization, and Compensation: The Structure, Reliability, and Validity of Forced-Choice versus Likert-Type Measures in a Sample of Late Adolescents

    ERIC Educational Resources Information Center

    Geldhof, G. John; Gestsdottir, Steinunn; Stefansson, Kristjan; Johnson, Sara K.; Bowers, Edmond P.; Lerner, Richard M.

    2015-01-01

    Intentional self-regulation (ISR) undergoes significant development across the life span. However, our understanding of ISR's development and function remains incomplete, in part because the field's conceptualization and measurement of ISR vary greatly. A key sample case involves how Baltes and colleagues' Selection, Optimization,…

  5. A small mission concept to the Sun-Earth Lagrangian L5 point for innovative solar, heliospheric and space weather science

    NASA Astrophysics Data System (ADS)

    Lavraud, B.; Liu, Y.; Segura, K.; He, J.; Qin, G.; Temmer, M.; Vial, J.-C.; Xiong, M.; Davies, J. A.; Rouillard, A. P.; Pinto, R.; Auchère, F.; Harrison, R. A.; Eyles, C.; Gan, W.; Lamy, P.; Xia, L.; Eastwood, J. P.; Kong, L.; Wang, J.; Wimmer-Schweingruber, R. F.; Zhang, S.; Zong, Q.; Soucek, J.; An, J.; Prech, L.; Zhang, A.; Rochus, P.; Bothmer, V.; Janvier, M.; Maksimovic, M.; Escoubet, C. P.; Kilpua, E. K. J.; Tappin, J.; Vainio, R.; Poedts, S.; Dunlop, M. W.; Savani, N.; Gopalswamy, N.; Bale, S. D.; Li, G.; Howard, T.; DeForest, C.; Webb, D.; Lugaz, N.; Fuselier, S. A.; Dalmasse, K.; Tallineau, J.; Vranken, D.; Fernández, J. G.

    2016-08-01

    We present a concept for a small mission to the Sun-Earth Lagrangian L5 point for innovative solar, heliospheric and space weather science. The proposed INvestigation of Solar-Terrestrial Activity aNd Transients (INSTANT) mission is designed to identify how solar coronal magnetic fields drive eruptions, mass transport and particle acceleration that impact the Earth and the heliosphere. INSTANT is the first mission designed to (1) obtain measurements of coronal magnetic fields from space and (2) determine coronal mass ejection (CME) kinematics with unparalleled accuracy. Thanks to innovative instrumentation at a vantage point that provides the most suitable perspective view of the Sun-Earth system, INSTANT would uniquely track the whole chain of fundamental processes driving space weather at Earth. We present the science requirements, payload and mission profile that fulfill ambitious science objectives within small mission programmatic boundary conditions.

  6. INSTANT: a Small Mission Concept to the Sun-Earth Lagrangian L5 Point for Innovative Solar, Heliospheric and Space Weather Sciences

    NASA Astrophysics Data System (ADS)

    Lavraud, B.; Liu, Y.

    2015-12-01

    We present a small mission concept to the Sun-Earth Lagrangian L5 point for innovative solar, heliospheric and space weather sciences. The proposed INvestigation of Solar-Terrestrial Activity aNd Transients (INSTANT) mission concept is designed to identify how solar coronal magnetic fields drive eruptions, mass transport and particle acceleration that impact the Earth and the heliosphere. The INSTANT concept would be the first to (1) obtain measurements of coronal magnetic fields from space, and (2) determine coronal mass ejection (CME) kinematics with unparalleled accuracy. Thanks to innovative instrumentation at a vantage point that provides the most suitable perspective view of the Sun-Earth system, INSTANT would, in addition, uniquely track the whole chain of fundamental processes driving space weather. We present the science requirements, payload and mission profile which fulfill ambitious science objectives within small mission programmatic boundary conditions.

  7. Development and Validation of a Scale to Assess Students' Attitude towards Animal Welfare

    NASA Astrophysics Data System (ADS)

    Mazas, Beatriz; Rosario Fernández Manzanal, Mª; Zarza, Francisco Javier; Adolfo María, Gustavo

    2013-07-01

    This work presents the development of a scale of attitudes of secondary-school and university students towards animal welfare. A questionnaire was drawn up following a Likert-type scale attitude assessment model. Four components or factors, which globally measure animal welfare, are proposed to define the object of the attitude. The components are animal abuse for pleasure or due to ignorance (C1), leisure with animals (C2), farm animals (C3) and animal abandonment (C4). The final version of the questionnaire contains 29 items that are evenly distributed among the four components indicated, guaranteeing that each component is one-dimensional. A sample of 329 students was used to validate the scale. These students were aged between 11 and 25, and were from secondary schools in Aragon and the University in Zaragoza (Aragon's main and largest city, located in NE Spain). The scale shows good internal reliability, with a Cronbach's alpha value of 0.74. The questionnaire was later given to 1,007 students of similar levels and ages to the sample used in the validation, the results of which are presented in this study. The most relevant results show significant differences in gender and level of education in some of the components of the scale, observing that women and university students rate animal welfare more highly.

  8. The Self-Assessment Scale of Cognitive Complaints in Schizophrenia: A validation study in Tunisian population

    PubMed Central

    Johnson, Ines; Kebir, Oussama; Ben Azouz, Olfa; Dellagi, Lamia; Rabah, Yasmine; Tabbane, Karim

    2009-01-01

    Background Despite a huge well-documented literature on cognitive deficits in schizophrenia, little is known about the own perception of patients regarding their cognitive functioning. The purpose of our study was to create a scale to collect subjective cognitive complaints of patients suffering from schizophrenia with Tunisian Arabic dialect as mother tongue and to proceed to a validation study of this scale. Methods The authors constructed the Self-Assessment Scale of Cognitive Complaints in Schizophrenia (SASCCS) based on a questionnaire covering five cognitive domains which are the most frequently reported in the literature to be impaired in schizophrenia. The scale consisted of 21 likert-type questions dealing with memory, attention, executive functions, language and praxia. In a second time, the authors proceeded to the study of psychometric qualities of the scale among 105 patients suffering from schizophrenia spectrum disorders (based on DSM- IV criteria). Patients were evaluated using the Positive and Negative Syndrome Scale (PANSS), the Global Assessment Functioning Scale (GAF scale) and the Calgary Depression Scale (CDS). Results The scale's reliability was proven to be good through Cronbach alpha coefficient equal to 0.85 and showing its good internal consistency. The intra-class correlation coefficient at 11 weeks was equal to 0.77 suggesting a good stability over time. Principal component analysis with Oblimin rotation was performed and yielded to six factors accounting for 58.28% of the total variance of the scale. Conclusion Given the good psychometric properties that have been revealed in this study, the SASCCS seems to be reliable to measure schizophrenic patients' perception of their own cognitive impairment. This kind of evaluation can't substitute for objective measures of cognitive performances in schizophrenia. The purpose of such an evaluation is to permit to the patient to express his own well-being and satisfaction of quality of life. PMID

  9. Mixture Random-Effect IRT Models for Controlling Extreme Response Style on Rating Scales

    PubMed Central

    Huang, Hung-Yu

    2016-01-01

    Respondents are often requested to provide a response to Likert-type or rating-scale items during the assessment of attitude, interest, and personality to measure a variety of latent traits. Extreme response style (ERS), which is defined as a consistent and systematic tendency of a person to locate on a limited number of available rating-scale options, may distort the test validity. Several latent trait models have been proposed to address ERS, but all these models have limitations. Mixture random-effect item response theory (IRT) models for ERS are developed in this study to simultaneously identify the mixtures of latent classes from different ERS levels and detect the possible differential functioning items that result from different latent mixtures. The model parameters can be recovered fairly well in a series of simulations that use Bayesian estimation with the WinBUGS program. In addition, the model parameters in the developed models can be used to identify items that are likely to elicit ERS. The results show that a long test and large sample can improve the parameter estimation process; the precision of the parameter estimates increases with the number of response options, and the model parameter estimation outperforms the person parameter estimation. Ignoring the mixtures and ERS results in substantial rank-order changes in the target latent trait and a reduced classification accuracy of the response styles. An empirical survey of emotional intelligence in college students is presented to demonstrate the applications and implications of the new models. PMID:27853444

  10. Reflective Thinking Scale: A Validity and Reliability Study

    ERIC Educational Resources Information Center

    Basol, Gulsah; Evin Gencel, Ilke

    2013-01-01

    The purpose of this study was to adapt Reflective Thinking Scale to Turkish and investigate its validity and reliability over a Turkish university students' sample. Reflective Thinking Scale (RTS) is a 5 point Likert scale (ranging from 1 corresponding Agree Completely, 3 to Neutral, and 5 to Not Agree Completely), purposed to measure reflective…

  11. Development and evaluation of a regional, large-scale interprofessional collaborative care summit.

    PubMed

    Foote, Edward F; Clarke, Virginia; Szarek, John L; Waters, Sharon K; Walline, Vera; Shea, Diane; Goss, Sheryl; Farrell, Marian; Easton, Diana; Dunleavy, Erin; Arscott, Karen

    2015-01-01

    The Northeastern/Central Pennsylvania Interprofessional Education Coalition (NECPA IPEC) is a coalition of faculty from multiple smaller academic institutions with a mission to promote interprofessional education. An interprofessional learning program was organized, which involved 676 learners from 10 different institutions representing 16 unique professions, and took place at seven different institutions simultaneously. The program was a 3-hour long summit which focused on the management of a patient with ischemic stroke. A questionnaire consisting of the Interprofessional Education Perception Scale (IEPS) questionnaire (pre-post summit), Likert-type questions, and open comment questions explored the learners' perceptions of the session and their attitudes toward interprofessional learning. Responses were analyzed using descriptive statistics and statistical tests for difference and qualitative thematic coding. The attitude of learners toward interprofessional education (as measured by the IEPS) was quite high even prior to the summit, so there were no significant changes after the summit. However, a high percentage of learners and facilitators agreed that the summit met its objective and was effective. In addition, the thematic analysis of the open-ended questions confirmed that students learned from the experience with a sense of the core competencies of interprofessional education and practice. A collaborative approach to delivering interprofessional learning is time and work intensive but beneficial to learners.

  12. Analyzing Likert Data

    ERIC Educational Resources Information Center

    Boone, Harry N., Jr.; Boone, Deborah A.

    2012-01-01

    This article provides information for Extension professionals on the correct analysis of Likert data. The analyses of Likert-type and Likert scale data require unique data analysis procedures, and as a result, misuses and/or mistakes often occur. This article discusses the differences between Likert-type and Likert scale data and provides…

  13. Scale development to measure attitudes toward unauthorized migration into a foreign country.

    PubMed

    VAN DER Veer, Kees; Ommundsen, Reidar; Krumov, Krum; VAN LE, Hao; Larsen, Knud S

    2008-08-01

    This study reports on the development and cross-national utility of a Likert type scale measuring attitudes toward unauthorized migration into a foreign country in two samples from "migrant-sending" nations. In the first phase a pool of 86 attitude statements were administered to a sample of 505 undergraduate students in Bulgaria (22.5% male; M age = 23, SD = 4.8). Exploratory factor analysis resulted in six factors, and a reduction to 34 items. The results yielded an overall alpha of (0.92) and alpha for subscales ranging from 0.70 to 0.89. In the second phase the 34-item scale was administered in a survey of 180 undergraduates from Sofia University in Bulgaria (16.7% male, M age = 23, SD = 4.8), plus 150 undergraduates from Hanoi State University in Vietnam (14.7% male, M age = 19, SD = 1.8). Results yielded a 19-item scale with no gender differences, and satisfactory alpha coefficients for a Vietnamese and Bulgarian sample of 0.87 and 0.89 respectively. This scale, equally applicable in both samples, includes items that reflect salient topics of concept of attitudes towards unauthorized migration. An exploratory principal component analysis of the Bulgarian and Vietnamese version of the 19-item scale yielded three factors accounting for 54% and 47% of the variance respectively. A procrustes analysis indicates high conceptual equivalence in the two samples for factor 1 and 2, and moderate for factor 3. This study lends support to the idea that despite different cultural meanings there is a common meaning space in culturally diverse societies.

  14. Scales, scales and more scales.

    PubMed

    Weitzenhoffer, Andre M

    2002-01-01

    This article examines the nature, uses, and limitations of the large variety of existing, so-called, hypnosis scales; that is, instruments that have been proposed for the assessment of hypnotic behavior. Although the major aim of most of the scales ostensively seems to be to assess several aspects of hypnotic states, they are found generally to say little about these and much more about responses to suggestions. The greatest application of these scales is to be found in research, but they also have a limited place in clinical work.

  15. Children's Self-Efficacy Scale: Initial Psychometric Studies

    ERIC Educational Resources Information Center

    Martinelli, Selma de Cassia; Bartholomeu, Daniel; Caliatto, Susana Gakyia; Sassi, Adriana de Grecci

    2009-01-01

    This article describes the development of a self-efficacy measure for elementary school children. A sample of 514 children, ages 8 to 11, enrolled in Grades 2 to 4 of public schools in Brazil was investigated. The scale included 78 descriptive items about academic situations, in which the child was required to respond on a 5-point scale, the…

  16. Validation of Scale of Commitment to Democratic Values among Secondary Students

    ERIC Educational Resources Information Center

    Gafoor, K. Abdul

    2015-01-01

    This study reports development of a reliable and valid instrument for assessing the commitment to democratic values among secondary school students in Kerala from 57 likert type statements originally developed in 2007 by Gafoor and Thushara to assess commitment to nine values avowed in the Indian Constitution. Nine separate maximum likelihood…

  17. Developing a Scale for Quality of Using Learning Strategies

    ERIC Educational Resources Information Center

    Tasci, Guntay; Yurdugul, Halil

    2016-01-01

    This study aims to develop a measurement tool to measure the quality of using learning strategies. First, the quality of using learning strategies was described based on the literature. The 32 items in the 5-point Likert scale were then administered to 320 prospective teachers, and they were analysed with exploratory factor analysis using…

  18. The Impact of Outliers on Cronbach's Coefficient Alpha Estimate of Reliability: Ordinal/Rating Scale Item Responses

    ERIC Educational Resources Information Center

    Liu, Yan; Wu, Amery D.; Zumbo, Bruno D.

    2010-01-01

    In a recent Monte Carlo simulation study, Liu and Zumbo showed that outliers can severely inflate the estimates of Cronbach's coefficient alpha for continuous item response data--visual analogue response format. Little, however, is known about the effect of outliers for ordinal item response data--also commonly referred to as Likert, Likert-type,…

  19. The Arabic Scale of Death Anxiety (ASDA): Its Development, Validation, and Results in Three Arab Countries

    ERIC Educational Resources Information Center

    Abdel-Khalek, Ahmed M.

    2004-01-01

    The Arabic Scale of Death Anxiety (ASDA) was constructed and validated in a sample of undergraduates (17-33 yrs) in 3 Arab countries, Egypt (n = 418), Kuwait (n = 509), and Syria (n = 709). In its final form, the ASDA consists of 20 statements. Each item is answered on a 5-point intensity scale anchored by 1: No, and 5: Very much. Alpha…

  20. Iowa's Severity Rating Scales for Communication Disabilities: Preschool, Ages 2-5 Years.

    ERIC Educational Resources Information Center

    Freilinger, J. Joseph, Ed.; And Others

    The Iowa Severity Rating Scales are designed to provide general guidelines which may be used as a part of the clinical speech and language program to obtain uniform identification of preschool children with communication disabilities. Section 1 contains definitions, an explanation of the severity classification (a 5 point scale ranging from 0 for…

  1. [Standardization of the Greek version of Zung's Self-rating Anxiety Scale (SAS)].

    PubMed

    Samakouri, M; Bouhos, G; Kadoglou, M; Giantzelidou, A; Tsolaki, K; Livaditis, M

    2012-01-01

    Self-rating Anxiety Scale (SAS), introduced by Zung, has been widely used in research and in clinical practice for the detection of anxiety. The present study aims at standardizing the Greek version of SAS. SAS consists of 20 items rated on a 1-4 likert type scale. The total SAS score may vary from 20 (no anxiety at all) to 80 (severe anxiety). Two hundred and fifty four participants (114 male and 140 female), psychiatric patients, physically ill and general population individuals, aged 45.40±11.35 years, completed the following: (a) a demographic characteristics' questionnaire, (b) the SAS Greek version, (c) the Spielberg's Modified Greek State-Trait Anxiety Scale (STAI-Gr.-X) and (d) the Zung Depression Rating Scale (ZDRS). Seventy six participants answered the SAS twice within a 12th-day median period of time. The following parameters were calculated: (a) internal consistency of the SAS in terms of Cronbach's α co-efficient, (b) its test-retest reliability in terms of the Intraclass Correlation Coefficient (ICC) and (c) its concurrent and convergent validities through its score's Spearman's rho correlations with both the state and trait subscales of STAI-Gr X and the ZDRS. In addition, in order to evaluate SAS' discriminant validity, the scale's scores of the three groups of participants (psychiatric patients, physically ill and general population individuals) were compared among each other, in terms of Kruskall Wallis and Mann Whitney U tests. SAS Cronbach's alpha equals 0.897 while ICC regarding its test-retest reliability equals 0.913. Spearman's rho concerning validity: (a) when SAS is compared to STAI-Gr.-X (state), equals it 0.767, (b) when SAS is compared to STAI-Gr. X (trait), it equals 0.802 and (c) when SAS is compared to ZDRS, it equals 0.835. The mentally ill scored significantly higher in SAS compared to both the healthy and the general population. In conclusion, the SAS Greek version presents very satisfactory psychometric properties regarding

  2. Mountaineers' risk perception in outdoor-adventure sports: a study of sex and sports experience.

    PubMed

    Demirhan, Giyasettin

    2005-06-01

    The purpose of this study was to examine mountaineers' (expert, less-experienced, nonparticipant) risk perception in 19 outdoor-adventure sports related to their sex and sports experience. A total of 299 experienced mountaineers (90 women, 209 men), 321 less-experienced mountaineers (110 women, 211 men) and 193 volunteers nonparticipants in sport (95 women and 98 men) took part. Data were collected with items on a Likert-type 5-point scale. Test-retest over 15 days ranged from .64-86. A two-way variance analysis of sex x group showed that men's mean risk perception was lower than that of women for orienteering, mountain biking, rowing, surfing, sailing, nordic skiing, tour skiing, snowboarding, parachuting, and cliff jumping. Also, experienced mountaineers' mean risk perception was lower than that of those less experienced.

  3. Validation of the Pornography Consumption Inventory in a Sample of Male Brazilian University Students.

    PubMed

    Baltieri, Danilo Antonio; Aguiar, Ana Saito Junqueira; de Oliveira, Vitor Henrique; de Souza Gatti, Ana Luisa; de Souza Aranha E Silva, Renata Almeida

    2015-01-01

    Few measures are available to examine pornography use constructs, and this can compromise the reliability of statements regarding harmful use of pornography. This study aimed to confirm the factorial validity and internal consistency of the Pornography Consumption Inventory in a sample of male Brazilian university students. The inventory consists of a 4-factor, 15-item, 5-point Likert-type scale. After translation and back-translation of the inventory, it was administered to 100 male medical students. An initial model that included all 15 items of the inventory showed some substandard fit indices. Therefore, another model was tested, excluding an item that had loaded onto two different factors. Goodness-of-fit indices were better for the new model. Overall, findings from this study support using the inventory on Portuguese-speaking individuals. With additional replication across populations, other settings, and treatment-seeking patients, the Pornography Consumption Inventory could also potentially be shortened to 14 items.

  4. Improving Palliative Care Team Meetings: Structure, Inclusion, and "Team Care".

    PubMed

    Brennan, Caitlin W; Kelly, Brittany; Skarf, Lara Michal; Tellem, Rotem; Dunn, Kathleen M; Poswolsky, Sheila

    2016-07-01

    Increasing demands on palliative care teams point to the need for continuous improvement to ensure teams are working collaboratively and efficiently. This quality improvement initiative focused on improving interprofessional team meeting efficiency and subsequently patient care. Meeting start and end times improved from a mean of approximately 9 and 6 minutes late in the baseline period, respectively, to a mean of 4.4 minutes late (start time) and ending early in our sustainability phase. Mean team satisfaction improved from 2.4 to 4.5 on a 5-point Likert-type scale. The improvement initiative clarified communication about patients' plans of care, thus positively impacting team members' ability to articulate goals to other professionals, patients, and families. We propose several recommendations in the form of a team meeting "toolkit."

  5. Group-based preference assessment for children and adolescents in a residential setting: examining developmental, clinical, gender, and ethnic differences.

    PubMed

    Resetar Volz, Jennifer L; Cook, Clayton R

    2009-11-01

    This study examines developmental, clinical, gender, and ethnic group differences in preference in residentially placed children and adolescents. In addition, this study considers whether residentially placed youth prefer stimuli currently being used as rewards as part of a campuswide token economy system and whether youth would identify preferred stimuli that are not currently offered. The article discusses a survey devised specifically for the purpose of this study. Stimuli currently offered as rewards are listed and rated on a 5-point Likert-type scale. Results indicate that the majority of stimuli available within the token economy system were rated as preferred. Also, significant developmental, clinical, gender, and ethnic group differences are found, indicating the benefit of considering group-level characteristics when designing and implementing a groupwide token economy system. The implications of the results and directions for future research are discussed.

  6. The Development of Will Perception Scale and Practice in a Psycho-Education Program with Its Validity and Reliability

    ERIC Educational Resources Information Center

    Yener, Özen

    2014-01-01

    In this research, we aim to develop a 5-point likert scale and use it in an experimental application by performing its validity and reliability in order to measure the will perception of teenagers and adults. With this aim, firstly the items have been taken either in the same or changed way from various scales and an item pool including 61 items…

  7. Validation of the Persian Version of the 8-Item Morisky Medication Adherence Scale (MMAS-8) in Iranian Hypertensive Patients

    PubMed Central

    Moharamzad, Yashar; Saadat, Habibollah; Shahraki, Babak Nakhjavan; Rai, Alireza; Saadat, Zahra; Aerab-Sheibani, Hossein; Naghizadeh, Mohammad Mehdi; Morisky, Donald E.

    2015-01-01

    The reliability and validity of the 8-item Morisky Medication Adherence Scale (MMAS-8) was assessed in a sample of Iranian hypertensive patients. In this multi-center study which lasted from August to October 2014, a total of 200 patients who were suffering from hypertension (HTN) and were taking anti-hypertensive medication(s) were included. The cases were accessed through private and university health centers in the cities of Tehran, Karaj, Kermanshah, and Bafgh in Iran and were interviewed face-to-face by the research team. The validated Persian translation of the MMAS-8 was provided by the owner of this scale. This scale contains 7 questions with “Yes” or “No” response choices and an additional Likert-type question (totally 8 questions). The total score ranges from 0 to 8 with higher scores reflecting better medication adherence. Mean (±SD) overall MMAS-8 score was 5.57 (±1.86). There were 108 (54%), 62 (31%), and 30 (15%) patients in the low, moderate, and high adherence groups. Internal consistency was acceptable with an overall Cronbach’s α coefficient of 0.697 and test–retest reliability showed good reproducibility (r= 0.940); P< 0.001. Overall score of the MMAS-8 was significantly correlated with systolic BP (r= - 0.306) and diastolic BP (r= - 0.279) with P< 0.001 for both BP measurements. The Chi-square test showed a significant relationship between adherence level and BP control (P= 0.016). The sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of the scale were 92.8%, 22.3%, 52.9%, and 76.7%, respectively. The Persian version of the MMAS had acceptable reliability and validity in Iranian hypertensive patients. This scale can be used as a standard and reliable tool in future studies to determine medication adherence of Persian-speaking patients with chronic conditions. PMID:25946926

  8. Student Attitudes toward Web-Enhanced Instruction in an Educational Technology Course

    ERIC Educational Resources Information Center

    Alghazo, Iman M.

    2006-01-01

    This study aimed at investigating students' attitudes toward Web-enhanced instruction in an educational technology course taught in the College of Education at the United Arab Emirates University. The sample of the study consisted of (66) college female students. A survey with 5 point Likert-type items and open-ended questions was used to collect…

  9. Scale Development for Measuring and Predicting Adolescents’ Leisure Time Physical Activity Behavior

    PubMed Central

    Ries, Francis; Romero Granados, Santiago; Arribas Galarraga, Silvia

    2009-01-01

    . Rephrasing the items and scoring items on a Likert-type scale enhanced greatly the subscales reliability. Identical factorial structure was extracted for both culturally different samples. The obtained factors, namely perceived physical competence, parents’ physical activity, perceived resources support, attitude toward physical activity and perceived parental support were hypothesized as for the original TPB constructs. PMID:24149606

  10. Scale and scaling in soils

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Scale is recognized as a central concept in the description of the hierarchical organization of our world. Pressing environmental and societal problems such require an understanding of how processes operate at different scales, and how they can be linked across scales. Soil science as many other dis...

  11. Maslowian Scale.

    ERIC Educational Resources Information Center

    Falk, C.; And Others

    The development of the Maslowian Scale, a method of revealing a picture of one's needs and concerns based on Abraham Maslow's levels of self-actualization, is described. This paper also explains how the scale is supported by the theories of L. Kohlberg, C. Rogers, and T. Rusk. After a literature search, a list of statements was generated…

  12. Development of a new resilience scale: the Connor-Davidson Resilience Scale (CD-RISC).

    PubMed

    Connor, Kathryn M; Davidson, Jonathan R T

    2003-01-01

    Resilience may be viewed as a measure of stress coping ability and, as such, could be an important target of treatment in anxiety, depression, and stress reactions. We describe a new rating scale to assess resilience. The Connor-Davidson Resilience scale (CD-RISC) comprises of 25 items, each rated on a 5-point scale (0-4), with higher scores reflecting greater resilience. The scale was administered to subjects in the following groups: community sample, primary care outpatients, general psychiatric outpatients, clinical trial of generalized anxiety disorder, and two clinical trials of PTSD. The reliability, validity, and factor analytic structure of the scale were evaluated, and reference scores for study samples were calculated. Sensitivity to treatment effects was examined in subjects from the PTSD clinical trials. The scale demonstrated good psychometric properties and factor analysis yielded five factors. A repeated measures ANOVA showed that an increase in CD-RISC score was associated with greater improvement during treatment. Improvement in CD-RISC score was noted in proportion to overall clinical global improvement, with greatest increase noted in subjects with the highest global improvement and deterioration in CD-RISC score in those with minimal or no global improvement. The CD-RISC has sound psychometric properties and distinguishes between those with greater and lesser resilience. The scale demonstrates that resilience is modifiable and can improve with treatment, with greater improvement corresponding to higher levels of global improvement.

  13. Multidimensional scaling

    PubMed Central

    Papesh, Megan H.; Goldinger, Stephen D.

    2012-01-01

    The concept of similarity, or a sense of "sameness" among things, is pivotal to theories in the cognitive sciences and beyond. Similarity, however, is a difficult thing to measure. Multidimensional scaling (MDS) is a tool by which researchers can obtain quantitative estimates of similarity among groups of items. More formally, MDS refers to a set of statistical techniques that are used to reduce the complexity of a data set, permitting visual appreciation of the underlying relational structures contained therein. The current paper provides an overview of MDS. We discuss key aspects of performing this technique, such as methods that can be used to collect similarity estimates, analytic techniques for treating proximity data, and various concerns regarding interpretation of the MDS output. MDS analyses of two novel data sets are also included, highlighting in step-by-step fashion how MDS is performed, and key issues that may arise during analysis. PMID:23359318

  14. Impact of the Number of Response Categories and Anchor Labels on Coefficient Alpha and Test-Retest Reliability

    ERIC Educational Resources Information Center

    Weng, Li-Jen

    2004-01-01

    A total of 1,247 college students participated in this study on the effect of scale format on the reliability of Likert-type rating scales. The number of response categories ranged from 3 to 9. Anchor labels on the scales were provided for each response option or for the end points only. The results indicated that the scales with few response…

  15. Effects of Scale, Question Location, Order of Response Alternatives, and Season on Self-Reported Noise Annoyance Using ICBEN Scales: A Field Experiment

    PubMed Central

    Brink, Mark; Schreckenberg, Dirk; Vienneau, Danielle; Cajochen, Christian; Wunderli, Jean-Marc; Probst-Hensch, Nicole; Röösli, Martin

    2016-01-01

    The type of noise annoyance scale and aspects of its presentation such as response format or location within a questionnaire and other contextual factors may affect self-reported noise annoyance. By means of a balanced experimental design, the effect of type of annoyance question and corresponding scale (5-point verbal vs. 11-point numerical ICBEN (International Commission on Biological Effects of Noise) scale), presentation order of scale points (ascending vs. descending), question location (early vs. late within the questionnaire), and survey season (autumn vs. spring) on reported road traffic noise annoyance was investigated in a postal survey with a stratified random sample of 2386 Swiss residents. Our results showed that early appearance of annoyance questions was significantly associated with higher annoyance scores. Questionnaires filled out in autumn were associated with a significantly higher annoyance rating than in the springtime. No effect was found for the order of response alternatives. Standardized average annoyance scores were slightly higher using the 11-point numerical scale whereas the percentage of highly annoyed respondents was higher based on the 5-point scale, using common cutoff points. In conclusion, placement and presentation of annoyance questions within a questionnaire, as well as the time of the year a survey is carried out, have small but demonstrable effects on the degree of self-reported noise annoyance. PMID:27886110

  16. Effects of Scale, Question Location, Order of Response Alternatives, and Season on Self-Reported Noise Annoyance Using ICBEN Scales: A Field Experiment.

    PubMed

    Brink, Mark; Schreckenberg, Dirk; Vienneau, Danielle; Cajochen, Christian; Wunderli, Jean-Marc; Probst-Hensch, Nicole; Röösli, Martin

    2016-11-23

    The type of noise annoyance scale and aspects of its presentation such as response format or location within a questionnaire and other contextual factors may affect self-reported noise annoyance. By means of a balanced experimental design, the effect of type of annoyance question and corresponding scale (5-point verbal vs. 11-point numerical ICBEN (International Commission on Biological Effects of Noise) scale), presentation order of scale points (ascending vs. descending), question location (early vs. late within the questionnaire), and survey season (autumn vs. spring) on reported road traffic noise annoyance was investigated in a postal survey with a stratified random sample of 2386 Swiss residents. Our results showed that early appearance of annoyance questions was significantly associated with higher annoyance scores. Questionnaires filled out in autumn were associated with a significantly higher annoyance rating than in the springtime. No effect was found for the order of response alternatives. Standardized average annoyance scores were slightly higher using the 11-point numerical scale whereas the percentage of highly annoyed respondents was higher based on the 5-point scale, using common cutoff points. In conclusion, placement and presentation of annoyance questions within a questionnaire, as well as the time of the year a survey is carried out, have small but demonstrable effects on the degree of self-reported noise annoyance.

  17. The Development of the Francis Moral Values Scales: A Study among 16- to 18-Year-Old Students Taking Religious Studies at A Level in the UK

    ERIC Educational Resources Information Center

    Village, Andrew; Francis, Leslie J.

    2016-01-01

    This article reports on the development of scales for measuring moral values in three domains: anti-social behaviour, sex and relationships, and substance use. Students studying religion at A level in 25 schools were invited to respond to 32 Likert items that referred to a wide range of moral issues and behaviours, employing a 5-point response…

  18. The Purdue Interest Questionnaire--An Interest Inventory to Assist Engineering Students in Their Career Planning.

    ERIC Educational Resources Information Center

    LeBold, William K.; And Others

    The Purdue Interest Questionnaire (PIQ), a 264-item Likert-type scale, was developed to assist engineering students in their career planning. The six engineering scales identify specialized fields: aeronautical, chemical, civil, electrical, industrial, or mechanical. For students planning to transfer out of engineering, four scales identify…

  19. Ohio Army National Guard Mental Health Initiative: Risk and Resilience Factors for Combat-Related Posttraumatic Psychopathology and Post Combat Adjustment

    DTIC Science & Technology

    2013-10-01

    To screen for depression effectively, results indicate use of the cardinal first two items, items representing fatigue , appetite and sleep changes...as done by Cannon et al. (2007). The PHQ-9 uses a Likert-type scale with four response options ranging from 0¼ ‘‘Not at all’’ to 3¼ ‘‘Nearly every day...symptom over the past month by rating items on a five-point Likert-type scale (1¼ ‘‘not at all’’ to 5¼ ‘‘extremely’’); however, in our study we assessed

  20. Development of a Walking Safety Scale for Older Adults, Part I: Content Validity of the GEM Scale

    PubMed Central

    Boudreault, Renée; Rousseau, Jacqueline; Bourbonnais, Daniel; Nadeau, Sylvie; Dubé, François

    2008-01-01

    Purpose: The Grille d’évaluation de la sécurité à la marche (GEM scale) is a performance-based tool developed to fill the need for an objective assessment of walking safety for older adults. It underwent a three-phase process of content validation. Method: A mailed questionnaire was used to assess the representativeness of the walking items (5-point pertinence scale). Subsequently, two physiotherapist focus groups (n = 20) were held to further evaluate the relevance of the scale and the walking items. Finally, a pilot study was completed with 3 raters administering the GEM scale to 12 hospitalized patients. Results: Comments and descriptive statistics (percentages) were analyzed from the questionnaire results and focus groups. On completion of the pilot study, which assessed 12 patients on the GEM scale, additional analyses were performed to address the theoretical background, the administration manual, the walking items, the scoring scale, and interpretation of the scale. Following each step, modifications were made to reflect the results of the analyses. Conclusion: The three-phase content-validation process demonstrated the relevance of this instrument and its representativeness as a walking safety assessment tool for older adults. PMID:20145759

  1. Measuring Group Dynamics: An Exploratory Trial

    ERIC Educational Resources Information Center

    Phan, Loan T.; Rivera, Edil Torres; Volker, Martin A.; Garrett, Michael T.

    2004-01-01

    This article reports on the development of a scale used to assess and measure group dynamics during group supervision counselling courses (practicum and internship). A 20-item Likert-type scale was administered to 200 counsellors-in-training master's students. Reliability and validity data are described. An exploratory factor analysis yielded…

  2. Secondary Reading: What High School Teachers Think. A Research Study of Secondary Teachers' Opinions on the Need for Teaching Reading Skills.

    ERIC Educational Resources Information Center

    Lloyd, Bruce A.

    To discover the opinions of high school teachers regarding the need for teaching reading skills, an amended Vaughan Scale was administered to 388 teachers in 15 high schools in Michigan. The questionnaire consisted of 15 questions to which teachers responded on a Likert-type scale. The results supported the following conclusions: (1) teaching…

  3. Designing, Testing, and Validating an Attitudinal Survey on an Environmental Topic: A Groundwater Pollution Survey Instrument for Secondary School Students

    ERIC Educational Resources Information Center

    Lacosta-Gabari, Idoya; Fernandez-Manzanal, Rosario; Sanchez-Gonzalez, Dolores

    2009-01-01

    Research in environmental attitudes' assessment has significantly increased in recent years. The development of specific attitude scales for specific environmental problems has often been proposed. This paper describes the Groundwater Pollution Test (GPT), a 19-item survey instrument using a Likert-type scale. The survey has been used with…

  4. An Evaluation of the Effectiveness of the New Primary School Mathematics Curriculum in Practice

    ERIC Educational Resources Information Center

    Gomleksiz, Mehmet Nuri; Bulut, Ilhami

    2007-01-01

    The aim of this study is to determine and compare the views of primary school teachers on the implementation and effectiveness of the new primary school mathematics curriculum. For that aim, a 32-item Likert-type Mathematics Curriculum Scale was developed. The reliability of the scale was tested through Cronbach Alpha (0.98), Spearman-Brown (0.93)…

  5. Rural Principal Attitudes toward Poverty and the Poor

    ERIC Educational Resources Information Center

    Gholson, Melissa L.

    2015-01-01

    This study used Yun and Weaver's (2010) Attitudes toward Poverty Short Form (ATP-SF) of twenty-one items on a Likert-type scale to determine the poverty attitudes of 309 principals in a rural Appalachian state in the United States. The study compared the poverty attitudes from the ATP-SF scaled score as a dependent variable to the following…

  6. Teachers' Ratings of Preschool Children's Behaviours. Discussion Paper No. 2.

    ERIC Educational Resources Information Center

    Metham, John

    This paper reports upon the evaluation and implementation of a 30-item Likert-type rating scale for teachers to use in assessing children's behaviors within preschool classrooms. The Preschool Observation Scale (POS) was developed to evaluate programs of the Mt. Druitt Early Childhood Project, North Ryde, Australia. Items were constructed on the…

  7. Attitude, Gender and Achievement in Computer Programming

    ERIC Educational Resources Information Center

    Baser, Mustafa

    2013-01-01

    The aim of this research was to explore the relationship among students' attitudes toward programming, gender and academic achievement in programming. The scale used for measuring students' attitudes toward programming was developed by the researcher and consisted of 35 five-point Likert type items in four subscales. The scale was administered to…

  8. Student Attitude Toward Information Retrieval.

    ERIC Educational Resources Information Center

    Adair, Charles H.; Allen, Rodney F.

    This is an individually administered rating scale designed to evaluate teacher trainee attitudes toward an information retrieval system. A major goal of the scale is to seek responses that measure students' reactions to the cognitive interest and motivational nature of the information retrieval system through the use of Likert-type items. The…

  9. The Effect on Prospective Teachers of the Learning Environment Supported by Dynamic Statistics Software

    ERIC Educational Resources Information Center

    Koparan, Timur

    2016-01-01

    In this study, the effect on the achievement and attitudes of prospective teachers is examined. With this aim ahead, achievement test, attitude scale for statistics and interviews were used as data collection tools. The achievement test comprises 8 problems based on statistical data, and the attitude scale comprises 13 Likert-type items. The study…

  10. Validation of a realistic simulator for veterinary gastrointestinal endoscopy training.

    PubMed

    Usón-Gargallo, Jesús; Usón-Casaús, Jesús M; Pérez-Merino, Eva M; Soria-Gálvez, Federico; Morcillo, Esther; Enciso, Silvia; Sánchez-Margallo, Francisco M

    2014-01-01

    This article reports on the face, content, and construct validity of a new realistic composite simulator (Simuldog) used to provide training in canine gastrointestinal flexible endoscopy. The basic endoscopic procedures performed on the simulator were esophagogastroduodenoscopy (EGD), gastric biopsy (GB), and gastric foreign body removal (FBR). Construct validity was assessed by comparing the performance of novices (final-year veterinary students and recent graduates without endoscopic experience, n=30) versus experienced subjects (doctors in veterinary medicine who had performed more than 50 clinical upper gastrointestinal endoscopic procedures as a surgeon, n=15). Tasks were scored based on completion time, and specific rating scales were developed to assess performance. Internal consistency and inter-rater agreement were assessed. Face and content validity were determined using a 5-point Likert-type scale questionnaire. The novices needed considerably more time than the experts to perform EGD, GB, and FBR, and their performance scores were significantly lower (p<.010). Inter-rater agreement and the internal validity of the rating scales were good. Face validity was excellent, and both groups agreed that the endoscopy scenarios were very realistic. The experts highly valued the usefulness of Simuldog for veterinary training and as a tool for assessing endoscopic skills. Simuldog is the first validated model specifically developed to be used as a training tool for endoscopy techniques in small animals.

  11. Glazer Narrative Composition Scale.

    ERIC Educational Resources Information Center

    Glazer, Joan

    Designed to assess the quality of children's narrative compositions, the Glazer Narrative Composition Scale (GNCS) consists of eighteen scales outlined under plot, theme, setting, characterization, and style. Each scale is scored 1, 2, or 3, depending on how much of the scale element is present in the narrative, with the highest possible score…

  12. Scaling: An Items Module

    ERIC Educational Resources Information Center

    Tong, Ye; Kolen, Michael J.

    2010-01-01

    "Scaling" is the process of constructing a score scale that associates numbers or other ordered indicators with the performance of examinees. Scaling typically is conducted to aid users in interpreting test results. This module describes different types of raw scores and scale scores, illustrates how to incorporate various sources of…

  13. A Study of the Relationship between Learner Preference and Student Achievement and Attitudes in an Instructional Television Course.

    ERIC Educational Resources Information Center

    Nadel, Judith L.

    The effects of interactive television (ITV) learning on the attitudes and achievement of different types of learners were investigated using as subjects 97 students who were taking courses at remote sites through the University of Southern Maine ITV system. First, a Likert-type scale for measuring learner preferences was developed and validated.…

  14. Pro-Recreational Sex Morality, Religiosity, and Causal Attribution of Homosexual Attitudes.

    ERIC Educational Resources Information Center

    Embree, Robert A.

    Homosexual cognitive victimization is a term which emphasizes social evaluation of sexual behaviors judged in terms of sexual preference. Individual differences in cognitive victimization of homosexuals were examined in two studies. In the first study, undergraduate students (N=78) completed Likert-type rating scales measuring homosexual cognitive…

  15. Compulsory Book Reading at School and within Leisure

    ERIC Educational Resources Information Center

    Pavlovic, Slavica

    2015-01-01

    This paper deals with attitudes of secondary school pupils towards compulsory book reading at school, being the integral part of the subject Croat language and literature teaching subject, and its possible impact on their book (not-)reading in their leisure time. It is based on the research carried out through five-point Likert-type scale in…

  16. Teacher Participation in Decision Making and Its Impact on School and Teachers

    ERIC Educational Resources Information Center

    Sarafidou, Jasmin-Olga; Chatziioannidis, Georgios

    2013-01-01

    Purpose: The purpose of this paper is to examine teacher involvement in different domains of decision making in Greek primary schools and explore associations with school and teacher variables. Design/methodology/approach: A survey employing self-administered questionnaires, with a Likert-type scale assessing teachers' actual and desired…

  17. Assessing Parent Satisfaction.

    ERIC Educational Resources Information Center

    Cleminshaw, Helen; Guidubaldi, John

    Although actual or projected satisfaction with parenting is important in determining whether a couple will become parents and how large their family will be, only minimal research has assessed parental satisfaction. The Cleminshaw-Guidubaldi Parent Satisfaction Scale, a 50-item Likert-type instrument designed to measure components of satisfaction…

  18. Training Evaluation as an Integral Component of Training for Performance.

    ERIC Educational Resources Information Center

    Lapp, H. J., Jr.

    A training evaluation system should address four major areas: reaction, learning, behavior, and results. The training evaluation system at GPU Nuclear Corporation addresses each of these areas through practical approaches such as course and program evaluation. GPU's program evaluation instrument uses a Likert-type scale to assess task development,…

  19. Slavic and Italian Canadian Attitudes towards Authority.

    ERIC Educational Resources Information Center

    Ryan, Michael G.

    Predicting that Italian Canadians would hold attitudes of greater hostility and anxiety toward authority than Slavic Canadians, this study, using 58 part-time summer students (29 Italians and 29 Slavs) at three universities in Canada, analyzed the subjects' responses to the five-response option Likert type scale. Results confirmed the early…

  20. Some Determinants of Citizen Attitudes toward Community Resource Development.

    ERIC Educational Resources Information Center

    Clark, Robert C.; Timothy, Earl E.

    This study examines, by adaptation of the Likert-type scale, the attitudes of a random sample of Bayfield citizens toward a new national park comprising 20 of the Apostle Islands and 42,000 acres of Lake Superior shoreline in northern Wisconsin, and its possible impact on the community. With the use of correlation analysis and regression…

  1. The Content of a College-Level Outdoor Leadership Course.

    ERIC Educational Resources Information Center

    Green, Paul

    This research study used the Delphi technique to determine the ideal content of a college-level outdoor leadership course for land-based outdoor pursuits in the Pacific Northwest. Topics were generated and value-rated by 61 Pacific Northwest outdoor leaders using a Likert-type scale in three separate questionnaires. Thirty-five topics were…

  2. Influence of Item Direction on Student Responses in Attitude Assessment.

    ERIC Educational Resources Information Center

    Campbell, Noma Jo; Grissom, Stephen

    To investigate the effects of wording in attitude test items, a five-point Likert-type rating scale was administered to 173 undergraduate education majors. The test measured attitudes toward college and self, and contained 38 positively-worded items. Thirty-eight negatively-worded items were also written to parallel the positive statements.…

  3. Estonian Science and Non-Science Students' Attitudes towards Mathematics at University Level

    ERIC Educational Resources Information Center

    Kaldo, Indrek; Reiska, Priit

    2012-01-01

    This article investigates the attitudes and beliefs towards studying mathematics by university level students. A total of 970 randomly chosen, first year, Estonian bachelor students participated in the study (of which 498 were science students). Data were collected using a Likert-type scale questionnaire and analysed with a respect to field of…

  4. Student Teachers' Attitudes Concerning Understanding the Nature of Science in Turkey

    ERIC Educational Resources Information Center

    Sahin, Nurettin; Deniz, Sabahattin; Gorgen, Izzet

    2006-01-01

    Nature of science is defined as one of the directions of scientific literacy. The main aim of this study was to investigate both secondary school social and science branch post-graduate (non-thesis master) teacher candidates attitudes about the Nature of Science (NOS) and compare their attitudes towards NOS. A 12-item Likert type scale for teacher…

  5. Measures to Combat Research Phobia among Undergraduates for Knowledge Creation in Imo State

    ERIC Educational Resources Information Center

    Ihebereme, Chioma I.

    2012-01-01

    The study examined the measures to combat research phobia among undergraduates in order to achieve knowledge creation. The study used Alvan Ikoku Federal College of Education Owerri in Imo State as case study. An 11-item four point Likert-type scale of Agreed (A) = 4 points, Strongly Agreed (SA) = 3 points, Disagreed (D) = 2 points and Strongly…

  6. Evaluating a Geology Curriculum for Non-Majors.

    ERIC Educational Resources Information Center

    Boone, William J.

    Two key factors affecting the success of non-major science courses are students' perceptions of topic difficulty and interest. An attitudinal survey administered to 300 college students, after completion of a college science course, evaluated their attitudes toward a geology curriculum. Using a Likert type scale students rated their level of…

  7. Predictors of Organizational Commitment for Faculty and Administrators of a Private Christian University

    ERIC Educational Resources Information Center

    Schroder, Ralph

    2008-01-01

    Faculty and administrators of a private Christian university responded to measures of overall, intrinsic, and extrinsic job satisfaction as well as organizational and religious commitment. The survey measured responses on a five-point Likert-type scale. Data were statistically analyzed by using descriptive statistics and factor analysis. Results…

  8. Beliefs about Language Learning Held by Students and Their Teacher (A Pilot Study).

    ERIC Educational Resources Information Center

    Kuntz, Patricia S.

    A study investigated the beliefs about second language learning among nine students of English as a Second Language (all female), and their teacher at Queen Arwa University (Yemen). The survey instrument consisted of five demographic statements and 47 statements concerning language learning in a Likert-type scaled response format. Results indicate…

  9. Students and Their Teachers of Arabic: Beliefs about Language Learning.

    ERIC Educational Resources Information Center

    Kuntz, Patricia S.

    A study investigated beliefs about second language learning held by 27 adult students and 10 teachers of Arabic at the Yemen Language Center. The survey instrument consisted of 5 demographic statements and 47 statements concerning language learning in a Likert-type scaled response format. Results indicate students and teachers generally agreed…

  10. Teachers' Perceptions of the Teaching of Acids and Bases in Swedish Upper Secondary Schools

    ERIC Educational Resources Information Center

    Drechsler, Michal; Van Driel, Jan

    2009-01-01

    We report in this paper on a study of chemistry teachers' perceptions of their teaching in upper secondary schools in Sweden, regarding models of acids and bases, especially the Bronsted and the Arrhenius model. A questionnaire consisting of a Likert-type scale was developed, which focused on teachers' knowledge of different models, knowledge of…

  11. The Arts, Social Inclusion and Social Class: The Case of Dance

    ERIC Educational Resources Information Center

    Sanderson, Patricia

    2008-01-01

    This article places the results of an empirical research study on the relationship between the social class factor and young people's perceptions of dance within the context of recent British government initiatives promoting social and educational inclusion through the arts. Four Likert-type dance attitude scales that were developed from pupil…

  12. Jordanian Social Studies Teachers' Perceptions of Competency Needed for Implementing Technology in the Classroom

    ERIC Educational Resources Information Center

    Al Bataineh, Mohammad; Anderson, Sharon

    2015-01-01

    This study used a cross-sectional, ten-point Likert-type scale survey design, to examine the perception of Jordanian seventh to twelfth-grade social studies teachers of the competency needed for technology implementation in their classrooms. The instrument for this study was a modified version of a survey developed by Kelly (2003) called the…

  13. Attitudes of the Student Teachers in English Language Teaching Programs towards Microteaching Technique

    ERIC Educational Resources Information Center

    Ogeyik, Muhlise Cosgun

    2009-01-01

    This paper evaluates the attitudes of student teachers towards microteaching experiences. The research was conducted with a total of 57 fourth year students attending the ELT Department at Trakya University, in Turkey. The data were collected via a Likert type scale developed by the researcher. The research results were evaluated regarding the…

  14. Within-Subject Comparison of Changes in a Pretest-Posttest Design

    ERIC Educational Resources Information Center

    Hennig, Christian; Mullensiefen, Daniel; Bargmann, Jens

    2010-01-01

    The authors propose a method to compare the influence of a treatment on different properties within subjects. The properties are measured by several Likert-type-scaled items. The results show that many existing approaches, such as repeated measurement analysis of variance on sum and mean scores, a linear partial credit model, and a graded response…

  15. The Use of Technology by Nonformal Environmental Educators

    ERIC Educational Resources Information Center

    Peffer, Tamara Elizabeth; Bodzin, Alec M.; Smith, Judith Duffield

    2013-01-01

    This study examined the use of instructional and learning technologies by nonformal environmental educators. A 40-question survey was developed to inquire about practitioner demographics, technology use in practice, and beliefs about technology. The survey consisted of multiple choice, open-ended questions, and a Likert-type scale component--the…

  16. Attitudes of the Public and Citizen Advisory Committee Members Toward Land and Water Resources in the Maumee River Basin.

    ERIC Educational Resources Information Center

    Taylor, Calvin Lee

    The reported study was conducted to determine the extent to which active participants of a Citizen's Advisory Committee (CAC) were representative of the general public in land and water resource attitudes. All 39 members of the Maumee River Basin Level B CAC and a random sample of 400 Basin residents were given a Likert-type scale to measure their…

  17. Assessment of the Perceived School Loneliness and Isolation of Mentally Retarded and Nonretarded Students.

    ERIC Educational Resources Information Center

    Luftig, Richard L.

    1988-01-01

    Perceived school loneliness and isolation of 73 partially mainstreamed retarded students (mean age 13.5 years) and their nonretarded peers was assessed using a five-point Likert-type loneliness scale. Retarded students reported significantly more loneliness and isolation than nonretarded peers suggesting that mainstreaming by itself does not…

  18. Teacher Leader Human Relations Skills: A Comparative Study

    ERIC Educational Resources Information Center

    Roby, Douglas E.

    2012-01-01

    In this study, 142 graduate school teachers working in schools throughout southwestern Ohio assessed their human relation skills. A human relations survey was used for the study, and results were compared with colleagues assessing the teachers in the study. The survey was developed using a Likert-type scale, and was based on key elements affecting…

  19. Teachers and Challenging Behavior: Knowledge, Views, and Practices

    ERIC Educational Resources Information Center

    Westling, David L.

    2010-01-01

    Seventy teachers (38 special education and 32 general education teachers) completed a questionnaire using Likert-type scales to describe several traits and conditions about themselves and students with challenging behavior. Results indicated that most teachers did not use many effective strategies or receive sufficient support, and viewed…

  20. Second Field Test of the AEL Measure of School Capacity for Improvement

    ERIC Educational Resources Information Center

    Copley, Lisa D.; Meehan, Merrill L.; Howley, Caitlin W.; Hughes, Georgia K.

    2005-01-01

    The major purpose of the second field test of the AEL MSCI instrument was to assess the psychometric properties of the refined version with a larger, more diverse group of respondents. The first objective of this field test was to expand the four-point Likert-type response scale to six points in order to yield more variance in responses. The…

  1. Perceived Stress Events by Teachers.

    ERIC Educational Resources Information Center

    Meinke, Dean L.; And Others

    A study was designed to identify patterns or relationships among test items used to measure teacher stress and develop a measure of teacher stress using a Likert-type scale. The subjects of the study were 89 experienced elementary and secondary school teachers, who rated 50 items related to events in a classroom situation or school setting, using…

  2. Development and Initial Validation of the Medical Fear Survey-Short Version

    ERIC Educational Resources Information Center

    Olatunji, Bunmi O.; Ebesutani, Chad; Sawchuk, Craig N.; McKay, Dean; Lohr, Jeffrey M.; Kleinknecht, Ronald A.

    2012-01-01

    The present investigation employs item response theory (IRT) to develop an abbreviated Medical Fear Survey (MFS). Application of IRT analyses in Study 1 (n = 931) to the original 50-item MFS resulted in a 25-item shortened version. Examination of the location parameters also resulted in a reduction of the Likert-type scaling of the MFS by removing…

  3. Trends and Issues Affecting Economic Development in Ohio, 2001-2005.

    ERIC Educational Resources Information Center

    Thomas, Jerold R.; Safrit, R. Dale

    Fourteen economic development practitioners were asked to participate in a modified Delphi study that attempted to provide a level of agreement about future trends and issues that affect economic development at the county level in Ohio. Literature from several fields was reviewed to find potential trends and issues and, using a Likert-type scale,…

  4. Investigation of Primary Students' Motivation Levels towards Science Learning

    ERIC Educational Resources Information Center

    Sevinc, Betul; Ozmen, Haluk; Yigit, Nevzat

    2011-01-01

    The present research was conducted with 518 students enrolled at the 6th, 7th and 8th classes of primary schools. A likert-type scale developed by Tuan, Chin and Shieh (2005) and translated into Turkish by Yilmaz and Cavas (2007) was used to examine the motivation levels of students towards science learning. Research findings revealed that gender,…

  5. Generalized IRT Models for Extreme Response Style

    ERIC Educational Resources Information Center

    Jin, Kuan-Yu; Wang, Wen-Chung

    2014-01-01

    Extreme response style (ERS) is a systematic tendency for a person to endorse extreme options (e.g., strongly disagree, strongly agree) on Likert-type or rating-scale items. In this study, we develop a new class of item response theory (IRT) models to account for ERS so that the target latent trait is free from the response style and the tendency…

  6. TEFL Textbook Evaluation: From Teachers' Perspectives

    ERIC Educational Resources Information Center

    Tok, Hidayet

    2010-01-01

    This study aims to examine the advantages and disadvantages of one type of TEFL materials, English language textbook "Spot On", used in state primary schools in Turkey. Sample of the research consists of 46 English teachers chosen randomly from state primary schools in Malatya and Adiyaman city centres. A five-likert type scale was used…

  7. The Art Appreciation Component of Visual Literacy: Examples of Guided Approaches to Viewing Art.

    ERIC Educational Resources Information Center

    Demery, Marie

    Likert-type rating scales were designed and used to help college students perceive, understand, and value the beauty and content of a piece of art. The subjects for the project were 100 college students enrolled in two art appreciation courses at Texas College. Their classification ranged from freshman to senior, with majors mainly in business,…

  8. Parental Recall of Pre-School Behavior Related to ADHD and Disruptive Behavior Disorder

    ERIC Educational Resources Information Center

    Ercan, Eyup Sabri; Somer, Oya; Amado, Sonia; Thompson, Dennis

    2005-01-01

    The aim of this study was to examine the contribution of Age of Onset Criterion (AOC) to the diagnosis of Attention Deficit Hyperactivity Disorder (ADHD) and disruptive behavior disorder. For this purpose, a 10-item Likert-type Parent Assessment of Pre-school Behavior Scale (PARPS), developed by the experimenters, was used to examine the presence…

  9. Effect of anti-smoking advertisements on Turkish adolescents.

    PubMed

    Unal, E; Gokler, M E; Metintas, S; Kalyoncu, C

    2016-12-12

    The aim of the present study was to determine the perception of 10 anti-smoking advertisements in 1434 Turkish adolescents. We used the Effectiveness of the Anti-smoking Advertisements Scale, which included 6 items for each advertisement; each item was assessed on a 5-point Likert-type scale. Multiple logistic regression analysis was used to determine the factors associated with the impact of the advertisements. All the advertisements were more effective for adolescents who had never smoked compared to ex-smokers and current smokers. We also noted that, regardless of age, smoking status decreased the effectiveness of all the advertisements. Previous studies have shown that smokers have a negative attitude towards anti-smoking messages. In the present study, the most effective advertisements among adolescents were those with "Sponge and tar", "Smoking harms in every breath" and "Children want to grow". In conclusion, although anti-smoking campaigns are targeted towards adults, they also have a strong influence on adolescents. The main target population for advertisements should be individuals aged < 15 years who have not yet started smoking.

  10. The Effect of Response Format on the Psychometric Properties of the Narcissistic Personality Inventory: Consequences for Item Meaning and Factor Structure.

    PubMed

    Ackerman, Robert A; Donnellan, M Brent; Roberts, Brent W; Fraley, R Chris

    2016-04-01

    The Narcissistic Personality Inventory (NPI) is currently the most widely used measure of narcissism in social/personality psychology. It is also relatively unique because it uses a forced-choice response format. We investigate the consequences of changing the NPI's response format for item meaning and factor structure. Participants were randomly assigned to one of three conditions: 40 forced-choice items (n = 2,754), 80 single-stimulus dichotomous items (i.e., separate true/false responses for each item; n = 2,275), or 80 single-stimulus rating scale items (i.e., 5-point Likert-type response scales for each item; n = 2,156). Analyses suggested that the "narcissistic" and "nonnarcissistic" response options from the Entitlement and Superiority subscales refer to independent personality dimensions rather than high and low levels of the same attribute. In addition, factor analyses revealed that although the Leadership dimension was evident across formats, dimensions with entitlement and superiority were not as robust. Implications for continued use of the NPI are discussed.

  11. A Sense of Scale.

    ERIC Educational Resources Information Center

    Tretter, Thomas R.; Jones, M. Gail

    2003-01-01

    Points out the importance of an understanding of a sense of scale and presents an activity that uses distance or time as a measure. The activity illustrates for students what the universe would look like at various scales. (DDR)

  12. Small Scale Organic Techniques

    ERIC Educational Resources Information Center

    Horak, V.; Crist, DeLanson R.

    1975-01-01

    Discusses the advantages of using small scale experimentation in the undergraduate organic chemistry laboratory. Describes small scale filtration techniques as an example of a semi-micro method applied to small quantities of material. (MLH)

  13. Cross-scale morphology

    USGS Publications Warehouse

    Allen, Craig R.; Holling, Crawford S.; Garmestani, Ahjond S.; El-Shaarawi, Abdel H.; Piegorsch, Walter W.

    2013-01-01

    The scaling of physical, biological, ecological and social phenomena is a major focus of efforts to develop simple representations of complex systems. Much of the attention has been on discovering universal scaling laws that emerge from simple physical and geometric processes. However, there are regular patterns of departures both from those scaling laws and from continuous distributions of attributes of systems. Those departures often demonstrate the development of self-organized interactions between living systems and physical processes over narrower ranges of scale.

  14. Civilian PTSD Scales

    ERIC Educational Resources Information Center

    Shapinsky, Alicia C.; Rapport, Lisa J.; Henderson, Melinda J.; Axelrod, Bradley N.

    2005-01-01

    Strong associations between civilian posttraumatic stress disorder (PTSD) scales and measures of general psychological distress suggest that the scales are nonspecific to PTSD. Three common PTSD scales were administered to 122 undergraduates who had experienced an emotionally salient, nontraumatic event: a college examination. Results indicated…

  15. Classroom Observation Scales.

    ERIC Educational Resources Information Center

    Emmer, Edmund T.

    Nine scales were developed to measure a series of classroom behavior variables derived from a factor analytic study of five observation systems. The scales are multipoint check lists which are behaviorally referenced by different amounts and types of classroom behaviors. The scales measure such aspects of classroom behavior as teacher-initiated…

  16. Extreme Scale Visual Analytics

    SciTech Connect

    Wong, Pak C.; Shen, Han-Wei; Pascucci, Valerio

    2012-05-08

    Extreme-scale visual analytics (VA) is about applying VA to extreme-scale data. The articles in this special issue examine advances related to extreme-scale VA problems, their analytical and computational challenges, and their real-world applications.

  17. Educational Scale-Making

    ERIC Educational Resources Information Center

    Nespor, Jan

    2004-01-01

    The article explores the complexities of educational scale-making. "Educational scales" are defined as the spatial and temporal orders generated as pupils and teachers move and are moved through educational systems; scales are "envelopes of spacetime" into which certain schoolbased identities (and not others) can be folded.…

  18. Reading Graduated Scales.

    ERIC Educational Resources Information Center

    Hall, Lucien T., Jr.

    1982-01-01

    Ways of teaching students to read scales are presented as process instructions that are probably overlooked or taken for granted by most instructors. Scales on such devices as thermometers, rulers, spring scales, speedometers, and thirty-meter tape are discussed. (MP)

  19. Schroeder Composition Scale.

    ERIC Educational Resources Information Center

    Schroeder, Thomas S.

    Designed to describe the writing behaviors of elementary and junior high school children, the Schroeder Composition Scale is an analytic scale. For eleven of the criteria in the scale, the scoring is simply "yes" or "no" indicating whether the writing does or does not have the characteristic. Five other items identify…

  20. The Positivity Scale

    ERIC Educational Resources Information Center

    Caprara, Gian Vittorio; Alessandri, Guido; Eisenberg, Nancy; Kupfer, A.; Steca, Patrizia; Caprara, Maria Giovanna; Yamaguchi, Susumu; Fukuzawa, Ai; Abela, John

    2012-01-01

    Five studies document the validity of a new 8-item scale designed to measure "positivity," defined as the tendency to view life and experiences with a positive outlook. In the first study (N = 372), the psychometric properties of Positivity Scale (P Scale) were examined in accordance with classical test theory using a large number of…

  1. The Problem Behaviour Checklist: short scale to assess challenging behaviours

    PubMed Central

    Nagar, Jessica; Evans, Rosie; Oliver, Patricia; Bassett, Paul; Liedtka, Natalie; Tarabi, Aris

    2016-01-01

    Background Challenging behaviour, especially in intellectual disability, covers a wide range that is in need of further evaluation. Aims To develop a short but comprehensive instrument for all aspects of challenging behaviour. Method In the first part of a two-stage enquiry, a 28-item scale was constructed to examine the components of challenging behaviour. Following a simple factor analysis this was developed further to create a new short scale, the Problem Behaviour Checklist (PBCL). The scale was subsequently used in a randomised controlled trial and tested for interrater reliability. Scores were also compared with a standard scale, the Modified Overt Aggression Scale (MOAS). Results Seven identified factors – personal violence, violence against property, self-harm, sexually inappropriate, contrary, demanding and disappearing behaviour – were scored on a 5-point scale. A subsequent factor analysis with the second population showed demanding, violent and contrary behaviour to account for most of the variance. Interrater reliability using weighted kappa showed good agreement (0.91; 95% CI 0.83–0.99). Good agreement was also shown with scores on the MOAS and a score of 1 on the PBCL showed high sensitivity (97%) and specificity (85%) for a threshold MOASscore of 4. Conclusions The PBCL appears to be a suitable and practical scale for assessing all aspects of challenging behaviour. Declaration of interest None. Copyright and usage © 2016 The Royal College of Psychiatrists. This is an open access article distributed under the terms of the Creative Commons Non-Commercial, No Derivatives (CC BY-NC-ND) licence. PMID:27703753

  2. Manual of Scaling Methods

    NASA Technical Reports Server (NTRS)

    Bond, Thomas H. (Technical Monitor); Anderson, David N.

    2004-01-01

    This manual reviews the derivation of the similitude relationships believed to be important to ice accretion and examines ice-accretion data to evaluate their importance. Both size scaling and test-condition scaling methods employing the resulting similarity parameters are described, and experimental icing tests performed to evaluate scaling methods are reviewed with results. The material included applies primarily to unprotected, unswept geometries, but some discussion of how to approach other situations is included as well. The studies given here and scaling methods considered are applicable only to Appendix-C icing conditions. Nearly all of the experimental results presented have been obtained in sea-level tunnels. Recommendations are given regarding which scaling methods to use for both size scaling and test-condition scaling, and icing test results are described to support those recommendations. Facility limitations and size-scaling restrictions are discussed. Finally, appendices summarize the air, water and ice properties used in NASA scaling studies, give expressions for each of the similarity parameters used and provide sample calculations for the size-scaling and test-condition scaling methods advocated.

  3. Salzburger State Reactance Scale (SSR Scale)

    PubMed Central

    2015-01-01

    Abstract. This paper describes the construction and empirical evaluation of an instrument for measuring state reactance, the Salzburger State Reactance (SSR) Scale. The results of a confirmatory factor analysis supported a hypothesized three-factor structure: experience of reactance, aggressive behavioral intentions, and negative attitudes. Correlations with divergent and convergent measures support the validity of this structure. The SSR Subscales were strongly related to the other state reactance measures. Moreover, the SSR Subscales showed modest positive correlations with trait measures of reactance. The SSR Subscales correlated only slightly or not at all with neighboring constructs (e.g., autonomy, experience of control). The only exception was fairness scales, which showed moderate correlations with the SSR Subscales. Furthermore, a retest analysis confirmed the temporal stability of the scale. Suggestions for further validation of this questionnaire are discussed. PMID:27453806

  4. Scale and scaling in agronomy and environmental sciences

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Scale is of paramount importance in environmental studies, engineering, and design. The unique course covers the following topics: scale and scaling, methods and theories, scaling in soils and other porous media, scaling in plants and crops; scaling in landscapes and watersheds, and scaling in agro...

  5. Recursive scaled DCT

    NASA Astrophysics Data System (ADS)

    Hou, Hsieh-Sheng

    1991-12-01

    Among the various image data compression methods, the discrete cosine transform (DCT) has become the most popular in performing gray-scale image compression and decomposition. However, the computational burden in performing a DCT is heavy. For example, in a regular DCT, at least 11 multiplications are required for processing an 8 X 1 image block. The idea of the scaled-DCT is that more than half the multiplications in a regular DCT are unnecessary, because they can be formulated as scaling factors of the DCT coefficients, and these coefficients may be scaled back in the quantization process. A fast recursive algorithm for computing the scaled-DCT is presented in this paper. The formulations are derived based on practical considerations of applying the scaled-DCT algorithm to image data compression and decompression. These include the considerations of flexibility of processing different sizes of DCT blocks and the actual savings of the required number of arithmetic operations. Due to the recursive nature of this algorithm, a higher-order scaled-DCT can be obtained from two lower-order scaled DCTs. Thus, a scaled-DCT VLSI chip designed according to this algorithm may process different sizes of DCT under software control. To illustrate the unique properties of this recursive scaled-DCT algorithm, the one-dimensional formulations are presented with several examples exhibited in signal flow-graph forms.

  6. PULSE SCALING SYSTEM

    DOEpatents

    Kandiah, K.

    1954-06-01

    Pulse scaling systems embodying multi-electrode gaseous-discharge tubes of the type having a plurality of stable discharge paths are described. The novelty of this particular system lies in the simplification of the stepping arrangement between successive tubes. In one form the invention provides a multistage scaler comprising a pulse generator, a first multi-electrode scaling tube of the type set forth coupled to said generator to receive transfer pulses therefrom and one or more succeeding multi-electrode scaling tubes each deriving its transfer pulses from preceding scaling tubes.

  7. Evaluating the Effectiveness of the 2003-2004 NASA CONNECT(trademark)Program

    NASA Technical Reports Server (NTRS)

    Caton, Randall H.; Pinelli, Thomas E.; Giersch, Christopher E.; Holmes, Ellen B.; Lambert, Matthew A.

    2005-01-01

    NASA CONNECT is an Emmy-award-winning series of instructional (distance learning) programs for grades 6-8. Produced by the NASA Center for Distance Learning, the nine programs in the 2003-2004 NASA CONNECT series are research-, inquiry-, standards-, teacher-, and technology-based and include a 30-minute program, an educator guide containing a hands-on activity, and a web-based component. The 1,500 randomly selected NASA CONNECT registered users were invited to complete an electronic (self-reported) survey that employed a 5-point Likert-type scale. Regarding NASA CONNECT, respondents reported that the programs (1) enhance the teaching of mathematics, science, and technology (4.53); (2) are aligned with the national mathematics, science, and technology standards (4.52); (3) raise student awareness of careers requiring mathematics, science, and technology (4.48); (4) demonstrate the application of mathematics, science, and technology (4.47); and (5) present women and minorities performing challenging engineering and science tasks (4.50).

  8. Barriers and Enablers to Enacting Child and Youth Related Injury Prevention Legislation in Canada

    PubMed Central

    Rothman, Linda; Pike, Ian; Belton, Kathy; Olsen, Lise; Fuselli, Pam; Macpherson, Alison

    2016-01-01

    Injury prevention policy is crucial for the safety of Canada’s children; however legislation is not adopted uniformly across the country. This study aimed to identify key barriers and enablers to enacting injury prevention legislation. Purposive snowball sampling identified individuals involved in injury prevention throughout Canada. An online survey asked respondents to identify policies that were relevant to them, and whether legislation existed in their province. Respondents rated the importance of barriers or enablers using a 5-point Likert type scale and included open-ended comments. Fifty-seven respondents identified the most common injury topics: bicycle helmets (44, 77%), cell phone-distracted driving (36, 63%), booster seats (28, 49%), ski helmets (24, 42%), and graduated driver’s licensing (21, 37%). The top enablers were research/surveillance, managerial/political support and professional group consultation, with much variability between injury topics. Open-ended comments emphasized the importance of a united opinion as an enabler and barriers included costs of protective equipment and inadequate enforcement of legislation. The results highlighted the importance of strategies that include research, management and community collaboration and that injury prevention topics should be addressed individually as information may be lost if topics are considered together. Findings can inform the process of turning injury prevention evidence into action. PMID:27399745

  9. Stakeholder Perspectives on Creating and Maintaining Trust in Community-Academic Research Partnerships.

    PubMed

    Frerichs, Leah; Kim, Mimi; Dave, Gaurav; Cheney, Ann; Hassmiller Lich, Kristen; Jones, Jennifer; Young, Tiffany L; Cene, Crystal W; Varma, Deepthi S; Schaal, Jennifer; Black, Adina; Striley, Catherine W; Vassar, Stefanie; Sullivan, Greer; Cottler, Linda B; Brown, Arleen; Burke, Jessica G; Corbie-Smith, Giselle

    2017-02-01

    Community-academic research partnerships aim to build stakeholder trust in order to improve the reach and translation of health research, but there is limited empirical research regarding effective ways to build trust. This multisite study was launched to identify similarities and differences among stakeholders' perspectives of antecedents to trust in research partnerships. In 2013-2014, we conducted a mixed-methods concept mapping study with participants from three major stakeholder groups who identified and rated the importance of different antecedents of trust on a 5-point Likert-type scale. Study participants were community members ( n = 66), health care providers ( n = 38), and academic researchers ( n = 44). All stakeholder groups rated "authentic communication" and "reciprocal relationships" the highest in importance. Community members rated "communication/methodology to resolve problems" ( M = 4.23, SD = 0.58) significantly higher than academic researchers ( M = 3.87, SD = 0.67) and health care providers ( M = 3.89, SD = 0.62; p < .01) and had different perspectives regarding the importance of issues related to "sustainability." The importance of communication and relationships across stakeholders indicates the importance of colearning processes that involve the exchange of knowledge and skills. The differences uncovered suggest specific areas where attention and skill building may be needed to improve trust within partnerships. More research on how partnerships can improve communication specific to problem solving and sustainability is merited.

  10. A decision-making framework for total ownership cost management of complex systems: A Delphi study

    NASA Astrophysics Data System (ADS)

    King, Russel J.

    This qualitative study, using a modified Delphi method, was conducted to develop a decision-making framework for the total ownership cost management of complex systems in the aerospace industry. The primary focus of total ownership cost is to look beyond the purchase price when evaluating complex system life cycle alternatives. A thorough literature review and the opinions of a group of qualified experts resulted in a compilation of total ownership cost best practices, cost drivers, key performance factors, applicable assessment methods, practitioner credentials and potential barriers to effective implementation. The expert panel provided responses to the study questions using a 5-point Likert-type scale. Data were analyzed and provided to the panel members for review and discussion with the intent to achieve group consensus. As a result of the study, the experts agreed that a total ownership cost analysis should (a) be as simple as possible using historical data; (b) establish cost targets, metrics, and penalties early in the program; (c) monitor the targets throughout the product lifecycle and revise them as applicable historical data becomes available; and (d) directly link total ownership cost elements with other success factors during program development. The resultant study framework provides the business leader with incentives and methods to develop and implement strategies for controlling and reducing total ownership cost over the entire product life cycle when balancing cost, schedule, and performance decisions.

  11. Nurses' and Physicians' Perceptions of Older People and Attitudes towards Older People: Ageism in a Hospital in Turkey.

    PubMed

    Polat, Ulkü; Karadağ, Ayişe; Ulger, Zekeriya; Demir, Nevra

    2014-06-27

    Abstract Nurses and physicians provide health care for a growing number of older people as a result of the rapid increase in the life expectancies of older people. Health professionals' negative attitudes towards older people affect the quality of health care offered to these individuals. The sample for this study included 110 nurses and 57 physicians working in the medical and surgical clinics of a university hospital. A questionnaire form and the Ageism Attitude Scale (AAS) were used to collect the data. A 5-point Likert-type format was utilised for the AAS. The AAS total mean score was 80.02±2.64 for nurses and 83.17±9.09 for physicians. The difference between these mean scores was statistically significant (p<0.05). For the AAS subdimension "limiting the life of the older people", the physicians' score (35.14±6.22) was significantly higher than the nurses' score (33.22±3.59). In this study, nurses' and physicians' attitudes, approaches, and considerations were found to be generally positive.

  12. Depression and socio-economical burden are more common in primary caregivers of patients who are not aware of their cancer: TURQUOISE Study by the Palliative Care Working Committee of the Turkish Oncology Group (TOG).

    PubMed

    Tanriverdi, O; Yavuzsen, T; Turhal, S; Kilic, D; Yalcin, S; Ozkan, A; Uzunoglu, S; Uysal-Sonmez, O; Akman, T; Aktas, B; Ulger, S; Babacan, T; Komurcu, S; Yaren, A; Cay-Senler, F

    2016-05-01

    In this study, we aimed to determine the personal, social and economic burden and the frequency of depression, as well as in caregivers of cancer patients who are being treated with chemotherapy in Turkey. The study is designed as a cross-sectional survey study using a 5-point Likert-type response scale, and the last part of the questionnaire includes the Beck Depression Inventory. The depression rate was found to be 64% (n = 476) among all subjects (n = 968), with 91% of those with depression demonstrating signs of mild depression. In this study, a significant difference was found between the presence of depression and age (young), sex (female), educational level (high), economic status (low), financial loss during treatment, patient's lack of knowledge about his/her diagnosis, metastatic disease and short survival time. In addition, 64% of all subjects had concerns of getting cancer, and 44% of all subjects had feelings of anger/rage against other people. In a multivariate regression analysis, the patient's lack of knowledge of the diagnosis was the independent risk factor. In conclusion, depression incidence and burden rate increased among cancer caregivers, and care burden was highly associated with depression. Accordingly, approaches to reducing the psycho-social effects of cancer should focus intensively on both the patients and their caregivers in Turkey.

  13. Poetry Methods Rating Scale.

    ERIC Educational Resources Information Center

    Gallo, Donald R.

    Designed to assess high school teachers' attitudes about teaching poetry, this questionnaire asked teachers to respond to a 38-item poetry methods rating scale (PMRS) on a seven-point scale (from "strongly agree" to "strongly disagree"). The items for the questionnaire were derived from a study of popular methods texts for…

  14. Memorial symptom assessment scale.

    PubMed

    Chang, Victor T; Hwang, Shirley S; Thaler, Howard T; Kasimis, Basil S; Portenoy, Russell K

    2004-04-01

    Patients with advanced illnesses often have multiple symptoms. As interest in palliative care and interventions for symptom control increase, the ability to assess multiple symptoms has become more important. A number of instruments have been developed to meet this need in cancer patients. This article reviews the development and applications of a multidimensional instrument, the Memorial Symptom Assessment Scale. The Memorial Symptom Assessment Scale has 32 symptoms and three dimensions of frequency, severity, and distress. Shorter versions - The Memorial Symptom Assessment Scale Short Form (32 symptoms with one dimension) and the Condensed Memorial Symptom Assessment Scale (14 symptoms with one dimension), and a version for children aged 7-12 years, have also been developed. A distinctive feature is the summary subscales for physical distress, psychological distress, and The Global Distress Index. The Memorial Symptom Assessment Scale has proven useful in description of symptom epidemiology, the role of symptoms in pain, fatigue, and spirituality; as a predictor of survival, and in proxy assessments of pain. The Memorial Symptom Assessment Scale has been used in studies of cancer and AIDS patients, and patients with advanced medical illnesses. Possible future roles of instruments such as the Memorial Symptom Assessment Scale include use in clinical trials, for pharmacoeconomic analyses, definition of symptom clusters and symptom burden, the development of symptom outcome measures, symptom monitoring, and improving care for patients. Continued research is needed for the versions of the Memorial Symptom Assessment Scale and other symptom instruments in different populations and applications.

  15. Modelling Rating Scales.

    ERIC Educational Resources Information Center

    Linacre, John M.

    Determination of the intentions of the test developer is fundamental to the choice of the analytical model for a rating scale. For confirmatory analysis, the developer's intentions inform the choice of the general form of the model, representing the manner in which the respondent interacts with the scale; these intentions also inform the choice of…

  16. Pre-Kindergarten Scale.

    ERIC Educational Resources Information Center

    Flynn, Tim

    This 25-item scale for rating prekindergarten children concerns personal and cognitive skills. Directions for using the scale are provided. Personal skills include personal hygiene, communication skills, eating habits, relationships with the teacher, peer relations, and personal behavior. Cognitive skills rated are verbal skills, object…

  17. Basic Structure Content Scaling.

    ERIC Educational Resources Information Center

    Jackson, Douglas N.; Helmes, Edward

    1979-01-01

    A basic structure approach is proposed for obtaining multidimensional scale values for attitude, achievement, or personality items from response data. The technique permits the unconfounding of scale values due to response bias and content and partitions item indices of popularity or difficulty among a number of relevant dimensions. (Author/BH)

  18. Teaching Satisfaction Scale

    ERIC Educational Resources Information Center

    Ho, Chung-Lim; Au, Wing-Tung

    2006-01-01

    The present study proposes a teaching satisfaction measure and examines the validity of its scores. The measure is based on the Life Satisfaction Scale (LSS). Scores on the five-item Teaching Satisfaction Scale (TSS) were validated on a sample of 202 primary and secondary school teachers and favorable psychometric properties were found. As…

  19. The Family Constellation Scale.

    ERIC Educational Resources Information Center

    Lemire, David

    The Family Constellation Scale (FC Scale) is an instrument that assesses perceived birth order in families. It can be used in counseling to help initiate conversations about various traits and assumptions that tend to characterize first-born, middle-born children, youngest-born, and only children. It provides both counselors and clients insights…

  20. Teacher Observation Scales.

    ERIC Educational Resources Information Center

    Purdue Univ., Lafayette, IN. Educational Research Center.

    The Teacher Observation Scales include four instruments: Observer Rating Scale (ORS), Reading Strategies Check List, Arithmetic Strategies Check List, and Classroom Description. These instruments utilize trained observers to describe the teaching behavior, instructional strategies and physical characteristics in each classroom. On the ORS, teacher…

  1. Scaling up as Catachresis

    ERIC Educational Resources Information Center

    Tobin, Joseph

    2005-01-01

    The metaphor of scaling up is the wrong one to use for describing and prescribing educational change. Many of the strategies being employed to achieve scaling up are counter-productive: they conceive of practitioners as delivery agents or consumers, rather than as co-constructors of change. An approach to educational innovation based on the…

  2. Thoughts on Scale

    ERIC Educational Resources Information Center

    Schoenfeld, Alan H.

    2015-01-01

    This essay reflects on the challenges of thinking about scale--of making sense of phenomena such as continuous professional development (CPD) at the system level, while holding on to detail at the finer grain size(s) of implementation. The stimuli for my reflections are three diverse studies of attempts at scale--an attempt to use ideas related to…

  3. Commitment to Health Scale.

    PubMed

    Kelly, Cynthia W

    2005-01-01

    The Commitment to Health Scale (CHS) was developed to predict likelihood of clients being able to permanently adopt new health-promoting behaviors. Commitment is based on the association between starting new health behaviors and long-term performance of those behaviors. The CHS evolved from an examination of Prochaska and DiClemente's Stages of Change Algorithm, Decisional Balance Scale, and Strong and Weak Principle (Velicer, Rossi, Prochaska, & DiClemente, 1996). Scale items were assessed by classical and Rasch measurement methods. The research was performed in three separate studies at various locations in the United States and included approximately 1100 subjects. A new unidimensional variable was identified called Commitment to Health. Internal consistency reliability of the scale was .94 (Cronbach's alpha). External validity and reliability were assessed based on expected and observed ordering and between known groups. Scale scores predicted self-reported health behaviors and body mass index.

  4. Parallel Computing in SCALE

    SciTech Connect

    DeHart, Mark D; Williams, Mark L; Bowman, Stephen M

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  5. Composite rating scales.

    PubMed

    Martinez-Martin, Pablo

    2010-02-15

    Rating scales are instruments that are very frequently used by clinicians to perform patient assessments. Typically, rating scales grade the attribute on an ordinal level of measurement, i.e., a rank ordering, meaning that the numbers assigned to the different ranks (item scores) do not represent 'real numbers' or 'physical magnitudes'. Single-item scales have some advantages, such as simplicity and low respondent burden, but they may also suffer from disadvantages, such as ambiguous score meanings and low responsiveness. Multi-item scales, in contrast, seem more adequate for assessment of complex constructs, allowing for detailed evaluation. Total scores representing the value of the construct may be quite precise and thus the responsiveness of the scale may be high. The most common strategy for obtaining the total score is the sum of the item scores, a strategy that constitutes one of the most important problems with these types of scales. A summative score of ordinal figures is not a 'real magnitude' and may have little sense. This paper is a review of the theoretical frameworks of the main theories used to develop rating scales (Classical Test Theory and Item Response Theory). Bearing in mind that no alternative is perfect, additional research in this field and judicious decisions are called for.

  6. Quadratic Generalized Scale Invariance

    NASA Astrophysics Data System (ADS)

    Lovejoy, S.; Schertzer, D.; Addor, J. B.

    Nearly twenty years ago, two of us argued that in order to account for the scaling strat- ification of the atmosphere, that an anisotropic "unified scaling model" of the atmo- sphere was required with elliptical dimension 23/9=2.555... "in between" the standard 3-D (small scale) and 2-D large scale model. This model was based on the formal- ism of generalized scale invariance (GSI). Physically, GSI is justified by arguing that various conserved fluxes (energy, buoyancy force variance etc.) should define the ap- propriate notion of scale. In a recent large scale satellite cloud image analysis, we directly confirmed this model by studying the isotropic (angle averaged) horizontal cloud statistics. Mathematically, GSI is based on a a group of scale changing opera- tors and their generators but to date, both analyses (primarily of cloud images) and nu- merical (multifractal) simulations, have been limited to the special case of linear GSI. This has shown that cloud texture can plausibly be associated with local linearizations. However realistic morphologies involve spatially avarying textures; the full non linear GSI is clearly necessary. In this talk, we first show that the observed angle averaged (multi)scaling statistics only give a realtively weak constraint on the nonlinear gner- ator: that the latter can be expressed by self-similar (isotropic) part, and a deviatoric part described (in two dimensions) by an arbitrary scalar potential which contains all the information about the cloud morphology. We then show (using a theorem due to Poincaré) how to reduce nonlinear GSI to linear GSI plus a nonlinear coordinate trans- formation numerically, using this to take multifractal GSI modelling to the next level of approximation: quadratic GSI. We show many examples of the coresponding simu- lations which include transitions from various morphologies (including cyclones) and we discuss the results in relation to satellite cloud images.

  7. Allometric Scaling in Biology

    NASA Astrophysics Data System (ADS)

    Banavar, Jayanth

    2009-03-01

    The unity of life is expressed not only in the universal basis of inheritance and energetics at the molecular level, but also in the pervasive scaling of traits with body size at the whole-organism level. More than 75 years ago, Kleiber and Brody and Proctor independently showed that the metabolic rates, B, of mammals and birds scale as the three-quarter power of their mass, M. Subsequent studies showed that most biological rates and times scale as M-1/4 and M^1/4 respectively, and that these so called quarter-power scaling relations hold for a variety of organisms, from unicellular prokaryotes and eukaryotes to trees and mammals. The wide applicability of Kleiber's law, across the 22 orders of magnitude of body mass from minute bacteria to giant whales and sequoias, raises the hope that there is some simple general explanation that underlies the incredible diversity of form and function. We will present a general theoretical framework for understanding the relationship between metabolic rate, B, and body mass, M. We show how the pervasive quarter-power biological scaling relations arise naturally from optimal directed resource supply systems. This framework robustly predicts that: 1) whole organism power and resource supply rate, B, scale as M^3/4; 2) most other rates, such as heart rate and maximal population growth rate scale as M-1/4; 3) most biological times, such as blood circulation time and lifespan, scale as M^1/4; and 4) the average velocity of flow through the network, v, such as the speed of blood and oxygen delivery, scales as M^1/12. Our framework is valid even when there is no underlying network. Our theory is applicable to unicellular organisms as well as to large animals and plants. This work was carried out in collaboration with Amos Maritan along with Jim Brown, John Damuth, Melanie Moses, Andrea Rinaldo, and Geoff West.

  8. Sulfate scale dissolution

    SciTech Connect

    Morris, R.L.; Paul, J.M.

    1992-01-28

    This patent describes a method for removing barium sulfate scale. It comprises contacting the scale with an aqueous solution having a pH of about 8 to about 14 and consisting essentially of a chelating agent comprising a polyaminopolycarboxylic acid or salt of such an acid in a concentration of 0.1 to 1.0 M, and anions of a monocarboxylic acid selected form mercaptoacetic acid, hydroxyacetic acid, aminoacetic acid, or salicyclic acid in a concentration of 0.1 to 1.0 M and which is soluble in the solution under the selected pH conditions, to dissolve the scale.

  9. Clinical rating scales.

    PubMed

    Relja, Maja

    2012-01-01

    In Parkinson's disease (PD), rating scales are used to assess the degree of disease-related disability and to titrate long-term treatment to each phase of the disease. Recognition of non-motor symptoms required modification of existing widely used scales to integrate non-motor elements. In addition, new scales have been developed for the assessment of non-motor symptoms. In this article, assessment of PD patients will be discussed, particularly for non-motor symptoms such as pain and fatigue.

  10. On nature's scaling effects

    NASA Technical Reports Server (NTRS)

    Wilkins, Dick J.

    1994-01-01

    This presentation afforded the opportunity to look back in the literature to discover scaling effects in nature that might be relevant to composites. Numerous examples were found in nature's approaches to wood, teeth, horns, leaves, eggs, feathers, etc. Nature transmits tensile forces rigidly with cohesive bonds, while dealing with compression forces usually through noncompressible hydraulics. The optimum design scaling approaches for aircraft were also reviewed for comparison with similitude laws. Finally, some historical evidence for the use of Weibull scaling in composites was reviewed.

  11. Scaling the Universe

    NASA Astrophysics Data System (ADS)

    Frankel, Norman E.

    2014-04-01

    A model is presented for the origin of the large scale structure of the universe and their Mass-Radius scaling law. The physics is conventional, orthodox, but it is used to fashion a highly unorthodox model of the origin of the galaxies, their groups, clusters, super-clusters, and great walls. The scaling law fits the observational results and the model offers new suggestions and predictions. These include a largest, a supreme, cosmic structure, and possible implications for the recently observed pressing cosmological anomalies.

  12. Pulsar time scale

    SciTech Connect

    Il'in, V.G.; Llyasov, Yu.P.; Kuz'min, A.D.; Pushkin, S.B.; Palii, G.N.; Shabanova, T.V.; Shchitov, Yu.P.

    1984-05-01

    In this article a new time scale is proposed, that of pulsar time PT which is based on the regular sequence of time intervals between pulses of a pulsar's radio emissions. In discussing variations in the arrival times of pulsar radio emissions, three kinds of variations in the radiation periods are described. PSR 0834 + 06 is used as the basic reference pulsar. Time scales are also determined for reference pulsars PSR 0905 + 08 and 1919 + 21. The initial parameters for the three reference pulsars needed for managing a PT scale are presented. The basic PT scale is defined as the continuous sequence of time intervals between radio-emission pulses of the basic reference pulsar.

  13. Large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Doolin, B. F.

    1975-01-01

    Classes of large scale dynamic systems were discussed in the context of modern control theory. Specific examples discussed were in the technical fields of aeronautics, water resources and electric power.

  14. Digital scale converter

    DOEpatents

    Upton, Richard G.

    1978-01-01

    A digital scale converter is provided for binary coded decimal (BCD) conversion. The converter may be programmed to convert a BCD value of a first scale to the equivalent value of a second scale according to a known ratio. The value to be converted is loaded into a first BCD counter and counted down to zero while a second BCD counter registers counts from zero or an offset value depending upon the conversion. Programmable rate multipliers are used to generate pulses at selected rates to the counters for the proper conversion ratio. The value present in the second counter at the time the first counter is counted to the zero count is the equivalent value of the second scale. This value may be read out and displayed on a conventional seven-segment digital display.

  15. Scaling the Geologic Past

    ERIC Educational Resources Information Center

    Gerritts, Mary

    1975-01-01

    Describes construction of a Geologic Time Scale on a 100 foot roll of paper and suggests activities concerning its use. Includes information about fossils and suggestions for conducting a fossil field trip with students. (BR)

  16. Scaling in sensitivity analysis

    USGS Publications Warehouse

    Link, W.A.; Doherty, P.F.

    2002-01-01

    Population matrix models allow sets of demographic parameters to be summarized by a single value 8, the finite rate of population increase. The consequences of change in individual demographic parameters are naturally measured by the corresponding changes in 8; sensitivity analyses compare demographic parameters on the basis of these changes. These comparisons are complicated by issues of scale. Elasticity analysis attempts to deal with issues of scale by comparing the effects of proportional changes in demographic parameters, but leads to inconsistencies in evaluating demographic rates. We discuss this and other problems of scaling in sensitivity analysis, and suggest a simple criterion for choosing appropriate scales. We apply our suggestions to data for the killer whale, Orcinus orca.

  17. The Improbability scale

    SciTech Connect

    Ritchie, David J.; /Fermilab

    2005-03-01

    The Improbability Scale (IS) is proposed as a way of communicating to the general public the improbability (and by implication, the probability) of events predicted as the result of scientific research. Through the use of the Improbability Scale, the public will be able to evaluate more easily the relative risks of predicted events and draw proper conclusions when asked to support governmental and public policy decisions arising from that research.

  18. Magnetron injection gun scaling

    NASA Astrophysics Data System (ADS)

    Lawson, W.

    1988-04-01

    A set of tradeoff equations was simplified to obtain scaling laws for magnetron injection guns (MIGs). The constraints are chosen to examine the maximum-peak-power capabilities of MIGs. The scaling laws are compared with exact solutions of the design equations and are supported by MIG simulations in which each MIG is designed to double the beam power of an existing design by adjusting one of the four fundamental parameters.

  19. Ensemble Pulsar Time Scale

    NASA Astrophysics Data System (ADS)

    Yin, D. S.; Gao, Y. P.; Zhao, S. H.

    2016-05-01

    Millisecond pulsars can generate another type of time scale that is totally independent of the atomic time scale, because the physical mechanisms of the pulsar time scale and the atomic time scale are quite different from each other. Usually the pulsar timing observational data are not evenly sampled, and the internals between data points range from several hours to more than half a month. What's more, these data sets are sparse. And all these make it difficult to generate an ensemble pulsar time scale. Hence, a new algorithm to calculate the ensemble pulsar time scale is proposed. Firstly, we use cubic spline interpolation to densify the data set, and make the intervals between data points even. Then, we employ the Vondrak filter to smooth the data set, and get rid of high-frequency noise, finally adopt the weighted average method to generate the ensemble pulsar time scale. The pulsar timing residuals represent clock difference between the pulsar time and atomic time, and the high precision pulsar timing data mean the clock difference measurement between the pulsar time and atomic time with a high signal to noise ratio, which is fundamental to generate pulsar time. We use the latest released NANOGRAV (North American Nanohertz Observatory for Gravitational Waves) 9-year data set to generate the ensemble pulsar time scale. This data set is from the newest NANOGRAV data release, which includes 9-year observational data of 37 millisecond pulsars using the 100-meter Green Bank telescope and 305-meter Arecibo telescope. We find that the algorithm used in this paper can lower the influence caused by noises in timing residuals, and improve long-term stability of pulsar time. Results show that the long-term (> 1 yr) frequency stability of the pulsar time is better than 3.4×10-15.

  20. Delusion assessment scales.

    PubMed

    Forgácová, L'ubica

    2008-03-01

    Since the beginning of the 19th century, delusions have been classified mainly by their content or theme. Clinical psychopathological investigation requires additional variables that will allow investigators to describe the structure of delusional experience more accurately. Delusions are multidimensional constructs that may change across the various mental disorders. Several authors have developed rating scales with the aim to measure individual dimensions of delusional structure. In this paper, common rating scales are mentioned and the main characteristics of the Simple Delusional Syndrome Scale (SDSS) are summarized. The SDSS scale consists of 7 items (logical organization, systemization, stability, conviction, influence on the action, extension, and insertion), scored from 1 to 5. Results of the statistical analysis confirm good psychometric characteristics of the scale, Cronbach coefficient alpha=0.8327. The SDSS may contribute to a better understanding and diagnostics of delusional disorders and, using statistical methods, can help quantify the relationship between the delusional syndrome and the primary disease process. The SDSS scale may also be utilized in the assessment of changes occurring in delusional syndromes depending on the therapeutic effect of psychopharmacological drugs.

  1. Goal attainment scaling (GAS) in rehabilitation: a practical guide.

    PubMed

    Turner-Stokes, Lynne

    2009-04-01

    Goal attainment scaling is a mathematical technique for quantifying the achievement (or otherwise) of goals set, and it can be used in rehabilitation. Because several different approaches are described in the literature, this article presents a simple practical approach to encourage uniformity in its application. It outlines the process of setting goals appropriately, so that the achievement of each goal can be measured on a 5-point scale ranging from -2 to +2, and then explains a method for quantifying the outcome in a single aggregated goal attainment score. This method gives a numerical T-score which is normally distributed about a mean of 50 (if the goals are achieved precisely) with a standard deviation of around this mean of 10 (if the goals are overachieved or underachieved). If desired, the approach encompasses weighting of goals to reflect the opinion of the patient on the personal importance of the goal and the opinion of the therapist or team on the difficulty of achieving the goal. Some practical tips are offered, as well as a simple spreadsheet (in Microsoft Excel) allowing easy calculation of the T-scores.

  2. Fire toxicity scaling

    SciTech Connect

    Braun, E.; Levin, B.C.; Paabo, M.; Gurman, J.; Holt, T.

    1987-02-01

    The toxicity of the thermal-decomposition products from two flexible polyurethane foams (with and without a fire retardant) and a cotton upholstery fabric was evaluated by a series of small-scale and large-scale tests single mock-up upholstery chair tests during smoldering or flaming decomposition. In addition other fire property data such as rates of heat release, effective heats of combustion, specific gas species yields, and smoke obscuration were measured. The degree of toxicity observed during and following the flaming tests (both large-scale room burns and the NBS Toxicity Tests) could be explained by a 3-Gas Model which includes the combined toxicological effects of CO, CO/sub 2/, and HCN. Essentially, no animal deaths were noted during the thirty minute exposures to the non-flaming or smoldering combustion products produced in the NBS Toxicity Test Method or the large-scale room test. In the large-scale room tests, little toxicological difference was noted between decomposition products from the burn room and a second room 12 meters away.

  3. Development of scale inhibitors

    SciTech Connect

    Gill, J.S.

    1996-12-01

    During the last fifty years, scale inhibition has gone from an art to a science. Scale inhibition has changed from simple pH adjustment to the use of optimized dose of designer polymers from multiple monomers. The water-treatment industry faces many challenges due to the need to conserve water, availability of only low quality water, increasing environmental regulations of the water discharge, and concern for human safety when using acid. Natural materials such as starch, lignin, tannin, etc., have been replaced with hydrolytically stable organic phosphates and synthetic polymers. Most progress in scale inhibition has come from the use of synergistic mixtures and copolymerizing different functionalities to achieve specific goals. Development of scale inhibitors requires an understanding of the mechanism of crystal growth and its inhibition. This paper discusses the historic perspective of scale inhibition and the development of new inhibitors based on the understanding of the mechanism of crystal growth and the use of powerful tools like molecular modeling to visualize crystal-inhibitor interactions.

  4. Atomic Scale Plasmonic Switch.

    PubMed

    Emboras, Alexandros; Niegemann, Jens; Ma, Ping; Haffner, Christian; Pedersen, Andreas; Luisier, Mathieu; Hafner, Christian; Schimmel, Thomas; Leuthold, Juerg

    2016-01-13

    The atom sets an ultimate scaling limit to Moore's law in the electronics industry. While electronics research already explores atomic scales devices, photonics research still deals with devices at the micrometer scale. Here we demonstrate that photonic scaling, similar to electronics, is only limited by the atom. More precisely, we introduce an electrically controlled plasmonic switch operating at the atomic scale. The switch allows for fast and reproducible switching by means of the relocation of an individual or, at most, a few atoms in a plasmonic cavity. Depending on the location of the atom either of two distinct plasmonic cavity resonance states are supported. Experimental results show reversible digital optical switching with an extinction ratio of 9.2 dB and operation at room temperature up to MHz with femtojoule (fJ) power consumption for a single switch operation. This demonstration of an integrated quantum device allowing to control photons at the atomic level opens intriguing perspectives for a fully integrated and highly scalable chip platform, a platform where optics, electronics, and memory may be controlled at the single-atom level.

  5. Full Scale Tunnel model

    NASA Technical Reports Server (NTRS)

    1929-01-01

    Interior view of Full-Scale Tunnel (FST) model. (Small human figures have been added for scale.) On June 26, 1929, Elton W. Miller wrote to George W. Lewis proposing the construction of a model of the full-scale tunnel . 'The excellent energy ratio obtained in the new wind tunnel of the California Institute of Technology suggests that before proceeding with our full scale tunnel design, we ought to investigate the effect on energy ratio of such factors as: 1. small included angle for the exit cone; 2. carefully designed return passages of circular section as far as possible, without sudden changes in cross sections; 3. tightness of walls. It is believed that much useful information can be obtained by building a model of about 1/16 scale, that is, having a closed throat of 2 ft. by 4 ft. The outside dimensions would be about 12 ft. by 25 ft. in plan and the height 4 ft. Two propellers will be required about 28 in. in diameter, each to be driven by direct current motor at a maximum speed of 4500 R.P.M. Provision can be made for altering the length of certain portions, particularly the exit cone, and possibly for the application of boundary layer control in order to effect satisfactory air flow.

  6. Spatial ecology across scales.

    PubMed

    Hastings, Alan; Petrovskii, Sergei; Morozov, Andrew

    2011-04-23

    The international conference 'Models in population dynamics and ecology 2010: animal movement, dispersal and spatial ecology' took place at the University of Leicester, UK, on 1-3 September 2010, focusing on mathematical approaches to spatial population dynamics and emphasizing cross-scale issues. Exciting new developments in scaling up from individual level movement to descriptions of this movement at the macroscopic level highlighted the importance of mechanistic approaches, with different descriptions at the microscopic level leading to different ecological outcomes. At higher levels of organization, different macroscopic descriptions of movement also led to different properties at the ecosystem and larger scales. New developments from Levy flight descriptions to the incorporation of new methods from physics and elsewhere are revitalizing research in spatial ecology, which will both increase understanding of fundamental ecological processes and lead to tools for better management.

  7. Scale of dark QCD

    NASA Astrophysics Data System (ADS)

    Bai, Yang; Schwaller, Pedro

    2014-03-01

    Most of the mass of ordinary matter has its origin from quantum chromodynamics (QCD). A similar strong dynamics, dark QCD, could exist to explain the mass origin of dark matter. Using infrared fixed points of the two gauge couplings, we provide a dynamical mechanism that relates the dark QCD confinement scale to our QCD scale, and hence provides an explanation for comparable dark baryon and proton masses. Together with a mechanism that generates equal amounts of dark baryon and ordinary baryon asymmetries in the early Universe, the similarity of dark matter and ordinary matter energy densities can be naturally explained. For a large class of gauge group representations, the particles charged under both QCD and dark QCD, necessary ingredients for generating the infrared fixed points, are found to have masses at 1-2 TeV, which sets the scale for dark matter direct detection and novel collider signatures involving visible and dark jets.

  8. Generalized scale invariant theories

    NASA Astrophysics Data System (ADS)

    Padilla, Antonio; Stefanyszyn, David; Tsoukalas, Minas

    2014-03-01

    We present the most general actions of a single scalar field and two scalar fields coupled to gravity, consistent with second-order field equations in four dimensions, possessing local scale invariance. We apply two different methods to arrive at our results. One method, Ricci gauging, was known to the literature and we find this to produce the same result for the case of one scalar field as a more efficient method presented here. However, we also find our more efficient method to be much more general when we consider two scalar fields. Locally scale invariant actions are also presented for theories with more than two scalar fields coupled to gravity and we explain how one could construct the most general actions for any number of scalar fields. Our generalized scale invariant actions have obvious applications to early Universe cosmology and include, for example, the Bezrukov-Shaposhnikov action as a subset.

  9. Absolute neutrino mass scale

    NASA Astrophysics Data System (ADS)

    Capelli, Silvia; Di Bari, Pasquale

    2013-04-01

    Neutrino oscillation experiments firmly established non-vanishing neutrino masses, a result that can be regarded as a strong motivation to extend the Standard Model. In spite of being the lightest massive particles, neutrinos likely represent an important bridge to new physics at very high energies and offer new opportunities to address some of the current cosmological puzzles, such as the matter-antimatter asymmetry of the Universe and Dark Matter. In this context, the determination of the absolute neutrino mass scale is a key issue within modern High Energy Physics. The talks in this parallel session well describe the current exciting experimental activity aiming to determining the absolute neutrino mass scale and offer an overview of a few models beyond the Standard Model that have been proposed in order to explain the neutrino masses giving a prediction for the absolute neutrino mass scale and solving the cosmological puzzles.

  10. Irreversibility time scale.

    PubMed

    Gallavotti, G

    2006-06-01

    Entropy creation rate is introduced for a system interacting with thermostats (i.e., for a system subject to internal conservative forces interacting with "external" thermostats via conservative forces) and a fluctuation theorem for it is proved. As an application, a time scale is introduced, to be interpreted as the time over which irreversibility becomes manifest in a process leading from an initial to a final stationary state of a mechanical system in a general nonequilibrium context. The time scale is evaluated in a few examples, including the classical Joule-Thompson process (gas expansion in a vacuum).

  11. Fundamentals of Zoological Scaling.

    ERIC Educational Resources Information Center

    Lin, Herbert

    1982-01-01

    The following animal characteristics are considered to determine how properties and characteristics of various systems change with system size (scaling): skeletal weight, speed of running, height and range of jumping, food consumption, heart rate, lifetime, locomotive efficiency, frequency of wing-flapping, and maximum sizes of flying and hovering…

  12. SCALING UNDERWATER EXPLODING WIRES

    DTIC Science & Technology

    heat of detonation of TNT in calories per gram. This scaling behavior extends the law of similarity six decades in terms of weight, from pounds to micropounds. The peak pressure for exploding-wire phenomena has been obtained from data and is emprically expressed as pm = 26,800 (cube root of W/R) to

  13. Scaling up Psycholinguistics

    ERIC Educational Resources Information Center

    Smith, Nathaniel J.

    2011-01-01

    This dissertation contains several projects, each addressing different questions with different techniques. In chapter 1, I argue that they are unified thematically by their goal of "scaling up psycholinguistics"; they are all aimed at analyzing large data-sets using tools that reveal patterns to propose and test mechanism-neutral hypotheses about…

  14. Scale, Composition, and Technology

    ERIC Educational Resources Information Center

    Victor, Peter A.

    2009-01-01

    Scale (gross domestic product), composition (goods and services), and technology (impacts per unit of goods and services) in combination are the proximate determinants in an economy of the resources used, wastes generated, and land transformed. In this article, we examine relationships among these determinants to understand better the contribution…

  15. Student Descriptor Scale Manual.

    ERIC Educational Resources Information Center

    Goetz, Lori; And Others

    The Student Descriptor Scale (SDS) was developed as a validation measure to determine whether students described and counted by states as "severely handicapped" were, indeed, students with severe disabilities. The SDS addresses nine characteristics: intellectual disability, health impairment, need for toileting assistance, upper torso motor…

  16. The Spiritual Competency Scale

    ERIC Educational Resources Information Center

    Robertson, Linda A.

    2010-01-01

    This study describes the development of the Spiritual Competency Scale, which was based on the Association for Spiritual, Ethical and Religious Values in Counseling's original Spiritual Competencies. Participants were 662 counseling students from religiously based and secular universities nationwide. Exploratory factor analysis revealed a 22-item,…

  17. Scaling up Education Reform

    ERIC Educational Resources Information Center

    Gaffney, Jon D. H.; Richards, Evan; Kustusch, Mary Bridget; Ding, Lin; Beichner, Robert J.

    2008-01-01

    The SCALE-UP (Student-Centered Activities for Large Enrollment for Undergraduate Programs) project was developed to implement reforms designed for small classes into large physics classes. Over 50 schools across the country, ranging from Wake Technical Community College to Massachusetts Institute of Technology (MIT), have adopted it for classes of…

  18. The Social Integration Scale.

    ERIC Educational Resources Information Center

    Ross, Susan M.; Straus, Murray A.

    The Social Integration Scale (SIS) is intended to facilitate empirical research on the applicability of control theory to many types of adult crime, including "street crime," white collar crime, and physical assaults on spouses. There are five subscales: (1) belief (belief in law and social control); (2) commitment (psychological…

  19. Bristol Stool Form Scale

    MedlinePlus

    ... Stool Form Scale Type Description Type 1 Separate hard lumps, like nuts Image Type 2 Sausage-shaped but lumpy Type 3 Like a sausage or snake but with cracks on its surface Type 4 Like a sausage or snake, smooth and soft ...

  20. Scaling School Turnaround

    ERIC Educational Resources Information Center

    Herman, Rebecca

    2012-01-01

    This article explores the research on turning around low performing schools to summarize what we know, what we don't know, and what this means for scaling school turnaround efforts. "School turnaround" is defined here as quick, dramatic gains in academic achievement for persistently low performing schools. The article first considers the…

  1. Allometric scaling of countries

    NASA Astrophysics Data System (ADS)

    Zhang, Jiang; Yu, Tongkui

    2010-11-01

    As huge complex systems consisting of geographic regions, natural resources, people and economic entities, countries follow the allometric scaling law which is ubiquitous in ecological, and urban systems. We systematically investigated the allometric scaling relationships between a large number of macroscopic properties and geographic (area), demographic (population) and economic (GDP, gross domestic production) sizes of countries respectively. We found that most of the economic, trade, energy consumption, communication related properties have significant super-linear (the exponent is larger than 1) or nearly linear allometric scaling relations with the GDP. Meanwhile, the geographic (arable area, natural resources, etc.), demographic (labor force, military age population, etc.) and transportation-related properties (road length, airports) have significant and sub-linear (the exponent is smaller than 1) allometric scaling relations with area. Several differences of power law relations with respect to the population between countries and cities were pointed out. First, population increases sub-linearly with area in countries. Second, the GDP increases linearly in countries but not super-linearly as in cities. Finally, electricity or oil consumption per capita increases with population faster than cities.

  2. Scaling Applications in hydrology

    NASA Astrophysics Data System (ADS)

    Gebremichael, Mekonnen

    2010-05-01

    Besides downscaling applications, scaling properties of hydrological fields can be used to address a variety of research questions. In this presentation, we will use scaling properties to address questions related to satellite evapotranspiration algorithms, precipitation-streamflow relationships, and hydrological model calibration. Most of the existing satellite-based evapotranspiration (ET) algorithms have been developed using fine-resolution Landsat TM and ASTER data. However, these algorithms are often applied to coarse-resolution MODIS data. Our results show that applying the satellite-based algorithms, which are developed at ASTER resolution, to MODIS resolution leads to ET estimates that (1) preserve the overall spatial pattern (spatial correlation in excess of 0.90), (2) increase the spatial standard deviation and maximum value, (3) have modest conditional bias: underestimate low ET rates (< 1 mm/day) and overestimate high ET rates; the overestimation is within 20%. The results emphasize the need for exploring alternatives for estimation of ET from MODIS. Understanding the relationship between the scaling properties of precipitation and streamflow is important in a number of applications. We present the results of a detailed river flow fluctuation analysis on daily records from 14 stations in the Flint River basin in Georgia in the United States with focus on effect of watershed area on long memory of river flow fluctuations. The areas of the watersheds draining to the stations range from 22 km2 to 19,606 km2. Results show that large watersheds have more persistent flow fluctuations and stronger long-term (time greater than scale break point) memory than small watersheds while precipitation time series shows weak long-term correlation. We conclude that a watershed acts as a 'filter' for a 'white noise' precipitation with more significant filtering in case of large watersheds. Finally, we compare the scaling properties of simulated and observed spatial soil

  3. Scales of mantle heterogeneity

    NASA Astrophysics Data System (ADS)

    Moore, J. C.; Akber-Knutson, S.; Konter, J.; Kellogg, J.; Hart, S.; Kellogg, L. H.; Romanowicz, B.

    2004-12-01

    A long-standing question in mantle dynamics concerns the scale of heterogeneity in the mantle. Mantle convection tends to both destroy (through stirring) and create (through melt extraction and subduction) heterogeneity in bulk and trace element composition. Over time, these competing processes create variations in geochemical composition along mid-oceanic ridges and among oceanic islands, spanning a range of scales from extremely long wavelength (for example, the DUPAL anomaly) to very small scale (for example, variations amongst melt inclusions). While geochemical data and seismic observations can be used to constrain the length scales of mantle heterogeneity, dynamical mixing calculations can illustrate the processes and timescales involved in stirring and mixing. At the Summer 2004 CIDER workshop on Relating Geochemical and Seismological Heterogeneity in the Earth's Mantle, an interdisciplinary group evaluated scales of heterogeneity in the Earth's mantle using a combined analysis of geochemical data, seismological data and results of numerical models of mixing. We mined the PetDB database for isotopic data from glass and whole rock analyses for the Mid-Atlantic Ridge (MAR) and the East Pacific Rise (EPR), projecting them along the ridge length. We examined Sr isotope variability along the East Pacific rise by looking at the difference in Sr ratio between adjacent samples as a function of distance between the samples. The East Pacific Rise exhibits an overall bowl shape of normal MORB characteristics, with higher values in the higher latitudes (there is, however, an unfortunate gap in sampling, roughly 2000 km long). These background characteristics are punctuated with spikes in values at various locations, some, but not all of which are associated with off-axis volcanism. A Lomb-Scargle periodogram for unevenly spaced data was utilized to construct a power spectrum of the scale lengths of heterogeneity along both ridges. Using the same isotopic systems (Sr, Nd

  4. Application of psychometric theory to the measurement of voice quality using rating scales.

    PubMed

    Shrivastav, Rahul; Sapienza, Christine M; Nandur, Vuday

    2005-04-01

    Rating scales are commonly used to study voice quality. However, recent research has demonstrated that perceptual measures of voice quality obtained using rating scales suffer from poor interjudge agreement and reliability, especially in the mid-range of the scale. These findings, along with those obtained using multidimensional scaling (MDS), have been interpreted to show that listeners perceive voice quality in an idiosyncratic manner. Based on psychometric theory, the present research explored an alternative explanation for the poor interlistener agreement observed in previous research. This approach suggests that poor agreement between listeners may result, in part, from measurement errors related to a variety of factors rather than true differences in the perception of voice quality. In this study, 10 listeners rated breathiness for 27 vowel stimuli using a 5-point rating scale. Each stimulus was presented to the listeners 10 times in random order. Interlistener agreement and reliability were calculated from these ratings. Agreement and reliability were observed to improve when multiple ratings of each stimulus from each listener were averaged and when standardized scores were used instead of absolute ratings. The probability of exact agreement was found to be approximately .9 when using averaged ratings and standardized scores. In contrast, the probability of exact agreement was only .4 when a single rating from each listener was used to measure agreement. These findings support the hypothesis that poor agreement reported in past research partly arises from errors in measurement rather than individual differences in the perception of voice quality.

  5. Measuring Growth with Vertical Scales

    ERIC Educational Resources Information Center

    Briggs, Derek C.

    2013-01-01

    A vertical score scale is needed to measure growth across multiple tests in terms of absolute changes in magnitude. Since the warrant for subsequent growth interpretations depends upon the assumption that the scale has interval properties, the validation of a vertical scale would seem to require methods for distinguishing interval scales from…

  6. ELECTRONIC PULSE SCALING CIRCUITS

    DOEpatents

    Cooke-Yarborough, E.H.

    1958-11-18

    Electronic pulse scaling circults of the klnd comprlsing a serles of bi- stable elements connected ln sequence, usually in the form of a rlng so as to be cycllcally repetitive at the highest scallng factor, are described. The scaling circuit comprises a ring system of bi-stable elements each arranged on turn-off to cause, a succeeding element of the ring to be turned-on, and one being arranged on turn-off to cause a further element of the ring to be turned-on. In addition, separate means are provided for applying a turn-off pulse to all the elements simultaneously, and for resetting the elements to a starting condition at the end of each cycle.

  7. Large-Scale Disasters

    NASA Astrophysics Data System (ADS)

    Gad-El-Hak, Mohamed

    "Extreme" events - including climatic events, such as hurricanes, tornadoes, and drought - can cause massive disruption to society, including large death tolls and property damage in the billions of dollars. Events in recent years have shown the importance of being prepared and that countries need to work together to help alleviate the resulting pain and suffering. This volume presents a review of the broad research field of large-scale disasters. It establishes a common framework for predicting, controlling and managing both manmade and natural disasters. There is a particular focus on events caused by weather and climate change. Other topics include air pollution, tsunamis, disaster modeling, the use of remote sensing and the logistics of disaster management. It will appeal to scientists, engineers, first responders and health-care professionals, in addition to graduate students and researchers who have an interest in the prediction, prevention or mitigation of large-scale disasters.

  8. Scaling macroscopic aquatic locomotion

    NASA Astrophysics Data System (ADS)

    Gazzola, Mattia; Argentina, Médéric; Mahadevan, L.

    2014-10-01

    Inertial aquatic swimmers that use undulatory gaits range in length L from a few millimetres to 30 metres, across a wide array of biological taxa. Using elementary hydrodynamic arguments, we uncover a unifying mechanistic principle characterizing their locomotion by deriving a scaling relation that links swimming speed U to body kinematics (tail beat amplitude A and frequency ω) and fluid properties (kinematic viscosity ν). This principle can be simply couched as the power law Re ~ Swα, where Re = UL/ν >> 1 and Sw = ωAL/ν, with α = 4/3 for laminar flows, and α = 1 for turbulent flows. Existing data from over 1,000 measurements on fish, amphibians, larvae, reptiles, mammals and birds, as well as direct numerical simulations are consistent with our scaling. We interpret our results as the consequence of the convergence of aquatic gaits to the performance limits imposed by hydrodynamics.

  9. Scaling macroscopic aquatic locomotion

    NASA Astrophysics Data System (ADS)

    Gazzola, Mattia; Argentina, Mederic; Mahadevan, Lakshminarayanan

    2014-11-01

    Inertial aquatic swimmers that use undulatory gaits range in length L from a few millimeters to 30 meters, across a wide array of biological taxa. Using elementary hydrodynamic arguments, we uncover a unifying mechanistic principle characterizing their locomotion by deriving a scaling relation that links swimming speed U to body kinematics (tail beat amplitude A and frequency ω) and fluid properties (kinematic viscosity ν). This principle can be simply couched as the power law Re ~ Swα , where Re = UL / ν >> 1 and Sw = ωAL / ν , with α = 4 / 3 for laminar flows, and α = 1 for turbulent flows. Existing data from over 1000 measurements on fish, amphibians, larvae, reptiles, mammals and birds, as well as direct numerical simulations are consistent with our scaling. We interpret our results as the consequence of the convergence of aquatic gaits to the performance limits imposed by hydrodynamics.

  10. Fundamentals of zoological scaling

    NASA Astrophysics Data System (ADS)

    Lin, Herbert

    1982-01-01

    Most introductory physics courses emphasize highly idealized problems with unique well-defined answers. Though many textbooks complement these problems with estimation problems, few books present anything more than an elementary discussion of scaling. This paper presents some fundamentals of scaling in the zoological domain—a domain complex by any standard, but one also well suited to illustrate the power of very simple physical ideas. We consider the following animal characteristics: skeletal weight, speed of running, height and range of jumping, food consumption, heart rate, lifetime, locomotive efficiency, frequency of wing flapping, and maximum sizes of animals that fly and hover. These relationships are compared to zoological data and everyday experience, and match reasonably well.

  11. The Extragalactic Distance Scale

    NASA Astrophysics Data System (ADS)

    Livio, Mario; Donahue, Megan; Panagia, Nino

    1997-07-01

    Participants; Preface; Foreword; Early history of the distance scale problem, S. van den Bergh; Cosmology: From Hubble to HST, M. S. Turner; Age constraints nucleocosmochronology, J. Truran; The ages of globular clusters, P. Demarque; The linearity of the Hubble flow M. Postman; Gravitational lensing and the extragalactic distance scale, R. D. Blandford andT . Kundic; Using the cosmic microwave background to constrain the Hubble constant A. Lasenby and T M. Jones; Cepheids as distance indicators, N. R. Tanvir; The I-band Tully-Fisher relation and the Hubble constant, R. Giovanell; The calibration of type 1a supernovae as standard candles, A. Saha; Focusing in on the Hubble constant, G. A. Tammann & M. Federspiel; Interim report on the calibration of the Tully-Fisher relation in the HST Key Project to measure the Hubble constant, J. Mould et al.; Hubble Space Telescope Key Project on the extragalactic distance scale, W. L. Freedman, B. F. Madore and T R. C. Kennicutt; Novae as distance indicators, M. Livio; Verifying the planetary nebula luminosity function method, G. H. Jacoby; On the possible use of radio supernovae for distance determinations, K. W. Weiler et al.; Post-AGB stars as standard candles, H. Bond; Helium core flash at the tip of the red giant branch: a population II distance indicator, B. F. Madore, W. L. Freedman and T S. Sakai; Globular clusters as distance indicators, B. C. Whitmore; Detached eclipsing binaries as primary distance and age indicators, B. Paczynski; Light echoes: geometric measurement of galaxy distances, W. B. Sparks; The SBF survey of galaxy distances J. L. Tonry; Extragalactic distance scales: The long and short of it, V. Trimble.

  12. Full Scale Wind Tunnel

    NASA Technical Reports Server (NTRS)

    1931-01-01

    Construction of motor fairing for the fan motors of the Full-Scale Tunnel (FST). The motors and their supporting structures were enclosed in aerodynamically smooth fairings to minimize resistance to the air flow. Close examination of this photograph reveals the complicated nature of constructing a wind tunnel. This motor fairing, like almost every other structure in the FST, represents a one-of-a-kind installation.

  13. Is this scaling nonlinear?

    PubMed Central

    2016-01-01

    One of the most celebrated findings in complex systems in the last decade is that different indexes y (e.g. patents) scale nonlinearly with the population x of the cities in which they appear, i.e. y∼xβ,β≠1. More recently, the generality of this finding has been questioned in studies that used new databases and different definitions of city boundaries. In this paper, we investigate the existence of nonlinear scaling, using a probabilistic framework in which fluctuations are accounted for explicitly. In particular, we show that this allows not only to (i) estimate β and confidence intervals, but also to (ii) quantify the evidence in favour of β≠1 and (iii) test the hypothesis that the observations are compatible with the nonlinear scaling. We employ this framework to compare five different models to 15 different datasets and we find that the answers to points (i)–(iii) crucially depend on the fluctuations contained in the data, on how they are modelled, and on the fact that the city sizes are heavy-tailed distributed. PMID:27493764

  14. Urban scaling in Europe

    PubMed Central

    Bettencourt, Luís M. A.; Lobo, José

    2016-01-01

    Over the last few decades, in disciplines as diverse as economics, geography and complex systems, a perspective has arisen proposing that many properties of cities are quantitatively predictable due to agglomeration or scaling effects. Using new harmonized definitions for functional urban areas, we examine to what extent these ideas apply to European cities. We show that while most large urban systems in Western Europe (France, Germany, Italy, Spain, UK) approximately agree with theoretical expectations, the small number of cities in each nation and their natural variability preclude drawing strong conclusions. We demonstrate how this problem can be overcome so that cities from different urban systems can be pooled together to construct larger datasets. This leads to a simple statistical procedure to identify urban scaling relations, which then clearly emerge as a property of European cities. We compare the predictions of urban scaling to Zipf's law for the size distribution of cities and show that while the former holds well the latter is a poor descriptor of European cities. We conclude with scenarios for the size and properties of future pan-European megacities and their implications for the economic productivity, technological sophistication and regional inequalities of an integrated European urban system. PMID:26984190

  15. Urban scaling in Europe.

    PubMed

    Bettencourt, Luís M A; Lobo, José

    2016-03-01

    Over the last few decades, in disciplines as diverse as economics, geography and complex systems, a perspective has arisen proposing that many properties of cities are quantitatively predictable due to agglomeration or scaling effects. Using new harmonized definitions for functional urban areas, we examine to what extent these ideas apply to European cities. We show that while most large urban systems in Western Europe (France, Germany, Italy, Spain, UK) approximately agree with theoretical expectations, the small number of cities in each nation and their natural variability preclude drawing strong conclusions. We demonstrate how this problem can be overcome so that cities from different urban systems can be pooled together to construct larger datasets. This leads to a simple statistical procedure to identify urban scaling relations, which then clearly emerge as a property of European cities. We compare the predictions of urban scaling to Zipf's law for the size distribution of cities and show that while the former holds well the latter is a poor descriptor of European cities. We conclude with scenarios for the size and properties of future pan-European megacities and their implications for the economic productivity, technological sophistication and regional inequalities of an integrated European urban system.

  16. Global Scale Solar Disturbances

    NASA Astrophysics Data System (ADS)

    Title, A. M.; Schrijver, C. J.; DeRosa, M. L.

    2013-12-01

    The combination of the STEREO and SDO missions have allowed for the first time imagery of the entire Sun. This coupled with the high cadence, broad thermal coverage, and the large dynamic range of the Atmospheric Imaging Assembly on SDO has allowed discovery of impulsive solar disturbances that can significantly affect a hemisphere or more of the solar volume. Such events are often, but not always, associated with M and X class flares. GOES C and even B class flares are also associated with these large scale disturbances. Key to the recognition of the large scale disturbances was the creation of log difference movies. By taking the log of images before differencing events in the corona become much more evident. Because such events cover such a large portion of the solar volume their passage can effect the dynamics of the entire corona as it adjusts to and recovers from their passage. In some cases this may lead to a another flare or filament ejection, but in general direct causal evidence of 'sympathetic' behavior is lacking. However, evidence is accumulating these large scale events create an environment that encourages other solar instabilities to occur. Understanding the source of these events and how the energy that drives them is built up, stored, and suddenly released is critical to understanding the origins of space weather. Example events and comments of their relevance will be presented.

  17. Advanced scale conditioning agents

    SciTech Connect

    Davis, Jeff; Battaglia, Philip J.

    2004-06-01

    A technical description of Advanced Scale Conditioning Agents (ASCA) technology was published in the May-June 2003 edition of the Nuclear Plant Journal. That article described the development of programs of advanced scale conditioning agents and specific types to maintain the secondary side of steam generators within a pressurized water reactor free of deposited corrosion products and corrosion-inducing contaminants to ensure their long-term operation. This article describes the first two plant applications of advanced scale conditioning agents implemented at Southern Nuclear Operating Company's Vogtle Units 1 and 2 during their 2002 scheduled outages to minimize tube degradation and maintain full power operation using the most effective techniques while minimizing outage costs. The goal was to remove three to four fuel cycles of deposits from each steam generator so that after future chemical cleaning activities, ASCAs could be used to maintain the cleanliness of the steam generators without the need for additional chemical cleaning efforts. The goal was achieved as well as several other benefits that resulted in cost savings to the plant.

  18. Micro-Scale Thermoacoustics

    NASA Astrophysics Data System (ADS)

    Offner, Avshalom; Ramon, Guy Z.

    2016-11-01

    Thermoacoustic phenomena - conversion of heat to acoustic oscillations - may be harnessed for construction of reliable, practically maintenance-free engines and heat pumps. Specifically, miniaturization of thermoacoustic devices holds great promise for cooling of micro-electronic components. However, as devices size is pushed down to micro-meter scale it is expected that non-negligible slip effects will exist at the solid-fluid interface. Accordingly, new theoretical models for thermoacoustic engines and heat pumps were derived, accounting for a slip boundary condition. These models are essential for the design process of micro-scale thermoacoustic devices that will operate under ultrasonic frequencies. Stability curves for engines - representing the onset of self-sustained oscillations - were calculated with both no-slip and slip boundary conditions, revealing improvement in the performance of engines with slip at the resonance frequency range applicable for micro-scale devices. Maximum achievable temperature differences curves for thermoacoustic heat pumps were calculated, revealing the negative effect of slip on the ability to pump heat up a temperature gradient. The authors acknowledge the support from the Nancy and Stephen Grand Technion Energy Program (GTEP).

  19. Color quality scale

    NASA Astrophysics Data System (ADS)

    Davis, Wendy; Ohno, Yoshi

    2010-03-01

    The color rendering index (CRI) has been shown to have deficiencies when applied to white light-emitting-diode-based sources. Furthermore, evidence suggests that the restricted scope of the CRI unnecessarily penalizes some light sources with desirable color qualities. To solve the problems of the CRI and include other dimensions of color quality, the color quality scale (CQS) has been developed. Although the CQS uses many of elements of the CRI, there are a number of fundamental differences. Like the CRI, the CQS is a test-samples method that compares the appearance of a set of reflective samples when illuminated by the test lamp to their appearance under a reference illuminant. The CQS uses a larger set of reflective samples, all of high chroma, and combines the color differences of the samples with a root mean square. Additionally, the CQS does not penalize light sources for causing increases in the chroma of object colors but does penalize sources with smaller rendered color gamut areas. The scale of the CQS is converted to span 0-100, and the uniform object color space and chromatic adaptation transform used in the calculations are updated. Supplementary scales have also been developed for expert users.

  20. Estimation of local spatial scale

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1987-01-01

    The concept of local scale asserts that for a given class of psychophysical measurements, performance at any two visual field locations is equated by magnifying the targets by the local scale associated with each location. Local scale has been hypothesized to be equal to cortical magnification or alternatively to the linear density of receptors or ganglion cells. Here, it is shown that it is possible to estimate local scale without prior knowledge about the scale or its physiological basis.

  1. The Loneliness Questionnaire-Short Version: an evaluation of reverse-worded and non-reverse-worded items via item response theory.

    PubMed

    Ebesutani, Chad; Drescher, Christopher F; Reise, Steven P; Heiden, Laurie; Hight, Terry L; Damon, John D; Young, John

    2012-01-01

    Although reverse-worded items have often been incorporated in scale construction to minimize the effects of acquiescent reporting biases, some researchers have more recently begun questioning this approach and wondering whether the advantages associated with incorporating reverse-worded items is worth the complexities that they bring to measures (e.g., Brown, 2003 ; Marsh, 1996 ). In this study, we used item response theory (IRT) to determine whether there is statistical justification to eliminate the reverse-worded items (e.g., "I have lots of friends") from the Loneliness Questionnaire (LQ; Asher, Hymel, & Renshaw, 1984) and retain only the non-reverse-worded items (e.g., "I'm lonely") to inform the provision of a shortened LQ version. Using a large sample of children (Grades 2-7; n = 6,784) and adolescents (Grades 8-12; n = 4,941), we examined the psychometric properties of the 24-item LQ and found support for retaining the 9 non-reverse-worded LQ items to make up a shortened measure of loneliness in youth. We found that the non-reverse-worded items were associated with superior psychometric properties relative to the reverse-worded items with respect to reliability and IRT parameters (e.g., discrimination and item information). A 3-point Likert-type scale was also found to be more suitable for measuring loneliness across both children and adolescents compared to the original 5-point scale. The relative contributions of reverse-worded and non-reverse-worded items in scale development for youth instruments are also discussed.

  2. Mechanism for salt scaling

    NASA Astrophysics Data System (ADS)

    Valenza, John J., II

    Salt scaling is superficial damage caused by freezing a saline solution on the surface of a cementitious body. The damage consists of the removal of small chips or flakes of binder. The discovery of this phenomenon in the early 1950's prompted hundreds of experimental studies, which clearly elucidated the characteristics of this damage. In particular it was shown that a pessimum salt concentration exists, where a moderate salt concentration (˜3%) results in the most damage. Despite the numerous studies, the mechanism responsible for salt scaling has not been identified. In this work it is shown that salt scaling is a result of the large thermal expansion mismatch between ice and the cementitious body, and that the mechanism responsible for damage is analogous to glue-spalling. When ice forms on a cementitious body a bi-material composite is formed. The thermal expansion coefficient of the ice is ˜5 times that of the underlying body, so when the temperature of the composite is lowered below the melting point, the ice goes into tension. Once this stress exceeds the strength of the ice, cracks initiate in the ice and propagate into the surface of the cementitious body, removing a flake of material. The glue-spall mechanism accounts for all of the characteristics of salt scaling. In particular, a theoretical analysis is presented which shows that the pessimum concentration is a consequence of the effect of brine pockets on the mechanical properties of ice, and that the damage morphology is accounted for by fracture mechanics. Finally, empirical evidence is presented that proves that the glue-small mechanism is the primary cause of salt scaling. The primary experimental tool used in this study is a novel warping experiment, where a pool of liquid is formed on top of a thin (˜3 mm) plate of cement paste. Stresses in the plate, including thermal expansion mismatch, result in warping of the plate, which is easily detected. This technique revealed the existence of

  3. Comparing the theoretical versions of the Beaufort scale, the T-Scale and the Fujita scale

    NASA Astrophysics Data System (ADS)

    Meaden, G. Terence; Kochev, S.; Kolendowicz, L.; Kosa-Kiss, A.; Marcinoniene, Izolda; Sioutas, Michalis; Tooming, Heino; Tyrrell, John

    2007-02-01

    2005 is the bicentenary of the Beaufort Scale and its wind-speed codes: the marine version in 1805 and the land version later. In the 1920s when anemometers had come into general use, the Beaufort Scale was quantified by a formula based on experiment. In the early 1970s two tornado wind-speed scales were proposed: (1) an International T-Scale based on the Beaufort Scale; and (2) Fujita's damage scale developed for North America. The International Beaufort Scale and the T-Scale share a common root in having an integral theoretical relationship with an established scientific basis, whereas Fujita's Scale introduces criteria that make its intensities non-integral with Beaufort. Forces on the T-Scale, where T stands for Tornado force, span the range 0 to 10 which is highly useful world wide. The shorter range of Fujita's Scale (0 to 5) is acceptable for American use but less convenient elsewhere. To illustrate the simplicity of the decimal T-Scale, mean hurricane wind speed of Beaufort 12 is T2 on the T-Scale but F1.121 on the F-Scale; while a tornado wind speed of T9 (= B26) becomes F4.761. However, the three wind scales can be uni-fied by either making F-Scale numbers exactly half the magnitude of T-Scale numbers [i.e. F'half = T / 2 = (B / 4) - 4] or by doubling the numbers of this revised version to give integral equivalence with the T-Scale. The result is a decimal formula F'double = T = (B / 2) - 4 named the TF-Scale where TF stands for Tornado Force. This harmonious 10-digit scale has all the criteria needed for world-wide practical effectiveness.

  4. Nestedness across biological scales.

    PubMed

    Cantor, Mauricio; Pires, Mathias M; Marquitti, Flavia M D; Raimundo, Rafael L G; Sebastián-González, Esther; Coltri, Patricia P; Perez, S Ivan; Barneche, Diego R; Brandt, Débora Y C; Nunes, Kelly; Daura-Jorge, Fábio G; Floeter, Sergio R; Guimarães, Paulo R

    2017-01-01

    Biological networks pervade nature. They describe systems throughout all levels of biological organization, from molecules regulating metabolism to species interactions that shape ecosystem dynamics. The network thinking revealed recurrent organizational patterns in complex biological systems, such as the formation of semi-independent groups of connected elements (modularity) and non-random distributions of interactions among elements. Other structural patterns, such as nestedness, have been primarily assessed in ecological networks formed by two non-overlapping sets of elements; information on its occurrence on other levels of organization is lacking. Nestedness occurs when interactions of less connected elements form proper subsets of the interactions of more connected elements. Only recently these properties began to be appreciated in one-mode networks (where all elements can interact) which describe a much wider variety of biological phenomena. Here, we compute nestedness in a diverse collection of one-mode networked systems from six different levels of biological organization depicting gene and protein interactions, complex phenotypes, animal societies, metapopulations, food webs and vertebrate metacommunities. Our findings suggest that nestedness emerge independently of interaction type or biological scale and reveal that disparate systems can share nested organization features characterized by inclusive subsets of interacting elements with decreasing connectedness. We primarily explore the implications of a nested structure for each of these studied systems, then theorize on how nested networks are assembled. We hypothesize that nestedness emerges across scales due to processes that, although system-dependent, may share a general compromise between two features: specificity (the number of interactions the elements of the system can have) and affinity (how these elements can be connected to each other). Our findings suggesting occurrence of nestedness

  5. Nestedness across biological scales

    PubMed Central

    Marquitti, Flavia M. D.; Raimundo, Rafael L. G.; Sebastián-González, Esther; Coltri, Patricia P.; Perez, S. Ivan; Brandt, Débora Y. C.; Nunes, Kelly; Daura-Jorge, Fábio G.; Floeter, Sergio R.; Guimarães, Paulo R.

    2017-01-01

    Biological networks pervade nature. They describe systems throughout all levels of biological organization, from molecules regulating metabolism to species interactions that shape ecosystem dynamics. The network thinking revealed recurrent organizational patterns in complex biological systems, such as the formation of semi-independent groups of connected elements (modularity) and non-random distributions of interactions among elements. Other structural patterns, such as nestedness, have been primarily assessed in ecological networks formed by two non-overlapping sets of elements; information on its occurrence on other levels of organization is lacking. Nestedness occurs when interactions of less connected elements form proper subsets of the interactions of more connected elements. Only recently these properties began to be appreciated in one-mode networks (where all elements can interact) which describe a much wider variety of biological phenomena. Here, we compute nestedness in a diverse collection of one-mode networked systems from six different levels of biological organization depicting gene and protein interactions, complex phenotypes, animal societies, metapopulations, food webs and vertebrate metacommunities. Our findings suggest that nestedness emerge independently of interaction type or biological scale and reveal that disparate systems can share nested organization features characterized by inclusive subsets of interacting elements with decreasing connectedness. We primarily explore the implications of a nested structure for each of these studied systems, then theorize on how nested networks are assembled. We hypothesize that nestedness emerges across scales due to processes that, although system-dependent, may share a general compromise between two features: specificity (the number of interactions the elements of the system can have) and affinity (how these elements can be connected to each other). Our findings suggesting occurrence of nestedness

  6. Reconsidering Fault Slip Scaling

    NASA Astrophysics Data System (ADS)

    Gomberg, J. S.; Wech, A.; Creager, K. C.; Obara, K.; Agnew, D. C.

    2015-12-01

    The scaling of fault slip events given by the relationship between the scalar moment M0, and duration T, potentially provides key constraints on the underlying physics controlling slip. Many studies have suggested that measurements of M0 and T are related as M0=KfT3 for 'fast' slip events (earthquakes) and M0=KsT for 'slow' slip events, in which Kf and Ks are proportionality constants, although some studies have inferred intermediate relations. Here 'slow' and 'fast' refer to slip front propagation velocities, either so slow that seismic radiation is too small or long period to be measurable or fast enough that dynamic processes may be important for the slip process and measurable seismic waves radiate. Numerous models have been proposed to explain the differing M0-T scaling relations. We show that a single, simple dislocation model of slip events within a bounded slip zone may explain nearly all M0-T observations. Rather than different scaling for fast and slow populations, we suggest that within each population the scaling changes from M0 proportional to T3 to T when the slipping area reaches the slip zone boundaries and transitions from unbounded, 2-dimensional to bounded, 1-dimensional growth. This transition has not been apparent previously for slow events because data have sampled only the bounded regime and may be obscured for earthquakes when observations from multiple tectonic regions are combined. We have attempted to sample the expected transition between bounded and unbounded regimes for the slow slip population, measuring tremor cluster parameters from catalogs for Japan and Cascadia and using them as proxies for small slow slip event characteristics. For fast events we employed published earthquake slip models. Observations corroborate our hypothesis, but highlight observational difficulties. We find that M0-T observations for both slow and fast slip events, spanning 12 orders of magnitude in M0, are consistent with a single model based on dislocation

  7. Soil organic carbon across scales.

    PubMed

    O'Rourke, Sharon M; Angers, Denis A; Holden, Nicholas M; McBratney, Alex B

    2015-10-01

    Mechanistic understanding of scale effects is important for interpreting the processes that control the global carbon cycle. Greater attention should be given to scale in soil organic carbon (SOC) science so that we can devise better policy to protect/enhance existing SOC stocks and ensure sustainable use of soils. Global issues such as climate change require consideration of SOC stock changes at the global and biosphere scale, but human interaction occurs at the landscape scale, with consequences at the pedon, aggregate and particle scales. This review evaluates our understanding of SOC across all these scales in the context of the processes involved in SOC cycling at each scale and with emphasis on stabilizing SOC. Current synergy between science and policy is explored at each scale to determine how well each is represented in the management of SOC. An outline of how SOC might be integrated into a framework of soil security is examined. We conclude that SOC processes at the biosphere to biome scales are not well understood. Instead, SOC has come to be viewed as a large-scale pool subjects to carbon flux. Better understanding exists for SOC processes operating at the scales of the pedon, aggregate and particle. At the landscape scale, the influence of large- and small-scale processes has the greatest interaction and is exposed to the greatest modification through agricultural management. Policy implemented at regional or national scale tends to focus at the landscape scale without due consideration of the larger scale factors controlling SOC or the impacts of policy for SOC at the smaller SOC scales. What is required is a framework that can be integrated across a continuum of scales to optimize SOC management.

  8. Small scale sanitation technologies.

    PubMed

    Green, W; Ho, G

    2005-01-01

    Small scale systems can improve the sustainability of sanitation systems as they more easily close the water and nutrient loops. They also provide alternate solutions to centrally managed large scale infrastructures. Appropriate sanitation provision can improve the lives of people with inadequate sanitation through health benefits, reuse products as well as reduce ecological impacts. In the literature there seems to be no compilation of a wide range of available onsite sanitation systems around the world that encompasses black and greywater treatment plus stand-alone dry and urine separation toilet systems. Seventy technologies have been identified and classified according to the different waste source streams. Sub-classification based on major treatment methods included aerobic digestion, composting and vermicomposting, anaerobic digestion, sand/soil/peat filtration and constructed wetlands. Potential users or suppliers of sanitation systems can choose from wide range of technologies available and examine the different treatment principles used in the technologies. Sanitation systems need to be selected according to the local social, economic and environmental conditions and should aim to be sustainable.

  9. Spectral multidimensional scaling

    PubMed Central

    Aflalo, Yonathan; Kimmel, Ron

    2013-01-01

    An important tool in information analysis is dimensionality reduction. There are various approaches for large data simplification by scaling its dimensions down that play a significant role in recognition and classification tasks. The efficiency of dimension reduction tools is measured in terms of memory and computational complexity, which are usually a function of the number of the given data points. Sparse local operators that involve substantially less than quadratic complexity at one end, and faithful multiscale models with quadratic cost at the other end, make the design of dimension reduction procedure a delicate balance between modeling accuracy and efficiency. Here, we combine the benefits of both and propose a low-dimensional multiscale modeling of the data, at a modest computational cost. The idea is to project the classical multidimensional scaling problem into the data spectral domain extracted from its Laplace–Beltrami operator. There, embedding into a small dimensional Euclidean space is accomplished while optimizing for a small number of coefficients. We provide a theoretical support and demonstrate that working in the natural eigenspace of the data, one could reduce the process complexity while maintaining the model fidelity. As examples, we efficiently canonize nonrigid shapes by embedding their intrinsic metric into , a method often used for matching and classifying almost isometric articulated objects. Finally, we demonstrate the method by exposing the style in which handwritten digits appear in a large collection of images. We also visualize clustering of digits by treating images as feature points that we map to a plane. PMID:24108352

  10. Scaling the Kondo lattice.

    PubMed

    Yang, Yi-feng; Fisk, Zachary; Lee, Han-Oh; Thompson, J D; Pines, David

    2008-07-31

    The origin of magnetic order in metals has two extremes: an instability in a liquid of local magnetic moments interacting through conduction electrons, and a spin-density wave instability in a Fermi liquid of itinerant electrons. This dichotomy between 'local-moment' magnetism and 'itinerant-electron' magnetism is reminiscent of the valence bond/molecular orbital dichotomy present in studies of chemical bonding. The class of heavy-electron intermetallic compounds of cerium, ytterbium and various 5f elements bridges the extremes, with itinerant-electron magnetic characteristics at low temperatures that grow out of a high-temperature local-moment state. Describing this transition quantitatively has proved difficult, and one of the main unsolved problems is finding what determines the temperature scale for the evolution of this behaviour. Here we present a simple, semi-quantitative solution to this problem that provides a basic framework for interpreting the physics of heavy-electron materials and offers the prospect of a quantitative determination of the physical origin of their magnetic ordering and superconductivity. It also reveals the difference between the temperature scales that distinguish the conduction electrons' response to a single magnetic impurity and their response to a lattice of local moments, and provides an updated version of the well-known Doniach diagram.

  11. Scaling in Transportation Networks

    PubMed Central

    Louf, Rémi; Roth, Camille; Barthelemy, Marc

    2014-01-01

    Subway systems span most large cities, and railway networks most countries in the world. These networks are fundamental in the development of countries and their cities, and it is therefore crucial to understand their formation and evolution. However, if the topological properties of these networks are fairly well understood, how they relate to population and socio-economical properties remains an open question. We propose here a general coarse-grained approach, based on a cost-benefit analysis that accounts for the scaling properties of the main quantities characterizing these systems (the number of stations, the total length, and the ridership) with the substrate's population, area and wealth. More precisely, we show that the length, number of stations and ridership of subways and rail networks can be estimated knowing the area, population and wealth of the underlying region. These predictions are in good agreement with data gathered for about subway systems and more than railway networks in the world. We also show that train networks and subway systems can be described within the same framework, but with a fundamental difference: while the interstation distance seems to be constant and determined by the typical walking distance for subways, the interstation distance for railways scales with the number of stations. PMID:25029528

  12. Scale in Education Research: Towards a Multi-Scale Methodology

    ERIC Educational Resources Information Center

    Noyes, Andrew

    2013-01-01

    This article explores some theoretical and methodological problems concerned with scale in education research through a critique of a recent mixed-method project. The project was framed by scale metaphors drawn from the physical and earth sciences and I consider how recent thinking around scale, for example, in ecosystems and human geography might…

  13. A Validity Scale for the Sharp Consumer Satisfaction Scales.

    ERIC Educational Resources Information Center

    Tanner, Barry A.; Stacy, Webb, Jr.

    1985-01-01

    A validity scale for the Sharp Consumer Satisfaction Scale was developed and used in experiments to assess patients' satisfaction with community mental health centers. The scale discriminated between clients who offered suggestions and those who did not. It also improved researcher's ability to predict true scores from obtained scores. (DWH)

  14. Returns to Scale and Economies of Scale: Further Observations.

    ERIC Educational Resources Information Center

    Gelles, Gregory M.; Mitchell, Douglas W.

    1996-01-01

    Maintains that most economics textbooks continue to repeat past mistakes concerning returns to scale and economies of scale under assumptions of constant and nonconstant input prices. Provides an adaptation for a calculus-based intermediate microeconomics class that demonstrates the pointwise relationship between returns to scale and economies of…

  15. Validity and Reliability of the Seizure Self-Efficacy Scale for Children with Epilepsy

    PubMed Central

    TUTAR GÜVEN, Şerife; İŞLER, Ayşegül

    2015-01-01

    Introduction This study aims to adapt the Seizure Self-Efficacy Scale for Children (SSES-C) into Turkish and then assess its validity and reliability in children with epilepsy. Methods The study sample consisted of 166 children (aged 9–17 years) with epilepsy who attended of Akdeniz University Hospital, Antalya Training and Research Hospital, and Bursa Dortcelik Children’s Hospital Pediatric Neurology Clinics between July 2012 and March 2013. All research data were collected by a researcher in face-to-face interviews using Child Information Form, Seizure Self-Efficacy Scale for Children and Children’s Depression Scale. The Seizure Self-Efficacy Scale for Children is a 15-item, 5-point Likert scale designed by Caplin et al. (2002). Results The linguistic adaptation and validation of the scale was conducted by seven experts. To evaluate the content validity of the scale, we elicited judgments from a panel of 10 content experts. The expert judgments showed that the correlation between the items on the scale was fairly good (Kendall’s W=0.411, p<0.001, ki-kare: 57.495). Load factor of 40% and a large factor analysis included analysis of substances and two factors accounting for 49.67% of the total variance explained. We calculated Cronbach’s alpha coefficient for the internal consistency and the full-scale score showed good internal consistency (alpha 0.89). Within the context of reliability studies, it was found correlations varying between 0,98–0,74 for the two sub-factors of the scale. Test/retest correlation coefficients were significant (p<0,01) and high (r=0.99). In parallel forms reliability, the correlations between the Seizure Self-Efficacy Scale for Children and Children’s Depression Rating Scale were found to be negative, moderate and statistically significant (r=−0.58, p<0.001). Conclusion The measurements conducted on the Turkish version of the Seizure Self-Efficacy Scale for Children showed that it is consistent with the original scale

  16. CONSTRUCTION AND VALIDATION OF A MEANING IN LIFE SCALE IN THE TAIWANESE CULTURAL CONTEXT.

    PubMed

    Wang, Ya-Huei; Liao, Hung-Chang

    2015-10-01

    The objective was to construct and validate a Chinese-language Meaning in Life Scale (MiLS) and to assess its psychometric properties. The three most popular scales have some weaknesses and are grounded in a Western cultural context. Consequently, a comprehensive and psychometrically adequate meaning in life scale is needed for use in Asian samples. 500 randomly selected participants from the Taiwanese public provided 476 valid responses to a written questionnaire. The participants' ages ranged from 18 to 63 years (M age = 42.3 yr.; 181 men, 295 women). Exploratory factor analysis reduced the initial 41 items to 33 items, based on a 5-point rating scale. Five factors were extracted: Contented with life (10 items; 33.20% of total variance), Goals in life (5 items; 6.95%), Enthusiasm and commitment (7 items; 6.28%), Understanding (6 items; 5.41%), and Sense or meaning to human existence (5 items; 4.57%). The MiLS showed satisfactory internal consistency and test-retest reliability, and concurrent validity. Therefore, the MiLS was found to be a valid and reliable instrument to measure the subjective sense of a meaning in life in the Taiwanese cultural context.

  17. Global scale precipitation from monthly to centennial scales: empirical space-time scaling analysis, anthropogenic effects

    NASA Astrophysics Data System (ADS)

    de Lima, Isabel; Lovejoy, Shaun

    2016-04-01

    The characterization of precipitation scaling regimes represents a key contribution to the improved understanding of space-time precipitation variability, which is the focus here. We conduct space-time scaling analyses of spectra and Haar fluctuations in precipitation, using three global scale precipitation products (one instrument based, one reanalysis based, one satellite and gauge based), from monthly to centennial scales and planetary down to several hundred kilometers in spatial scale. Results show the presence - similarly to other atmospheric fields - of an intermediate "macroweather" regime between the familiar weather and climate regimes: we characterize systematically the macroweather precipitation temporal and spatial, and joint space-time statistics and variability, and the outer scale limit of temporal scaling. These regimes qualitatively and quantitatively alternate in the way fluctuations vary with scale. In the macroweather regime, the fluctuations diminish with time scale (this is important for seasonal, annual, and decadal forecasts) while anthropogenic effects increase with time scale. Our approach determines the time scale at which the anthropogenic signal can be detected above the natural variability noise: the critical scale is about 20 - 40 yrs (depending on the product, on the spatial scale). This explains for example why studies that use data covering only a few decades do not easily give evidence of anthropogenic changes in precipitation, as a consequence of warming: the period is too short. Overall, while showing that precipitation can be modeled with space-time scaling processes, our results clarify the different precipitation scaling regimes and further allow us to quantify the agreement (and lack of agreement) of the precipitation products as a function of space and time scales. Moreover, this work contributes to clarify a basic problem in hydro-climatology, which is to measure precipitation trends at decadal and longer scales and to

  18. Scaling laws in cognitive sciences.

    PubMed

    Kello, Christopher T; Brown, Gordon D A; Ferrer-I-Cancho, Ramon; Holden, John G; Linkenkaer-Hansen, Klaus; Rhodes, Theo; Van Orden, Guy C

    2010-05-01

    Scaling laws are ubiquitous in nature, and they pervade neural, behavioral and linguistic activities. A scaling law suggests the existence of processes or patterns that are repeated across scales of analysis. Although the variables that express a scaling law can vary from one type of activity to the next, the recurrence of scaling laws across so many different systems has prompted a search for unifying principles. In biological systems, scaling laws can reflect adaptive processes of various types and are often linked to complex systems poised near critical points. The same is true for perception, memory, language and other cognitive phenomena. Findings of scaling laws in cognitive science are indicative of scaling invariance in cognitive mechanisms and multiplicative interactions among interdependent components of cognition.

  19. Earthquake impact scale

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M, $100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should

  20. Westside Test Anxiety Scale Validation

    ERIC Educational Resources Information Center

    Driscoll, Richard

    2007-01-01

    The Westside Test Anxiety Scale is a brief, ten item instrument designed to identify students with anxiety impairments who could benefit from an anxiety-reduction intervention. The scale items cover self-assessed anxiety impairment and cognitions which can impair performance. Correlations between anxiety-reduction as measured by the scale and…

  1. The Rapid Induction Susceptibility Scale.

    ERIC Educational Resources Information Center

    Page, Roger A.; Handley, George W.

    1989-01-01

    Developed Rapid Induction Susceptibility Scale using Chiasson induction to produce hypnotic susceptibility scale which is quickly administered and yields scores comparable to the Stanford Hypnotic Susceptibility Scale, Form C (SHSS:C). Found that validation study with college students (N=100) produced a correlation of .88 with the SHSS:C and…

  2. Micromechanical silicon precision scale

    NASA Astrophysics Data System (ADS)

    Oja, Aarne S.; Sillanpaa, Teuvo; Seppae, H.; Kiihamaki, Jyrki; Seppala, P.; Karttunen, Jani; Riski, Kari

    2000-04-01

    A micro machined capacitive silicon scale has been designed and fabricated. It is intended for weighing masses on the order of 1 g at the resolution of about 1 ppm and below. The device consists of a micro machined SOI chip which is anodically bonded to a glass chip. The flexible electrode is formed in the SOI device layer. The other electrode is metallized on the glass and is divided into three sections. The sections are used for detecting tilting of the top electrode due to a possible off-centering of the mass load. The measuring circuit implements electrostatic force feedback and keeps the top electrode at a constant horizontal position irrespective of its mass loading. First measurements have demonstrated the stability allowing measurement of 1 g masses at an accuracy of 2...3 ppm.

  3. Indian scales and inventories

    PubMed Central

    Venkatesan, S.

    2010-01-01

    This conceptual, perspective and review paper on Indian scales and inventories begins with clarification on the historical and contemporary meanings of psychometry before linking itself to the burgeoning field of clinimetrics in their applications to the practice of clinical psychology and psychiatry. Clinimetrics is explained as a changing paradigm in the design, administration, and interpretation of quantitative tests, techniques or procedures applied to measurement of clinical variables, traits and processes. As an illustrative sample, this article assembles a bibliographic survey of about 105 out of 2582 research papers (4.07%) scanned through 51 back dated volumes covering 185 issues related to clinimetry as reviewed across a span of over fifty years (1958-2009) in the Indian Journal of Psychiatry. A content analysis of the contributions across distinct categories of mental measurements is explained before linkages are proposed for future directions along these lines. PMID:21836709

  4. Galactic-scale civilization

    NASA Technical Reports Server (NTRS)

    Kuiper, T. B. H.

    1980-01-01

    Evolutionary arguments are presented in favor of the existence of civilization on a galactic scale. Patterns of physical, chemical, biological, social and cultural evolution leading to increasing levels of complexity are pointed out and explained thermodynamically in terms of the maximization of free energy dissipation in the environment of the organized system. The possibility of the evolution of a global and then a galactic human civilization is considered, and probabilities that the galaxy is presently in its colonization state and that life could have evolved to its present state on earth are discussed. Fermi's paradox of the absence of extraterrestrials in light of the probability of their existence is noted, and a variety of possible explanations is indicated. Finally, it is argued that although mankind may be the first occurrence of intelligence in the galaxy, it is unjustified to presume that this is so.

  5. The Children's Loneliness Scale.

    PubMed

    Maes, Marlies; Van den Noortgate, Wim; Vanhalst, Janne; Beyers, Wim; Goossens, Luc

    2017-03-01

    The present study examined the factor structure and construct validity of the Children's Loneliness Scale (CLS), a popular measure of childhood loneliness, in Belgian children. Analyses were conducted on two samples of fifth and sixth graders in Belgium, for a total of 1,069 children. A single-factor structure proved superior to alternative solutions proposed in the literature, when taking item wording into account. Construct validity was shown by substantial associations with related constructs, based on both self-reported (e.g., depressive symptoms and low social self-esteem), and peer-reported variables (e.g., victimization). Furthermore, a significant association was found between the CLS and a peer-reported measure of loneliness. Collectively, these findings provide a solid foundation for the continuing use of the CLS as a measure of childhood loneliness.

  6. In Brief: Scale model

    NASA Astrophysics Data System (ADS)

    Here's one for Guinness or maybe Ripley: The Worlds's largest scale model of the solar system begins at a museum in Peoria, Ill., and extends geographically as far away as Ecuador and the South Pole. In the model, which was developed by the museum's deputy director Sheldon Schafer, 42 feet equal about 1 million miles. The Sun, which is 36-feet wide, is painted on the dome of the Lakeview Museum's planetarium in Peoria. Mercury, which is 1.5 inches across, can be found at a nearby store; Venus sits in a local bank lobby; Earth is lodged at a gas station; and Mars at a radio station. "The idea is that people will encounter a little bit of astronomy in the walks of their daily lives," Schafer says.

  7. In Brief: Scale model

    NASA Astrophysics Data System (ADS)

    Here's one for Guinness or maybe Ripley: The Worlds's largest scale model of the solar system begins at a museum in Peoria, Ill., and extends geographically as far away as Ecuador and the South Pole. In the model, which was developed by the museum's deputy director Sheldon Schafer, 42 feet equal about 1 million miles. The Sun, which is 36-feet wide, is painted on the dome of the Lakeview Museum's planetarium in Peoria. Mercury, which is 1.5 inches across, can be found at a nearby store; Venus sits in a local bank lobby; Earth is lodged at a gas station; and Mars at a radio station. “The idea is that people will encounter a little bit of astronomy in the walks of their daily lives,” Schafer says.

  8. Extreme Scale Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Shoemaker, Deirdre

    2009-11-01

    We live in extraordinary times. With increasingly sophisticated observatories opening up new vistas on the universe, astrophysics is becoming more complex and data-driven. The success in understanding astrophysical systems that are inherently multi-physical, nonlinear systems demands realism in our models of the phenomena. We cannot hope to advance the realism of these models to match the expected sophistication of future observations without extreme-scale computation. Just one example is the advent of gravitational wave astronomy. Detectors like LIGO are about to make the first ever detection of gravitational waves. The gravitational waves are produced during violent events such as the merger of two black holes. The detection of these waves or ripples in the fabric of spacetime is a formidable undertaking, requiring innovative engineering, powerful data analysis tools and careful theoretical modeling. I will discuss the computational and theoretical challenges ahead in our new understanding of physics and astronomy where gravity exhibits its strongest grip on our spacetime.

  9. Full Scale Tunnel (FST)

    NASA Technical Reports Server (NTRS)

    1930-01-01

    Construction of Full Scale Tunnel (FST). In November 1929, Smith DeFrance submitted his recommendations for the general design of the Full Scale Wind Tunnel. The last on his list concerned the division of labor required to build this unusual facility. He believed the job had five parts and described them as follows: 'It is proposed that invitations be sent out for bids on five groups of items. The first would be for one contract on the complete structure; second the same as first, including the erection of the cones but not the fabrication, since this would be more of a shipyard job; third would cover structural steel, cover, sash and doors, but not cones or foundation; fourth, foundations; an fifth, fabrication of cones.' DeFrance's memorandum prompted the NACA to solicit estimates from a large number of companies. Preliminary designs and estimates were prepared and submitted to the Bureau of the Budget and Congress appropriated funds on February 20, 1929. The main construction contract with the J.A. Jones Company of Charlotte, North Carolina was signed one year later on February 12, 1930. It was a peculiar structure as the building's steel framework is visible on the outside of the building. DeFrance described this in NACA TR No. 459: 'The entire equipment is housed in a structure, the outside walls of which serve as the outer walls of the return passages. The over-all length of the tunnel is 434 feet 6 inches, the width 222 feet, and the maximum height 97 feet. The framework is of structural steel....' (pp. 292-293)

  10. Full Scale Tunnel (FST)

    NASA Technical Reports Server (NTRS)

    1930-01-01

    Construction of Full-Scale Tunnel (FST). In November 1929, Smith DeFrance submitted his recommendations for the general design of the Full Scale Wind Tunnel. The last on his list concerned the division of labor required to build this unusual facility. He believed the job had five parts and described them as follows: 'It is proposed that invitations be sent out for bids on five groups of items. The first would be for one contract on the complete structure; second the same as first, including the erection of the cones but not the fabrication, since this would be more of a shipyard job; third would cover structural steel, cover, sash and doors, but not cones or foundation; fourth, foundations; and fifth, fabrication of cones.' DeFrance's memorandum prompted the NACA to solicit estimates from a large number of companies. Preliminary designs and estimates were prepared and submitted to the Bureau of the Budget and Congress appropriated funds on February 20, 1929. The main construction contract with the J.A. Jones Company of Charlotte, North Carolina was signed one year later on February 12, 1930. It was a peculiar structure as the building's steel framework is visible on the outside of the building. DeFrance described this in NACA TR No. 459: 'The entire equipment is housed in a structure, the outside walls of which serve as the outer walls of the return passages. The over-all length of the tunnel is 434 feet 6 inches, the width 222 feet, and the maximum height 97 feet. The framework is of structural steel....' (pp. 292-293).

  11. Scaling up: Assessing social impacts at the macro-scale

    SciTech Connect

    Schirmer, Jacki

    2011-04-15

    Social impacts occur at various scales, from the micro-scale of the individual to the macro-scale of the community. Identifying the macro-scale social changes that results from an impacting event is a common goal of social impact assessment (SIA), but is challenging as multiple factors simultaneously influence social trends at any given time, and there are usually only a small number of cases available for examination. While some methods have been proposed for establishing the contribution of an impacting event to macro-scale social change, they remain relatively untested. This paper critically reviews methods recommended to assess macro-scale social impacts, and proposes and demonstrates a new approach. The 'scaling up' method involves developing a chain of logic linking change at the individual/site scale to the community scale. It enables a more problematised assessment of the likely contribution of an impacting event to macro-scale social change than previous approaches. The use of this approach in a recent study of change in dairy farming in south east Australia is described.

  12. Solar system to scale

    NASA Astrophysics Data System (ADS)

    Gerwig López, Susanne

    2016-04-01

    One of the most important successes in astronomical observations has been to determine the limit of the Solar System. It is said that the first man able to measure the distance Earth-Sun with only a very slight mistake, in the second century BC, was the wise Greek man Aristarco de Samos. Thanks to Newtońs law of universal gravitation, it was possible to measure, with a little margin of error, the distances between the Sun and the planets. Twelve-year old students are very interested in everything related to the universe. However, it seems too difficult to imagine and understand the real distances among the different celestial bodies. To learn the differences among the inner and outer planets and how far away the outer ones are, I have considered to make my pupils work on the sizes and the distances in our solar system constructing it to scale. The purpose is to reproduce our solar system to scale on a cardboard. The procedure is very easy and simple. Students of first year of ESO (12 year-old) receive the instructions in a sheet of paper (things they need: a black cardboard, a pair of scissors, colored pencils, a ruler, adhesive tape, glue, the photocopies of the planets and satellites, the measurements they have to use). In another photocopy they get the pictures of the edge of the sun, the planets, dwarf planets and some satellites, which they have to color, cut and stick on the cardboard. This activity is planned for both Spanish and bilingual learning students as a science project. Depending on the group, they will receive these instructions in Spanish or in English. When the time is over, the students bring their works on their cardboard to the class. They obtain a final mark: passing, good or excellent, depending on the accuracy of the measurements, the position of all the celestial bodies, the asteroids belts, personal contributions, etc. If any of the students has not followed the instructions they get the chance to remake it again properly, in order not

  13. Tipping the scales.

    PubMed

    1998-12-01

    In the US, the October 1998 murder of a physician who performed abortions was an outward manifestation of the insidious battle against legal abortion being waged by radical Christian social conservatives seeking to transform the US democracy into a theocracy. This movement has been documented in a publication entitled, "Tipping the Scales: The Christian Right's Legal Crusade Against Choice" produced as a result of a 4-year investigation conducted by The Center for Reproductive Law and Policy. This publication describes how these fundamentalists have used sophisticated legal, lobbying, and communication strategies to further their goals of challenging the separation of church and state, opposing family planning and sexuality education that is not based solely on abstinence, promoting school prayer, and restricting homosexual rights. The movement has resulted in the introduction of more than 300 anti-abortion bills in states, 50 of which have passed in 23 states. Most Christian fundamentalist groups provide free legal representation to abortion clinic terrorists, and some groups solicit women to bring specious malpractice claims against providers. Sophisticated legal tactics are used by these groups to remove the taint of extremism and mask the danger posed to US constitutional principles being posed by "a well-financed and zealous brand of radical lawyers and their supporters."

  14. Turbulent scaling in fluids

    SciTech Connect

    Ecke, R.; Li, Ning; Chen, Shiyi; Liu, Yuanming

    1996-11-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The project was a study of turbulence in fluids that are subject to different body forces and to external temperature gradients. Our focus was on the recent theoretical prediction that the Kolomogorov picture of turbulence may need to be modified for turbulent flows driven by buoyancy and subject to body forces such as rotational accelerations. Models arising from this research are important in global climate modeling, in turbulent transport problems, and in the fundamental understanding of fluid turbulence. Experimentally, we use (1) precision measurements of heat transport and local temperature; (2) flow visualization using digitally- enhanced optical shadowgraphs, particle-image velocimetry, thermochromic liquid-crystal imaging, laser-doppler velocimetry, and photochromic dye imaging; and (3) advanced image- processing techniques. Our numerical simulations employ standard spectral and novel lattice Boltzmann algorithms implemented on parallel Connection Machine computers to simulate turbulent fluid flow. In laboratory experiments on incompressible fluids, we measure probability distribution functions and two-point spatial correlations of temperature T and velocity V (both T-T and V-T correlations) and determine scaling relations for global heat transport with Rayleigh number. We also explore the mechanism for turbulence in thermal convection and the stability of the thermal boundary layer.

  15. Lightning Scaling Laws Revisited

    NASA Technical Reports Server (NTRS)

    Boccippio, D. J.; Arnold, James E. (Technical Monitor)

    2000-01-01

    Scaling laws relating storm electrical generator power (and hence lightning flash rate) to charge transport velocity and storm geometry were originally posed by Vonnegut (1963). These laws were later simplified to yield simple parameterizations for lightning based upon cloud top height, with separate parameterizations derived over land and ocean. It is demonstrated that the most recent ocean parameterization: (1) yields predictions of storm updraft velocity which appear inconsistent with observation, and (2) is formally inconsistent with the theory from which it purports to derive. Revised formulations consistent with Vonnegut's original framework are presented. These demonstrate that Vonnegut's theory is, to first order, consistent with observation. The implications of assuming that flash rate is set by the electrical generator power, rather than the electrical generator current, are examined. The two approaches yield significantly different predictions about the dependence of charge transfer per flash on storm dimensions, which should be empirically testable. The two approaches also differ significantly in their explanation of regional variability in lightning observations.

  16. UltraScale Computing

    NASA Astrophysics Data System (ADS)

    Maynard, , Jr.

    1997-08-01

    The Defense Advanced Research Projects Agency Information Technology Office (DARPA/ITO) supports research in technology for defense-critical applications. Defense Applications are always insatiable consumers of computing. Futuristic applications such as automated image interpretation/whole vehicle radar-cross-section/real-time prototyping/faster-than-real-time simulation will require computing capabilities orders-of-magnitude beyond the best performance that can be projected from contemporary scalable parallel processors. To reach beyond the silicon digital paradigm, DARPA has initiated a program in UltraScale Computing to explore the domain of innovative computational models, methods, and mechanisms. The objective is to encourage a complete re-thinking of computing. Novel architectures, program synthesis, and execution environments are needed as well as alternative underlying physical mechanisms including molecular, biological, optical and quantum mechanical processes. Development of these advanced computing technologies will offer spectacular performance and cost improvements beyond the threshold of traditional materials and processes. The talk will focus on novel approaches for employing vastly more computational units than shrinking transistors will enable and exploration of the biological options for solving computationally difficult problems.

  17. Large scale traffic simulations

    SciTech Connect

    Nagel, K.; Barrett, C.L. |; Rickert, M. |

    1997-04-01

    Large scale microscopic (i.e. vehicle-based) traffic simulations pose high demands on computational speed in at least two application areas: (i) real-time traffic forecasting, and (ii) long-term planning applications (where repeated {open_quotes}looping{close_quotes} between the microsimulation and the simulated planning of individual person`s behavior is necessary). As a rough number, a real-time simulation of an area such as Los Angeles (ca. 1 million travellers) will need a computational speed of much higher than 1 million {open_quotes}particle{close_quotes} (= vehicle) updates per second. This paper reviews how this problem is approached in different projects and how these approaches are dependent both on the specific questions and on the prospective user community. The approaches reach from highly parallel and vectorizable, single-bit implementations on parallel supercomputers for Statistical Physics questions, via more realistic implementations on coupled workstations, to more complicated driving dynamics implemented again on parallel supercomputers. 45 refs., 9 figs., 1 tab.

  18. Scaling of structural failure

    SciTech Connect

    Bazant, Z.P.; Chen, Er-Ping

    1997-01-01

    This article attempts to review the progress achieved in the understanding of scaling and size effect in the failure of structures. Particular emphasis is placed on quasibrittle materials for which the size effect is complicated. Attention is focused on three main types of size effects, namely the statistical size effect due to randomness of strength, the energy release size effect, and the possible size effect due to fractality of fracture or microcracks. Definitive conclusions on the applicability of these theories are drawn. Subsequently, the article discusses the application of the known size effect law for the measurement of material fracture properties, and the modeling of the size effect by the cohesive crack model, nonlocal finite element models and discrete element models. Extensions to compression failure and to the rate-dependent material behavior are also outlined. The damage constitutive law needed for describing a microcracked material in the fracture process zone is discussed. Various applications to quasibrittle materials, including concrete, sea ice, fiber composites, rocks and ceramics are presented.

  19. Large scale tracking algorithms

    SciTech Connect

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  20. SPACE BASED INTERCEPTOR SCALING

    SciTech Connect

    G. CANAVAN

    2001-02-01

    Space Based Interceptor (SBI) have ranges that are adequate to address rogue ICBMs. They are not overly sensitive to 30-60 s delay times. Current technologies would support boost phase intercept with about 150 interceptors. Higher acceleration and velocity could reduce than number by about a factor of 3 at the cost of heavier and more expensive Kinetic Kill Vehicles (KKVs). 6g SBI would reduce optimal constellation costs by about 35%; 8g SBI would reduce them another 20%. Interceptor ranges fall rapidly with theater missile range. Constellations increase significantly for ranges under 3,000 km, even with advanced interceptor technology. For distributed launches, these estimates recover earlier strategic scalings, which demonstrate the improved absentee ratio for larger or multiple launch areas. Constellations increase with the number of missiles and the number of interceptors launched at each. The economic estimates above suggest that two SBI per missile with a modest midcourse underlay is appropriate. The SBI KKV technology would appear to be common for space- and surface-based boost phase systems, and could have synergisms with improved midcourse intercept and discrimination systems. While advanced technology could be helpful in reducing costs, particularly for short range theater missiles, current technology appears adequate for pressing rogue ICBM, accidental, and unauthorized launches.

  1. Environmental complexity across scales: mechanism, scaling and the phenomenological fallacy

    NASA Astrophysics Data System (ADS)

    Lovejoy, Shaun

    2015-04-01

    Ever since Van Leeuwenhoek used a microscope to discover "new worlds in a drop of water" we have become used to the idea that "zooming in" - whether in space or in time - will reveal new processes, new phenomena. Yet in the natural environment - geosystems - this is often wrong. For example, in the temporal domain, a recent publication has shown that from hours to hundreds of millions of years the conventional scale bound view of atmospheric variability was wrong by a factor of over a quadrillion (10**15). Mandelbrot challenged the "scale bound" ideology and proposed that many natural systems - including many geosystems - were instead better treated as fractal systems in which the same basic mechanism acts over potentially huge ranges of scale. However, in its original form Mandelbrot's isotropic scaling (self-similar) idea turned out to be too naïve: geosystems are typically anisotropic so that shapes and morphologies (e.g. of clouds landmasses) are not the same at different resolutions. However it turns out that the scaling idea often still applies on condition that the notion of scale is generalized appropriately (using the framework of Generalized Scale Invariance). The overall result is that unique processes, unique dynamical mechanisms may act over huge ranges of scale even though the morphologies systematically change with scale. Therefore the common practice of inferring mechanism from shapes, forms, morphologies is unjustified, the "phenomenological fallacy". We give examples of the phenomenological fallacy drawn from diverse areas of geoscience.

  2. Cryptic individual scaling relationships and the evolution of morphological scaling.

    PubMed

    Dreyer, Austin P; Saleh Ziabari, Omid; Swanson, Eli M; Chawla, Akshita; Frankino, W Anthony; Shingleton, Alexander W

    2016-08-01

    Morphological scaling relationships between organ and body size-also known as allometries-describe the shape of a species, and the evolution of such scaling relationships is central to the generation of morphological diversity. Despite extensive modeling and empirical tests, however, the modes of selection that generate changes in scaling remain largely unknown. Here, we mathematically model the evolution of the group-level scaling as an emergent property of individual-level variation in the developmental mechanisms that regulate trait and body size. We show that these mechanisms generate a "cryptic individual scaling relationship" unique to each genotype in a population, which determines body and trait size expressed by each individual, depending on developmental nutrition. We find that populations may have identical population-level allometries but very different underlying patterns of cryptic individual scaling relationships. Consequently, two populations with apparently the same morphological scaling relationship may respond very differently to the same form of selection. By focusing on the developmental mechanisms that regulate trait size and the patterns of cryptic individual scaling relationships they produce, our approach reveals the forms of selection that should be most effective in altering morphological scaling, and directs researcher attention on the actual, hitherto overlooked, targets of selection.

  3. The Italian version of the Physical Therapy Patient Satisfaction Questionnaire - [PTPSQ-I(15)]: psychometric properties in a sample of inpatients

    PubMed Central

    2014-01-01

    Background In a previous study we described the translation, cultural adaptation, and validation of the Italian version of the PTPSQ [PTPSQ-I(15)] in outpatients. To the authors’ knowledge, the PTPSQ was never studied in a hospital setting. The aims of this study were: (1) to establish the psychometric properties of the Physical Therapy Patient Satisfaction Questionnaire [PTPSQ- I(15)] in a sample of Italian inpatients, and (2) to investigate the relationships between the characteristics of patients and physical therapists and the indicators of satisfaction. Methods The PTPSQ-I(15) was administered to inpatients in a Physical Medicine and Rehabilitation Unit. Reliability of the PTPSQ-I(15) was measured by internal consistency (Cronbach’s α) and test-retest stability (ICC 3,1). The internal structure was investigated by factor analysis. Divergent validity was measured by comparing the PTPSQ-I(15) with a Visual Analogue Scale (VAS) for pain and with a 5-point Likert-type scale evaluating the Global Perceived Effect (GPE) of the physical therapy treatment. Results The PTPSQ-I(15) was administered to 148 inpatients, and 73 completed a second administration. The PTPSQ-I(15) showed high internal consistency (α = 0.949) and test-retest stability (ICC = 0.996). Divergent validity was moderate for the GPE (r = − 0.502, P < 0.001) and strong for the VAS (r = −0.17, P = 0.07). Factor analysis showed a one-factor structure. Conclusions The administration of PTPSQ-I(15) to inpatients demonstrated strong psychometric properties and its use can be recommended with Italian-speaking population. Further studies are suggested on the concurrent validity and on the psychometric properties of the PTPSQ-I(15) in different hospital settings or with other pathological conditions. PMID:24758356

  4. Full Scale Tunnel (FST)

    NASA Technical Reports Server (NTRS)

    1930-01-01

    Installation of Full Scale Tunnel (FST) power plant. Virginia Public Service Company could not supply adequate electricity to run the wind tunnels being built at Langley. (The Propeller Research Tunnel was powered by two submarine diesel engines.) This led to the consideration of a number of different ideas for generating electric power to drive the fan motors in the FST. The main proposition involved two 3000 hp and two 1000 hp diesel engines with directly connected generators. Another, proposition suggested 30 Liberty motors driving 600 hp DC generators in pairs. For a month, engineers at Langley were hopeful they could secure additional diesel engines from decommissioned Navy T-boats but the Navy could not offer a firm commitment regarding the future status of the submarines. By mid-December 1929, Virginia Public Service Company had agreed to supply service to the field at the north end of the King Street Bridge connecting Hampton and Langley Field. Thus, new plans for FST powerplant and motors were made. Smith DeFrance described the motors in NACA TR No. 459: 'The most commonly used power plant for operating a wind tunnel is a direct-current motor and motor-generator set with Ward Leonard control system. For the FST it was found that alternating current slip-ring induction motors, together with satisfactory control equipment, could be purchased for approximately 30 percent less than the direct-current equipment. Two 4000-horsepower slip-ring induction motors with 24 steps of speed between 75 and 300 r.p.m. were therefore installed.'

  5. Excitable scale free networks

    NASA Astrophysics Data System (ADS)

    Copelli, M.; Campos, P. R. A.

    2007-04-01

    When a simple excitable system is continuously stimulated by a Poissonian external source, the response function (mean activity versus stimulus rate) generally shows a linear saturating shape. This is experimentally verified in some classes of sensory neurons, which accordingly present a small dynamic range (defined as the interval of stimulus intensity which can be appropriately coded by the mean activity of the excitable element), usually about one or two decades only. The brain, on the other hand, can handle a significantly broader range of stimulus intensity, and a collective phenomenon involving the interaction among excitable neurons has been suggested to account for the enhancement of the dynamic range. Since the role of the pattern of such interactions is still unclear, here we investigate the performance of a scale-free (SF) network topology in this dynamic range problem. Specifically, we study the transfer function of disordered SF networks of excitable Greenberg-Hastings cellular automata. We observe that the dynamic range is maximum when the coupling among the elements is critical, corroborating a general reasoning recently proposed. Although the maximum dynamic range yielded by general SF networks is slightly worse than that of random networks, for special SF networks which lack loops the enhancement of the dynamic range can be dramatic, reaching nearly five decades. In order to understand the role of loops on the transfer function we propose a simple model in which the density of loops in the network can be gradually increased, and show that this is accompanied by a gradual decrease of dynamic range.

  6. Contact kinematics of biomimetic scales

    SciTech Connect

    Ghosh, Ranajay; Ebrahimi, Hamid; Vaziri, Ashkan

    2014-12-08

    Dermal scales, prevalent across biological groups, considerably boost survival by providing multifunctional advantages. Here, we investigate the nonlinear mechanical effects of biomimetic scale like attachments on the behavior of an elastic substrate brought about by the contact interaction of scales in pure bending using qualitative experiments, analytical models, and detailed finite element (FE) analysis. Our results reveal the existence of three distinct kinematic phases of operation spanning linear, nonlinear, and rigid behavior driven by kinematic interactions of scales. The response of the modified elastic beam strongly depends on the size and spatial overlap of rigid scales. The nonlinearity is perceptible even in relatively small strain regime and without invoking material level complexities of either the scales or the substrate.

  7. Plague and climate: scales matter.

    PubMed

    Ben-Ari, Tamara; Ben Ari, Tamara; Neerinckx, Simon; Gage, Kenneth L; Kreppel, Katharina; Laudisoit, Anne; Leirs, Herwig; Stenseth, Nils Chr

    2011-09-01

    Plague is enzootic in wildlife populations of small mammals in central and eastern Asia, Africa, South and North America, and has been recognized recently as a reemerging threat to humans. Its causative agent Yersinia pestis relies on wild rodent hosts and flea vectors for its maintenance in nature. Climate influences all three components (i.e., bacteria, vectors, and hosts) of the plague system and is a likely factor to explain some of plague's variability from small and regional to large scales. Here, we review effects of climate variables on plague hosts and vectors from individual or population scales to studies on the whole plague system at a large scale. Upscaled versions of small-scale processes are often invoked to explain plague variability in time and space at larger scales, presumably because similar scale-independent mechanisms underlie these relationships. This linearity assumption is discussed in the light of recent research that suggests some of its limitations.

  8. Natural Scales in Geographical Patterns

    PubMed Central

    Menezes, Telmo; Roth, Camille

    2017-01-01

    Human mobility is known to be distributed across several orders of magnitude of physical distances, which makes it generally difficult to endogenously find or define typical and meaningful scales. Relevant analyses, from movements to geographical partitions, seem to be relative to some ad-hoc scale, or no scale at all. Relying on geotagged data collected from photo-sharing social media, we apply community detection to movement networks constrained by increasing percentiles of the distance distribution. Using a simple parameter-free discontinuity detection algorithm, we discover clear phase transitions in the community partition space. The detection of these phases constitutes the first objective method of characterising endogenous, natural scales of human movement. Our study covers nine regions, ranging from cities to countries of various sizes and a transnational area. For all regions, the number of natural scales is remarkably low (2 or 3). Further, our results hint at scale-related behaviours rather than scale-related users. The partitions of the natural scales allow us to draw discrete multi-scale geographical boundaries, potentially capable of providing key insights in fields such as epidemiology or cultural contagion where the introduction of spatial boundaries is pivotal. PMID:28374825

  9. Contradictory correlations between derived scales.

    PubMed

    Thompson, M J; Hand, D J; Everitt, B S

    1991-08-01

    In a clinical trial one scale of pain relief is scored backwards relative to another (high on one corresponding to low on the other), with a consequent large negative correlation. But two derived scales of total pain, obtained by multiplying average pain relief on each scale by duration of pain (common to both pain relief measurements) gave an almost zero correlation. This apparent contradiction is explained by the inverse relationship between the pain relief scales and the large differences in duration of pain experienced by the patients.

  10. Time scale independent signal transmission

    NASA Astrophysics Data System (ADS)

    Faltin, L.

    1980-05-01

    The paper presents a method which permits the conversion of time scale variations occurring during signal transmission into time shifts proportionally related to these variations. It is demonstrated that the method can be used to reject the adverse effects of the time scale variations (such as wow and flutter in magnetic tape recordings) and/or to determine the scale change exactly (such as would be required in Doppler signal processing). Finally, it is noted that since the system performance degrades with rising frequency of the time scale distortions, an upper bound for this frequency is derived.

  11. DARHT Radiographic Grid Scale Correction

    SciTech Connect

    Warthen, Barry J.

    2015-02-13

    Recently it became apparent that the radiographic grid which has been used to calibrate the dimensional scale of DARHT radiographs was not centered at the location where the objects have been centered. This offset produced an error of 0.188% in the dimensional scaling of the radiographic images processed using the assumption that the grid and objects had the same center. This paper will show the derivation of the scaling correction, explain how new radiographs are being processed to account for the difference in location, and provide the details of how to correct radiographic image processed with the erroneous scale factor.

  12. Mechanically reliable scales and coatings

    SciTech Connect

    Tortorelli, P.F.; Alexander, K.B.

    1995-07-01

    As the first stage in examining the mechanical reliability of protective surface oxides, the behavior of alumina scales formed on iron-aluminum alloys during high-temperature cyclic oxidation was characterized in terms of damage and spallation tendencies. Scales were thermally grown on specimens of three iron-aluminum composition using a series of exposures to air at 1000{degrees}C. Gravimetric data and microscopy revealed substantially better integrity and adhesion of the scales grown on an alloy containing zirconium. The use of polished (rather than just ground) specimens resulted in scales that were more suitable for subsequent characterization of mechanical reliability.

  13. Extended scaling in high dimensions

    NASA Astrophysics Data System (ADS)

    Berche, B.; Chatelain, C.; Dhall, C.; Kenna, R.; Low, R.; Walter, J.-C.

    2008-11-01

    We apply and test the recently proposed 'extended scaling' scheme in an analysis of the magnetic susceptibility of Ising systems above the upper critical dimension. The data are obtained by Monte Carlo simulations using both the conventional Wolff cluster algorithm and the Prokof'ev-Svistunov worm algorithm. As already observed for other models, extended scaling is shown to extend the high-temperature critical scaling regime over a range of temperatures much wider than that achieved conventionally. It allows for an accurate determination of leading and sub-leading scaling indices, critical temperatures and amplitudes of the confluent corrections.

  14. INTERIOR VIEW SHOWING BATCH SCALES. SERIES OF FIVE SCALES WITH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW SHOWING BATCH SCALES. SERIES OF FIVE SCALES WITH SIX DIFFERENT MATERIALS. MIX SIFTED DOWN FROM SILOS ABOVE. INGREDIENTS: SAND, SODA ASH, DOLOMITE LIMESTONE, NEPHELINE SYENITE, SALT CAKE. - Chambers-McKee Window Glass Company, Batch Plant, Clay Avenue Extension, Jeannette, Westmoreland County, PA

  15. Validating Large Scale Networks Using Temporary Local Scale Networks

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The USDA NRCS Soil Climate Analysis Network and NOAA Climate Reference Networks are nationwide meteorological and land surface data networks with soil moisture measurements in the top layers of soil. There is considerable interest in scaling these point measurements to larger scales for validating ...

  16. Drift Scale THM Model

    SciTech Connect

    J. Rutqvist

    2004-10-07

    This model report documents the drift scale coupled thermal-hydrological-mechanical (THM) processes model development and presents simulations of the THM behavior in fractured rock close to emplacement drifts. The modeling and analyses are used to evaluate the impact of THM processes on permeability and flow in the near-field of the emplacement drifts. The results from this report are used to assess the importance of THM processes on seepage and support in the model reports ''Seepage Model for PA Including Drift Collapse'' and ''Abstraction of Drift Seepage'', and to support arguments for exclusion of features, events, and processes (FEPs) in the analysis reports ''Features, Events, and Processes in Unsaturated Zone Flow and Transport and Features, Events, and Processes: Disruptive Events''. The total system performance assessment (TSPA) calculations do not use any output from this report. Specifically, the coupled THM process model is applied to simulate the impact of THM processes on hydrologic properties (permeability and capillary strength) and flow in the near-field rock around a heat-releasing emplacement drift. The heat generated by the decay of radioactive waste results in elevated rock temperatures for thousands of years after waste emplacement. Depending on the thermal load, these temperatures are high enough to cause boiling conditions in the rock, resulting in water redistribution and altered flow paths. These temperatures will also cause thermal expansion of the rock, with the potential of opening or closing fractures and thus changing fracture permeability in the near-field. Understanding the THM coupled processes is important for the performance of the repository because the thermally induced permeability changes potentially effect the magnitude and spatial distribution of percolation flux in the vicinity of the drift, and hence the seepage of water into the drift. This is important because a sufficient amount of water must be available within a

  17. Scale invariance vs conformal invariance

    NASA Astrophysics Data System (ADS)

    Nakayama, Yu

    2015-03-01

    In this review article, we discuss the distinction and possible equivalence between scale invariance and conformal invariance in relativistic quantum field theories. Under some technical assumptions, we can prove that scale invariant quantum field theories in d = 2 space-time dimensions necessarily possess the enhanced conformal symmetry. The use of the conformal symmetry is well appreciated in the literature, but the fact that all the scale invariant phenomena in d = 2 space-time dimensions enjoy the conformal property relies on the deep structure of the renormalization group. The outstanding question is whether this feature is specific to d = 2 space-time dimensions or it holds in higher dimensions, too. As of January 2014, our consensus is that there is no known example of scale invariant but non-conformal field theories in d = 4 space-time dimensions under the assumptions of (1) unitarity, (2) Poincaré invariance (causality), (3) discrete spectrum in scaling dimensions, (4) existence of scale current and (5) unbroken scale invariance in the vacuum. We have a perturbative proof of the enhancement of conformal invariance from scale invariance based on the higher dimensional analogue of Zamolodchikov's c-theorem, but the non-perturbative proof is yet to come. As a reference we have tried to collect as many interesting examples of scale invariance in relativistic quantum field theories as possible in this article. We give a complementary holographic argument based on the energy-condition of the gravitational system and the space-time diffeomorphism in order to support the claim of the symmetry enhancement. We believe that the possible enhancement of conformal invariance from scale invariance reveals the sublime nature of the renormalization group and space-time with holography. This review is based on a lecture note on scale invariance vs conformal invariance, on which the author gave lectures at Taiwan Central University for the 5th Taiwan School on Strings and

  18. Disparity Gradients and Depth Scaling

    DTIC Science & Technology

    1989-09-01

    points. This depth scaling effect is discussed in a computational framework of stereo based on a Baysian (continued on back)_ D D F~~ 14 73 EDTION 01 1NOV...stimuli than for points. This depth scaling effect is discussed in a computational framework of stereo based on a Baysian approach ’which allows to

  19. Children's Scale Errors with Tools

    ERIC Educational Resources Information Center

    Casler, Krista; Eshleman, Angelica; Greene, Kimberly; Terziyan, Treysi

    2011-01-01

    Children sometimes make "scale errors," attempting to interact with tiny object replicas as though they were full size. Here, we demonstrate that instrumental tools provide special insight into the origins of scale errors and, moreover, into the broader nature of children's purpose-guided reasoning and behavior with objects. In Study 1, 1.5- to…

  20. The Differentiated Classroom Observation Scale

    ERIC Educational Resources Information Center

    Cassady, Jerrell C.; Neumeister, Kristie L. Speirs; Adams, Cheryll M.; Cross, Tracy L.; Dixon, Felicia A.; Pierce, Rebecca L.

    2004-01-01

    This article presents a new classroom observation scale that was developed to examine the differential learning activities and experiences of gifted children educated in regular classroom settings. The Differentiated Classroom Observation Scale (DCOS) is presented in total, with clarification of the coding practices and strategies. Although the…

  1. Voice, Schooling, Inequality, and Scale

    ERIC Educational Resources Information Center

    Collins, James

    2013-01-01

    The rich studies in this collection show that the investigation of voice requires analysis of "recognition" across layered spatial-temporal and sociolinguistic scales. I argue that the concepts of voice, recognition, and scale provide insight into contemporary educational inequality and that their study benefits, in turn, from paying attention to…

  2. Spiritual Competency Scale: Further Analysis

    ERIC Educational Resources Information Center

    Dailey, Stephanie F.; Robertson, Linda A.; Gill, Carman S.

    2015-01-01

    This article describes a follow-up analysis of the Spiritual Competency Scale, which initially validated ASERVIC's (Association for Spiritual, Ethical and Religious Values in Counseling) spiritual competencies. The study examined whether the factor structure of the Spiritual Competency Scale would be supported by participants (i.e., ASERVIC…

  3. Rating Scale Instruments and Measurement

    ERIC Educational Resources Information Center

    Cavanagh, Robert F.; Romanoski, Joseph T.

    2006-01-01

    The article examines theoretical issues associated with measurement in the human sciences and ensuring data from rating scale instruments are measures. An argument is made that using raw scores from rating scale instruments for subsequent arithmetic operations and applying linear statistics is less preferable than using measures. These theoretical…

  4. The Callier-Azusa Scale.

    ERIC Educational Resources Information Center

    Stillman, Robert D., Ed.

    Presented is the Callier-Azusa Scale designed to aid in the assessment of deaf-blind and multihandicapped children in the areas of motor development, perceptual abilities, daily living skills, language development, and socialization. The scale is said to be predicated on the assumption that given the appropriate environment all children follow the…

  5. A Scale of Mobbing Impacts

    ERIC Educational Resources Information Center

    Yaman, Erkan

    2012-01-01

    The aim of this research was to develop the Mobbing Impacts Scale and to examine its validity and reliability analyses. The sample of study consisted of 509 teachers from Sakarya. In this study construct validity, internal consistency, test-retest reliabilities and item analysis of the scale were examined. As a result of factor analysis for…

  6. Profile Analysis: Multidimensional Scaling Approach.

    ERIC Educational Resources Information Center

    Ding, Cody S.

    2001-01-01

    Outlines an exploratory multidimensional scaling-based approach to profile analysis called Profile Analysis via Multidimensional Scaling (PAMS) (M. Davison, 1994). The PAMS model has the advantages of being applied to samples of any size easily, classifying persons on a continuum, and using person profile index for further hypothesis studies, but…

  7. Evaluation of Behavioral Expectation Scales.

    ERIC Educational Resources Information Center

    Zedeck, Sheldon; Baker, Henry T.

    Behavioral Expectation Scales developed by Smith and Kendall were evaluated. Results indicated slight interrater reliability between Head Nurses and Supervisors, moderate dependence among five performance dimensions, and correlation between two scales and tenure. Results are discussed in terms of procedural problems, critical incident problems,…

  8. Chip-scale microscopy imaging.

    PubMed

    Zheng, Guoan

    2012-08-01

    Chip-scale microscopy imaging platforms are pivotal for improving the efficiency of modern biomedical and bioscience experiments. Their integration with other lab-on-a-chip techniques would allow rapid, reliable and high-throughput sample analysis for applications in diverse disciplines. In typical chip-scale microscopy imaging platforms, the light path can be generalized to the following steps: photons leave the light source, interact with the sample and finally are detected by the sensor. Based on the light path of these platforms, the current review aims to provide some insights on design strategies for chip-scale microscopy. Specifically, we analyze current chip-scale microscopy approaches from three aspects: illumination design, sample manipulation and substrate/imager modification. We also discuss some opportunities for future developments of chip-scale microscopy, such as time multiplexed structured illumination and hydrodynamic focusing for high throughput sample manipulation.

  9. Rasch rating scale analysis of the Attitudes Toward Research Scale.

    PubMed

    Papanastasiou, Elena C; Schumacker, Randall

    2014-01-01

    College students may view research methods courses with negative attitudes, however, few studies have investigated this issue due to the lack of instruments that measure the students' attitudes towards research. Therefore, the purpose of this study was to examine the psychometric properties of a Attitudes Toward Research Scale using Rasch rating scale analysis. Assessment of attitudes toward research is essential to determine if students have negative attitudes towards research and assist instructors in better facilitation of learning research methods in their courses. The results of this study have shown that a thirty item Attitudes Toward Research Scale yielded scores with high person and item reliability.

  10. Important Scaling Parameters for Testing Model-Scale Helicopter Rotors

    NASA Technical Reports Server (NTRS)

    Singleton, Jeffrey D.; Yeager, William T., Jr.

    1998-01-01

    An investigation into the effects of aerodynamic and aeroelastic scaling parameters on model scale helicopter rotors has been conducted in the NASA Langley Transonic Dynamics Tunnel. The effect of varying Reynolds number, blade Lock number, and structural elasticity on rotor performance has been studied and the performance results are discussed herein for two different rotor blade sets at two rotor advance ratios. One set of rotor blades were rigid and the other set of blades were dynamically scaled to be representative of a main rotor design for a utility class helicopter. The investigation was con-densities permits the acquisition of data for several Reynolds and Lock number combinations.

  11. 27 CFR 19.186 - Package scales.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2014-04-01 2014-04-01 false Package scales. 19.186... Package Scale and Pipeline Requirements § 19.186 Package scales. Proprietors must ensure that scales used.... However, if a scale is not used during a 6-month period, it is only necessary to test the scale prior...

  12. 27 CFR 19.186 - Package scales.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2012-04-01 2012-04-01 false Package scales. 19.186... Package Scale and Pipeline Requirements § 19.186 Package scales. Proprietors must ensure that scales used.... However, if a scale is not used during a 6-month period, it is only necessary to test the scale prior...

  13. 27 CFR 19.186 - Package scales.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2013-04-01 2013-04-01 false Package scales. 19.186... Package Scale and Pipeline Requirements § 19.186 Package scales. Proprietors must ensure that scales used.... However, if a scale is not used during a 6-month period, it is only necessary to test the scale prior...

  14. 27 CFR 19.186 - Package scales.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2011-04-01 2011-04-01 false Package scales. 19.186... Package Scale and Pipeline Requirements § 19.186 Package scales. Proprietors must ensure that scales used.... However, if a scale is not used during a 6-month period, it is only necessary to test the scale prior...

  15. Weyl current, scale-invariant inflation, and Planck scale generation

    NASA Astrophysics Data System (ADS)

    Ferreira, Pedro G.; Hill, Christopher T.; Ross, Graham G.

    2017-02-01

    Scalar fields, ϕi, can be coupled nonminimally to curvature and satisfy the general criteria: (i) the theory has no mass input parameters, including MP=0 ; (ii) the ϕi have arbitrary values and gradients, but undergo a general expansion and relaxation to constant values that satisfy a nontrivial constraint, K (ϕi)=constant; (iii) this constraint breaks scale symmetry spontaneously, and the Planck mass is dynamically generated; (iv) there can be adequate inflation associated with slow roll in a scale-invariant potential subject to the constraint; (v) the final vacuum can have a small to vanishing cosmological constant; (vi) large hierarchies in vacuum expectation values can naturally form; (vii) there is a harmless dilaton which naturally eludes the usual constraints on massless scalars. These models are governed by a global Weyl scale symmetry and its conserved current, Kμ. At the quantum level the Weyl scale symmetry can be maintained by an invariant specification of renormalized quantities.

  16. Scale-dependent halo bias from scale-dependent growth

    SciTech Connect

    Parfrey, Kyle; Hui, Lam; Sheth, Ravi K.

    2011-03-15

    We derive a general expression for the large-scale halo bias, in theories with a scale-dependent linear growth, using the excursion set formalism. Such theories include modified-gravity models, and models in which the dark energy clustering is non-negligible. A scale dependence is imprinted in both the formation and evolved biases by the scale-dependent growth. Mergers are accounted for in our derivation, which thus extends earlier work which focused on passive evolution. There is a simple analytic form for the bias for those theories in which the nonlinear collapse of perturbations is approximately the same as in general relativity. As an illustration, we apply our results to a simple Yukawa modification of gravity, and use Sloan Digital Sky Survey measurements of the clustering of luminous red galaxies to constrain the theory's parameters.

  17. Full-Scale Wind Tunnel

    NASA Technical Reports Server (NTRS)

    1931-01-01

    Construction of Full-Scale Tunnel (FST) balance. Smith DeFrance described the 6-component type balance in NACA TR No. 459 (which also includes a schematic diagram of the balance and its various parts). 'Ball and socket fittings at the top of each of the struts hod the axles of the airplane to be tested; the tail is attached to the triangular frame. These struts are secured to the turntable, which is attached to the floating frame. This frame rests on the struts (next to the concrete piers on all four corners), which transmit the lift forces to the scales (partially visible on the left). The drag linkage is attached to the floating frame on the center line and, working against a known counterweight, transmits the drag force to the scale (center, face out). The cross-wind force linkages are attached to the floating frame on the front and rear sides at the center line. These linkages, working against known counterweights, transmit the cross-wind force to scales (two front scales, face in). In the above manner the forces in three directions are measured and by combining the forces and the proper lever arms, the pitching, rolling, and yawing moments can be computed. The scales are of the dial type and are provided with solenoid-operated printing devices. When the proper test condition is obtained, a push-button switch is momentarily closed and the readings on all seven scales are recorded simultaneously, eliminating the possibility of personal errors.'

  18. Scaling limits of a model for selection at two scales

    NASA Astrophysics Data System (ADS)

    Luo, Shishi; Mattingly, Jonathan C.

    2017-04-01

    The dynamics of a population undergoing selection is a central topic in evolutionary biology. This question is particularly intriguing in the case where selective forces act in opposing directions at two population scales. For example, a fast-replicating virus strain outcompetes slower-replicating strains at the within-host scale. However, if the fast-replicating strain causes host morbidity and is less frequently transmitted, it can be outcompeted by slower-replicating strains at the between-host scale. Here we consider a stochastic ball-and-urn process which models this type of phenomenon. We prove the weak convergence of this process under two natural scalings. The first scaling leads to a deterministic nonlinear integro-partial differential equation on the interval [0,1] with dependence on a single parameter, λ. We show that the fixed points of this differential equation are Beta distributions and that their stability depends on λ and the behavior of the initial data around 1. The second scaling leads to a measure-valued Fleming–Viot process, an infinite dimensional stochastic process that is frequently associated with a population genetics.

  19. Geometric scaling as traveling waves.

    PubMed

    Munier, S; Peschanski, R

    2003-12-05

    We show the relevance of the nonlinear Fisher and Kolmogorov-Petrovsky-Piscounov (KPP) equation to the problem of high energy evolution of the QCD amplitudes. We explain how the traveling wave solutions of this equation are related to geometric scaling, a phenomenon observed in deep-inelastic scattering experiments. Geometric scaling is for the first time shown to result from an exact solution of nonlinear QCD evolution equations. Using general results on the KPP equation, we compute the velocity of the wave front, which gives the full high energy dependence of the saturation scale.

  20. Scaling relations for magnetic nanoparticles

    NASA Astrophysics Data System (ADS)

    Landeros, P.; Escrig, J.; Altbir, D.; Laroze, D.; D'Albuquerque E Castro, J.; Vargas, P.

    2005-03-01

    A detailed investigation of the scaling relations recently proposed [J. d’Albuquerque e Castro, D. Altbir, J. C. Retamal, and P. Vargas, Phys. Rev. Lett. 88, 237202 (2002)] to study the magnetic properties of nanoparticles is presented. Analytical expressions for the total energy of three characteristic internal configurations of the particles are obtained, in terms of which the behavior of the magnetic phase diagram for those particles upon scaling of the exchange interaction is discussed. The exponent η in scaling relations is shown to be dependent on the geometry of the vortex core, and results for specific cases are presented.

  1. Evaluating the Effectiveness of the 1999-2000 NASA CONNECT Program

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Frank, Kari Lou

    2002-01-01

    NASA CONNECT is a standards-based, integrated mathematics, science, and technology series of 30-minute instructional distance learning (satellite and television) programs for students in grades 6-8. Each of the five programs in the 1999-2000 NASA CONNECT series included a lesson, an educator guide, a student activity or experiment, and a web-based component. In March 2000, a mail (self-reported) survey (booklet) was sent to a randomly selected sample of 1,000 NASA CONNECT registrants. A total of 336 surveys (269 usable) were received by the established cut-off date. Most survey questions employed a 5-point Likert-type response scale. Survey topics included (1) instructional technology and teaching, (2) instructional programming and technology in the classroom, (3) the NASA CONNECT program, (4) classroom use of computer technology, and (5) demographics. About 73% of the respondents were female, about 92% identified "classroom teacher" as their present professional duty, about 90% worked in a public school, and about 62% held a master's degree or master's equivalency. Regarding NASA CONNECT, respondents reported that (1) they used the five programs in the 1999-2000 NASA CONNECT series; (2) the stated objectives for each program were met (4.54); (3) the programs were aligned with the national mathematics, science, and technology standards (4.57); (4) program content was developmentally appropriate for grade level (4.17); and (5) the programs in the 1999-2000 NASA CONNECT series enhanced/enriched the teaching of mathematics, science, and technology (4.51).

  2. Evaluating the Effectiveness of the 1998-1999 NASA CONNECT Program

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Frank, Kari Lou; House, Patricia L.

    2000-01-01

    NASA CONNECT is a standards-based, integrated mathematics, science, and technology series of 30-minute instructional distance learning (satellite and television) programs for students in grades 5-8. Each of the five programs in the 1998-1999 NASA CONNECT series included a lesson, an educator guide, a student activity or experiment, and a web-based component. In March 1999, a mail (self-reported) survey (booklet) was sent to a randomly selected sample of 1,000 NASA CONNECT registrants. A total of 401 surveys (351 usable) were received by the established cut-off date. Most survey questions employed a 5-point Likert-type response scale. Survey topics included: (1) instructional technology and teaching, (2) instructional programming and technology in the classroom, (3) the NASA CONNECT program, (4) classroom use of computer technology, and (5) demographics. About 68% of the respondents were female, about 88% identified "classroom teacher" as their present professional duty, about 75% worked in a public school, and about 67% held a master's degree or master's equivalency. Regarding NASA CONNECT, respondents reported that: (1) they used the five programs in the 1998-1999 NASA CONNECT series; (2) the stated objectives for each program were met (4.49); (3) the programs were aligned with the national mathematics, science, and technology standards (4.61); (4) program content was developmentally appropriate for grade level (4.25); and (5) the programs in the 1998-1999 NASA CONNECT series enhanced/enriched the teaching of mathematics, science, and technology (4.45).

  3. Relation between malodor, ambient hydrogen sulfide, and health in a community bordering a landfill

    PubMed Central

    Heaney, Christopher D.; Wing, Steve; Campbell, Robert L.; Caldwell, David; Hopkins, Barbara; Richardson, David; Yeatts, Karin

    2011-01-01

    Background Municipal solid waste landfills are sources of air pollution that may affect the health and quality of life of neighboring communities. Objectives To investigate health and quality of life concerns of neighbors related to landfill air pollution. Methods Landfill neighbors were enrolled and kept twice-daily diaries for 14 d about odor intensity, alteration of daily activities, mood states, and irritant and other physical symptoms between Jan–Nov, 2009. Concurrently, hydrogen sulfide (H2S) air measurements were recorded every 15-min. Relationships between H2S, odor, and health outcomes were evaluated using conditional fixed effects regression models. Results Twenty-three participants enrolled and completed 878 twice-daily diary entries. H2S measurements were recorded over a period of 80 d and 1-hr average H2S = 0.22 ppb (SD = 0.27; range: 0–2.30 ppb). Landfill odor increased 0.63 points (on 5-point Likert-type scale) for every 1 ppb increase in hourly average H2S when the wind was blowing from the landfill towards the community (95% confidence interval (CI): 0.29, 0.91). Odor was strongly associated with reports of alteration of daily activities (odds ratio (OR) = 9.0; 95% CI: 3.5, 23.5), negative mood states (OR = 5.2; 95% CI: 2.8, 9.6), mucosal irritation (OR = 3.7; 95% CI = 2.0, 7.1) and upper respiratory symptoms (OR = 3.9; 95% CI: 2.2, 7.0), but not positive mood states (OR = 0.6; 95% CI: 0.2, 1.5) and gastrointestinal (GI) symptoms (OR = 1.0; 95% CI: 0.4, 2.6). Conclusions Results suggest air pollutants from a regional landfill negatively impact the health and quality of life of neighbors. PMID:21679938

  4. Assessing cognitive therapy skills comprehension, acquisition, and use by means of an independent observer version of the Skills of Cognitive Therapy (SoCT-IO).

    PubMed

    Brown, Gregory K; Thase, Michael E; Vittengl, Jeffrey R; Borman, Patricia D; Clark, Lee Anna; Jarrett, Robin B

    2016-02-01

    The purposes of this study were (a) to describe the adaptation and psychometric properties of the Skills for Cognitive Therapy (SoCT) measure for use by an independent observer (SoCT-IO) who rates the cognitive therapy (CT) skill acquisition, comprehension, and use by depressed adults and (b) to compare ratings of CT skill comprehension, acquisition, and use by independent observers to those by patients and therapists. Like the other SoCT versions, the SoCT-IO consists of 8 items that assess patients' comprehension, acquisition, and use of cognitive and behavioral skills for managing depressive symptoms, using a 5-point Likert-type scale. Four experienced raters (2 doctoral-level CT therapists and 2 bachelor-level nontherapists) used the SoCT-IO to rate 80 CT videotapes from both mid and later sessions in acute-phase CT from a randomized controlled trial for outpatients with recurrent major depression. The SoCT-IO ratings showed excellent internal consistency reliability and moderately high interrater reliability. Concurrent validity was demonstrated by convergence of the SoCT-IO with 2 other versions of the SoCT, 1 completed by therapists (SoCT-O) and the other by patients (SoCT-P). SoCT-IO ratings evidenced good predictive validity: Independent observers' ratings of patient CT skills midphase in therapy predicted treatment response even when the predictive effects of SoCT ratings by therapists and patients were controlled. The SoCT-IO is a psychometrically sound measure of CT skill comprehension, acquisition and use for rating outpatients with recurrent depression. The clinical utility and implications for using the SoCT-IO as a measure of CT skills acquisition are discussed. (PsycINFO Database Record

  5. Quit Smoking Experts’ Opinions toward Quality and Results of Quit Smoking Methods Provided in Tobacco Cessation Services Centers in Iran

    PubMed Central

    Heydari, Gholamreza; Masjedi, Mohammadreza; Ebn Ahmady, Arezoo; Leischow, Scott J.; Lando, Harry A.; Shadmehr, Mohammad B.; Fadaizadeh, Lida

    2015-01-01

    Background: One of the core responsibilities of health system is to treat tobacco dependence. This treatment includes different methods such as simple medical consultation, medication, and telephone counseling. To assess physicians’ opinions towards quality and result of different quit smoking methods provided in tobacco cessation services centers in Iran. Methods: In this cross-sectional and descriptive study, random sampling of all quit centers at country level was used to obtain a representative sample size of 100 physicians. Physicians completed a self-administered questionnaire which contained 10 questions regarding the quality, cost, effect, side effects, and the results of quitting methods using a 5-point Likert-type scale. Percentages, frequencies, mean, T-test, and variance analyses were computed for all study variables. Results: Most experts preferred to use combination quit smoking methods and then Nicotine Replacement Therapy (NRT) with 26 and 23, respectively. The least used methods were quit line and some methods without medication with 3 cases. The method which gained the maximum scores were telephone consultation, acupuncture, Willpower, Champix, combined method, and Interactive Voice Response (IVR) with the mean of 23.3, 23, 22.5, 22, 21.7 and 21.3, respectively. The minimum scores were related to e-cigarette, some methods without medication, and non-NRT medication with the mean of 12.3, 15.8 and 16.2, respectively. There were no significant differences in the mean of scores based on different cities (P = 0.256). Analysis of variance in mean scores showed significant differences in the means scores of different methods (P < 0.000). Conclusions: According to physicians acupuncture, personal methods and Champix are the most effective methods and these methods could be much more feasible and cost effective than other methods. PMID:26425329

  6. Download Alert: Understanding Gastroenterology Patients' Perspectives on Health-Related Smartphone Apps

    PubMed Central

    Zia, Jasmine K; Le, Thai; Munson, Sean; Heitkemper, Margaret M; Demiris, George

    2015-01-01

    Objectives: The aims of this study were to understand patients' willingness to use different types of health-related smartphone apps and to explore their attitudes on the overall value, usability, feasibility, credibility, intrusiveness, and obtrusiveness of these apps. Methods: Questionnaires were distributed to adult patients presenting to gastroenterology clinics at an academic medical center. The 25-question survey consisted of 5-point Likert-type scale statements, multiple-choice questions, and open-ended questions. Results: Participants were mainly White (N=94, 78%) and smartphone owners (N=125, 93%). The mean age was 40.8 years (N=121, s.d.=13.2). Participants were willing to use most types of apps unless it monitored their location or social networking activity. Half were less willing to use an app if it required a visible accessory. Most participants were willing to use a health-related app up to 5 min a day indefinitely but unwilling to pay out-of-pocket for it. Participants generally disagreed that an app would be hard to learn how to use, interfere with their daily routine, or be embarrassing to use in public. Overall, participants felt that health-related apps could help them and their doctors better manage their medical problems, but were neutral in trusting their quality. Most worried that personal information used for an app would fall into the wrong hands. Conclusion: Gastroenterology patients were willing to use and valued most types of health-related apps. They perceived this technology as feasible, usable, and relatively unobtrusive unless a visible accessory was required. However, many were concerned about their privacy. PMID:26133109

  7. The impact of gross anatomy laboratory on first year medical students' interest in a surgical career.

    PubMed

    Pulcrano, Marisa E; Malekzadeh, Sonya; Kumar, Anagha

    2016-09-01

    This study sought to determine the impact of gross anatomy laboratory (GA) on first year medical students' (M1) interest in a surgical career. Secondary objectives included identifying other influences in M1s' career decision making. This prospective study included surveys before and after GA. All M1s enrolled in GA were invited to participate. Sixty students completed both the pre- and post-test surveys. A 5-point Likert-type scale surveyed participants' interests, specific personality traits, experience during the course of GA, and likelihood of pursuing a surgical career. Statistical analysis included Wilcoxon Signed Rank Test and (Polychotomous) Ordinal Logistic Regression Model. Students' desire to work with their hands increased (50 vs. 33.3%) and enjoyment working with instruments and tools similarly increased (50 vs. 41.7%). Likelihood of pursuing a surgical career after gross anatomy increased in 31.7% of students, decreased in 16.7%, and was unchanged in 51.7%. Over 75% of students with a prior interest in surgery and 21% of those who previously felt neutral agreed that they were likely to pursue a career in surgery at the conclusion of the laboratory. Students with a surgeon family member were 0.1976 times as likely to exhibit a positive change in interest (P values 0.024). Gross anatomy may influence up to a third of the class to consider a surgical career, especially those with a prior interest in surgery and those previously feeling ambivalent. Students with a surgeon family member became less likely to enter a surgical career after gross anatomy. Clin. Anat. 29:691-695, 2016. © 2016 Wiley Periodicals, Inc.

  8. Sources of evidence in HIV/AIDS care: pilot study comparing family physicians and AIDS service organization staff

    PubMed Central

    Stefanski, Kasia E; Tracy, C Shawn; Upshur, Ross EG

    2004-01-01

    Background The improvement of the quality of the evidence used in treatment decision-making is especially important in the case of patients with complicated disease processes such as HIV/AIDS for which multiple treatment strategies exist with conflicting reports of efficacy. Little is known about the perceptions of distinct groups of health care workers regarding various sources of evidence and how these influence the clinical decision-making process. Our objective was to investigate how two groups of treatment information providers for people living with HIV/AIDS perceive the importance of various sources of treatment information. Methods Surveys were distributed to staff at two local AIDS service organizations and to family physicians at three community health centres treating people living with HIV/AIDS. Participants were asked to rate the importance of 10 different sources of evidence for HIV/AIDS treatment information on a 5-point Likert-type scale. Mean rating scores and relative rankings were compared. Results Findings suggest that a discordance exists between the two health information provider groups in terms of their perceptions of the various sources of evidence. Furthermore, AIDS service organization staff ranked health care professionals as the most important source of information whereas physicians deemed AIDS service organizations to be relatively unimportant. The two groups appear to share a common mistrust for information from pharmaceutical industries. Conclusions Discordance exists between medical "experts" from different backgrounds relating to their perceptions of evidence. Further investigation is warranted in order to reveal any effects on the quality of treatment information and implications in the decision-making process. Possible effects on collaboration and working relationships also warrant further exploration. PMID:15245578

  9. Clinical trials involving cats: What factors affect owner 1 participation?

    PubMed Central

    Gruen, Margaret E; Jiamachello, Katrina N; Thomson, Andrea; Lascelles, BDX

    2014-01-01

    Clinical trials are frequently hindered by difficulty recruiting eligible participants, increasing the timeline and limiting generalizability of results. In veterinary medicine, where proxy enrollment is required, no studies have detailed what factors influence owner participation in studies involving cats. We aimed to investigate these factors through a survey of owners at first opinion practices. The survey was designed using feedback from a pilot study and input from clinical researchers. Owners were asked demographic questions and whether they would, would not, or were unsure about participating in a clinical trial with their cat. They then ranked the importance and influence of various factors on participation using a 5-point Likert-type scale, and incentives from most to least encouraging. A total of 413 surveys were distributed to cat owners at four hospitals, two feline-only and two multi-species; 88.6% were completed. Data for importance and influence factors as well as incentive rankings were analyzed overall, by hospital type, location and whether owners would consider participating. The most influential factors were trust in the organization, benefit to the cat and veterinarian recommendation. Importance and influence factors varied by willingness to participate. Ranked incentives were not significantly different across groups, with “Free Services” ranked highest. This study provides a first look at what factors influence participation in clinical trials with cats. Given the importance placed in the recommendation of veterinarians, continued work is needed to determine veterinarian related factors affecting clinical trial participation. The results provide guidance towards improved clinical trial design, promotion and education. PMID:24938313

  10. Validation of NOViSE.

    PubMed

    Korzeniowski, Przemyslaw; Brown, Daniel C; Sodergren, Mikael H; Barrow, Alastair; Bello, Fernando

    2017-02-01

    The goal of this study was to establish face, content, and construct validity of NOViSE-the first force-feedback enabled virtual reality (VR) simulator for natural orifice transluminal endoscopic surgery (NOTES). Fourteen surgeons and surgical trainees performed 3 simulated hybrid transgastric cholecystectomies using a flexible endoscope on NOViSE. Four of them were classified as "NOTES experts" who had independently performed 10 or more simulated or human NOTES procedures. Seven participants were classified as "Novices" and 3 as "Gastroenterologists" with no or minimal NOTES experience. A standardized 5-point Likert-type scale questionnaire was administered to assess the face and content validity. NOViSE showed good overall face and content validity. In 14 out of 15 statements pertaining to face validity (graphical appearance, endoscope and tissue behavior, overall realism), ≥50% of responses were "agree" or "strongly agree." In terms of content validity, 85.7% of participants agreed or strongly agreed that NOViSE is a useful training tool for NOTES and 71.4% that they would recommend it to others. Construct validity was established by comparing a number of performance metrics such as task completion times, path lengths, applied forces, and so on. NOViSE demonstrated early signs of construct validity. Experts were faster and used a shorter endoscopic path length than novices in all but one task. The results indicate that NOViSE authentically recreates a transgastric hybrid cholecystectomy and sets promising foundations for the further development of a VR training curriculum for NOTES without compromising patient safety or requiring expensive animal facilities.

  11. Self-Confidence in and Perceived Utility of the Physical Examination: A Comparison of Medical Students, Residents, and Faculty Internists

    PubMed Central

    Fagan, Mark J.; Reinert, Steven E.; Diaz, Joseph A.

    2007-01-01

    BACKGROUND AND OBJECTIVES Little is known about the differences in attitudes of medical students, Internal Medicine residents, and faculty Internists toward the physical examination. We sought to investigate these groups’ self-confidence in and perceived utility of physical examination skills. DESIGN AND PARTICIPANTS Cross-sectional survey of third- and fourth-year medical students, Internal Medicine residents, and faculty Internists at an academic teaching hospital. MEASUREMENTS Using a 5-point Likert-type scale, respondents indicated their self-confidence in overall physical examination skill, as well as their ability to perform 14 individual skills, and how useful they felt the overall physical examination, and each skill, to be for yielding clinically important information. RESULTS The response rate was 80% (302/376). The skills with overall mean self-confidence ratings less than “neutral” were interpreting a diastolic murmur (2.9), detecting a thyroid nodule (2.8), and the nondilated fundoscopic examination using an ophthalmoscope to assess retinal vasculature (2.5). No skills had a mean utility rating less than neutral. The skills with the greatest numerical differences between mean self-confidence and perceived utility were distinguishing between a mole and melanoma (1.5), detecting a thyroid nodule (1.4), and interpreting a diastolic murmur (1.3). Regarding overall self-confidence, third-year students’ ratings (3.3) were similar to those of first-year residents (3.4; p = .95) but less than those of fourth-year students (3.8; p = .002), upper-level residents (3.7; p = .01), and faculty Internists (3.9; p < .001). CONCLUSIONS Self-confidence in the physical exam does not necessarily increase at each stage of training. The differences found between self-confidence and perceived utility for a number of skills suggest important areas for educational interventions. PMID:17922165

  12. Preference of undergraduate students after first experience on nickel-titanium endodontic instruments

    PubMed Central

    Kwak, Sang Won; Cheung, Gary Shun-Pan; Ha, Jung-Hong; Kim, Sung Kyo; Lee, Hyojin

    2016-01-01

    Objectives This study aimed to compare two nickel-titanium systems (rotary vs. reciprocating) for their acceptance by undergraduate students who experienced nickel-titanium (NiTi) instruments for the first time. Materials and Methods Eighty-one sophomore dental students were first taught on manual root canal preparation with stainless-steel files. After that, they were instructed on the use of ProTaper Universal system (PTU, Dentsply Maillefer), then the WaveOne (WO, Dentsply Maillefer). They practiced with each system on 2 extracted molars, before using those files to shape the buccal or mesial canals of additional first molars. A questionnaire was completed after using each file system, seeking students' perception about 'Ease of use', 'Flexibility', 'Cutting-efficiency', 'Screwing-effect', 'Feeling-safety', and 'Instrumentation-time' of the NiTi files, relative to stainless-steel instrumentation, on a 5-point Likert-type scale. They were also requested to indicate their preference between the two systems. Data was compared between groups using t-test, and with Chi-square test for correlation of each perception value with the preferred choice (p = 0.05). Results Among the 81 students, 55 indicated their preferred file system as WO and 22 as PTU. All scores were greater than 4 (better) for both systems, compared with stainless-steel files, except for 'Screwing-effect' for PTU. The scores for WO in the categories of 'Flexibility', 'Screwing-effect', and 'Feeling-safety' were significantly higher scores than those of PTU. A significant association between the 'Screwing-effect' and students' preference for WO was observed. Conclusions Novice operators preferred nickel-titanium instruments to stainless-steel, and majority of them opted for reciprocating file instead of continuous rotating system. PMID:27508158

  13. Assessment of Different Quit Smoking Methods Selected by Patients in Tobacco Cessation Centers in Iran

    PubMed Central

    Heydari, Gholamreza; Masjedi, Mohammadreza; Ahmady, Arezoo Ebn; Leischow, Scott J.; Harry, A. Lando; Shadmehr, Mohammad B.; Fadaizadeh, Lida

    2015-01-01

    Background: Health systems play key roles in identifying tobacco users and providing evidence-based care to help them quit. This treatment includes different methods such as simple medical consultation, medication, and telephone counseling. To assess different quit smoking methods selected by patients in tobacco cessation centers in Iran in order to identify those that are most appropriate for the country health system. Methods: In this cross-sectional and descriptive study, a random sample of all quit centers at the country level was used to obtain a representative sample. Patients completed the self-administered questionnaire which contained 10 questions regarding the quality, cost, effect, side effects and the results of quitting methods using a 5-point Likert-type scale. Percentages, frequencies, mean, T-test, and variance analyses were computed for all study variables. Results: A total of 1063 smokers returned completed survey questionnaires. The most frequently used methods were Nicotine Replacement Therapy (NRT) and combination therapy (NRT and Counseling) with 228 and 163 individuals reporting these respectively. The least used methods were hypnotism (n = 8) and the quit and win (n = 17). The methods which gained the maximum scores were respectively the combined method, personal and Champix with means of 21.4, 20.4 and 18.4. The minimum scores were for e-cigarettes, hypnotism and education with means of 12.8, 11 and 10.8, respectively. There were significant differences in mean scores based on different cities and different methods. Conclusions: According to smokers’ selection the combined therapy, personal methods and Champix are the most effective methods for quit smoking and these methods could be much more considered in the country health system. PMID:26442750

  14. Validation of NOViSE

    PubMed Central

    Korzeniowski, Przemyslaw; Brown, Daniel C.; Sodergren, Mikael H.; Barrow, Alastair; Bello, Fernando

    2016-01-01

    The goal of this study was to establish face, content, and construct validity of NOViSE—the first force-feedback enabled virtual reality (VR) simulator for natural orifice transluminal endoscopic surgery (NOTES). Fourteen surgeons and surgical trainees performed 3 simulated hybrid transgastric cholecystectomies using a flexible endoscope on NOViSE. Four of them were classified as “NOTES experts” who had independently performed 10 or more simulated or human NOTES procedures. Seven participants were classified as “Novices” and 3 as “Gastroenterologists” with no or minimal NOTES experience. A standardized 5-point Likert-type scale questionnaire was administered to assess the face and content validity. NOViSE showed good overall face and content validity. In 14 out of 15 statements pertaining to face validity (graphical appearance, endoscope and tissue behavior, overall realism), ≥50% of responses were “agree” or “strongly agree.” In terms of content validity, 85.7% of participants agreed or strongly agreed that NOViSE is a useful training tool for NOTES and 71.4% that they would recommend it to others. Construct validity was established by comparing a number of performance metrics such as task completion times, path lengths, applied forces, and so on. NOViSE demonstrated early signs of construct validity. Experts were faster and used a shorter endoscopic path length than novices in all but one task. The results indicate that NOViSE authentically recreates a transgastric hybrid cholecystectomy and sets promising foundations for the further development of a VR training curriculum for NOTES without compromising patient safety or requiring expensive animal facilities. PMID:27671036

  15. Generic dynamic scaling in kinetic roughening

    PubMed

    Ramasco; Lopez; Rodriguez

    2000-03-06

    We study the dynamic scaling hypothesis in invariant surface growth. We show that the existence of power-law scaling of the correlation functions (scale invariance) does not determine a unique dynamic scaling form of the correlation functions, which leads to the different anomalous forms of scaling recently observed in growth models. We derive all the existing forms of anomalous dynamic scaling from a new generic scaling ansatz. The different scaling forms are subclasses of this generic scaling ansatz associated with bounds on the roughness exponent values. The existence of a new class of anomalous dynamic scaling is predicted and compared with simulations.

  16. A scaling theory for linear systems

    NASA Technical Reports Server (NTRS)

    Brockett, R. W.; Krishnaprasad, P. S.

    1980-01-01

    A theory of scaling for rational (transfer) functions in terms of transformation groups is developed. Two different four-parameter scaling groups which play natural roles in studying linear systems are identified and the effect of scaling on Fisher information and related statistical measures in system identification are studied. The scalings considered include change of time scale, feedback, exponential scaling, magnitude scaling, etc. The scaling action of the groups studied is tied to the geometry of transfer functions in a rather strong way as becomes apparent in the examination of the invariants of scaling. As a result, the scaling process also provides new insight into the parameterization question for rational functions.

  17. Fluid dynamics: Swimming across scales

    NASA Astrophysics Data System (ADS)

    Baumgart, Johannes; Friedrich, Benjamin M.

    2014-10-01

    The myriad creatures that inhabit the waters of our planet all swim using different mechanisms. Now, a simple relation links key physical observables of underwater locomotion, on scales ranging from millimetres to tens of metres.

  18. Trends in Analytical Scale Separations.

    ERIC Educational Resources Information Center

    Jorgenson, James W.

    1984-01-01

    Discusses recent developments in the instrumentation and practice of analytical scale operations. Emphasizes detection devices and procedures in gas chromatography, liquid chromatography, electrophoresis, supercritical fluid chromatography, and field-flow fractionation. (JN)

  19. Scaling behavior of threshold epidemics

    NASA Astrophysics Data System (ADS)

    Ben-Naim, E.; Krapivsky, P. L.

    2012-05-01

    We study the classic Susceptible-Infected-Recovered (SIR) model for the spread of an infectious disease. In this stochastic process, there are two competing mechanism: infection and recovery. Susceptible individuals may contract the disease from infected individuals, while infected ones recover from the disease at a constant rate and are never infected again. Our focus is the behavior at the epidemic threshold where the rates of the infection and recovery processes balance. In the infinite population limit, we establish analytically scaling rules for the time-dependent distribution functions that characterize the sizes of the infected and the recovered sub-populations. Using heuristic arguments, we also obtain scaling laws for the size and duration of the epidemic outbreaks as a function of the total population. We perform numerical simulations to verify the scaling predictions and discuss the consequences of these scaling laws for near-threshold epidemic outbreaks.

  20. Constructing cities, deconstructing scaling laws.

    PubMed

    Arcaute, Elsa; Hatna, Erez; Ferguson, Peter; Youn, Hyejin; Johansson, Anders; Batty, Michael

    2015-01-06

    Cities can be characterized and modelled through different urban measures. Consistency within these observables is crucial in order to advance towards a science of cities. Bettencourt et al. have proposed that many of these urban measures can be predicted through universal scaling laws. We develop a framework to consistently define cities, using commuting to work and population density thresholds, and construct thousands of realizations of systems of cities with different boundaries for England and Wales. These serve as a laboratory for the scaling analysis of a large set of urban indicators. The analysis shows that population size alone does not provide us enough information to describe or predict the state of a city as previously proposed, indicating that the expected scaling laws are not corroborated. We found that most urban indicators scale linearly with city size, regardless of the definition of the urban boundaries. However, when nonlinear correlations are present, the exponent fluctuates considerably.

  1. Gallium Arsenide wafer scale integration

    NASA Astrophysics Data System (ADS)

    McDonald, J. F.; Taylor, G.; Steinvorth, R.; Donlan, B.; Bergendahl, A. S.

    1985-08-01

    Gallium Arsenide (GaAs) digital MESFET technology has recently begun to appear in the semiconductor marketplace. The initial commercial offerings are at the small to medium scale integration levels. The high speed of these parts would seem to be very attractive for designers of high performance signal processing equipment. Persistent yield problems, however, have prevented the appearance of large scale integrated circuits. As a result, intrapackage and interpackage signal propagation problems such as coupling, parasitics and delay are likely to negate much of the benefits of the fast MESFET logic devices for large systems constructed with such small scale building blocks. An early packaging concept, Wafer Scale Integration (WSI), which could possibly be used to address some of these limitations is reexamined.

  2. Pilot Scale Advanced Fogging Demonstration

    SciTech Connect

    Demmer, Rick L.; Fox, Don T.; Archiblad, Kip E.

    2015-01-01

    Experiments in 2006 developed a useful fog solution using three different chemical constituents. Optimization of the fog recipe and use of commercially available equipment were identified as needs that had not been addressed. During 2012 development work it was noted that low concentrations of the components hampered coverage and drying in the United Kingdom’s National Nuclear Laboratory’s testing much more so than was evident in the 2006 tests. In fiscal year 2014 the Idaho National Laboratory undertook a systematic optimization of the fogging formulation and conducted a non-radioactive, pilot scale demonstration using commercially available fogging equipment. While not as sophisticated as the equipment used in earlier testing, the new approach is much less expensive and readily available for smaller scale operations. Pilot scale testing was important to validate new equipment of an appropriate scale, optimize the chemistry of the fogging solution, and to realize the conceptual approach.

  3. Physical capability scale: psychometric testing.

    PubMed

    Resnick, Barbara; Boltz, Marie; Galik, Elizabeth; Wells, Chris

    2013-02-01

    The purpose of this study was to describe the psychometric testing of the Basic Physical Capability Scale. The study was a secondary data analysis of combined data sets from three studies. Study participants included 93 older adults, recruited from 2 acute-care settings and 110 older adults living in long-term care facilities. Rasch analysis was used for the testing of the measurement model. There was some support for construct validity based on the fit of the items to the scale across both samples. In addition, there was support for hypothesis testing as physical function was significantly associated with physical capability. There was evidence for internal consistency (Alpha coefficients of .77-.83) and interrater reliability based on an intraclass correlation of .81. This study provided preliminary support for the reliability and validity of the Basic Physical Capability Scale, and guidance for scale revisions and continued use.

  4. Scaling of graphene integrated circuits

    NASA Astrophysics Data System (ADS)

    Bianchi, Massimiliano; Guerriero, Erica; Fiocco, Marco; Alberti, Ruggero; Polloni, Laura; Behnam, Ashkan; Carrion, Enrique A.; Pop, Eric; Sordan, Roman

    2015-04-01

    The influence of transistor size reduction (scaling) on the speed of realistic multi-stage integrated circuits (ICs) represents the main performance metric of a given transistor technology. Despite extensive interest in graphene electronics, scaling efforts have so far focused on individual transistors rather than multi-stage ICs. Here we study the scaling of graphene ICs based on transistors from 3.3 to 0.5 μm gate lengths and with different channel widths, access lengths, and lead thicknesses. The shortest gate delay of 31 ps per stage was obtained in sub-micron graphene ROs oscillating at 4.3 GHz, which is the highest oscillation frequency obtained in any strictly low-dimensional material to date. We also derived the fundamental Johnson limit, showing that scaled graphene ICs could be used at high frequencies in applications with small voltage swing.The influence of transistor size reduction (scaling) on the speed of realistic multi-stage integrated circuits (ICs) represents the main performance metric of a given transistor technology. Despite extensive interest in graphene electronics, scaling efforts have so far focused on individual transistors rather than multi-stage ICs. Here we study the scaling of graphene ICs based on transistors from 3.3 to 0.5 μm gate lengths and with different channel widths, access lengths, and lead thicknesses. The shortest gate delay of 31 ps per stage was obtained in sub-micron graphene ROs oscillating at 4.3 GHz, which is the highest oscillation frequency obtained in any strictly low-dimensional material to date. We also derived the fundamental Johnson limit, showing that scaled graphene ICs could be used at high frequencies in applications with small voltage swing. Electronic supplementary information (ESI) available: Discussions on the cutoff frequency fT, the maximum frequency of oscillation fmax, and the intrinsic gate delay CV/I. See DOI: 10.1039/c5nr01126d

  5. Interspecies Scaling in Blast Neurotrauma

    DTIC Science & Technology

    2015-08-27

    in vivo animal model research, and the effects of interspecies scaling on current and future in vivo animal model experimentation for blast trauma...and gut. To improve FE modeling capabilities, brain tissue mechanics in common blast TBI animal model species were investigated experimentally and...importance of interspecies scaling for investigation of blast neurotrauma. This work looks at existing in vivo animal model data to derive appropriate

  6. Two-Dimensional Vernier Scale

    NASA Technical Reports Server (NTRS)

    Juday, Richard D.

    1992-01-01

    Modified vernier scale gives accurate two-dimensional coordinates from maps, drawings, or cathode-ray-tube displays. Movable circular overlay rests on fixed rectangular-grid overlay. Pitch of circles nine-tenths that of grid and, for greatest accuracy, radii of circles large compared with pitch of grid. Scale enables user to interpolate between finest divisions of regularly spaced rule simply by observing which mark on auxiliary vernier rule aligns with mark on primary rule.

  7. Full-Scale Tunnel (FST)

    NASA Technical Reports Server (NTRS)

    1931-01-01

    Full-Scale Tunnel (FST). Construction of balance housing. Smith DeFrance noted the need for this housing in his NACA TR No. 459: 'The entire floating frame and scale assembly is enclosed in a room for protection from air currents and the supporting struts are shielded by streamlined fairings which are secured to the roof of the balance room and free from the balance.'

  8. Inflation in the scaling limit

    SciTech Connect

    Matarrese, S.; Ortolan, A.; Lucchin, F.

    1989-07-15

    We investigate the stochastic dynamics of the/ital inflaton/ for a wide class of potentials leading either tochaotic or to power-law inflation.At late times the system enters a /ital scaling/ /ital regime/where macroscopic order sets in: the field distribution sharply peaksaround the classical slow-rollover configuration and curvature perturbationsoriginate with a non-Gaussian scale-invariant statistics.

  9. Fundamental Scaling Laws in Nanophotonics

    PubMed Central

    Liu, Ke; Sun, Shuai; Majumdar, Arka; Sorger, Volker J.

    2016-01-01

    The success of information technology has clearly demonstrated that miniaturization often leads to unprecedented performance, and unanticipated applications. This hypothesis of “smaller-is-better” has motivated optical engineers to build various nanophotonic devices, although an understanding leading to fundamental scaling behavior for this new class of devices is missing. Here we analyze scaling laws for optoelectronic devices operating at micro and nanometer length-scale. We show that optoelectronic device performance scales non-monotonically with device length due to the various device tradeoffs, and analyze how both optical and electrical constrains influence device power consumption and operating speed. Specifically, we investigate the direct influence of scaling on the performance of four classes of photonic devices, namely laser sources, electro-optic modulators, photodetectors, and all-optical switches based on three types of optical resonators; microring, Fabry-Perot cavity, and plasmonic metal nanoparticle. Results show that while microrings and Fabry-Perot cavities can outperform plasmonic cavities at larger length-scales, they stop working when the device length drops below 100 nanometers, due to insufficient functionality such as feedback (laser), index-modulation (modulator), absorption (detector) or field density (optical switch). Our results provide a detailed understanding of the limits of nanophotonics, towards establishing an opto-electronics roadmap, akin to the International Technology Roadmap for Semiconductors. PMID:27869159

  10. Fundamental Scaling Laws in Nanophotonics

    NASA Astrophysics Data System (ADS)

    Liu, Ke; Sun, Shuai; Majumdar, Arka; Sorger, Volker J.

    2016-11-01

    The success of information technology has clearly demonstrated that miniaturization often leads to unprecedented performance, and unanticipated applications. This hypothesis of “smaller-is-better” has motivated optical engineers to build various nanophotonic devices, although an understanding leading to fundamental scaling behavior for this new class of devices is missing. Here we analyze scaling laws for optoelectronic devices operating at micro and nanometer length-scale. We show that optoelectronic device performance scales non-monotonically with device length due to the various device tradeoffs, and analyze how both optical and electrical constrains influence device power consumption and operating speed. Specifically, we investigate the direct influence of scaling on the performance of four classes of photonic devices, namely laser sources, electro-optic modulators, photodetectors, and all-optical switches based on three types of optical resonators; microring, Fabry-Perot cavity, and plasmonic metal nanoparticle. Results show that while microrings and Fabry-Perot cavities can outperform plasmonic cavities at larger length-scales, they stop working when the device length drops below 100 nanometers, due to insufficient functionality such as feedback (laser), index-modulation (modulator), absorption (detector) or field density (optical switch). Our results provide a detailed understanding of the limits of nanophotonics, towards establishing an opto-electronics roadmap, akin to the International Technology Roadmap for Semiconductors.

  11. Distributional Scaling in Heterogeneous Aquifers

    NASA Astrophysics Data System (ADS)

    Polsinelli, J. F.

    2015-12-01

    An investigation is undertaken into the fractal scaling properties of the piezometric head in a heterogeneous unconfined aquifer. The governing equations for the unconfined flow are derived from conservation of mass and the Darcy law. The Dupuit approximation will be used to model the dynamics. The spatially varying nature of the tendency to conduct flow (e.g. the hydraulic conductivity) is represented as a stochastic process. Experimental studies in the literature have indicated that the conductivity belongs to a class of non-stationary stochastic fields, called H-ss fields. The uncertainty in the soil parameters is imparted onto the flow variables; in groundwater investigations the potentiometric head will be a random function. The structure of the head field will be analyzed with an emphasis on the scaling properties. The scaling scheme for the modeling equations and the simulation procedure for the saturated hydraulic conductivity process will be explained, then the method will be validated through numerical experimentation using the USGS Modflow-2005 software. The results of the numerical simulations demonstrate that the head will exhibit multi-fractal scaling if the hydraulic conductivity exhibits multi-fractal scaling and the differential equations for the groundwater equation satisfy a particular set of scale invariance conditions.

  12. Fundamental Scaling Laws in Nanophotonics.

    PubMed

    Liu, Ke; Sun, Shuai; Majumdar, Arka; Sorger, Volker J

    2016-11-21

    The success of information technology has clearly demonstrated that miniaturization often leads to unprecedented performance, and unanticipated applications. This hypothesis of "smaller-is-better" has motivated optical engineers to build various nanophotonic devices, although an understanding leading to fundamental scaling behavior for this new class of devices is missing. Here we analyze scaling laws for optoelectronic devices operating at micro and nanometer length-scale. We show that optoelectronic device performance scales non-monotonically with device length due to the various device tradeoffs, and analyze how both optical and electrical constrains influence device power consumption and operating speed. Specifically, we investigate the direct influence of scaling on the performance of four classes of photonic devices, namely laser sources, electro-optic modulators, photodetectors, and all-optical switches based on three types of optical resonators; microring, Fabry-Perot cavity, and plasmonic metal nanoparticle. Results show that while microrings and Fabry-Perot cavities can outperform plasmonic cavities at larger length-scales, they stop working when the device length drops below 100 nanometers, due to insufficient functionality such as feedback (laser), index-modulation (modulator), absorption (detector) or field density (optical switch). Our results provide a detailed understanding of the limits of nanophotonics, towards establishing an opto-electronics roadmap, akin to the International Technology Roadmap for Semiconductors.

  13. Chemical Measurement and Fluctuation Scaling.

    PubMed

    Hanley, Quentin S

    2016-12-20

    Fluctuation scaling reports on all processes producing a data set. Some fluctuation scaling relationships, such as the Horwitz curve, follow exponential dispersion models which have useful properties. The mean-variance method applied to Poisson distributed data is a special case of these properties allowing the gain of a system to be measured. Here, a general method is described for investigating gain (G), dispersion (β), and process (α) in any system whose fluctuation scaling follows a simple exponential dispersion model, a segmented exponential dispersion model, or complex scaling following such a model locally. When gain and dispersion cannot be obtained directly, relative parameters, GR and βR, may be used. The method was demonstrated on data sets conforming to simple, segmented, and complex scaling. These included mass, fluorescence intensity, and absorbance measurements and specifications for classes of calibration weights. Changes in gain, dispersion, and process were observed in the scaling of these data sets in response to instrument parameters, photon fluxes, mathematical processing, and calibration weight class. The process parameter which limits the type of statistical process that can be invoked to explain a data set typically exhibited 0 < α < 1, with α > 4 possible. With two exceptions, calibration weight class definitions only affected β. Adjusting photomultiplier voltage while measuring fluorescence intensity changed all three parameters (0 < α < 0.8; 0 < βR < 3; 0 < GR < 4.1). The method provides a framework for calibrating and interpreting uncertainty in chemical measurement allowing robust comparison of specific instruments, conditions, and methods.

  14. Mineral Dissolution Rates at the Pore Scale: Scaling Effects

    NASA Astrophysics Data System (ADS)

    Li, L.; Steefel, C. I.; Yang, L.

    2006-12-01

    Mineral dissolution reactions play an important role in various physical, chemical and biological processes in nature. Although rates of these reactions have been extensively studied in laboratories, they have been found to be orders of magnitude faster than those measured in the natural systems. This work examines some of the mechanisms that can produce such a discrepancy at the pore scale, while quantifying the conditions under which the discrepancy becomes significant. This work used the reactive transport model CrunchFlow to examine the dissolution rates of three minerals, calcite, labradorite, and iron hydroxide, in a single pore. Pores were assumed to be cylindrical, with axisymmetric flow given by the analytical solution for Poiseuille flow in a cylinder. Mineral dissolution occurs only at the pore wall, with the reactive surface area of the dissolving phase specified geometrically. The average dissolution rates in the pore (R_D) for various flow velocities is determined by the flux-weighted change in concentration over the length of the pore and is compared to the rates that assume complete mixing (R_M). The differences in rates between the two models, quantified by the ratio of R_D over R_M, provide a measure of the scaling effect. The modeling results were validated by a microfluidic reactive flow experiment using a cylindrical pore in calcite. Modeling results show that the scaling effect arises due to the development of large concentration gradients caused by incomplete mixing within a pore when transport and reaction rates are comparable. The magnitude of the scaling effect depends on the reaction kinetics, flow velocity, and pore size. For labradorite and iron hydroxide, the scaling effect is negligible under all conditions due to their slow dissolution rates, thus limiting the development of any intra-pore concentration gradients. For calcite dissolution at low (smaller than 0.1 cm/s) and high (larger than 1000 cm/s) flow velocities the scaling

  15. The Menopause Rating Scale (MRS) scale: A methodological review

    PubMed Central

    Heinemann, Klaas; Ruebig, Alexander; Potthoff, Peter; Schneider, Hermann PG; Strelow, Frank; Heinemann, Lothar AJ; Thai, Do Minh

    2004-01-01

    Background This paper compiles data from different sources to get a first comprehensive picture of psychometric and other methodological characteristics of the Menopause Rating Scale (MRS) scale. The scale was designed and standardized as a self-administered scale to (a) to assess symptoms/complaints of aging women under different conditions, (b) to evaluate the severity of symptoms over time, and (c) to measure changes pre- and postmenopause replacement therapy. The scale became widespread used (available in 10 languages). Method A large multinational survey (9 countries in 4 continents) from 2001/ 2002 is the basis for in depth analyses on reliability and validity of the MRS. Additional small convenience samples were used to get first impressions about test-retest reliability. The data were centrally analyzed. Data from a postmarketing HRT study were used to estimate discriminative validity. Results Reliability measures (consistency and test-retest stability) were found to be good across countries, although the sample size for test-retest reliability was small. Validity: The internal structure of the MRS across countries was astonishingly similar to conclude that the scale really measures the same phenomenon in symptomatic women. The sub-scores and total score correlations were high (0.7–0.9) but lower among the sub-scales (0.5–0.7). This however suggests that the subscales are not fully independent. Norm values from different populations were presented showing that a direct comparison between Europe and North America is possible, but caution recommended with comparisons of data from Latin America and Indonesia. But this will not affect intra-individual comparisons within clinical trials. The comparison with the Kupperman Index showed sufficiently good correlations, illustrating an adept criterion-oriented validity. The same is true for the comparison with the generic quality-of-life scale SF-36 where also a sufficiently close association has been shown

  16. Une version franco-canadienne de la Physiotherapy Evidence Database (PEDro) Scale : L'Échelle PEDro

    PubMed Central

    Laroche, Chantal; Sutton, Anne; Guitard, Paulette; King, Judy; Poitras, Stéphane; Casimiro, Lynn; Tremblay, Manon; Cardinal, Dominique; Cavallo, Sabrina; Laferrière, Lucie; Grisé, Isabelle; Marshall, Lisa; Smith, Jacky R.; Lagacé, Josée; Pharand, Denyse; Galipeau, Roseline; Toupin-April, Karine; Loew, Laurianne; Demers, Catrine; Sauvé-Schenk, Katrine; Paquet, Nicole; Savard, Jacinthe; Tourigny, Jocelyne; Vaillancourt, Véronique

    2015-01-01

    RÉSUMÉ But : Effectuer une traduction franco-canadienne de la PEDro scale sous l'appellation proposée d'Échelle PEDro et examiner la validité de son contenu. Méthodologie : Nous avons utilisé une approche modifiée de la méthodologie de validation transculturelle de Vallerand. Une traduction renversée parallèle de la PEDro scale a d'abord été effectuée à la fois par des traductrices professionnelles et des chercheurs cliniciens. Ensuite, un premier comité d'experts (P1) a examiné les versions traduites et a créé la première version expérimentale de l'Échelle PEDro. Cette version a été évaluée par un deuxième comité d'experts (P2). Finalement, 32 chercheurs cliniques ont évalué cette deuxième version expérimentale de l'Échelle PEDro à l'aide d'une échelle de clarté (étendue de 5 points) et ont proposé les modifications finales. Résultats : Pour les différents énoncés de la version finale de l'Échelle PEDro, les moyennes sur l'échelle de clarté montrent un niveau élevé puisqu'elles varient entre 4,0 et 4,7 sur un score maximal de 5 points. Conclusion : Les quatre étapes rigoureuses du processus ont permis de produire une version franco-canadienne valide de l'Échelle PEDro. PMID:26839449

  17. Scaling of extreme rainfall areas at a planetary scale.

    PubMed

    Devineni, Naresh; Lall, Upmanu; Xi, Chen; Ward, Philip

    2015-07-01

    Event magnitude and area scaling relationships for rainfall over different regions of the world have been presented in the literature for relatively short durations and over relatively small areas. In this paper, we present the first ever results on a global analysis of the scaling characteristics of extreme rainfall areas for durations ranging from 1 to 30 days. Broken power law models are fit in each case. The past work has been focused largely on the time and space scales associated with local and regional convection. The work presented here suggests that power law scaling may also apply to planetary scale phenomenon, such as frontal and monsoonal systems, and their interaction with local moisture recycling. Such features may have persistence over large areas corresponding to extreme rain and regional flood events. As a result, they lead to considerable hazard exposure. A caveat is that methods used for empirical power law identification have difficulties with edge effects due to finite domains. This leads to problems with robust model identification and interpretability of the underlying relationships. We use recent algorithms that aim to address some of these issues in a principled way. Theoretical research that could explain why such results may emerge across the world, as analyzed for the first time in this paper, is needed.

  18. Data-Driven Scale Extrapolation: Application on Continental Scale

    NASA Astrophysics Data System (ADS)

    Gong, L.

    2014-12-01

    Large-scale hydrological models and land surface models are so far the only tools for assessing current and future water resources. Those models estimate discharge with large uncertainties, due to the complex interaction between climate and hydrology, the limited availability and quality of data, as well as model uncertainties. A new purely data-driven scale-extrapolation method to estimate discharge for a large region solely from selected small sub-basins, which are typically 1-2 orders of magnitude smaller than the large region, has been developed. When tested in the Baltic Sea drainage basin, the method was able to provide accurate discharge estimation for the gauged area with sub-basins that cover 5% of the gauged area. There exist multiple sets of sub-basins whose climate and hydrology resemble those of the gauged area equally well. Those multiple sets estimate annual discharge for the gauged area consistently well with 6 % average error. The scale-extrapolation method is completely data-driven; therefore it does not force any modelling error into the prediction. The scale-extrapolation method is now further tested at continent scale in Europe and North America to exam its potential for climate change studies.

  19. Feasibility of scaling from pilot to process scale.

    PubMed

    Ignatova, Svetlana; Wood, Philip; Hawes, David; Janaway, Lee; Keay, David; Sutherland, Ian

    2007-06-01

    The pharmaceutical industry is looking for new technology that is easy to scale up from analytical to process scale and is cheap and reliable to operate. Large scale counter-current chromatography is an emerging technology that could provide this advance, but little was known about the key variables affecting scale-up. This paper investigates two such variables: the rotor radius and the tubing bore. The effect of rotor radius was studied using identical: length, beta-value, helix angle and tubing bore coils for rotors of different radii (50 mm, 110 mm and 300 mm). The effect of bore was researched using identical: length, helix angle and mean beta-value coils on the Maxi-DE centrifuge (R=300 mm). The rotor radius results show that there is very little difference in retention and resolution as rotor radius increases at constant bore. The tubing bore results show that good retention is maintained as bore increases and resolution only decrease slightly, but at the highest bore (17.5 mm) resolution can be maintained at very high flow rates making it possible for process scale centrifuges to be designed with throughputs exceeding 25 kg/day.

  20. Validation of the breathlessness, cough and sputum scale to predict COPD exacerbation

    PubMed Central

    DeVries, Rebecca; Kriebel, David; Sama, Susan

    2016-01-01

    The breathlessness, cough and sputum scale (BCSS) is a three-item questionnaire rating breathlessness, cough and sputum on a 5-point Likert scale from 0 (no symptoms) to 4 (severe symptoms). Researchers have explored the utility of this tool to quantify efficacy of treatment following a chronic obstructive pulmonary disease (COPD) exacerbation; however, little work has been done to investigate the ability of the BCSS to predict COPD exacerbation. As part of a prospective case-crossover study among a cohort of 168 COPD patients residing in central Massachusetts, patients were asked standard BCSS questions during exacerbation and randomly identified non-exacerbation (or healthy) weeks. We found that the BCSS was strongly associated with COPD exacerbation (OR=2.80, 95% CI=2.27–3.45) and that a BCSS sum score of 5.0 identified COPD exacerbation with 83% sensitivity and 68% specificity. These results may be useful in the clinical setting to expedite interventions of exacerbation. PMID:27906157

  1. Copper atomic-scale transistors

    PubMed Central

    Kavalenka, Maryna N; Röger, Moritz; Albrecht, Daniel; Hölscher, Hendrik; Leuthold, Jürgen

    2017-01-01

    We investigated copper as a working material for metallic atomic-scale transistors and confirmed that copper atomic-scale transistors can be fabricated and operated electrochemically in a copper electrolyte (CuSO4 + H2SO4) in bi-distilled water under ambient conditions with three microelectrodes (source, drain and gate). The electrochemical switching-on potential of the atomic-scale transistor is below 350 mV, and the switching-off potential is between 0 and −170 mV. The switching-on current is above 1 μA, which is compatible with semiconductor transistor devices. Both sign and amplitude of the voltage applied across the source and drain electrodes (U bias) influence the switching rate of the transistor and the copper deposition on the electrodes, and correspondingly shift the electrochemical operation potential. The copper atomic-scale transistors can be switched using a function generator without a computer-controlled feedback switching mechanism. The copper atomic-scale transistors, with only one or two atoms at the narrowest constriction, were realized to switch between 0 and 1G 0 (G 0 = 2e2/h; with e being the electron charge, and h being Planck’s constant) or 2G 0 by the function generator. The switching rate can reach up to 10 Hz. The copper atomic-scale transistor demonstrates volatile/non-volatile dual functionalities. Such an optimal merging of the logic with memory may open a perspective for processor-in-memory and logic-in-memory architectures, using copper as an alternative working material besides silver for fully metallic atomic-scale transistors. PMID:28382242

  2. The Internet Gaming Disorder Scale.

    PubMed

    Lemmens, Jeroen S; Valkenburg, Patti M; Gentile, Douglas A

    2015-06-01

    Recently, the American Psychiatric Association included Internet gaming disorder (IGD) in the appendix of the 5th edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5). The main aim of the current study was to test the reliability and validity of 4 survey instruments to measure IGD on the basis of the 9 criteria from the DSM-5: a long (27-item) and short (9-item) polytomous scale and a long (27-item) and short (9-item) dichotomous scale. The psychometric properties of these scales were tested among a representative sample of 2,444 Dutch adolescents and adults, ages 13-40 years. Confirmatory factor analyses demonstrated that the structural validity (i.e., the dimensional structure) of all scales was satisfactory. Both types of assessment (polytomous and dichotomous) were also reliable (i.e., internally consistent) and showed good criterion-related validity, as indicated by positive correlations with time spent playing games, loneliness, and aggression and negative correlations with self-esteem, prosocial behavior, and life satisfaction. The dichotomous 9-item IGD scale showed solid psychometric properties and was the most practical scale for diagnostic purposes. Latent class analysis of this dichotomous scale indicated that 3 groups could be discerned: normal gamers, risky gamers, and disordered gamers. On the basis of the number of people in this last group, the prevalence of IGD among 13- through 40-year-olds in the Netherlands is approximately 4%. If the DSM-5 threshold for diagnosis (experiencing 5 or more criteria) is applied, the prevalence of disordered gamers is more than 5%.

  3. SETI and astrobiology: The Rio Scale and the London Scale

    NASA Astrophysics Data System (ADS)

    Almár, Iván

    2011-11-01

    The public reaction to a discovery, the character of the corresponding risk communication, as well as the possible impact on science and society all depend on the character of the phenomenon discovered, on the method of discovery, on the distance to the phenomenon and, last but not least, on the reliability of the announcement itself. The Rio Scale - proposed together with Jill Tarter just a decade ago at an IAA symposium in Rio de Janeiro - attempts to quantify the relative importance of such a “low probability, high consequence event”, namely the announcement of an ETI discovery. After the publication of the book “The Eerie Silence” by Paul Davies it is necessary to control how the recently suggested possible “technosignatures” or “technomarkers” mentioned in this book could be evaluated by the Rio Scale. The new London Scale, proposed at the Royal Society meeting in January 2010, in London, is a similar attempt to quantify the impact of an announcement regarding the discovery of ET life on an analogous ordinal scale between zero and ten. Here again the new concept of a “shadow biosphere” raised in this book deserves a special attention since a “weird form of life” found on Earth would not necessarily have an extraterrestrial origin, nevertheless it might be an important discovery in itself. Several arguments are presented that methods, aims and targets of “search for ET life” and “search for ET intelligence” are recently converging. The new problem is raised whether a unification of these two scales is necessary as a consequence of the convergence of the two subjects. Finally, it is suggested that experts in social sciences should take the structure of the respective scales into consideration when investigating case by case the possible effects on the society of such discoveries.

  4. On the scaling of small-scale jet noise to large scale

    NASA Astrophysics Data System (ADS)

    Soderman, Paul T.; Allen, Christopher S.

    An examination was made of several published jet noise studies for the purpose of evaluating scale effects important to the simulation of jet aeroacoustics. Several studies confirmed that small conical jets, one as small as 59 mm diameter, could be used to correctly simulate the overall or PNL noise of large jets dominated by mixing noise. However, the detailed acoustic spectra of large jets are more difficult to simulate because of the lack of broad-band turbulence spectra in small jets. One study indicated that a jet Reynolds number of 5 x 10 exp 6 based on exhaust diameter enabled the generation of broad-band noise representative of large jet mixing noise. Jet suppressor aeroacoustics is even more difficult to simulate at small scale because of the small mixer nozzles with flows sensitive to Reynolds number. Likewise, one study showed incorrect ejector mixing and entrainment using small-scale, short ejector that led to poor acoustic scaling. Conversely, fairly good results were found with a longer ejector and, in a different study, with a 32-chute suppressor nozzle. Finally, it was found that small-scale aeroacoustic resonance produced by jets impacting ground boards does not reproduce at large scale.

  5. On the scaling of small-scale jet noise to large scale

    NASA Astrophysics Data System (ADS)

    Soderman, Paul T.; Allen, Christopher S.

    1992-05-01

    An examination was made of several published jet noise studies for the purpose of evaluating scale effects important to the simulation of jet aeroacoustics. Several studies confirmed that small conical jets, one as small as 59 mm diameter, could be used to correctly simulate the overall or perceived noise level (PNL) noise of large jets dominated by mixing noise. However, the detailed acoustic spectra of large jets are more difficult to simulate because of the lack of broad-band turbulence spectra in small jets. One study indicated that a jet Reynolds number of 5 x 10(exp 6) based on exhaust diameter enabled the generation of broad-band noise representative of large jet mixing noise. Jet suppressor aeroacoustics is even more difficult to simulate at small scale because of the small mixer nozzles with flows sensitive to Reynolds number. Likewise, one study showed incorrect ejector mixing and entrainment using a small-scale, short ejector that led to poor acoustic scaling. Conversely, fairly good results were found with a longer ejector and, in a different study, with a 32-chute suppressor nozzle. Finally, it was found that small-scale aeroacoustic resonance produced by jets impacting ground boards does not reproduce at large scale.

  6. On the scaling of small-scale jet noise to large scale

    NASA Technical Reports Server (NTRS)

    Soderman, Paul T.; Allen, Christopher S.

    1992-01-01

    An examination was made of several published jet noise studies for the purpose of evaluating scale effects important to the simulation of jet aeroacoustics. Several studies confirmed that small conical jets, one as small as 59 mm diameter, could be used to correctly simulate the overall or PNL noise of large jets dominated by mixing noise. However, the detailed acoustic spectra of large jets are more difficult to simulate because of the lack of broad-band turbulence spectra in small jets. One study indicated that a jet Reynolds number of 5 x 10 exp 6 based on exhaust diameter enabled the generation of broad-band noise representative of large jet mixing noise. Jet suppressor aeroacoustics is even more difficult to simulate at small scale because of the small mixer nozzles with flows sensitive to Reynolds number. Likewise, one study showed incorrect ejector mixing and entrainment using small-scale, short ejector that led to poor acoustic scaling. Conversely, fairly good results were found with a longer ejector and, in a different study, with a 32-chute suppressor nozzle. Finally, it was found that small-scale aeroacoustic resonance produced by jets impacting ground boards does not reproduce at large scale.

  7. On the scaling of small-scale jet noise to large scale

    NASA Technical Reports Server (NTRS)

    Soderman, Paul T.; Allen, Christopher S.

    1992-01-01

    An examination was made of several published jet noise studies for the purpose of evaluating scale effects important to the simulation of jet aeroacoustics. Several studies confirmed that small conical jets, one as small as 59 mm diameter, could be used to correctly simulate the overall or perceived noise level (PNL) noise of large jets dominated by mixing noise. However, the detailed acoustic spectra of large jets are more difficult to simulate because of the lack of broad-band turbulence spectra in small jets. One study indicated that a jet Reynolds number of 5 x 10(exp 6) based on exhaust diameter enabled the generation of broad-band noise representative of large jet mixing noise. Jet suppressor aeroacoustics is even more difficult to simulate at small scale because of the small mixer nozzles with flows sensitive to Reynolds number. Likewise, one study showed incorrect ejector mixing and entrainment using a small-scale, short ejector that led to poor acoustic scaling. Conversely, fairly good results were found with a longer ejector and, in a different study, with a 32-chute suppressor nozzle. Finally, it was found that small-scale aeroacoustic resonance produced by jets impacting ground boards does not reproduce at large scale.

  8. Development of the Chinese version of the Oro-facial Esthetic Scale.

    PubMed

    Zhao, Y; He, S L

    2013-09-01

    The aim of this study was to investigate the psychometric properties of the Oro-facial Esthetic Scale among Chinese-speaking patients. The original Oro-facial Esthetic Scale was cross-culturally adapted in accordance with the international standards to develop a Chinese version (OES-C). Unlike the original Oro-facial Esthetic Scale, the version employed in this study used a 5-point Likert scale with items rated from unsatisfactory to most satisfactory. Psychometric evaluation included the reliability and validity of the OES-C. The reliability of the OES-C was determined through internal consistency and test-retest methods. The validity of OES-C was analysed by content validity, discriminative validity, construct validity and convergent validity. The corrected item-total correlation coefficients of the OES-C ranged from 0·859 to 0·910. The inter-item correlation coefficients between each two of the eight items of the OES-C ranged from 0·766 to 0·922. The values of ICC ranged from 0·79 (95% CI = 0·54-0·98) to 0·93 (95% CI = 0·87-0·99), indicating an excellent agreement. Construct validity was proved by the presence of one-factor structure that accounted for 83·507% of the variance and fitted well into the model. Convergent validity was confirmed by the association between OES-C scores and self-reported oral aesthetics and three questions from the Oral Health Impact Profile related to aesthetics (correlation coefficients ranged from -0·830 to -0·702, P < 0·001). OES-C scores discriminated aesthetically impaired patients from healthy controls. This study provides preliminary evidence concerning the reliability and validity of the OES-C. The results show that the OES-C may be a useful tool for assessment of oro-facial esthetics in China.

  9. 30 CFR 57.3202 - Scaling tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Scaling tools. 57.3202 Section 57.3202 Mineral... HEALTH SAFETY AND HEALTH STANDARDS-UNDERGROUND METAL AND NONMETAL MINES Ground Control Scaling and Support-Surface and Underground § 57.3202 Scaling tools. Where manual scaling is performed, a scaling...

  10. 30 CFR 56.3202 - Scaling tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Scaling tools. 56.3202 Section 56.3202 Mineral... HEALTH SAFETY AND HEALTH STANDARDS-SURFACE METAL AND NONMETAL MINES Ground Control Scaling and Support § 56.3202 Scaling tools. Where manual scaling is performed, a scaling bar shall be provided. This...

  11. Galaxy clustering on large scales.

    PubMed Central

    Efstathiou, G

    1993-01-01

    I describe some recent observations of large-scale structure in the galaxy distribution. The best constraints come from two-dimensional galaxy surveys and studies of angular correlation functions. Results from galaxy redshift surveys are much less precise but are consistent with the angular correlations, provided the distortions in mapping between real-space and redshift-space are relatively weak. The galaxy two-point correlation function, rich-cluster two-point correlation function, and galaxy-cluster cross-correlation function are all well described on large scales ( greater, similar 20h-1 Mpc, where the Hubble constant, H0 = 100h km.s-1.Mpc; 1 pc = 3.09 x 10(16) m) by the power spectrum of an initially scale-invariant, adiabatic, cold-dark-matter Universe with Gamma = Omegah approximately 0.2. I discuss how this fits in with the Cosmic Background Explorer (COBE) satellite detection of large-scale anisotropies in the microwave background radiation and other measures of large-scale structure in the Universe. PMID:11607400

  12. Multi-scale brain networks.

    PubMed

    Betzel, Richard F; Bassett, Danielle S

    2016-11-11

    The network architecture of the human brain has become a feature of increasing interest to the neuroscientific community, largely because of its potential to illuminate human cognition, its variation over development and aging, and its alteration in disease or injury. Traditional tools and approaches to study this architecture have largely focused on single scales-of topology, time, and space. Expanding beyond this narrow view, we focus this review on pertinent questions and novel methodological advances for the multi-scale brain. We separate our exposition into content related to multi-scale topological structure, multi-scale temporal structure, and multi-scale spatial structure. In each case, we recount empirical evidence for such structures, survey network-based methodological approaches to reveal these structures, and outline current frontiers and open questions. Although predominantly peppered with examples from human neuroimaging, we hope that this account will offer an accessible guide to any neuroscientist aiming to measure, characterize, and understand the full richness of the brain's multiscale network structure-irrespective of species, imaging modality, or spatial resolution.

  13. Scales of Natural Flood Management

    NASA Astrophysics Data System (ADS)

    Nicholson, Alex; Quinn, Paul; Owen, Gareth; Hetherington, David; Piedra Lara, Miguel; O'Donnell, Greg

    2016-04-01

    The scientific field of Natural flood Management (NFM) is receiving much attention and is now widely seen as a valid solution to sustainably manage flood risk whilst offering significant multiple benefits. However, few examples exist looking at NFM on a large scale (>10km2). Well-implemented NFM has the effect of restoring more natural catchment hydrological and sedimentological processes, which in turn can have significant flood risk and WFD benefits for catchment waterbodies. These catchment scale improvements in-turn allow more 'natural' processes to be returned to rivers and streams, creating a more resilient system. Although certain NFM interventions may appear distant and disconnected from main stem waterbodies, they will undoubtedly be contributing to WFD at the catchment waterbody scale. This paper offers examples of NFM, and explains how they can be maximised through practical design across many scales (from feature up to the whole catchment). New tools to assist in the selection of measures and their location, and to appreciate firstly, the flooding benefit at the local catchment scale and then show a Flood Impact Model that can best reflect the impacts of local changes further downstream. The tools will be discussed in the context of our most recent experiences on NFM projects including river catchments in the north east of England and in Scotland. This work has encouraged a more integrated approach to flood management planning that can use both traditional and novel NFM strategies in an effective and convincing way.

  14. Scaling Effect In Trade Network

    NASA Astrophysics Data System (ADS)

    Konar, M.; Lin, X.; Rushforth, R.; Ruddell, B. L.; Reimer, J.

    2015-12-01

    Scaling is an important issue in the physical sciences. Economic trade is increasingly of interest to the scientific community due to the natural resources (e.g. water, carbon, nutrients, etc.) embodied in traded commodities. Trade refers to the spatial and temporal redistribution of commodities, and is typically measured annually between countries. However, commodity exchange networks occur at many different scales, though data availability at finer temporal and spatial resolution is rare. Exchange networks may prove an important adaptation measure to cope with future climate and economic shocks. As such, it is essential to understand how commodity exchange networks scale, so that we can understand opportunities and roadblocks to the spatial and temporal redistribution of goods and services. To this end, we present an empirical analysis of trade systems across three spatial scales: global, sub-national in the United States, and county-scale in the United States. We compare and contrast the network properties, the self-sufficiency ratio, and performance of the gravity model of trade for these three exchange systems.

  15. Allometric scaling in-vitro

    PubMed Central

    Ahluwalia, Arti

    2017-01-01

    About two decades ago, West and coworkers established a model which predicts that metabolic rate follows a three quarter power relationship with the mass of an organism, based on the premise that tissues are supplied nutrients through a fractal distribution network. Quarter power scaling is widely considered a universal law of biology and it is generally accepted that were in-vitro cultures to obey allometric metabolic scaling, they would have more predictive potential and could, for instance, provide a viable substitute for animals in research. This paper outlines a theoretical and computational framework for establishing quarter power scaling in three-dimensional spherical constructs in-vitro, starting where fractal distribution ends. Allometric scaling in non-vascular spherical tissue constructs was assessed using models of Michaelis Menten oxygen consumption and diffusion. The models demonstrate that physiological scaling is maintained when about 5 to 60% of the construct is exposed to oxygen concentrations less than the Michaelis Menten constant, with a significant concentration gradient in the sphere. The results have important implications for the design of downscaled in-vitro systems with physiological relevance. PMID:28169362

  16. Featured Invention: Laser Scaling Device

    NASA Technical Reports Server (NTRS)

    Dunn, Carol Anne

    2008-01-01

    In September 2003, NASA signed a nonexclusive license agreement with Armor Forensics, a subsidiary of Armor Holdings, Inc., for the laser scaling device under the Innovative Partnerships Program. Coupled with a measuring program, also developed by NASA, the unit provides crime scene investigators with the ability to shoot photographs at scale without having to physically enter the scene, analyzing details such as bloodspatter patterns and graffiti. This ability keeps the scene's components intact and pristine for the collection of information and evidence. The laser scaling device elegantly solved a pressing problem for NASA's shuttle operations team and also provided industry with a useful tool. For NASA, the laser scaling device is still used to measure divots or damage to the shuttle's external tank and other structures around the launchpad. When the invention also met similar needs within industry, the Innovative Partnerships Program provided information to Armor Forensics for licensing and marketing the laser scaling device. Jeff Kohler, technology transfer agent at Kennedy, added, "We also invited a representative from the FBI's special photography unit to Kennedy to meet with Armor Forensics and the innovator. Eventually the FBI ended up purchasing some units. Armor Forensics is also beginning to receive interest from DoD [Department of Defense] for use in military crime scene investigations overseas."

  17. Visions of Atomic Scale Tomography

    SciTech Connect

    Kelly, T. F.; Miller, Michael K; Rajan, Krishna; Ringer, S. P.

    2012-01-01

    A microscope, by definition, provides structural and analytical information about objects that are too small to see with the unaided eye. From the very first microscope, efforts to improve its capabilities and push them to ever-finer length scales have been pursued. In this context, it would seem that the concept of an ultimate microscope would have received much attention by now; but has it really ever been defined? Human knowledge extends to structures on a scale much finer than atoms, so it might seem that a proton-scale microscope or a quark-scale microscope would be the ultimate. However, we argue that an atomic-scale microscope is the ultimate for the following reason: the smallest building block for either synthetic structures or natural structures is the atom. Indeed, humans and nature both engineer structures with atoms, not quarks. So far as we know, all building blocks (atoms) of a given type are identical; it is the assembly of the building blocks that makes a useful structure. Thus, would a microscope that determines the position and identity of every atom in a structure with high precision and for large volumes be the ultimate microscope? We argue, yes. In this article, we consider how it could be built, and we ponder the answer to the equally important follow-on questions: who would care if it is built, and what could be achieved with it?

  18. Full-Scale Tunnel (FST)

    NASA Technical Reports Server (NTRS)

    1930-01-01

    Steam pile driver for foundation of Full-Scale Tunnel (FST). In 1924, George Lewis, Max Munk and Fred Weick began to discuss an idea for a wind tunnel large enough to test a full-scale propeller. Munk sketched out a design for a tunnel with a 20-foot test section. The rough sketches were presented to engineers at Langley for comment. Elliott Reid was especially enthusiastic and he wrote a memorandum in support of the proposed 'Giant Wind Tunnel.' At the end of the memorandum, he appended the recommendation that the tunnel test section should be increased to 30-feet diameter so as to allow full-scale testing of entire airplanes (not just propellers). Reid's idea for a full-scale tunnel excited many at Langley but the funds and support were not available in 1924. Nonetheless, Elliot Reid's idea would eventually become reality. In 1928, NACA engineers began making plans for a full-scale wind tunnel. In February 1929, Congress approved of the idea and appropriated $900,000 for construction. Located just a few feet from the Back River, pilings to support the massive building's foundation had to be driven deep into the earth. This work began in the spring of 1929 and cost $11,293.22

  19. Full-Scale Tunnel (FST)

    NASA Technical Reports Server (NTRS)

    1930-01-01

    Pile driving for foundation of Full-Scale Tunnel (FST). In 1924, George Lewis, Max Munk and Fred Weick began to discuss an idea for a wind tunnel large enough to test a full-scale propeller. Munk sketched out a design for a tunnel with a 20-foot test section. The rough sketches were presented to engineers at Langley for comment. Elliott Reid was especially enthusiastic and he wrote a memorandum in support of the proposed 'Giant Wind Tunnel.' At the end of the memorandum, he appended the recommendation that the tunnel test section should be increased to 30-feet diameter so as to allow full-scale testing of entire airplanes (not just propellers). Reid's idea for a full-scale tunnel excited many at Langley but the funds and support were not available in 1924. Nonetheless, Elliot Reid's idea would eventually become reality. In 1928, NACA engineers began making plans for a full-scale wind tunnel. In February 1929, Congress approved of the idea and appropriated $900,000 for construction. Located just a few feet from the Back River, pilings to support the massive building's foundation had to be driven deep into the earth. This work began in the spring of 1929 and cost $11,293.22.

  20. Full-Scale Tunnel (FST)

    NASA Technical Reports Server (NTRS)

    1930-01-01

    General view of concrete column base for Full-Scale Tunnel (FST). In 1924, George Lewis, Max Munk and Fred Weick began to discuss an idea for a wind tunnel large enough to test a full-scale propeller. Munk sketched out a design for a tunnel with a 20-foot test section. The rough sketches were presented to engineers at Langley for comment. Elliott Reid was especially enthusiastic and he wrote a memorandum in support of the proposed 'Giant Wind Tunnel.' At the end of the memorandum, he appended the recommendation that the tunnel test section should be increased to 30-feet diameter so as to allow full-scale testing of entire airplanes (not just propellers). Reid's idea for a full-scale tunnel excited many at Langley but the funds and support were not available in 1924. Nonetheless, Elliot Reid's idea would eventually become reality. In 1928, NACA engineers began making plans for a full-scale wind tunnel. In February 1929, Congress approved of the idea and appropriated $900,000 for construction. Work on the foundation began in the spring of 1929 and cost $11,293.22.

  1. Allometric scaling in-vitro

    NASA Astrophysics Data System (ADS)

    Ahluwalia, Arti

    2017-02-01

    About two decades ago, West and coworkers established a model which predicts that metabolic rate follows a three quarter power relationship with the mass of an organism, based on the premise that tissues are supplied nutrients through a fractal distribution network. Quarter power scaling is widely considered a universal law of biology and it is generally accepted that were in-vitro cultures to obey allometric metabolic scaling, they would have more predictive potential and could, for instance, provide a viable substitute for animals in research. This paper outlines a theoretical and computational framework for establishing quarter power scaling in three-dimensional spherical constructs in-vitro, starting where fractal distribution ends. Allometric scaling in non-vascular spherical tissue constructs was assessed using models of Michaelis Menten oxygen consumption and diffusion. The models demonstrate that physiological scaling is maintained when about 5 to 60% of the construct is exposed to oxygen concentrations less than the Michaelis Menten constant, with a significant concentration gradient in the sphere. The results have important implications for the design of downscaled in-vitro systems with physiological relevance.

  2. Urban Transfer Entropy across Scales

    PubMed Central

    Murcio, Roberto

    2015-01-01

    The morphology of urban agglomeration is studied here in the context of information exchange between different spatio-temporal scales. Urban migration to and from cities is characterised as non-random and following non-random pathways. Cities are multidimensional non-linear phenomena, so understanding the relationships and connectivity between scales is important in determining how the interplay of local/regional urban policies may affect the distribution of urban settlements. In order to quantify these relationships, we follow an information theoretic approach using the concept of Transfer Entropy. Our analysis is based on a stochastic urban fractal model, which mimics urban growing settlements and migration waves. The results indicate how different policies could affect urban morphology in terms of the information generated across geographical scales. PMID:26207628

  3. An environmentally friendly scale inhibitor

    SciTech Connect

    Dobbs, J.B.; Brown, J.M.

    1999-11-01

    This paper describes a method of inhibiting the formation of scales such as barium and strontium sulfate in low pH aqueous systems, and calcium carbonate in systems containing high concentrations of dissolved iron. The solution, chemically, involves treating the aqueous system with an inhibitor designed to replace organic-phosphonates. Typical low pH aqueous systems where the inhibitor is particularly useful are oilfield produced-water, resin bed water softeners that form scale during low pH, acid regeneration operations. Downhole applications are recommended where high concentrations of dissolved iron are present in the produced water. This new approach to inhibition replaces typical organic phosphonates and polymers with a non-toxic, biodegradable scale inhibitor that performs in harsh environments.

  4. Large-scale structural optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.

    1983-01-01

    Problems encountered by aerospace designers in attempting to optimize whole aircraft are discussed, along with possible solutions. Large scale optimization, as opposed to component-by-component optimization, is hindered by computational costs, software inflexibility, concentration on a single, rather than trade-off, design methodology and the incompatibility of large-scale optimization with single program, single computer methods. The software problem can be approached by placing the full analysis outside of the optimization loop. Full analysis is then performed only periodically. Problem-dependent software can be removed from the generic code using a systems programming technique, and then embody the definitions of design variables, objective function and design constraints. Trade-off algorithms can be used at the design points to obtain quantitative answers. Finally, decomposing the large-scale problem into independent subproblems allows systematic optimization of the problems by an organization of people and machines.

  5. The Satisfaction With Life Scale.

    PubMed

    Diener, E; Emmons, R A; Larsen, R J; Griffin, S

    1985-02-01

    This article reports the development and validation of a scale to measure global life satisfaction, the Satisfaction With Life Scale (SWLS). Among the various components of subjective well-being, the SWLS is narrowly focused to assess global life satisfaction and does not tap related constructs such as positive affect or loneliness. The SWLS is shown to have favorable psychometric properties, including high internal consistency and high temporal reliability. Scores on the SWLS correlate moderately to highly with other measures of subjective well-being, and correlate predictably with specific personality characteristics. It is noted that the SWLS is Suited for use with different age groups, and other potential uses of the scale are discussed.

  6. Balthazar Scales of Adaptive Behavior: II. Scales of Social Adaption.

    ERIC Educational Resources Information Center

    Balthazar, Earl E.

    The Balthazar Scales of Adaptive Behavior II (BSAB-II) provides a system for program development and evaluation and for social behavior assessment of profoundly and severely mentally retarded individuals as well as of the younger less retarded and emotionally disturbed individuals. The specimen set consists of six parts: a Manual, a Tally Sheet…

  7. Critical Multicultural Education Competencies Scale: A Scale Development Study

    ERIC Educational Resources Information Center

    Acar-Ciftci, Yasemin

    2016-01-01

    The purpose of this study is to develop a scale in order to identify the critical mutlicultural education competencies of teachers. For this reason, first of all, drawing on the knowledge in the literature, a new conceptual framework was created with deductive method based on critical theory, critical race theory and critical multicultural…

  8. Fiber optic probe augmented sonic scaling versus conventional sonic scaling.

    PubMed

    Johnson, G K; Reinhardt, R A; Tussing, G J; Krejci, R F

    1989-03-01

    Several factors, including access and visualization problems, make total deposit removal during scaling and root planing procedures extremely difficult. This study examined the effectiveness of a mode of therapy designed to improve access and visualization for sonic scaling compared to closed sonic instrumentation. Teeth with moderate to deep probing depths in six patients scheduled to receive immediate dentures were divided into three experimental groups: Group I, sonic scaling with access augmented by interdental papilla reflection and fiber optic illumination/transillumination (34 surfaces); Group II, closed sonic scaling (34 surfaces); and Group III, untreated controls (35 surfaces). Immediately after treatment the experimental teeth were extracted, stained with toluidine blue, and interproximal areas evaluated for remaining accretions with a microscope-digitizing pad-computer system. Group I had a significantly lower percentage (P less than 0.01) of remaining subgingival accretion coverage than Group II (1.30 +/- 0.25% vs 6.35 +/- 1.08%), and both Group I and II demonstrated significantly (P less than 0.01) fewer deposits than the control surfaces (46.61 +/- 4.32%). These findings suggest that minimal tissue reflection and fiber optic illumination/transillumination are beneficial adjuncts to deposit removal in moderate to deep periodontal pockets.

  9. IMF Length Scales and Predictability: The Two Length Scale Medium

    NASA Technical Reports Server (NTRS)

    Collier, Michael R.; Szabo, Adam; Slavin, James A.; Lepping, R. P.; Kokubun, S.

    1999-01-01

    We present preliminary results from a systematic study using simultaneous data from three spacecraft, Wind, IMP 8 (Interplanetary Monitoring Platform) and Geotail to examine interplanetary length scales and their implications on predictability for magnetic field parcels in the typical solar wind. Time periods were selected when the plane formed by the three spacecraft included the GSE (Ground Support Equipment) x-direction so that if the parcel fronts were strictly planar, the two adjacent spacecraft pairs would determine the same phase front angles. After correcting for the motion of the Earth relative to the interplanetary medium and deviations in the solar wind flow from radial, we used differences in the measured front angle between the two spacecraft pairs to determine structure radius of curvature. Results indicate that the typical radius of curvature for these IMF parcels is of the order of 100 R (Sub E). This implies that there are two important IMF (Interplanetary Magnetic Field) scale lengths relevant to predictability: (1) the well-established scale length over which correlations observed by two spacecraft decay along a given IMF parcel, of the order of a few tens of Earth radii and (2) the scale length over which two spacecraft are unlikely to even observe the same parcel because of its curvature, of the order of a hundred Earth radii.

  10. Continuously-Variable Vernier Scale

    NASA Technical Reports Server (NTRS)

    Miller, Irvin M.

    1989-01-01

    Easily fabricated device increases precision in reading graphical data. Continuously-variable vernier scale (CV VS) designed to provide greater accuracy to scientists and technologists in reading numerical values from graphical data. Placed on graph and used to interpolate coordinate value of point on curve or plotted point on figure within division on each coordinate axis. Requires neither measurement of line segments where projection of point intersects division nor calculation to quantify projected value. Very flexible device constructed with any kind of scale. Very easy to use, requiring no special equipment of any kind, and saves considerable amount of time if numerous points to be evaluated.

  11. TeV-Scale Strings

    NASA Astrophysics Data System (ADS)

    Berenstein, David

    2014-10-01

    This review discusses the status of string physics where the string tension is around the TeV scale. It covers model-building basics for perturbative strings, based on D-brane configurations. The effective low-energy physics description of such string constructions is analyzed: how anomaly cancellation is implemented, how fast proton decay is avoided, and how D-brane models lead to additional Z' particles. This review also discusses direct search bounds for strings at the TeV scale, as well as theoretical issues with model building related to flavor physics and axions.

  12. The Clinical Global Impressions Scale

    PubMed Central

    Targum, Steven D.

    2007-01-01

    Objective: This paper reviews the potential value in daily clinical practice of an easily applied research tool, the Clinical Global Impressions (CGI) Scale, for the nonresearcher clinician to quantify and track patient progress and treatment response over time. Method: The instrument is described and sample patient scenarios are provided with scoring rationales and a practical charting system. Conclusion: The CGI severity and improvement scales offer a readily understood, practical measurement tool that can easily be administered by a clinician in a busy clinical practice setting. PMID:20526405

  13. Cavitation erosion size scale effects

    NASA Technical Reports Server (NTRS)

    Rao, P. V.; Buckley, D. H.

    1984-01-01

    Size scaling in cavitation erosion is a major problem confronting the design engineers of modern high speed machinery. An overview and erosion data analysis presented in this paper indicate that the size scale exponent n in the erosion rate relationship as a function of the size or diameter can vary from 1.7 to 4.9 depending on the type of device used. There is, however, a general agreement as to the values of n if the correlations are made with constant cavitation number.

  14. Dynamics of convective scale interaction

    NASA Technical Reports Server (NTRS)

    Purdom, James F. W.; Sinclair, Peter C.

    1988-01-01

    Several of the mesoscale dynamic and thermodynamic aspects of convective scale interaction are examined. An explanation of how sounding data can be coupled with satellite observed cumulus development in the warm sector and the arc cloud line's time evolution to develop a short range forecast of expected convective intensity along an arc cloud line. The formative, mature and dissipating stages of the arc cloud line life cycle are discussed. Specific properties of convective scale interaction are presented and the relationship between arc cloud lines and tornado producing thunderstorms is considered.

  15. Scaling of sand flux over bedforms- experiments to field scale

    NASA Astrophysics Data System (ADS)

    McElroy, B. J.; Mahon, R. C.; Ashley, T.; Alexander, J. S.

    2015-12-01

    Bed forms are one of the few geomorphic phenomena whose field and laboratory geometric scales have significant overlap. This is similarly true for scales of sediment transport. Whether in the lab or field, at low transport stages and high Rouse numbers where suspension is minimal, sand fluxes scale nonlinearly with transport stage. At high transport stages, and low Rouse numbers where suspension is substantial, sand transport scales with rouse number. In intermediate cases deformation of bed forms is a direct result of the exchange of sediment between the classically suspended and bed load volumes. These parameters are straightforwardly measured in the laboratory. However, practical difficulties and cost ineffectiveness often exclude bed-sediment measurements from studies and monitoring efforts aimed at estimating sediment loads in rivers. An alternative to direct sampling is through the measurement of evolution of bed topography constrained by sediment-mass conservation. Historically, the topographic-evolution approach has been limited to systems with negligible transport of sand in suspension. As was shown decades ago, pure bed load transport is responsible for the mean migration of trains of bed forms when no sediment is exchanged between individual bed forms. In contrast, the component of bed-material load that moves in suspension is responsible for changes in the size, shape, and spacing of evolving bed forms; collectively this is called deformation. The difference between bed-load flux and bed-material-load flux equals the flux of suspended bed material. We give a partial demonstration of this using available field and laboratory data and comparing them across geometric and sediment transport scales.

  16. Scaling of pressurized fluidized beds

    SciTech Connect

    Guralnik, S.; Glicksman, L.R.

    1994-10-01

    The project has two primary objectives. The first is to verify a set of hydrodynamic scaling relationships for commercial pressurized fluidized bed combustors (PFBC). The second objective is to investigate solids mixing in pressurized bubbling fluidized beds. American Electric Power`s (AEP) Tidd combined-cycle demonstration plant will provide time-varying pressure drop data to serve as the basis for the scaling verification. The verification will involve demonstrating that a properly scaled cold model and the Tidd PFBC exhibit hydrodynamically similar behavior. An important issue in PFBC design is the spacing of fuel feed ports. The feed spacing is dictated by the fuel distribution and the mixing characteristics within the bed. After completing the scaling verification, the cold model will be used to study the characteristics of PFBCs. A thermal tracer technique will be utilized to study mixing both near the fuel feed region and in the far field. The results allow the coal feed and distributor to be designed for optimal heating.

  17. The Psychological Maltreatment Rating Scales.

    ERIC Educational Resources Information Center

    Brassard, Marla R.; And Others

    1993-01-01

    The Psychological Maltreatment Rating Scales (PMRS) were developed for assessing psychological maltreatment in the mother-child interaction, and were used to rate the videotaped interaction of 49 high-risk mother-child dyads and predict child protective service involvements. The PMRS was found to be a moderately reliable and valid measure.…

  18. Multidimensional Scaling of Video Surrogates.

    ERIC Educational Resources Information Center

    Goodrum, Abby A.

    2001-01-01

    Four types of video surrogates were compared under two tasks. Multidimensional scaling was used to map dimensional dispersions of users' judgments of similarity between videos and surrogates. Congruence between these maps was used to evaluate representativeness of each surrogate type. Congruence was greater for image-based than for text-based…

  19. Primary Childhood School Success Scale.

    ERIC Educational Resources Information Center

    Seagraves, Margaret C.

    The purpose of this research study was to build and pilot a psychometric instrument, the Primary Childhood School Success Scale (PCSSS), to identify behaviors needed for children to be successful in first grade. Fifty-two teacher responses were collected. The instrument had a reliability coefficient (Alpha) of 0.95, a mean of 13.26, and a variance…

  20. Scales of Independent Behavior (SIB).

    ERIC Educational Resources Information Center

    Thomas, Paulette J.

    1990-01-01

    Designed for use with individuals ages 3 months to 44 years, the Scales of Independent Behavior (SIB) measure adaptive behavior and problem behaviors in such areas as motor skills, social interaction, language, personal self-care, punctuality, destructiveness, and inattention. This paper describes the SIB's administration, scoring,…

  1. An Assertiveness Scale for Adolescents.

    ERIC Educational Resources Information Center

    Lee, Dong Yul; And Others

    1985-01-01

    Developed a 33-item, situation-specific instrument that measures assertiveness of adolescents. Based on data from 682 elementary and secondary school students, adequate reliability and validity of the Assertiveness Scale for Adolescents (ASA) were obtained when tested against several variables about which predictions could be made. (BH)

  2. A Feminist Family Therapy Scale.

    ERIC Educational Resources Information Center

    Black, Leora; Piercy, Fred P.

    1991-01-01

    Reports on development and psychometric properties of Feminist Family Therapy Scale (FFTS), a 17-item instrument intended to reflect degree to which family therapists conceptualize process of family therapy from feminist-informed perspective. Found that the instrument discriminated between self-identified feminists and nonfeminists, women and men,…

  3. Tumor detection at multiple scales

    NASA Astrophysics Data System (ADS)

    Strickland, Robin N.; Hahn, Hee I.

    1993-06-01

    We describe detectors capable of locating small tumors of variable size in the highly textured anatomic backgrounds typical of gamma-ray images. The problem of inhomogeneous background noise is solved using a spatially adaptive statistical scaling operation, which effectively pre-whitens the data and leads to a very simple form of adaptive matched filter. Detecting tumors of variable size is accomplished by processing the images formed in a Laplacian pyramid, each of which contains a narrower range of tumor scales. We compare the performance of this pyramid technique with our earlier nonlinear detector, which detects small tumors according to their signature in curvature feature space, where 'curvature' is the local curvature of the image data when viewed as a relief map. Computed curvature values are mapped to a normalized significance space using a windowed t-statistic. The resulting test statistic is thresholded at a chosen level of significance to give a positive detection. Nonuniform anatomic background activity is effectively suppressed. This curvature detector works quite well over a large range of tumor scales, although not as well as the pyramid/adaptive matched filter scheme. None of the multiscale techniques tested perform at the level of the fixed scale detectors. Tests are performed using simulated tumors superimposed on clinical gamma-ray images.

  4. Global scale predictability of floods

    NASA Astrophysics Data System (ADS)

    Weerts, Albrecht; Gijsbers, Peter; Sperna Weiland, Frederiek

    2016-04-01

    Flood (and storm surge) forecasting at the continental and global scale has only become possible in recent years (Emmerton et al., 2016; Verlaan et al., 2015) due to the availability of meteorological forecast, global scale precipitation products and global scale hydrologic and hydrodynamic models. Deltares has setup GLOFFIS a research-oriented multi model operational flood forecasting system based on Delft-FEWS in an open experimental ICT facility called Id-Lab. In GLOFFIS both the W3RA and PCRGLOB-WB model are run in ensemble mode using GEFS and ECMWF-EPS (latency 2 days). GLOFFIS will be used for experiments into predictability of floods (and droughts) and their dependency on initial state estimation, meteorological forcing and the hydrologic model used. Here we present initial results of verification of the ensemble flood forecasts derived with the GLOFFIS system. Emmerton, R., Stephens, L., Pappenberger, F., Pagano, T., Weerts, A., Wood, A. Salamon, P., Brown, J., Hjerdt, N., Donnelly, C., Cloke, H. Continental and Global Scale Flood Forecasting Systems, WIREs Water (accepted), 2016 Verlaan M, De Kleermaeker S, Buckman L. GLOSSIS: Global storm surge forecasting and information system 2015, Australasian Coasts & Ports Conference, 15-18 September 2015,Auckland, New Zealand.

  5. Hydrodynamic aspects of shark scales

    NASA Technical Reports Server (NTRS)

    Raschi, W. G.; Musick, J. A.

    1986-01-01

    Ridge morphometrices on placoid scales from 12 galeoid shark species were examined in order to evaluate their potential value for frictional drag reduction. The geometry of the shark scales is similar to longitudinal grooved surfaces (riblets) that have been previously shown to give 8 percent skin-friction reduction for turbulent boundary layers. The present study of the shark scales was undertaken to determine if the physical dimensions of the ridges on the shark scales are of the right magnitude to be used by the sharks for drag reduction based on previous riblet work. The results indicate that the ridge heights and spacings are normally maintained between the predicted optimal values proposed for voluntary and burst swimming speeds throughout the individual's ontogeny. Moreover, the species which might be considered to be the faster posses smaller and more closely spaced ridges that based on the riblet work would suggest a greater frictional drag reduction value at the high swimming speeds, as compared to their more sluggish counterparts.

  6. Citizen Science Data and Scaling

    NASA Astrophysics Data System (ADS)

    Henderson, S.; Wasser, L. A.

    2013-12-01

    There is rapid growth in the collection of environmental data by non experts. So called ';citizen scientists' are collecting data on plant phenology, precipitation patterns, bird migration and winter feeding, mating calls of frogs in the spring, and numerous other topics and phenomena related to environmental science. This data is generally submitted to online programs (e.g Project BudBurst, COCORaHS, Project Feederwatch, Frogwatch USA, etc.)and is freely available to scientists, educators, land managers, and decisions makers. While the data is often used to address specific science questions, it also provides the opportunity to explore its utility in the context of ecosystem scaling. Citizen science data is being collected and submitted at an unprecedented rate and is of a spatial and temporal scale previously not possible. The amount of citizen science data vastly exceeds what scientists or land managers can collect on their own. As such, it provides opportunities to address scaling in the environmental sciences. This presentation will explore data from several citizen science programs in the context of scaling.

  7. Children's Social Relations Interview Scale.

    ERIC Educational Resources Information Center

    Volpe, Richard

    The Children's Social Relations Interview Scale (CSRIS) was developed to assess the role expectations and role behaviors associated with physical disabilities, namely low status and independence. Three traits are assessed: succorance, the seeking of help and support; restraint, physical and social limitation and circumscription by others; and…

  8. Multi-Scale Infrastructure Assessment

    EPA Science Inventory

    The U.S. Environmental Protection Agency’s (EPA) multi-scale infrastructure assessment project supports both water resource adaptation to climate change and the rehabilitation of the nation’s aging water infrastructure by providing tools, scientific data and information to progra...

  9. Dynamic scaling in chemical ecology.

    PubMed

    Zimmer, Richard K; Zimmer, Cheryl Ann

    2008-07-01

    Natural rates of chemical production, release, and transport of fluid-borne molecules drive fundamental biological responses to these stimuli. The scaling of the field signaling environment to laboratory conditions recreates essential features of the dynamics and establishes ecological relevance. If appropriately scaled, laboratory simulations of physical regimes, coupled with natural rates of chemical cue/signal emission, facilitate interpretation of field results. From a meta-analysis of papers published in 11 journals over the last 22 years (1984-1986, 1994-1996, 2004-2006), complete dynamic scaling was rare in both field and laboratory studies. Studies in terrestrial systems often involved chemical determinations, but rarely simulated natural aerodynamics in laboratory wind tunnels. Research in aquatic (marine and freshwater) systems seldom scaled either the chemical or physical environments. Moreover, nearly all research, in all environments, focused on organism-level processes without incorporating the effects of individual-based behavior on populations, communities, and ecosystems. As a result, relationships between chemosensory-mediated behavior and ecological function largely remain unexplored. Outstanding exceptions serve as useful examples for guiding future research. Advanced conceptual frameworks and refined techniques offer exciting opportunities for identifying the ecological significance of chemical cues/signals in behavioral interactions and for incorporating individual effects at higher levels of biological organization.

  10. Convergence methods on time scales

    NASA Astrophysics Data System (ADS)

    Turan, Ceylan; Duman, Oktay

    2013-10-01

    In this paper, we introduce the concepts of lacunary statistical convergence and strongly lacunary Cesàro summability of delta measurable functions on time scales and obtain some inclusion results between them. We also display some examples containing discrete and continuous cases.

  11. Nanotribology: Rubbing on Small Scale

    ERIC Educational Resources Information Center

    Dickinson, J. Thomas

    2005-01-01

    Nanometer-scale investigations offer the potential of providing first-principles understanding of tribo-systems in terms of fundamental intermolecular forces. Some of the basic issues and motivation for use of scanning probes in the area of nanotribology is presented.

  12. Optimal scaling in ductile fracture

    NASA Astrophysics Data System (ADS)

    Fokoua Djodom, Landry

    This work is concerned with the derivation of optimal scaling laws, in the sense of matching lower and upper bounds on the energy, for a solid undergoing ductile fracture. The specific problem considered concerns a material sample in the form of an infinite slab of finite thickness subjected to prescribed opening displacements on its two surfaces. The solid is assumed to obey deformation-theory of plasticity and, in order to further simplify the analysis, we assume isotropic rigid-plastic deformations with zero plastic spin. When hardening exponents are given values consistent with observation, the energy is found to exhibit sublinear growth. We regularize the energy through the addition of nonlocal energy terms of the strain-gradient plasticity type. This nonlocal regularization has the effect of introducing an intrinsic length scale into the energy. We also put forth a physical argument that identifies the intrinsic length and suggests a linear growth of the nonlocal energy. Under these assumptions, ductile fracture emerges as the net result of two competing effects: whereas the sublinear growth of the local energy promotes localization of deformation to failure planes, the nonlocal regularization stabilizes this process, thus resulting in an orderly progression towards failure and a well-defined specific fracture energy. The optimal scaling laws derived here show that ductile fracture results from localization of deformations to void sheets, and that it requires a well-defined energy per unit fracture area. In particular, fractal modes of fracture are ruled out under the assumptions of the analysis. The optimal scaling laws additionally show that ductile fracture is cohesive in nature, i.e., it obeys a well-defined relation between tractions and opening displacements. Finally, the scaling laws supply a link between micromechanical properties and macroscopic fracture properties. In particular, they reveal the relative roles that surface energy and microplasticity

  13. Structural Similitude and Scaling Laws

    NASA Technical Reports Server (NTRS)

    Simitses, George J.

    1998-01-01

    Aircraft and spacecraft comprise the class of aerospace structures that require efficiency and wisdom in design, sophistication and accuracy in analysis and numerous and careful experimental evaluations of components and prototype, in order to achieve the necessary system reliability, performance and safety. Preliminary and/or concept design entails the assemblage of system mission requirements, system expected performance and identification of components and their connections as well as of manufacturing and system assembly techniques. This is accomplished through experience based on previous similar designs, and through the possible use of models to simulate the entire system characteristics. Detail design is heavily dependent on information and concepts derived from the previous steps. This information identifies critical design areas which need sophisticated analyses, and design and redesign procedures to achieve the expected component performance. This step may require several independent analysis models, which, in many instances, require component testing. The last step in the design process, before going to production, is the verification of the design. This step necessitates the production of large components and prototypes in order to test component and system analytical predictions and verify strength and performance requirements under the worst loading conditions that the system is expected to encounter in service. Clearly then, full-scale testing is in many cases necessary and always very expensive. In the aircraft industry, in addition to full-scale tests, certification and safety necessitate large component static and dynamic testing. Such tests are extremely difficult, time consuming and definitely absolutely necessary. Clearly, one should not expect that prototype testing will be totally eliminated in the aircraft industry. It is hoped, though, that we can reduce full-scale testing to a minimum. Full-scale large component testing is necessary in

  14. Earthquake Scaling, Simulation and Forecasting

    NASA Astrophysics Data System (ADS)

    Sachs, Michael Karl

    Earthquakes are among the most devastating natural events faced by society. In 2011, just two events, the magnitude 6.3 earthquake in Christcurch New Zealand on February 22, and the magnitude 9.0 Tohoku earthquake off the coast of Japan on March 11, caused a combined total of $226 billion in economic losses. Over the last decade, 791,721 deaths were caused by earthquakes. Yet, despite their impact, our ability to accurately predict when earthquakes will occur is limited. This is due, in large part, to the fact that the fault systems that produce earthquakes are non-linear. The result being that very small differences in the systems now result in very big differences in the future, making forecasting difficult. In spite of this, there are patterns that exist in earthquake data. These patterns are often in the form of frequency-magnitude scaling relations that relate the number of smaller events observed to the number of larger events observed. In many cases these scaling relations show consistent behavior over a wide range of scales. This consistency forms the basis of most forecasting techniques. However, the utility of these scaling relations is limited by the size of the earthquake catalogs which, especially in the case of large events, are fairly small and limited to a few 100 years of events. In this dissertation I discuss three areas of earthquake science. The first is an overview of scaling behavior in a variety of complex systems, both models and natural systems. The focus of this area is to understand how this scaling behavior breaks down. The second is a description of the development and testing of an earthquake simulator called Virtual California designed to extend the observed catalog of earthquakes in California. This simulator uses novel techniques borrowed from statistical physics to enable the modeling of large fault systems over long periods of time. The third is an evaluation of existing earthquake forecasts, which focuses on the Regional

  15. Incorporating Pore-Scale Data in Field-Scale Uncertainty Quantification: A Multi-Scale Bayesian Approach

    NASA Astrophysics Data System (ADS)

    Icardi, M.

    2014-12-01

    Pore-scale modeling is recently become an important tool for a deeper understanding of complex transport phenomena in porous media. However its direct usage for field-scale processes is still hindered by limited predictive capabilities. This is due to the large uncertainties in the micro-scale parameters, in the pore geometries, in the limited number of available samples, and in the numerical errors. These issues are often overlooked because it is usually thought that the computational cost of pore-scale simulation prohibits an extensive uncertainty quantification study with large number of samples. In this work we propose an computational tool to estimate statistics of pore-scale quantities. The algorithm is based on (i) an efficient automatic CFD solver for pore-scale simulations, (ii) a multi-scale Bayesian theoretical framework, and (iii) a generalized multilevel Monte Carlo to speed up the statistical computations. Exploiting the variance reduction of the multi-level and multi-scale representation, we demonstrate the feasibility of the forward and inverse uncertainty quantification problems. The former consists in quantifying the effect of micro-scale heterogeneities and parametric uncertainties on macro-scale upscaled quantities. Given some prior information on the pore-scale structures, the latter can be applied to (i) assess the validity and estimate uncertainties of macro-scale models for a wide range of micro-scale properties, (ii) match macro-scale results with the underlying pore-scale properties.

  16. 21 CFR 880.2720 - Patient scale.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Patient scale. 880.2720 Section 880.2720 Food and... Patient scale. (a) Identification. A patient scale is a device intended for medical purposes that is used to measure the weight of a patient who cannot stand on a scale. This generic device includes...

  17. 21 CFR 880.2720 - Patient scale.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Patient scale. 880.2720 Section 880.2720 Food and... Patient scale. (a) Identification. A patient scale is a device intended for medical purposes that is used to measure the weight of a patient who cannot stand on a scale. This generic device includes...

  18. 21 CFR 880.2720 - Patient scale.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Patient scale. 880.2720 Section 880.2720 Food and... Patient scale. (a) Identification. A patient scale is a device intended for medical purposes that is used to measure the weight of a patient who cannot stand on a scale. This generic device includes...

  19. 21 CFR 880.2720 - Patient scale.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Patient scale. 880.2720 Section 880.2720 Food and... Patient scale. (a) Identification. A patient scale is a device intended for medical purposes that is used to measure the weight of a patient who cannot stand on a scale. This generic device includes...

  20. 21 CFR 880.2720 - Patient scale.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Patient scale. 880.2720 Section 880.2720 Food and... Patient scale. (a) Identification. A patient scale is a device intended for medical purposes that is used to measure the weight of a patient who cannot stand on a scale. This generic device includes...

  1. 27 CFR 19.276 - Package scales.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Package scales. 19.276... Package scales. Proprietors shall ensure the accuracy of scales used for weighing packages of spirits through tests conducted at intervals of not more than 6 months or whenever scales are adjusted or...

  2. 30 CFR 57.3202 - Scaling tools.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Scaling tools. 57.3202 Section 57.3202 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Support-Surface and Underground § 57.3202 Scaling tools. Where manual scaling is performed, a scaling...

  3. 30 CFR 56.3202 - Scaling tools.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Scaling tools. 56.3202 Section 56.3202 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... § 56.3202 Scaling tools. Where manual scaling is performed, a scaling bar shall be provided. This...

  4. Simple scale interpolator facilitates reading of graphs

    NASA Technical Reports Server (NTRS)

    Fazio, A.; Henry, B.; Hood, D.

    1966-01-01

    Set of cards with scale divisions and a scale finder permits accurate reading of the coordinates of points on linear or logarithmic graphs plotted on rectangular grids. The set contains 34 different scales for linear plotting and 28 single cycle scales for log plots.

  5. Mokken Scale Analysis Using Hierarchical Clustering Procedures

    ERIC Educational Resources Information Center

    van Abswoude, Alexandra A. H.; Vermunt, Jeroen K.; Hemker, Bas T.; van der Ark, L. Andries

    2004-01-01

    Mokken scale analysis (MSA) can be used to assess and build unidimensional scales from an item pool that is sensitive to multiple dimensions. These scales satisfy a set of scaling conditions, one of which follows from the model of monotone homogeneity. An important drawback of the MSA program is that the sequential item selection and scale…

  6. Stability of Rasch Scales over Time

    ERIC Educational Resources Information Center

    Taylor, Catherine S.; Lee, Yoonsun

    2010-01-01

    Item response theory (IRT) methods are generally used to create score scales for large-scale tests. Research has shown that IRT scales are stable across groups and over time. Most studies have focused on items that are dichotomously scored. Now Rasch and other IRT models are used to create scales for tests that include polytomously scored items.…

  7. Metabolic scaling in solid tumours

    NASA Astrophysics Data System (ADS)

    Milotti, E.; Vyshemirsky, V.; Sega, M.; Stella, S.; Chignola, R.

    2013-06-01

    Tumour metabolism is an outstanding topic of cancer research, as it determines the growth rate and the global activity of tumours. Recently, by combining the diffusion of oxygen, nutrients, and metabolites in the extracellular environment, and the internal motions that mix live and dead cells, we derived a growth law of solid tumours which is linked to parameters at the cellular level. Here we use this growth law to obtain a metabolic scaling law for solid tumours, which is obeyed by tumours of different histotypes both in vitro and in vivo, and we display its relation with the fractal dimension of the distribution of live cells in the tumour mass. The scaling behaviour is related to measurable parameters, with potential applications in the clinical practice.

  8. The scale of cosmic isotropy

    SciTech Connect

    Marinoni, C.; Bel, J.; Buzzi, A. E-mail: Julien.Bel@cpt.univ-mrs.fr

    2012-10-01

    The most fundamental premise to the standard model of the universe states that the large-scale properties of the universe are the same in all directions and at all comoving positions. Demonstrating this hypothesis has proven to be a formidable challenge. The cross-over scale R{sub iso} above which the galaxy distribution becomes statistically isotropic is vaguely defined and poorly (if not at all) quantified. Here we report on a formalism that allows us to provide an unambiguous operational definition and an estimate of R{sub iso}. We apply the method to galaxies in the Sloan Digital Sky Survey (SDSS) Data Release 7, finding that R{sub iso} ∼ 150h{sup −1}Mpc. Besides providing a consistency test of the Copernican principle, this result is in agreement with predictions based on numerical simulations of the spatial distribution of galaxies in cold dark matter dominated cosmological models.

  9. Scale invariance in road networks

    NASA Astrophysics Data System (ADS)

    Kalapala, Vamsi; Sanwalani, Vishal; Clauset, Aaron; Moore, Cristopher

    2006-02-01

    We study the topological and geographic structure of the national road networks of the United States, England, and Denmark. By transforming these networks into their dual representation, where roads are vertices and an edge connects two vertices if the corresponding roads ever intersect, we show that they exhibit both topological and geographic scale invariance. That is, we show that for sufficiently large geographic areas, the dual degree distribution follows a power law with exponent 2.2⩽α⩽2.4 , and that journeys, regardless of their length, have a largely identical structure. To explain these properties, we introduce and analyze a simple fractal model of road placement that reproduces the observed structure, and suggests a testable connection between the scaling exponent α and the fractal dimensions governing the placement of roads and intersections.

  10. Emerging universe from scale invariance

    SciTech Connect

    Del Campo, Sergio; Herrera, Ramón; Guendelman, Eduardo I.; Labraña, Pedro E-mail: guendel@bgu.ac.il E-mail: plabrana@ubiobio.cl

    2010-06-01

    We consider a scale invariant model which includes a R{sup 2} term in action and show that a stable ''emerging universe'' scenario is possible. The model belongs to the general class of theories, where an integration measure independent of the metric is introduced. To implement scale invariance (S.I.), a dilaton field is introduced. The integration of the equations of motion associated with the new measure gives rise to the spontaneous symmetry breaking (S.S.B) of S.I. After S.S.B. of S.I. in the model with the R{sup 2} term (and first order formalism applied), it is found that a non trivial potential for the dilaton is generated. The dynamics of the scalar field becomes non linear and these non linearities are instrumental in the stability of some of the emerging universe solutions, which exists for a parameter range of the theory.

  11. Scaling Aspects of Lymphocyte Trafficking

    PubMed Central

    Perelson, Alan S.; Wiegel, Frederik W.

    2010-01-01

    We consider the long lived pool of B and T cells that recirculate through blood, tissues and the lymphatic system of an animal with body mass M. We derive scaling rules (allometric relations) for: (1) the rate of production of mature lymphocytes; (2) the accumulation of lymphocytes in the tissues; (3) the flux of lymphocytes through the lymphatic system; (4) the number of lymph nodes, (5) the number of lymphocytes per clone within a lymph node, and (6) the total number of lymphocytes within a lymph node. Mass-dependent aspects of immune learning and of the immunological self are shown to be not very significant. Our treatment is somewhat heuristic and aims at a combination of immunological data with recent progress in biological scaling. PMID:19084024

  12. Metabolic scaling in solid tumours

    PubMed Central

    Milotti, E.; Vyshemirsky, V.; Sega, M.; Stella, S.; Chignola, R.

    2013-01-01

    Tumour metabolism is an outstanding topic of cancer research, as it determines the growth rate and the global activity of tumours. Recently, by combining the diffusion of oxygen, nutrients, and metabolites in the extracellular environment, and the internal motions that mix live and dead cells, we derived a growth law of solid tumours which is linked to parameters at the cellular level1. Here we use this growth law to obtain a metabolic scaling law for solid tumours, which is obeyed by tumours of different histotypes both in vitro and in vivo, and we display its relation with the fractal dimension of the distribution of live cells in the tumour mass. The scaling behaviour is related to measurable parameters, with potential applications in the clinical practice. PMID:23727729

  13. Latest Developments in SLD Scaling

    NASA Technical Reports Server (NTRS)

    Tsao, Jen-Ching; Anderson, David N.

    2006-01-01

    Scaling methods have been shown previously to work well for super cooled large droplet (SLD) main ice shapes. However, feather sizes for some conditions have not been well represented by scale tests. To determine if there are fundamental differences between the development of feathers for appendix C and SLD conditions, this study used time-sequenced photographs, viewing along the span of the model during icing sprays. An airspeed of 100 kt, cloud water drop MVDs of 30 and 140 microns, and stagnation freezing fractions of 0.30 and 0.50 were tested in the NASA Glenn Icing Research Tunnel using an unswept 91-cm-chord NACA0012 airfoil model mounted at 0deg AOA. The photos indicated that the feathers that developed in a distinct region downstream of the leading-edge ice determined the horn location and angle. The angle at which feathers grew from the surface were also measured; results are shown for an airspeed of 150 kt, an MVD of 30 microns, and stagnation freezing fractions of 0.30 to 0.60. Feather angles were found to depend strongly on the stagnation freezing fraction, and were independent of either chordwise position on the model or time into the spray. Feather angles also correlated well with horn angles. For these tests, there did not appear to be fundamental differences between the physics of SLD and appendix C icing; therefore, for these conditions similarity parameters used for appendix C scaling appear to be valid for SLD scaling as well. Further investigation into the cause for the large feather structures observed for some SLD conditions will continue.

  14. Small-Scale-Field Dynamo

    SciTech Connect

    Gruzinov, A.; Cowley, S.; Sudan, R. ||

    1996-11-01

    Generation of magnetic field energy, without mean field generation, is studied. Isotropic mirror-symmetric turbulence of a conducting fluid amplifies the energy of small-scale magnetic perturbations if the magnetic Reynolds number is high, and the dimensionality of space {ital d} satisfies 2.103{lt}{ital d}{lt}8.765. The result does not depend on the model of turbulence, incompressibility, and isotropy being the only requirements. {copyright} {ital 1996 The American Physical Society.}

  15. Scaling properties of lithographic VCSELs

    NASA Astrophysics Data System (ADS)

    Demir, Abdullah; Zhao, Guowei; Freisem, Sabine; Liu, Xiaohang; Deppe, Dennis G.

    2011-03-01

    Data are presented demonstrating lithographic vertical-cavity surface-emitting lasers (VCSELs) and their scaling properties. Lithographic VCSELs have simultaneous mode- and current-confinement defined only by lithography and epitaxial crystal growth. The lithographic process of these devices allows getting uniform device size throughout a wafer and easy scaling to manufacture very small lasers. The semiconductor's high thermal conductivity enables the small lithographic VCSEL to have lower thermal resistance than an oxide-aperture VCSEL, while the lithographic fabrication produces high VCSEL uniformity even at small size. Very dense packing is also possible. Devices of 3 μm to 20 μm diameters are fabricated and scaling properties are characterized. 3 μm lithographic VCSELs produce output power of 4.1 mW, with threshold current of 260 μA and slope efficiency of 0.76 W/A at emission wavelength of ~980 nm. These VCSELs also have single-mode single-polarization lasing without the use of a surface grating, and have >25 dB sidemode- suppression-ratio up to 1 mW of output power. Lifetime tests demonstrate that 3 μm VCSEL operates for hundreds of hours at high injection current level of 85 kA/cm2 with 3.7 mW output power without degradation. Scaling properties and low thermal resistance of the lithographic VCSELs can extend the VCSEL technology to manufacturable and reliable small size lasers and densely packed arrays with long device lifetime.

  16. Source Code Analysis Laboratory (SCALe)

    DTIC Science & Technology

    2012-04-01

    revenue. Among respondents to the IAAR survey, 86% of companies certified in quality management realized a positive return on investment (ROI). An...SCALe undertakes. Testing and calibration laboratories that comply with ISO /IEC 17025 also operate in accordance with ISO 9001 . • NIST National...17025:2005 accredited and ISO 9001 :2008 registered. 4.3 SAIC Accreditation and Certification Services SAIC (Science Applications International

  17. Noisy scale-free networks

    NASA Astrophysics Data System (ADS)

    Scholz, Jan; Dejori, Mathäus; Stetter, Martin; Greiner, Martin

    2005-05-01

    The impact of observational noise on the analysis of scale-free networks is studied. Various noise sources are modeled as random link removal, random link exchange and random link addition. Emphasis is on the resulting modifications for the node-degree distribution and for a functional ranking based on betweenness centrality. The implications for estimated gene-expressed networks for childhood acute lymphoblastic leukemia are discussed.

  18. Shift and Scale Invariant Preprocessor.

    DTIC Science & Technology

    1981-12-01

    1982 THESIS D V SHIFT AND SCALE INVARIANT ?PREPROCESSOR by Norman E. Huston, Jr. December 1981 0 Thesis Advisor: L. A. Wilson Approved for public...SCHOOL December 1981 Author: - . 4 ,/ A pp ro0ved by: rYY. ( Thesis Advisor Co-Ad isor Chairman, De artment of 4n n eing Dean of Science and...large range of problems/disciplines. Fields where it is particularly common include optical imagery, acoustic signal processing , radiology, radio

  19. Multi-Scale Autoregressive Processes

    DTIC Science & Technology

    1989-06-01

    rationnelles et leurs langages," Mas- son 1984, Collection "Etudes et Recherches en Informatique". [12] J.L. DUNAU, "Etude d’une classe de marches...June 1989 LIDS-P-1880 Multi-Scale Autoregressive Processes Michele Basseville’ Albert Benveniste’ Institut de Recherche en Informatique et Systemes...Centre National de la Recherche Scientifique (CNRS) and A.B. is also with Institut National de Recherche en Informatique et en Automatique (INRIA). The

  20. Development of a Facebook Addiction Scale.

    PubMed

    Andreassen, Cecilie Schou; Torsheim, Torbjørn; Brunborg, Geir Scott; Pallesen, Ståle

    2012-04-01

    The Bergen Facebook Addiction Scale (BFAS), initially a pool of 18 items, three reflecting each of the six core elements of addiction (salience, mood modification, tolerance, withdrawal, conflict, and relapse), was constructed and administered to 423 students together with several other standardized self-report scales (Addictive Tendencies Scale, Online Sociability Scale, Facebook Attitude Scale, NEO-FFI, BIS/BAS scales, and Sleep questions). That item within each of the six addiction elements with the highest corrected item-total correlation was retained in the final scale. The factor structure of the scale was good (RMSEA = .046, CFI = .99) and coefficient alpha was .83. The 3-week test-retest reliability coefficient was .82. The scores converged with scores for other scales of Facebook activity. Also, they were positively related to Neuroticism and Extraversion, and negatively related to Conscientiousness. High scores on the new scale were associated with delayed bedtimes and rising times.

  1. An investigation of ride quality rating scales

    NASA Technical Reports Server (NTRS)

    Dempsey, T. K.; Coates, G. D.; Leatherwood, J. D.

    1977-01-01

    An experimental investigation was conducted for the combined purposes of determining the relative merits of various category scales for the prediction of human discomfort response to vibration and for determining the mathematical relationships whereby subjective data are transformed from one scale to other scales. There were 16 category scales analyzed representing various parametric combinations of polarity, that is, unipolar and bipolar, scale type, and number of scalar points. Results indicated that unipolar continuous-type scales containing either seven or nine scalar points provide the greatest reliability and discriminability. Transformations of subjective data between category scales were found to be feasible with unipolar scales of a larger number of scalar points providing the greatest accuracy of transformation. The results contain coefficients for transformation of subjective data between the category scales investigated. A result of particular interest was that the comfort half of a bipolar scale was seldom used by subjects to describe their subjective reaction to vibration.

  2. A Lab-Scale CELSS

    NASA Technical Reports Server (NTRS)

    Flynn, Mark E.; Finn, Cory K.; Srinivasan, Venkatesh; Sun, Sidney; Harper, Lynn D. (Technical Monitor)

    1994-01-01

    It has been shown that prohibitive resupply costs for extended-duration manned space flight missions will demand that a high degree of recycling and in situ food production be implemented. A prime candidate for in situ food production is the growth of higher level plants. Research in the area of plant physiology is currently underway at many institutions. This research is aimed at the characterization and optimization of gas exchange, transpiration and food production of higher plants in order to support human life in space. However, there are a number of unresolved issues involved in making plant chambers an integral part of a closed life support system. For example, issues pertaining to the integration of tightly coupled, non-linear systems with small buffer volumes will need to be better understood in order to ensure successful long term operation of a Controlled Ecological Life Support System (CELSS). The Advanced Life Support Division at NASA Ames Research Center has embarked on a program to explore some of these issues and demonstrate the feasibility of the CELSS concept. The primary goal of the Laboratory Scale CELSS Project is to develop a fully-functioning integrated CELSS on a laboratory scale in order to provide insight, knowledge and experience applicable to the design of human-rated CELSS facilities. Phase I of this program involves the integration of a plant chamber with a solid waste processor. This paper will describe the requirements, design and some experimental results from Phase I of the Laboratory Scale CELSS Program.

  3. Flavor from the electroweak scale

    DOE PAGES

    Bauer, Martin; Carena, Marcela; Gemmler, Katrin

    2015-11-04

    We discuss the possibility that flavor hierarchies arise from the electroweak scale in a two Higgs doublet model, in which the two Higgs doublets jointly act as the flavon. Quark masses and mixing angles are explained by effective Yukawa couplings, generated by higher dimensional operators involving quarks and Higgs doublets. Modified Higgs couplings yield important effects on the production cross sections and decay rates of the light Standard Model like Higgs. In addition, flavor changing neutral currents arise at tree-level and lead to strong constraints from meson-antimeson mixing. Remarkably, flavor constraints turn out to prefer a region in parameter spacemore » that is in excellent agreement with the one preferred by recent Higgs precision measurements at the Large Hadron Collider (LHC). Direct searches for extra scalars at the LHC lead to further constraints. Precise predictions for the production and decay modes of the additional Higgs bosons are derived, and we present benchmark scenarios for searches at the LHC Run II. As a result, flavor breaking at the electroweak scale as well as strong coupling effects demand a UV completion at the scale of a few TeV, possibly within the reach of the LHC.« less

  4. Temporal scaling in information propagation

    NASA Astrophysics Data System (ADS)

    Huang, Junming; Li, Chao; Wang, Wen-Qiang; Shen, Hua-Wei; Li, Guojie; Cheng, Xue-Qi

    2014-06-01

    For the study of information propagation, one fundamental problem is uncovering universal laws governing the dynamics of information propagation. This problem, from the microscopic perspective, is formulated as estimating the propagation probability that a piece of information propagates from one individual to another. Such a propagation probability generally depends on two major classes of factors: the intrinsic attractiveness of information and the interactions between individuals. Despite the fact that the temporal effect of attractiveness is widely studied, temporal laws underlying individual interactions remain unclear, causing inaccurate prediction of information propagation on evolving social networks. In this report, we empirically study the dynamics of information propagation, using the dataset from a population-scale social media website. We discover a temporal scaling in information propagation: the probability a message propagates between two individuals decays with the length of time latency since their latest interaction, obeying a power-law rule. Leveraging the scaling law, we further propose a temporal model to estimate future propagation probabilities between individuals, reducing the error rate of information propagation prediction from 6.7% to 2.6% and improving viral marketing with 9.7% incremental customers.

  5. Flavor from the electroweak scale

    SciTech Connect

    Bauer, Martin; Carena, Marcela; Gemmler, Katrin

    2015-11-04

    We discuss the possibility that flavor hierarchies arise from the electroweak scale in a two Higgs doublet model, in which the two Higgs doublets jointly act as the flavon. Quark masses and mixing angles are explained by effective Yukawa couplings, generated by higher dimensional operators involving quarks and Higgs doublets. Modified Higgs couplings yield important effects on the production cross sections and decay rates of the light Standard Model like Higgs. In addition, flavor changing neutral currents arise at tree-level and lead to strong constraints from meson-antimeson mixing. Remarkably, flavor constraints turn out to prefer a region in parameter space that is in excellent agreement with the one preferred by recent Higgs precision measurements at the Large Hadron Collider (LHC). Direct searches for extra scalars at the LHC lead to further constraints. Precise predictions for the production and decay modes of the additional Higgs bosons are derived, and we present benchmark scenarios for searches at the LHC Run II. As a result, flavor breaking at the electroweak scale as well as strong coupling effects demand a UV completion at the scale of a few TeV, possibly within the reach of the LHC.

  6. Large scale cluster computing workshop

    SciTech Connect

    Dane Skow; Alan Silverman

    2002-12-23

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community.

  7. Mechanically reliable scales and coatings

    SciTech Connect

    Tortorelli, P.F.; Alexander, K.B.

    1995-06-01

    In many high-temperature fossil energy systems, corrosion and deleterious environmental effects arising from reactions with reactive gases and condensible products often compromise materials performance and, as a consequence, degrade operating efficiencies. Protection of materials from such reactions is best afforded by the formation of stable surface oxides (either as deposited coatings or thermally grown scales) that are slowly reacting, continuous, dense, and adherent to the substrate. However, the ability of normally brittle ceramic films and coatings to provide such protection has long been problematical, particularly for applications involving numerous or severe high-temperature thermal cycles or very aggressive (for example, sulfidizing) environments. A satisfactory understanding of how scale and coating integrity and adherence are improved by compositional, microstructural, and processing modifications is lacking. Therefore, to address this issue, the present work is intended to define the relationships between substrate characteristics (composition, microstructure, and mechanical behavior) and the structure and protective properties of deposited oxide coatings and/or thermally grown scales. Such information is crucial to the optimization of the chemical, interfacial, and mechanical properties of the protective oxides on high-temperature materials through control of processing and composition and directly supports the development of corrosion-resistant, high-temperature materials for improved energy and environmental control systems.

  8. Transition physics and scaling overview

    SciTech Connect

    Carlstrom, T.N.

    1995-12-01

    This paper presents an overview of recent experimental progress towards understanding H-mode transition physics and scaling. Terminology and techniques for studying H-mode are reviewed and discussed. The model of shear E x B flow stabilization of edge fluctuations at the L-H transition is gaining wide acceptance and is further supported by observations of edge rotation on a number of new devices. Observations of poloidal asymmetries of edge fluctuations and dephasing of density and potential fluctuations after the transition pose interesting challenges for understanding H-mode physics. Dedicated scans to determine the scaling of the power threshold have now been performed on many machines. A dear B{sub t} dependence is universally observed but dependence on the line averaged density is complicated. Other dependencies are also reported. Studies of the effect of neutrals and error fields on the power threshold are under investigation. The ITER threshold database has matured and offers guidance to the power threshold scaling issues relevant to next-step devices.

  9. Temporal scaling in information propagation.

    PubMed

    Huang, Junming; Li, Chao; Wang, Wen-Qiang; Shen, Hua-Wei; Li, Guojie; Cheng, Xue-Qi

    2014-06-18

    For the study of information propagation, one fundamental problem is uncovering universal laws governing the dynamics of information propagation. This problem, from the microscopic perspective, is formulated as estimating the propagation probability that a piece of information propagates from one individual to another. Such a propagation probability generally depends on two major classes of factors: the intrinsic attractiveness of information and the interactions between individuals. Despite the fact that the temporal effect of attractiveness is widely studied, temporal laws underlying individual interactions remain unclear, causing inaccurate prediction of information propagation on evolving social networks. In this report, we empirically study the dynamics of information propagation, using the dataset from a population-scale social media website. We discover a temporal scaling in information propagation: the probability a message propagates between two individuals decays with the length of time latency since their latest interaction, obeying a power-law rule. Leveraging the scaling law, we further propose a temporal model to estimate future propagation probabilities between individuals, reducing the error rate of information propagation prediction from 6.7% to 2.6% and improving viral marketing with 9.7% incremental customers.

  10. The scaling of secondary craters

    NASA Technical Reports Server (NTRS)

    Croft, Steven K.

    1991-01-01

    Secondary craters are common features around fresh planetary-scale primary impact craters throughout most of the Solar System. They derive from the ejection phase of crater formation, thus secondary scaling relations provide constraints on parameters affecting ejection processes. Secondary crater fields typically begin at the edge of the continuous ejecta blankets (CEB) and extend out several crater radii. Secondaries tend to have rounded rims and bilateral symmetry about an axis through the primary crater's center. Prominent secondary chains can extend inward across the CEB close to the rim. A simple method for comparing secondary crater fields was employed: averaging the diameters and ranges from the center of the primary crater of the five largest craters in a secondary crater field. While not as much information is obtained about individual crater fields by this method as in more complete secondary field mapping, it facilitates rapid comparison of many secondary fields. Also, by quantifying a few specific aspects of the secondary crater field, this method can be used to construct scaling relations for secondary craters.

  11. Dystonia rating scales: critique and recommendations

    PubMed Central

    Albanese, Alberto; Sorbo, Francesca Del; Comella, Cynthia; Jinnah, H.A.; Mink, Jonathan W.; Post, Bart; Vidailhet, Marie; Volkmann, Jens; Warner, Thomas T.; Leentjens, Albert F.G.; Martinez-Martin, Pablo; Stebbins, Glenn T.; Goetz, Christopher G.; Schrag, Anette

    2014-01-01

    Background Many rating scales have been applied to the evaluation of dystonia, but only few have been assessed for clinimetric properties. The Movement Disorders Society commissioned this task force to critique existing dystonia rating scales and place them in the clinical and clinimetric context. Methods A systematic literature review was conducted to identify rating scales that have either been validated or used in dystonia. Results Thirty six potential scales were identified. Eight were excluded because they did not meet review criteria, leaving twenty-eight scales that were critiqued and rated by the task force. Seven scales were found to meet criteria to be “recommended”: the Blepharospasm Disability Index is recommended for rating blepharospasm; the Cervical Dystonia Impact Scale and the Toronto Western Spasmodic Torticollis Rating Scale for rating cervical dystonia; the Craniocervical Dystonia Questionnaire for blepharospasm and cervical dystonia; the Voice Handicap Index (VHI) and the Vocal Performance Questionnaire (VPQ) for laryngeal dystonia; and the Fahn-Marsden Dystonia Rating Scale for rating generalized dystonia. Two “recommended” scales (VHI and VPQ) are generic scales validated on few patients with laryngeal dystonia, whereas the others are disease-specific scales. Twelve scales met criteria for “suggested” and seven scales met criteria for “listed”. All the scales are individually reviewed in the online appendix. Conclusion The task force recommends five specific dystonia scales and suggests to further validate in dystonia two recommended generic voice-disorder scales. Existing scales for oromandibular, arm and task-specific dystonia should be refined and fully assessed. Scales should be developed for body regions where no scales are available, such as lower limbs and trunk. PMID:23893443

  12. Patch scales in coastal ecosystems

    NASA Astrophysics Data System (ADS)

    Broitman, Bernardo R.

    Quantifying the spatial and temporal scales over which ecological processes are coupled to environmental variability is a major challenge for ecologists. Here, I assimilate patterns of oceanographic variability with ecological field studies in an attempt to quantify spatial and temporal scales of coupling. Using coastal time series of chlorophyll-a concentration from remote sensing, the first chapter examines the alongshore extent of coastal regions subject to similar temporal patterns of oceanographic variability in Western North America (WNA) and North-Central Chile (Chile). I found striking interhemispherical differences in the length of coastal sections under similar oceanographic regimes, with the Chile region showing longshore coherency over much smaller spatial scales (˜60 km) than on the coast of WNA (˜140 km). Through a spatial analysis of coastal orientation I suggest that the characteristic length scales may be traced to the geomorphologic character of the ocean margins. The second chapter examines spatial patterns of primary production through long-term means of coastal chlorophyll-a concentration and kelp (Macrocystis pyrifera) cover and explores their relationship with coastal geomorphology and sea surface temperature (SST). Spatial analyses showed a striking match in length scales around 180--250 km. Strong anticorrelations at small spatial lags and positive correlations at longer distances suggest little overlap between patches of kelp and coastal chlorophyll-a. In agreement with findings from the previous chapter, I found that coastal patches could be traced back to spatial patterns of coastal geomorphology. Through SST time series and long-term datasets of larval recruitment in Santa Cruz Island, California, the third chapter examines temporal patterns of oceanographic variability as determinants of ecological patterns. SST time series from sites experiencing low larval recruitment rates were dominated by strong temporal variability. These sites

  13. Evaluating the impact of farm scale innovation at catchment scale

    NASA Astrophysics Data System (ADS)

    van Breda, Phelia; De Clercq, Willem; Vlok, Pieter; Querner, Erik

    2014-05-01

    Hydrological modelling lends itself to other disciplines very well, normally as a process based system that acts as a catalogue of events taking place. These hydrological models are spatial-temporal in their design and are generally well suited for what-if situations in other disciplines. Scaling should therefore be a function of the purpose of the modelling. Process is always linked with scale or support but the temporal resolution can affect the results if the spatial scale is not suitable. The use of hydrological response units tends to lump area around physical features but disregards farm boundaries. Farm boundaries are often the more crucial uppermost resolution needed to gain more value from hydrological modelling. In the Letaba Catchment of South Africa, we find a generous portion of landuses, different models of ownership, different farming systems ranging from large commercial farms to small subsistence farming. All of these have the same basic right to water but water distribution in the catchment is somewhat of a problem. Since water quantity is also a problem, the water supply systems need to take into account that valuable production areas not be left without water. Clearly hydrological modelling should therefore be sensitive to specific landuse. As a measure of productivity, a system of small farmer production evaluation was designed. This activity presents a dynamic system outside hydrological modelling that is generally not being considered inside hydrological modelling but depends on hydrological modelling. For sustainable development, a number of important concepts needed to be aligned with activities in this region, and the regulatory actions also need to be adhered to. This study aimed at aligning the activities in a region to the vision and objectives of the regulatory authorities. South Africa's system of socio-economic development planning is complex and mostly ineffective. There are many regulatory authorities involved, often with unclear

  14. Derivation of physically motivated wind speed scales

    NASA Astrophysics Data System (ADS)

    Dotzek, Nikolai

    A class of new wind speed scales is proposed in which the relevant scaling factors are derived from physical quantities like mass flux density, energy density (pressure), or energy flux density. Hence, they are called Energy- or E-scales, and can be applied to wind speeds of any intensity. It is shown that the Mach scale is a special case of an E-scale. Aside from its foundation in physical quantities which allow for a calibration of the scales, the E-scale concept can help to overcome the present plethora of scales for winds in the range from gale to hurricane intensity. A procedure to convert existing data based on the Fujita-scale or other scales (Saffir-Simpson, TORRO, Beaufort) to their corresponding E-scales is outlined. Even for the large US tornado record, the workload of conversion in case of an adoption of the E-scale would in principle remain manageable (if the necessary metadata to do so were available), as primarily the F5 events would have to be re-rated. Compared to damage scales like the "Enhanced Fujita" or EF-scale concept recently implemented in the USA, the E-scales are based on first principles. They can consistently be applied all over the world for the purpose of climatological homogeneity. To account for international variations in building characteristics, one should not adapt wind speed scale thresholds to certain national building characteristics. Instead, one worldwide applicable wind speed scale based on physical principles should rather be complemented by nationally-adapted damage descriptions. The E-scale concept can provide the basis for such a standardised wind speed scale.

  15. Preliminary Scaling Estimate for Select Small Scale Mixing Demonstration Tests

    SciTech Connect

    Wells, Beric E.; Fort, James A.; Gauglitz, Phillip A.; Rector, David R.; Schonewill, Philip P.

    2013-09-12

    The Hanford Site double-shell tank (DST) system provides the staging location for waste that will be transferred to the Hanford Tank Waste Treatment and Immobilization Plant (WTP). Specific WTP acceptance criteria for waste feed delivery describe the physical and chemical characteristics of the waste that must be met before the waste is transferred from the DSTs to the WTP. One of the more challenging requirements relates to the sampling and characterization of the undissolved solids (UDS) in a waste feed DST because the waste contains solid particles that settle and their concentration and relative proportion can change during the transfer of the waste in individual batches. A key uncertainty in the waste feed delivery system is the potential variation in UDS transferred in individual batches in comparison to an initial sample used for evaluating the acceptance criteria. To address this uncertainty, a number of small-scale mixing tests have been conducted as part of Washington River Protection Solutions’ Small Scale Mixing Demonstration (SSMD) project to determine the performance of the DST mixing and sampling systems.

  16. Global scale deposition of radioactivity from a large scale exchange

    SciTech Connect

    Knox, J.B.

    1983-10-01

    The global impact of radioactivity pertains to the continental scale and planetary scale deposition of the radioactivity in a delayed mode; it affects all peoples. Global deposition is distinct and separate from close-in fallout. Close-in fallout is delivered in a matter of a few days or less and is much studied in the literature of civilian defense. But much less studied is the matter of global deposition. The global deposition of radioactivity from the reference strategic exchange (5300 MT) leads to an estimated average whole body, total integrated dose of 20 rem for the latitudes of 30 to 50/sup 0/ in the Northern Hemisphere. Hotspots of deposited radioactivity can occur with doses of about 70 rem (winter) to 40 to 110 rem (summer) in regions like Europe, western Asia, western North Pacific, southeastern US, northeastern US, and Canada. The neighboring countries within a few hundred kilometers of areas under strategic nuclear attack can be impacted by the normal (termal close-in) fallout due to gravitational sedimentation with lethal radiation doses to unsheltered populations. In regard to the strategic scenario about 40% of the megatonage is assumed to be in a surface burst mode and the rest in the free air burst mode.

  17. Computational Analysis of the Large Scale Low-Boom Supersonic Inlet

    NASA Technical Reports Server (NTRS)

    Chima, ROdrick V.

    2011-01-01

    This presentation describes two computational fluid dynamic (CFD) analyses done in support of a supersonic inlet test performed at NASA Glenn Research Center in the fall of 2010. The large-scale-low-boom supersonic inlet was designed for a small supersonic aircraft that would cruise at a Mach number of 1.6. It uses an axisymmetric, external compression spike to reduce the Mach number to 0.65 at the fan face. The inlet was tested in the 8x6 supersonic wind tunnel at NASA GRC using conventional pressure probes, pressure sensitive paint, and high-speed schlieren. Two CFD analyses of the inlet were performed before the test, and compared to the experimental data afterwards. Both analyses used the WIND-US code. First, an axisymmetric analysis of the inlet, diffuser, cold pipe, and mass flow plug was performed to predict the performance of the entire system in the wind tunnel. Then a 3-D analysis of the inlet with all its interior struts was performed to predict details of the flow field and effects of angle of attack. Test results showed that the inlet had excellent performance, with a peak total pressure recovery of 96 percent, and a buzz point far outside the engine operating range. The computations agreed very well with the data, with predicted recoveries within 0.3 - 0.5 points of the measurements.

  18. Mineral scale in gravel packed wells

    SciTech Connect

    Schmidt, T.; Soereide, F.

    1994-12-31

    Mineral scales of barium, strontium and calcium sulphate are well known to the oil industry. The most common scale is calcium carbonate. However carbonate, unlike the three other scales mentioned, is acid soluble and it is perhaps the sulphate scales which gives the greatest problems. One additional feature of the sulphate scales is that they very often coprecipitate radium sulphate which is radioactive and difficult to dispose of and troublesome to work with from a health and safety aspect. This paper presents the production history of gravel packed wells which have experienced the deposition and removal of mainly strontium sulphate (SrSO{sub 4}) scale. A scale prediction program is used to analyze the scale tendencies under both equilibrium and kinetic controlled conditions. The flow and scale characteristics of gravel packed and naturally completed wells are compared.

  19. Dimensional Review of Scales for Forensic Photography.

    PubMed

    Ferrucci, Massimiliano; Doiron, Theodore D; Thompson, Robert M; Jones, John P; Freeman, Adam J; Neiman, Janice A

    2016-03-01

    Scales for photography provide a geometrical reference in the photographic documentation of a crime scene, pattern, or item of evidence. The ABFO No. 2 Standard Reference Scale (1) is used by the forensic science community as an accurate reference scale. We investigated the overall accuracy of the major centimeter graduations, internal/external diameters of the circles, error in placement of the circle centers, and leg perpendicularity. Four vendors were selected for the scales, and the features were measured on a vision-based coordinate measurement system. The scales were well within the specified tolerance for the length graduations. After 4 years, the same scales were measured to determine what change could be measured. The scales demonstrated acceptable stability in the scale length and center-to-center measurements; however, the perpendicularity exhibited change. The study results indicate that scale quality checks using certified metal rulers are good practice.

  20. Scaling on a limestone flooring

    NASA Astrophysics Data System (ADS)

    Carmona-Quiroga, P. M.; Blanco-Varela, M. T.; Martínez-Ramírez, S.

    2012-04-01

    Natural stone can be use on nearly every surface, inside and outside buildings, but decay is more commonly reported from the ones exposed to outdoor aggressively conditions. This study instead, is an example of limestone weathering of uncertain origin in the interior of a residential building. The stone, used as flooring, started to exhibit loss of material in the form of scaling. These damages were observed before the building, localized in the South of Spain (Málaga), was inhabited. Moreover, according to the company the limestone satisfies the following European standards UNE-EN 1341: 2002, UNE-EN 1343: 2003; UNE-EN 12058: 2004 for floorings. Under these circumstances the main objective of this study was to assess the causes of this phenomenon. For this reason the composition of the mortar was determined and the stone was characterized from a mineralogical and petrological point of view. The last material, which is a fossiliferous limestone from Egypt with natural fissure lines, is mainly composed of calcite, being quartz, kaolinite and apatite minor phases. Moreover, under different spectroscopic and microscopic techniques (FTIR, micro-Raman, SEM-EDX, etc) samples of the weathered, taken directly from the buildings, and unweathered limestone tiles were examined and a new mineralogical phase, trona, was identified at scaled areas which are connected with the natural veins of the stone. In fact, through BSE-mapping the presence of sodium has been detected in these veins. This soluble sodium carbonate would was dissolved in the natural waters from which limestone was precipitated and would migrate with the ascendant capilar humidity and crystallized near the surface of the stone starting the scaling phenomenon which in historic masonry could be very damaging. Therefore, the weathering of the limestone would be related with the hygroscopic behaviour of this salt, but not with the constructive methods used. This makes the limestone unable to be used on restoration

  1. Optimal Scaling of Digital Transcriptomes

    PubMed Central

    Glusman, Gustavo; Caballero, Juan; Robinson, Max; Kutlu, Burak; Hood, Leroy

    2013-01-01

    Deep sequencing of transcriptomes has become an indispensable tool for biology, enabling expression levels for thousands of genes to be compared across multiple samples. Since transcript counts scale with sequencing depth, counts from different samples must be normalized to a common scale prior to comparison. We analyzed fifteen existing and novel algorithms for normalizing transcript counts, and evaluated the effectiveness of the resulting normalizations. For this purpose we defined two novel and mutually independent metrics: (1) the number of “uniform” genes (genes whose normalized expression levels have a sufficiently low coefficient of variation), and (2) low Spearman correlation between normalized expression profiles of gene pairs. We also define four novel algorithms, one of which explicitly maximizes the number of uniform genes, and compared the performance of all fifteen algorithms. The two most commonly used methods (scaling to a fixed total value, or equalizing the expression of certain ‘housekeeping’ genes) yielded particularly poor results, surpassed even by normalization based on randomly selected gene sets. Conversely, seven of the algorithms approached what appears to be optimal normalization. Three of these algorithms rely on the identification of “ubiquitous” genes: genes expressed in all the samples studied, but never at very high or very low levels. We demonstrate that these include a “core” of genes expressed in many tissues in a mutually consistent pattern, which is suitable for use as an internal normalization guide. The new methods yield robustly normalized expression values, which is a prerequisite for the identification of differentially expressed and tissue-specific genes as potential biomarkers. PMID:24223126

  2. Proposing a tornado watch scale

    NASA Astrophysics Data System (ADS)

    Mason, Jonathan Brock

    This thesis provides an overview of language used in tornado safety recommendations from various sources, along with developing a rubric for scaled tornado safety recommendations, and subsequent development and testing of a tornado watch scale. The rubric is used to evaluate tornado refuge/shelter adequacy responses of Tuscaloosa residents gathered following the April 27, 2011 Tuscaloosa, Alabama EF4 tornado. There was a significant difference in the counts of refuge adequacy for Tuscaloosa residents when holding the locations during the April 27th tornado constant and comparing adequacy ratings for weak (EF0-EF1), strong (EF2-EF3) and violent (EF4-EF5) tornadoes. There was also a significant difference when comparing future tornado refuge plans of those same participants to the adequacy ratings for weak, strong and violent tornadoes. The tornado refuge rubric is then revised into a six-class, hierarchical Tornado Watch Scale (TWS) from Level 0 to Level 5 based on the likelihood of high-impact or low-impact severe weather events containing weak, strong or violent tornadoes. These levels represent maximum expected tornado intensity and include tornado safety recommendations from the tornado refuge rubric. Audio recordings similar to those used in current National Oceanic and Atmospheric Administration (NOAA) weather radio communications were developed to correspond to three levels of the TWS, a current Storm Prediction Center (SPC) tornado watch and a particularly dangerous situation (PDS) tornado watch. These were then used in interviews of Alabama residents to determine how changes to the information contained in the watch statements would affect each participant's tornado safety actions and perception of event danger. Results from interview participants (n=38) indicate a strong preference (97.37%) for the TWS when compared to current tornado watch and PDS tornado watch statements. Results also show the TWS elicits more adequate safety decisions from participants

  3. Time Scales in Particulate Systems

    NASA Astrophysics Data System (ADS)

    Zhang, Duan

    2013-06-01

    While there are many interests of studying interactions of individual particles, macroscopic collective behavior of particles are our main interest in many practical applications. In this talk, I will give a brief overview of the multiscale methods connecting the physics at individual particles to macroscopic quantities and averaged equations. The emphasis will be on dense dissipative particulate systems, such as powders. Unlike conservative particle systems, such as molecular systems, in a dissipative particle system the concept of thermodynamic equilibrium is not very useful unless in very special cases, because the only true thermodynamically equilibrium state in these systems is the state in which nothing moves. Other than idealized simple systems, mesoscale structures are common and important in many practical systems, especially in dissipative systems. Spatial correlations of these mesoscale structures, such as force chains in dense granular system, particle clusters and streamers in fluidized beds have received some recent attentions, partly because they can be visualized. This talk will emphasize the effects of time correlations related to the mesoscale structures. To consider time correlations and history information of the system, I will introduce the mathematical foundation of the Liouville equation, its applicability and limitations. I will derive the generalized Liouville equations for particulate systems with and without interstitial fluids, and then use them to study averaged transport equations and related closures. Interactions among the time scale of particle interactions, the time scale of the mesocale structures, and the time scale of the physical problem as represented by strain rate will be discussed. The effect of these interactions on the closure relations will be illustrated. I will also discuss possible numerical methods of solving the averaged equations, and multiscale numerical algorithms bridging the particle level calculations to

  4. Metastability at the nanometer scale

    SciTech Connect

    Desre, P.J.

    1996-12-31

    Under constraints and at the nanometer scale, transitory metastable states can be generated in multicomponents materials. Examples illustrating such specific states are presented. They concern (1) the crystalline nucleation in a growing undercooled liquid droplet formed from a liquid parent phase; (2) the suppression of intermetallic nucleation in solid solutions or glasses subjected to sharp concentration gradients; (3) the nanocrystalline transitory state preceding amorphization by ball milling. In connection with this latter example, a thermodynamic model for the nanocrystal to glass transition, based on a hypothesis of a topological disorder wetting at the nanograin boundaries, is proposed.

  5. Scale-free convection theory

    NASA Astrophysics Data System (ADS)

    Pasetto, Stefano; Chiosi, Cesare; Cropper, Mark; Grebel, Eva K.

    2015-08-01

    Convection is one of the fundamental mechanism to transport energy, e.g., in planetology, oceanography as well as in astrophysics where stellar structure customarily described by the mixing-length theory, which makes use of the mixing-length scale parameter to express the convective flux, velocity, and temperature gradients of the convective elements and stellar medium. The mixing-length scale is taken to be proportional to the local pressure scale height of the star, and the proportionality factor (the mixing-length parameter) must be determined by comparing the stellar models to some calibrator, usually the Sun.No strong arguments exist to claim that the mixing-length parameter is the same in all stars and all evolutionary phases. Because of this, all stellar models in literature are hampered by this basic uncertainty.In a recent paper (Pasetto et al 2014) we presented the first fully analytical scale-free theory of convection that does not require the mixing-length parameter. Our self-consistent analytical formulation of convection determines all the properties of convection as a function of the physical behaviour of the convective elements themselves and the surrounding medium (being it a either a star, an ocean, a primordial planet). The new theory of convection is formulated starting from a conventional solution of the Navier-Stokes/Euler equations, i.e. the Bernoulli equation for a perfect fluid, but expressed in a non-inertial reference frame co-moving with the convective elements. In our formalism, the motion of convective cells inside convective-unstable layers is fully determined by a new system of equations for convection in a non-local and time dependent formalism.We obtained an analytical, non-local, time-dependent solution for the convective energy transport that does not depend on any free parameter. The predictions of the new theory in astrophysical environment are compared with those from the standard mixing-length paradigm in stars with

  6. Full-Scale Tunnel (FST)

    NASA Technical Reports Server (NTRS)

    1929-01-01

    Modified propeller and spinner in Full-Scale Tunnel (FST) model. On June 26, 1929, Elton W. Miller wrote to George W. Lewis proposing the construction of a model of the full-scale tunnel. 'The excellent energy ratio obtained in the new wind tunnel of the California Institute of Technology suggests that before proceeding with our full scale tunnel design, we ought to investigate the effect on energy ratio of such factors as: 1. small included angle for the exit cone; 2. carefully designed return passages of circular section as far as possible, without sudden changes in cross sections; 3. tightness of walls. It is believed that much useful information can be obtained by building a model of about 1/16 scale, that is, having a closed throat of 2 ft. by 4 ft. The outside dimensions would be about 12 ft. by 25 ft. in plan and the height 4 ft. Two propellers will be required about 28 in. in diameter, each to be driven by direct current motor at a maximum speed of 4500 R.P.M. Provision can be made for altering the length of certain portions, particularly the exit cone, and possibly for the application of boundary layer control in order to effect satisfactory air flow. This model can be constructed in a comparatively short time, using 2 by 4 framing with matched sheathing inside, and where circular sections are desired they can be obtained by nailing sheet metal to wooden ribs, which can be cut on the band saw. It is estimated that three months will be required for the construction and testing of such a model and that the cost will be approximately three thousand dollars, one thousand dollars of which will be for the motors. No suitable location appears to exist in any of our present buildings, and it may be necessary to build it outside and cover it with a roof.' George Lewis responded immediately (June 27) granting the authority to proceed. He urged Langley to expedite construction and to employ extra carpenters if necessary. Funds for the model came from the FST project

  7. Outer scale of atmospheric turbulence

    NASA Astrophysics Data System (ADS)

    Lukin, Vladimir P.

    2005-10-01

    In the early 70's, the scientists in Italy (A.Consortini, M.Bertolotti, L.Ronchi), USA (R.Buser, Ochs, S.Clifford) and USSR (V.Pokasov, V.Lukin) almost simultaneously discovered the phenomenon of deviation from the power law and the effect of saturation for the structure phase function. During a period of 35 years we have performed successively the investigations of the effect of low-frequency spectral range of atmospheric turbulence on the optical characteristics. The influence of the turbulence models as well as a outer scale of turbulence on the characteristics of telescopes and systems of laser beam formations has been determined too.

  8. The NIST Length Scale Interferometer

    PubMed Central

    Beers, John S.; Penzes, William B.

    1999-01-01

    The National Institute of Standards and Technology (NIST) interferometer for measuring graduated length scales has been in use since 1965. It was developed in response to the redefinition of the meter in 1960 from the prototype platinum-iridium bar to the wavelength of light. The history of the interferometer is recalled, and its design and operation described. A continuous program of modernization by making physical modifications, measurement procedure changes and computational revisions is described, and the effects of these changes are evaluated. Results of a long-term measurement assurance program, the primary control on the measurement process, are presented, and improvements in measurement uncertainty are documented.

  9. New Scalings in Nuclear Fragmentation

    SciTech Connect

    Bonnet, E.; Bougault, R.; Galichet, E.; Gagnon-Moisan, F.; Guinet, D.; Lautesse, P.; Marini, P.; Parlog, M.

    2010-10-01

    Fragment partitions of fragmenting hot nuclei produced in central and semiperipheral collisions have been compared in the excitation energy region 4-10 MeV per nucleon where radial collective expansion takes place. It is shown that, for a given total excitation energy per nucleon, the amount of radial collective energy fixes the mean fragment multiplicity. It is also shown that, at a given total excitation energy per nucleon, the different properties of fragment partitions are completely determined by the reduced fragment multiplicity (i.e., normalized to the source size). Freeze-out volumes seem to play a role in the scalings observed.

  10. Universality and scaling in metamaterials

    NASA Astrophysics Data System (ADS)

    Felbacq, Didier

    2016-09-01

    It has been demonstrated by many theoretical and experimentals works that Mie resonances are at the heart of the effective properties of dielectric metamaterials. These resonances indeed allow for the onset of tailorable macroscopic magnetic properties. They were shown to provide a convenient way to study the transition between photonic crystals and metamaterials. In the present work, we show that the band structure linked to theses resonances is largely scale invariant and also, to some extend, robust with regard to disorder. These results do not rely heavily on a specific type of wave, suggesting that the same kind of results can be obtained for acoustic or gravity waves.

  11. Full-Scale Tunnel (FST)

    NASA Technical Reports Server (NTRS)

    1931-01-01

    Modification of entrance cone Full-Scale Tunnel (FST). Smith DeFrance describes the entrance cone in NACA TR 459 as follows: 'The entrance cone is 75 feet in length and in this distance the cross section changes from a rectangle 72 by 110 feet to a 30 by 60 foot elliptic section. The area reduction in the entrance cone is slightly less than 5:1. The shape of the entrance cone was chosen to give as fas as possible a constant acceleration to the air stream and to retain a 9-foot length of nozzle for directing the flow.' (p. 293)

  12. Drift-Scale Radionuclide Transport

    SciTech Connect

    P.R. Dixon

    2004-02-17

    The purpose of this Model Report is to document two models for drift-scale radionuclide transport. This has been developed in accordance with ''Technical Work Plan for: Performance Assessment Unsaturated Zone'' (Bechtel SAIC Company, LLC (BSC) 2002 [160819]), which includes planning documents for the technical work scope, content, and management of this Model Report in Section 1.15, Work Package AUZM11, ''Drift-Scale Radionuclide Transport.'' The technical work scope for this Model Report calls for development of a process-level model and an abstraction model representing diffusive release from the invert to the rocks, partitioned between fracture and matrix, as compared to the fracture-release approach used in the Site Recommendation. The invert is the structure constructed in a drift to provide the floor of that drift. The plan for validation of the models documented in this Model Report is given in Section I-5 of Attachment I in BSC (2002 [160819]). Note that the model validation presented in Section 7 deviates from the technical work plan (BSC 2002 [160819], Section I-5) in that an independent technical review specifically for model validation has not been conducted, nor publication in a peer-reviewed journal. Model validation presented in Section 7 is based on corroboration with alternative mathematical models, which is also called out by the technical work plan (BSC 2002 [160819], Section I-5), and is sufficient based on the requirements of AP-SIII.10Q for model validation. See Section 7 for additional discussion. The phenomenon of flow and transport in the vicinity of the waste emplacement drift are evaluated in this model report under ambient thermal, chemical, and mechanical conditions. This includes the effects of water diversion around an emplacement drift and the flow and transport behavior expected in a fractured rock below the drift. The reason for a separate assessment of drift-scale transport is that the effects of waste emplacement drifts on flow

  13. Scale-free convection theory

    NASA Astrophysics Data System (ADS)

    Pasetto, Stefano; Chiosi, Cesare; Cropper, Mark; Grebel, Eva K.

    Convection is one of the fundamental mechanisms to transport energy, e.g., in planetology, oceanography, as well as in astrophysics where stellar structure is customarily described by the mixing-length theory, which makes use of the mixing-length scale parameter to express the convective flux, velocity, and temperature gradients of the convective elements and stellar medium. The mixing-length scale is taken to be proportional to the local pressure scale height of the star, and the proportionality factor (the mixing-length parameter) must be determined by comparing the stellar models to some calibrator, usually the Sun. No strong arguments exist to claim that the mixing-length parameter is the same in all stars and all evolutionary phases. Because of this, all stellar models in the literature are hampered by this basic uncertainty. In a recent paper (Pasetto et al. 2014) we presented the first fully analytical scale-free theory of convection that does not require the mixing-length parameter. Our self-consistent analytical formulation of convection determines all the properties of convection as a function of the physical behaviour of the convective elements themselves and the surrounding medium (be it a star, an ocean, or a primordial planet). The new theory of convection is formulated starting from a conventional solution of the Navier-Stokes/Euler equations, i.e. the Bernoulli equation for a perfect fluid, but expressed in a non-inertial reference frame co-moving with the convective elements. In our formalism, the motion of convective cells inside convective-unstable layers is fully determined by a new system of equations for convection in a non-local and time dependent formalism. We obtained an analytical, non-local, time-dependent solution for the convective energy transport that does not depend on any free parameter. The predictions of the new theory in astrophysical environment are compared with those from the standard mixing-length paradigm in stars with

  14. Drift-Scale Radionuclide Transport

    SciTech Connect

    J. Houseworth

    2004-09-22

    The purpose of this model report is to document the drift scale radionuclide transport model, taking into account the effects of emplacement drifts on flow and transport in the vicinity of the drift, which are not captured in the mountain-scale unsaturated zone (UZ) flow and transport models ''UZ Flow Models and Submodels'' (BSC 2004 [DIRS 169861]), ''Radionuclide Transport Models Under Ambient Conditions'' (BSC 2004 [DIRS 164500]), and ''Particle Tracking Model and Abstraction of Transport Process'' (BSC 2004 [DIRS 170041]). The drift scale radionuclide transport model is intended to be used as an alternative model for comparison with the engineered barrier system (EBS) radionuclide transport model ''EBS Radionuclide Transport Abstraction'' (BSC 2004 [DIRS 169868]). For that purpose, two alternative models have been developed for drift-scale radionuclide transport. One of the alternative models is a dual continuum flow and transport model called the drift shadow model. The effects of variations in the flow field and fracture-matrix interaction in the vicinity of a waste emplacement drift are investigated through sensitivity studies using the drift shadow model (Houseworth et al. 2003 [DIRS 164394]). In this model, the flow is significantly perturbed (reduced) beneath the waste emplacement drifts. However, comparisons of transport in this perturbed flow field with transport in an unperturbed flow field show similar results if the transport is initiated in the rock matrix. This has led to a second alternative model, called the fracture-matrix partitioning model, that focuses on the partitioning of radionuclide transport between the fractures and matrix upon exiting the waste emplacement drift. The fracture-matrix partitioning model computes the partitioning, between fractures and matrix, of diffusive radionuclide transport from the invert (for drifts without seepage) into the rock water. The invert is the structure constructed in a drift to provide the floor of the

  15. Identifying characteristic scales in the human genome

    NASA Astrophysics Data System (ADS)

    Carpena, P.; Bernaola-Galván, P.; Coronado, A. V.; Hackenberg, M.; Oliver, J. L.

    2007-03-01

    The scale-free, long-range correlations detected in DNA sequences contrast with characteristic lengths of genomic elements, being particularly incompatible with the isochores (long, homogeneous DNA segments). By computing the local behavior of the scaling exponent α of detrended fluctuation analysis (DFA), we discriminate between sequences with and without true scaling, and we find that no single scaling exists in the human genome. Instead, human chromosomes show a common compositional structure with two characteristic scales, the large one corresponding to the isochores and the other to small and medium scale genomic elements.

  16. Family health climate scale (FHC-scale): development and validation

    PubMed Central

    2014-01-01

    Background The family environment is important for explaining individual health behaviour. While previous research mostly focused on influences among family members and dyadic interactions (parent-child), the purpose of this study was to develop a new measure, the Family Health Climate Scale (FHC-Scale), using a family-based approach. The FHC is an attribute of the whole family and describes an aspect of the family environment that is related to health and health behaviour. Specifically, a questionnaire measuring the FHC (a) for nutrition (FHC-NU) and (b) for activity behaviour (FHC-PA) was developed and validated. Methods In Study 1 (N = 787) the FHC scales were refined and validated. The sample was randomly divided into two subsamples. With random sample I exploratory factor analyses were conducted and items were selected according to their psychometric quality. In a second step, confirmatory factor analyses were conducted using the random sample II. In Study 2 (N = 210 parental couples) the construct validity was tested by correlating the FHC to self-determined motivation of healthy eating and physical activity as well as the families’ food environment and joint physical activities. Results Exploratory factor analyses with random sample I (Study 1) revealed a four (FHC-NU) and a three (FHC-PA) factor model. These models were cross-validated with random sample II and demonstrated an acceptable fit [FHC-PA: χ2 = 222.69, df = 74, p < .01; χ2/df = 3.01; CFI = .96; SRMR = .04; RMSEA = .07, CI .06/.08; FHC-NU: χ2 = 278.30, df = 113, p < .01, χ2/df = 2.46, CFI = .96; SRMR = .04; RMSEA = .06, CI .05/.07]. The perception of FHC correlated (p < .01) with the intrinsic motivation of healthy eating (r = .42) and physical activity (r = .56). Moreover, parental perceptions of FHC-NU correlated with household soft drink availability (r = -.31) and perceptions of FHC-PA with the frequency of

  17. Scaling analysis of stock markets

    NASA Astrophysics Data System (ADS)

    Bu, Luping; Shang, Pengjian

    2014-06-01

    In this paper, we apply the detrended fluctuation analysis (DFA), local scaling detrended fluctuation analysis (LSDFA), and detrended cross-correlation analysis (DCCA) to investigate correlations of several stock markets. DFA method is for the detection of long-range correlations used in time series. LSDFA method is to show more local properties by using local scale exponents. DCCA method is a developed method to quantify the cross-correlation of two non-stationary time series. We report the results of auto-correlation and cross-correlation behaviors in three western countries and three Chinese stock markets in periods 2004-2006 (before the global financial crisis), 2007-2009 (during the global financial crisis), and 2010-2012 (after the global financial crisis) by using DFA, LSDFA, and DCCA method. The findings are that correlations of stocks are influenced by the economic systems of different countries and the financial crisis. The results indicate that there are stronger auto-correlations in Chinese stocks than western stocks in any period and stronger auto-correlations after the global financial crisis for every stock except Shen Cheng; The LSDFA shows more comprehensive and detailed features than traditional DFA method and the integration of China and the world in economy after the global financial crisis; When it turns to cross-correlations, it shows different properties for six stock markets, while for three Chinese stocks, it reaches the weakest cross-correlations during the global financial crisis.

  18. Full-Scale Tunnel (FST)

    NASA Technical Reports Server (NTRS)

    1931-01-01

    Wing and nacelle set-up in Full-Scale Tunnel (FST). The NACA conducted drag tests in 1931 on a P3M-1 nacelle which were presented in a special report to the Navy. Smith DeFrance described this work in the report's introduction: 'Tests were conducted in the full-scale wind tunnel on a five to four geared Pratt and Whitney Wasp engine mounted in a P3M-1 nacelle. In order to simulate the flight conditions the nacelle was assembled on a 15-foot span of wing from the same airplane. The purpose of the tests was to improve the cooling of the engine and to reduce the drag of the nacelle combination. Thermocouples were installed at various points on the cylinders and temperature readings were obtained from these by the power plants division. These results will be reported in a memorandum by that division. The drag results, which are covered by this memorandum, were obtained with the original nacelle condition as received from the Navy with the tail of the nacelle modified, with the nose section of the nacelle modified, with a Curtiss anti-drag ring attached to the engine, with a Type G ring developed by the N.A.C.A., and with a Type D cowling which was also developed by the N.A.C.A.' (p. 1)

  19. The SCALE-UP Project

    NASA Astrophysics Data System (ADS)

    Beichner, Robert

    2015-03-01

    The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).

  20. The Autonomy Over Smoking Scale.

    PubMed

    DiFranza, Joseph R; Wellman, Robert J; Ursprung, W W Sanouri A; Sabiston, Catherine

    2009-12-01

    Our goal was to create an instrument that can be used to study how smokers lose autonomy over smoking and regain it after quitting. The Autonomy Over Smoking Scale was produced through a process involving item generation, focus-group evaluation, testing in adults to winnow items, field testing with adults and adolescents, and head-to-head comparisons with other measures. The final 12-item scale shows excellent reliability (alphas = .91-.97), with a one-factor solution explaining 59% of the variance in adults and 61%-74% of the variance in adolescents. Concurrent validity was supported by associations with age of smoking initiation, lifetime use, smoking frequency, daily cigarette consumption, history of failed cessation, Hooked on Nicotine Checklist scores, and Diagnostic and Statistical Manual of Mental Disorder (4th ed., text rev.; American Psychiatric Association, 2000) nicotine dependence criteria. Potentially useful features of this new instrument include (a) it assesses tobacco withdrawal, cue-induced craving, and psychological dependence on cigarettes; (b) it measures symptom intensity; and (c) it asks about current symptoms only, so it could be administered to quitting smokers to track the resolution of symptoms.

  1. Engineering scale electrostatic enclosure demonstration

    SciTech Connect

    Meyer, L.C.

    1993-09-01

    This report presents results from an engineering scale electrostatic enclosure demonstration test. The electrostatic enclosure is part of an overall in-depth contamination control strategy for transuranic (TRU) waste recovery operations. TRU contaminants include small particles of plutonium compounds associated with defense-related waste recovery operations. Demonstration test items consisted of an outer Perma-con enclosure, an inner tent enclosure, and a ventilation system test section for testing electrostatic curtain devices. Three interchangeable test fixtures that could remove plutonium from the contaminated dust were tested in the test section. These were an electret filter, a CRT as an electrostatic field source, and an electrically charged parallel plate separator. Enclosure materials tested included polyethylene, anti-static construction fabric, and stainless steel. The soil size distribution was determined using an eight stage cascade impactor. Photographs of particles containing plutonium were obtained with a scanning electron microscope (SEM). The SEM also provided a second method of getting the size distribution. The amount of plutonium removed from the aerosol by the electrostatic devices was determined by radiochemistry from input and output aerosol samplers. The inner and outer enclosures performed adequately for plutonium handling operations and could be used for full scale operations.

  2. Scaling device for photographic images

    NASA Technical Reports Server (NTRS)

    Rivera, Jorge E. (Inventor); Youngquist, Robert C. (Inventor); Cox, Robert B. (Inventor); Haskell, William D. (Inventor); Stevenson, Charles G. (Inventor)

    2005-01-01

    A scaling device projects a known optical pattern into the field of view of a camera, which can be employed as a reference scale in a resulting photograph of a remote object, for example. The device comprises an optical beam projector that projects two or more spaced, parallel optical beams onto a surface of a remotely located object to be photographed. The resulting beam spots or lines on the object are spaced from one another by a known, predetermined distance. As a result, the size of other objects or features in the photograph can be determined through comparison of their size to the known distance between the beam spots. Preferably, the device is a small, battery-powered device that can be attached to a camera and employs one or more laser light sources and associated optics to generate the parallel light beams. In a first embodiment of the invention, a single laser light source is employed, but multiple parallel beams are generated thereby through use of beam splitting optics. In another embodiment, multiple individual laser light sources are employed that are mounted in the device parallel to one another to generate the multiple parallel beams.

  3. The weak scale from BBN

    NASA Astrophysics Data System (ADS)

    Hall, Lawrence J.; Pinner, David; Ruderman, Joshua T.

    2014-12-01

    The measured values of the weak scale, v, and the first generation masses, m u, d, e , are simultaneously explained in the multiverse, with all these parameters scanning independently. At the same time, several remarkable coincidences are understood. Small variations in these parameters away from their measured values lead to the instability of hydrogen, the instability of heavy nuclei, and either a hydrogen or a helium dominated universe from Big Bang Nucleosynthesis. In the 4d parameter space of ( m u , m d , m e , v), catastrophic boundaries are reached by separately increasing each parameter above its measured value by a factor of (1.4, 1.3, 2.5, ˜ 5), respectively. The fine-tuning problem of the weak scale in the Standard Model is solved: as v is increased beyond the observed value, it is impossible to maintain a significant cosmological hydrogen abundance for any values of m u, d, e that yield both hydrogen and heavy nuclei stability.

  4. Goethite Bench-scale and Large-scale Preparation Tests

    SciTech Connect

    Josephson, Gary B.; Westsik, Joseph H.

    2011-10-23

    The Hanford Waste Treatment and Immobilization Plant (WTP) is the keystone for cleanup of high-level radioactive waste from our nation's nuclear defense program. The WTP will process high-level waste from the Hanford tanks and produce immobilized high-level waste glass for disposal at a national repository, low activity waste (LAW) glass, and liquid effluent from the vitrification off-gas scrubbers. The liquid effluent will be stabilized into a secondary waste form (e.g. grout-like material) and disposed on the Hanford site in the Integrated Disposal Facility (IDF) along with the low-activity waste glass. The major long-term environmental impact at Hanford results from technetium that volatilizes from the WTP melters and finally resides in the secondary waste. Laboratory studies have indicated that pertechnetate ({sup 99}TcO{sub 4}{sup -}) can be reduced and captured into a solid solution of {alpha}-FeOOH, goethite (Um 2010). Goethite is a stable mineral and can significantly retard the release of technetium to the environment from the IDF. The laboratory studies were conducted using reaction times of many days, which is typical of environmental subsurface reactions that were the genesis of this new process. This study was the first step in considering adaptation of the slow laboratory steps to a larger-scale and faster process that could be conducted either within the WTP or within the effluent treatment facility (ETF). Two levels of scale-up tests were conducted (25x and 400x). The largest scale-up produced slurries of Fe-rich precipitates that contained rhenium as a nonradioactive surrogate for {sup 99}Tc. The slurries were used in melter tests at Vitreous State Laboratory (VSL) to determine whether captured rhenium was less volatile in the vitrification process than rhenium in an unmodified feed. A critical step in the technetium immobilization process is to chemically reduce Tc(VII) in the pertechnetate (TcO{sub 4}{sup -}) to Tc(Iv)by reaction with the ferrous

  5. Building Bridges from Micro-Scale to Macro-Scale

    NASA Astrophysics Data System (ADS)

    Harlow, Francis

    2002-04-01

    A major focus of research at the Los Alamos National Laboratory since its inception in 1943 has been to characterize very complex small-scale processes in terms of bulk constitutive relations that capture the essence of the collective behavior. Examples include the development of equations of state, the investigation of material mix at an unstable interface, the examination of metal pore growth with strong tensile stress, and characterization of the response of a polymeric foam to large-strain-rate insults. Bridging techniques include transport for probability-distribution-function evolution, and the use of Reynolds decomposition with moment closure. The research described in this presentation combines theoretical and experimental activities with model building for scientific and engineering computer codes.

  6. Scale Model Icing Research Tunnel

    NASA Technical Reports Server (NTRS)

    Canacci, Victor A.

    1997-01-01

    NASA Lewis Research Center's Icing Research Tunnel (IRT) is the world's largest refrigerated wind tunnel and one of only three icing wind tunnel facilities in the United States. The IRT was constructed in the 1940's and has been operated continually since it was built. In this facility, natural icing conditions are duplicated to test the effects of inflight icing on actual aircraft components as well as on models of airplanes and helicopters. IRT tests have been used successfully to reduce flight test hours for the certification of ice-detection instrumentation and ice protection systems. To ensure that the IRT will remain the world's premier icing facility well into the next century, Lewis is making some renovations and is planning others. These improvements include modernizing the control room, replacing the fan blades with new ones to increase the test section maximum velocity to 430 mph, installing new spray bars to increase the size and uniformity of the artificial icing cloud, and replacing the facility heat exchanger. Most of the improvements will have a first-order effect on the IRT's airflow quality. To help us understand these effects and evaluate potential improvements to the flow characteristics of the IRT, we built a modular 1/10th-scale aerodynamic model of the facility. This closed-loop scale-model pilot tunnel was fabricated onsite in the various shops of Lewis' Fabrication Support Division. The tunnel's rectangular sections are composed of acrylic walls supported by an aluminum angle framework. Its turning vanes are made of tubing machined to the contour of the IRT turning vanes. The fan leg of the tunnel, which transitions from rectangular to circular and back to rectangular cross sections, is fabricated of fiberglass sections. The contraction section of the tunnel is constructed from sheet aluminum. A 12-bladed aluminum fan is coupled to a turbine powered by high-pressure air capable of driving the maximum test section velocity to 550 ft

  7. Lagrange L4/L5 points and the origin of our Moon and Saturn's moons and rings.

    PubMed

    Gott, J Richard

    2005-12-01

    The current standard theory of the origin of the Moon is that the Earth was hit by a giant impactor the size of Mars causing ejection of debris from its mantle that coalesced to form the moon; but where did this Mars-sized impactor come from? Isotopic evidence suggests that it came from 1 AU radius in the solar nebula, and computer simulations are consistent with its approaching Earth on a zero-energy parabolic trajectory. How could such a large object form at 1 AU in a quiescent disk of planetesimals without having already collided with the Earth at an earlier epoch before having the chance to grow large? Belbruno and Gott propose that the giant impactor could have formed in a stable orbit from debris at the Earth's Lagrange point L(5) (or L(4)). It would grow quietly by accretion at L(5) (or L(4)), but eventually gravitational perturbations by other growing planetesimals would kick it out into a horseshoe orbit and finally into a chaotic creeping orbit, which Belbruno and Gott show would, with high probability, hit the Earth on a near zero-energy parabolic trajectory. We can see other examples of this phenomenon occurring in the solar system. Asteroid 2002AA29 is in a horseshoe orbit relative to the Earth that looks exactly like the horseshoe orbits that Belbruno and Gott found for objects that had been perturbed from L(4)/L(5). The regular moons of Saturn are made of ice and have the same albedo as the ring particles (ice chunks, plus some dust). We (J. R. Gott, R. Vanderbei, and E. Belbruno) propose that the regular icy moons of Saturn (out to the orbit of Titan), which are all in nearly circular orbits, formed out of a thin disk of planetesimals (ice chunks) rather like the rings of Saturn today only larger in extent. In such a situation formation of objects at L(4)/L(5) might be expected. Indeed, Saturn's moon Dione is accompanied by moons (Helene and Polydeuces) at both L(4) and L(5) Lagrange points, and Saturn's moon Tethys is also accompanied by moons (Telesto and Calypso) at both L(4) and L(5) Lagrange points. Epimetheus is in a horseshoe orbit relative to Janus that is exactly like the horseshoe orbit expected for an object that has been perturbed from a location at L(4)/L(5). We propose that the rings of Saturn visible today are all that remains of this original disk; they lie inside the Roche limit where tidal forces have simply prevented the formation of large moons by accretion. Further out, the icy particles have accumulated into icy moons. Objects in external solar systems on horseshoe orbits (like those of Epimetheus relative to Janus) could be detected by a slow sinusoidal variation with time of the calculated mass of a planet from radial velocity measurements.

  8. Regional-Scale Salt Tectonics Modelling: Bench-Scale Validation and Extension to Field-Scale

    NASA Astrophysics Data System (ADS)

    Crook, A. J. L.; Yu, J. G.; Thornton, D. A.

    2010-05-01

    The role of salt in the evolution of the West African continental margin, and in particular its impact on hydrocarbon migration and trap formation, is an important research topic. It has attracted many researchers who have based their research on bench-scale experiments, numerical models and seismic observations. This research has shown that the evolution is very complex. For example, regional analogue bench-scale models of the Angolan margin (Fort et al., 2004) indicate a complex system with an upslope extensional domain with sealed tilted blocks, growth fault and rollover systems and extensional diapers, and a downslope contractional domain with squeezed diapirs, polyharmonic folds and thrust faults, and late-stage folding and thrusting. Numerical models have the potential to provide additional insight into the evolution of these salt driven passive margins. The longer-term aim is to calibrate regional-scale evolution models, and then to evaluate the effect of the depositional history on the current day geomechanical and hydrogeologic state in potential target hydrocarbon reservoir formations adjacent to individual salt bodies. To achieve this goal the burial and deformational history of the sediment must be modelled from initial deposition to the current-day state, while also accounting for the reaction and transport processes occurring in the margin. Accurate forward modeling is, however complex, and necessitates advanced procedures for the prediction of fault formation and evolution, representation of the extreme deformations in the salt, and for coupling the geomechanical, fluid flow and temperature fields. The evolution of the sediment due to a combination of mechanical compaction, chemical compaction and creep relaxation must also be represented. In this paper ongoing research on a computational approach for forward modelling complex structural evolution, with particular reference to passive margins driven by salt tectonics is presented. The approach is an

  9. Scaling ansatz for the jamming transition

    PubMed Central

    Goodrich, Carl P.; Liu, Andrea J.; Sethna, James P.

    2016-01-01

    We propose a Widom-like scaling ansatz for the critical jamming transition. Our ansatz for the elastic energy shows that the scaling of the energy, compressive strain, shear strain, system size, pressure, shear stress, bulk modulus, and shear modulus are all related to each other via scaling relations, with only three independent scaling exponents. We extract the values of these exponents from already known numerical or theoretical results, and we numerically verify the resulting predictions of the scaling theory for the energy and residual shear stress. We also derive a scaling relation between pressure and residual shear stress that yields insight into why the shear and bulk moduli scale differently. Our theory shows that the jamming transition exhibits an emergent scale invariance, setting the stage for the potential development of a renormalization group theory for jamming. PMID:27512041

  10. The Adaptive Multi-scale Simulation Infrastructure

    SciTech Connect

    Tobin, William R.

    2015-09-01

    The Adaptive Multi-scale Simulation Infrastructure (AMSI) is a set of libraries and tools developed to support the development, implementation, and execution of general multimodel simulations. Using a minimal set of simulation meta-data AMSI allows for minimally intrusive work to adapt existent single-scale simulations for use in multi-scale simulations. Support for dynamic runtime operations such as single- and multi-scale adaptive properties is a key focus of AMSI. Particular focus has been spent on the development on scale-sensitive load balancing operations to allow single-scale simulations incorporated into a multi-scale simulation using AMSI to use standard load-balancing operations without affecting the integrity of the overall multi-scale simulation.

  11. Kernel method for corrections to scaling.

    PubMed

    Harada, Kenji

    2015-07-01

    Scaling analysis, in which one infers scaling exponents and a scaling function in a scaling law from given data, is a powerful tool for determining universal properties of critical phenomena in many fields of science. However, there are corrections to scaling in many cases, and then the inference problem becomes ill-posed by an uncontrollable irrelevant scaling variable. We propose a new kernel method based on Gaussian process regression to fix this problem generally. We test the performance of the new kernel method for some example cases. In all cases, when the precision of the example data increases, inference results of the new kernel method correctly converge. Because there is no limitation in the new kernel method for the scaling function even with corrections to scaling, unlike in the conventional method, the new kernel method can be widely applied to real data in critical phenomena.

  12. Scaling ansatz for the jamming transition.

    PubMed

    Goodrich, Carl P; Liu, Andrea J; Sethna, James P

    2016-08-30

    We propose a Widom-like scaling ansatz for the critical jamming transition. Our ansatz for the elastic energy shows that the scaling of the energy, compressive strain, shear strain, system size, pressure, shear stress, bulk modulus, and shear modulus are all related to each other via scaling relations, with only three independent scaling exponents. We extract the values of these exponents from already known numerical or theoretical results, and we numerically verify the resulting predictions of the scaling theory for the energy and residual shear stress. We also derive a scaling relation between pressure and residual shear stress that yields insight into why the shear and bulk moduli scale differently. Our theory shows that the jamming transition exhibits an emergent scale invariance, setting the stage for the potential development of a renormalization group theory for jamming.

  13. Small-Scale Rocket Motor Test

    NASA Video Gallery

    Engineers at NASA's Marshall Space Flight Center in Huntsville, Ala. successfully tested a sub-scale solid rocket motor on May 27. Testing a sub-scale version of a rocket motor is a cost-effective ...

  14. Scaling ansatz for the jamming transition

    NASA Astrophysics Data System (ADS)

    Goodrich, Carl P.; Liu, Andrea J.; Sethna, James P.

    2016-08-01

    We propose a Widom-like scaling ansatz for the critical jamming transition. Our ansatz for the elastic energy shows that the scaling of the energy, compressive strain, shear strain, system size, pressure, shear stress, bulk modulus, and shear modulus are all related to each other via scaling relations, with only three independent scaling exponents. We extract the values of these exponents from already known numerical or theoretical results, and we numerically verify the resulting predictions of the scaling theory for the energy and residual shear stress. We also derive a scaling relation between pressure and residual shear stress that yields insight into why the shear and bulk moduli scale differently. Our theory shows that the jamming transition exhibits an emergent scale invariance, setting the stage for the potential development of a renormalization group theory for jamming.

  15. Large-scale dynamics of magnetic helicity

    NASA Astrophysics Data System (ADS)

    Linkmann, Moritz; Dallas, Vassilios

    2016-11-01

    In this paper we investigate the dynamics of magnetic helicity in magnetohydrodynamic (MHD) turbulent flows focusing at scales larger than the forcing scale. Our results show a nonlocal inverse cascade of magnetic helicity, which occurs directly from the forcing scale into the largest scales of the magnetic field. We also observe that no magnetic helicity and no energy is transferred to an intermediate range of scales sufficiently smaller than the container size and larger than the forcing scale. Thus, the statistical properties of this range of scales, which increases with scale separation, is shown to be described to a large extent by the zero flux solutions of the absolute statistical equilibrium theory exhibited by the truncated ideal MHD equations.

  16. Scale Interaction in a California precipitation event

    SciTech Connect

    Leach, M. J., LLNL

    1997-09-01

    Heavy rains and severe flooding frequently plaque California. The heavy rains are most often associated with large scale cyclonic and frontal systems, where large scale dynamics and large moisture influx from the tropical Pacific interact. however, the complex topography along the west coast also interacts with the large scale influences, producing local areas with heavier precipitation. In this paper, we look at some of the local interactions with the large scale.

  17. The Suitability of Gray-Scale Electronic Readers for Dermatology Journals

    PubMed Central

    Choi, Jae Eun; Kim, Dai Hyun; Seo, Soo Hong; Kye, Young Chul

    2014-01-01

    Background The rapid development of information and communication technology has replaced traditional books by electronic versions. Most print dermatology journals have been replaced with electronic journals (e-journals), which are readily used by clinicians and medical students. Objective The objectives of this study were to determine whether e-readers are appropriate for reading dermatology journals, to conduct an attitude study of both medical personnel and students, and to find a way of improving e-book use in the field of dermatology. Methods All articles in the Korean Journal of Dermatology published from January 2010 to December 2010 were utilized in this study. Dermatology house officers, student trainees in their fourth year of medical school, and interns at Korea University Medical Center participated in the study. After reading the articles with Kindle 2, their impressions and evaluations were recorded using a questionnaire with a 5-point Likert scale. Results The results demonstrated that gray-scale e-readers might not be suitable for reading dermatology journals, especially for case reports compared to the original articles. Only three of the thirty-one respondents preferred e-readers to printed papers. The most common suggestions from respondents to encourage usage of e-books in the field of dermatology were the introduction of a color display, followed by the use of a touch screen system, a cheaper price, and ready-to-print capabilities. Conclusion In conclusion, our study demonstrated that current e-readers might not be suitable for reading dermatology journals. However, they may be utilized in selected situations according to the type and topic of the papers. PMID:25473221

  18. 27 CFR 19.183 - Scale tanks.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2011-04-01 2011-04-01 false Scale tanks. 19.183... Tank Requirements § 19.183 Scale tanks. (a) Except as otherwise provided in paragraph (b) of this..., the tank must be mounted on scales and the contents of the tank must be determined by weight....

  19. 27 CFR 19.183 - Scale tanks.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2014-04-01 2014-04-01 false Scale tanks. 19.183... Tank Requirements § 19.183 Scale tanks. (a) Except as otherwise provided in paragraph (b) of this..., the tank must be mounted on scales and the contents of the tank must be determined by weight....

  20. 27 CFR 19.183 - Scale tanks.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2013-04-01 2013-04-01 false Scale tanks. 19.183... Tank Requirements § 19.183 Scale tanks. (a) Except as otherwise provided in paragraph (b) of this..., the tank must be mounted on scales and the contents of the tank must be determined by weight....

  1. 27 CFR 19.183 - Scale tanks.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2012-04-01 2012-04-01 false Scale tanks. 19.183... Tank Requirements § 19.183 Scale tanks. (a) Except as otherwise provided in paragraph (b) of this..., the tank must be mounted on scales and the contents of the tank must be determined by weight....

  2. Scaling properties of foreign exchange volatility

    NASA Astrophysics Data System (ADS)

    Gençay, Ramazan; Selçuk, Faruk; Whitcher, Brandon

    2001-01-01

    In this paper, we investigate the scaling properties of foreign exchange volatility. Our methodology is based on a wavelet multi-scaling approach which decomposes the variance of a time series and the covariance between two time series on a scale by scale basis through the application of a discrete wavelet transformation. It is shown that foreign exchange rate volatilities follow different scaling laws at different horizons. Particularly, there is a smaller degree of persistence in intra-day volatility as compared to volatility at one day and higher scales. Therefore, a common practice in the risk management industry to convert risk measures calculated at shorter horizons into longer horizons through a global scaling parameter may not be appropriate. This paper also demonstrates that correlation between the foreign exchange volatilities is the lowest at the intra-day scales but exhibits a gradual increase up to a daily scale. The correlation coefficient stabilizes at scales one day and higher. Therefore, the benefit of currency diversification is the greatest at the intra-day scales and diminishes gradually at higher scales (lower frequencies). The wavelet cross-correlation analysis also indicates that the association between two volatilities is stronger at lower frequencies.

  3. 76 FR 50881 - Required Scale Tests

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-17

    ..., Packers and Stockyards Administration 9 CFR Part 201 RIN 0580-AB10 Required Scale Tests AGENCY: Grain... January 20, 2011, and on April 4, 2011, concerning required scale tests. Those documents defined ``limited...), concerning required scale tests. Those documents incorrectly defined limited seasonal basis in Sec....

  4. 76 FR 3485 - Required Scale Tests

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-20

    ..., Packers and Stockyards Administration 9 CFR Part 201 RIN 0580-AB10 Required Scale Tests AGENCY: Grain... first of the two scale tests between January 1 and June 30 of the calendar year. The remaining scale test must be completed between July 1 and December 31 of the calendar year. In addition, a...

  5. 76 FR 18348 - Required Scale Tests

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-04

    ... Tests AGENCY: Grain Inspection, Packers and Stockyards Administration. ACTION: Correcting amendments... Register on January 20, 2011 (76 FR 3485), defining required scale tests. That document incorrectly defined... packer using such scales may use the scales within a 6-month period following each test. * * * * * Alan...

  6. Developing a Sense of Scale: Looking Backward

    ERIC Educational Resources Information Center

    Jones, M. Gail; Taylor, Amy R.

    2009-01-01

    Although scale has been identified as one of four major interdisciplinary themes that cut across the science domains by the American Association for the Advancement of Science (1989), we are only beginning to understand how students learn and apply scale concepts. Early research on learning scale tended to focus on perceptions of linear distances,…

  7. Small Scale High Speed Turbomachinery

    NASA Technical Reports Server (NTRS)

    London, Adam P. (Inventor); Droppers, Lloyd J. (Inventor); Lehman, Matthew K. (Inventor); Mehra, Amitav (Inventor)

    2015-01-01

    A small scale, high speed turbomachine is described, as well as a process for manufacturing the turbomachine. The turbomachine is manufactured by diffusion bonding stacked sheets of metal foil, each of which has been pre-formed to correspond to a cross section of the turbomachine structure. The turbomachines include rotating elements as well as static structures. Using this process, turbomachines may be manufactured with rotating elements that have outer diameters of less than four inches in size, and/or blading heights of less than 0.1 inches. The rotating elements of the turbomachines are capable of rotating at speeds in excess of 150 feet per second. In addition, cooling features may be added internally to blading to facilitate cooling in high temperature operations.

  8. Hypoallometric scaling in international collaborations

    NASA Astrophysics Data System (ADS)

    Hsiehchen, David; Espinoza, Magdalena; Hsieh, Antony

    2016-02-01

    Collaboration is a vital process and dominant theme in knowledge production, although the effectiveness of policies directed at promoting multinational research remains ambiguous. We examined approximately 24 million research articles published over four decades and demonstrated that the scaling of international publications to research productivity for each country obeys a universal and conserved sublinear power law. Inefficient mechanisms in transborder team dynamics or organization as well as increasing opportunity costs may contribute to the disproportionate growth of international collaboration rates with increasing productivity among nations. Given the constrained growth of international relationships, our findings advocate a greater emphasis on the qualitative aspects of collaborations, such as with whom partnerships are forged, particularly when assessing research and policy outcomes.

  9. Bacterial Communities: Interactions to Scale

    PubMed Central

    Stubbendieck, Reed M.; Vargas-Bautista, Carol; Straight, Paul D.

    2016-01-01

    In the environment, bacteria live in complex multispecies communities. These communities span in scale from small, multicellular aggregates to billions or trillions of cells within the gastrointestinal tract of animals. The dynamics of bacterial communities are determined by pairwise interactions that occur between different species in the community. Though interactions occur between a few cells at a time, the outcomes of these interchanges have ramifications that ripple through many orders of magnitude, and ultimately affect the macroscopic world including the health of host organisms. In this review we cover how bacterial competition influences the structures of bacterial communities. We also emphasize methods and insights garnered from culture-dependent pairwise interaction studies, metagenomic analyses, and modeling experiments. Finally, we argue that the integration of multiple approaches will be instrumental to future understanding of the underlying dynamics of bacterial communities. PMID:27551280

  10. The Scaled Thermal Explosion Experiment

    SciTech Connect

    Wardell, J F; Maienschein, J L

    2002-07-05

    We have developed the Scaled Thermal Explosion Experiment (STEX) to provide a database of reaction violence from thermal explosion for explosives of interest. Such data are needed to develop, calibrate, and validate predictive capability for thermal explosions using simulation computer codes. A cylinder of explosive 25, 50 or 100 mm in diameter, is confined in a steel cylinder with heavy end caps, and heated under controlled conditions until reaction. Reaction violence is quantified through non-contact micropower impulse radar measurements of the cylinder wall velocity and by strain gauge data at reaction onset. Here we describe the test concept, design and diagnostic recording, and report results with HMX- and RDX-based energetic materials.

  11. Validation of the Metacomprehension Scale

    PubMed

    Moore; Zabrucky; Commander

    1997-10-01

    Evidence for the factorial, convergent and discriminant, and criterion-related validity of the Metacomprehension Scale (MCS) was examined in a sample of 237 young adults. The instrument was factorially heterogeneous but exhibited homogeneity within each of the seven subscales. Evidence for the convergent and discriminant validity of the MCS was examined by correlating the subscales from the MCS with subscales from metacognitive questionnaires measuring similar constructs from related domains. In general, correlations within constructs were larger than correlations between constructs, providing preliminary evidence of the convergent and discriminant validity of the MCS. The criterion-related validity of the MCS relative to other metacognitive measures was examined by using the metacognitive measures and the MCS to predict comprehension performance. The MCS predicted performance better than the other measures of metacognition and accounted for variance in performance not accounted for by the other measures. These results show promise for the value of self-assessments of metacomprehension. Copyright 1997Academic Press

  12. Gradient scaling for nonuniform meshes

    SciTech Connect

    Margolin, L.G.; Ruppel, H.M.; Demuth, R.B.

    1985-01-01

    This paper is concerned with the effect of nonuniform meshes on the accuracy of finite-difference calculations of fluid flow. In particular, when a simple shock propagates through a nonuniform mesh, one may fail to model the jump conditions across the shock even when the equations are differenced in manifestly conservative fashion. We develop an approximate dispersion analysis of the numerical equations and identify the source of the mesh dependency with the form of the artificial viscosity. We then derive an algebraic correction to the numerical equations - a scaling factor for the pressure gradient - to essentially eliminate the mesh dependency. We present several calculations to illustrate our theory. We conclude with an alternate interpretation of our results. 14 refs., 5 figs.

  13. [Virginia Apgar and her scale].

    PubMed

    van Gijn, Jan; Gijselhart, Joost P

    2012-01-01

    Virginia Apgar (1909-1974), born in New Jersey, managed to continue medical school despite the financial crisis of 1929, continued for a brief time in surgery and subsequently became one of the first specialists in anaesthesiology. In 1949 she was appointed to a professorship, the first woman to reach this rank at Columbia University in New York. She then dedicated herself to obstetric anaesthesiology and devised the well known scale for the initial assessment of newborn babies, according to 5 criteria. From 1959 she worked for the National Foundation for Infantile Paralysis (now March of Dimes), to expand its activities from prevention of poliomyelitis to other aspects of preventive child care, such as rubella vaccination and testing for rhesus antagonism. She remained single; in her private life she enjoyed fly fishing, took lessons in aviation and was an accomplished violinist.

  14. Significant Scales in Community Structure

    PubMed Central

    Traag, V. A.; Krings, G.; Van Dooren, P.

    2013-01-01

    Many complex networks show signs of modular structure, uncovered by community detection. Although many methods succeed in revealing various partitions, it remains difficult to detect at what scale some partition is significant. This problem shows foremost in multi-resolution methods. We here introduce an efficient method for scanning for resolutions in one such method. Additionally, we introduce the notion of “significance” of a partition, based on subgraph probabilities. Significance is independent of the exact method used, so could also be applied in other methods, and can be interpreted as the gain in encoding a graph by making use of a partition. Using significance, we can determine “good” resolution parameters, which we demonstrate on benchmark networks. Moreover, optimizing significance itself also shows excellent performance. We demonstrate our method on voting data from the European Parliament. Our analysis suggests the European Parliament has become increasingly ideologically divided and that nationality plays no role. PMID:24121597

  15. Quantitative Scaling of Magnetic Avalanches

    NASA Astrophysics Data System (ADS)

    Durin, G.; Bohn, F.; Corrêa, M. A.; Sommer, R. L.; Le Doussal, P.; Wiese, K. J.

    2016-08-01

    We provide the first quantitative comparison between Barkhausen noise experiments and recent predictions from the theory of avalanches for pinned interfaces, both in and beyond mean field. We study different classes of soft magnetic materials, including polycrystals and amorphous samples—which are characterized by long-range and short-range elasticity, respectively—both for thick and thin samples, i.e., with and without eddy currents. The temporal avalanche shape at fixed size as well as observables related to the joint distribution of sizes and durations are analyzed in detail. Both long-range and short-range samples with no eddy currents are fitted extremely well by the theoretical predictions. In particular, the short-range samples provide the first reliable test of the theory beyond mean field. The thick samples show systematic deviations from the scaling theory, providing unambiguous signatures for the presence of eddy currents.

  16. Enabling department-scale supercomputing

    SciTech Connect

    Greenberg, D.S.; Hart, W.E.; Phillips, C.A.

    1997-11-01

    The Department of Energy (DOE) national laboratories have one of the longest and most consistent histories of supercomputer use. The authors summarize the architecture of DOE`s new supercomputers that are being built for the Accelerated Strategic Computing Initiative (ASCI). The authors then argue that in the near future scaled-down versions of these supercomputers with petaflop-per-weekend capabilities could become widely available to hundreds of research and engineering departments. The availability of such computational resources will allow simulation of physical phenomena to become a full-fledged third branch of scientific exploration, along with theory and experimentation. They describe the ASCI and other supercomputer applications at Sandia National Laboratories, and discuss which lessons learned from Sandia`s long history of supercomputing can be applied in this new setting.

  17. Atomic time scales and pulsars

    NASA Astrophysics Data System (ADS)

    Petit, G.

    2014-12-01

    I review the atomic time scales generated by the BIPM, International Atomic Time TAI and the realization of Terrestrial Time TT(BIPM). TT(BIPM) is shown to be now accurate to within a few 10..16 in relative frequency and the performances of TAI and TT(BIPM) are compared. Millisecond pulsars have a very regular period of rotation and data from several pulsars may be used to realize an ensemble pulsar timescale. It is shown that a pulsar timescale may detect past instabilities in TAI. However TT(BIPM) is much more stable than TAI and should be used as a reference in pulsar analysis. Since the beginning of regular millisecond pulsar observations in the 1980s, primary standards and atomic time have gained one order of magnitude in accuracy every ~ 12 years, and this trend should continue for some time.

  18. Quantitative Scaling of Magnetic Avalanches.

    PubMed

    Durin, G; Bohn, F; Corrêa, M A; Sommer, R L; Le Doussal, P; Wiese, K J

    2016-08-19

    We provide the first quantitative comparison between Barkhausen noise experiments and recent predictions from the theory of avalanches for pinned interfaces, both in and beyond mean field. We study different classes of soft magnetic materials, including polycrystals and amorphous samples-which are characterized by long-range and short-range elasticity, respectively-both for thick and thin samples, i.e., with and without eddy currents. The temporal avalanche shape at fixed size as well as observables related to the joint distribution of sizes and durations are analyzed in detail. Both long-range and short-range samples with no eddy currents are fitted extremely well by the theoretical predictions. In particular, the short-range samples provide the first reliable test of the theory beyond mean field. The thick samples show systematic deviations from the scaling theory, providing unambiguous signatures for the presence of eddy currents.

  19. Approach toward Linear Scaling QMC

    NASA Astrophysics Data System (ADS)

    Clark, Bryan; Ceperley, David; de Sturler, Eric

    2007-03-01

    Quantum Monte Carlo simulations of fermions are currently done for relatively small system sizes, e.g., fewer than one thousand fermions. The most time-consuming part of the code for larger systems depends critically on the speed with which the ratio of a wavefunction for two different configurations can be evaluated. Most of the time goes into calculating the ratio of two determinants; this scales naively as O(n^3) operations. Work by Williamson, et al. (2) have improved the procedure for evaluating the elements of the Slater matrix, so it can be done in linear time. Our work involves developing methods to evaluate the ratio of these Slater determinants quickly. We compare a number of methods including work involving iterative techniques, sparse approximate inverses, and faster matrix updating.(2) A. J. Williamson, R.Q. Hood and J.C. Grossman, Phys. Rev. Lett. 87, 246406 (2001)

  20. Chip Scale Package Implementation Challenges

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Reza

    1998-01-01

    The JPL-led MicrotypeBGA Consortium of enterprises representing government agencies and private companies have jointed together to pool in-kind resources for developing the quality and reliability of chip scale packages (CSPs) for a variety of projects. In the process of building the Consortium CSP test vehicles, many challenges were identified regarding various aspects of technology implementation. This paper will present our experience in the areas of technology implementation challenges, including design and building both standard and microvia boards, and assembly of two types of test vehicles. We also discuss the most current package isothermal aging to 2,000 hours at 100 C and 125 C and thermal cycling test results to 1,700 cycles in the range of -30 to 100 C.