Science.gov

Sample records for 6-point likert scale

  1. Phrase Completions: An Alternative to Likert Scales.

    ERIC Educational Resources Information Center

    Hodge, David R.; Gillespie, David

    2003-01-01

    Delineates the problems with Likert scales, with a particular emphasis on multidimensionality and coarse response categories, and proposes a new measurement method called "phrase completions," which has been designed to circumvent the problems inherent in Likert scales. Also conducts an exploratory test, in which Likert items were adapted to…

  2. A New Likert Scale Based on Fuzzy Sets Theory

    ERIC Educational Resources Information Center

    Li, Cheryl Qing

    2010-01-01

    In social science research, the Likert method is commonly used as a psychometric scale to measure responses. This measurement scale has a procedure that facilitates survey construction and administration, and data coding and analysis. However, there are some problems with Likert scaling. This dissertation addresses the information distortion and…

  3. Using Likert-Type Scales in the Social Sciences

    ERIC Educational Resources Information Center

    Croasmun, James T.; Ostrom, Lee

    2011-01-01

    Likert scales are useful in social science and attitude research projects. The General Self-Efficacy Exam is a test used to determine whether factors in educational settings affect participant's learning self-efficacy. The original instrument had 10 efficacy items and used a 4-point Likert scale. The Cronbach's alphas for the original test ranged…

  4. Response-Order Effects in Likert-Type Scales.

    ERIC Educational Resources Information Center

    Chan, Jason C.

    1991-01-01

    A study involving 102 high school students (49 males and 53 females) from Taiwan revealed that the order of response scale labels had a primacy effect on subjects' choices of the alternatives in Likert-type attitude scales. Practical implications of the response-order effects for measurement are discussed. (SLD)

  5. Response-Order Effect in Likert-Type Scales.

    ERIC Educational Resources Information Center

    Chan, Jason C.

    The importance of the presentation order of items on Likert-type scales was studied. It was proposed that subjects tend to choose the first alternative acceptable to them from among the response categories, so that a primacy effect can be predicted. The effects of reversing the order of the response scale on the latent factor structure underlying…

  6. Dependability of Anchoring Labels of Likert-Type Scales.

    ERIC Educational Resources Information Center

    Chang, Lei

    This study uses generalizability theory to examine the dependability of anchoring labels of Likert-type scales. Variance components associated with labeling were estimated in two samples using a two-facet random effect generalizability-study design. In one sample, 173 graduate students in education were administered 7 items measuring attitudes…

  7. Moderated regression analysis and Likert scales: too coarse for comfort.

    PubMed

    Russell, C J; Bobko, P

    1992-06-01

    One of the most commonly accepted models of relationships among three variables in applied industrial and organizational psychology is the simple moderator effect. However, many authors have expressed concern over the general lack of empirical support for interaction effects reported in the literature. We demonstrate in the current sample that use of a continuous, dependent-response scale instead of a discrete, Likert-type scale, causes moderated regression analysis effect sizes to increase an average of 93%. We suggest that use of relatively coarse Likert scales to measure fine dependent responses causes information loss that, although varying widely across subjects, greatly reduces the probability of detecting true interaction effects. Specific recommendations for alternate research strategies are made. PMID:1601825

  8. Psychological Distance between Categories in the Likert Scale: Comparing Different Numbers of Options

    ERIC Educational Resources Information Center

    Wakita, Takafumi; Ueshima, Natsumi; Noguchi, Hiroyuki

    2012-01-01

    This study examined whether the number of options in the Likert scale influences the psychological distance between categories. The most important assumption when using the Likert scale is that the psychological distance between options is equal. The authors proposed a new algorithm for calculating the scale values of options by applying item…

  9. The RAM Scale: Development and Validation of the Revised Scale in Likert Format.

    ERIC Educational Resources Information Center

    Wright, Claudia R.; And Others

    1983-01-01

    The development and validation of a revised form of the RAM Scale in Likert format are described. The RAM Scale measures student philosophical orientations in terms of relative, absolute, or mixed biases or preferences toward issues of knowledge, methods, and values. (Author/PN)

  10. Validation and Findings Comparing VAS vs. Likert Scales for Psychosocial Measurements

    ERIC Educational Resources Information Center

    Hasson, Dan; Arnetz, Bengt B.

    2005-01-01

    Context: Psychosocial exposures commonly show large variation over time and are usually assessed using multi-item Likert indices. A construct requiring a five-item Likert index could possibly be replaced by a single visual analogue scale (VAS). Objective: To: a) evaluate validity and relative reliability of a single VAS compared to previously…

  11. Effect of Items Direction (Positive or Negative) on the Reliability in Likert Scale. Paper-11

    ERIC Educational Resources Information Center

    Gul, Showkeen Bilal Ahmad; Qasem, Mamun Ali Naji; Bhat, Mehraj Ahmad

    2015-01-01

    In this paper an attempt was made to analyze the effect of items direction (positive or negative) on the Alpha Cronbach reliability coefficient and the Split Half reliability coefficient in Likert scale. The descriptive survey research method was used for the study and sample of 510 undergraduate students were selected by used random sampling…

  12. Improving the Factor Structure of Psychological Scales: The Expanded Format as an Alternative to the Likert Scale Format

    ERIC Educational Resources Information Center

    Zhang, Xijuan; Savalei, Victoria

    2016-01-01

    Many psychological scales written in the Likert format include reverse worded (RW) items in order to control acquiescence bias. However, studies have shown that RW items often contaminate the factor structure of the scale by creating one or more method factors. The present study examines an alternative scale format, called the Expanded format,…

  13. Compressed ultrasound video image-quality evaluation using a Likert scale and Kappa statistical analysis

    NASA Astrophysics Data System (ADS)

    Stewart, Brent K.; Carter, Stephen J.; Langer, Steven G.; Andrew, Rex K.

    1998-06-01

    Experiments using NASA's Advanced Communications Technology Satellite were conducted to provide an estimate of the compressed video quality required for preservation of clinically relevant features for the detection of trauma. Bandwidth rates of 128, 256 and 384 kbps were used. A five point Likert scale (1 equals no useful information and 5 equals good diagnostic quality) was used for a subjective preference questionnaire to evaluate the quality of the compressed ultrasound imagery at the three compression rates for several anatomical regions of interest. At 384 kbps the Likert scores (mean plus or minus SD) were abdomen (4.45 plus or minus 0.71), carotid artery (4.70 plus or minus 0.36), kidney (5.0 plus or minus 0.0), liver (4.67 plus or minus 0.58) and thyroid (4.03 plus or minus 0.74). Due to the volatile nature of the H.320 compressed digital video stream, no statistically significant results can be derived through this methodology. As the MPEG standard has at its roots many of the same intraframe and motion vector compression algorithms as the H.261 (such as that used in the previous ACTS/AMT experiments), we are using the MPEG compressed video sequences to best gauge what minimum bandwidths are necessary for preservation of clinically relevant features for the detection of trauma. We have been using an MPEG codec board to collect losslessly compressed video clips from high quality S- VHS tapes and through direct digitization of S-video. Due to the large number of videoclips and questions to be presented to the radiologists and for ease of application, we have developed a web browser interface for this video visual perception study. Due to the large numbers of observations required to reach statistical significance in most ROC studies, Kappa statistical analysis is used to analyze the degree of agreement between observers and between viewing assessment. If the degree of agreement amongst readers is high, then there is a possibility that the ratings (i

  14. Effect of Items Direction (Positive or Negative) on the Factorial Construction and Criterion Related Validity in Likert Scale

    ERIC Educational Resources Information Center

    Naji Qasem, Mamun Ali; Ahmad Gul, Showkeen Bilal

    2014-01-01

    The study was conducted to know the effect of items direction (positive or negative) on the factorial construction and criterion related validity in Likert scale. The descriptive survey research method was used for the study and the sample consisted of 510 undergraduate students selected by used random sampling technique. A scale developed by…

  15. Detecting information processing bias toward psychopathology: Interpreting Likert scales at intake assessment.

    PubMed

    Flückiger, Christoph; Znoj, Hansjörg; Vîslă, Andreea

    2016-09-01

    Since the programmatic Rosenhan study, there is a broad discussion of how to actively construct clinical realities on both "insane" and "sane" perspectives. To inform patients about the output of the psychometric questionnaires assessed at intake is a required task in many clinical routines. Information processing bias toward psychopathology may impact many clinical communications and thus lead to clinical errors. Based on an output of the commonly used Symptom Check List 90, case examples demonstrate various grades of balanced and unbalanced alternatives of how to consider the psychopathological as well as the unproblematic poles of Likert scales in discussing psychometric questionnaires at Session 1. We provide one clinical error related to client information at intake assessments and offer four therapeutic tasks that can serve as observable quality indicators of how to facilitate a balanced picture of the patients' burdens and capabilities: (1) validate individual problems, (2) isolate individual problems, (3) validate individual strengths, and (4) break through black and white thinking. (PsycINFO Database Record PMID:27631856

  16. The Analysis of Likert Scales Using State Multipoles: An Application of Quantum Methods to Behavioral Sciences Data

    ERIC Educational Resources Information Center

    Camparo, James; Camparo, Lorinda B.

    2013-01-01

    Though ubiquitous, Likert scaling's traditional mode of analysis is often unable to uncover all of the valid information in a data set. Here, the authors discuss a solution to this problem based on methodology developed by quantum physicists: the state multipole method. The authors demonstrate the relative ease and value of this method by…

  17. How Differences among Data Collectors Are Reflected in the Reliability and Validity of Data Collected by Likert-Type Scales?

    ERIC Educational Resources Information Center

    Köksal, Mustafa Serdar; Ertekin, Pelin; Çolakoglu, Özgür Murat

    2014-01-01

    The purpose of this study is to investigate association of data collectors' differences with the differences in reliability and validity of scores regarding affective variables (motivation toward science learning and science attitude) that are measured by Likert-type scales. Four researchers trained in data collection and seven science…

  18. Effect of personality item writing on psychometric properties of ideal-point and likert scales.

    PubMed

    Huang, Jialin; Mead, Alan D

    2014-12-01

    The present study was designed to investigate personality item-writing practices and their effect on the psychometric properties of personality items and scales. Personality items were developed based on ideal-point and dominance models, analyzed using the generalized graded unfolding model, and empirically classified as having an ideal-point or dominance form. Results suggested that writing dominance items were slightly easier (more successful) than writing ideal-point items, but this varied slightly by personality dimensions. Of 3 ideal-point item writing tactics, the "neutral" tactic was least successful; success writing "double-barreled" and "average" ideal-point items was comparable to that of dominance items. Three personality scales were then constructed using successful ideal-point and dominance items. Scales constructed using ideal-point items had substantially inferior psychometric properties, including lower score reliability, lower correlations with important criteria, and mixed test information results. However, lower predictive validity of ideal-point scale scores may be due to lower reliability of the scores. Practical and methodological implications were also discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  19. Evidence of Factorial Validity of Parental Knowledge, Control and Solicitation, and Adolescent Disclosure Scales: When the Ordered Nature of Likert Scales Matters

    PubMed Central

    Lionetti, Francesca; Keijsers, Loes; Dellagiulia, Antonio; Pastore, Massimiliano

    2016-01-01

    For evaluating monitoring and parent-adolescent communication, a set of scales addressing parental knowledge, control and solicitation, and adolescent disclosure was proposed by Kerr and Stattin (2000). Although these scales have been widely disseminated, their psychometric proprieties have often been found to be unsatisfactory, raising questions about their validity. The current study examines whether their poor psychometric properties, which are mainly attributed to the relatively poor conceptual quality of the items, could have been caused by the use of less-than-optimal analytical estimation methods. A cross-validation approach is used on a sample of 1071 adolescents. Maximum likelihood (ML) is compared with the diagonal weighted least squares (DWLS) method, which is suitable for Likert scales. The results of the DWLS approach lead to a more optimal fit than that obtained using ML estimation. The DWLS methodology may represent a useful option for researchers using these scales because it corrects for their unreliability. PMID:27445909

  20. Internal consistency, test-retest reliability, and predictive validity for a Likert-based version of the Sources of occupational stress-14 (SOOS-14) scale.

    PubMed

    Kimbrel, Nathan A; Flynn, Elisa J; Carpenter, Grace Stephanie J; Cammarata, Claire M; Leto, Frank; Ostiguy, William J; Kamholz, Barbara W; Zimering, Rose T; Gulliver, Suzy B

    2015-08-30

    This study examined the psychometric properties of a Likert-based version of the Sources of Occupational Stress-14 (SOOS-14) scale. Internal consistency for the SOOS-14 ranged from 0.78-0.84, whereas three-month test-retest reliability was 0.51. In addition, SOOS-14 scores were prospectively associated with symptoms of PTSD and depression at a three-month follow-up assessment. PMID:26073282

  1. The Effects of Various Configurations of Likert, Ordered Categorical, or Rating Scale Data on the Ordinal Logistic Regression Pseudo R-Squared Measure of Fit: The Case of the Cummulative Logit Model.

    ERIC Educational Resources Information Center

    Zumbo, Bruno D.; Ochieng, Charles O.

    Many measures found in educational research are ordered categorical response variables that are empirical realizations of an underlying normally distributed variate. These ordered categorical variables are commonly referred to as Likert or rating scale data. Regression models are commonly fit using these ordered categorical variables as the…

  2. Development and Validation of the POSITIVES Scale (Postsecondary Information Technology Initiative Scale)

    ERIC Educational Resources Information Center

    Fichten, Catherine S.; Asuncion, Jennison V.; Nguyen, Mai N.; Wolforth, Joan; Budd, Jillian; Barile, Maria; Gaulin, Chris; Martiniello, Natalie; Tibbs, Anthony; Ferraro, Vittoria; Amsel, Rhonda

    2009-01-01

    Data on how well information and communication technology (ICT) needs of 1354 Canadian college and university students with disabilities are met on and off campus were collected using the newly developed Positives Scale (Postsecondary Information Technology Initiative Scale). The measure contains 26 items which use a 6-point Likert scale (1 =…

  3. Estimating Ordinal Reliability for Likert-Type and Ordinal Item Response Data: A Conceptual, Empirical, and Practical Guide

    ERIC Educational Resources Information Center

    Gadermann, Anne M.; Guhn, Martin; Zumbo, Bruno D.

    2012-01-01

    This paper provides a conceptual, empirical, and practical guide for estimating ordinal reliability coefficients for ordinal item response data (also referred to as Likert, Likert-type, ordered categorical, or rating scale item responses). Conventionally, reliability coefficients, such as Cronbach's alpha, are calculated using a Pearson…

  4. Likert Survey Primacy Effect in the Absence or Presence of Negatively-Worded Items.

    ERIC Educational Resources Information Center

    Barnette, J. Jackson

    2001-01-01

    Studied the primacy effect (tendency to select items closer to the left side of the response scale) in Likert scales worded from "Strongly Disagree" to "Strongly Agree" and in the opposite direction. Findings for 386 high school and college students show no primacy effect, although negatively worded stems had an effect on Cronbach's alpha. (SLD)

  5. Controlling for Rater Effects When Comparing Survey Items with Incomplete Likert Data. ACT Research Report Series.

    ERIC Educational Resources Information Center

    Schulz, E. Matthew; Sun, Anji

    This study was concerned with ranking items with Likert scale data when the items are subject to systematic (nonrandom) patterns of nonresponse. Researchers applied the rating scale model of D. Andrich (1978) to data from a survey in which the item response rate varied from less than 1% to over 90%. Data were from a section of the Student Opinion…

  6. A Two-Decision Model for Responses to Likert-Type Items

    ERIC Educational Resources Information Center

    Thissen-Roe, Anne; Thissen, David

    2013-01-01

    Extreme response set, the tendency to prefer the lowest or highest response option when confronted with a Likert-type response scale, can lead to misfit of item response models such as the generalized partial credit model. Recently, a series of intrinsically multidimensional item response models have been hypothesized, wherein tendency toward…

  7. Use and Misuse of the Likert Item Responses and Other Ordinal Measures

    PubMed Central

    BISHOP, PHILLIP A.; HERRON, ROBERT L.

    2015-01-01

    Likert, Likert-type, and ordinal-scale responses are very popular psychometric item scoring schemes for attempting to quantify people’s opinions, interests, or perceived efficacy of an intervention and are used extensively in Physical Education and Exercise Science research. However, these numbered measures are generally considered ordinal and violate some statistical assumptions needed to evaluate them as normally distributed, parametric data. This is an issue because parametric statistics are generally perceived as being more statistically powerful than non-parametric statistics. To avoid possible misinterpretation, care must be taken in analyzing these types of data. The use of visual analog scales may be equally efficacious and provide somewhat better data for analysis with parametric statistics. PMID:27182418

  8. A mixed-binomial model for Likert-type personality measures

    PubMed Central

    Allik, Jüri

    2014-01-01

    Personality measurement is based on the idea that values on an unobservable latent variable determine the distribution of answers on a manifest response scale. Typically, it is assumed in the Item Response Theory (IRT) that latent variables are related to the observed responses through continuous normal or logistic functions, determining the probability with which one of the ordered response alternatives on a Likert-scale item is chosen. Based on an analysis of 1731 self- and other-rated responses on the 240 NEO PI-3 questionnaire items, it was proposed that a viable alternative is a finite number of latent events which are related to manifest responses through a binomial function which has only one parameter—the probability with which a given statement is approved. For the majority of items, the best fit was obtained with a mixed-binomial distribution, which assumes two different subpopulations who endorse items with two different probabilities. It was shown that the fit of the binomial IRT model can be improved by assuming that about 10% of random noise is contained in the answers and by taking into account response biases toward one of the response categories. It was concluded that the binomial response model for the measurement of personality traits may be a workable alternative to the more habitual normal and logistic IRT models. PMID:24847291

  9. Appropriate Statistical Analysis for Two Independent Groups of Likert-Type Data

    ERIC Educational Resources Information Center

    Warachan, Boonyasit

    2011-01-01

    The objective of this research was to determine the robustness and statistical power of three different methods for testing the hypothesis that ordinal samples of five and seven Likert categories come from equal populations. The three methods are the two sample t-test with equal variances, the Mann-Whitney test, and the Kolmogorov-Smirnov test. In…

  10. A 6-Point TACS Score Predicts In-Hospital Mortality Following Total Anterior Circulation Stroke

    PubMed Central

    Wood, Adrian D; Gollop, Nicholas D; Bettencourt-Silva, Joao H; Clark, Allan B; Metcalf, Anthony K; Bowles, Kristian M; Flather, Marcus D; Potter, John F

    2016-01-01

    Background and Purpose Little is known about the factors associated with in-hospital mortality following total anterior circulation stroke (TACS). We examined the characteristics and comorbidity data for TACS patients in relation to in-hospital mortality with the aim of developing a simple clinical rule for predicting the acute mortality outcome in TACS. Methods A routine data registry of one regional hospital in the UK was analyzed. The subjects were 2,971 stroke patients with TACS (82% ischemic; median age=81 years, interquartile age range=74–86 years) admitted between 1996 and 2012. Uni- and multivariate regression models were used to estimate in-hospital mortality odds ratios for the study covariates. A 6-point TACS scoring system was developed from regression analyses to predict in-hospital mortality as the outcome. Results Factors associated with in-hospital mortality of TACS were male sex [adjusted odds ratio (AOR)=1.19], age (AOR=4.96 for ≥85 years vs. <65 years), hemorrhagic subtype (AOR=1.70), nonlateralization (AOR=1.75), prestroke disability (AOR=1.73 for moderate disability vs. no symptoms), and congestive heart failure (CHF) (AOR=1.61). Risk stratification using the 6-point TACS Score [T=type (hemorrhage=1 point) and territory (nonlateralization=1 point), A=age (65–84 years=1 point, ≥85 years=2 points), C=CHF (if present=1 point), S=status before stroke (prestroke modified Rankin Scale score of 4 or 5=1 point)] reliably predicted a mortality outcome: score=0, 29.4% mortality; score=1, 46.2% mortality [negative predictive value (NPV)=70.6%, positive predictive value (PPV)=46.2%]; score=2, 64.1% mortality (NPV=70.6, PPV=64.1%); score=3, 73.7% mortality (NPV=70.6%, PPV=73.7%); and score=4 or 5, 81.2% mortality (NPV=70.6%, PPV=81.2%). Conclusions We have identified the key determinants of in-hospital mortality following TACS and derived a 6-point TACS Score that can be used to predict the prognosis of particular patients.

  11. A Comparison of Anchor-Item Designs for the Concurrent Calibration of Large Banks of Likert-Type Items

    ERIC Educational Resources Information Center

    Garcia-Perez, Miguel A.; Alcala-Quintana, Rocio; Garcia-Cueto, Eduardo

    2010-01-01

    Current interest in measuring quality of life is generating interest in the construction of computerized adaptive tests (CATs) with Likert-type items. Calibration of an item bank for use in CAT requires collecting responses to a large number of candidate items. However, the number is usually too large to administer to each subject in the…

  12. Taylor, Graicunas, Worthy, Likert, and Thayer: Span of Control and Organizational Structure--Where They Fit on the "Leadership Continuum."

    ERIC Educational Resources Information Center

    Corder, Lloyd E.

    The "Leadership Continuum" model developed in 1961 by R. Tannenbaum, I. Weschler, and F. Massarik clearly illustrates the ideas that management scholars like Frederick Taylor, V. A. Graicunas, James Worthy, Rensis Likert, and Frederick Thayer have posited concerning span of control and organizational structure. Each of these scholars fits at some…

  13. Optimum Response Categories for the Religious Motivation Scale

    ERIC Educational Resources Information Center

    Kraska, Chad

    2011-01-01

    Likert response scales are widely used in the social sciences, typically to measure attitudes and personality. This study seeks to understand the optimal number of categories to include in a Likert scale measuring attitudes. Therefore, the author examined four versions of an attitude measure, the Religious Motivation Scale, ranging from a 4- to…

  14. Longitudinal Measurement Invariance of Likert-Type Learning Strategy Scales: Are We Using the Same Ruler at Each Wave?

    ERIC Educational Resources Information Center

    Coertjens, Liesje; Donche, Vincent; De Maeyer, Sven; Vanthournout, Gert; Van Petegem, Peter

    2012-01-01

    Whether or not learning strategies change during the course of higher education is an important topic in the Student Approaches to Learning field. However, there is a dearth of any empirical evaluations in the literature as to whether or not the instruments in this research domain measure equivalently over time. Therefore, this study details the…

  15. A Mathematical Approach in Evaluating Biotechnology Attitude Scale: Rough Set Data Analysis

    ERIC Educational Resources Information Center

    Narli, Serkan; Sinan, Olcay

    2011-01-01

    Individuals' thoughts and attitudes towards biotechnology have been investigated in many countries. A Likert-type scale is the most commonly used scale to measure attitude. However, the weak side of a likert-type scale is that different responses may produce the same score. The Rough set method has been regarded to address this shortcoming. A…

  16. A Monte Carlo Examination of the Sensitivity of the Differential Functioning of Items and Tests Framework for Tests of Measurement Invariance with Likert Data

    ERIC Educational Resources Information Center

    Meade, Adam W.; Lautenschlager, Gary J.; Johnson, Emily C.

    2007-01-01

    This article highlights issues associated with the use of the differential functioning of items and tests (DFIT) methodology for assessing measurement invariance (or differential functioning) with Likert-type data. Monte Carlo analyses indicate relatively low sensitivity of the DFIT methodology for identifying differential item functioning (DIF)…

  17. Lifestyle Alternatives: Development and Evaluation of an Attitude Scale.

    ERIC Educational Resources Information Center

    Pestle, Ruth E.; And Others

    1982-01-01

    Describes the construction and testing of an instrument to assess attitudes toward voluntary simplicity of lifestyle. The scale consists of 120 Likert items concerning such behaviors as recycling, home production of goods, conservation of resources, and sharing of skills. (SK)

  18. Further Empirical Results on Parametric Versus Non-Parametric IRT Modeling of Likert-Type Personality Data

    ERIC Educational Resources Information Center

    Maydeu-Olivares, Albert

    2005-01-01

    Chernyshenko, Stark, Chan, Drasgow, and Williams (2001) investigated the fit of Samejima's logistic graded model and Levine's non-parametric MFS model to the scales of two personality questionnaires and found that the graded model did not fit well. We attribute the poor fit of the graded model to small amounts of multidimensionality present in…

  19. [Development and psychometric validation of the Brief Smartphone Addiction Scale (BSAS) with schoolchidren].

    PubMed

    Csibi, Sándor; Demetrovics, Zsolt; Szabó, Attila

    2016-01-01

    Smartphone use among children increases continuously. A growing range of stimulating applications may trigger the risk of addiction. The aim of this study was to develop a brief, easy-to-use and score tool for screening children at risk for smartphone addiction. A 6-item agree-disagree Likert scale (6-point range), was developed on the basis of the 'components' model of addiction (Griffiths, 2005). The brief tool was administered to 441 Hungarian speaking schoolchildren (mean age=13.4 years, SD=2.22) along with the 26-item Smartphone Addiction Inventory (SPAI; Lin et al, 2014). Principal components analysis yielded a single component for the 6-item tool, which accounted for 52.38% of the total variance. The internal reliability of the scale was good (Cronbach's alpha=0.82). Content validity was confirmed by statistically significant differences between heavy and light users (p <.001). The brief tool correlated positively and significantly with the 26-item SPAI (r = 0.67, p <.001), that justified its congruent validity. Younger children (9-13 years) scored higher on the scale than their older (14-18 years) peers (p <.001). The Hungarian version of the brief smartphone addiction inventory appears to be a valid and reliable tool for screening for mobile phone addiction among schoolchildren. PMID:27091924

  20. [Development and psychometric validation of the Brief Smartphone Addiction Scale (BSAS) with schoolchidren].

    PubMed

    Csibi, Sándor; Demetrovics, Zsolt; Szabó, Attila

    2016-01-01

    Smartphone use among children increases continuously. A growing range of stimulating applications may trigger the risk of addiction. The aim of this study was to develop a brief, easy-to-use and score tool for screening children at risk for smartphone addiction. A 6-item agree-disagree Likert scale (6-point range), was developed on the basis of the 'components' model of addiction (Griffiths, 2005). The brief tool was administered to 441 Hungarian speaking schoolchildren (mean age=13.4 years, SD=2.22) along with the 26-item Smartphone Addiction Inventory (SPAI; Lin et al, 2014). Principal components analysis yielded a single component for the 6-item tool, which accounted for 52.38% of the total variance. The internal reliability of the scale was good (Cronbach's alpha=0.82). Content validity was confirmed by statistically significant differences between heavy and light users (p <.001). The brief tool correlated positively and significantly with the 26-item SPAI (r = 0.67, p <.001), that justified its congruent validity. Younger children (9-13 years) scored higher on the scale than their older (14-18 years) peers (p <.001). The Hungarian version of the brief smartphone addiction inventory appears to be a valid and reliable tool for screening for mobile phone addiction among schoolchildren.

  1. Improving the Factor Structure of Psychological Scales

    PubMed Central

    Zhang, Xijuan; Savalei, Victoria

    2015-01-01

    Many psychological scales written in the Likert format include reverse worded (RW) items in order to control acquiescence bias. However, studies have shown that RW items often contaminate the factor structure of the scale by creating one or more method factors. The present study examines an alternative scale format, called the Expanded format, which replaces each response option in the Likert scale with a full sentence. We hypothesized that this format would result in a cleaner factor structure as compared with the Likert format. We tested this hypothesis on three popular psychological scales: the Rosenberg Self-Esteem scale, the Conscientiousness subscale of the Big Five Inventory, and the Beck Depression Inventory II. Scales in both formats showed comparable reliabilities. However, scales in the Expanded format had better (i.e., lower and more theoretically defensible) dimensionalities than scales in the Likert format, as assessed by both exploratory factor analyses and confirmatory factor analyses. We encourage further study and wider use of the Expanded format, particularly when a scale’s dimensionality is of theoretical interest. PMID:27182074

  2. Designing the Nuclear Energy Attitude Scale.

    ERIC Educational Resources Information Center

    Calhoun, Lawrence; And Others

    1988-01-01

    Presents a refined method for designing a valid and reliable Likert-type scale to test attitudes toward the generation of electricity from nuclear energy. Discusses various tests of validity that were used on the nuclear energy scale. Reports results of administration and concludes that the test is both reliable and valid. (CW)

  3. Scales

    MedlinePlus

    Scales are a visible peeling or flaking of outer skin layers. These layers are called the stratum ... Scales may be caused by dry skin, certain inflammatory skin conditions, or infections. Eczema , ringworm , and psoriasis ...

  4. The Multidisciplinary Hemodialysis Patient Satisfaction Scale: Reliability, Validity, and Scale Development.

    ERIC Educational Resources Information Center

    Martin, Pamela Davis; Brantley, Philip J.; McKnight, G. Tipton; Jones, Glenn N.; Springer, Annette

    1997-01-01

    The development and preliminary reliability and validity studies are reported for the Multidisciplinary Hemodialysis Patient Satisfaction Scale, a 110-item Likert scale that assesses satisfaction with team health care services. The methods used to construct subscales may have implications for other psychometric studies of satisfaction and quality…

  5. Reflective Thinking Scale: A Validity and Reliability Study

    ERIC Educational Resources Information Center

    Basol, Gulsah; Evin Gencel, Ilke

    2013-01-01

    The purpose of this study was to adapt Reflective Thinking Scale to Turkish and investigate its validity and reliability over a Turkish university students' sample. Reflective Thinking Scale (RTS) is a 5 point Likert scale (ranging from 1 corresponding Agree Completely, 3 to Neutral, and 5 to Not Agree Completely), purposed to measure…

  6. A Study on Emotional Literacy Scale Development

    ERIC Educational Resources Information Center

    Akbag, Müge; Küçüktepe, Seval Eminoglu; Özmercan, Esra Eminoglu

    2016-01-01

    Emotional literacy is described as being aware of our own feelings in order to improve our personal power and life quality as well as people's life quality around us. In this study, the aim is to develop a Likert scale which measures people's emotional literacy in order to be used both in descriptive and experimental researches. Related literature…

  7. Scale

    ERIC Educational Resources Information Center

    Schaffhauser, Dian

    2009-01-01

    The common approach to scaling, according to Christopher Dede, a professor of learning technologies at the Harvard Graduate School of Education, is to jump in and say, "Let's go out and find more money, recruit more participants, hire more people. Let's just keep doing the same thing, bigger and bigger." That, he observes, "tends to fail, and fail…

  8. Reliability and Validity of the Transracial Adoption Parenting Scale

    ERIC Educational Resources Information Center

    Massatti, Richard R.; Vonk, M. Elizabeth; Gregoire, Thomas K.

    2004-01-01

    The present study provides information on the reliability and validity of the Transracial Adoption Parenting Scale (TAPS), a multidimensional 36-item Likert-type scale that measures cultural competence among transracial adoptive (TRA) parents. The TAPS was theoretically developed and refined through feedback from experts in TRA adoption. A…

  9. Scales

    SciTech Connect

    Murray Gibson

    2007-04-27

    Musical scales involve notes that, sounded simultaneously (chords), sound good together. The result is the left brain meeting the right brain — a Pythagorean interval of overlapping notes. This synergy would suggest less difference between the working of the right brain and the left brain than common wisdom would dictate. The pleasing sound of harmony comes when two notes share a common harmonic, meaning that their frequencies are in simple integer ratios, such as 3/2 (G/C) or 5/4 (E/C).

  10. Scales

    ScienceCinema

    Murray Gibson

    2016-07-12

    Musical scales involve notes that, sounded simultaneously (chords), sound good together. The result is the left brain meeting the right brain — a Pythagorean interval of overlapping notes. This synergy would suggest less difference between the working of the right brain and the left brain than common wisdom would dictate. The pleasing sound of harmony comes when two notes share a common harmonic, meaning that their frequencies are in simple integer ratios, such as 3/2 (G/C) or 5/4 (E/C).

  11. Factor Analysis of the Omega Scale: A Scale Designed To Measure the Attitudes of College Students toward Their Own Deaths and the Disposition of Their Bodies.

    ERIC Educational Resources Information Center

    Staik, Irene M.

    A study was undertaken to provide a factor analysis of the Omega Scale, a 25-item, Likert-type scale developed in 1984 to assess attitudes toward death and funerals and other body disposition practices. The Omega Scale was administered to 250 students enrolled in introductory psychology classes at two higher education institutions in Alabama.…

  12. Value-Eroding Teacher Behaviors Scale: A Validity and Reliability Study

    ERIC Educational Resources Information Center

    Arseven, Zeynep; Kiliç, Abdurrahman; Sahin, Seyma

    2016-01-01

    In the present study, it is aimed to develop a valid and reliable scale for determining value-eroding behaviors of teachers, hence their values of judgment. The items of the "Value-eroding Teacher Behaviors Scale" were designed in the form of 5-point likert type rating scale. The exploratory factor analysis (EFA) was conducted to…

  13. Preparing Attitude Scale to Define Students' Attitudes about Environment, Recycling, Plastic and Plastic Waste

    ERIC Educational Resources Information Center

    Avan, Cagri; Aydinli, Bahattin; Bakar, Fatma; Alboga, Yunus

    2011-01-01

    The aim of this study is to introduce an attitude scale in order to define students? attitudes about environment, recycling, plastics, plastic waste. In this study, 80 attitude sentences according to 5-point Likert-type scale were prepared and applied to 492 students of 6th grade in the Kastamonu city center of Turkey. The scale consists of…

  14. The Use of Different Happiness Rating Scales: Bias and Comparison Problem?

    ERIC Educational Resources Information Center

    Lim, Hock-Eam

    2008-01-01

    This paper uses data of reported happiness which is measured at 4, 5, 7 and 11 likert-scale points. A group of 137 respondents was selected to study the comparison problem on the estimated mean happiness using direct rescaling. It is found that the 11-point scale's estimated mean happiness is significantly higher than the 4 and 7-point scale. This…

  15. Developing a Scale on the Usage of Learner Control Strategy

    ERIC Educational Resources Information Center

    Kutlu, M. Oguz

    2012-01-01

    The aim of this study was to develop a Likert-like scale in order to measure teachers' usage level of learner control strategy. This study was carried out with 219 State primary school teachers who were class teachers, Turkish teachers, English teachers, Mathematics teachers, Science teachers, Social Sciences teachers, Religion and Moral teachers…

  16. The Intuitive Eating Scale: Development and Preliminary Validation

    ERIC Educational Resources Information Center

    Hawks, Steven; Merrill, Ray M.; Madanat, Hala N.

    2004-01-01

    This article describes the development and validation of an instrument designed to measure the concept of intuitive eating. To ensure face and content validity for items used in the Likert-type Intuitive Eating Scale (IES), content domain was clearly specified and a panel of experts assessed the validity of each item. Based on responses from 391…

  17. Acquiescent Responding in Balanced Multidimensional Scales and Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Lorenzo-Seva, Urbano; Rodriguez-Fornells, Antoni

    2006-01-01

    Personality tests often consist of a set of dichotomous or Likert items. These response formats are known to be susceptible to an agreeing-response bias called acquiescence. The common assumption in balanced scales is that the sum of appropriately reversed responses should be reasonably free of acquiescence. However, inter-item correlation (or…

  18. Developing a Scale for Quality of Using Learning Strategies

    ERIC Educational Resources Information Center

    Tasci, Guntay; Yurdugul, Halil

    2016-01-01

    This study aims to develop a measurement tool to measure the quality of using learning strategies. First, the quality of using learning strategies was described based on the literature. The 32 items in the 5-point Likert scale were then administered to 320 prospective teachers, and they were analysed with exploratory factor analysis using…

  19. Construct Validation and a More Parsimonious Mathematics Beliefs Scales.

    ERIC Educational Resources Information Center

    Capraro, Mary Margaret

    Teacher beliefs are instrumental in defining teacher pedagogical and content tasks and for processing information relevant to those tasks. In this study, a Likert-type instrument, Mathematics Beliefs Scales (E. Fennema, T. Carpenter, and M. Loef, 1990), was used to measure the mathematical beliefs of teachers. This instrument was designed with…

  20. The Development of an Objective Scale to Measure a Transpersonal Orientation to Learning.

    ERIC Educational Resources Information Center

    Shapiro, Stewart B.; Fitzgerald, Louise F.

    1989-01-01

    A 40-item Likert scale, Transpersonal Orientation to Learning, was developed to investigate transpersonal (spiritual or mystical) orientation. The scale was validated via administration to 166 graduate students in education to determine their beliefs about the development of spiritual potential in learning environments. Satisfactory reliability…

  1. Determination of Reliability and Validity for Myself as a Teacher Scale.

    ERIC Educational Resources Information Center

    Handley, Herbert M.; Thomson, James R., Jr.

    The reliability and validity of the Myself as a Teacher Scale (MTS), developed to assess the self-concept of teachers, were studied. Materials developed by David P. Butts and Robert Howe were used to construct a 62-item Likert-type scale asking individuals to rate themselves on certain criteria. After a pilot study with 92 preservice teachers and…

  2. Women's Liberation Scale (WLS): A Measure of Attitudes Toward Positions Advocated by Women's Groups.

    ERIC Educational Resources Information Center

    Goldberg, Carlos

    The Women's Liberation Scale (WLS) is a 14-item, Likert-type scale designed to measure attitudes toward positions advocated by women's groups. The WLS and its four-alternative response schema is presented, along with descriptive statistics of scores based on male and female college samples. Reliability and validity measures are reported, and the…

  3. The Effects of Changes in the Order of Verbal Labels and Numerical Values on Children's Scores on Attitude and Rating Scales

    ERIC Educational Resources Information Center

    Betts, Lucy; Hartley, James

    2012-01-01

    Research with adults has shown that variations in verbal labels and numerical scale values on rating scales can affect the responses given. However, few studies have been conducted with children. The study aimed to examine potential differences in children's responses to Likert-type rating scales according to their anchor points and scale…

  4. The Development of Will Perception Scale and Practice in a Psycho-Education Program with Its Validity and Reliability

    ERIC Educational Resources Information Center

    Yener, Özen

    2014-01-01

    In this research, we aim to develop a 5-point likert scale and use it in an experimental application by performing its validity and reliability in order to measure the will perception of teenagers and adults. With this aim, firstly the items have been taken either in the same or changed way from various scales and an item pool including 61 items…

  5. A Scale for E-Content Preparation Skills: Development, Validity and Reliability

    ERIC Educational Resources Information Center

    Tekin, Ahmet; Polat, Ebru

    2016-01-01

    Problem Statement: For an effective teaching and learning process it is critical to provide support for teachers in the development of e-content, and teachers should play an active role in this development. Purpose of the Study: The purpose of this study is to develop a valid and reliable Likert-type scale that will determine pre-service teachers'…

  6. In Search of the Optimal Number of Response Categories in a Rating Scale

    ERIC Educational Resources Information Center

    Lee, Jihyun; Paek, Insu

    2014-01-01

    Likert-type rating scales are still the most widely used method when measuring psychoeducational constructs. The present study investigates a long-standing issue of identifying the optimal number of response categories. A special emphasis is given to categorical data, which were generated by the Item Response Theory (IRT) Graded-Response Modeling…

  7. Development of an Acculturative Stress Scale for International Students: preliminary findings.

    PubMed

    Sandhu, D S; Asrabadi, B R

    1994-08-01

    Description of the development and testing of a new 36-item scale in Likert format, designed to assess the acculturative stress of international students, includes perceived discrimination, homesickness, fear, guilt, perceived hatred, and stress due to change (cultural shock), identified as major contributing factors. The psychometric properties of this instrument and implications for use by mental health practitioners are discussed. PMID:7809315

  8. The Development of a Competence Scale for Learning Science: Inquiry and Communication

    ERIC Educational Resources Information Center

    Chang, Huey-Por; Chen, Chin-Chang; Guo, Gwo-Jen; Cheng, Yeong-Jin; Lin, Chen-Yung; Jen, Tsung-Hau

    2011-01-01

    The objective of this study was to develop an instrument to measure school students' competence in learning science as part of a large research project in Taiwan. The instrument consisted of 29 self-report, Likert-type items divided into 2 scales: Competence in Scientific Inquiry and Competence in Communication. The Competence in Scientific…

  9. Measuring Pretest-Posttest Change with a Rasch Rating Scale Model.

    ERIC Educational Resources Information Center

    Wolfe, Edward W.; Chiu, Chris W. T.

    1999-01-01

    Describes a method for disentangling changes in persons from changes in the interpretation of Likert-type questionnaire items and the use of rating scales. The procedure relies on anchoring strategies to create a common frame of reference for interpreting measures taken at different times. Illustrates the use of these procedures using the FACETS…

  10. Exploratory Factor Analysis Study for the Scale of High School Students' Attitudes towards Chemistry

    ERIC Educational Resources Information Center

    Demircioglu, Gökhan; Aslan, Aysegül; Yadigaroglu, Mustafa

    2014-01-01

    It is important to develop students' positive attitudes chemistry lessons in school because research has suggested that attitudes are linked with academic achievement. Therefore, how to evaluate the attitudes is an important topic in education. The purpose of this study was to develop a Likert-type scale that could measure high school students'…

  11. Evaluation of Social Cognitive Scaling Response Options in the Physical Activity Domain

    ERIC Educational Resources Information Center

    Rhodes, Ryan E.; Matheson, Deborah Hunt; Mark, Rachel

    2010-01-01

    The purpose of this study was to compare the reliability, variability, and predictive validity of two common scaling response formats (semantic differential, Likert-type) and two numbers of response options (5-point, 7-point) in the physical activity domain. Constructs of the theory of planned behavior were chosen in this analysis based on its…

  12. Development of the perceptions of racism scale.

    PubMed

    Green, N L

    1995-01-01

    Racism may be a factor in low-birth-weight (LBW) and preterm delivery in African American childbearing women. Because no satisfactory measure of racism existed, the Perception of Racism Scale (PRS) was developed. The PRS was pilot tested on 109 participants from churches and community organizations. The scale was then used in a study of 136 childbearing women to investigate LBW and preterm delivery. Twenty items rated on a 4-point Likert-type scale were scored with 1 as the lowest and 4 as the highest perception of racism. Alpha reliabilities were .88 for the pilot and .91 for the study. Content validity was strengthened by expert panel critique. Reliability, content validity, and construct validity were demonstrated and no undue participant burden was observed. The scale is an effective instrument to measure perceptions of racism by African American women.

  13. Using Visual Analogue Scales in eHealth: Non-Response Effects in a Lifestyle Intervention

    PubMed Central

    Reips, Ulf-Dietrich; Wienert, Julian; Lippke, Sonia

    2016-01-01

    Background Visual analogue scales (VASs) have been shown to be valid measurement instruments and a better alternative to Likert-type scales in Internet-based research, both empirically and theoretically [1,2]. Upsides include more differentiated responses, better measurement level, and less error. Their feasibility and properties in the context of eHealth, however, have not been examined so far. Objective The present study examined VASs in the context of a lifestyle study conducted online, measuring the impact of VASs on distributional properties and non-response. Method A sample of 446 participants with a mean age of 52.4 years (standard deviation (SD) = 12.1) took part in the study. The study was carried out as a randomized controlled trial, aimed at supporting participants over 8 weeks with an additional follow-up measurement. In addition to the randomized questionnaire, participants were further randomly assigned to either a Likert-type or VAS response scale version of the measures. Results Results showed that SDs were lower for items answered via VASs, 2P (Y ≥ 47 | n=55, P=.5) < .001. Means did not differ across versions. Participants in the VAS version showed lower dropout rates than participants in the Likert version, odds ratio = 0.75, 90% CI (0.58-0.98), P=.04. Number of missing values did not differ between questionnaire versions. Conclusions The VAS is shown to be a valid instrument in the eHealth context, offering advantages over Likert-type scales. The results of the study provide further support for the use of VASs in Internet-based research, extending the scope to senior samples in the health context. Trial Registration Clinicaltrials.gov NCT01909349; https://clinicaltrials.gov/ct2/show/NCT01909349 (Archived by WebCite at http://www.webcitation.org/6h88sLw2Y) PMID:27334562

  14. The assessment of dyspnea during the vigorous intensity exercise by three Dyspnea Rating Scales in inactive medical personnel.

    PubMed

    Intarakamhang, Patrawut; Wangjongmeechaikul, Piyathida

    2013-07-24

    It is well recognized that exercise is good for health especially as it's known to prevent metabolic syndromes such as diabetes, hypertension and heart disease. To reap the benefits from exercise the most appropriate level of intensity must be determined, the level of intensity ranging from low, low to moderate to hard (vigorous). This study is aimed to 1. To investigate and evaluate 3 subjective rating scales. The Borg scale, the Combined Numerical Rating Scale (NRS) + FACES Dyspnea Rating Scale (FACES) and the Likert scale, during hard (vigorous) exercise. 2. To compare the effectiveness of the Borg scale and Combined Numerical Rating Scale (NRS) + FACES Dyspnea Rating Scale during the hard (vigorous) intensity exercise. This study uses a descriptive methodology. The sample group was 73 medical personnel that were leading an inactive life style, volunteers from Phramongkutklao Hospital. Participants were randomly divided into 3 groups. Group 1, those to report using the Borg Scale, group 2 using NRS + FACES, and group 3 to subjectively assess the intensity of the exercise using the Likert scale during a treadmill Exercise Stress Test (EST) using the Bruce protocol. The upper limit of the intensity in the study was equal to 85% of the maximal heart rate of all participants. The subjective reporting of the experienced level of dyspnea was undertaken immediately after the completion of exercise. The average age of participants was 23.37 years old. The 26 participants reporting using the Borg scale had mean Borg scale score of 13.46+1.77, a mode score of 15. The 24 participants reporting intensity levels through NRS +FACES had a mean NRS + FACES score of 6.83+1.09 and mode on the NRS + FACES scale equal to 7. The Likert scale group evaluated 23 participants with a mean Likert scale score of 2.74. That is those choosing Levels 2 and 3 were 6 (26.9%) and 17 participants (73.95%), respectively. Comparing the two groups with the Borg scale at equal to or greater than 15

  15. Acupuncture at “Zusanli” (St.36) and “Sanyinjiao” (SP.6) Points on the Gastrointestinal Tract: A Study of the Bioavailability of 99mTc-Sodium Pertechnetate in Rats

    PubMed Central

    Senna-Fernandes, Vasco; França, Daisy L. M.; de Souza, Deise; Santos, Kelly C. M.; Sousa, Rafael S.; Manoel, Cristiano V.; Santos-Filho, Sebastião D.; Cortez, Célia M.; Bernardo-Filho, Mario; Guimarães, Marco Antonio M.

    2011-01-01

    The objective of this study is to investigate the differences of acupuncture effect between the Zusanli (St.36) and Sanyinjiao (SP.6) points on the gastrointestinal-tract (GIT) segment performed by the bioavailability of 99mTc-sodium-pertechnetate (Na99mTcO4) in rats. Male Wistar rats (n = 21) were allocated into three groups of seven each. Group 1 was treated by acupuncture bilaterally at St.36; Group 2 at SP.6; and Group 3 was untreated (control). After 10 min of needle insertion in anesthetized rats, 0.3 mL of Na99mTcO4 (7.4 MBq) was injected via ocular-plexus. After 20 min, the exitus of animals was induced by cervical-dislocation and GIT organs isolated. However, immediately before the exitus procedure, blood was collected by cardiac-puncture for blood radio-labeling (BRL). The radioactivity uptake of the blood constituents was calculated together with the GIT organs by a well gamma counter. The percentage of injected dose per gram of tissue (%ID/g) of Na99mTcO4 was calculated for each GIT organs, while BRL was calculated in %ID. According to the one-way ANOVA, the stomach, jejunum, ileum from the treated groups (Group 1 and Group 2) had significant differences compared to the controls (Group 3). However, between the treated groups (Group 1 and Group 2), there were significant differences (P < .05) in the stomach, jejunum, ileum, cecum, transverse and rectum. In BRL analysis, Group 2 showed significant increase and decrease of the insoluble and soluble fractions of the blood cells, respectively (P < .0001). The authors suggest that St.36 may have a tendency of up-regulation effect on GIT, whereas SP.6, down-regulation effect. However, further rigorous experimental studies to examine the effectiveness of acupuncture in either acupuncture points need to be carried out. PMID:19213853

  16. Measuring pretest-posttest change with a Rasch Rating Scale Model.

    PubMed

    Wolfe, E W; Chiu, C W

    1999-01-01

    When measures are taken on the same individual over time, it is difficult to determine whether observed differences are the result of changes in the person or changes in other facets of the measurement situation (e.g., interpretation of items or use of rating scale). This paper describes a method for disentangling changes in persons from changes in the interpretation of Likert-type questionnaire items and the use of rating scales (Wright, 1996a). The procedure relies on anchoring strategies to create a common frame of reference for interpreting measures that are taken at different times and provides a detailed illustration of how to implement these procedures using FACETS.

  17. Psychometric Properties of Measures of Team Diversity with Likert Data

    ERIC Educational Resources Information Center

    Deng, Lifang; Marcoulides, George A.; Yuan, Ke-Hai

    2015-01-01

    Certain diversity among team members is beneficial to the growth of an organization. Multiple measures have been proposed to quantify diversity, although little is known about their psychometric properties. This article proposes several methods to evaluate the unidimensionality and reliability of three measures of diversity. To approximate the…

  18. A Measurement Model for Likert Responses that Incorporates Response Time

    ERIC Educational Resources Information Center

    Ferrando, Pere J.; Lorenzo-Seva, Urbano

    2007-01-01

    This article describes a model for response times that is proposed as a supplement to the usual factor-analytic model for responses to graded or more continuous typical-response items. The use of the proposed model together with the factor model provides additional information about the respondent and can potentially increase the accuracy of the…

  19. Validation of the self-confidence scale of nursing care in urinary retention

    PubMed Central

    Mazzo, Alessandra; Martins, José Carlos Amado; Jorge, Beatriz Maria; Batista, Rui Carlos Negrão; Almeida, Rodrigo Guimarães dos Santos; Henriques, Fernando Manuel Dias; Coutinho, Verónica Rita Dias; Mendes, Isabel Amélia Costa

    2015-01-01

    Objective: to validate an instrument to measure self-confidence of nursing care in urinary retention. Methods: methodological research study, carried out after ethical approval. A Likert-like scale of 32 items related to nursing care in urinary retention was applied to students of the graduate nursing course. For instrument validation, analysis of the sample adequacy and main components, Varimax orthogonal rotation and internal consistency analyses were developed. Results: in a sample of 305 students, there was high correlation of all items with the total scale and Cronbach's alpha of 0.949. The scale items were divided into five factors with internal consistency: Factor 1 (0.890), Factor 2 (0.874), Factor 3 (0.868), Factor 4 (0.814) and Factor 5 (0.773), respectively. Conclusion: the scale meets the validity requirements, demonstrating potential for use in evaluation and research. PMID:26487130

  20. Development of the Adolescent Cancer Suffering Scale

    PubMed Central

    Khadra, Christelle; Le May, Sylvie; Tremblay, Isabelle; Dupuis, France; Cara, Chantal; Mercier, Geneviève; Vachon, Marie France; Fiola, Jacinthe Lachance

    2015-01-01

    BACKGROUND: While mortality due to pediatric cancer has decreased, suffering has increased due to complex and lengthy treatments. Cancer in adolescence has repercussions on personal and physical development. Although suffering can impede recovery, there is no validated scale in French or English to measure suffering in adolescents with cancer. OBJECTIVE: To develop an objective scale to measure suffering in adolescents with cancer. METHODS: A methodological design for instrument development was used. Following a MEDLINE search, semistructured interviews were conducted with adolescents 12 to 19 years of age who had undergone four to six weeks of cancer treatment, and with a multidisciplinary cohort of health care professionals. Adolescents with advanced terminal cancer or cognitive impairment were excluded. Enrollment proceeded from the hematology-oncology department/clinic in Montreal, Quebec, from December 2011 to March 2012. Content validity was assessed by five health care professionals and four adolescents with cancer. RESULTS: Interviews with 19 adolescents and 16 health care professionals identified six realms of suffering: physical, psychological, spiritual, social, cognitive and global. Through iterative feedback, the Adolescent Cancer Suffering Scale (ACSS) was developed, comprising 41 questions on a four-point Likert scale and one open-ended question. Content validity was 0.98, and inter-rater agreement among professionals was 88% for relevance and 86% for clarity. Adolescents considered the scale to be representative of their suffering. CONCLUSIONS: The ACSS is the first questionnaire to measure suffering in adolescents with cancer. In future research, the questionnaire should be validated extensively and interventions developed. Once validated, the ACSS will contribute to promote a holistic approach to health with appropriate intervention or referral. PMID:26252665

  1. Internal consistency of a five-item form of the Francis Scale of Attitude Toward Christianity among adolescent students.

    PubMed

    Campo-Arias, Adalberto; Oviedo, Heidi Celina; Cogollo, Zuleima

    2009-04-01

    The short form of the Francis Scale of Attitude Toward Christianity (L. J. Francis, 1992) is a 7-item Likert-type scale that shows high homogeneity among adolescents. The psychometric performance of a shorter version of this scale has not been explored. The authors aimed to determine the internal consistency of a 5-item form of the Francis Scale of Attitude Toward Christianity among 405 students from a school in Cartagena, Colombia. The authors computed the Cronbach's alpha coefficient for the 5 items with a greater corrected item-total punctuation correlation. The version without Items 2 and 7 showed internal consistency of .87. The 5-item version of the Francis Scale of Attitude Toward Christianity exhibited higher internal consistency than did the 7-item version. Future researchers should corroborate this finding. PMID:19425361

  2. Thurstonian Scaling of Compositional Questionnaire Data.

    PubMed

    Brown, Anna

    2016-01-01

    To prevent response bias, personality questionnaires may use comparative response formats. These include forced choice, where respondents choose among a number of items, and quantitative comparisons, where respondents indicate the extent to which items are preferred to each other. The present article extends Thurstonian modeling of binary choice data to "proportion-of-total" (compositional) formats. Following the seminal work of Aitchison, compositional item data are transformed into log ratios, conceptualized as differences of latent item utilities. The mean and covariance structure of the log ratios is modeled using confirmatory factor analysis (CFA), where the item utilities are first-order factors, and personal attributes measured by a questionnaire are second-order factors. A simulation study with two sample sizes, N = 300 and N = 1,000, shows that the method provides very good recovery of true parameters and near-nominal rejection rates. The approach is illustrated with empirical data from N = 317 students, comparing model parameters obtained with compositional and Likert-scale versions of a Big Five measure. The results show that the proposed model successfully captures the latent structures and person scores on the measured traits. PMID:27054408

  3. Scale and scaling in soils

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Scale is recognized as a central concept in the description of the hierarchical organization of our world. Pressing environmental and societal problems such require an understanding of how processes operate at different scales, and how they can be linked across scales. Soil science as many other dis...

  4. What Should Constitute a Health Related Quality of Life Scale for Parkinson’s Disease?

    PubMed Central

    Aggarwal, Rajeev; Goyal, Vinay; Pandey, Ravindra Mohan; Kumar, Nand; Singh, Sumit; Shukla, Garima

    2016-01-01

    Introduction Health Related Quality of Life (HRQoL) in Parkinson’s Disease (PD) lacks universally agreed definition and its components. A conceptual framework helps in understanding the essential domains and their inter-relationship while developing patient reported outcome measure. Aim To construct a conceptual framework for developing HRQoL scale in PD. Materials and Methods A panel of 7 experts extracted 6 major domains for measuring HRQoL in PD from literature review including 8 disease specific scales for PD, 2 books on quality of life, 5 websites, relevant articles; and content analysis of semi-structured interviews of stakeholders (28 persons with PD, 6 caregivers and 9 clinicians). Extracted domains were subjected to consensus of stakeholders (7 persons with PD, 7 caregivers and 7 clinicians) on 7 point Likert scale. The panel constructed a conceptual framework and a definition of HRQoL in PD in context of available guidelines for developing patient reported outcome measures. Results The extracted domains were physical, non motor symptom, psychological, family/social, finance and treatment domains. Median of all six domains on 7 point Likert scale was 7 and inter-quartile distance was <1 in consensus agreement. The conceptual framework consisted of indicator domains and causal domains. Indicator domains (physical, psychological, and social and family) estimate the influence of causal domains (motor symptoms, non motor symptoms, finance and treatment) on quality of life. The definition emphasizes upon the person’s perception of their symptoms and its impact on their lives. Conclusion This study defined and developed a conceptual framework for HRQoL scale for PD. PMID:27790491

  5. Millimeter Scale.

    ERIC Educational Resources Information Center

    Harvill, Leo M.

    This absolute scale contains nine times, each of which consists of a 100 millimeter vertical line with small division marks every 25 millimeters with the words "high" at the top and "low" at the bottom of the line. Above each of the vertical lines is a word or phrase. For the second grade scale these words are: arithmetic, counting, adding,…

  6. Activity Scale.

    ERIC Educational Resources Information Center

    Kerpelman, Larry C.; Weiner, Michael J.

    This twenty-four item scale assesses students' actual and desired political-social activism in terms of physical participation, communication activities, and information-gathering activities. About ten minutes are required to complete the instrument. The scale is divided into two subscales. The first twelve items (ACT-A) question respondents on…

  7. Development of The Viking Speech Scale to classify the speech of children with cerebral palsy.

    PubMed

    Pennington, Lindsay; Virella, Daniel; Mjøen, Tone; da Graça Andrada, Maria; Murray, Janice; Colver, Allan; Himmelmann, Kate; Rackauskaite, Gija; Greitane, Andra; Prasauskiene, Audrone; Andersen, Guro; de la Cruz, Javier

    2013-10-01

    Surveillance registers monitor the prevalence of cerebral palsy and the severity of resulting impairments across time and place. The motor disorders of cerebral palsy can affect children's speech production and limit their intelligibility. We describe the development of a scale to classify children's speech performance for use in cerebral palsy surveillance registers, and its reliability across raters and across time. Speech and language therapists, other healthcare professionals and parents classified the speech of 139 children with cerebral palsy (85 boys, 54 girls; mean age 6.03 years, SD 1.09) from observation and previous knowledge of the children. Another group of health professionals rated children's speech from information in their medical notes. With the exception of parents, raters reclassified children's speech at least four weeks after their initial classification. Raters were asked to rate how easy the scale was to use and how well the scale described the child's speech production using Likert scales. Inter-rater reliability was moderate to substantial (k>.58 for all comparisons). Test-retest reliability was substantial to almost perfect for all groups (k>.68). Over 74% of raters found the scale easy or very easy to use; 66% of parents and over 70% of health care professionals judged the scale to describe children's speech well or very well. We conclude that the Viking Speech Scale is a reliable tool to describe the speech performance of children with cerebral palsy, which can be applied through direct observation of children or through case note review.

  8. Scale interactions

    NASA Astrophysics Data System (ADS)

    Snow, John T.

    Since the time of the first world war, investigation of synoptic processes has been a major focus of atmospheric research. These are the physical processes that drive the continuously evolving pattern of high and low pressure centers and attendant frontal boundaries that are to be seen on continental-scale weather maps. This effort has been motivated both by a spirit of scientific inquiry and by a desire to improve operational weather forecasting by national meteorological services. These national services in turn have supported the development of a global observational network that provides the data required for both operational and research purposes. As a consequence of this research, there now exists a reasonable physical understanding of many of the phenomena found at this synoptic scale. This understanding is reflected in the numerical weather forecast models used by the national services. These have shown significant skill in predicting the evolution of synoptic-scale features for periods extending out to five days.

  9. Scaling Rules!

    NASA Astrophysics Data System (ADS)

    Malkinson, Dan; Wittenberg, Lea

    2015-04-01

    Scaling is a fundamental issue in any spatially or temporally hierarchical system. Defining domains and identifying the boundaries of the hierarchical levels may be a challenging task. Hierarchical systems may be broadly classified to two categories: compartmental and continuous ones. Examples of compartmental systems include: governments, companies, computerized networks, biological taxonomy and others. In such systems the compartments, and hence the various levels and their constituents are easily delineated. In contrast, in continuous systems, such as geomorphological, ecological or climatological ones, detecting the boundaries of the various levels may be difficult. We propose that in continuous hierarchical systems a transition from one functional scale to another is associated with increased system variance. Crossing from a domain of one scale to the domain of another is associated with a transition or substitution of the dominant drivers operating in the system. Accordingly we suggest that crossing this boundary is characterized by increased variance, or a "variance leap", which stabilizes, until crossing to the next domain or hierarchy level. To assess this we compiled sediment yield data from studies conducted at various spatial scales and from different environments. The studies were partitioned to ones conducted in undisturbed environments, and those conducted in disturbed environments, specifically by wildfires. The studies were conducted in plots as small as 1 m2, and watersheds larger than 555000 ha. Regressing sediment yield against plot size, and incrementally calculating the variance in the systems, enabled us to detect domains where variance values were exceedingly high. We propose that at these domains scale-crossing occurs, and the systems transition from one hierarchical level to another. Moreover, the degree of the "variance leaps" characterizes the degree of connectivity among the scales.

  10. Scaling satan.

    PubMed

    Wilson, K M; Huff, J L

    2001-05-01

    The influence on social behavior of beliefs in Satan and the nature of evil has received little empirical study. Elaine Pagels (1995) in her book, The Origin of Satan, argued that Christians' intolerance toward others is due to their belief in an active Satan. In this study, more than 200 college undergraduates completed the Manitoba Prejudice Scale and the Attitudes Toward Homosexuals Scale (B. Altemeyer, 1988), as well as the Belief in an Active Satan Scale, developed by the authors. The Belief in an Active Satan Scale demonstrated good internal consistency and temporal stability. Correlational analyses revealed that for the female participants, belief in an active Satan was directly related to intolerance toward lesbians and gay men and intolerance toward ethnic minorities. For the male participants, belief in an active Satan was directly related to intolerance toward lesbians and gay men but was not significantly related to intolerance toward ethnic minorities. Results of this research showed that it is possible to meaningfully measure belief in an active Satan and that such beliefs may encourage intolerance toward others.

  11. Scaling satan.

    PubMed

    Wilson, K M; Huff, J L

    2001-05-01

    The influence on social behavior of beliefs in Satan and the nature of evil has received little empirical study. Elaine Pagels (1995) in her book, The Origin of Satan, argued that Christians' intolerance toward others is due to their belief in an active Satan. In this study, more than 200 college undergraduates completed the Manitoba Prejudice Scale and the Attitudes Toward Homosexuals Scale (B. Altemeyer, 1988), as well as the Belief in an Active Satan Scale, developed by the authors. The Belief in an Active Satan Scale demonstrated good internal consistency and temporal stability. Correlational analyses revealed that for the female participants, belief in an active Satan was directly related to intolerance toward lesbians and gay men and intolerance toward ethnic minorities. For the male participants, belief in an active Satan was directly related to intolerance toward lesbians and gay men but was not significantly related to intolerance toward ethnic minorities. Results of this research showed that it is possible to meaningfully measure belief in an active Satan and that such beliefs may encourage intolerance toward others. PMID:11577971

  12. [Satisfaction scales with health care to cardiovascular diseases: CARDIOSATIS--patient and team].

    PubMed

    Cardoso, Clareci Silva; Bandeira, Marina; Ribeiro, Antonio Luiz Pinho; Oliveira, Graziella Lage; Caiaffa, Waleska Teixeira

    2011-01-01

    Satisfaction is an important measure of quality care, of adherence to the treatment and adequate use of health services. The objective here is to build two scales which evaluates team' and patients' satisfaction with cardiovascular disease treatment provided through a distance telecardiology project. The procedure followed international standards for development of measure instruments, including operational definition of satisfaction contents and its area for evaluation; item definition; pre-test and pilot study. The literature review, focal groups and discussion with specialists had delimited the domains to be included in the scales and the elaboration of its items. The CARDIOSATIS-Team included 15 items and the CARDIOSATIS-Patient included 11. Satisfaction was measured through a five-point Likert scale. The scales' items comprised satisfaction with physical structure, human resources, capacity of resolution, attention and care offered by the service and the satisfaction with the received/given care. The scales also included open questions. CARDIOSATIS scales have showed to be an easy and accessible instrument very well accepted by medical team and patients. Preliminary results presented good characteristics of validity and reliability. PMID:21503491

  13. Nuclear scales

    SciTech Connect

    Friar, J.L.

    1998-12-01

    Nuclear scales are discussed from the nuclear physics viewpoint. The conventional nuclear potential is characterized as a black box that interpolates nucleon-nucleon (NN) data, while being constrained by the best possible theoretical input. The latter consists of the longer-range parts of the NN force (e.g., OPEP, TPEP, the {pi}-{gamma} force), which can be calculated using chiral perturbation theory and gauged using modern phase-shift analyses. The shorter-range parts of the force are effectively parameterized by moments of the interaction that are independent of the details of the force model, in analogy to chiral perturbation theory. Results of GFMC calculations in light nuclei are interpreted in terms of fundamental scales, which are in good agreement with expectations from chiral effective field theories. Problems with spin-orbit-type observables are noted.

  14. The development of a sense of control scale

    PubMed Central

    Dong, Mia Y.; Sandberg, Kristian; Bibby, Bo M.; Pedersen, Michael N.; Overgaard, Morten

    2015-01-01

    In the past decades, sense of control—the feeling that one is in control of one’s actions has gained much scientific interests. Various scales have been used to measure sense of control in previous studies, yet no study has allowed participants to create a scale for rating their control experiences despite advances in the neighboring field of conscious vision has been linked to this approach. Here, we examined how participants preferred to rate sense of control during a simple motor control task by asking them to create a scale to be used to describe their sense of control experience during the task. Scale with six steps was most frequently created. Even though some variability was observed in the number of preferred scale steps, descriptions were highly similar across all participants when scales were converted to the same continuum. When we divided participants into groups based on their number of preferred scale steps, mean task performance and sense of control could be described as sigmoid functions of the noise level, and the function parameters were equivalent across groups. We also showed that task performance increased exponentially as a function of control rating, and that, again, function parameters were equivalent for all groups. In summary, the present study established a participant-generated 6-point sense of control rating scale for simple computerized motor control tasks that can be empirically tested against other measures of control in future studies. PMID:26594195

  15. The VAGUS insight into psychosis scale – Self-report & clinician-rated versions

    PubMed Central

    Gerretsen, Philip; Remington, Gary; Borlido, Carol; Quilty, Lena; Hassan, Sabrina; Polsinelli, Gina; Teo, Celine; Mar, Wanna; Simon, Regina; Menon, Mahesh; Pothier, David D.; Nakajima, Shinichiro; Caravaggio, Fernando; Mamo, David C.; Rajji, Tarek K.; Mulsant, Benoit H.; Deluca, Vincenzo; Ganguli, Rohan; Pollock, Bruce G.; Graff-Guerrero, Ariel

    2015-01-01

    The aim of this study was to develop self-report and clinician-rated versions of an insight scale that would be easy to administer, sensitive to small changes, and inclusive of the core dimensions of clinical insight into psychosis. Ten-item self-report (VAGUS-SR) and five-item clinician-rated (VAGUS-CR) scales were designed to measure the dimensions of insight into psychosis and evaluated in 215 and 140 participants, respectively (www.vagusonline.com). Tests of reliability and validity were performed. Both the VAGUS-SR and VAGUS-CR showed good internal consistency and reliability. They demonstrated good convergent and discriminant validity. Both versions were strongly correlated with one another and with the Schedule for the Assessment of Insight and Birchwood Insight Scale. Exploratory factor analyses identified three possible latent components of insight. The VAGUS-CR and VAGUS-SR are valid, reliable and easy to administer. They are build on previous insight scales with separate clinician-rated and self-report versions. The VAGUS-SR exhibited a multidimensional factor structure. Using a 10-point Likert scale for each item, the VAGUS has the capacity to detect small, temporally sensitive changes in insight, which is essential for intervention studies with neurostimulation or rapidly acting medications. PMID:25246410

  16. Development and Validation of a Scale to Assess Students' Attitude towards Animal Welfare

    NASA Astrophysics Data System (ADS)

    Mazas, Beatriz; Rosario Fernández Manzanal, Mª; Zarza, Francisco Javier; Adolfo María, Gustavo

    2013-07-01

    This work presents the development of a scale of attitudes of secondary-school and university students towards animal welfare. A questionnaire was drawn up following a Likert-type scale attitude assessment model. Four components or factors, which globally measure animal welfare, are proposed to define the object of the attitude. The components are animal abuse for pleasure or due to ignorance (C1), leisure with animals (C2), farm animals (C3) and animal abandonment (C4). The final version of the questionnaire contains 29 items that are evenly distributed among the four components indicated, guaranteeing that each component is one-dimensional. A sample of 329 students was used to validate the scale. These students were aged between 11 and 25, and were from secondary schools in Aragon and the University in Zaragoza (Aragon's main and largest city, located in NE Spain). The scale shows good internal reliability, with a Cronbach's alpha value of 0.74. The questionnaire was later given to 1,007 students of similar levels and ages to the sample used in the validation, the results of which are presented in this study. The most relevant results show significant differences in gender and level of education in some of the components of the scale, observing that women and university students rate animal welfare more highly.

  17. Development and psychometric evaluation of the Nurses Professional Values Scale--Revised.

    PubMed

    Weis, Darlene; Schank, Mary Jane

    2009-01-01

    The Nurses Professional Values Scale--Revised (NPVS-R) is an instrument derived from the American Nurses Association Code of Ethics for Nurses designed to measure nurses' professional values. The purpose of this study was to examine the psychometric properties of the NPVS-R in a random sample of baccalaureate and master's students and practicing nurses. The NPVS-R, a 26-item Likert-scale format instrument, was tested on 782 subjects. Responses to the NPVS-R were subjected to exploratory and confirmatory factor analysis. Principal components analysis with varimax rotation and Kaiser normalization resulted in a five-factor solution explaining 56.7% of the common variance. Findings supported internal consistency reliability of five factors with alpha coefficients from .70 to .85 and a total scale alpha coefficient of .92. Construct validity was supported with an overall factor loading range of .46 to .79 across the five factors labeled Caring, Activism, Trust, Professionalism, and Justice. The NPVS-R is a psychometrically sound instrument for measuring professional nurses' values and enhancing professional socialization. PMID:20069950

  18. Psychometric properties of the Scale for Quality Evaluation of the Bachelor Degree in Nursing Version 2 (QBN 2).

    PubMed

    Macale, Loreana; Scialò, Gennaro; Di Sarra, Luca; De Marinis, Maria Grazia; Rocco, Gennaro; Vellone, Ercole; Alvaro, Rosaria

    2014-03-01

    To evaluate all the variables that affect nursing education is important for nursing educators to have valid and reliable instruments that can measure the perceived quality of the Bachelor Degree in Nursing. This study testing the Scale for Quality Evaluation of the Bachelor Degree in Nursing instrument and its psychometric properties with a descriptive design. Participant were first, second and third year students of the Bachelor Degree in Nursing Science from three Italian universities. The Scale for Quality Evaluation of Bachelor Degree in Nursing consists of 65 items that use a 4 point Likert scale ranging from "strongly disagree" to "strongly agree". The instrument comes from a prior version with 41 items that were modified and integrated with 24 items to improve reliability. Six hundred and fifty questionnaires were completed and considered for the present study. The mean age of the students was 24.63 years, 65.5% were females. Reliability of the scale resulted in a very high Cronbach's alpha (0.96). The construct validity was tested with factor analysis that showed 7 factors. The Scale for Quality Evaluation of the Bachelor Degree in Nursing, although requiring further studies, represents a useful instrument to measure the quality of the Bachelor Nursing Degree. PMID:23810577

  19. Likert Response Alternative Direction: SA to SD or SD to SA: Does It Make a Difference?

    ERIC Educational Resources Information Center

    Barnette, J. Jackson

    A 20-item survey was designed in 4 forms with response set direction as "strongly disagree" (SD) to "strongly agree" (SA) and SA to SD crossed with the absence or presence of negatively worded item stems. The primary research question related to finding a primacy effect when comparing the two response direction formats. Surveys were administered,…

  20. A Survey of the Management System at St. Petersburg Junior College Using Likert's Profile.

    ERIC Educational Resources Information Center

    Pesuth, F. X.

    A study was conducted at St. Petersburg Junior College (Florida) to determine the level of congruency of current perceptions and future expectations regarding organizational climate, of faculty, professional career personnel, and upper-level supervisors. Also surveyed were first-level supervisors' beliefs of their subordinates' perceptions of…

  1. The Impact of Outliers on Cronbach's Coefficient Alpha Estimate of Reliability: Ordinal/Rating Scale Item Responses

    ERIC Educational Resources Information Center

    Liu, Yan; Wu, Amery D.; Zumbo, Bruno D.

    2010-01-01

    In a recent Monte Carlo simulation study, Liu and Zumbo showed that outliers can severely inflate the estimates of Cronbach's coefficient alpha for continuous item response data--visual analogue response format. Little, however, is known about the effect of outliers for ordinal item response data--also commonly referred to as Likert, Likert-type,…

  2. Reliability and validity of the Student Perceptions of School Cohesion Scale in a sample of Salvadoran secondary school students

    PubMed Central

    2009-01-01

    Background Despite a growing body of research from the United States and other industrialized countries on the inverse association between supportive social relationships in the school and youth risk behavior engagement, research on the measurement of supportive school social relationships in Central America is limited. We examined the psychometric properties of the Student Perceptions of School Cohesion (SPSC) scale, a 10-item scale that asks students to rate with a 5-point Likert-type response scale their perceptions of the school social environment, in a sample of public secondary school students (mean age = 15 years) living in central El Salvador. Methods Students (n = 982) completed a self-administered questionnaire that included the SPSC scale along with measures of youth health risk behaviors based on the Center for Disease Control and Prevention's Youth Risk Behavior Survey. Exploratory factor analysis was used to assess the factor structure of the scale, and two internal consistency estimates of reliability were computed. Construct validity was assessed by examining whether students who reported low school cohesion were significantly more likely to report physical fighting and illicit drug use. Results Results indicated that the SPSC scale has three latent factors, which explained 61.6% of the variance: supportive school relationships, student-school connectedness, and student-teacher connectedness. The full scale and three subscales had good internal consistency (rs = .87 and α = .84 for the full scale; rs and α between .71 and .75 for the three subscales). Significant associations were found between the full scale and all three subscales with physical fighting (p ≤ .001) and illicit drug use (p < .05). Conclusion Findings provide evidence of reliability and validity of the SPSC for the measurement of supportive school relationships in Latino adolescents living in El Salvador. These findings provide a foundation for further research on school cohesion

  3. Mental Imagery Scale: a new measurement tool to assess structural features of mental representations

    NASA Astrophysics Data System (ADS)

    D'Ercole, Martina; Castelli, Paolo; Giannini, Anna Maria; Sbrilli, Antonella

    2010-05-01

    Mental imagery is a quasi-perceptual experience which resembles perceptual experience, but occurring without (appropriate) external stimuli. It is a form of mental representation and is often considered centrally involved in visuo-spatial reasoning and inventive and creative thought. Although imagery ability is assumed to be functionally independent of verbal systems, it is still considered to interact with verbal representations, enabling objects to be named and names to evoke images. In literature, most measurement tools for evaluating imagery capacity are self-report instruments focusing on differences in individuals. In the present work, we applied a Mental Imagery Scale (MIS) to mental images derived from verbal descriptions in order to assess the structural features of such mental representations. This is a key theme for those disciplines which need to turn objects and representations into words and vice versa, such as art or architectural didactics. To this aim, an MIS questionnaire was administered to 262 participants. The questionnaire, originally consisting of a 33-item 5-step Likert scale, was reduced to 28 items covering six areas: (1) Image Formation Speed, (2) Permanence/Stability, (3) Dimensions, (4) Level of Detail/Grain, (5) Distance and (6) Depth of Field or Perspective. Factor analysis confirmed our six-factor hypothesis underlying the 28 items.

  4. Preliminary Study of the Autism Self-Efficacy Scale for Teachers (ASSET)

    PubMed Central

    Ruble, Lisa A.; Toland, Michael D.; Birdwhistell, Jessica L.; McGrew, John H.; Usher, Ellen L.

    2013-01-01

    The purpose of the current study was to evaluate a new measure, the Autism Self-Efficacy Scale for Teachers (ASSET) for its dimensionality, internal consistency, and construct validity derived in a sample of special education teachers (N = 44) of students with autism. Results indicate that all items reflect one dominant factor, teachers’ responses to items were internally consistent within the sample, and compared to a 100-point scale, a 6-point response scale is adequate. ASSET scores were found to be negatively correlated with scores on two subscale measures of teacher stress (i.e., self-doubt/need for support and disruption of the teaching process) but uncorrelated with teacher burnout scores. The ASSET is a promising tool that requires replication with larger samples. PMID:23976899

  5. Educational Scale-Making

    ERIC Educational Resources Information Center

    Nespor, Jan

    2004-01-01

    The article explores the complexities of educational scale-making. "Educational scales" are defined as the spatial and temporal orders generated as pupils and teachers move and are moved through educational systems; scales are "envelopes of spacetime" into which certain schoolbased identities (and not others) can be folded. Scale is thus both an…

  6. Raters & Rating Scales.

    ERIC Educational Resources Information Center

    Lopez, Winifred A.; Stone, Mark H.

    1998-01-01

    The first article in this section, "Rating Scales and Shared Meaning," by Winifred A. Lopez, discusses the analysis of rating scale data. The second article, "Rating Scale Categories: Dichotomy, Double Dichotomy, and the Number Two," by Mark H. Stone, argues that dichotomies in rating scales are more useful than multiple ratings. (SLD)

  7. Scale development to measure attitudes toward unauthorized migration into a foreign country.

    PubMed

    VAN DER Veer, Kees; Ommundsen, Reidar; Krumov, Krum; VAN LE, Hao; Larsen, Knud S

    2008-08-01

    This study reports on the development and cross-national utility of a Likert type scale measuring attitudes toward unauthorized migration into a foreign country in two samples from "migrant-sending" nations. In the first phase a pool of 86 attitude statements were administered to a sample of 505 undergraduate students in Bulgaria (22.5% male; M age = 23, SD = 4.8). Exploratory factor analysis resulted in six factors, and a reduction to 34 items. The results yielded an overall alpha of (0.92) and alpha for subscales ranging from 0.70 to 0.89. In the second phase the 34-item scale was administered in a survey of 180 undergraduates from Sofia University in Bulgaria (16.7% male, M age = 23, SD = 4.8), plus 150 undergraduates from Hanoi State University in Vietnam (14.7% male, M age = 19, SD = 1.8). Results yielded a 19-item scale with no gender differences, and satisfactory alpha coefficients for a Vietnamese and Bulgarian sample of 0.87 and 0.89 respectively. This scale, equally applicable in both samples, includes items that reflect salient topics of concept of attitudes towards unauthorized migration. An exploratory principal component analysis of the Bulgarian and Vietnamese version of the 19-item scale yielded three factors accounting for 54% and 47% of the variance respectively. A procrustes analysis indicates high conceptual equivalence in the two samples for factor 1 and 2, and moderate for factor 3. This study lends support to the idea that despite different cultural meanings there is a common meaning space in culturally diverse societies.

  8. Occupational Cohort Time Scales

    PubMed Central

    Roth, H. Daniel

    2015-01-01

    Purpose: This study explores how highly correlated time variables (occupational cohort time scales) contribute to confounding and ambiguity of interpretation. Methods: Occupational cohort time scales were identified and organized through simple equations of three time scales (relational triads) and the connections between these triads (time scale web). The behavior of the time scales was examined when constraints were imposed on variable ranges and interrelationships. Results: Constraints on a time scale in a triad create high correlations between the other two time scales. These correlations combine with the connections between relational triads to produce association paths. High correlation between time scales leads to ambiguity of interpretation. Conclusions: Understanding the properties of occupational cohort time scales, their relational triads, and the time scale web is helpful in understanding the origins of otherwise obscure confounding bias and ambiguity of interpretation. PMID:25647318

  9. A preliminary study to measure and develop job satisfaction scale for medical teachers

    PubMed Central

    Bhatnagar, Kavita; Srivastava, Kalpana; Singh, Amarjit; Jadav, S.L.

    2011-01-01

    Background: Job satisfaction of medical teachers has an impact on quality of medical education and patient care. In this background, the study was planned to develop scale and measure job satisfaction status of medical teachers. Materials and Methods: To generate items pertaining to the scale of job satisfaction, closed-ended and open-ended questionnaires were administered to medical professionals. The job satisfaction questionnaire was developed and rated on Likert type of rating scale. Both quantitative and qualitative methods were used to ascertain job satisfaction among 245 health science faculty of an autonomous educational institution. Factor loading was calculated and final items with strong factor loading were selected. Data were statistically evaluated. Results: Average job satisfaction score was 53.97 on a scale of 1–100. The Cronbach's alpha reliability coefficient was 0.918 for entire set of items. There was statistically significant difference in job satisfaction level across different age groups (P 0.0358) showing a U-shaped pattern and fresh entrants versus reemployed faculty (P 0.0188), former showing lower satisfaction. Opportunity for self-development was biggest satisfier, followed by work, opportunity for promotion, and job security. Factors contributing toward job dissatisfaction were poor utilization of skills, poor promotional prospects, inadequate pay and allowances, work conditions, and work atmosphere. Conclusion: Tertiary care teaching hospitals in autonomous educational institutions need to build infrastructure and create opportunities for their medical professional. Job satisfaction of young entrants needs to be raised further by improving their work environment. This will pave the way for effective delivery of health care. PMID:23271862

  10. Small Scale Organic Techniques

    ERIC Educational Resources Information Center

    Horak, V.; Crist, DeLanson R.

    1975-01-01

    Discusses the advantages of using small scale experimentation in the undergraduate organic chemistry laboratory. Describes small scale filtration techniques as an example of a semi-micro method applied to small quantities of material. (MLH)

  11. Weak scale supersymmetry

    SciTech Connect

    Hall, L.J. California Univ., Berkeley, CA . Dept. of Physics)

    1990-11-12

    An introduction to the ideas and current state of weak scale supersymmetry is given. It is shown that LEP data on Z decays has already excluded two of the most elegant models of weak scale supersymmetry. 14 refs.

  12. On Quantitative Rorschach Scales.

    ERIC Educational Resources Information Center

    Haggard, Ernest A.

    1978-01-01

    Two types of quantitative Rorschach scales are discussed: first, those based on the response categories of content, location, and the determinants, and second, global scales based on the subject's responses to all ten stimulus cards. (Author/JKS)

  13. Cross-scale morphology

    USGS Publications Warehouse

    Allen, Craig R.; Holling, Crawford S.; Garmestani, Ahjond S.; El-Shaarawi, Abdel H.; Piegorsch, Walter W.

    2013-01-01

    The scaling of physical, biological, ecological and social phenomena is a major focus of efforts to develop simple representations of complex systems. Much of the attention has been on discovering universal scaling laws that emerge from simple physical and geometric processes. However, there are regular patterns of departures both from those scaling laws and from continuous distributions of attributes of systems. Those departures often demonstrate the development of self-organized interactions between living systems and physical processes over narrower ranges of scale.

  14. Reading Graduated Scales.

    ERIC Educational Resources Information Center

    Hall, Lucien T., Jr.

    1982-01-01

    Ways of teaching students to read scales are presented as process instructions that are probably overlooked or taken for granted by most instructors. Scales on such devices as thermometers, rulers, spring scales, speedometers, and thirty-meter tape are discussed. (MP)

  15. The Positivity Scale

    ERIC Educational Resources Information Center

    Caprara, Gian Vittorio; Alessandri, Guido; Eisenberg, Nancy; Kupfer, A.; Steca, Patrizia; Caprara, Maria Giovanna; Yamaguchi, Susumu; Fukuzawa, Ai; Abela, John

    2012-01-01

    Five studies document the validity of a new 8-item scale designed to measure "positivity," defined as the tendency to view life and experiences with a positive outlook. In the first study (N = 372), the psychometric properties of Positivity Scale (P Scale) were examined in accordance with classical test theory using a large number of college…

  16. Extreme Scale Visual Analytics

    SciTech Connect

    Wong, Pak C.; Shen, Han-Wei; Pascucci, Valerio

    2012-05-08

    Extreme-scale visual analytics (VA) is about applying VA to extreme-scale data. The articles in this special issue examine advances related to extreme-scale VA problems, their analytical and computational challenges, and their real-world applications.

  17. Belt scales user's guide

    SciTech Connect

    Rosenberg, N.I. )

    1993-02-01

    A conveyor-belt scale provides a means of obtaining accurate weights of dry bulk materials without delaying other plant operations. In addition, for many applications a belt scale is the most cost-effective alternative among many choices for a weighing system. But a number of users are not comfortable with the accuracy of their belt scales. In cases of unsatisfactory scale performance, it is often possible to correct problems and achieve the accuracy that was expected. To have a belt scale system that is accurate, precise, and cost effective, practical experience has shown that certain basic requisites must be satisfied. These requisites include matching the scale capability to the needs of the application, selecting durable scale equipment and conveyor idlers, adopting improved conveyor support methods, employing superior scale installation and alignment techniques, and establishing and practicing an effective scale testing and performance monitoring program. The goal of the Belt Scale Users' Guide is to enable utilities to reap the benefits of consistently accurate output from their new or upgraded belt scale installations. Such benefits include eliminating incorrect payments for coal receipts, improving coal pile inventory data, providing better heat rate results to enhance plant efficiency and yield more economical power dispatch, and satisfying regulatory agencies. All these benefits can reduce the cost of power generation.

  18. Manual of Scaling Methods

    NASA Technical Reports Server (NTRS)

    Bond, Thomas H. (Technical Monitor); Anderson, David N.

    2004-01-01

    This manual reviews the derivation of the similitude relationships believed to be important to ice accretion and examines ice-accretion data to evaluate their importance. Both size scaling and test-condition scaling methods employing the resulting similarity parameters are described, and experimental icing tests performed to evaluate scaling methods are reviewed with results. The material included applies primarily to unprotected, unswept geometries, but some discussion of how to approach other situations is included as well. The studies given here and scaling methods considered are applicable only to Appendix-C icing conditions. Nearly all of the experimental results presented have been obtained in sea-level tunnels. Recommendations are given regarding which scaling methods to use for both size scaling and test-condition scaling, and icing test results are described to support those recommendations. Facility limitations and size-scaling restrictions are discussed. Finally, appendices summarize the air, water and ice properties used in NASA scaling studies, give expressions for each of the similarity parameters used and provide sample calculations for the size-scaling and test-condition scaling methods advocated.

  19. Scaled models, scaled frequencies, and model fitting

    NASA Astrophysics Data System (ADS)

    Roxburgh, Ian W.

    2015-12-01

    I show that given a model star of mass M, radius R, and density profile ρ(x) [x = r/R], there exists a two parameter family of models with masses Mk, radii Rk, density profile ρk(x) = λρ(x) and frequencies νknℓ = λ1/2νnℓ, where λ,Rk/RA are scaling factors. These models have different internal structures, but all have the same value of separation ratios calculated at given radial orders n, and all exactly satisfy a frequency matching algorithm with an offset function determined as part of the fitting procedure. But they do not satisfy ratio matching at given frequencies nor phase shift matching. This illustrates that erroneous results may be obtained when model fitting with ratios at given n values or frequency matching. I give examples from scaled models and from non scaled evolutionary models.

  20. Salzburger State Reactance Scale (SSR Scale)

    PubMed Central

    2015-01-01

    Abstract. This paper describes the construction and empirical evaluation of an instrument for measuring state reactance, the Salzburger State Reactance (SSR) Scale. The results of a confirmatory factor analysis supported a hypothesized three-factor structure: experience of reactance, aggressive behavioral intentions, and negative attitudes. Correlations with divergent and convergent measures support the validity of this structure. The SSR Subscales were strongly related to the other state reactance measures. Moreover, the SSR Subscales showed modest positive correlations with trait measures of reactance. The SSR Subscales correlated only slightly or not at all with neighboring constructs (e.g., autonomy, experience of control). The only exception was fairness scales, which showed moderate correlations with the SSR Subscales. Furthermore, a retest analysis confirmed the temporal stability of the scale. Suggestions for further validation of this questionnaire are discussed. PMID:27453806

  1. Scale and scaling in agronomy and environmental sciences

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Scale is of paramount importance in environmental studies, engineering, and design. The unique course covers the following topics: scale and scaling, methods and theories, scaling in soils and other porous media, scaling in plants and crops; scaling in landscapes and watersheds, and scaling in agro...

  2. Development and Evaluation of the Brief Sexual Openness Scale-A Construal Level Theory Based Approach.

    PubMed

    Chen, Xinguang; Wang, Yan; Li, Fang; Gong, Jie; Yan, Yaqiong

    2015-01-01

    Obtaining reliable and valid data on sensitive questions represents a longstanding challenge for public health, particularly HIV research. To overcome the challenge, we assessed a construal level theory (CLT)-based novel method. The method was previously established and pilot-tested using the Brief Sexual Openness Scale (BSOS). This scale consists of five items assessing attitudes toward premarital sex, multiple sexual partners, homosexuality, extramarital sex, and commercial sex, all rated on a standard 5-point Likert scale. In addition to self-assessment, the participants were asked to assess rural residents, urban residents, and foreigners. The self-assessment plus the assessment of the three other groups were all used as subconstructs of one latent construct: sexual openness. The method was validated with data from 1,132 rural-to-urban migrants (mean age = 32.5, SD = 7.9; 49.6% female) recruited in China. Consistent with CLT, the Cronbach alpha of the BSOS as a conventional tool increased with social distance, from .81 for self-assessment to .97 for assessing foreigners. In addition to a satisfactory fit of the data to a one-factor model (CFI = .94, TLI = .93, RMSEA = .08), a common factor was separated from the four perspective factors (i.e., migrants' self-perspective and their perspectives of rural residents, urban residents and foreigners) through a trifactor modeling analysis (CFI = .95, TLI = .94, RMSEA = .08). Relative to its conventional form, CTL-based BSOS was more reliable (alpha: .96 vs .81) and valid in predicting sexual desire, frequency of dating, age of first sex, multiple sexual partners and STD history. This novel technique can be used to assess sexual openness, and possibly other sensitive questions among Chinese domestic migrants. PMID:26308336

  3. Development and Evaluation of the Brief Sexual Openness Scale-A Construal Level Theory Based Approach.

    PubMed

    Chen, Xinguang; Wang, Yan; Li, Fang; Gong, Jie; Yan, Yaqiong

    2015-01-01

    Obtaining reliable and valid data on sensitive questions represents a longstanding challenge for public health, particularly HIV research. To overcome the challenge, we assessed a construal level theory (CLT)-based novel method. The method was previously established and pilot-tested using the Brief Sexual Openness Scale (BSOS). This scale consists of five items assessing attitudes toward premarital sex, multiple sexual partners, homosexuality, extramarital sex, and commercial sex, all rated on a standard 5-point Likert scale. In addition to self-assessment, the participants were asked to assess rural residents, urban residents, and foreigners. The self-assessment plus the assessment of the three other groups were all used as subconstructs of one latent construct: sexual openness. The method was validated with data from 1,132 rural-to-urban migrants (mean age = 32.5, SD = 7.9; 49.6% female) recruited in China. Consistent with CLT, the Cronbach alpha of the BSOS as a conventional tool increased with social distance, from .81 for self-assessment to .97 for assessing foreigners. In addition to a satisfactory fit of the data to a one-factor model (CFI = .94, TLI = .93, RMSEA = .08), a common factor was separated from the four perspective factors (i.e., migrants' self-perspective and their perspectives of rural residents, urban residents and foreigners) through a trifactor modeling analysis (CFI = .95, TLI = .94, RMSEA = .08). Relative to its conventional form, CTL-based BSOS was more reliable (alpha: .96 vs .81) and valid in predicting sexual desire, frequency of dating, age of first sex, multiple sexual partners and STD history. This novel technique can be used to assess sexual openness, and possibly other sensitive questions among Chinese domestic migrants.

  4. Identification of Response Options to Artisanal and Small-Scale Gold Mining (ASGM) in Ghana via the Delphi Process

    PubMed Central

    Basu, Avik; Phipps, Sean; Long, Rachel; Essegbey, George; Basu, Niladri

    2015-01-01

    The Delphi technique is a means of facilitating discussion among experts in order to develop consensus, and can be used for policy formulation. This article describes a modified Delphi approach in which 27 multi-disciplinary academics and 22 stakeholders from Ghana and North America were polled about ways to address negative effects of small-scale gold mining (ASGM) in Ghana. In early 2014, the academics, working in disciplinary groups, synthesized 17 response options based on data aggregated during an Integrated Assessment of ASGM in Ghana. The researchers participated in two rounds of Delphi polling in March and April 2014, during which 17 options were condensed into 12. Response options were rated via a 4-point Likert scale in terms of benefit (economic, environmental, and benefit to people) and feasibility (economic, social/cultural, political, and implementation). The six highest-scoring options populated a third Delphi poll, which 22 stakeholders from diverse sectors completed in April 2015. The academics and stakeholders also prioritized the response options using ranking exercises. The technique successfully gauged expert opinion on ASGM, and helped identify potential responses, policies and solutions for the sector. This is timely given that improvement to the ASGM sector is an important component within the UN Minamata Convention. PMID:26378557

  5. The Medical Student Expectation Scale (MSES): a device for measuring students' expectations of each others' values and behaviours.

    PubMed

    Singleton, A F; Chen, S

    1996-05-01

    A study assessing the differences between institutional and matriculants' expectations of students' attitudes and behaviour was undertaken in 1992 at the Drew/UCLA Medical Education Program (DUMEP) in Los Angeles, California. Responding to a 33-item questionnaire utilizing 5-point Likert scales were 113/122 students in the classes of 1992 through to 1996. Factor analysis yielded two factors accounting for 61% of the total variance. Two subscales (Personal Trait Subscale and Drew Mission Subscale) containing a total of nine items comprise the Medical Student Expectation Scale (MSES). The alphas and standardized item alphas of these two subscales were 0.7531 and 0.8287 (Personal Trait Subscale) and 0.7304 and 0.7406 (Drew Mission Subscale), indicating good reliability. Correlation coefficients for continuous variables were calculated in order to determine subgroup responses to the subscales and their component items. While the students' overall responses indicated commitment to the values of the Charles R Drew University of Medicine and Science, subgroup responses varied. The strongest supporters of the University's values were older students, blacks, and those having better undergraduate performance in non-science areas. Least likely to agree with University values were students having better performances in the sciences (grade-point average and MCAT scores) and those of Mexican-American ethnicity. The scores of participating classes documented a secular trend away from endorsement of the values of the Drew University. Following further study, the MSES may be useful in student selection and curriculum design.

  6. Defining the Minimal Important Difference for the Visual Analogue Scale Assessing Dyspnea in Patients with Malignant Pleural Effusions

    PubMed Central

    Mishra, Eleanor K.; Corcoran, John P.; Hallifax, Robert J.; Stradling, John; Maskell, Nicholas A.; Rahman, Najib M.

    2015-01-01

    Background The minimal important difference (MID) is essential for interpreting the results of randomised controlled trials (RCTs). Despite a number of RCTs in patients with malignant pleural effusions (MPEs) which use the visual analogue scale for dyspnea (VASD) as an outcome measure, the MID has not been established. Methods Patients with suspected MPE undergoing a pleural procedure recorded their baseline VASD and their post-procedure VASD (24 hours after the pleural drainage), and in parallel assessed their breathlessness on a 7 point Likert scale. Findings The mean decrease in VASD in patients with a MPE reporting a ‘small but just worthwhile decrease’ in their dyspnea (i.e. equivalent to the MID) was 19mm (95% CI 14-24mm). The mean drainage volume required to produce a change in VASD of 19mm was 760ml. Interpretation The mean MID for the VASD in patients with a MPE undergoing a pleural procedure is 19mm (95% CI 14-24mm). Thus choosing an improvement of 19mm in the VASD would be justifiable in the design and analysis of future MPE studies. PMID:25874452

  7. How are personality judgments made? A cognitive model of reference group effects, personality scale responses, and behavioral reactions.

    PubMed

    Wood, Alex M; Brown, Gordon D A; Maltby, John; Watkinson, Pat

    2012-10-01

    This article suggests that personality judgments are wholly relative, being the outcome of a comparison of a given individual to a reference group of others. The underlying comparison processes are the same as those used to judge psychophysical stimuli (as outlined by range frequency theory and decision by sampling accounts). Five experimental studies show that the same person's personality is rated differently depending on how his or her behavior (a) ranks within a reference group and (b) falls within the overall range of behavior shown by other reference group members. Results were invariant across stimulus type and response options (7-point Likert scale, 990-point allocation task, or dichotomous choice). Simulated occupational scenarios led participants to give different-sized bonuses and employ different people as a function of context. Future research should note that personality judgments (as in self-report personality scales) only represent perceived standing relative to others or alternatively should measure personality through behavior or biological reactivity. Personality judgments cannot be used to compare different populations when the population participants have different reference groups (as in cross-cultural research).

  8. Identification of Response Options to Artisanal and Small-Scale Gold Mining (ASGM) in Ghana via the Delphi Process.

    PubMed

    Basu, Avik; Phipps, Sean; Long, Rachel; Essegbey, George; Basu, Niladri

    2015-09-10

    The Delphi technique is a means of facilitating discussion among experts in order to develop consensus, and can be used for policy formulation. This article describes a modified Delphi approach in which 27 multi-disciplinary academics and 22 stakeholders from Ghana and North America were polled about ways to address negative effects of small-scale gold mining (ASGM) in Ghana. In early 2014, the academics, working in disciplinary groups, synthesized 17 response options based on data aggregated during an Integrated Assessment of ASGM in Ghana. The researchers participated in two rounds of Delphi polling in March and April 2014, during which 17 options were condensed into 12. Response options were rated via a 4-point Likert scale in terms of benefit (economic, environmental, and benefit to people) and feasibility (economic, social/cultural, political, and implementation). The six highest-scoring options populated a third Delphi poll, which 22 stakeholders from diverse sectors completed in April 2015. The academics and stakeholders also prioritized the response options using ranking exercises. The technique successfully gauged expert opinion on ASGM, and helped identify potential responses, policies and solutions for the sector. This is timely given that improvement to the ASGM sector is an important component within the UN Minamata Convention.

  9. The Medical Student Expectation Scale (MSES): a device for measuring students' expectations of each others' values and behaviours.

    PubMed

    Singleton, A F; Chen, S

    1996-05-01

    A study assessing the differences between institutional and matriculants' expectations of students' attitudes and behaviour was undertaken in 1992 at the Drew/UCLA Medical Education Program (DUMEP) in Los Angeles, California. Responding to a 33-item questionnaire utilizing 5-point Likert scales were 113/122 students in the classes of 1992 through to 1996. Factor analysis yielded two factors accounting for 61% of the total variance. Two subscales (Personal Trait Subscale and Drew Mission Subscale) containing a total of nine items comprise the Medical Student Expectation Scale (MSES). The alphas and standardized item alphas of these two subscales were 0.7531 and 0.8287 (Personal Trait Subscale) and 0.7304 and 0.7406 (Drew Mission Subscale), indicating good reliability. Correlation coefficients for continuous variables were calculated in order to determine subgroup responses to the subscales and their component items. While the students' overall responses indicated commitment to the values of the Charles R Drew University of Medicine and Science, subgroup responses varied. The strongest supporters of the University's values were older students, blacks, and those having better undergraduate performance in non-science areas. Least likely to agree with University values were students having better performances in the sciences (grade-point average and MCAT scores) and those of Mexican-American ethnicity. The scores of participating classes documented a secular trend away from endorsement of the values of the Drew University. Following further study, the MSES may be useful in student selection and curriculum design. PMID:8949552

  10. Parabolic scaling beams.

    PubMed

    Gao, Nan; Xie, Changqing

    2014-06-15

    We generalize the concept of diffraction free beams to parabolic scaling beams (PSBs), whose normalized intensity scales parabolically during propagation. These beams are nondiffracting in the circular parabolic coordinate systems, and all the diffraction free beams of Durnin's type have counterparts as PSBs. Parabolic scaling Bessel beams with Gaussian apodization are investigated in detail, their nonparaxial extrapolations are derived, and experimental results agree well with theoretical predictions.

  11. Multi-scale renormalization

    NASA Astrophysics Data System (ADS)

    Ford, C.; Wiesendanger, C.

    1997-02-01

    The standard MS renormalization prescription is inadequate for dealing with multi-scale problems. To illustrate this we consider the computation of the effective potential in the Higgs-Yukawa model. It is argued that it is natural to employ a two-scale renormalization group. We give a modified version of a two-scale scheme introduced by Einhorn and Jones. In such schemes the beta functions necessarily contain potentially large logarithms of the RG scale ratios. For credible perturbation theory one must implement a large logarithms resummation on the beta functions themselves. We show how the integrability condition for the two RG equations allows one to perform this resummation.

  12. The Family Constellation Scale.

    ERIC Educational Resources Information Center

    Lemire, David

    The Family Constellation Scale (FC Scale) is an instrument that assesses perceived birth order in families. It can be used in counseling to help initiate conversations about various traits and assumptions that tend to characterize first-born, middle-born children, youngest-born, and only children. It provides both counselors and clients insights…

  13. INL Laboratory Scale Atomizer

    SciTech Connect

    C.R. Clark; G.C. Knighton; R.S. Fielding; N.P. Hallinan

    2010-01-01

    A laboratory scale atomizer has been built at the Idaho National Laboratory. This has proven useful for laboratory scale tests and has been used to fabricate fuel used in the RERTR miniplate experiments. This instrument evolved over time with various improvements being made ‘on the fly’ in a trial and error process.

  14. Scaling up as Catachresis

    ERIC Educational Resources Information Center

    Tobin, Joseph

    2005-01-01

    The metaphor of scaling up is the wrong one to use for describing and prescribing educational change. Many of the strategies being employed to achieve scaling up are counter-productive: they conceive of practitioners as delivery agents or consumers, rather than as co-constructors of change. An approach to educational innovation based on the…

  15. Thoughts on Scale

    ERIC Educational Resources Information Center

    Schoenfeld, Alan H.

    2015-01-01

    This essay reflects on the challenges of thinking about scale--of making sense of phenomena such as continuous professional development (CPD) at the system level, while holding on to detail at the finer grain size(s) of implementation. The stimuli for my reflections are three diverse studies of attempts at scale--an attempt to use ideas related to…

  16. Premarital Attitude Scale.

    ERIC Educational Resources Information Center

    Vancouver Board of School Trustees (British Columbia). Dept. of Planning and Evaluation.

    The thirty-one item questionnaire was developed to measure how prepared high school students are for marriage. The students are directed to read each statement and to select a response on a five point scale ranging from agreeing strongly to disagreeing strongly. The scale is scored to indicate three factors which are considered important for a…

  17. Everyday Scale Errors

    ERIC Educational Resources Information Center

    Ware, Elizabeth A.; Uttal, David H.; DeLoache, Judy S.

    2010-01-01

    Young children occasionally make "scale errors"--they attempt to fit their bodies into extremely small objects or attempt to fit a larger object into another, tiny, object. For example, a child might try to sit in a dollhouse-sized chair or try to stuff a large doll into it. Scale error research was originally motivated by parents' and…

  18. Teaching Satisfaction Scale

    ERIC Educational Resources Information Center

    Ho, Chung-Lim; Au, Wing-Tung

    2006-01-01

    The present study proposes a teaching satisfaction measure and examines the validity of its scores. The measure is based on the Life Satisfaction Scale (LSS). Scores on the five-item Teaching Satisfaction Scale (TSS) were validated on a sample of 202 primary and secondary school teachers and favorable psychometric properties were found. As…

  19. Teacher Observation Scales.

    ERIC Educational Resources Information Center

    Purdue Univ., Lafayette, IN. Educational Research Center.

    The Teacher Observation Scales include four instruments: Observer Rating Scale (ORS), Reading Strategies Check List, Arithmetic Strategies Check List, and Classroom Description. These instruments utilize trained observers to describe the teaching behavior, instructional strategies and physical characteristics in each classroom. On the ORS, teacher…

  20. New scale factor measure

    NASA Astrophysics Data System (ADS)

    Bousso, Raphael

    2012-07-01

    The computation of probabilities in an eternally inflating universe requires a regulator or “measure.” The scale factor time measure truncates the Universe when a congruence of timelike geodesics has expanded by a fixed volume factor. This definition breaks down if the generating congruence is contracting—a serious limitation that excludes from consideration gravitationally bound regions such as our own. Here we propose a closely related regulator which is well defined in the entire spacetime. The new scale factor cutoff restricts to events with a scale factor below a given value. Since the scale factor vanishes at caustics and crunches, this cutoff always includes an infinite number of disconnected future regions. We show that this does not lead to divergences. The resulting measure combines desirable features of the old scale factor cutoff and of the light-cone time cutoff, while eliminating some of the disadvantages of each.

  1. The inflationary energy scale

    NASA Astrophysics Data System (ADS)

    Liddle, Andrew R.

    1994-01-01

    The energy scale of inflation is of much interest, as it suggests the scale of grand unified physics, governs whether cosmological events such as topological defect formation can occur after inflation, and also determines the amplitude of gravitational waves which may be detectable using interferometers. The COBE results are used to limit the energy scale of inflation at the time large scale perturbations were imprinted. An exact dynamical treatment based on the Hamilton-Jacobi equations is then used to translate this into limits on the energy scale at the end of inflation. General constraints are given, and then tighter constraints based on physically motivated assumptions regarding the allowed forms of density perturbation and gravitational wave spectra. These are also compared with the values of familiar models.

  2. Parallel Computing in SCALE

    SciTech Connect

    DeHart, Mark D; Williams, Mark L; Bowman, Stephen M

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  3. Composite rating scales.

    PubMed

    Martinez-Martin, Pablo

    2010-02-15

    Rating scales are instruments that are very frequently used by clinicians to perform patient assessments. Typically, rating scales grade the attribute on an ordinal level of measurement, i.e., a rank ordering, meaning that the numbers assigned to the different ranks (item scores) do not represent 'real numbers' or 'physical magnitudes'. Single-item scales have some advantages, such as simplicity and low respondent burden, but they may also suffer from disadvantages, such as ambiguous score meanings and low responsiveness. Multi-item scales, in contrast, seem more adequate for assessment of complex constructs, allowing for detailed evaluation. Total scores representing the value of the construct may be quite precise and thus the responsiveness of the scale may be high. The most common strategy for obtaining the total score is the sum of the item scores, a strategy that constitutes one of the most important problems with these types of scales. A summative score of ordinal figures is not a 'real magnitude' and may have little sense. This paper is a review of the theoretical frameworks of the main theories used to develop rating scales (Classical Test Theory and Item Response Theory). Bearing in mind that no alternative is perfect, additional research in this field and judicious decisions are called for.

  4. Allometric Scaling in Biology

    NASA Astrophysics Data System (ADS)

    Banavar, Jayanth

    2009-03-01

    The unity of life is expressed not only in the universal basis of inheritance and energetics at the molecular level, but also in the pervasive scaling of traits with body size at the whole-organism level. More than 75 years ago, Kleiber and Brody and Proctor independently showed that the metabolic rates, B, of mammals and birds scale as the three-quarter power of their mass, M. Subsequent studies showed that most biological rates and times scale as M-1/4 and M^1/4 respectively, and that these so called quarter-power scaling relations hold for a variety of organisms, from unicellular prokaryotes and eukaryotes to trees and mammals. The wide applicability of Kleiber's law, across the 22 orders of magnitude of body mass from minute bacteria to giant whales and sequoias, raises the hope that there is some simple general explanation that underlies the incredible diversity of form and function. We will present a general theoretical framework for understanding the relationship between metabolic rate, B, and body mass, M. We show how the pervasive quarter-power biological scaling relations arise naturally from optimal directed resource supply systems. This framework robustly predicts that: 1) whole organism power and resource supply rate, B, scale as M^3/4; 2) most other rates, such as heart rate and maximal population growth rate scale as M-1/4; 3) most biological times, such as blood circulation time and lifespan, scale as M^1/4; and 4) the average velocity of flow through the network, v, such as the speed of blood and oxygen delivery, scales as M^1/12. Our framework is valid even when there is no underlying network. Our theory is applicable to unicellular organisms as well as to large animals and plants. This work was carried out in collaboration with Amos Maritan along with Jim Brown, John Damuth, Melanie Moses, Andrea Rinaldo, and Geoff West.

  5. Sulfate scale dissolution

    SciTech Connect

    Morris, R.L.; Paul, J.M.

    1992-01-28

    This patent describes a method for removing barium sulfate scale. It comprises contacting the scale with an aqueous solution having a pH of about 8 to about 14 and consisting essentially of a chelating agent comprising a polyaminopolycarboxylic acid or salt of such an acid in a concentration of 0.1 to 1.0 M, and anions of a monocarboxylic acid selected form mercaptoacetic acid, hydroxyacetic acid, aminoacetic acid, or salicyclic acid in a concentration of 0.1 to 1.0 M and which is soluble in the solution under the selected pH conditions, to dissolve the scale.

  6. Development of an Attitude Scale to Assess K-12 Teachers' Attitudes toward Nanotechnology

    NASA Astrophysics Data System (ADS)

    Lan, Yu-Ling

    2012-05-01

    To maximize the contributions of nanotechnology to this society, at least 60 countries have put efforts into this field. In Taiwan, a government-funded K-12 Nanotechnology Programme was established to train K-12 teachers with adequate nanotechnology literacy to foster the next generation of Taiwanese people with sufficient knowledge in nanotechnology. In the present study, the Nanotechnology Attitude Scale for K-12 teachers (NAS-T) was developed to assess K-12 teachers' attitudes toward nanotechnology. The NAS-T included 23 Likert-scale items that can be grouped into three components: importance of nanotechnology, affective tendencies in science teaching, and behavioural tendencies to teach nanotechnology. A sample of 233 K-12 teachers who have participated in the K-12 Nanotechnology Programme was included in the present study to investigate the psychometric properties of the NAS-T. The exploratory factor analysis of this teacher sample suggested that the NAS-T was a three-factor model that explained 64.11% of the total variances. This model was also confirmed by the confirmatory factor analysis to validate the factor structure of the NAS-T. The Cronbach's alpha values of three NAS-T subscales ranged from 0.89 to 0.95. Moderate to strong correlations among teachers' NAS-T domain scores, self-perception of own nanoscience knowledge, and their science-teaching efficacy demonstrated good convergent validity of the NAS-T. As a whole, psychometric properties of the NAS-T indicated that this instrument is an effective instrument for assessing K-12 teachers' attitudes toward nanotechnology. The NAS-T will serve as a valuable tool to evaluate teachers' attitude changes after participating in the K-12 Nanotechnology Programme.

  7. The Suitability of Gray-Scale Electronic Readers for Dermatology Journals

    PubMed Central

    Choi, Jae Eun; Kim, Dai Hyun; Seo, Soo Hong; Kye, Young Chul

    2014-01-01

    Background The rapid development of information and communication technology has replaced traditional books by electronic versions. Most print dermatology journals have been replaced with electronic journals (e-journals), which are readily used by clinicians and medical students. Objective The objectives of this study were to determine whether e-readers are appropriate for reading dermatology journals, to conduct an attitude study of both medical personnel and students, and to find a way of improving e-book use in the field of dermatology. Methods All articles in the Korean Journal of Dermatology published from January 2010 to December 2010 were utilized in this study. Dermatology house officers, student trainees in their fourth year of medical school, and interns at Korea University Medical Center participated in the study. After reading the articles with Kindle 2, their impressions and evaluations were recorded using a questionnaire with a 5-point Likert scale. Results The results demonstrated that gray-scale e-readers might not be suitable for reading dermatology journals, especially for case reports compared to the original articles. Only three of the thirty-one respondents preferred e-readers to printed papers. The most common suggestions from respondents to encourage usage of e-books in the field of dermatology were the introduction of a color display, followed by the use of a touch screen system, a cheaper price, and ready-to-print capabilities. Conclusion In conclusion, our study demonstrated that current e-readers might not be suitable for reading dermatology journals. However, they may be utilized in selected situations according to the type and topic of the papers. PMID:25473221

  8. Scaling in sensitivity analysis

    USGS Publications Warehouse

    Link, W.A.; Doherty, P.F.

    2002-01-01

    Population matrix models allow sets of demographic parameters to be summarized by a single value 8, the finite rate of population increase. The consequences of change in individual demographic parameters are naturally measured by the corresponding changes in 8; sensitivity analyses compare demographic parameters on the basis of these changes. These comparisons are complicated by issues of scale. Elasticity analysis attempts to deal with issues of scale by comparing the effects of proportional changes in demographic parameters, but leads to inconsistencies in evaluating demographic rates. We discuss this and other problems of scaling in sensitivity analysis, and suggest a simple criterion for choosing appropriate scales. We apply our suggestions to data for the killer whale, Orcinus orca.

  9. Lifshitz scale anomalies

    NASA Astrophysics Data System (ADS)

    Arav, Igal; Chapman, Shira; Oz, Yaron

    2015-02-01

    We analyse scale anomalies in Lifshitz field theories, formulated as the relative cohomology of the scaling operator with respect to foliation preserving diffeomorphisms. We construct a detailed framework that enables us to calculate the anomalies for any number of spatial dimensions, and for any value of the dynamical exponent. We derive selection rules, and establish the anomaly structure in diverse universal sectors. We present the complete cohomologies for various examples in one, two and three space dimensions for several values of the dynamical exponent. Our calculations indicate that all the Lifshitz scale anomalies are trivial descents, called B-type in the terminology of conformal anomalies. However, not all the trivial descents are cohomologically non-trivial. We compare the conformal anomalies to Lifshitz scale anomalies with a dynamical exponent equal to one.

  10. Large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Doolin, B. F.

    1975-01-01

    Classes of large scale dynamic systems were discussed in the context of modern control theory. Specific examples discussed were in the technical fields of aeronautics, water resources and electric power.

  11. Reconsidering earthquake scaling

    NASA Astrophysics Data System (ADS)

    Gomberg, J.; Wech, A.; Creager, K.; Obara, K.; Agnew, D.

    2016-06-01

    The relationship (scaling) between scalar moment, M0, and duration, T, potentially provides key constraints on the physics governing fault slip. The prevailing interpretation of M0-T observations proposes different scaling for fast (earthquakes) and slow (mostly aseismic) slip populations and thus fundamentally different driving mechanisms. We show that a single model of slip events within bounded slip zones may explain nearly all fast and slow slip M0-T observations, and both slip populations have a change in scaling, where the slip area growth changes from 2-D when too small to sense the boundaries to 1-D when large enough to be bounded. We present new fast and slow slip M0-T observations that sample the change in scaling in each population, which are consistent with our interpretation. We suggest that a continuous but bimodal distribution of slip modes exists and M0-T observations alone may not imply a fundamental difference between fast and slow slip.

  12. Scaling in Columnar Joints

    NASA Astrophysics Data System (ADS)

    Morris, Stephen

    2007-03-01

    Columnar jointing is a fracture pattern common in igneous rocks in which cracks self-organize into a roughly hexagonal arrangement, leaving behind an ordered colonnade. We report observations of columnar jointing in a laboratory analog system, desiccated corn starch slurries. Using measurements of moisture density, evaporation rates, and fracture advance rates, we suggest an advective-diffusive system is responsible for the rough scaling behavior of columnar joints. This theory explains the order of magnitude difference in scales between jointing in lavas and in starches. We investigated the scaling of average columnar cross-sectional areas in experiments where the evaporation rate was fixed using feedback methods. Our results suggest that the column area at a particular depth is related to both the current conditions, and hysteretically to the geometry of the pattern at previous depths. We argue that there exists a range of stable column scales allowed for any particular evaporation rate.

  13. Digital scale converter

    DOEpatents

    Upton, Richard G.

    1978-01-01

    A digital scale converter is provided for binary coded decimal (BCD) conversion. The converter may be programmed to convert a BCD value of a first scale to the equivalent value of a second scale according to a known ratio. The value to be converted is loaded into a first BCD counter and counted down to zero while a second BCD counter registers counts from zero or an offset value depending upon the conversion. Programmable rate multipliers are used to generate pulses at selected rates to the counters for the proper conversion ratio. The value present in the second counter at the time the first counter is counted to the zero count is the equivalent value of the second scale. This value may be read out and displayed on a conventional seven-segment digital display.

  14. Magnetron injection gun scaling

    NASA Astrophysics Data System (ADS)

    Lawson, W.

    1988-04-01

    A set of tradeoff equations was simplified to obtain scaling laws for magnetron injection guns (MIGs). The constraints are chosen to examine the maximum-peak-power capabilities of MIGs. The scaling laws are compared with exact solutions of the design equations and are supported by MIG simulations in which each MIG is designed to double the beam power of an existing design by adjusting one of the four fundamental parameters.

  15. Ensemble Pulsar Time Scale

    NASA Astrophysics Data System (ADS)

    Yin, D. S.; Gao, Y. P.; Zhao, S. H.

    2016-05-01

    Millisecond pulsars can generate another type of time scale that is totally independent of the atomic time scale, because the physical mechanisms of the pulsar time scale and the atomic time scale are quite different from each other. Usually the pulsar timing observational data are not evenly sampled, and the internals between data points range from several hours to more than half a month. What's more, these data sets are sparse. And all these make it difficult to generate an ensemble pulsar time scale. Hence, a new algorithm to calculate the ensemble pulsar time scale is proposed. Firstly, we use cubic spline interpolation to densify the data set, and make the intervals between data points even. Then, we employ the Vondrak filter to smooth the data set, and get rid of high-frequency noise, finally adopt the weighted average method to generate the ensemble pulsar time scale. The pulsar timing residuals represent clock difference between the pulsar time and atomic time, and the high precision pulsar timing data mean the clock difference measurement between the pulsar time and atomic time with a high signal to noise ratio, which is fundamental to generate pulsar time. We use the latest released NANOGRAV (North American Nanohertz Observatory for Gravitational Waves) 9-year data set to generate the ensemble pulsar time scale. This data set is from the newest NANOGRAV data release, which includes 9-year observational data of 37 millisecond pulsars using the 100-meter Green Bank telescope and 305-meter Arecibo telescope. We find that the algorithm used in this paper can lower the influence caused by noises in timing residuals, and improve long-term stability of pulsar time. Results show that the long-term (> 1 yr) frequency stability of the pulsar time is better than 3.4×10-15.

  16. The Improbability scale

    SciTech Connect

    Ritchie, David J.; /Fermilab

    2005-03-01

    The Improbability Scale (IS) is proposed as a way of communicating to the general public the improbability (and by implication, the probability) of events predicted as the result of scientific research. Through the use of the Improbability Scale, the public will be able to evaluate more easily the relative risks of predicted events and draw proper conclusions when asked to support governmental and public policy decisions arising from that research.

  17. Scaling, Universality, and Geomorphology

    NASA Astrophysics Data System (ADS)

    Dodds, Peter Sheridan; Rothman, Daniel H.

    Theories of scaling apply wherever similarity exists across many scales. This similarity may be found in geometry and in dynamical processes. Universality arises when the qualitative character of a system is sufficient to quantitatively predict its essential features, such as the exponents that characterize scaling laws. Within geomorphology, two areas where the concepts of scaling and universality have found application are the geometry of river networks and the statistical structure of topography. We begin this review with a pedagogical presentation of scaling and universality. We then describe recent progress made in applying these ideas to networks and topography. This overview leads to a synthesis that attempts a classification of surface and network properties based on generic mechanisms and geometric constraints. We also briefly review how scaling and universality have been applied to related problems in sedimentology-specifically, the origin of stromatolites and the relation of the statistical properties of submarine-canyon topography to the size distribution of turbidite deposits. Throughout the review, our intention is to elucidate not only the problems that can be solved using these concepts, but also those that cannot.

  18. Profiling medical school learning environments in Malaysia: a validation study of the Johns Hopkins Learning Environment Scale

    PubMed Central

    Tackett, Sean; Bakar, Hamidah Abu; Shilkofski, Nicole A.; Coady, Niamh; Rampal, Krishna; Wright, Scott

    2015-01-01

    Purpose: While a strong learning environment is critical to medical student education, the assessment of medical school learning environments has confounded researchers. Our goal was to assess the validity and utility of the Johns Hopkins Learning Environment Scale (JHLES) for preclinical students at three Malaysian medical schools with distinct educational and institutional models. Two schools were new international partnerships, and the third was school leaver program established without international partnership. Methods: First- and second-year students responded anonymously to surveys at the end of the academic year. The surveys included the JHLES, a 28-item survey using five-point Likert scale response options, the Dundee Ready Educational Environment Measure (DREEM), the most widely used method to assess learning environments internationally, a personal growth scale, and single-item global learning environment assessment variables. Results: The overall response rate was 369/429 (86%). After adjusting for the medical school year, gender, and ethnicity of the respondents, the JHLES detected differences across institutions in four out of seven domains (57%), with each school having a unique domain profile. The DREEM detected differences in one out of five categories (20%). The JHLES was more strongly correlated than the DREEM to two thirds of the single-item variables and the personal growth scale. The JHLES showed high internal reliability for the total score (α=0.92) and the seven domains (α, 0.56-0.85). Conclusion: The JHLES detected variation between learning environment domains across three educational settings, thereby creating unique learning environment profiles. Interpretation of these profiles may allow schools to understand how they are currently supporting trainees and identify areas needing attention. PMID:26165949

  19. Earthquake Scaling Relations

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.; Boettcher, M.; Richardson, E.

    2002-12-01

    Using scaling relations to understand nonlinear geosystems has been an enduring theme of Don Turcotte's research. In particular, his studies of scaling in active fault systems have led to a series of insights about the underlying physics of earthquakes. This presentation will review some recent progress in developing scaling relations for several key aspects of earthquake behavior, including the inner and outer scales of dynamic fault rupture and the energetics of the rupture process. The proximate observations of mining-induced, friction-controlled events obtained from in-mine seismic networks have revealed a lower seismicity cutoff at a seismic moment Mmin near 109 Nm and a corresponding upper frequency cutoff near 200 Hz, which we interpret in terms of a critical slip distance for frictional drop of about 10-4 m. Above this cutoff, the apparent stress scales as M1/6 up to magnitudes of 4-5, consistent with other near-source studies in this magnitude range (see special session S07, this meeting). Such a relationship suggests a damage model in which apparent fracture energy scales with the stress intensity factor at the crack tip. Under the assumption of constant stress drop, this model implies an increase in rupture velocity with seismic moment, which successfully predicts the observed variation in corner frequency and maximum particle velocity. Global observations of oceanic transform faults (OTFs) allow us to investigate a situation where the outer scale of earthquake size may be controlled by dynamics (as opposed to geologic heterogeneity). The seismicity data imply that the effective area for OTF moment release, AE, depends on the thermal state of the fault but is otherwise independent of fault's average slip rate; i.e., AE ~ AT, where AT is the area above a reference isotherm. The data are consistent with β = 1/2 below an upper cutoff moment Mmax that increases with AT and yield the interesting scaling relation Amax ~ AT1/2. Taken together, the OTF

  20. Fast ignition breakeven scaling.

    SciTech Connect

    Slutz, Stephen A.; Vesey, Roger Alan

    2005-01-01

    A series of numerical simulations have been performed to determine scaling laws for fast ignition break even of a hot spot formed by energetic particles created by a short pulse laser. Hot spot break even is defined to be when the fusion yield is equal to the total energy deposited in the hot spot through both the initial compression and the subsequent heating. In these simulations, only a small portion of a previously compressed mass of deuterium-tritium fuel is heated on a short time scale, i.e., the hot spot is tamped by the cold dense fuel which surrounds it. The hot spot tamping reduces the minimum energy required to obtain break even as compared to the situation where the entire fuel mass is heated, as was assumed in a previous study [S. A. Slutz, R. A. Vesey, I. Shoemaker, T. A. Mehlhorn, and K. Cochrane, Phys. Plasmas 7, 3483 (2004)]. The minimum energy required to obtain hot spot break even is given approximately by the scaling law E{sub T} = 7.5({rho}/100){sup -1.87} kJ for tamped hot spots, as compared to the previously reported scaling of E{sub UT} = 15.3({rho}/100){sup -1.5} kJ for untamped hotspots. The size of the compressed fuel mass and the focusability of the particles generated by the short pulse laser determines which scaling law to use for an experiment designed to achieve hot spot break even.

  1. Scaling of Thermoacoustic Refrigerators

    NASA Astrophysics Data System (ADS)

    Li, Y.; Zeegers, J. C. H.; ter Brake, H. J. M.

    2008-03-01

    The possibility of scaling-down thermoacoustic refrigerators is theoretically investigated. Standing-wave systems are considered as well as traveling-wave. In the former case, a reference system is taken that consists of a resonator tube (50 cm) with a closed end and a PVC stack (length 5 cm). Helium is used at a mean pressure of 10 bar and an amplitude of 1 bar. The resulting operating frequency is 1 kHz. The variation of the performance of the refrigerator when scaled down in size is computed under the prerequisites that the temperature drop over the stack or the energy flux or its density are fixed. The analytical results show that there is a limitation in scaling-down a standing-wave thermoacoustic refrigerator due to heat conduction. Similar scaling trends are considered in traveling-wave refrigerators. The traveling-wave reference system consists of a feedback inertance tube of 0.567 m long, inside diameter 78 mm, a compliance volume of 2830 cm3 and a 24 cm thermal buffer tube. The regenerator is sandwiched between two heat exchangers. The system is operated at 125 Hz and filled with 30 bar helium gas. Again, the thermal conductance forms a practical limitation in down-scaling.

  2. Universities scale like cities.

    PubMed

    van Raan, Anthony F J

    2013-01-01

    Recent studies of urban scaling show that important socioeconomic city characteristics such as wealth and innovation capacity exhibit a nonlinear, particularly a power law scaling with population size. These nonlinear effects are common to all cities, with similar power law exponents. These findings mean that the larger the city, the more disproportionally they are places of wealth and innovation. Local properties of cities cause a deviation from the expected behavior as predicted by the power law scaling. In this paper we demonstrate that universities show a similar behavior as cities in the distribution of the 'gross university income' in terms of total number of citations over 'size' in terms of total number of publications. Moreover, the power law exponents for university scaling are comparable to those for urban scaling. We find that deviations from the expected behavior can indeed be explained by specific local properties of universities, particularly the field-specific composition of a university, and its quality in terms of field-normalized citation impact. By studying both the set of the 500 largest universities worldwide and a specific subset of these 500 universities--the top-100 European universities--we are also able to distinguish between properties of universities with as well as without selection of one specific local property, the quality of a university in terms of its average field-normalized citation impact. It also reveals an interesting observation concerning the working of a crucial property in networked systems, preferential attachment.

  3. Fire toxicity scaling

    SciTech Connect

    Braun, E.; Levin, B.C.; Paabo, M.; Gurman, J.; Holt, T.

    1987-02-01

    The toxicity of the thermal-decomposition products from two flexible polyurethane foams (with and without a fire retardant) and a cotton upholstery fabric was evaluated by a series of small-scale and large-scale tests single mock-up upholstery chair tests during smoldering or flaming decomposition. In addition other fire property data such as rates of heat release, effective heats of combustion, specific gas species yields, and smoke obscuration were measured. The degree of toxicity observed during and following the flaming tests (both large-scale room burns and the NBS Toxicity Tests) could be explained by a 3-Gas Model which includes the combined toxicological effects of CO, CO/sub 2/, and HCN. Essentially, no animal deaths were noted during the thirty minute exposures to the non-flaming or smoldering combustion products produced in the NBS Toxicity Test Method or the large-scale room test. In the large-scale room tests, little toxicological difference was noted between decomposition products from the burn room and a second room 12 meters away.

  4. Full Scale Tunnel model

    NASA Technical Reports Server (NTRS)

    1929-01-01

    Interior view of Full-Scale Tunnel (FST) model. (Small human figures have been added for scale.) On June 26, 1929, Elton W. Miller wrote to George W. Lewis proposing the construction of a model of the full-scale tunnel . 'The excellent energy ratio obtained in the new wind tunnel of the California Institute of Technology suggests that before proceeding with our full scale tunnel design, we ought to investigate the effect on energy ratio of such factors as: 1. small included angle for the exit cone; 2. carefully designed return passages of circular section as far as possible, without sudden changes in cross sections; 3. tightness of walls. It is believed that much useful information can be obtained by building a model of about 1/16 scale, that is, having a closed throat of 2 ft. by 4 ft. The outside dimensions would be about 12 ft. by 25 ft. in plan and the height 4 ft. Two propellers will be required about 28 in. in diameter, each to be driven by direct current motor at a maximum speed of 4500 R.P.M. Provision can be made for altering the length of certain portions, particularly the exit cone, and possibly for the application of boundary layer control in order to effect satisfactory air flow.

  5. Atomic Scale Plasmonic Switch.

    PubMed

    Emboras, Alexandros; Niegemann, Jens; Ma, Ping; Haffner, Christian; Pedersen, Andreas; Luisier, Mathieu; Hafner, Christian; Schimmel, Thomas; Leuthold, Juerg

    2016-01-13

    The atom sets an ultimate scaling limit to Moore's law in the electronics industry. While electronics research already explores atomic scales devices, photonics research still deals with devices at the micrometer scale. Here we demonstrate that photonic scaling, similar to electronics, is only limited by the atom. More precisely, we introduce an electrically controlled plasmonic switch operating at the atomic scale. The switch allows for fast and reproducible switching by means of the relocation of an individual or, at most, a few atoms in a plasmonic cavity. Depending on the location of the atom either of two distinct plasmonic cavity resonance states are supported. Experimental results show reversible digital optical switching with an extinction ratio of 9.2 dB and operation at room temperature up to MHz with femtojoule (fJ) power consumption for a single switch operation. This demonstration of an integrated quantum device allowing to control photons at the atomic level opens intriguing perspectives for a fully integrated and highly scalable chip platform, a platform where optics, electronics, and memory may be controlled at the single-atom level.

  6. Scale adaptive compressive tracking.

    PubMed

    Zhao, Pengpeng; Cui, Shaohui; Gao, Min; Fang, Dan

    2016-01-01

    Recently, the compressive tracking (CT) method (Zhang et al. in Proceedings of European conference on computer vision, pp 864-877, 2012) has attracted much attention due to its high efficiency, but it cannot well deal with the scale changing objects due to its constant tracking box. To address this issue, in this paper we propose a scale adaptive CT approach, which adaptively adjusts the scale of tracking box with the size variation of the objects. Our method significantly improves CT in three aspects: Firstly, the scale of tracking box is adaptively adjusted according to the size of the objects. Secondly, in the CT method, all the compressive features are supposed independent and equal contribution to the classifier. Actually, different compressive features have different confidence coefficients. In our proposed method, the confidence coefficients of features are computed and used to achieve different contribution to the classifier. Finally, in the CT method, the learning parameter λ is constant, which will result in large tracking drift on the occasion of object occlusion or large scale appearance variation. In our proposed method, a variable learning parameter λ is adopted, which can be adjusted according to the object appearance variation rate. Extensive experiments on the CVPR2013 tracking benchmark demonstrate the superior performance of the proposed method compared to state-of-the-art tracking algorithms. PMID:27386298

  7. No-scale inflation

    NASA Astrophysics Data System (ADS)

    Ellis, John; Garcia, Marcos A. G.; Nanopoulos, Dimitri V.; Olive, Keith A.

    2016-05-01

    Supersymmetry is the most natural framework for physics above the TeV scale, and the corresponding framework for early-Universe cosmology, including inflation, is supergravity. No-scale supergravity emerges from generic string compactifications and yields a non-negative potential, and is therefore a plausible framework for constructing models of inflation. No-scale inflation yields naturally predictions similar to those of the Starobinsky model based on R+{R}2 gravity, with a tilted spectrum of scalar perturbations: {n}s∼ 0.96, and small values of the tensor-to-scalar perturbation ratio r\\lt 0.1, as favoured by Planck and other data on the cosmic microwave background (CMB). Detailed measurements of the CMB may provide insights into the embedding of inflation within string theory as well as its links to collider physics.

  8. Scales of rock permeability

    NASA Astrophysics Data System (ADS)

    Guéguen, Y.; Gavrilenko, P.; Le Ravalec, M.

    1996-05-01

    Permeability is a transport property which is currently measured in Darcy units. Although this unit is very convenient for most purposes, its use prevents from recognizing that permeability has units of length squared. Physically, the square root of permeability can thus be seen as a characteristic length or a characteristic pore size. At the laboratory scale, the identification of this characteristic length is a good example of how experimental measurements and theoretical modelling can be integrated. Three distinct identifications are of current use, relying on three different techniques: image analysis of thin sections, mercury porosimetry and nitrogen adsorption. In each case, one or several theoretical models allow us to derive permeability from the experimental data (equivalent channel models, statistical models, effective media models, percolation and network models). Permeability varies with pressure and temperature and this is a decisive point for any extrapolation to crustal conditions. As far as pressure is concerned, most of the effect is due to cracks and a model which does not incorporate this fact will miss its goal. Temperature induced modifications can be the result of several processes: thermal cracking (due to thermal expansion mismatch and anisotropy, or to fluid pressure build up), and pressure solution are the two main ones. Experimental data on pressure and temperature effects are difficult to obtain but they are urgently needed. Finally, an important issue is: up to which point are these small scale data and models relevant when considering formations at the oil reservoir scale, or at the crust scale? At larger scales the identification of the characteristic scale is also a major goal which is examined.

  9. [Standardization of the Greek version of Zung's Self-rating Anxiety Scale (SAS)].

    PubMed

    Samakouri, M; Bouhos, G; Kadoglou, M; Giantzelidou, A; Tsolaki, K; Livaditis, M

    2012-01-01

    Self-rating Anxiety Scale (SAS), introduced by Zung, has been widely used in research and in clinical practice for the detection of anxiety. The present study aims at standardizing the Greek version of SAS. SAS consists of 20 items rated on a 1-4 likert type scale. The total SAS score may vary from 20 (no anxiety at all) to 80 (severe anxiety). Two hundred and fifty four participants (114 male and 140 female), psychiatric patients, physically ill and general population individuals, aged 45.40±11.35 years, completed the following: (a) a demographic characteristics' questionnaire, (b) the SAS Greek version, (c) the Spielberg's Modified Greek State-Trait Anxiety Scale (STAI-Gr.-X) and (d) the Zung Depression Rating Scale (ZDRS). Seventy six participants answered the SAS twice within a 12th-day median period of time. The following parameters were calculated: (a) internal consistency of the SAS in terms of Cronbach's α co-efficient, (b) its test-retest reliability in terms of the Intraclass Correlation Coefficient (ICC) and (c) its concurrent and convergent validities through its score's Spearman's rho correlations with both the state and trait subscales of STAI-Gr X and the ZDRS. In addition, in order to evaluate SAS' discriminant validity, the scale's scores of the three groups of participants (psychiatric patients, physically ill and general population individuals) were compared among each other, in terms of Kruskall Wallis and Mann Whitney U tests. SAS Cronbach's alpha equals 0.897 while ICC regarding its test-retest reliability equals 0.913. Spearman's rho concerning validity: (a) when SAS is compared to STAI-Gr.-X (state), equals it 0.767, (b) when SAS is compared to STAI-Gr. X (trait), it equals 0.802 and (c) when SAS is compared to ZDRS, it equals 0.835. The mentally ill scored significantly higher in SAS compared to both the healthy and the general population. In conclusion, the SAS Greek version presents very satisfactory psychometric properties regarding

  10. Angular Scaling In Jets

    SciTech Connect

    Jankowiak, Martin; Larkoski, Andrew J.; /SLAC

    2012-02-17

    We introduce a jet shape observable defined for an ensemble of jets in terms of two-particle angular correlations and a resolution parameter R. This quantity is infrared and collinear safe and can be interpreted as a scaling exponent for the angular distribution of mass inside the jet. For small R it is close to the value 2 as a consequence of the approximately scale invariant QCD dynamics. For large R it is sensitive to non-perturbative effects. We describe the use of this correlation function for tests of QCD, for studying underlying event and pile-up effects, and for tuning Monte Carlo event generators.

  11. Scale invariance in biophysics

    NASA Astrophysics Data System (ADS)

    Stanley, H. Eugene

    2000-06-01

    In this general talk, we offer an overview of some problems of interest to biophysicists, medical physicists, and econophysicists. These include DNA sequences, brain plaques in Alzheimer patients, heartbeat intervals, and time series giving price fluctuations in economics. These problems have the common feature that they exhibit features that appear to be scale invariant. Particularly vexing is the problem that some of these scale invariant phenomena are not stationary-their statistical properties vary from one time interval to the next or form one position to the next. We will discuss methods, such as wavelet methods and multifractal methods, to cope with these problems. .

  12. xi-scaling

    SciTech Connect

    Gunion, J.F.

    1980-04-01

    A class of purely kinematical corrections to xi-scaling is exposed. These corrections are inevitably present in any realistic hadron model with spin and gauge invariance and lead to phenomenologically important M/sub hadron//sup 2//Q/sup 2/ corrections to Nachtmann moments.

  13. Scale, Composition, and Technology

    ERIC Educational Resources Information Center

    Victor, Peter A.

    2009-01-01

    Scale (gross domestic product), composition (goods and services), and technology (impacts per unit of goods and services) in combination are the proximate determinants in an economy of the resources used, wastes generated, and land transformed. In this article, we examine relationships among these determinants to understand better the contribution…

  14. Scaling the Salary Heights.

    ERIC Educational Resources Information Center

    McNamee, Mike

    1986-01-01

    Federal cutbacks have created new demand for fund-raisers everywhere. Educational fund-raisers are thinking about "pay for performance"--incentive-based pay plans that can help them retain, reward, and motivate talented fund raisers within the tight pay scales common at colleges and universities. (MLW)

  15. Build an Interplanetary Scale.

    ERIC Educational Resources Information Center

    Matthews, Catherine; And Others

    1997-01-01

    Describes an activity in which students use a bathroom scale and a long board to see how their weight changes on other planets and the moon. Materials list, procedures, tables of planet radii, comparative values, and gravitational ratios are provided. (DDR)

  16. Fundamentals of Zoological Scaling.

    ERIC Educational Resources Information Center

    Lin, Herbert

    1982-01-01

    The following animal characteristics are considered to determine how properties and characteristics of various systems change with system size (scaling): skeletal weight, speed of running, height and range of jumping, food consumption, heart rate, lifetime, locomotive efficiency, frequency of wing-flapping, and maximum sizes of flying and hovering…

  17. Allometric scaling of countries

    NASA Astrophysics Data System (ADS)

    Zhang, Jiang; Yu, Tongkui

    2010-11-01

    As huge complex systems consisting of geographic regions, natural resources, people and economic entities, countries follow the allometric scaling law which is ubiquitous in ecological, and urban systems. We systematically investigated the allometric scaling relationships between a large number of macroscopic properties and geographic (area), demographic (population) and economic (GDP, gross domestic production) sizes of countries respectively. We found that most of the economic, trade, energy consumption, communication related properties have significant super-linear (the exponent is larger than 1) or nearly linear allometric scaling relations with the GDP. Meanwhile, the geographic (arable area, natural resources, etc.), demographic (labor force, military age population, etc.) and transportation-related properties (road length, airports) have significant and sub-linear (the exponent is smaller than 1) allometric scaling relations with area. Several differences of power law relations with respect to the population between countries and cities were pointed out. First, population increases sub-linearly with area in countries. Second, the GDP increases linearly in countries but not super-linearly as in cities. Finally, electricity or oil consumption per capita increases with population faster than cities.

  18. Scaling up Education Reform

    ERIC Educational Resources Information Center

    Gaffney, Jon D. H.; Richards, Evan; Kustusch, Mary Bridget; Ding, Lin; Beichner, Robert J.

    2008-01-01

    The SCALE-UP (Student-Centered Activities for Large Enrollment for Undergraduate Programs) project was developed to implement reforms designed for small classes into large physics classes. Over 50 schools across the country, ranging from Wake Technical Community College to Massachusetts Institute of Technology (MIT), have adopted it for classes of…

  19. Scaling up Psycholinguistics

    ERIC Educational Resources Information Center

    Smith, Nathaniel J.

    2011-01-01

    This dissertation contains several projects, each addressing different questions with different techniques. In chapter 1, I argue that they are unified thematically by their goal of "scaling up psycholinguistics"; they are all aimed at analyzing large data-sets using tools that reveal patterns to propose and test mechanism-neutral hypotheses about…

  20. The Infant Rating Scale.

    ERIC Educational Resources Information Center

    Lindsay, G. A.

    1980-01-01

    A study was made of the usefulness of the Infant Rating Scale (IRS) in the early identification of learning difficulties. Thirteen hundred five-year-olds were rated by their teachers after one term in school. The structure of the IRS, its reliability, and predictive validity are examined. (Author/SJL)

  1. The Spiritual Competency Scale

    ERIC Educational Resources Information Center

    Robertson, Linda A.

    2010-01-01

    This study describes the development of the Spiritual Competency Scale, which was based on the Association for Spiritual, Ethical and Religious Values in Counseling's original Spiritual Competencies. Participants were 662 counseling students from religiously based and secular universities nationwide. Exploratory factor analysis revealed a 22-item,…

  2. Scales of mantle heterogeneity

    NASA Astrophysics Data System (ADS)

    Moore, J. C.; Akber-Knutson, S.; Konter, J.; Kellogg, J.; Hart, S.; Kellogg, L. H.; Romanowicz, B.

    2004-12-01

    A long-standing question in mantle dynamics concerns the scale of heterogeneity in the mantle. Mantle convection tends to both destroy (through stirring) and create (through melt extraction and subduction) heterogeneity in bulk and trace element composition. Over time, these competing processes create variations in geochemical composition along mid-oceanic ridges and among oceanic islands, spanning a range of scales from extremely long wavelength (for example, the DUPAL anomaly) to very small scale (for example, variations amongst melt inclusions). While geochemical data and seismic observations can be used to constrain the length scales of mantle heterogeneity, dynamical mixing calculations can illustrate the processes and timescales involved in stirring and mixing. At the Summer 2004 CIDER workshop on Relating Geochemical and Seismological Heterogeneity in the Earth's Mantle, an interdisciplinary group evaluated scales of heterogeneity in the Earth's mantle using a combined analysis of geochemical data, seismological data and results of numerical models of mixing. We mined the PetDB database for isotopic data from glass and whole rock analyses for the Mid-Atlantic Ridge (MAR) and the East Pacific Rise (EPR), projecting them along the ridge length. We examined Sr isotope variability along the East Pacific rise by looking at the difference in Sr ratio between adjacent samples as a function of distance between the samples. The East Pacific Rise exhibits an overall bowl shape of normal MORB characteristics, with higher values in the higher latitudes (there is, however, an unfortunate gap in sampling, roughly 2000 km long). These background characteristics are punctuated with spikes in values at various locations, some, but not all of which are associated with off-axis volcanism. A Lomb-Scargle periodogram for unevenly spaced data was utilized to construct a power spectrum of the scale lengths of heterogeneity along both ridges. Using the same isotopic systems (Sr, Nd

  3. Scaling Applications in hydrology

    NASA Astrophysics Data System (ADS)

    Gebremichael, Mekonnen

    2010-05-01

    Besides downscaling applications, scaling properties of hydrological fields can be used to address a variety of research questions. In this presentation, we will use scaling properties to address questions related to satellite evapotranspiration algorithms, precipitation-streamflow relationships, and hydrological model calibration. Most of the existing satellite-based evapotranspiration (ET) algorithms have been developed using fine-resolution Landsat TM and ASTER data. However, these algorithms are often applied to coarse-resolution MODIS data. Our results show that applying the satellite-based algorithms, which are developed at ASTER resolution, to MODIS resolution leads to ET estimates that (1) preserve the overall spatial pattern (spatial correlation in excess of 0.90), (2) increase the spatial standard deviation and maximum value, (3) have modest conditional bias: underestimate low ET rates (< 1 mm/day) and overestimate high ET rates; the overestimation is within 20%. The results emphasize the need for exploring alternatives for estimation of ET from MODIS. Understanding the relationship between the scaling properties of precipitation and streamflow is important in a number of applications. We present the results of a detailed river flow fluctuation analysis on daily records from 14 stations in the Flint River basin in Georgia in the United States with focus on effect of watershed area on long memory of river flow fluctuations. The areas of the watersheds draining to the stations range from 22 km2 to 19,606 km2. Results show that large watersheds have more persistent flow fluctuations and stronger long-term (time greater than scale break point) memory than small watersheds while precipitation time series shows weak long-term correlation. We conclude that a watershed acts as a 'filter' for a 'white noise' precipitation with more significant filtering in case of large watersheds. Finally, we compare the scaling properties of simulated and observed spatial soil

  4. Scaling macroscopic aquatic locomotion

    NASA Astrophysics Data System (ADS)

    Gazzola, Mattia; Argentina, Médéric; Mahadevan, L.

    2014-10-01

    Inertial aquatic swimmers that use undulatory gaits range in length L from a few millimetres to 30 metres, across a wide array of biological taxa. Using elementary hydrodynamic arguments, we uncover a unifying mechanistic principle characterizing their locomotion by deriving a scaling relation that links swimming speed U to body kinematics (tail beat amplitude A and frequency ω) and fluid properties (kinematic viscosity ν). This principle can be simply couched as the power law Re ~ Swα, where Re = UL/ν >> 1 and Sw = ωAL/ν, with α = 4/3 for laminar flows, and α = 1 for turbulent flows. Existing data from over 1,000 measurements on fish, amphibians, larvae, reptiles, mammals and birds, as well as direct numerical simulations are consistent with our scaling. We interpret our results as the consequence of the convergence of aquatic gaits to the performance limits imposed by hydrodynamics.

  5. Scaling macroscopic aquatic locomotion

    NASA Astrophysics Data System (ADS)

    Gazzola, Mattia; Argentina, Mederic; Mahadevan, Lakshminarayanan

    2014-11-01

    Inertial aquatic swimmers that use undulatory gaits range in length L from a few millimeters to 30 meters, across a wide array of biological taxa. Using elementary hydrodynamic arguments, we uncover a unifying mechanistic principle characterizing their locomotion by deriving a scaling relation that links swimming speed U to body kinematics (tail beat amplitude A and frequency ω) and fluid properties (kinematic viscosity ν). This principle can be simply couched as the power law Re ~ Swα , where Re = UL / ν >> 1 and Sw = ωAL / ν , with α = 4 / 3 for laminar flows, and α = 1 for turbulent flows. Existing data from over 1000 measurements on fish, amphibians, larvae, reptiles, mammals and birds, as well as direct numerical simulations are consistent with our scaling. We interpret our results as the consequence of the convergence of aquatic gaits to the performance limits imposed by hydrodynamics.

  6. ELECTRONIC PULSE SCALING CIRCUITS

    DOEpatents

    Cooke-Yarborough, E.H.

    1958-11-18

    Electronic pulse scaling circults of the klnd comprlsing a serles of bi- stable elements connected ln sequence, usually in the form of a rlng so as to be cycllcally repetitive at the highest scallng factor, are described. The scaling circuit comprises a ring system of bi-stable elements each arranged on turn-off to cause, a succeeding element of the ring to be turned-on, and one being arranged on turn-off to cause a further element of the ring to be turned-on. In addition, separate means are provided for applying a turn-off pulse to all the elements simultaneously, and for resetting the elements to a starting condition at the end of each cycle.

  7. An elastica arm scale.

    PubMed

    Bosi, F; Misseroni, D; Dal Corso, F; Bigoni, D

    2014-09-01

    The concept of a 'deformable arm scale' (completely different from a traditional rigid arm balance) is theoretically introduced and experimentally validated. The idea is not intuitive, but is the result of nonlinear equilibrium kinematics of rods inducing configurational forces, so that deflection of the arms becomes necessary for equilibrium, which would be impossible for a rigid system. In particular, the rigid arms of usual scales are replaced by a flexible elastic lamina, free to slide in a frictionless and inclined sliding sleeve, which can reach a unique equilibrium configuration when two vertical dead loads are applied. Prototypes designed to demonstrate the feasibility of the system show a high accuracy in the measurement of load within a certain range of use. Finally, we show that the presented results are strongly related to snaking of confined beams, with implications for locomotion of serpents, plumbing and smart oil drilling. PMID:25197248

  8. Fundamentals of zoological scaling

    NASA Astrophysics Data System (ADS)

    Lin, Herbert

    1982-01-01

    Most introductory physics courses emphasize highly idealized problems with unique well-defined answers. Though many textbooks complement these problems with estimation problems, few books present anything more than an elementary discussion of scaling. This paper presents some fundamentals of scaling in the zoological domain—a domain complex by any standard, but one also well suited to illustrate the power of very simple physical ideas. We consider the following animal characteristics: skeletal weight, speed of running, height and range of jumping, food consumption, heart rate, lifetime, locomotive efficiency, frequency of wing flapping, and maximum sizes of animals that fly and hover. These relationships are compared to zoological data and everyday experience, and match reasonably well.

  9. The Extragalactic Distance Scale

    NASA Astrophysics Data System (ADS)

    Livio, Mario; Donahue, Megan; Panagia, Nino

    1997-07-01

    Participants; Preface; Foreword; Early history of the distance scale problem, S. van den Bergh; Cosmology: From Hubble to HST, M. S. Turner; Age constraints nucleocosmochronology, J. Truran; The ages of globular clusters, P. Demarque; The linearity of the Hubble flow M. Postman; Gravitational lensing and the extragalactic distance scale, R. D. Blandford andT . Kundic; Using the cosmic microwave background to constrain the Hubble constant A. Lasenby and T M. Jones; Cepheids as distance indicators, N. R. Tanvir; The I-band Tully-Fisher relation and the Hubble constant, R. Giovanell; The calibration of type 1a supernovae as standard candles, A. Saha; Focusing in on the Hubble constant, G. A. Tammann & M. Federspiel; Interim report on the calibration of the Tully-Fisher relation in the HST Key Project to measure the Hubble constant, J. Mould et al.; Hubble Space Telescope Key Project on the extragalactic distance scale, W. L. Freedman, B. F. Madore and T R. C. Kennicutt; Novae as distance indicators, M. Livio; Verifying the planetary nebula luminosity function method, G. H. Jacoby; On the possible use of radio supernovae for distance determinations, K. W. Weiler et al.; Post-AGB stars as standard candles, H. Bond; Helium core flash at the tip of the red giant branch: a population II distance indicator, B. F. Madore, W. L. Freedman and T S. Sakai; Globular clusters as distance indicators, B. C. Whitmore; Detached eclipsing binaries as primary distance and age indicators, B. Paczynski; Light echoes: geometric measurement of galaxy distances, W. B. Sparks; The SBF survey of galaxy distances J. L. Tonry; Extragalactic distance scales: The long and short of it, V. Trimble.

  10. Earthquake Apparent Stress Scaling

    NASA Astrophysics Data System (ADS)

    Walter, W. R.; Mayeda, K.; Ruppert, S.

    2002-12-01

    There is currently a disagreement within the geophysical community on the way earthquake energy scales with magnitude. One set of recent papers finds evidence that energy release per seismic moment (apparent stress) is constant (e.g. Choy and Boatwright, 1995; McGarr, 1999; Ide and Beroza, 2001). Another set of recent papers finds the apparent stress increases with magnitude (e.g. Kanamori et al., 1993 Abercrombie, 1995; Mayeda and Walter, 1996; Izutani and Kanamori, 2001). The resolution of this issue is complicated by the difficulty of accurately accounting for and determining the seismic energy radiated by earthquakes over a wide range of event sizes in a consistent manner. We have just started a project to reexamine this issue by analyzing aftershock sequences in the Western U.S. and Turkey using two different techniques. First we examine the observed regional S-wave spectra by fitting with a parametric model (Walter and Taylor, 2002) with and without variable stress drop scaling. Because the aftershock sequences have common stations and paths we can examine the S-wave spectra of events by size to determine what type of apparent stress scaling, if any, is most consistent with the data. Second we use regional coda envelope techniques (e.g. Mayeda and Walter, 1996; Mayeda et al, 2002) on the same events to directly measure energy and moment. The coda techniques corrects for path and site effects using an empirical Green function technique and independent calibration with surface wave derived moments. Our hope is that by carefully analyzing a very large number of events in a consistent manner using two different techniques we can start to resolve this apparent stress scaling issue. This work was performed under the auspices of the U.S. Department of Energy by the University of California, Lawrence Livermore National Laboratory under Contract No. W-7405-Eng-48.

  11. Earthquake Apparent Stress Scaling

    NASA Astrophysics Data System (ADS)

    Mayeda, K.; Walter, W. R.

    2003-04-01

    There is currently a disagreement within the geophysical community on the way earthquake energy scales with magnitude. One set of recent papers finds evidence that energy release per seismic moment (apparent stress) is constant (e.g. Choy and Boatwright, 1995; McGarr, 1999; Ide and Beroza, 2001). Another set of recent papers finds the apparent stress increases with magnitude (e.g. Kanamori et al., 1993 Abercrombie, 1995; Mayeda and Walter, 1996; Izutani and Kanamori, 2001). The resolution of this issue is complicated by the difficulty of accurately accounting for and determining the seismic energy radiated by earthquakes over a wide range of event sizes in a consistent manner. We have just started a project to reexamine this issue by applying the same methodology to a series of datasets that spans roughly 10 orders in seismic moment, M0. We will summarize recent results using a coda envelope methodology of Mayeda et al, (2003) which provide the most stable source spectral estimates to date. This methodology eliminates the complicating effects of lateral path heterogeneity, source radiation pattern, directivity, and site response (e.g., amplification, f-max and kappa). We find that in tectonically active continental crustal areas the total radiated energy scales as M00.25 whereas in regions of relatively younger oceanic crust, the stress drop is generally lower and exhibits a 1-to-1 scaling with moment. In addition to answering a fundamental question in earthquake source dynamics, this study addresses how one would scale small earthquakes in a particular region up to a future, more damaging earthquake. This work was performed under the auspices of the U.S. Department of Energy by the University of California, Lawrence Livermore National Laboratory under Contract No. W-7405-Eng-48.

  12. Extreme Scale Visual Analytics

    SciTech Connect

    Steed, Chad A; Potok, Thomas E; Pullum, Laura L; Ramanathan, Arvind; Shipman, Galen M; Thornton, Peter E; Potok, Thomas E

    2013-01-01

    Given the scale and complexity of today s data, visual analytics is rapidly becoming a necessity rather than an option for comprehensive exploratory analysis. In this paper, we provide an overview of three applications of visual analytics for addressing the challenges of analyzing climate, text streams, and biosurveilance data. These systems feature varying levels of interaction and high performance computing technology integration to permit exploratory analysis of large and complex data of global significance.

  13. Beyond the Planck Scale

    SciTech Connect

    Giddings, Steven B.

    2009-12-15

    I outline motivations for believing that important quantum gravity effects lie beyond the Planck scale at both higher energies and longer distances and times. These motivations arise in part from the study of ultra-high energy scattering, and also from considerations in cosmology. I briefly summarize some inferences about such ultra-planckian physics, and clues we might pursue towards the principles of a more fundamental theory addressing the known puzzles and paradoxes of quantum gravity.

  14. Is this scaling nonlinear?

    PubMed

    Leitão, J C; Miotto, J M; Gerlach, M; Altmann, E G

    2016-07-01

    One of the most celebrated findings in complex systems in the last decade is that different indexes y (e.g. patents) scale nonlinearly with the population x of the cities in which they appear, i.e. y∼x (β) ,β≠1. More recently, the generality of this finding has been questioned in studies that used new databases and different definitions of city boundaries. In this paper, we investigate the existence of nonlinear scaling, using a probabilistic framework in which fluctuations are accounted for explicitly. In particular, we show that this allows not only to (i) estimate β and confidence intervals, but also to (ii) quantify the evidence in favour of β≠1 and (iii) test the hypothesis that the observations are compatible with the nonlinear scaling. We employ this framework to compare five different models to 15 different datasets and we find that the answers to points (i)-(iii) crucially depend on the fluctuations contained in the data, on how they are modelled, and on the fact that the city sizes are heavy-tailed distributed. PMID:27493764

  15. Is this scaling nonlinear?

    PubMed Central

    2016-01-01

    One of the most celebrated findings in complex systems in the last decade is that different indexes y (e.g. patents) scale nonlinearly with the population x of the cities in which they appear, i.e. y∼xβ,β≠1. More recently, the generality of this finding has been questioned in studies that used new databases and different definitions of city boundaries. In this paper, we investigate the existence of nonlinear scaling, using a probabilistic framework in which fluctuations are accounted for explicitly. In particular, we show that this allows not only to (i) estimate β and confidence intervals, but also to (ii) quantify the evidence in favour of β≠1 and (iii) test the hypothesis that the observations are compatible with the nonlinear scaling. We employ this framework to compare five different models to 15 different datasets and we find that the answers to points (i)–(iii) crucially depend on the fluctuations contained in the data, on how they are modelled, and on the fact that the city sizes are heavy-tailed distributed. PMID:27493764

  16. Scaling body size fluctuations

    PubMed Central

    Giometto, Andrea; Altermatt, Florian; Carrara, Francesco; Maritan, Amos; Rinaldo, Andrea

    2013-01-01

    The size of an organism matters for its metabolic, growth, mortality, and other vital rates. Scale-free community size spectra (i.e., size distributions regardless of species) are routinely observed in natural ecosystems and are the product of intra- and interspecies regulation of the relative abundance of organisms of different sizes. Intra- and interspecies distributions of body sizes are thus major determinants of ecosystems’ structure and function. We show experimentally that single-species mass distributions of unicellular eukaryotes covering different phyla exhibit both characteristic sizes and universal features over more than four orders of magnitude in mass. Remarkably, we find that the mean size of a species is sufficient to characterize its size distribution fully and that the latter has a universal form across all species. We show that an analytical physiological model accounts for the observed universality, which can be synthesized in a log-normal form for the intraspecies size distributions. We also propose how ecological and physiological processes should interact to produce scale-invariant community size spectra and discuss the implications of our results on allometric scaling laws involving body mass. PMID:23487793

  17. Urban scaling in Europe.

    PubMed

    Bettencourt, Luís M A; Lobo, José

    2016-03-01

    Over the last few decades, in disciplines as diverse as economics, geography and complex systems, a perspective has arisen proposing that many properties of cities are quantitatively predictable due to agglomeration or scaling effects. Using new harmonized definitions for functional urban areas, we examine to what extent these ideas apply to European cities. We show that while most large urban systems in Western Europe (France, Germany, Italy, Spain, UK) approximately agree with theoretical expectations, the small number of cities in each nation and their natural variability preclude drawing strong conclusions. We demonstrate how this problem can be overcome so that cities from different urban systems can be pooled together to construct larger datasets. This leads to a simple statistical procedure to identify urban scaling relations, which then clearly emerge as a property of European cities. We compare the predictions of urban scaling to Zipf's law for the size distribution of cities and show that while the former holds well the latter is a poor descriptor of European cities. We conclude with scenarios for the size and properties of future pan-European megacities and their implications for the economic productivity, technological sophistication and regional inequalities of an integrated European urban system. PMID:26984190

  18. Urban scaling in Europe

    PubMed Central

    Bettencourt, Luís M. A.; Lobo, José

    2016-01-01

    Over the last few decades, in disciplines as diverse as economics, geography and complex systems, a perspective has arisen proposing that many properties of cities are quantitatively predictable due to agglomeration or scaling effects. Using new harmonized definitions for functional urban areas, we examine to what extent these ideas apply to European cities. We show that while most large urban systems in Western Europe (France, Germany, Italy, Spain, UK) approximately agree with theoretical expectations, the small number of cities in each nation and their natural variability preclude drawing strong conclusions. We demonstrate how this problem can be overcome so that cities from different urban systems can be pooled together to construct larger datasets. This leads to a simple statistical procedure to identify urban scaling relations, which then clearly emerge as a property of European cities. We compare the predictions of urban scaling to Zipf's law for the size distribution of cities and show that while the former holds well the latter is a poor descriptor of European cities. We conclude with scenarios for the size and properties of future pan-European megacities and their implications for the economic productivity, technological sophistication and regional inequalities of an integrated European urban system. PMID:26984190

  19. Mechanism for salt scaling

    NASA Astrophysics Data System (ADS)

    Valenza, John J., II

    Salt scaling is superficial damage caused by freezing a saline solution on the surface of a cementitious body. The damage consists of the removal of small chips or flakes of binder. The discovery of this phenomenon in the early 1950's prompted hundreds of experimental studies, which clearly elucidated the characteristics of this damage. In particular it was shown that a pessimum salt concentration exists, where a moderate salt concentration (˜3%) results in the most damage. Despite the numerous studies, the mechanism responsible for salt scaling has not been identified. In this work it is shown that salt scaling is a result of the large thermal expansion mismatch between ice and the cementitious body, and that the mechanism responsible for damage is analogous to glue-spalling. When ice forms on a cementitious body a bi-material composite is formed. The thermal expansion coefficient of the ice is ˜5 times that of the underlying body, so when the temperature of the composite is lowered below the melting point, the ice goes into tension. Once this stress exceeds the strength of the ice, cracks initiate in the ice and propagate into the surface of the cementitious body, removing a flake of material. The glue-spall mechanism accounts for all of the characteristics of salt scaling. In particular, a theoretical analysis is presented which shows that the pessimum concentration is a consequence of the effect of brine pockets on the mechanical properties of ice, and that the damage morphology is accounted for by fracture mechanics. Finally, empirical evidence is presented that proves that the glue-small mechanism is the primary cause of salt scaling. The primary experimental tool used in this study is a novel warping experiment, where a pool of liquid is formed on top of a thin (˜3 mm) plate of cement paste. Stresses in the plate, including thermal expansion mismatch, result in warping of the plate, which is easily detected. This technique revealed the existence of

  20. The Practicality of Behavioral Observation Scales, Behavioral Expectation Scales, and Trait Scales.

    ERIC Educational Resources Information Center

    Wiersma, Uco; Latham, Gary P.

    1986-01-01

    The practicality of three appraisal instruments was measured in terms of user preference, namely, behavioral observation scales (BOS), behavioral expectation scales (BES), and trait scales. In all instances, BOS were preferred to BES, and in all but two instances, BOS were viewed as superior to trait scales. (Author/ABB)

  1. Comparing the theoretical versions of the Beaufort scale, the T-Scale and the Fujita scale

    NASA Astrophysics Data System (ADS)

    Meaden, G. Terence; Kochev, S.; Kolendowicz, L.; Kosa-Kiss, A.; Marcinoniene, Izolda; Sioutas, Michalis; Tooming, Heino; Tyrrell, John

    2007-02-01

    2005 is the bicentenary of the Beaufort Scale and its wind-speed codes: the marine version in 1805 and the land version later. In the 1920s when anemometers had come into general use, the Beaufort Scale was quantified by a formula based on experiment. In the early 1970s two tornado wind-speed scales were proposed: (1) an International T-Scale based on the Beaufort Scale; and (2) Fujita's damage scale developed for North America. The International Beaufort Scale and the T-Scale share a common root in having an integral theoretical relationship with an established scientific basis, whereas Fujita's Scale introduces criteria that make its intensities non-integral with Beaufort. Forces on the T-Scale, where T stands for Tornado force, span the range 0 to 10 which is highly useful world wide. The shorter range of Fujita's Scale (0 to 5) is acceptable for American use but less convenient elsewhere. To illustrate the simplicity of the decimal T-Scale, mean hurricane wind speed of Beaufort 12 is T2 on the T-Scale but F1.121 on the F-Scale; while a tornado wind speed of T9 (= B26) becomes F4.761. However, the three wind scales can be uni-fied by either making F-Scale numbers exactly half the magnitude of T-Scale numbers [i.e. F'half = T / 2 = (B / 4) - 4] or by doubling the numbers of this revised version to give integral equivalence with the T-Scale. The result is a decimal formula F'double = T = (B / 2) - 4 named the TF-Scale where TF stands for Tornado Force. This harmonious 10-digit scale has all the criteria needed for world-wide practical effectiveness.

  2. Small-scale strength

    SciTech Connect

    Anderson, J.L.

    1995-11-01

    In the world of power project development there is a market for smaller scale cogeneration projects in the range of 1MW to 10MW. In the European Union alone, this range will account for about $25 Billion in value over the next 10 years. By adding the potential that exists in Eastern Europe, the numbers are even more impressive. In Europe, only about 7 percent of needed electrical power is currently produced through cogeneration installations; this is expected to change to around 15 percent by the year 2000. Less than one year ago, two equipment manufacturers formed Dutch Power Partners (DPP) to focus on the market for industrial cogeneration throughout Europe.

  3. Reconsidering Fault Slip Scaling

    NASA Astrophysics Data System (ADS)

    Gomberg, J. S.; Wech, A.; Creager, K. C.; Obara, K.; Agnew, D. C.

    2015-12-01

    The scaling of fault slip events given by the relationship between the scalar moment M0, and duration T, potentially provides key constraints on the underlying physics controlling slip. Many studies have suggested that measurements of M0 and T are related as M0=KfT3 for 'fast' slip events (earthquakes) and M0=KsT for 'slow' slip events, in which Kf and Ks are proportionality constants, although some studies have inferred intermediate relations. Here 'slow' and 'fast' refer to slip front propagation velocities, either so slow that seismic radiation is too small or long period to be measurable or fast enough that dynamic processes may be important for the slip process and measurable seismic waves radiate. Numerous models have been proposed to explain the differing M0-T scaling relations. We show that a single, simple dislocation model of slip events within a bounded slip zone may explain nearly all M0-T observations. Rather than different scaling for fast and slow populations, we suggest that within each population the scaling changes from M0 proportional to T3 to T when the slipping area reaches the slip zone boundaries and transitions from unbounded, 2-dimensional to bounded, 1-dimensional growth. This transition has not been apparent previously for slow events because data have sampled only the bounded regime and may be obscured for earthquakes when observations from multiple tectonic regions are combined. We have attempted to sample the expected transition between bounded and unbounded regimes for the slow slip population, measuring tremor cluster parameters from catalogs for Japan and Cascadia and using them as proxies for small slow slip event characteristics. For fast events we employed published earthquake slip models. Observations corroborate our hypothesis, but highlight observational difficulties. We find that M0-T observations for both slow and fast slip events, spanning 12 orders of magnitude in M0, are consistent with a single model based on dislocation

  4. Soil organic carbon across scales.

    PubMed

    O'Rourke, Sharon M; Angers, Denis A; Holden, Nicholas M; McBratney, Alex B

    2015-10-01

    Mechanistic understanding of scale effects is important for interpreting the processes that control the global carbon cycle. Greater attention should be given to scale in soil organic carbon (SOC) science so that we can devise better policy to protect/enhance existing SOC stocks and ensure sustainable use of soils. Global issues such as climate change require consideration of SOC stock changes at the global and biosphere scale, but human interaction occurs at the landscape scale, with consequences at the pedon, aggregate and particle scales. This review evaluates our understanding of SOC across all these scales in the context of the processes involved in SOC cycling at each scale and with emphasis on stabilizing SOC. Current synergy between science and policy is explored at each scale to determine how well each is represented in the management of SOC. An outline of how SOC might be integrated into a framework of soil security is examined. We conclude that SOC processes at the biosphere to biome scales are not well understood. Instead, SOC has come to be viewed as a large-scale pool subjects to carbon flux. Better understanding exists for SOC processes operating at the scales of the pedon, aggregate and particle. At the landscape scale, the influence of large- and small-scale processes has the greatest interaction and is exposed to the greatest modification through agricultural management. Policy implemented at regional or national scale tends to focus at the landscape scale without due consideration of the larger scale factors controlling SOC or the impacts of policy for SOC at the smaller SOC scales. What is required is a framework that can be integrated across a continuum of scales to optimize SOC management.

  5. Soil organic carbon across scales.

    PubMed

    O'Rourke, Sharon M; Angers, Denis A; Holden, Nicholas M; McBratney, Alex B

    2015-10-01

    Mechanistic understanding of scale effects is important for interpreting the processes that control the global carbon cycle. Greater attention should be given to scale in soil organic carbon (SOC) science so that we can devise better policy to protect/enhance existing SOC stocks and ensure sustainable use of soils. Global issues such as climate change require consideration of SOC stock changes at the global and biosphere scale, but human interaction occurs at the landscape scale, with consequences at the pedon, aggregate and particle scales. This review evaluates our understanding of SOC across all these scales in the context of the processes involved in SOC cycling at each scale and with emphasis on stabilizing SOC. Current synergy between science and policy is explored at each scale to determine how well each is represented in the management of SOC. An outline of how SOC might be integrated into a framework of soil security is examined. We conclude that SOC processes at the biosphere to biome scales are not well understood. Instead, SOC has come to be viewed as a large-scale pool subjects to carbon flux. Better understanding exists for SOC processes operating at the scales of the pedon, aggregate and particle. At the landscape scale, the influence of large- and small-scale processes has the greatest interaction and is exposed to the greatest modification through agricultural management. Policy implemented at regional or national scale tends to focus at the landscape scale without due consideration of the larger scale factors controlling SOC or the impacts of policy for SOC at the smaller SOC scales. What is required is a framework that can be integrated across a continuum of scales to optimize SOC management. PMID:25918852

  6. Scaling in Transportation Networks

    PubMed Central

    Louf, Rémi; Roth, Camille; Barthelemy, Marc

    2014-01-01

    Subway systems span most large cities, and railway networks most countries in the world. These networks are fundamental in the development of countries and their cities, and it is therefore crucial to understand their formation and evolution. However, if the topological properties of these networks are fairly well understood, how they relate to population and socio-economical properties remains an open question. We propose here a general coarse-grained approach, based on a cost-benefit analysis that accounts for the scaling properties of the main quantities characterizing these systems (the number of stations, the total length, and the ridership) with the substrate's population, area and wealth. More precisely, we show that the length, number of stations and ridership of subways and rail networks can be estimated knowing the area, population and wealth of the underlying region. These predictions are in good agreement with data gathered for about subway systems and more than railway networks in the world. We also show that train networks and subway systems can be described within the same framework, but with a fundamental difference: while the interstation distance seems to be constant and determined by the typical walking distance for subways, the interstation distance for railways scales with the number of stations. PMID:25029528

  7. Static Scale Conversion (SSC)

    2007-01-19

    The Static Scale Conversion (SSC) software is a unique enhancement to the AIMVEE system. It enables a SSC to weigh and measure vehicles and cargo dynamically (i.e., as they pass over the large scale. Included in the software is the AIMVEE computer code base. The SSC and AIMVEE computer system electronically continue to retrieve deployment information, identify vehicle automatically and determine total weight, individual axle weights, axle spacing and center-of-balance for any wheeled vehicle inmore » motion. The AIMVEE computer code system can also perform these functions statically for both wheel vehicles and cargo with information. The AIMVEE computer code system incorporates digital images and applies cubing algorithms to determine length, width, height for cubic dimensions of both vehicle and cargo. Once all this information is stored, it electronically links to data collection and dissemination systems to provide “actual” weight and measurement information for planning, deployment, and in-transit visibility.« less

  8. Static Scale Conversion (SSC)

    SciTech Connect

    2007-01-19

    The Static Scale Conversion (SSC) software is a unique enhancement to the AIMVEE system. It enables a SSC to weigh and measure vehicles and cargo dynamically (i.e., as they pass over the large scale. Included in the software is the AIMVEE computer code base. The SSC and AIMVEE computer system electronically continue to retrieve deployment information, identify vehicle automatically and determine total weight, individual axle weights, axle spacing and center-of-balance for any wheeled vehicle in motion. The AIMVEE computer code system can also perform these functions statically for both wheel vehicles and cargo with information. The AIMVEE computer code system incorporates digital images and applies cubing algorithms to determine length, width, height for cubic dimensions of both vehicle and cargo. Once all this information is stored, it electronically links to data collection and dissemination systems to provide “actual” weight and measurement information for planning, deployment, and in-transit visibility.

  9. Aging scaled Brownian motion.

    PubMed

    Safdari, Hadiseh; Chechkin, Aleksei V; Jafari, Gholamreza R; Metzler, Ralf

    2015-04-01

    Scaled Brownian motion (SBM) is widely used to model anomalous diffusion of passive tracers in complex and biological systems. It is a highly nonstationary process governed by the Langevin equation for Brownian motion, however, with a power-law time dependence of the noise strength. Here we study the aging properties of SBM for both unconfined and confined motion. Specifically, we derive the ensemble and time averaged mean squared displacements and analyze their behavior in the regimes of weak, intermediate, and strong aging. A very rich behavior is revealed for confined aging SBM depending on different aging times and whether the process is sub- or superdiffusive. We demonstrate that the information on the aging factorizes with respect to the lag time and exhibits a functional form that is identical to the aging behavior of scale-free continuous time random walk processes. While SBM exhibits a disparity between ensemble and time averaged observables and is thus weakly nonergodic, strong aging is shown to effect a convergence of the ensemble and time averaged mean squared displacement. Finally, we derive the density of first passage times in the semi-infinite domain that features a crossover defined by the aging time. PMID:25974439

  10. Returns to Scale and Economies of Scale: Further Observations.

    ERIC Educational Resources Information Center

    Gelles, Gregory M.; Mitchell, Douglas W.

    1996-01-01

    Maintains that most economics textbooks continue to repeat past mistakes concerning returns to scale and economies of scale under assumptions of constant and nonconstant input prices. Provides an adaptation for a calculus-based intermediate microeconomics class that demonstrates the pointwise relationship between returns to scale and economies of…

  11. Global scale precipitation from monthly to centennial scales: empirical space-time scaling analysis, anthropogenic effects

    NASA Astrophysics Data System (ADS)

    de Lima, Isabel; Lovejoy, Shaun

    2016-04-01

    The characterization of precipitation scaling regimes represents a key contribution to the improved understanding of space-time precipitation variability, which is the focus here. We conduct space-time scaling analyses of spectra and Haar fluctuations in precipitation, using three global scale precipitation products (one instrument based, one reanalysis based, one satellite and gauge based), from monthly to centennial scales and planetary down to several hundred kilometers in spatial scale. Results show the presence - similarly to other atmospheric fields - of an intermediate "macroweather" regime between the familiar weather and climate regimes: we characterize systematically the macroweather precipitation temporal and spatial, and joint space-time statistics and variability, and the outer scale limit of temporal scaling. These regimes qualitatively and quantitatively alternate in the way fluctuations vary with scale. In the macroweather regime, the fluctuations diminish with time scale (this is important for seasonal, annual, and decadal forecasts) while anthropogenic effects increase with time scale. Our approach determines the time scale at which the anthropogenic signal can be detected above the natural variability noise: the critical scale is about 20 - 40 yrs (depending on the product, on the spatial scale). This explains for example why studies that use data covering only a few decades do not easily give evidence of anthropogenic changes in precipitation, as a consequence of warming: the period is too short. Overall, while showing that precipitation can be modeled with space-time scaling processes, our results clarify the different precipitation scaling regimes and further allow us to quantify the agreement (and lack of agreement) of the precipitation products as a function of space and time scales. Moreover, this work contributes to clarify a basic problem in hydro-climatology, which is to measure precipitation trends at decadal and longer scales and to

  12. Validation of Scale of Commitment to Democratic Values among Secondary Students

    ERIC Educational Resources Information Center

    Gafoor, K. Abdul

    2015-01-01

    This study reports development of a reliable and valid instrument for assessing the commitment to democratic values among secondary school students in Kerala from 57 likert type statements originally developed in 2007 by Gafoor and Thushara to assess commitment to nine values avowed in the Indian Constitution. Nine separate maximum likelihood…

  13. Absolute flux scale for radioastronomy

    SciTech Connect

    Ivanov, V.P.; Stankevich, K.S.

    1986-07-01

    The authors propose and provide support for a new absolute flux scale for radio astronomy, which is not encumbered with the inadequacies of the previous scales. In constructing it the method of relative spectra was used (a powerful tool for choosing reference spectra). A review is given of previous flux scales. The authors compare the AIS scale with the scale they propose. Both scales are based on absolute measurements by the ''artificial moon'' method, and they are practically coincident in the range from 0.96 to 6 GHz. At frequencies above 6 GHz, 0.96 GHz, the AIS scale is overestimated because of incorrect extrapolation of the spectra of the primary and secondary standards. The major results which have emerged from this review of absolute scales in radio astronomy are summarized.

  14. [Research progress on hydrological scaling].

    PubMed

    Liu, Jianmei; Pei, Tiefan

    2003-12-01

    With the development of hydrology and the extending effect of mankind on environment, scale issue has become a great challenge to many hydrologists due to the stochasticism and complexity of hydrological phenomena and natural catchments. More and more concern has been given to the scaling issues to gain a large-scale (or small-scale) hydrological characteristic from a certain known catchments, but hasn't been solved successfully. The first part of this paper introduced some concepts about hydrological scale, scale issue and scaling. The key problem is the spatial heterogeneity of catchments and the temporal and spatial variability of hydrological fluxes. Three approaches to scale were put forward in the third part, which were distributed modeling, fractal theory and statistical self similarity analyses. Existing problems and future research directions were proposed in the last part.

  15. MULTIPLE SCALES FOR SUSTAINABLE RESULTS

    EPA Science Inventory

    This session will highlight recent research that incorporates the use of multiple scales and innovative environmental accounting to better inform decisions that affect sustainability, resilience, and vulnerability at all scales. Effective decision-making involves assessment at mu...

  16. Earthquake impact scale

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M, $100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should

  17. Scaling aircraft noise perception.

    NASA Technical Reports Server (NTRS)

    Ollerhead, J. B.

    1973-01-01

    Following a brief review of the background to the study, an extensive experiment is described which was undertaken to assess the practical differences between numerous alternative methods for calculating the perceived levels of individual aircraft flyover wounds. One hundred and twenty recorded sounds, including jets, turboprops, piston aircraft and helicopters were rated by a panel of subjects in a pair comparison test. The results were analyzed to evaluate a number of noise rating procedures, in terms of their ability to accurately estimate both relative and absolute perceived noise levels over a wider dynamic range (84-115 dB SPL) than had generally been used in previous experiments. Performances of the different scales were examined in detail for different aircraft categories, and the merits of different band level summation procedures, frequency weighting functions, duration and tone corrections were investigated.

  18. Indian scales and inventories

    PubMed Central

    Venkatesan, S.

    2010-01-01

    This conceptual, perspective and review paper on Indian scales and inventories begins with clarification on the historical and contemporary meanings of psychometry before linking itself to the burgeoning field of clinimetrics in their applications to the practice of clinical psychology and psychiatry. Clinimetrics is explained as a changing paradigm in the design, administration, and interpretation of quantitative tests, techniques or procedures applied to measurement of clinical variables, traits and processes. As an illustrative sample, this article assembles a bibliographic survey of about 105 out of 2582 research papers (4.07%) scanned through 51 back dated volumes covering 185 issues related to clinimetry as reviewed across a span of over fifty years (1958-2009) in the Indian Journal of Psychiatry. A content analysis of the contributions across distinct categories of mental measurements is explained before linkages are proposed for future directions along these lines. PMID:21836709

  19. Biological scaling and physics.

    PubMed

    Rau, A R P

    2002-09-01

    Kleiber's law in biology states that the specific metabolic rate (metabolic rate per unit mass) scales as M- 1/4 in terms of the mass M of the organism. A long-standing puzzle is the (- 1/4) power in place of the usual expectation of (- 1/3) based on the surface to volume ratio in three-dimensions. While recent papers by physicists have focused exclusively on geometry in attempting to explain the puzzle, we consider here a specific law of physics that governs fluid flow to show how the (- 1/4) power arises under certain conditions. More generally, such a line of approach that identifies a specific physical law as involved and then examines the implications of a power law may illuminate better the role of physics in biology.

  20. Galactic-scale civilization

    NASA Technical Reports Server (NTRS)

    Kuiper, T. B. H.

    1980-01-01

    Evolutionary arguments are presented in favor of the existence of civilization on a galactic scale. Patterns of physical, chemical, biological, social and cultural evolution leading to increasing levels of complexity are pointed out and explained thermodynamically in terms of the maximization of free energy dissipation in the environment of the organized system. The possibility of the evolution of a global and then a galactic human civilization is considered, and probabilities that the galaxy is presently in its colonization state and that life could have evolved to its present state on earth are discussed. Fermi's paradox of the absence of extraterrestrials in light of the probability of their existence is noted, and a variety of possible explanations is indicated. Finally, it is argued that although mankind may be the first occurrence of intelligence in the galaxy, it is unjustified to presume that this is so.

  1. L-Scaling: An Update.

    ERIC Educational Resources Information Center

    Blankmeyer, Eric

    L-scaling is introduced as a technique for determining the weights in weighted averages or scaled scores for T joint observations on K variables. The technique is so named because of its formal resemblance to the Leontief matrix of mathematical economics. L-scaling is compared to several widely-used procedures for data reduction, and the…

  2. The Gains from Vertical Scaling

    ERIC Educational Resources Information Center

    Briggs, Derek C.; Domingue, Ben

    2013-01-01

    It is often assumed that a vertical scale is necessary when value-added models depend upon the gain scores of students across two or more points in time. This article examines the conditions under which the scale transformations associated with the vertical scaling process would be expected to have a significant impact on normative interpretations…

  3. Westside Test Anxiety Scale Validation

    ERIC Educational Resources Information Center

    Driscoll, Richard

    2007-01-01

    The Westside Test Anxiety Scale is a brief, ten item instrument designed to identify students with anxiety impairments who could benefit from an anxiety-reduction intervention. The scale items cover self-assessed anxiety impairment and cognitions which can impair performance. Correlations between anxiety-reduction as measured by the scale and…

  4. Full Scale Tunnel (FST)

    NASA Technical Reports Server (NTRS)

    1930-01-01

    Construction of Full Scale Tunnel (FST). In November 1929, Smith DeFrance submitted his recommendations for the general design of the Full Scale Wind Tunnel. The last on his list concerned the division of labor required to build this unusual facility. He believed the job had five parts and described them as follows: 'It is proposed that invitations be sent out for bids on five groups of items. The first would be for one contract on the complete structure; second the same as first, including the erection of the cones but not the fabrication, since this would be more of a shipyard job; third would cover structural steel, cover, sash and doors, but not cones or foundation; fourth, foundations; an fifth, fabrication of cones.' DeFrance's memorandum prompted the NACA to solicit estimates from a large number of companies. Preliminary designs and estimates were prepared and submitted to the Bureau of the Budget and Congress appropriated funds on February 20, 1929. The main construction contract with the J.A. Jones Company of Charlotte, North Carolina was signed one year later on February 12, 1930. It was a peculiar structure as the building's steel framework is visible on the outside of the building. DeFrance described this in NACA TR No. 459: 'The entire equipment is housed in a structure, the outside walls of which serve as the outer walls of the return passages. The over-all length of the tunnel is 434 feet 6 inches, the width 222 feet, and the maximum height 97 feet. The framework is of structural steel....' (pp. 292-293)

  5. Full Scale Tunnel (FST)

    NASA Technical Reports Server (NTRS)

    1930-01-01

    Construction of Full-Scale Tunnel (FST). In November 1929, Smith DeFrance submitted his recommendations for the general design of the Full Scale Wind Tunnel. The last on his list concerned the division of labor required to build this unusual facility. He believed the job had five parts and described them as follows: 'It is proposed that invitations be sent out for bids on five groups of items. The first would be for one contract on the complete structure; second the same as first, including the erection of the cones but not the fabrication, since this would be more of a shipyard job; third would cover structural steel, cover, sash and doors, but not cones or foundation; fourth, foundations; and fifth, fabrication of cones.' DeFrance's memorandum prompted the NACA to solicit estimates from a large number of companies. Preliminary designs and estimates were prepared and submitted to the Bureau of the Budget and Congress appropriated funds on February 20, 1929. The main construction contract with the J.A. Jones Company of Charlotte, North Carolina was signed one year later on February 12, 1930. It was a peculiar structure as the building's steel framework is visible on the outside of the building. DeFrance described this in NACA TR No. 459: 'The entire equipment is housed in a structure, the outside walls of which serve as the outer walls of the return passages. The over-all length of the tunnel is 434 feet 6 inches, the width 222 feet, and the maximum height 97 feet. The framework is of structural steel....' (pp. 292-293).

  6. Scaling up: Assessing social impacts at the macro-scale

    SciTech Connect

    Schirmer, Jacki

    2011-04-15

    Social impacts occur at various scales, from the micro-scale of the individual to the macro-scale of the community. Identifying the macro-scale social changes that results from an impacting event is a common goal of social impact assessment (SIA), but is challenging as multiple factors simultaneously influence social trends at any given time, and there are usually only a small number of cases available for examination. While some methods have been proposed for establishing the contribution of an impacting event to macro-scale social change, they remain relatively untested. This paper critically reviews methods recommended to assess macro-scale social impacts, and proposes and demonstrates a new approach. The 'scaling up' method involves developing a chain of logic linking change at the individual/site scale to the community scale. It enables a more problematised assessment of the likely contribution of an impacting event to macro-scale social change than previous approaches. The use of this approach in a recent study of change in dairy farming in south east Australia is described.

  7. The Cross-Scale Mission

    SciTech Connect

    Baumjohann, W.; Nakamura, R.; Horbury, T.; Schwartz, S.; Canu, P.; Roux, A.; Vaivads, A.

    2009-06-16

    Collisionless space plasmas exhibit complex behavior on many scales. Fortunately, one can identify a small number of processes and phenomena, essentially shocks, reconnection and turbulence that play a predominant role in the dynamics of a plasma. These processes act to transfer energy between locations, scales and modes, a transfer characterized by variability and three-dimensional structure on at least three scales: electron kinetic, ion kinetic and fluid scale. The nonlinear interaction between physical processes at these scales is the key to understanding these phenomena. Current and upcoming multi-spacecraft missions such as Cluster, THEMIS, and MMS only study three-dimensional variations on one scale at any given time, but one needs to measure the three scales simultaneously to understand the energy transfer processes and the coupling and interaction between the different scales. A mission called Cross-Scale would comprise three nested groups, each consisting of up to four spacecraft. Each group would have a different spacecraft separation, at approximately the electron and ion gyro radii, and at the larger magnetohydrodynamic or fluid scale. One would therefore be able to measure simultaneously variations on all three important physical scales, for the first time. With the spacecraft traversing key regions of near-Earth space, namely solar wind, bow shock, magnetosheath, magnetopause and magnetotail, all three aforementioned processes can be studied.

  8. Composite Health Plan Quality Scales

    PubMed Central

    Caldis, Todd

    2007-01-01

    This study employs exploratory factor analysis and scale construction methods with commercial Health Plan Employers Data Information Set (HEDIS®) process of care and outcome measures from 1999 to uncover evidence for a unidimensional composite health maintenance organization (HMO) quality scale. Summated scales by categories of care are created and are then used in a factor analysis that has a single factor solution. The category of care scales were used to construct a summated composite scale which exhibits strong evidence of internal consistency (alpha= 0.90). External validity of the composite quality scale was checked by regressing the composite scale on Consumer Assessment of Healthcare Providers and Systems (CAHPS®) survey results for 1999. PMID:17645158

  9. Solar system to scale

    NASA Astrophysics Data System (ADS)

    Gerwig López, Susanne

    2016-04-01

    One of the most important successes in astronomical observations has been to determine the limit of the Solar System. It is said that the first man able to measure the distance Earth-Sun with only a very slight mistake, in the second century BC, was the wise Greek man Aristarco de Samos. Thanks to Newtońs law of universal gravitation, it was possible to measure, with a little margin of error, the distances between the Sun and the planets. Twelve-year old students are very interested in everything related to the universe. However, it seems too difficult to imagine and understand the real distances among the different celestial bodies. To learn the differences among the inner and outer planets and how far away the outer ones are, I have considered to make my pupils work on the sizes and the distances in our solar system constructing it to scale. The purpose is to reproduce our solar system to scale on a cardboard. The procedure is very easy and simple. Students of first year of ESO (12 year-old) receive the instructions in a sheet of paper (things they need: a black cardboard, a pair of scissors, colored pencils, a ruler, adhesive tape, glue, the photocopies of the planets and satellites, the measurements they have to use). In another photocopy they get the pictures of the edge of the sun, the planets, dwarf planets and some satellites, which they have to color, cut and stick on the cardboard. This activity is planned for both Spanish and bilingual learning students as a science project. Depending on the group, they will receive these instructions in Spanish or in English. When the time is over, the students bring their works on their cardboard to the class. They obtain a final mark: passing, good or excellent, depending on the accuracy of the measurements, the position of all the celestial bodies, the asteroids belts, personal contributions, etc. If any of the students has not followed the instructions they get the chance to remake it again properly, in order not

  10. Large scale tracking algorithms.

    SciTech Connect

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  11. Tipping the scales.

    PubMed

    1998-12-01

    In the US, the October 1998 murder of a physician who performed abortions was an outward manifestation of the insidious battle against legal abortion being waged by radical Christian social conservatives seeking to transform the US democracy into a theocracy. This movement has been documented in a publication entitled, "Tipping the Scales: The Christian Right's Legal Crusade Against Choice" produced as a result of a 4-year investigation conducted by The Center for Reproductive Law and Policy. This publication describes how these fundamentalists have used sophisticated legal, lobbying, and communication strategies to further their goals of challenging the separation of church and state, opposing family planning and sexuality education that is not based solely on abstinence, promoting school prayer, and restricting homosexual rights. The movement has resulted in the introduction of more than 300 anti-abortion bills in states, 50 of which have passed in 23 states. Most Christian fundamentalist groups provide free legal representation to abortion clinic terrorists, and some groups solicit women to bring specious malpractice claims against providers. Sophisticated legal tactics are used by these groups to remove the taint of extremism and mask the danger posed to US constitutional principles being posed by "a well-financed and zealous brand of radical lawyers and their supporters." PMID:12294553

  12. Scaling of structural failure

    SciTech Connect

    Bazant, Z.P.; Chen, Er-Ping

    1997-01-01

    This article attempts to review the progress achieved in the understanding of scaling and size effect in the failure of structures. Particular emphasis is placed on quasibrittle materials for which the size effect is complicated. Attention is focused on three main types of size effects, namely the statistical size effect due to randomness of strength, the energy release size effect, and the possible size effect due to fractality of fracture or microcracks. Definitive conclusions on the applicability of these theories are drawn. Subsequently, the article discusses the application of the known size effect law for the measurement of material fracture properties, and the modeling of the size effect by the cohesive crack model, nonlocal finite element models and discrete element models. Extensions to compression failure and to the rate-dependent material behavior are also outlined. The damage constitutive law needed for describing a microcracked material in the fracture process zone is discussed. Various applications to quasibrittle materials, including concrete, sea ice, fiber composites, rocks and ceramics are presented.

  13. SPACE BASED INTERCEPTOR SCALING

    SciTech Connect

    G. CANAVAN

    2001-02-01

    Space Based Interceptor (SBI) have ranges that are adequate to address rogue ICBMs. They are not overly sensitive to 30-60 s delay times. Current technologies would support boost phase intercept with about 150 interceptors. Higher acceleration and velocity could reduce than number by about a factor of 3 at the cost of heavier and more expensive Kinetic Kill Vehicles (KKVs). 6g SBI would reduce optimal constellation costs by about 35%; 8g SBI would reduce them another 20%. Interceptor ranges fall rapidly with theater missile range. Constellations increase significantly for ranges under 3,000 km, even with advanced interceptor technology. For distributed launches, these estimates recover earlier strategic scalings, which demonstrate the improved absentee ratio for larger or multiple launch areas. Constellations increase with the number of missiles and the number of interceptors launched at each. The economic estimates above suggest that two SBI per missile with a modest midcourse underlay is appropriate. The SBI KKV technology would appear to be common for space- and surface-based boost phase systems, and could have synergisms with improved midcourse intercept and discrimination systems. While advanced technology could be helpful in reducing costs, particularly for short range theater missiles, current technology appears adequate for pressing rogue ICBM, accidental, and unauthorized launches.

  14. Scaling and Urban Growth

    NASA Astrophysics Data System (ADS)

    Benguigui, L.; Czamanski, D.; Marinov, M.

    This paper presents an analysis of the growth of towns in the Tel Aviv metropolis. It indicates a similarity in the variation of populations so that the population functions can be scaled and superposed one onto the other. This is a strong indication that the growth mechanism for all these towns is the same. Two different models are presented to interpret the population growth: one is an analytic model while the other is a computer simulation. In the dynamic analytic model, we introduced the concept of characteristic time. The growth has two parts: in the first, the derivative is an increasing function, the town is very attractive and there is short delay between decision to build and complete realization of the process. At this time, there is no shortage of land. However, around a specific time, the delay begins to increase and there is lack of available land. The rate of the population variation decreases until saturation. The two models give a good quantitative description.

  15. Investigation of psychometric properties of the Falls Efficacy Scale using Rasch analysis in patients with hemiplegic stroke.

    PubMed

    Park, Eun Young; Choi, Yoo Im

    2015-09-01

    [Purpose] The purpose of this study was to investigate the psychometric properties of the Falls Efficacy Scale using Rasch analysis in patients with hemiplegic stroke. [Subjects] Fifty-five community-dwelling hemiplegic stroke patients were selected as participants. [Methods] Data were analyzed using the Winsteps program (version 3.62) with the Rasch model to confirm the unidimensionality through item fit, reliability, and appropriateness of the rating scale. [Results] There were no misfit persons or items. Furthermore, infit and outfit statistics appeared adjacent. The person separation value was 3.07, and the reliability coefficient was 0.90. The reliability of all items was at an acceptable level for patients with hemiplegic stroke. [Conclusion] This was the first study to investigate the psychometric properties of the Falls Efficacy Scale using Rasch analysis. The results of this study suggest that the 6-point Falls Efficacy Scale is an appropriate tool for measuring the self-perceived fear of falling in patients with hemiplegic stroke.

  16. Investigation of psychometric properties of the Falls Efficacy Scale using Rasch analysis in patients with hemiplegic stroke

    PubMed Central

    Park, Eun Young; Choi, Yoo Im

    2015-01-01

    [Purpose] The purpose of this study was to investigate the psychometric properties of the Falls Efficacy Scale using Rasch analysis in patients with hemiplegic stroke. [Subjects] Fifty-five community-dwelling hemiplegic stroke patients were selected as participants. [Methods] Data were analyzed using the Winsteps program (version 3.62) with the Rasch model to confirm the unidimensionality through item fit, reliability, and appropriateness of the rating scale. [Results] There were no misfit persons or items. Furthermore, infit and outfit statistics appeared adjacent. The person separation value was 3.07, and the reliability coefficient was 0.90. The reliability of all items was at an acceptable level for patients with hemiplegic stroke. [Conclusion] This was the first study to investigate the psychometric properties of the Falls Efficacy Scale using Rasch analysis. The results of this study suggest that the 6-point Falls Efficacy Scale is an appropriate tool for measuring the self-perceived fear of falling in patients with hemiplegic stroke. PMID:26504303

  17. A concept for major incident triage: full-scaled simulation feasibility study

    PubMed Central

    2010-01-01

    Background Efficient management of major incidents involves triage, treatment and transport. In the absence of a standardised interdisciplinary major incident management approach, the Norwegian Air Ambulance Foundation developed Interdisciplinary Emergency Service Cooperation Course (TAS). The TAS-program was established in 1998 and by 2009, approximately 15 500 emergency service professionals have participated in one of more than 500 no-cost courses. The TAS-triage concept is based on the established triage Sieve and Paediatric Triage Tape models but modified with slap-wrap reflective triage tags and paediatric triage stretchers. We evaluated the feasibility and accuracy of the TAS-triage concept in full-scale simulated major incidents. Methods The learners participated in two standardised bus crash simulations: without and with competence of TAS-triage and access to TAS-triage equipment. The instructors calculated triage accuracy and measured time consumption while the learners participated in a self-reported before-after study. Each question was scored on a 7-point Likert scale with points labelled "Did not work" (1) through "Worked excellent" (7). Results Among the 93 (85%) participating emergency service professionals, 48% confirmed the existence of a major incident triage system in their service, whereas 27% had access to triage tags. The simulations without TAS-triage resulted in a mean over- and undertriage of 12%. When TAS-Triage was used, no mistriage was found. The average time from "scene secured to all patients triaged" was 22 minutes (range 15-32) without TAS-triage vs. 10 minutes (range 5-21) with TAS-triage. The participants replied to "How did interdisciplinary cooperation of triage work?" with mean 4,9 (95% CI 4,7-5,2) before the course vs. mean 5,8 (95% CI 5,6-6,0) after the course, p < 0,001. Conclusions Our modified triage Sieve tool is feasible, time-efficient and accurate in allocating priority during simulated bus accidents and may serve as

  18. Scale Development for Measuring and Predicting Adolescents’ Leisure Time Physical Activity Behavior

    PubMed Central

    Ries, Francis; Romero Granados, Santiago; Arribas Galarraga, Silvia

    2009-01-01

    . Rephrasing the items and scoring items on a Likert-type scale enhanced greatly the subscales reliability. Identical factorial structure was extracted for both culturally different samples. The obtained factors, namely perceived physical competence, parents’ physical activity, perceived resources support, attitude toward physical activity and perceived parental support were hypothesized as for the original TPB constructs. PMID:24149606

  19. Industrial scale gene synthesis.

    PubMed

    Notka, Frank; Liss, Michael; Wagner, Ralf

    2011-01-01

    The most recent developments in the area of deep DNA sequencing and downstream quantitative and functional analysis are rapidly adding a new dimension to understanding biochemical pathways and metabolic interdependencies. These increasing insights pave the way to designing new strategies that address public needs, including environmental applications and therapeutic inventions, or novel cell factories for sustainable and reconcilable energy or chemicals sources. Adding yet another level is building upon nonnaturally occurring networks and pathways. Recent developments in synthetic biology have created economic and reliable options for designing and synthesizing genes, operons, and eventually complete genomes. Meanwhile, high-throughput design and synthesis of extremely comprehensive DNA sequences have evolved into an enabling technology already indispensable in various life science sectors today. Here, we describe the industrial perspective of modern gene synthesis and its relationship with synthetic biology. Gene synthesis contributed significantly to the emergence of synthetic biology by not only providing the genetic material in high quality and quantity but also enabling its assembly, according to engineering design principles, in a standardized format. Synthetic biology on the other hand, added the need for assembling complex circuits and large complexes, thus fostering the development of appropriate methods and expanding the scope of applications. Synthetic biology has also stimulated interdisciplinary collaboration as well as integration of the broader public by addressing socioeconomic, philosophical, ethical, political, and legal opportunities and concerns. The demand-driven technological achievements of gene synthesis and the implemented processes are exemplified by an industrial setting of large-scale gene synthesis, describing production from order to delivery.

  20. Validation of the Nurses' Perception of Patient Rounding Scale: An Exploratory Study of the Influence of Shift Work on Nurses' Perception of Patient Rounding.

    PubMed

    Neville, Kathleen; DiBona, Courtney; Mahler, Maureen

    2016-01-01

    Hourly rounds have re-emerged as standard practice among nurses in acute care settings, and there is the need to identify nurses' perceptions regarding this practice. Further use of the Nurses' Perception of Patient Rounding Scale (NPPRS) is needed to further validate this new instrument. In addition, there exists a dearth of literature that examines the impact of hours worked and shift on nurses' perceptions of patient rounding. The purpose of this descriptive study was to explore nurses' perception of the required practice of patient rounding, to examine the influence of nurses' shift on nurses' perception of rounding practice, and to provide additional psychometric support for the NPPRS. The NPPRS, a 42-item scale in 5-point Likert format, and a demographic information sheet were used in the study. The NPPRS yields three subscales: communication, patient benefits, and nurse benefits. Using a convenience sample of anonymous nurse participants, 76 nurses from five medical-surgical units at a medical center in the northeast corridor of the United States participated in the study. Further psychometric support for the NPPRS was demonstrated. Excellent reliability coefficients via Cronbach's alpha for the total scale (0.91) and each of the subscales were obtained. A statistically significant difference was noted among nurses working 8 hours versus 12 hours or combined 8- and 12-hour workloads. Perceptions of nurse benefits were statistically significantly higher for nurses working 8 hours. In addition, results indicated that nurses perceived rounding to be more beneficial to their own practice than to patients. Analyses revealed that leadership support was instrumental in successful rounding practice. Further support for the NPPRS was obtained through this study. Strong nursing leadership, supportive of rounding, is essential for successful rounding. Further research should examine the efficacy of nurse rounding-developed protocols specific to the shift and unit of

  1. Validation of the Nurses' Perception of Patient Rounding Scale: An Exploratory Study of the Influence of Shift Work on Nurses' Perception of Patient Rounding.

    PubMed

    Neville, Kathleen; DiBona, Courtney; Mahler, Maureen

    2016-01-01

    Hourly rounds have re-emerged as standard practice among nurses in acute care settings, and there is the need to identify nurses' perceptions regarding this practice. Further use of the Nurses' Perception of Patient Rounding Scale (NPPRS) is needed to further validate this new instrument. In addition, there exists a dearth of literature that examines the impact of hours worked and shift on nurses' perceptions of patient rounding. The purpose of this descriptive study was to explore nurses' perception of the required practice of patient rounding, to examine the influence of nurses' shift on nurses' perception of rounding practice, and to provide additional psychometric support for the NPPRS. The NPPRS, a 42-item scale in 5-point Likert format, and a demographic information sheet were used in the study. The NPPRS yields three subscales: communication, patient benefits, and nurse benefits. Using a convenience sample of anonymous nurse participants, 76 nurses from five medical-surgical units at a medical center in the northeast corridor of the United States participated in the study. Further psychometric support for the NPPRS was demonstrated. Excellent reliability coefficients via Cronbach's alpha for the total scale (0.91) and each of the subscales were obtained. A statistically significant difference was noted among nurses working 8 hours versus 12 hours or combined 8- and 12-hour workloads. Perceptions of nurse benefits were statistically significantly higher for nurses working 8 hours. In addition, results indicated that nurses perceived rounding to be more beneficial to their own practice than to patients. Analyses revealed that leadership support was instrumental in successful rounding practice. Further support for the NPPRS was obtained through this study. Strong nursing leadership, supportive of rounding, is essential for successful rounding. Further research should examine the efficacy of nurse rounding-developed protocols specific to the shift and unit of

  2. Plague and Climate: Scales Matter

    PubMed Central

    Ben Ari, Tamara; Neerinckx, Simon; Gage, Kenneth L.; Kreppel, Katharina; Laudisoit, Anne; Leirs, Herwig; Stenseth, Nils Chr.

    2011-01-01

    Plague is enzootic in wildlife populations of small mammals in central and eastern Asia, Africa, South and North America, and has been recognized recently as a reemerging threat to humans. Its causative agent Yersinia pestis relies on wild rodent hosts and flea vectors for its maintenance in nature. Climate influences all three components (i.e., bacteria, vectors, and hosts) of the plague system and is a likely factor to explain some of plague's variability from small and regional to large scales. Here, we review effects of climate variables on plague hosts and vectors from individual or population scales to studies on the whole plague system at a large scale. Upscaled versions of small-scale processes are often invoked to explain plague variability in time and space at larger scales, presumably because similar scale-independent mechanisms underlie these relationships. This linearity assumption is discussed in the light of recent research that suggests some of its limitations. PMID:21949648

  3. Mechanically reliable scales and coatings

    SciTech Connect

    Tortorelli, P.F.; Alexander, K.B.

    1995-07-01

    As the first stage in examining the mechanical reliability of protective surface oxides, the behavior of alumina scales formed on iron-aluminum alloys during high-temperature cyclic oxidation was characterized in terms of damage and spallation tendencies. Scales were thermally grown on specimens of three iron-aluminum composition using a series of exposures to air at 1000{degrees}C. Gravimetric data and microscopy revealed substantially better integrity and adhesion of the scales grown on an alloy containing zirconium. The use of polished (rather than just ground) specimens resulted in scales that were more suitable for subsequent characterization of mechanical reliability.

  4. DARHT Radiographic Grid Scale Correction

    SciTech Connect

    Warthen, Barry J.

    2015-02-13

    Recently it became apparent that the radiographic grid which has been used to calibrate the dimensional scale of DARHT radiographs was not centered at the location where the objects have been centered. This offset produced an error of 0.188% in the dimensional scaling of the radiographic images processed using the assumption that the grid and objects had the same center. This paper will show the derivation of the scaling correction, explain how new radiographs are being processed to account for the difference in location, and provide the details of how to correct radiographic image processed with the erroneous scale factor.

  5. Limitations in thermal scale modeling

    NASA Technical Reports Server (NTRS)

    Macgregor, R. K.

    1971-01-01

    Thermal scale modeling limitations for radiation- conduction system of unmanned spacecraft, discussing material thermal properties, model dimensions, instrumentation effects and environment simulation

  6. Mixed scale joint graphical lasso.

    PubMed

    Pircalabelu, Eugen; Claeskens, Gerda; Waldorp, Lourens J

    2016-10-01

    SummaryWe have developed a method for estimating brain networks from fMRI datasets that have not all been measured using the same set of brain regions. Some of the coarse scale regions have been split in smaller subregions. The proposed penalized estimation procedure selects undirected graphical models with similar structures that combine information from several subjects and several coarseness scales. Both within-scale edges and between-scale edges that identify possible connections between a large region and its subregions are estimated. PMID:27324414

  7. Validating Large Scale Networks Using Temporary Local Scale Networks

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The USDA NRCS Soil Climate Analysis Network and NOAA Climate Reference Networks are nationwide meteorological and land surface data networks with soil moisture measurements in the top layers of soil. There is considerable interest in scaling these point measurements to larger scales for validating ...

  8. INTERIOR VIEW SHOWING BATCH SCALES. SERIES OF FIVE SCALES WITH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW SHOWING BATCH SCALES. SERIES OF FIVE SCALES WITH SIX DIFFERENT MATERIALS. MIX SIFTED DOWN FROM SILOS ABOVE. INGREDIENTS: SAND, SODA ASH, DOLOMITE LIMESTONE, NEPHELINE SYENITE, SALT CAKE. - Chambers-McKee Window Glass Company, Batch Plant, Clay Avenue Extension, Jeannette, Westmoreland County, PA

  9. Drift Scale THM Model

    SciTech Connect

    J. Rutqvist

    2004-10-07

    This model report documents the drift scale coupled thermal-hydrological-mechanical (THM) processes model development and presents simulations of the THM behavior in fractured rock close to emplacement drifts. The modeling and analyses are used to evaluate the impact of THM processes on permeability and flow in the near-field of the emplacement drifts. The results from this report are used to assess the importance of THM processes on seepage and support in the model reports ''Seepage Model for PA Including Drift Collapse'' and ''Abstraction of Drift Seepage'', and to support arguments for exclusion of features, events, and processes (FEPs) in the analysis reports ''Features, Events, and Processes in Unsaturated Zone Flow and Transport and Features, Events, and Processes: Disruptive Events''. The total system performance assessment (TSPA) calculations do not use any output from this report. Specifically, the coupled THM process model is applied to simulate the impact of THM processes on hydrologic properties (permeability and capillary strength) and flow in the near-field rock around a heat-releasing emplacement drift. The heat generated by the decay of radioactive waste results in elevated rock temperatures for thousands of years after waste emplacement. Depending on the thermal load, these temperatures are high enough to cause boiling conditions in the rock, resulting in water redistribution and altered flow paths. These temperatures will also cause thermal expansion of the rock, with the potential of opening or closing fractures and thus changing fracture permeability in the near-field. Understanding the THM coupled processes is important for the performance of the repository because the thermally induced permeability changes potentially effect the magnitude and spatial distribution of percolation flux in the vicinity of the drift, and hence the seepage of water into the drift. This is important because a sufficient amount of water must be available within a

  10. Nonrelativistic scale anomaly, and composite operators with complex scaling dimensions

    SciTech Connect

    Moroz, Sergej

    2011-05-15

    Research Highlights: > Nonrelativistic scale anomaly leads to operators with complex scaling dimensions. > We study an operator O={psi}{psi} in quantum mechanics with 1/r{sup 2} potenial. > The propagator of the composite operator is analytically computed. - Abstract: It is demonstrated that a nonrelativistic quantum scale anomaly manifests itself in the appearance of composite operators with complex scaling dimensions. In particular, we study nonrelativistic quantum mechanics with an inverse square potential and consider a composite s-wave operator O={psi}{psi}. We analytically compute the scaling dimension of this operator and determine the propagator <0|TOO{sup +}|0>. The operator O represents an infinite tower of bound states with a geometric energy spectrum. Operators with higher angular momenta are briefly discussed.

  11. Scale invariance vs conformal invariance

    NASA Astrophysics Data System (ADS)

    Nakayama, Yu

    2015-03-01

    In this review article, we discuss the distinction and possible equivalence between scale invariance and conformal invariance in relativistic quantum field theories. Under some technical assumptions, we can prove that scale invariant quantum field theories in d = 2 space-time dimensions necessarily possess the enhanced conformal symmetry. The use of the conformal symmetry is well appreciated in the literature, but the fact that all the scale invariant phenomena in d = 2 space-time dimensions enjoy the conformal property relies on the deep structure of the renormalization group. The outstanding question is whether this feature is specific to d = 2 space-time dimensions or it holds in higher dimensions, too. As of January 2014, our consensus is that there is no known example of scale invariant but non-conformal field theories in d = 4 space-time dimensions under the assumptions of (1) unitarity, (2) Poincaré invariance (causality), (3) discrete spectrum in scaling dimensions, (4) existence of scale current and (5) unbroken scale invariance in the vacuum. We have a perturbative proof of the enhancement of conformal invariance from scale invariance based on the higher dimensional analogue of Zamolodchikov's c-theorem, but the non-perturbative proof is yet to come. As a reference we have tried to collect as many interesting examples of scale invariance in relativistic quantum field theories as possible in this article. We give a complementary holographic argument based on the energy-condition of the gravitational system and the space-time diffeomorphism in order to support the claim of the symmetry enhancement. We believe that the possible enhancement of conformal invariance from scale invariance reveals the sublime nature of the renormalization group and space-time with holography. This review is based on a lecture note on scale invariance vs conformal invariance, on which the author gave lectures at Taiwan Central University for the 5th Taiwan School on Strings and

  12. The Callier-Azusa Scale.

    ERIC Educational Resources Information Center

    Stillman, Robert D., Ed.

    Presented is the Callier-Azusa Scale designed to aid in the assessment of deaf-blind and multihandicapped children in the areas of motor development, perceptual abilities, daily living skills, language development, and socialization. The scale is said to be predicated on the assumption that given the appropriate environment all children follow the…

  13. A Scale of Mobbing Impacts

    ERIC Educational Resources Information Center

    Yaman, Erkan

    2012-01-01

    The aim of this research was to develop the Mobbing Impacts Scale and to examine its validity and reliability analyses. The sample of study consisted of 509 teachers from Sakarya. In this study construct validity, internal consistency, test-retest reliabilities and item analysis of the scale were examined. As a result of factor analysis for…

  14. Rating scales for musician's dystonia

    PubMed Central

    Berque, Patrice; Jabusch, Hans-Christian; Altenmüller, Eckart; Frucht, Steven J.

    2013-01-01

    Musician's dystonia (MD) is a focal adult-onset dystonia most commonly involving the hand. It has much greater relative prevalence than non-musician’s focal hand dystonias, exhibits task specificity at the level of specific musical passages, and is a particularly difficult form of dystonia to treat. For most MD patients, the diagnosis confirms the end of their music performance careers. Research on treatments and pathophysiology is contingent upon measures of motor function abnormalities. In this review, we comprehensively survey the literature to identify the rating scales used in MD and the distribution of their use. We also summarize the extent to which the scales have been evaluated for their clinical utility, including reliability, validity, sensitivity, specificity to MD, and practicality for a clinical setting. Out of 135 publications, almost half (62) included no quantitative measures of motor function. The remaining 73 studies used a variety of choices from among 10 major rating scales. Most used subjective scales involving either patient or clinician ratings. Only 25% (18) of the studies used objective scales. None of the scales has been completely and rigorously evaluated for clinical utility. Whether studies involved treatments or pathophysiologic assays, there was a heterogeneous choice of rating scales used with no clear standard. As a result, the collective interpretive value of those studies is limited because the results are confounded by measurement effects. We suggest that the development and widespread adoption of a new clinically useful rating scale is critical for accelerating basic and clinical research in MD. PMID:23884039

  15. Voice, Schooling, Inequality, and Scale

    ERIC Educational Resources Information Center

    Collins, James

    2013-01-01

    The rich studies in this collection show that the investigation of voice requires analysis of "recognition" across layered spatial-temporal and sociolinguistic scales. I argue that the concepts of voice, recognition, and scale provide insight into contemporary educational inequality and that their study benefits, in turn, from paying attention to…

  16. Contrast Analysis for Scale Differences.

    ERIC Educational Resources Information Center

    Olejnik, Stephen F.; And Others

    Research on tests for scale equality have focused exclusively on an overall test statistic and have not examined procedures for identifying specific differences in multiple group designs. The present study compares four contrast analysis procedures for scale differences in the single factor four-group design: (1) Tukey HSD; (2) Kramer-Tukey; (3)…

  17. The Differentiated Classroom Observation Scale

    ERIC Educational Resources Information Center

    Cassady, Jerrell C.; Neumeister, Kristie L. Speirs; Adams, Cheryll M.; Cross, Tracy L.; Dixon, Felicia A.; Pierce, Rebecca L.

    2004-01-01

    This article presents a new classroom observation scale that was developed to examine the differential learning activities and experiences of gifted children educated in regular classroom settings. The Differentiated Classroom Observation Scale (DCOS) is presented in total, with clarification of the coding practices and strategies. Although the…

  18. Very Large Scale Integration (VLSI).

    ERIC Educational Resources Information Center

    Yeaman, Andrew R. J.

    Very Large Scale Integration (VLSI), the state-of-the-art production techniques for computer chips, promises such powerful, inexpensive computing that, in the future, people will be able to communicate with computer devices in natural language or even speech. However, before full-scale VLSI implementation can occur, certain salient factors must be…

  19. Scale Shrinkage in Vertical Equating.

    ERIC Educational Resources Information Center

    Camilli, Gregory; And Others

    1993-01-01

    Three potential causes of scale shrinkage (measurement error, restriction of range, and multidimensionality) in item response theory vertical equating are discussed, and a more comprehensive model-based approach to establishing vertical scales is described. Test data from the National Assessment of Educational Progress are used to illustrate the…

  20. Multi-scale Material Appearance

    NASA Astrophysics Data System (ADS)

    Wu, Hongzhi

    Modeling and rendering the appearance of materials is important for a diverse range of applications of computer graphics - from automobile design to movies and cultural heritage. The appearance of materials varies considerably at different scales, posing significant challenges due to the sheer complexity of the data, as well the need to maintain inter-scale consistency constraints. This thesis presents a series of studies around the modeling, rendering and editing of multi-scale material appearance. To efficiently render material appearance at multiple scales, we develop an object-space precomputed adaptive sampling method, which precomputes a hierarchy of view-independent points that preserve multi-level appearance. To support bi-scale material appearance design, we propose a novel reflectance filtering algorithm, which rapidly computes the large-scale appearance from small-scale details, by exploiting the low-rank structures of Bidirectional Visible Normal Distribution Functions and pre-rotated Bidirectional Reflectance Distribution Functions in the matrix formulation of the rendering algorithm. This approach can guide the physical realization of appearance, as well as the modeling of real-world materials using very sparse measurements. Finally, we present a bi-scale-inspired high-quality general representation for material appearance described by Bidirectional Texture Functions. Our representation is at once compact, easily editable, and amenable to efficient rendering.

  1. Evaluation of Behavioral Expectation Scales.

    ERIC Educational Resources Information Center

    Zedeck, Sheldon; Baker, Henry T.

    Behavioral Expectation Scales developed by Smith and Kendall were evaluated. Results indicated slight interrater reliability between Head Nurses and Supervisors, moderate dependence among five performance dimensions, and correlation between two scales and tenure. Results are discussed in terms of procedural problems, critical incident problems,…

  2. OCCUPATIONAL ASPIRATION SCALE FOR FEMALES.

    ERIC Educational Resources Information Center

    JEFFS, GEORGE A.

    OCCUPATIONAL TITLES USABLE IN ASSESSING OCCUPATIONAL GOALS OFSENIOR HIGH SCHOOL FEMALES WERE SELECTED AS THE FIRST STEP IN ESTABLISHING AN OCCUPATIONAL ASPIRATION SCALE FOR FEMALES. A LIST OF 117 OCCUPATIONAL TITLES, COMPILED FROM THREE PREVIOUS STUDIES AND "THE DICTIONARY OF OCCUPATIONAL TITLES," WAS RATED ON A SIX-LEVEL SCALE AS TO ITS GENERAL…

  3. Children's Scale Errors with Tools

    ERIC Educational Resources Information Center

    Casler, Krista; Eshleman, Angelica; Greene, Kimberly; Terziyan, Treysi

    2011-01-01

    Children sometimes make "scale errors," attempting to interact with tiny object replicas as though they were full size. Here, we demonstrate that instrumental tools provide special insight into the origins of scale errors and, moreover, into the broader nature of children's purpose-guided reasoning and behavior with objects. In Study 1, 1.5- to…

  4. Spiritual Competency Scale: Further Analysis

    ERIC Educational Resources Information Center

    Dailey, Stephanie F.; Robertson, Linda A.; Gill, Carman S.

    2015-01-01

    This article describes a follow-up analysis of the Spiritual Competency Scale, which initially validated ASERVIC's (Association for Spiritual, Ethical and Religious Values in Counseling) spiritual competencies. The study examined whether the factor structure of the Spiritual Competency Scale would be supported by participants (i.e., ASERVIC…

  5. Convergent Validity of Four Innovativeness Scales.

    ERIC Educational Resources Information Center

    Goldsmith, Ronald E.

    1986-01-01

    Four scales of innovativeness were administered to two samples of undergraduate students: the Open Processing Scale, Innovativeness Scale, innovation subscale of the Jackson Personality Inventory, and Kirton Adaption-Innovation Inventory. Intercorrelations indicated the scales generally exhibited convergent validity. (GDC)

  6. Important Scaling Parameters for Testing Model-Scale Helicopter Rotors

    NASA Technical Reports Server (NTRS)

    Singleton, Jeffrey D.; Yeager, William T., Jr.

    1998-01-01

    An investigation into the effects of aerodynamic and aeroelastic scaling parameters on model scale helicopter rotors has been conducted in the NASA Langley Transonic Dynamics Tunnel. The effect of varying Reynolds number, blade Lock number, and structural elasticity on rotor performance has been studied and the performance results are discussed herein for two different rotor blade sets at two rotor advance ratios. One set of rotor blades were rigid and the other set of blades were dynamically scaled to be representative of a main rotor design for a utility class helicopter. The investigation was con-densities permits the acquisition of data for several Reynolds and Lock number combinations.

  7. Scale effect on overland flow connectivity at the plot scale

    NASA Astrophysics Data System (ADS)

    Peñuela, A.; Javaux, M.; Bielders, C. L.

    2013-01-01

    A major challenge in present-day hydrological sciences is to enhance the performance of existing distributed hydrological models through a better description of subgrid processes, in particular the subgrid connectivity of flow paths. The Relative Surface Connection (RSC) function was proposed by Antoine et al. (2009) as a functional indicator of runoff flow connectivity. For a given area, it expresses the percentage of the surface connected to the outflow boundary (C) as a function of the degree of filling of the depression storage. This function explicitly integrates the flow network at the soil surface and hence provides essential information regarding the flow paths' connectivity. It has been shown that this function could help improve the modeling of the hydrograph at the square meter scale, yet it is unknown how the scale affects the RSC function, and whether and how it can be extrapolated to other scales. The main objective of this research is to study the scale effect on overland flow connectivity (RSC function). For this purpose, digital elevation data of a real field (9 × 3 m) and three synthetic fields (6 × 6 m) with contrasting hydrological responses were used, and the RSC function was calculated at different scales by changing the length (l) or width (w) of the field. To different extents depending on the microtopography, border effects were observed for the smaller scales when decreasing l or w, which resulted in a strong decrease or increase of the maximum depression storage, respectively. There was no scale effect on the RSC function when changing w, but a remarkable scale effect was observed in the RSC function when changing l. In general, for a given degree of filling of the depression storage, C decreased as l increased, the change in C being inversely proportional to the change in l. However, this observation applied only up to approx. 50-70% (depending on the hydrological response of the field) of filling of depression storage, after which no

  8. Scale effect on overland flow connectivity, at the interill scale

    NASA Astrophysics Data System (ADS)

    Penuela Fernandez, A.; Bielders, C.; Javaux, M.

    2012-04-01

    The relative surface connection function (RSC) was proposed by Antoine et al. (2009) as a functional indicator of runoff flow connectivity. For a given area, it expresses the percentage of the surface connected to the outlet (C) as a function of the degree of filling of the depression storage. This function explicitly integrates the flow network at the soil surface and hence provides essential information regarding the flow paths' connectivity. It has been shown that this function could help improve the modeling of the hydrogram at the square meter scale, yet it is unknown how the scale affects the RSC function, and whether and how it can be extrapolated to other scales. The main objective of this research is to study the scale effect on overland flow connectivity (RSC function). For this purpose, digital elevation data of a real field (9 x 3 m) and three synthetic fields (6 x 6 m) with contrasting hydrological responses was used, and the RSC function was calculated at different scales by changing the length (L) or width (l) of the field. Border effects were observed for the smaller scales. In most of cases, for L or l smaller than 750mm, increasing L or l, resulted in a strong increase or decrease of the maximum depression storage, respectively. There was no scale effect on the RSC function when changing l. On the contrary, a remarkable scale effect was observed in the RSC function when changing L. In general, for a given degree of filling of the depression storage, C decreased as L increased. This change in C was inversely proportional to the change in L. This observation applied only up to approx. 50-70% (depending on the hydrological response of the field) of filling of depression storage, after which no correlation was found between C and L. The results of this study help identify the critical scale to study overland flow connectivity. At scales larger than the critical scale, the RSC function showed a great potential to be extrapolated to other scales.

  9. Scale effect on overland flow connectivity at the plot scale

    NASA Astrophysics Data System (ADS)

    Peñuela, A.; Javaux, M.; Bielders, C. L.

    2012-06-01

    A major challenge in present-day hydrological sciences is to enhance the performance of existing distributed hydrological models through a better description of subgrid processes, in particular the subgrid connectivity of flow paths. The relative surface connection function (RSC) was proposed by Antoine et al. (2009) as a functional indicator of runoff flow connectivity. For a given area, it expresses the percentage of the surface connected to the outflow boundary (C) as a function of the degree of filling of the depression storage. This function explicitly integrates the flow network at the soil surface and hence provides essential information regarding the flow paths' connectivity. It has been shown that this function could help improve the modeling of the hydrogram at the square meter scale, yet it is unknown how the scale affects the RSC function, and whether and how it can be extrapolated to other scales. The main objective of this research is to study the scale effect on overland flow connectivity (RSC function). For this purpose, digital elevation data of a real field (9 × 3 m) and three synthetic fields (6 × 6 m) with contrasting hydrological responses were used, and the RSC function was calculated at different scales by changing the length (l) or width (w) of the field. Border effects, at different extents depending on the microtopography, were observed for the smaller scales, when decreasing l or w, which resulted in a strong decrease or increase of the maximum depression storage, respectively. There was no scale effect on the RSC function when changing w. On the contrary, a remarkable scale effect was observed in the RSC function when changing l. In general, for a given degree of filling of the depression storage, C decreased as l increased. This change in C was inversely proportional to the change in l. This observation applied only up to approx. 50-70% (depending on the hydrological response of the field) of filling of depression storage, after which

  10. 27 CFR 19.186 - Package scales.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2012-04-01 2012-04-01 false Package scales. 19.186... Package Scale and Pipeline Requirements § 19.186 Package scales. Proprietors must ensure that scales used.... However, if a scale is not used during a 6-month period, it is only necessary to test the scale prior...

  11. 27 CFR 19.186 - Package scales.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2014-04-01 2014-04-01 false Package scales. 19.186... Package Scale and Pipeline Requirements § 19.186 Package scales. Proprietors must ensure that scales used.... However, if a scale is not used during a 6-month period, it is only necessary to test the scale prior...

  12. 27 CFR 19.186 - Package scales.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2011-04-01 2011-04-01 false Package scales. 19.186... Package Scale and Pipeline Requirements § 19.186 Package scales. Proprietors must ensure that scales used.... However, if a scale is not used during a 6-month period, it is only necessary to test the scale prior...

  13. 27 CFR 19.186 - Package scales.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2013-04-01 2013-04-01 false Package scales. 19.186... Package Scale and Pipeline Requirements § 19.186 Package scales. Proprietors must ensure that scales used.... However, if a scale is not used during a 6-month period, it is only necessary to test the scale prior...

  14. Scale-dependent halo bias from scale-dependent growth

    SciTech Connect

    Parfrey, Kyle; Hui, Lam; Sheth, Ravi K.

    2011-03-15

    We derive a general expression for the large-scale halo bias, in theories with a scale-dependent linear growth, using the excursion set formalism. Such theories include modified-gravity models, and models in which the dark energy clustering is non-negligible. A scale dependence is imprinted in both the formation and evolved biases by the scale-dependent growth. Mergers are accounted for in our derivation, which thus extends earlier work which focused on passive evolution. There is a simple analytic form for the bias for those theories in which the nonlinear collapse of perturbations is approximately the same as in general relativity. As an illustration, we apply our results to a simple Yukawa modification of gravity, and use Sloan Digital Sky Survey measurements of the clustering of luminous red galaxies to constrain the theory's parameters.

  15. Evaluation of properties of the Vestibular Disorders Activities of Daily Living Scale (Brazilian version) in an elderly population

    PubMed Central

    Ricci, Natalia A.; Aratani, Mayra C.; Caovilla, Heloisa H.; Cohen, Helen S.; Ganança, Fernando F.

    2014-01-01

    Background The Vestibular Disorders Activities of Daily Living Scale (VADL) is considered an important subjective assessment to evaluate patients suffering from dizziness and imbalance. Although frequently used, its metric characteristics still require further investigation. Objective This paper aims to analyze the psychometric properties of the Brazilian version of the VADL in an elderly population. Method The sample comprises patients (≥65 years old) with chronic dizziness resulting from vestibular disorders. For discriminant analysis, patients were compared to healthy subjects. All subjects answered the VADL-Brazil by interview. To examine the VADL validity, patients filled out the Dizziness Handicap Inventory (DHI) and the ABC scale and were tested on the Dynamic Gait Index (DGI). To evaluate the VADL responsiveness, 20 patients were submitted to rehabilitation. Results Patients (n=140) had a VADL total score of 4.1±1.6 points. Healthy subjects scored significantly less than patients in all the subscales and in the VADL total score. The VADL-Brazil was weakly correlated with the DHI and moderately to the ABC scale and the DGI. Instead of the original 3 subscales, factor analysis resulted in 6 factors. The VADL was capable of detecting changes after rehabilitation, which means that the instrument has responsiveness. Conclusions This study provided more data about the psychometric properties and usefulness of the VADL-Brazil. The use of such a reliable and valid instrument increases the knowledge about disability in patients with vestibular disorders. PMID:24676704

  16. Concordance among anticholinergic burden scales

    PubMed Central

    Naples, Jennifer G.; Marcum, Zachary A.; Perera, Subashan; Gray, Shelly L.; Newman, Anne B.; Simonsick, Eleanor M.; Yaffe, Kristine; Shorr, Ronald I.; Hanlon, Joseph T.

    2015-01-01

    Background There is no gold standard to assess potential anticholinergic burden of medications. Objectives To evaluate concordance among five commonly used anticholinergic scales. Design Cross-sectional secondary analysis. Setting Pittsburgh, PA, and Memphis, TN. Participants 3,055 community-dwelling older adults aged 70–79 with baseline medication data from the Health, Aging, and Body Composition study. Measurements Any use, weighted scores, and total standardized daily dosage were calculated using five anticholinergic measures (i.e., Anticholinergic Cognitive Burden [ACB] Scale, Anticholinergic Drug Scale [ADS], Anticholinergic Risk Scale [ARS], Drug Burden Index anticholinergic component [DBI-ACh], and Summated Anticholinergic Medications Scale [SAMS]). Concordance was evaluated with kappa statistics and Spearman rank correlations. Results Any anticholinergic use in rank order was 51% for the ACB, 43% for the ADS, 29% for the DBI-ACh, 23% for the ARS, and 16% for the SAMS. Kappa statistics for all pairwise use comparisons ranged from 0.33 to 0.68. Similarly, concordance as measured by weighted kappa statistics ranged from 0.54 to 0.70 among the three scales not incorporating dosage (ADS, ARS, and ACB). Spearman rank correlation between the DBI-ACh and SAMS was 0.50. Conclusions Only low to moderate concordance was found among the five anticholinergic scales. Future research is needed to examine how these differences in measurement impact their predictive validity with respect to clinically relevant outcomes, such as cognitive impairment. PMID:26480974

  17. Estimating Plot Scale Impacts on Watershed Scale Management

    NASA Astrophysics Data System (ADS)

    Shope, C. L.; Fleckenstein, J. H.; Tenhunen, J. D.; Peiffer, S.; Huwe, B.

    2010-12-01

    Over recent decades, land and resource use as well as climate change have been implicated in reduced ecosystem services (ie: high quality water yield, biodiversity, agricultural and forest products). The prediction of ecosystem services expected under future land use decisions and changing climate conditions has become increasingly important. Complex policy and management decisions require the integration of physical, economic, and social data over several scales to assess effects on water resources and ecology. Field-based meteorology, hydrology, biology, soil physics, plant production, solute and sediment transport, economic, and social behavior data were measured in a catchment of South Korea. A variety of models (Erosion-3D, HBV-Light, VS2DH, Hydrus, PIXGRO, DNDC, and Hydrogeosphere) are being used to simulate plot and field scale measurements within the catchment. Results from each of the local-scale models provide identification of sensitive, local-scale parameters which are then used as inputs into a large-scale watershed model. The experimental field data throughout the catchment was integrated with the spatially-distributed SWAT2005 model. Typically, macroscopic homogeneity and average effective model parameters are assumed when upscaling local-scale heterogeneous measurements to the watershed. The approach of our study was that the range in local-scale model parameter results can be used to define the sensitivity and uncertainty in the large-scale watershed model. The field-based and modeling framework described is being used to develop scenarios to examine spatial and temporal changes in land use practices and climatic effects on water quantity, water quality, and sediment transport. Development of accurate modeling scenarios requires understanding the social relationship between individual and policy driven land management practices and the value of sustainable resources.

  18. SCALING PROPERTIES OF SMALL-SCALE FLUCTUATIONS IN MAGNETOHYDRODYNAMIC TURBULENCE

    SciTech Connect

    Perez, Jean Carlos; Mason, Joanne; Boldyrev, Stanislav; Cattaneo, Fausto E-mail: j.mason@exeter.ac.uk E-mail: cattaneo@flash.uchicago.edu

    2014-09-20

    Magnetohydrodynamic (MHD) turbulence in the majority of natural systems, including the interstellar medium, the solar corona, and the solar wind, has Reynolds numbers far exceeding the Reynolds numbers achievable in numerical experiments. Much attention is therefore drawn to the universal scaling properties of small-scale fluctuations, which can be reliably measured in the simulations and then extrapolated to astrophysical scales. However, in contrast with hydrodynamic turbulence, where the universal structure of the inertial and dissipation intervals is described by the Kolmogorov self-similarity, the scaling for MHD turbulence cannot be established based solely on dimensional arguments due to the presence of an intrinsic velocity scale—the Alfvén velocity. In this Letter, we demonstrate that the Kolmogorov first self-similarity hypothesis cannot be formulated for MHD turbulence in the same way it is formulated for the hydrodynamic case. Besides profound consequences for the analytical consideration, this also imposes stringent conditions on numerical studies of MHD turbulence. In contrast with the hydrodynamic case, the discretization scale in numerical simulations of MHD turbulence should decrease faster than the dissipation scale, in order for the simulations to remain resolved as the Reynolds number increases.

  19. High performance oilfield scale inhibitors

    SciTech Connect

    Duccini, Y.; Dufour, A.; Hann, W.M.; Sanders, T.W.; Weinstein, B.

    1997-08-01

    Sea water often reacts with the formation water in offshore fields to produce barium, calcium and strontium sulfate deposits that hinder oil production. Newer fields often have more difficult to control scale problems than older ones, and current technology scale inhibitors are not able to control the deposits as well as needed. In addition, ever more stringent regulations designed to minimize the impact of inhibitors on the environment are being enacted. Three new inhibitors are presented that overcome many of the problems of older technology scale inhibitors.

  20. Semi-scaling cosmic strings

    SciTech Connect

    Vanchurin, Vitaly

    2010-11-01

    We develop a model of string dynamics with back-reaction from both scaling and non-scaling loops taken into account. The evolution of a string network is described by the distribution functions of coherence segments and kinks. We derive two non-linear equations which govern the evolution of the two distributions and solve them analytically in the limit of late times. We also show that the correlation function is an exponential, and solve the dynamics for the corresponding spectrum of scaling loops.

  1. Scaling of graphene integrated circuits.

    PubMed

    Bianchi, Massimiliano; Guerriero, Erica; Fiocco, Marco; Alberti, Ruggero; Polloni, Laura; Behnam, Ashkan; Carrion, Enrique A; Pop, Eric; Sordan, Roman

    2015-05-01

    The influence of transistor size reduction (scaling) on the speed of realistic multi-stage integrated circuits (ICs) represents the main performance metric of a given transistor technology. Despite extensive interest in graphene electronics, scaling efforts have so far focused on individual transistors rather than multi-stage ICs. Here we study the scaling of graphene ICs based on transistors from 3.3 to 0.5 μm gate lengths and with different channel widths, access lengths, and lead thicknesses. The shortest gate delay of 31 ps per stage was obtained in sub-micron graphene ROs oscillating at 4.3 GHz, which is the highest oscillation frequency obtained in any strictly low-dimensional material to date. We also derived the fundamental Johnson limit, showing that scaled graphene ICs could be used at high frequencies in applications with small voltage swing. PMID:25873359

  2. Constructing cities, deconstructing scaling laws

    PubMed Central

    Arcaute, Elsa; Hatna, Erez; Ferguson, Peter; Youn, Hyejin; Johansson, Anders; Batty, Michael

    2015-01-01

    Cities can be characterized and modelled through different urban measures. Consistency within these observables is crucial in order to advance towards a science of cities. Bettencourt et al. have proposed that many of these urban measures can be predicted through universal scaling laws. We develop a framework to consistently define cities, using commuting to work and population density thresholds, and construct thousands of realizations of systems of cities with different boundaries for England and Wales. These serve as a laboratory for the scaling analysis of a large set of urban indicators. The analysis shows that population size alone does not provide us enough information to describe or predict the state of a city as previously proposed, indicating that the expected scaling laws are not corroborated. We found that most urban indicators scale linearly with city size, regardless of the definition of the urban boundaries. However, when nonlinear correlations are present, the exponent fluctuates considerably. PMID:25411405

  3. Scale locality of magnetohydrodynamic turbulence.

    PubMed

    Aluie, Hussein; Eyink, Gregory L

    2010-02-26

    We investigate the scale locality of cascades of conserved invariants at high kinetic and magnetic Reynold's numbers in the "inertial-inductive range" of magnetohydrodynamic (MHD) turbulence, where velocity and magnetic field increments exhibit suitable power-law scaling. We prove that fluxes of total energy and cross helicity-or, equivalently, fluxes of Elsässer energies-are dominated by the contributions of local triads. Flux of magnetic helicity may be dominated by nonlocal triads. The magnetic stretching term may also be dominated by nonlocal triads, but we prove that it can convert energy only between velocity and magnetic modes at comparable scales. We explain the disagreement with numerical studies that have claimed conversion nonlocally between disparate scales. We present supporting data from a 1024{3} simulation of forced MHD turbulence.

  4. Trends in Analytical Scale Separations.

    ERIC Educational Resources Information Center

    Jorgenson, James W.

    1984-01-01

    Discusses recent developments in the instrumentation and practice of analytical scale operations. Emphasizes detection devices and procedures in gas chromatography, liquid chromatography, electrophoresis, supercritical fluid chromatography, and field-flow fractionation. (JN)

  5. Pilot Scale Advanced Fogging Demonstration

    SciTech Connect

    Demmer, Rick L.; Fox, Don T.; Archiblad, Kip E.

    2015-01-01

    Experiments in 2006 developed a useful fog solution using three different chemical constituents. Optimization of the fog recipe and use of commercially available equipment were identified as needs that had not been addressed. During 2012 development work it was noted that low concentrations of the components hampered coverage and drying in the United Kingdom’s National Nuclear Laboratory’s testing much more so than was evident in the 2006 tests. In fiscal year 2014 the Idaho National Laboratory undertook a systematic optimization of the fogging formulation and conducted a non-radioactive, pilot scale demonstration using commercially available fogging equipment. While not as sophisticated as the equipment used in earlier testing, the new approach is much less expensive and readily available for smaller scale operations. Pilot scale testing was important to validate new equipment of an appropriate scale, optimize the chemistry of the fogging solution, and to realize the conceptual approach.

  6. Inflation in the scaling limit

    SciTech Connect

    Matarrese, S.; Ortolan, A.; Lucchin, F.

    1989-07-15

    We investigate the stochastic dynamics of the/ital inflaton/ for a wide class of potentials leading either tochaotic or to power-law inflation.At late times the system enters a /ital scaling/ /ital regime/where macroscopic order sets in: the field distribution sharply peaksaround the classical slow-rollover configuration and curvature perturbationsoriginate with a non-Gaussian scale-invariant statistics.

  7. Distributional Scaling in Heterogeneous Aquifers

    NASA Astrophysics Data System (ADS)

    Polsinelli, J. F.

    2015-12-01

    An investigation is undertaken into the fractal scaling properties of the piezometric head in a heterogeneous unconfined aquifer. The governing equations for the unconfined flow are derived from conservation of mass and the Darcy law. The Dupuit approximation will be used to model the dynamics. The spatially varying nature of the tendency to conduct flow (e.g. the hydraulic conductivity) is represented as a stochastic process. Experimental studies in the literature have indicated that the conductivity belongs to a class of non-stationary stochastic fields, called H-ss fields. The uncertainty in the soil parameters is imparted onto the flow variables; in groundwater investigations the potentiometric head will be a random function. The structure of the head field will be analyzed with an emphasis on the scaling properties. The scaling scheme for the modeling equations and the simulation procedure for the saturated hydraulic conductivity process will be explained, then the method will be validated through numerical experimentation using the USGS Modflow-2005 software. The results of the numerical simulations demonstrate that the head will exhibit multi-fractal scaling if the hydraulic conductivity exhibits multi-fractal scaling and the differential equations for the groundwater equation satisfy a particular set of scale invariance conditions.

  8. Scale-free primordial cosmology

    NASA Astrophysics Data System (ADS)

    Ijjas, Anna; Steinhardt, Paul J.; Loeb, Abraham

    2014-01-01

    The large-scale structure of the Universe suggests that the physics underlying its early evolution is scale-free. This was the historic motivation for the Harrison-Zel'dovich-Peebles spectrum and for inflation. Based on a hydrodynamical approach, we identify scale-free forms for the background equation of state for both inflationary and cyclic scenarios and use these forms to derive predictions for the spectral tilt and tensor-to-scalar ratio of primordial density perturbations. For the case of inflation, we find three classes of scale-free models with distinct predictions. Including all classes, we show that scale-free inflation predicts tensor-to-scalar ratio r >10-4. We show that the observationally favored class is theoretically disfavored because it suffers from an initial conditions problem and the hydrodynamical form of an unlikeliness problem similar to that identified recently for certain inflaton potentials. We contrast these results with those for scale-free cyclic models.

  9. Color constancy and hue scaling.

    PubMed

    Schultz, Sven; Doerschner, Katja; Maloney, Laurence T

    2006-01-01

    In this study, we used a hue scaling technique to examine human color constancy performance in simulated three-dimensional scenes. These scenes contained objects of various shapes and materials and a matte test patch at the center of the scene. Hue scaling settings were made for test patches under five different illuminations. Results show that subjects had nearly stable hue scalings for a given test surface across different illuminants. In a control experiment, only the test surfaces that belonged to one illumination condition were presented, blocked in front of a black background. Surprisingly, the hue scalings of the subjects in the blocked control experiment were not simply determined by the color codes of the test surface. Rather, they depended on the sequence of previously presented test stimuli. In contrast, subjects' hue scalings in a second control experiment (with order of presentations randomized) were completely determined by the color codes of the test surface. Our results show that hue scaling is a useful technique to investigate color constancy in a more phenomenological sense. Furthermore, the results from the blocked control experiment underline the important role of slow chromatic adaptation for color constancy.

  10. Construct Validity of the Dutch Version of the 12-Item Partners in Health Scale: Measuring Patient Self-Management Behaviour and Knowledge in Patients with Chronic Obstructive Pulmonary Disease

    PubMed Central

    Lenferink, Anke; Effing, Tanja; Harvey, Peter; Battersby, Malcolm; Frith, Peter; van Beurden, Wendy; van der Palen, Job; Paap, Muirne C. S.

    2016-01-01

    Objective The 12-item Partners in Health scale (PIH) was developed in Australia to measure self-management behaviour and knowledge in patients with chronic diseases, and has undergone several changes. Our aim was to assess the construct validity and reliability of the latest PIH version in Dutch COPD patients. Methods The 12 items of the PIH, scored on a self-rated 9-point Likert scale, are used to calculate total and subscale scores (knowledge; coping; recognition and management of symptoms; and adherence to treatment). We used forward-backward translation of the latest version of the Australian PIH to define a Dutch PIH (PIH(Du)). Mokken Scale Analysis and common Factor Analysis were performed on data from a Dutch COPD sample to investigate the psychometric properties of the Dutch PIH; and to determine whether the four-subscale solution previously found for the original Australian PIH could be replicated for the Dutch PIH. Results Two subscales were found for the Dutch PIH data (n = 118); 1) knowledge and coping; 2) recognition and management of symptoms, adherence to treatment. The correlation between the two Dutch subscales was 0.43. The lower-bound of the reliability of the total scale equalled 0.84. Factor analysis indicated that the first two factors explained a larger percentage of common variance (39.4% and 19.9%) than could be expected when using random data (17.5% and 15.1%). Conclusion We recommend using two PIH subscale scores when assessing self-management in Dutch COPD patients. Our results did not support the four-subscale structure as previously reported for the original Australian PIH. PMID:27564410

  11. A Retrospective Study on Students' and Teachers' Perceptions of the Reflective Ability Clinical Assessment.

    PubMed

    Tsingos-Lucas, Cherie; Bosnic-Anticevich, Sinthia; Smith, Lorraine

    2016-08-25

    Objective. To evaluate student and teacher perceptions of the utility of the Reflective Ability Clinical Assessment (RACA) in an undergraduate pharmacy curriculum at an Australian university. Methods. A mixed-method study comprising the administration of a 7-item student survey on a 6-point Likert-type scale and a 45-minute focus group/phone interview with teachers. Results. Student (n=199) and teaching staff respondents (n=3) provided their perceptions of the implementation of the new educational tool. Student responses showed significant positive correlations between self-directed learning, counseling skills, relevance to future practice, and performance in an oral examination. Seven key themes emerged from the teacher interviews. Conclusion. The study revealed both students and teachers perceive the RACA as an effective educational tool that may enhance skill development for future clinical practice. PMID:27667838

  12. Consumer Satisfaction with Telerehabilitation Service Provision of Alternative Computer Access and Augmentative and Alternative Communication

    PubMed Central

    LOPRESTI, EDMUND F.; JINKS, ANDREW; SIMPSON, RICHARD C.

    2015-01-01

    Telerehabilitation (TR) services for assistive technology evaluation and training have the potential to reduce travel demands for consumers and assistive technology professionals while allowing evaluation in more familiar, salient environments for the consumer. Sixty-five consumers received TR services for augmentative and alternative communication or alternative computer access, and consumer satisfaction was compared with twenty-eight consumers who received exclusively in-person services. TR recipients rated their TR services at a median of 6 on a 6-point Likert scale TR satisfaction questionnaire, although individual responses did indicate room for improvement in the technology. Overall satisfaction with AT services was rated highly by both in-person (100% satisfaction) and TR (99% satisfaction) service recipients. PMID:27563382

  13. A Retrospective Study on Students’ and Teachers’ Perceptions of the Reflective Ability Clinical Assessment

    PubMed Central

    Bosnic-Anticevich, Sinthia; Smith, Lorraine

    2016-01-01

    Objective. To evaluate student and teacher perceptions of the utility of the Reflective Ability Clinical Assessment (RACA) in an undergraduate pharmacy curriculum at an Australian university. Methods. A mixed-method study comprising the administration of a 7-item student survey on a 6-point Likert-type scale and a 45-minute focus group/phone interview with teachers. Results. Student (n=199) and teaching staff respondents (n=3) provided their perceptions of the implementation of the new educational tool. Student responses showed significant positive correlations between self-directed learning, counseling skills, relevance to future practice, and performance in an oral examination. Seven key themes emerged from the teacher interviews. Conclusion. The study revealed both students and teachers perceive the RACA as an effective educational tool that may enhance skill development for future clinical practice. PMID:27667838

  14. Consumer Satisfaction with Telerehabilitation Service Provision of Alternative Computer Access and Augmentative and Alternative Communication.

    PubMed

    Lopresti, Edmund F; Jinks, Andrew; Simpson, Richard C

    2015-01-01

    Telerehabilitation (TR) services for assistive technology evaluation and training have the potential to reduce travel demands for consumers and assistive technology professionals while allowing evaluation in more familiar, salient environments for the consumer. Sixty-five consumers received TR services for augmentative and alternative communication or alternative computer access, and consumer satisfaction was compared with twenty-eight consumers who received exclusively in-person services. TR recipients rated their TR services at a median of 6 on a 6-point Likert scale TR satisfaction questionnaire, although individual responses did indicate room for improvement in the technology. Overall satisfaction with AT services was rated highly by both in-person (100% satisfaction) and TR (99% satisfaction) service recipients. PMID:27563382

  15. A Retrospective Study on Students’ and Teachers’ Perceptions of the Reflective Ability Clinical Assessment

    PubMed Central

    Bosnic-Anticevich, Sinthia; Smith, Lorraine

    2016-01-01

    Objective. To evaluate student and teacher perceptions of the utility of the Reflective Ability Clinical Assessment (RACA) in an undergraduate pharmacy curriculum at an Australian university. Methods. A mixed-method study comprising the administration of a 7-item student survey on a 6-point Likert-type scale and a 45-minute focus group/phone interview with teachers. Results. Student (n=199) and teaching staff respondents (n=3) provided their perceptions of the implementation of the new educational tool. Student responses showed significant positive correlations between self-directed learning, counseling skills, relevance to future practice, and performance in an oral examination. Seven key themes emerged from the teacher interviews. Conclusion. The study revealed both students and teachers perceive the RACA as an effective educational tool that may enhance skill development for future clinical practice.

  16. Scale-Dependent Dispersivity Explained Without Scale-Dependent Heterogeneity

    NASA Astrophysics Data System (ADS)

    Dhaliwal, P.; Engdahl, N. B.; Fogg, G. E.

    2011-12-01

    The observed scale-dependence of dispersivity has often been attributed to the scale-dependence of porous media heterogeneity. However, mass transfer between areas of high and low hydraulic conductivity and preferential solute migration may provide an alternative explanation for this phenomenon. To illustrate this point, we used geostatistical models representing the heterogeneity and interconnectedness of a typical aquifer system and plume modeling via a highly accurate random walk particle tracking method. The apparent dispersivity values were calculated using the statistical moments of the plumes. Apparent dispersivity was seen to grow from 0.01(m)to 100(m) over length scales of 0.06(m) to 500(m) even though heterogeneity scales and facies proportions were stationary and invariant with scale in the simulations. The results suggest that the increase in dispersivity was due solely to a stretching of the plume by two mechanisms. The first mechanism results from the diffusion of solute into areas of low conductivity and the second comes from the movement of solute through well-connected high K zone channels. Under such conditions, an "asymptotic dispersivity" may never be reached.

  17. Scaling of extreme rainfall areas at a planetary scale.

    PubMed

    Devineni, Naresh; Lall, Upmanu; Xi, Chen; Ward, Philip

    2015-07-01

    Event magnitude and area scaling relationships for rainfall over different regions of the world have been presented in the literature for relatively short durations and over relatively small areas. In this paper, we present the first ever results on a global analysis of the scaling characteristics of extreme rainfall areas for durations ranging from 1 to 30 days. Broken power law models are fit in each case. The past work has been focused largely on the time and space scales associated with local and regional convection. The work presented here suggests that power law scaling may also apply to planetary scale phenomenon, such as frontal and monsoonal systems, and their interaction with local moisture recycling. Such features may have persistence over large areas corresponding to extreme rain and regional flood events. As a result, they lead to considerable hazard exposure. A caveat is that methods used for empirical power law identification have difficulties with edge effects due to finite domains. This leads to problems with robust model identification and interpretability of the underlying relationships. We use recent algorithms that aim to address some of these issues in a principled way. Theoretical research that could explain why such results may emerge across the world, as analyzed for the first time in this paper, is needed. PMID:26232980

  18. Identifying Undifferentiating Response Sets and Assessing Their Effects on the Measurement of Items.

    ERIC Educational Resources Information Center

    Schulz, E. Matthew; Sun, Anji

    Undifferentiating response sets, defined as "overuse" of any category of a Likert scale, were identified using a combination of simple criteria, such as whether a single-category response set involved more than four items, and statistical criteria based on D. Andrich's (1978) measurement model for Likert scales (the Rating Scale model). Data were…

  19. SETI and astrobiology: The Rio Scale and the London Scale

    NASA Astrophysics Data System (ADS)

    Almár, Iván

    2011-11-01

    The public reaction to a discovery, the character of the corresponding risk communication, as well as the possible impact on science and society all depend on the character of the phenomenon discovered, on the method of discovery, on the distance to the phenomenon and, last but not least, on the reliability of the announcement itself. The Rio Scale - proposed together with Jill Tarter just a decade ago at an IAA symposium in Rio de Janeiro - attempts to quantify the relative importance of such a “low probability, high consequence event”, namely the announcement of an ETI discovery. After the publication of the book “The Eerie Silence” by Paul Davies it is necessary to control how the recently suggested possible “technosignatures” or “technomarkers” mentioned in this book could be evaluated by the Rio Scale. The new London Scale, proposed at the Royal Society meeting in January 2010, in London, is a similar attempt to quantify the impact of an announcement regarding the discovery of ET life on an analogous ordinal scale between zero and ten. Here again the new concept of a “shadow biosphere” raised in this book deserves a special attention since a “weird form of life” found on Earth would not necessarily have an extraterrestrial origin, nevertheless it might be an important discovery in itself. Several arguments are presented that methods, aims and targets of “search for ET life” and “search for ET intelligence” are recently converging. The new problem is raised whether a unification of these two scales is necessary as a consequence of the convergence of the two subjects. Finally, it is suggested that experts in social sciences should take the structure of the respective scales into consideration when investigating case by case the possible effects on the society of such discoveries.

  20. On the scaling of small-scale jet noise to large scale

    NASA Astrophysics Data System (ADS)

    Soderman, Paul T.; Allen, Christopher S.

    1992-05-01

    An examination was made of several published jet noise studies for the purpose of evaluating scale effects important to the simulation of jet aeroacoustics. Several studies confirmed that small conical jets, one as small as 59 mm diameter, could be used to correctly simulate the overall or perceived noise level (PNL) noise of large jets dominated by mixing noise. However, the detailed acoustic spectra of large jets are more difficult to simulate because of the lack of broad-band turbulence spectra in small jets. One study indicated that a jet Reynolds number of 5 x 10(exp 6) based on exhaust diameter enabled the generation of broad-band noise representative of large jet mixing noise. Jet suppressor aeroacoustics is even more difficult to simulate at small scale because of the small mixer nozzles with flows sensitive to Reynolds number. Likewise, one study showed incorrect ejector mixing and entrainment using a small-scale, short ejector that led to poor acoustic scaling. Conversely, fairly good results were found with a longer ejector and, in a different study, with a 32-chute suppressor nozzle. Finally, it was found that small-scale aeroacoustic resonance produced by jets impacting ground boards does not reproduce at large scale.

  1. On the scaling of small-scale jet noise to large scale

    NASA Astrophysics Data System (ADS)

    Soderman, Paul T.; Allen, Christopher S.

    An examination was made of several published jet noise studies for the purpose of evaluating scale effects important to the simulation of jet aeroacoustics. Several studies confirmed that small conical jets, one as small as 59 mm diameter, could be used to correctly simulate the overall or PNL noise of large jets dominated by mixing noise. However, the detailed acoustic spectra of large jets are more difficult to simulate because of the lack of broad-band turbulence spectra in small jets. One study indicated that a jet Reynolds number of 5 x 10 exp 6 based on exhaust diameter enabled the generation of broad-band noise representative of large jet mixing noise. Jet suppressor aeroacoustics is even more difficult to simulate at small scale because of the small mixer nozzles with flows sensitive to Reynolds number. Likewise, one study showed incorrect ejector mixing and entrainment using small-scale, short ejector that led to poor acoustic scaling. Conversely, fairly good results were found with a longer ejector and, in a different study, with a 32-chute suppressor nozzle. Finally, it was found that small-scale aeroacoustic resonance produced by jets impacting ground boards does not reproduce at large scale.

  2. On the scaling of small-scale jet noise to large scale

    NASA Technical Reports Server (NTRS)

    Soderman, Paul T.; Allen, Christopher S.

    1992-01-01

    An examination was made of several published jet noise studies for the purpose of evaluating scale effects important to the simulation of jet aeroacoustics. Several studies confirmed that small conical jets, one as small as 59 mm diameter, could be used to correctly simulate the overall or PNL noise of large jets dominated by mixing noise. However, the detailed acoustic spectra of large jets are more difficult to simulate because of the lack of broad-band turbulence spectra in small jets. One study indicated that a jet Reynolds number of 5 x 10 exp 6 based on exhaust diameter enabled the generation of broad-band noise representative of large jet mixing noise. Jet suppressor aeroacoustics is even more difficult to simulate at small scale because of the small mixer nozzles with flows sensitive to Reynolds number. Likewise, one study showed incorrect ejector mixing and entrainment using small-scale, short ejector that led to poor acoustic scaling. Conversely, fairly good results were found with a longer ejector and, in a different study, with a 32-chute suppressor nozzle. Finally, it was found that small-scale aeroacoustic resonance produced by jets impacting ground boards does not reproduce at large scale.

  3. On the scaling of small-scale jet noise to large scale

    NASA Technical Reports Server (NTRS)

    Soderman, Paul T.; Allen, Christopher S.

    1992-01-01

    An examination was made of several published jet noise studies for the purpose of evaluating scale effects important to the simulation of jet aeroacoustics. Several studies confirmed that small conical jets, one as small as 59 mm diameter, could be used to correctly simulate the overall or perceived noise level (PNL) noise of large jets dominated by mixing noise. However, the detailed acoustic spectra of large jets are more difficult to simulate because of the lack of broad-band turbulence spectra in small jets. One study indicated that a jet Reynolds number of 5 x 10(exp 6) based on exhaust diameter enabled the generation of broad-band noise representative of large jet mixing noise. Jet suppressor aeroacoustics is even more difficult to simulate at small scale because of the small mixer nozzles with flows sensitive to Reynolds number. Likewise, one study showed incorrect ejector mixing and entrainment using a small-scale, short ejector that led to poor acoustic scaling. Conversely, fairly good results were found with a longer ejector and, in a different study, with a 32-chute suppressor nozzle. Finally, it was found that small-scale aeroacoustic resonance produced by jets impacting ground boards does not reproduce at large scale.

  4. Hidden scale invariance of metals

    NASA Astrophysics Data System (ADS)

    Hummel, Felix; Kresse, Georg; Dyre, Jeppe C.; Pedersen, Ulf R.

    2015-11-01

    Density functional theory (DFT) calculations of 58 liquid elements at their triple point show that most metals exhibit near proportionality between the thermal fluctuations of the virial and the potential energy in the isochoric ensemble. This demonstrates a general "hidden" scale invariance of metals making the condensed part of the thermodynamic phase diagram effectively one dimensional with respect to structure and dynamics. DFT computed density scaling exponents, related to the Grüneisen parameter, are in good agreement with experimental values for the 16 elements where reliable data were available. Hidden scale invariance is demonstrated in detail for magnesium by showing invariance of structure and dynamics. Computed melting curves of period three metals follow curves with invariance (isomorphs). The experimental structure factor of magnesium is predicted by assuming scale invariant inverse power-law (IPL) pair interactions. However, crystal packings of several transition metals (V, Cr, Mn, Fe, Nb, Mo, Ta, W, and Hg), most post-transition metals (Ga, In, Sn, and Tl), and the metalloids Si and Ge cannot be explained by the IPL assumption. The virial-energy correlation coefficients of iron and phosphorous are shown to increase at elevated pressures. Finally, we discuss how scale invariance explains the Grüneisen equation of state and a number of well-known empirical melting and freezing rules.

  5. Strength Scaling in Fiber Composites

    NASA Technical Reports Server (NTRS)

    Kellas, Sotiris; Morton, John

    1990-01-01

    A research program was initiated to study and isolate the factors responsible for scale effects in the tensile strength of graphite/epoxy composite laminates. Four layups were chosen with appropriate stacking sequences so as to highlight individual and interacting failure modes. Four scale sizes were selected for investigation including full scale size, 3/4, 2/4, and 1/4, with n = to 4, 3, 2, and 1, respectively. The full scale specimen sizes was 32 piles thick as compared to 24, 16, and 8 piles for the 3/4, 2/4, and 1/4 specimen sizes respectively. Results were obtained in the form of tensile strength, stress-strain curves and damage development. Problems associated with strength degradation with increasing specimen sizes are isolated and discussed. Inconsistencies associated with strain measurements were also identified. Enhanced x ray radiography was employed for damage evaluation, following step loading. It was shown that fiber dominated layups were less sensitive to scaling effects compared to the matrix dominated layups.

  6. Featured Invention: Laser Scaling Device

    NASA Technical Reports Server (NTRS)

    Dunn, Carol Anne

    2008-01-01

    In September 2003, NASA signed a nonexclusive license agreement with Armor Forensics, a subsidiary of Armor Holdings, Inc., for the laser scaling device under the Innovative Partnerships Program. Coupled with a measuring program, also developed by NASA, the unit provides crime scene investigators with the ability to shoot photographs at scale without having to physically enter the scene, analyzing details such as bloodspatter patterns and graffiti. This ability keeps the scene's components intact and pristine for the collection of information and evidence. The laser scaling device elegantly solved a pressing problem for NASA's shuttle operations team and also provided industry with a useful tool. For NASA, the laser scaling device is still used to measure divots or damage to the shuttle's external tank and other structures around the launchpad. When the invention also met similar needs within industry, the Innovative Partnerships Program provided information to Armor Forensics for licensing and marketing the laser scaling device. Jeff Kohler, technology transfer agent at Kennedy, added, "We also invited a representative from the FBI's special photography unit to Kennedy to meet with Armor Forensics and the innovator. Eventually the FBI ended up purchasing some units. Armor Forensics is also beginning to receive interest from DoD [Department of Defense] for use in military crime scene investigations overseas."

  7. Scaling Effect In Trade Network

    NASA Astrophysics Data System (ADS)

    Konar, M.; Lin, X.; Rushforth, R.; Ruddell, B. L.; Reimer, J.

    2015-12-01

    Scaling is an important issue in the physical sciences. Economic trade is increasingly of interest to the scientific community due to the natural resources (e.g. water, carbon, nutrients, etc.) embodied in traded commodities. Trade refers to the spatial and temporal redistribution of commodities, and is typically measured annually between countries. However, commodity exchange networks occur at many different scales, though data availability at finer temporal and spatial resolution is rare. Exchange networks may prove an important adaptation measure to cope with future climate and economic shocks. As such, it is essential to understand how commodity exchange networks scale, so that we can understand opportunities and roadblocks to the spatial and temporal redistribution of goods and services. To this end, we present an empirical analysis of trade systems across three spatial scales: global, sub-national in the United States, and county-scale in the United States. We compare and contrast the network properties, the self-sufficiency ratio, and performance of the gravity model of trade for these three exchange systems.

  8. Visions of Atomic Scale Tomography

    SciTech Connect

    Kelly, T. F.; Miller, Michael K; Rajan, Krishna; Ringer, S. P.

    2012-01-01

    A microscope, by definition, provides structural and analytical information about objects that are too small to see with the unaided eye. From the very first microscope, efforts to improve its capabilities and push them to ever-finer length scales have been pursued. In this context, it would seem that the concept of an ultimate microscope would have received much attention by now; but has it really ever been defined? Human knowledge extends to structures on a scale much finer than atoms, so it might seem that a proton-scale microscope or a quark-scale microscope would be the ultimate. However, we argue that an atomic-scale microscope is the ultimate for the following reason: the smallest building block for either synthetic structures or natural structures is the atom. Indeed, humans and nature both engineer structures with atoms, not quarks. So far as we know, all building blocks (atoms) of a given type are identical; it is the assembly of the building blocks that makes a useful structure. Thus, would a microscope that determines the position and identity of every atom in a structure with high precision and for large volumes be the ultimate microscope? We argue, yes. In this article, we consider how it could be built, and we ponder the answer to the equally important follow-on questions: who would care if it is built, and what could be achieved with it?

  9. Definition of a nucleophilicity scale.

    PubMed

    Jaramillo, Paula; Pérez, Patricia; Contreras, Renato; Tiznado, William; Fuentealba, Patricio

    2006-07-01

    This work deals with exploring some empirical scales of nucleophilicity. We have started evaluating the experimental indices of nucleophilicity proposed by Legon and Millen on the basis of the measure of the force constants derived from vibrational frequencies using a probe dipole H-X (X = F,CN). The correlation among some theoretical parameters with this experimental scale has been evaluated. The theoretical parameters have been chosen as the minimum of the electrostatic potential V(min), the binding energy (BE) between the nucleophile and the H-X dipole, and the electrostatic potential measured at the position of the hydrogen atom V(H) when the complex nucleophile and dipole H-X is in the equilibrium geometry. All of them present good correlations with the experimental nucleophilicity scale. In addition, the BEs of the nucleophiles with two other Lewis acids (one hard, BF(3), and the other soft, BH(3)) have been evaluated. The results suggest that the Legon and Millen nucleophilicity scale and the electrostatic potential derived scales can describe in good approximation the reactivity order of the nucleophiles only when the interactions with a probe electrophile is of the hard-hard type. For a covalent interaction that is orbital controlled, a new nucleophilicity index using information of the frontier orbitals of both, the nucleophile and the electrophile has been proposed.

  10. The scaling of attention networks

    NASA Astrophysics Data System (ADS)

    Wang, Cheng-Jun; Wu, Lingfei

    2016-04-01

    We use clicks as a proxy of collective attention and construct networks to study the temporal dynamics of attention. In particular we collect the browsing records of millions of users on 1000 Web forums in two months. In the constructed networks, nodes are threads and edges represent the switch of users between threads in an hour. The investigated network properties include the number of threads N, the number of users UV, and the number of clicks, PV. We find scaling functions PV ∼ UV θ1, PV ∼N θ3, and UV ∼N θ2, in which the scaling exponents are always greater than 1. This means that (1) the studied networks maintain a self-similar flow structure in time, i.e., large networks are simply the scale-up versions of small networks; and (2) large networks are more "productive", in the sense that an average user would generate more clicks in the larger systems. We propose a revised version of Zipf's law to quantify the time-invariant flow structure of attention networks and relate it to the observed scaling properties. We also demonstrate the applied consequences of our research: forum-classification based on scaling properties.

  11. SCALE FORMATION IN CHRYSOPHYCEAN ALGAE

    PubMed Central

    Brown, R. Malcolm; Franke, Werner W.; Kleinig, Hans; Falk, Heinz; Sitte, Peter

    1970-01-01

    The cell wall of the marine chrysophycean alga Pleurochrysis scherfellii is composed of distinct wall fragments embedded in a gelatinous mass. The latter is a polysaccharide of pectic character which is rich in galactose and ribose. These wall fragments are identified as scales. They have been isolated and purified from the vegetative mother cell walls after zoospore formation. Their ultrastructure is described in an electron microscope study combining sectioning, freeze-etch, and negative staining techniques. The scales consist of a layer of concentrically arranged microfibrils (ribbons with cross-sections of 12 to 25 x 25 to 40 A) and underlying radial fibrils of similar dimensions. Such a network-plate is densely coated with particles which are assumed to be identical to the pectic component. The microfibrils are resistant to strong alkaline treatment and have been identified as cellulose by different methods, including sugar analysis after total hydrolysis, proton resonance spectroscopical examination (NMR spectroscopy) of the benzoylated product, and diverse histochemical tests. The formation and secretion of the scales can be followed along the maturing Golgi cisternae starting from a pronounced dilated "polymerization center" as a completely intracisternal process which ends in the exocytotic extrusion of the scales. The scales reveal the very same ultrastructure within the Golgi cisternae as they do in the cell wall. The present finding represents the first evidence on cellulose formation by the Golgi apparatus and is discussed in relation to a basic scheme for cellulose synthesis in plant cells in general. PMID:5513606

  12. Scales of Natural Flood Management

    NASA Astrophysics Data System (ADS)

    Nicholson, Alex; Quinn, Paul; Owen, Gareth; Hetherington, David; Piedra Lara, Miguel; O'Donnell, Greg

    2016-04-01

    The scientific field of Natural flood Management (NFM) is receiving much attention and is now widely seen as a valid solution to sustainably manage flood risk whilst offering significant multiple benefits. However, few examples exist looking at NFM on a large scale (>10km2). Well-implemented NFM has the effect of restoring more natural catchment hydrological and sedimentological processes, which in turn can have significant flood risk and WFD benefits for catchment waterbodies. These catchment scale improvements in-turn allow more 'natural' processes to be returned to rivers and streams, creating a more resilient system. Although certain NFM interventions may appear distant and disconnected from main stem waterbodies, they will undoubtedly be contributing to WFD at the catchment waterbody scale. This paper offers examples of NFM, and explains how they can be maximised through practical design across many scales (from feature up to the whole catchment). New tools to assist in the selection of measures and their location, and to appreciate firstly, the flooding benefit at the local catchment scale and then show a Flood Impact Model that can best reflect the impacts of local changes further downstream. The tools will be discussed in the context of our most recent experiences on NFM projects including river catchments in the north east of England and in Scotland. This work has encouraged a more integrated approach to flood management planning that can use both traditional and novel NFM strategies in an effective and convincing way.

  13. Coping with Multiple Sclerosis Scale

    PubMed Central

    Parkerson, Holly A.; Kehler, Melissa D.; Sharpe, Donald

    2016-01-01

    Background: The Coping with Multiple Sclerosis Scale (CMSS) was developed to assess coping strategies specific to multiple sclerosis (MS). Despite its wide application in MS research, psychometric support for the CMSS remains limited to the initial factor analytic investigation by Pakenham in 2001. Methods: The current investigation assessed the factor structure and construct validity of the CMSS. Participants with MS (N = 453) completed the CMSS, as well as measures of disability related to MS (Multiple Sclerosis Impact Scale), quality of life (World Health Organization Quality of Life Brief Scale), and anxiety and depression (Hospital Anxiety and Depression Scale). Results: The original factor structure reported by Pakenham was a poor fit to the data. An alternate seven-factor structure was identified using exploratory factor analysis. Although there were some similarities with the existing CMSS subscales, differences in factor content and item loadings were found. Relationships between the revised CMSS subscales and additional measures were assessed, and the findings were consistent with previous research. Conclusions: Refinement of the CMSS is suggested, especially for subscales related to acceptance and avoidance strategies. Until further research is conducted on the revised CMSS, it is recommended that the original CMSS continue to be administered. Clinicians and researchers should be mindful of lack of support for the acceptance and avoidance subscales and should seek additional scales to assess these areas. PMID:27551244

  14. Galaxy clustering on large scales.

    PubMed

    Efstathiou, G

    1993-06-01

    I describe some recent observations of large-scale structure in the galaxy distribution. The best constraints come from two-dimensional galaxy surveys and studies of angular correlation functions. Results from galaxy redshift surveys are much less precise but are consistent with the angular correlations, provided the distortions in mapping between real-space and redshift-space are relatively weak. The galaxy two-point correlation function, rich-cluster two-point correlation function, and galaxy-cluster cross-correlation function are all well described on large scales ( greater, similar 20h-1 Mpc, where the Hubble constant, H0 = 100h km.s-1.Mpc; 1 pc = 3.09 x 10(16) m) by the power spectrum of an initially scale-invariant, adiabatic, cold-dark-matter Universe with Gamma = Omegah approximately 0.2. I discuss how this fits in with the Cosmic Background Explorer (COBE) satellite detection of large-scale anisotropies in the microwave background radiation and other measures of large-scale structure in the Universe.

  15. Comparison of numerical and verbal rating scales to measure pain exacerbations in patients with chronic cancer pain

    PubMed Central

    2010-01-01

    Background Numerical rating scales (NRS), and verbal rating scales (VRS) showed to be reliable and valid tools for subjective cancer pain measurement, but no one of them consistently proved to be superior to the other. Aim of the present study is to compare NRS and VRS performance in assessing breakthrough or episodic pain (BP-EP) exacerbations. Methods In a cross sectional multicentre study carried out on a sample of 240 advanced cancer patients with pain, background pain and BP-EP intensity in the last 24 hours were measured using both a 6-point VRS and a 0-10 NRS. In order to evaluate the reproducibility of the two scales, a subsample of 60 patients was randomly selected and the questionnaire was administered for a second time three to four hours later. The proportion of "inconsistent" (background pain intensity higher than or equal to peak pain intensity) evaluations was calculated to compare the two scales capability in discriminating between background and peak pain intensity and Cohen's K was calculated to compare their reproducibility. Results NRS revealed higher discriminatory capability than VRS in distinguishing between background and peak pain intensity with a lower proportion of patients giving inconsistent evaluations (14% vs. 25%). NRS also showed higher reproducibility when measuring pain exacerbations (Cohen's K of 0.86 for NRS vs. 0.53 for VRS) while the reproducibility of the two scales in evaluating background pain was similar (Cohen's K of 0.80 vs. 0.77). Conclusions Our results suggest that, in the measurement of cancer pain exacerbations, patients use NRS more appropriately than VRS and as such NRS should be preferred to VRS in this patient's population. PMID:20412579

  16. An environmentally friendly scale inhibitor

    SciTech Connect

    Dobbs, J.B.; Brown, J.M.

    1999-11-01

    This paper describes a method of inhibiting the formation of scales such as barium and strontium sulfate in low pH aqueous systems, and calcium carbonate in systems containing high concentrations of dissolved iron. The solution, chemically, involves treating the aqueous system with an inhibitor designed to replace organic-phosphonates. Typical low pH aqueous systems where the inhibitor is particularly useful are oilfield produced-water, resin bed water softeners that form scale during low pH, acid regeneration operations. Downhole applications are recommended where high concentrations of dissolved iron are present in the produced water. This new approach to inhibition replaces typical organic phosphonates and polymers with a non-toxic, biodegradable scale inhibitor that performs in harsh environments.

  17. Softness Correlations Across Length Scales

    NASA Astrophysics Data System (ADS)

    Ivancic, Robert; Shavit, Amit; Rieser, Jennifer; Schoenholz, Samuel; Cubuk, Ekin; Durian, Douglas; Liu, Andrea; Riggleman, Robert

    In disordered systems, it is believed that mechanical failure begins with localized particle rearrangements. Recently, a machine learning method has been introduced to identify how likely a particle is to rearrange given its local structural environment, quantified by softness. We calculate the softness of particles in simulations of atomic Lennard-Jones mixtures, molecular Lennard-Jones oligomers, colloidal systems and granular systems. In each case, we find that the length scale characterizing spatial correlations of softness is approximately a particle diameter. These results provide a rationale for why localized rearrangements--whose size is presumably set by the scale of softness correlations--might occur in disordered systems across many length scales. Supported by DOE DE-FG02-05ER46199.

  18. Urban Transfer Entropy across Scales

    PubMed Central

    Murcio, Roberto

    2015-01-01

    The morphology of urban agglomeration is studied here in the context of information exchange between different spatio-temporal scales. Urban migration to and from cities is characterised as non-random and following non-random pathways. Cities are multidimensional non-linear phenomena, so understanding the relationships and connectivity between scales is important in determining how the interplay of local/regional urban policies may affect the distribution of urban settlements. In order to quantify these relationships, we follow an information theoretic approach using the concept of Transfer Entropy. Our analysis is based on a stochastic urban fractal model, which mimics urban growing settlements and migration waves. The results indicate how different policies could affect urban morphology in terms of the information generated across geographical scales. PMID:26207628

  19. Flavor hierarchies from dynamical scales

    NASA Astrophysics Data System (ADS)

    Panico, Giuliano; Pomarol, Alex

    2016-07-01

    One main obstacle for any beyond the SM (BSM) scenario solving the hierarchy problem is its potentially large contributions to electric dipole moments. An elegant way to avoid this problem is to have the light SM fermions couple to the BSM sector only through bilinears, overline{f}f . This possibility can be neatly implemented in composite Higgs models. We study the implications of dynamically generating the fermion Yukawa couplings at different scales, relating larger scales to lighter SM fermions. We show that all flavor and CP-violating constraints can be easily accommodated for a BSM scale of few TeV, without requiring any extra symmetry. Contributions to B physics are mainly mediated by the top, giving a predictive pattern of deviations in Δ F = 2 and Δ F = 1 flavor observables that could be seen in future experiments.

  20. Critical Multicultural Education Competencies Scale: A Scale Development Study

    ERIC Educational Resources Information Center

    Acar-Ciftci, Yasemin

    2016-01-01

    The purpose of this study is to develop a scale in order to identify the critical mutlicultural education competencies of teachers. For this reason, first of all, drawing on the knowledge in the literature, a new conceptual framework was created with deductive method based on critical theory, critical race theory and critical multicultural…

  1. Bath County Computer Attitude Scale: A Reliability and Validity Scale.

    ERIC Educational Resources Information Center

    Moroz, Pauline A.; Nash, John B.

    The Bath County Computer Attitude Scale (BCCAS) has received limited attention concerning its reliability and validity with a U.S. adult population. As developed by G. G. Bear, H. C. Richards, and P. Lancaster in 1987, the instrument assessed attitudes toward computers in areas of computer use, computer-aided instruction, programming and technical…

  2. Childhood Career Development Scale: Scale Construction and Psychometric Properties

    ERIC Educational Resources Information Center

    Schultheiss, Donna E. Palladino; Stead, Graham B.

    2004-01-01

    The purpose of this investigation was to construct a theoretically driven and psychometrically sound childhood career development scale to measure career progress in fourth-through sixth-grade children. Super's nine dimensions (i.e., curiosity, exploration, information, key figures, interests, locus of control, time perspective, self-concept, and…

  3. Cavitation erosion size scale effects

    NASA Technical Reports Server (NTRS)

    Rao, P. V.; Buckley, D. H.

    1984-01-01

    Size scaling in cavitation erosion is a major problem confronting the design engineers of modern high speed machinery. An overview and erosion data analysis presented in this paper indicate that the size scale exponent n in the erosion rate relationship as a function of the size or diameter can vary from 1.7 to 4.9 depending on the type of device used. There is, however, a general agreement as to the values of n if the correlations are made with constant cavitation number.

  4. Scaling Properties of Universal Tetramers

    SciTech Connect

    Hadizadeh, M. R.; Yamashita, M. T.; Tomio, Lauro; Delfino, A.; Frederico, T.

    2011-09-23

    We evidence the existence of a universal correlation between the binding energies of successive four-boson bound states (tetramers), for large two-body scattering lengths (a), related to an additional scale not constrained by three-body Efimov physics. Relevant to ultracold atom experiments, the atom-trimer relaxation peaks for |a|{yields}{infinity} when the ratio between the tetramer and trimer energies is {approx_equal}4.6 and a new tetramer is formed. The new scale is also revealed for a<0 by the prediction of a correlation between the positions of two successive peaks in the four-atom recombination process.

  5. Inflation at the electroweak scale

    NASA Technical Reports Server (NTRS)

    Knox, Lloyd; Turner, Michael S.

    1993-01-01

    We present a model for slow-rollover inflation where the vacuum energy that drives inflation is of the order of G(F) exp -2; unlike most models, the conversion of vacuum energy to radiation ('reheating') is moderately efficient. The scalar field responsible for inflation is a standard-model singlet, develops a vacuum expectation value of 4 x 10 exp 6 GeV, has a mass of about 1 GeV, and can play a role in electroweak phenomena. We also discuss models where the energy scale of inflation is somewhat larger, but still well below the unification scale.

  6. Global scale predictability of floods

    NASA Astrophysics Data System (ADS)

    Weerts, Albrecht; Gijsbers, Peter; Sperna Weiland, Frederiek

    2016-04-01

    Flood (and storm surge) forecasting at the continental and global scale has only become possible in recent years (Emmerton et al., 2016; Verlaan et al., 2015) due to the availability of meteorological forecast, global scale precipitation products and global scale hydrologic and hydrodynamic models. Deltares has setup GLOFFIS a research-oriented multi model operational flood forecasting system based on Delft-FEWS in an open experimental ICT facility called Id-Lab. In GLOFFIS both the W3RA and PCRGLOB-WB model are run in ensemble mode using GEFS and ECMWF-EPS (latency 2 days). GLOFFIS will be used for experiments into predictability of floods (and droughts) and their dependency on initial state estimation, meteorological forcing and the hydrologic model used. Here we present initial results of verification of the ensemble flood forecasts derived with the GLOFFIS system. Emmerton, R., Stephens, L., Pappenberger, F., Pagano, T., Weerts, A., Wood, A. Salamon, P., Brown, J., Hjerdt, N., Donnelly, C., Cloke, H. Continental and Global Scale Flood Forecasting Systems, WIREs Water (accepted), 2016 Verlaan M, De Kleermaeker S, Buckman L. GLOSSIS: Global storm surge forecasting and information system 2015, Australasian Coasts & Ports Conference, 15-18 September 2015,Auckland, New Zealand.

  7. The Maternal Behavior Rating Scale.

    ERIC Educational Resources Information Center

    Mahoney, Gerald; And Others

    1986-01-01

    Independent ratings of videotaped sessions in which mothers (N=60) interacted with their mentally retarded children (ages 1-3) suggested that potentially important components of maternal behavior (child orientedness/pleasure and control) may be assessed with the seven-item short form of the Maternal Behavior Rating Scale. (JW)

  8. The Creative Processes Rating Scale.

    ERIC Educational Resources Information Center

    Kulp, Margaret; Tarter, Barbara J.

    1986-01-01

    Developed from research about and teacher experience with children and creativity, the Creative Processes Rating Scale was tested with 100 sixth graders and found to be an effective instrument (which can be used by teachers with no experience in art) for assessing the creative processes of children in the visual arts. (Author/CB)

  9. Nanotribology: Rubbing on Small Scale

    ERIC Educational Resources Information Center

    Dickinson, J. Thomas

    2005-01-01

    Nanometer-scale investigations offer the potential of providing first-principles understanding of tribo-systems in terms of fundamental intermolecular forces. Some of the basic issues and motivation for use of scanning probes in the area of nanotribology is presented.

  10. SCALING: Wind Tunnel to Flight

    NASA Astrophysics Data System (ADS)

    Bushnell, Dennis M.

    2006-01-01

    Wind tunnels have wide-ranging functionality, including many applications beyond aeronautics, and historically have been the major source of information for technological aerodynamics/aeronautical applications. There are a myriad of scaling issues/differences from flight to wind tunnel, and their study and impacts are uneven and a function of the particular type of extant flow phenomena. Typically, the most serious discrepancies are associated with flow separation. The tremendous ongoing increases in numerical simulation capability are changing and in many aspects have changed the function of the wind tunnel from a (scaled) "predictor" to a source of computational calibration/validation information with the computation then utilized as the flight prediction/scaling tool. Numerical simulations can increasingly include the influences of the various scaling issues. This wind tunnel role change has been occurring for decades as computational capability improves in all aspects. Additional issues driving this trend are the increasing cost (and time) disparity between physical experiments and computations, and increasingly stringent accuracy requirements.

  11. The Adolescent Drug Involvement Scale.

    ERIC Educational Resources Information Center

    Moberg, D. Paul; Hahn, Lori

    1991-01-01

    Developed Adolescent Drug Involvement Scale (ADIS) to measure level of drug involvement, considered as continuum ranging from no use to severe dependency, in adolescents. Administered ADIS to 453 adolescents referred for treatment. Results indicated acceptable internal consistency and provide preliminary evidence of validity. Scores correlated…

  12. Hydrodynamic aspects of shark scales

    NASA Astrophysics Data System (ADS)

    Raschi, W. G.; Musick, J. A.

    1986-03-01

    Ridge morphometrices on placoid scales from 12 galeoid shark species were examined in order to evaluate their potential value for frictional drag reduction. The geometry of the shark scales is similar to longitudinal grooved surfaces (riblets) that have been previously shown to give 8 percent skin-friction reduction for turbulent boundary layers. The present study of the shark scales was undertaken to determine if the physical dimensions of the ridges on the shark scales are of the right magnitude to be used by the sharks for drag reduction based on previous riblet work. The results indicate that the ridge heights and spacings are normally maintained between the predicted optimal values proposed for voluntary and burst swimming speeds throughout the individual's ontogeny. Moreover, the species which might be considered to be the faster posses smaller and more closely spaced ridges that based on the riblet work would suggest a greater frictional drag reduction value at the high swimming speeds, as compared to their more sluggish counterparts.

  13. Scale invariance and superfluid turbulence

    NASA Astrophysics Data System (ADS)

    Sen, Siddhartha; Ray, Koushik

    2013-11-01

    We construct a Schroedinger field theory invariant under local spatial scaling. It is shown to provide an effective theory of superfluid turbulence by deriving, analytically, the observed Kolmogorov 5/3 law and to lead to a Biot-Savart interaction between the observed filament excitations of the system as well.

  14. Citizen Science Data and Scaling

    NASA Astrophysics Data System (ADS)

    Henderson, S.; Wasser, L. A.

    2013-12-01

    There is rapid growth in the collection of environmental data by non experts. So called ';citizen scientists' are collecting data on plant phenology, precipitation patterns, bird migration and winter feeding, mating calls of frogs in the spring, and numerous other topics and phenomena related to environmental science. This data is generally submitted to online programs (e.g Project BudBurst, COCORaHS, Project Feederwatch, Frogwatch USA, etc.)and is freely available to scientists, educators, land managers, and decisions makers. While the data is often used to address specific science questions, it also provides the opportunity to explore its utility in the context of ecosystem scaling. Citizen science data is being collected and submitted at an unprecedented rate and is of a spatial and temporal scale previously not possible. The amount of citizen science data vastly exceeds what scientists or land managers can collect on their own. As such, it provides opportunities to address scaling in the environmental sciences. This presentation will explore data from several citizen science programs in the context of scaling.

  15. Animal coloration: sexy spider scales.

    PubMed

    Taylor, Lisa A; McGraw, Kevin J

    2007-08-01

    Many male jumping spiders display vibrant colors that are used in visual communication. A recent microscopic study on a jumping spider from Singapore shows that three-layered 'scale sandwiches' of chitin and air are responsible for producing their brilliant iridescent body coloration.

  16. Newspaper Scale and Newspaper Expenditures.

    ERIC Educational Resources Information Center

    Blankenburg, William B.

    Employing data from the 1986 Inland Daily Newspaper Association Cost and Revenue study effects of scale on the costs of various factors in newspaper production were examined. Despite several limitations of the Inland Survey, the following tentative conclusions were reached: total expenses rise faster than circulation, and total revenues rise…

  17. Animal coloration: sexy spider scales.

    PubMed

    Taylor, Lisa A; McGraw, Kevin J

    2007-08-01

    Many male jumping spiders display vibrant colors that are used in visual communication. A recent microscopic study on a jumping spider from Singapore shows that three-layered 'scale sandwiches' of chitin and air are responsible for producing their brilliant iridescent body coloration. PMID:17686428

  18. Secondary School Burnout Scale (SSBS)

    ERIC Educational Resources Information Center

    Aypay, Ayse

    2012-01-01

    The purpose of this study is to develop "Secondary School Burnout Scale." Study group included 728 students out of 14 schools in four cities in Turkey. Both Exploratory Factor Analysis and Confirmatory Factor Analysis were conducted on the data. A seven-factor solution emerged. The seven factors explained 61% of the total variance. The model…

  19. Scaling up of renewable chemicals.

    PubMed

    Sanford, Karl; Chotani, Gopal; Danielson, Nathan; Zahn, James A

    2016-04-01

    The transition of promising technologies for production of renewable chemicals from a laboratory scale to commercial scale is often difficult and expensive. As a result the timeframe estimated for commercialization is typically underestimated resulting in much slower penetration of these promising new methods and products into the chemical industries. The theme of 'sugar is the next oil' connects biological, chemical, and thermochemical conversions of renewable feedstocks to products that are drop-in replacements for petroleum derived chemicals or are new to market chemicals/materials. The latter typically offer a functionality advantage and can command higher prices that result in less severe scale-up challenges. However, for drop-in replacements, price is of paramount importance and competitive capital and operating expenditures are a prerequisite for success. Hence, scale-up of relevant technologies must be interfaced with effective and efficient management of both cell and steel factories. Details involved in all aspects of manufacturing, such as utilities, sterility, product recovery and purification, regulatory requirements, and emissions must be managed successfully. PMID:26874264

  20. Multi-Scale Infrastructure Assessment

    EPA Science Inventory

    The U.S. Environmental Protection Agency’s (EPA) multi-scale infrastructure assessment project supports both water resource adaptation to climate change and the rehabilitation of the nation’s aging water infrastructure by providing tools, scientific data and information to progra...

  1. Hydrodynamic aspects of shark scales

    NASA Technical Reports Server (NTRS)

    Raschi, W. G.; Musick, J. A.

    1986-01-01

    Ridge morphometrices on placoid scales from 12 galeoid shark species were examined in order to evaluate their potential value for frictional drag reduction. The geometry of the shark scales is similar to longitudinal grooved surfaces (riblets) that have been previously shown to give 8 percent skin-friction reduction for turbulent boundary layers. The present study of the shark scales was undertaken to determine if the physical dimensions of the ridges on the shark scales are of the right magnitude to be used by the sharks for drag reduction based on previous riblet work. The results indicate that the ridge heights and spacings are normally maintained between the predicted optimal values proposed for voluntary and burst swimming speeds throughout the individual's ontogeny. Moreover, the species which might be considered to be the faster posses smaller and more closely spaced ridges that based on the riblet work would suggest a greater frictional drag reduction value at the high swimming speeds, as compared to their more sluggish counterparts.

  2. Scaling up of renewable chemicals.

    PubMed

    Sanford, Karl; Chotani, Gopal; Danielson, Nathan; Zahn, James A

    2016-04-01

    The transition of promising technologies for production of renewable chemicals from a laboratory scale to commercial scale is often difficult and expensive. As a result the timeframe estimated for commercialization is typically underestimated resulting in much slower penetration of these promising new methods and products into the chemical industries. The theme of 'sugar is the next oil' connects biological, chemical, and thermochemical conversions of renewable feedstocks to products that are drop-in replacements for petroleum derived chemicals or are new to market chemicals/materials. The latter typically offer a functionality advantage and can command higher prices that result in less severe scale-up challenges. However, for drop-in replacements, price is of paramount importance and competitive capital and operating expenditures are a prerequisite for success. Hence, scale-up of relevant technologies must be interfaced with effective and efficient management of both cell and steel factories. Details involved in all aspects of manufacturing, such as utilities, sterility, product recovery and purification, regulatory requirements, and emissions must be managed successfully.

  3. Time scales of Magmatic Processes

    NASA Astrophysics Data System (ADS)

    Hawkesworth, C. J.

    2002-05-01

    Knowledge of the rates of natural processes is critical to the development of physically realistic models. For magmatic processes, rates are increasingly well determined from short lived isotopes, and from diffusion modified element profiles, on time scales that vary from 10s of 1000s of years to a few years. Our understanding of the melting processes beneath MOR have been revolutionised by the application of U-series isotopes, because they include isotopes with half lives similar to the time scales of melt generation and extraction. For island arcs there is much discussion of how to incorporate suggestions that Ra and Ba are transferred from the slab in a few 1000 years, and yet significantly more time is required to generate the excess Pa isotopes. Once in the crust, crystallisation and differentiation may be driven by cooling, degassing and decompression, and these should be characterised by different time scales. Crystals preserve rich high-resolution records of changing magma compositions, but the time scales of those changes are difficult to establish. Isotope studies have shown that more evolved rock types tend to contain more old crystals that may be 10s of 1000s of years old at the time of eruption. Whether these are xenocrysts, or evidence for long term crystallisation histories remains controversial. Moreover, diffusion modified element profiles, and crystal size distributions, suggest that crystals are often less than a 100 years old. An alternative approach is to consider U-series isotope ratios in the magma, and how these may change with degree of magma evolution. These suggest that differentiation time scales may be up to 200 ky for magmas at the base of the crust, but for magmas that crystallise at shallower levels the time scales are much shorter. In some cases these are in weeks and months, and crystallisation is likely to be due to decompression and degassing. One consequence of the short crystallisation times, is that there may be insufficient

  4. Optimal scaling in ductile fracture

    NASA Astrophysics Data System (ADS)

    Fokoua Djodom, Landry

    This work is concerned with the derivation of optimal scaling laws, in the sense of matching lower and upper bounds on the energy, for a solid undergoing ductile fracture. The specific problem considered concerns a material sample in the form of an infinite slab of finite thickness subjected to prescribed opening displacements on its two surfaces. The solid is assumed to obey deformation-theory of plasticity and, in order to further simplify the analysis, we assume isotropic rigid-plastic deformations with zero plastic spin. When hardening exponents are given values consistent with observation, the energy is found to exhibit sublinear growth. We regularize the energy through the addition of nonlocal energy terms of the strain-gradient plasticity type. This nonlocal regularization has the effect of introducing an intrinsic length scale into the energy. We also put forth a physical argument that identifies the intrinsic length and suggests a linear growth of the nonlocal energy. Under these assumptions, ductile fracture emerges as the net result of two competing effects: whereas the sublinear growth of the local energy promotes localization of deformation to failure planes, the nonlocal regularization stabilizes this process, thus resulting in an orderly progression towards failure and a well-defined specific fracture energy. The optimal scaling laws derived here show that ductile fracture results from localization of deformations to void sheets, and that it requires a well-defined energy per unit fracture area. In particular, fractal modes of fracture are ruled out under the assumptions of the analysis. The optimal scaling laws additionally show that ductile fracture is cohesive in nature, i.e., it obeys a well-defined relation between tractions and opening displacements. Finally, the scaling laws supply a link between micromechanical properties and macroscopic fracture properties. In particular, they reveal the relative roles that surface energy and microplasticity

  5. Structural Similitude and Scaling Laws

    NASA Technical Reports Server (NTRS)

    Simitses, George J.

    1998-01-01

    Aircraft and spacecraft comprise the class of aerospace structures that require efficiency and wisdom in design, sophistication and accuracy in analysis and numerous and careful experimental evaluations of components and prototype, in order to achieve the necessary system reliability, performance and safety. Preliminary and/or concept design entails the assemblage of system mission requirements, system expected performance and identification of components and their connections as well as of manufacturing and system assembly techniques. This is accomplished through experience based on previous similar designs, and through the possible use of models to simulate the entire system characteristics. Detail design is heavily dependent on information and concepts derived from the previous steps. This information identifies critical design areas which need sophisticated analyses, and design and redesign procedures to achieve the expected component performance. This step may require several independent analysis models, which, in many instances, require component testing. The last step in the design process, before going to production, is the verification of the design. This step necessitates the production of large components and prototypes in order to test component and system analytical predictions and verify strength and performance requirements under the worst loading conditions that the system is expected to encounter in service. Clearly then, full-scale testing is in many cases necessary and always very expensive. In the aircraft industry, in addition to full-scale tests, certification and safety necessitate large component static and dynamic testing. Such tests are extremely difficult, time consuming and definitely absolutely necessary. Clearly, one should not expect that prototype testing will be totally eliminated in the aircraft industry. It is hoped, though, that we can reduce full-scale testing to a minimum. Full-scale large component testing is necessary in

  6. Time scales in cognitive neuroscience

    PubMed Central

    Papo, David

    2013-01-01

    Cognitive neuroscience boils down to describing the ways in which cognitive function results from brain activity. In turn, brain activity shows complex fluctuations, with structure at many spatio-temporal scales. Exactly how cognitive function inherits the physical dimensions of neural activity, though, is highly non-trivial, and so are generally the corresponding dimensions of cognitive phenomena. As for any physical phenomenon, when studying cognitive function, the first conceptual step should be that of establishing its dimensions. Here, we provide a systematic presentation of the temporal aspects of task-related brain activity, from the smallest scale of the brain imaging technique's resolution, to the observation time of a given experiment, through the characteristic time scales of the process under study. We first review some standard assumptions on the temporal scales of cognitive function. In spite of their general use, these assumptions hold true to a high degree of approximation for many cognitive (viz. fast perceptual) processes, but have their limitations for other ones (e.g., thinking or reasoning). We define in a rigorous way the temporal quantifiers of cognition at all scales, and illustrate how they qualitatively vary as a function of the properties of the cognitive process under study. We propose that each phenomenon should be approached with its own set of theoretical, methodological and analytical tools. In particular, we show that when treating cognitive processes such as thinking or reasoning, complex properties of ongoing brain activity, which can be drastically simplified when considering fast (e.g., perceptual) processes, start playing a major role, and not only characterize the temporal properties of task-related brain activity, but also determine the conditions for proper observation of the phenomena. Finally, some implications on the design of experiments, data analyses, and the choice of recording parameters are discussed. PMID:23626578

  7. Earthquake Scaling, Simulation and Forecasting

    NASA Astrophysics Data System (ADS)

    Sachs, Michael Karl

    Earthquakes are among the most devastating natural events faced by society. In 2011, just two events, the magnitude 6.3 earthquake in Christcurch New Zealand on February 22, and the magnitude 9.0 Tohoku earthquake off the coast of Japan on March 11, caused a combined total of $226 billion in economic losses. Over the last decade, 791,721 deaths were caused by earthquakes. Yet, despite their impact, our ability to accurately predict when earthquakes will occur is limited. This is due, in large part, to the fact that the fault systems that produce earthquakes are non-linear. The result being that very small differences in the systems now result in very big differences in the future, making forecasting difficult. In spite of this, there are patterns that exist in earthquake data. These patterns are often in the form of frequency-magnitude scaling relations that relate the number of smaller events observed to the number of larger events observed. In many cases these scaling relations show consistent behavior over a wide range of scales. This consistency forms the basis of most forecasting techniques. However, the utility of these scaling relations is limited by the size of the earthquake catalogs which, especially in the case of large events, are fairly small and limited to a few 100 years of events. In this dissertation I discuss three areas of earthquake science. The first is an overview of scaling behavior in a variety of complex systems, both models and natural systems. The focus of this area is to understand how this scaling behavior breaks down. The second is a description of the development and testing of an earthquake simulator called Virtual California designed to extend the observed catalog of earthquakes in California. This simulator uses novel techniques borrowed from statistical physics to enable the modeling of large fault systems over long periods of time. The third is an evaluation of existing earthquake forecasts, which focuses on the Regional

  8. Evaluating the impact of farm scale innovation at catchment scale

    NASA Astrophysics Data System (ADS)

    van Breda, Phelia; De Clercq, Willem; Vlok, Pieter; Querner, Erik

    2014-05-01

    Hydrological modelling lends itself to other disciplines very well, normally as a process based system that acts as a catalogue of events taking place. These hydrological models are spatial-temporal in their design and are generally well suited for what-if situations in other disciplines. Scaling should therefore be a function of the purpose of the modelling. Process is always linked with scale or support but the temporal resolution can affect the results if the spatial scale is not suitable. The use of hydrological response units tends to lump area around physical features but disregards farm boundaries. Farm boundaries are often the more crucial uppermost resolution needed to gain more value from hydrological modelling. In the Letaba Catchment of South Africa, we find a generous portion of landuses, different models of ownership, different farming systems ranging from large commercial farms to small subsistence farming. All of these have the same basic right to water but water distribution in the catchment is somewhat of a problem. Since water quantity is also a problem, the water supply systems need to take into account that valuable production areas not be left without water. Clearly hydrological modelling should therefore be sensitive to specific landuse. As a measure of productivity, a system of small farmer production evaluation was designed. This activity presents a dynamic system outside hydrological modelling that is generally not being considered inside hydrological modelling but depends on hydrological modelling. For sustainable development, a number of important concepts needed to be aligned with activities in this region, and the regulatory actions also need to be adhered to. This study aimed at aligning the activities in a region to the vision and objectives of the regulatory authorities. South Africa's system of socio-economic development planning is complex and mostly ineffective. There are many regulatory authorities involved, often with unclear

  9. Mokken Scale Analysis Using Hierarchical Clustering Procedures

    ERIC Educational Resources Information Center

    van Abswoude, Alexandra A. H.; Vermunt, Jeroen K.; Hemker, Bas T.; van der Ark, L. Andries

    2004-01-01

    Mokken scale analysis (MSA) can be used to assess and build unidimensional scales from an item pool that is sensitive to multiple dimensions. These scales satisfy a set of scaling conditions, one of which follows from the model of monotone homogeneity. An important drawback of the MSA program is that the sequential item selection and scale…

  10. Stability of Rasch Scales over Time

    ERIC Educational Resources Information Center

    Taylor, Catherine S.; Lee, Yoonsun

    2010-01-01

    Item response theory (IRT) methods are generally used to create score scales for large-scale tests. Research has shown that IRT scales are stable across groups and over time. Most studies have focused on items that are dichotomously scored. Now Rasch and other IRT models are used to create scales for tests that include polytomously scored items.…

  11. 27 CFR 19.276 - Package scales.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    .... Scales used to weigh packages designed to hold 10 wine gallons or less shall indicate weight in ounces or... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Package scales. 19.276... Package scales. Proprietors shall ensure the accuracy of scales used for weighing packages of...

  12. 30 CFR 57.3202 - Scaling tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-UNDERGROUND METAL AND NONMETAL MINES Ground Control Scaling and Support-Surface and Underground § 57.3202 Scaling tools. Where manual scaling is performed, a scaling...

  13. Simple scale interpolator facilitates reading of graphs

    NASA Technical Reports Server (NTRS)

    Fazio, A.; Henry, B.; Hood, D.

    1966-01-01

    Set of cards with scale divisions and a scale finder permits accurate reading of the coordinates of points on linear or logarithmic graphs plotted on rectangular grids. The set contains 34 different scales for linear plotting and 28 single cycle scales for log plots.

  14. Fidelity of implementation to instructional strategies as a moderator of curriculum unit effectiveness in a large-scale middle school science quasi-experiment

    NASA Astrophysics Data System (ADS)

    O'Donnell, Carol Lynn

    This study examined whether fidelity of implementation to reform-based instructional strategies embedded in a middle school physical science curriculum unit developed by the Harvard-Smithsonian Center for Astrophysics moderated the causal relationship between curriculum condition and classroom mean achievement in a quasi-experiment testing the effectiveness of the unit. The study sample included 48 6th grade science classrooms selected randomly from 8 Montgomery County Public Schools middle schools, assigned randomly to either the treatment or comparison condition in the Scaling up Curriculum for Achievement, Learning, and Equity Project (SCALE-0) quasi-experiment of The George Washington University. This dissertation was a secondary analysis of SCALE-uP's 2005-2006 fidelity of implementation data collected with the Instructional Strategies Classroom Observation Protocol (ISCOP), which captured whether the Project 2061 instructional strategies rated Satisfactory or Excellent in the ARIES: Exploring Motion and Forces (M&F) treatment unit were present during implementation in treatment and comparison classrooms. ISCOP Likert-like scores for each classroom were subjected to Rasch analysis; rating scale diagnostics, category collapsing, and fit statistics were used to develop a reliable continuous fidelity of implementation measure for each classroom. Results from hierarchical multiple regression analysis performed on the fidelity of implementation measures indicated that when controlling for prior knowledge, fidelity of implementation to the Project 2061 instructional strategies rated Satisfactory or Excellent in M&F moderated the causal relationship between science curriculum condition and classroom mean achievement. Follow-up post hoc analyses at two select fidelity measures indicated that treatment classrooms with High Fidelity were predicted to have higher classroom mean achievement than comparison classrooms with High Fidelity to the same set of instructional

  15. Emerging universe from scale invariance

    SciTech Connect

    Del Campo, Sergio; Herrera, Ramón; Guendelman, Eduardo I.; Labraña, Pedro E-mail: guendel@bgu.ac.il E-mail: plabrana@ubiobio.cl

    2010-06-01

    We consider a scale invariant model which includes a R{sup 2} term in action and show that a stable ''emerging universe'' scenario is possible. The model belongs to the general class of theories, where an integration measure independent of the metric is introduced. To implement scale invariance (S.I.), a dilaton field is introduced. The integration of the equations of motion associated with the new measure gives rise to the spontaneous symmetry breaking (S.S.B) of S.I. After S.S.B. of S.I. in the model with the R{sup 2} term (and first order formalism applied), it is found that a non trivial potential for the dilaton is generated. The dynamics of the scalar field becomes non linear and these non linearities are instrumental in the stability of some of the emerging universe solutions, which exists for a parameter range of the theory.

  16. The scale of cosmic isotropy

    SciTech Connect

    Marinoni, C.; Bel, J.; Buzzi, A. E-mail: Julien.Bel@cpt.univ-mrs.fr

    2012-10-01

    The most fundamental premise to the standard model of the universe states that the large-scale properties of the universe are the same in all directions and at all comoving positions. Demonstrating this hypothesis has proven to be a formidable challenge. The cross-over scale R{sub iso} above which the galaxy distribution becomes statistically isotropic is vaguely defined and poorly (if not at all) quantified. Here we report on a formalism that allows us to provide an unambiguous operational definition and an estimate of R{sub iso}. We apply the method to galaxies in the Sloan Digital Sky Survey (SDSS) Data Release 7, finding that R{sub iso} ∼ 150h{sup −1}Mpc. Besides providing a consistency test of the Copernican principle, this result is in agreement with predictions based on numerical simulations of the spatial distribution of galaxies in cold dark matter dominated cosmological models.

  17. SENSATION SEEKING SCALE: INDIAN ADAPTATION

    PubMed Central

    Basu, Debasish; Verma, Vijoy K.; Malhotra, Savita; Malhotra, Anil

    1993-01-01

    SUMMARY Sensation seeking refers to a biologically based personality dimension defined as the need for varied, novel and complex sensations and experiences, and the willingness to take physical and social risks for the sake of such experiences. Although researched worldwide for nearly three decades now, there is to date no published Indian study utilizing the concept of sensation seeking. This paper describes adaptation of the Sensation Seeking Scale for the Indian population. After due modification of the scale, its reliability, internal consistency and discriminant validity were established Norms were developed for a defined segment of general population. This study may be seen as the beginning of research in India on the subject of sensation seeking. PMID:21743627

  18. Thermodynamics from a scaling Hamiltonian

    NASA Astrophysics Data System (ADS)

    Del Pino, L. A.; Troncoso, P.; Curilef, S.

    2007-11-01

    There are problems with defining the thermodynamic limit of systems with long-range interactions; as a result, the thermodynamic behavior of these types of systems is anomalous. In the present work, we review some concepts from both extensive and nonextensive thermodynamic perspectives. We use a model, whose Hamiltonian takes into account spins ferromagnetically coupled in a chain via a power law that decays at large interparticle distance r as 1/rα for α⩾0 . Here, we review old nonextensive scaling. In addition, we propose a Hamiltonian scaled by 2((N/2)1-α-1)/(1-α) that explicitly includes symmetry of the lattice and dependence on the size N of the system. The approach enabled us to improve upon previous results. A numerical test is conducted through Monte Carlo simulations. In the model, periodic boundary conditions are adopted to eliminate surface effects.

  19. Allometric scaling laws of metabolism

    NASA Astrophysics Data System (ADS)

    da Silva, Jafferson Kamphorst Leal; Garcia, Guilherme J. M.; Barbosa, Lauro A.

    2006-12-01

    One of the most pervasive laws in biology is the allometric scaling, whereby a biological variable Y is related to the mass M of the organism by a power law, Y=YM, where b is the so-called allometric exponent. The origin of these power laws is still a matter of dispute mainly because biological laws, in general, do not follow from physical ones in a simple manner. In this work, we review the interspecific allometry of metabolic rates, where recent progress in the understanding of the interplay between geometrical, physical and biological constraints has been achieved. For many years, it was a universal belief that the basal metabolic rate (BMR) of all organisms is described by Kleiber's law (allometric exponent b=3/4). A few years ago, a theoretical basis for this law was proposed, based on a resource distribution network common to all organisms. Nevertheless, the 3/4-law has been questioned recently. First, there is an ongoing debate as to whether the empirical value of b is 3/4 or 2/3, or even nonuniversal. Second, some mathematical and conceptual errors were found these network models, weakening the proposed theoretical arguments. Another pertinent observation is that the maximal aerobically sustained metabolic rate of endotherms scales with an exponent larger than that of BMR. Here we present a critical discussion of the theoretical models proposed to explain the scaling of metabolic rates, and compare the predicted exponents with a review of the experimental literature. Our main conclusion is that although there is not a universal exponent, it should be possible to develop a unified theory for the common origin of the allometric scaling laws of metabolism.

  20. Global scale groundwater flow model

    NASA Astrophysics Data System (ADS)

    Sutanudjaja, Edwin; de Graaf, Inge; van Beek, Ludovicus; Bierkens, Marc

    2013-04-01

    As the world's largest accessible source of freshwater, groundwater plays vital role in satisfying the basic needs of human society. It serves as a primary source of drinking water and supplies water for agricultural and industrial activities. During times of drought, groundwater sustains water flows in streams, rivers, lakes and wetlands, and thus supports ecosystem habitat and biodiversity, while its large natural storage provides a buffer against water shortages. Yet, the current generation of global scale hydrological models does not include a groundwater flow component that is a crucial part of the hydrological cycle and allows the simulation of groundwater head dynamics. In this study we present a steady-state MODFLOW (McDonald and Harbaugh, 1988) groundwater model on the global scale at 5 arc-minutes resolution. Aquifer schematization and properties of this groundwater model were developed from available global lithological model (e.g. Dürr et al., 2005; Gleeson et al., 2010; Hartmann and Moorsdorff, in press). We force the groundwtaer model with the output from the large-scale hydrological model PCR-GLOBWB (van Beek et al., 2011), specifically the long term net groundwater recharge and average surface water levels derived from routed channel discharge. We validated calculated groundwater heads and depths with available head observations, from different regions, including the North and South America and Western Europe. Our results show that it is feasible to build a relatively simple global scale groundwater model using existing information, and estimate water table depths within acceptable accuracy in many parts of the world.

  1. Quantum critical scaling in graphene.

    PubMed

    Sheehy, Daniel E; Schmalian, Jörg

    2007-11-30

    We show that the emergent relativistic symmetry of electrons in graphene near its quantum critical point (QCP) implies a crucial importance of the Coulomb interaction. We derive scaling laws, valid near the QCP, that dictate the nontrivial magnetic and charge response of interacting graphene. Our analysis yields numerous predictions for how the Coulomb interaction will be manifested in experimental observables such as the diamagnetic response and electronic compressibility. PMID:18233313

  2. Scaling Exponents in Financial Markets

    NASA Astrophysics Data System (ADS)

    Kim, Kyungsik; Kim, Cheol-Hyun; Kim, Soo Yong

    2007-03-01

    We study the dynamical behavior of four exchange rates in foreign exchange markets. A detrended fluctuation analysis (DFA) is applied to detect the long-range correlation embedded in the non-stationary time series. It is for our case found that there exists a persistent long-range correlation in volatilities, which implies the deviation from the efficient market hypothesis. Particularly, the crossover is shown to exist in the scaling behaviors of the volatilities.

  3. Latest Developments in SLD Scaling

    NASA Technical Reports Server (NTRS)

    Tsao, Jen-Ching; Anderson, David N.

    2006-01-01

    Scaling methods have been shown previously to work well for super cooled large droplet (SLD) main ice shapes. However, feather sizes for some conditions have not been well represented by scale tests. To determine if there are fundamental differences between the development of feathers for appendix C and SLD conditions, this study used time-sequenced photographs, viewing along the span of the model during icing sprays. An airspeed of 100 kt, cloud water drop MVDs of 30 and 140 microns, and stagnation freezing fractions of 0.30 and 0.50 were tested in the NASA Glenn Icing Research Tunnel using an unswept 91-cm-chord NACA0012 airfoil model mounted at 0deg AOA. The photos indicated that the feathers that developed in a distinct region downstream of the leading-edge ice determined the horn location and angle. The angle at which feathers grew from the surface were also measured; results are shown for an airspeed of 150 kt, an MVD of 30 microns, and stagnation freezing fractions of 0.30 to 0.60. Feather angles were found to depend strongly on the stagnation freezing fraction, and were independent of either chordwise position on the model or time into the spray. Feather angles also correlated well with horn angles. For these tests, there did not appear to be fundamental differences between the physics of SLD and appendix C icing; therefore, for these conditions similarity parameters used for appendix C scaling appear to be valid for SLD scaling as well. Further investigation into the cause for the large feather structures observed for some SLD conditions will continue.

  4. An investigation of ride quality rating scales

    NASA Technical Reports Server (NTRS)

    Dempsey, T. K.; Coates, G. D.; Leatherwood, J. D.

    1977-01-01

    An experimental investigation was conducted for the combined purposes of determining the relative merits of various category scales for the prediction of human discomfort response to vibration and for determining the mathematical relationships whereby subjective data are transformed from one scale to other scales. There were 16 category scales analyzed representing various parametric combinations of polarity, that is, unipolar and bipolar, scale type, and number of scalar points. Results indicated that unipolar continuous-type scales containing either seven or nine scalar points provide the greatest reliability and discriminability. Transformations of subjective data between category scales were found to be feasible with unipolar scales of a larger number of scalar points providing the greatest accuracy of transformation. The results contain coefficients for transformation of subjective data between the category scales investigated. A result of particular interest was that the comfort half of a bipolar scale was seldom used by subjects to describe their subjective reaction to vibration.

  5. Large scale cluster computing workshop

    SciTech Connect

    Dane Skow; Alan Silverman

    2002-12-23

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community.

  6. Mechanically reliable scales and coatings

    SciTech Connect

    Tortorelli, P.F.; Alexander, K.B.

    1995-06-01

    In many high-temperature fossil energy systems, corrosion and deleterious environmental effects arising from reactions with reactive gases and condensible products often compromise materials performance and, as a consequence, degrade operating efficiencies. Protection of materials from such reactions is best afforded by the formation of stable surface oxides (either as deposited coatings or thermally grown scales) that are slowly reacting, continuous, dense, and adherent to the substrate. However, the ability of normally brittle ceramic films and coatings to provide such protection has long been problematical, particularly for applications involving numerous or severe high-temperature thermal cycles or very aggressive (for example, sulfidizing) environments. A satisfactory understanding of how scale and coating integrity and adherence are improved by compositional, microstructural, and processing modifications is lacking. Therefore, to address this issue, the present work is intended to define the relationships between substrate characteristics (composition, microstructure, and mechanical behavior) and the structure and protective properties of deposited oxide coatings and/or thermally grown scales. Such information is crucial to the optimization of the chemical, interfacial, and mechanical properties of the protective oxides on high-temperature materials through control of processing and composition and directly supports the development of corrosion-resistant, high-temperature materials for improved energy and environmental control systems.

  7. Transition physics and scaling overview

    SciTech Connect

    Carlstrom, T.N.

    1995-12-01

    This paper presents an overview of recent experimental progress towards understanding H-mode transition physics and scaling. Terminology and techniques for studying H-mode are reviewed and discussed. The model of shear E x B flow stabilization of edge fluctuations at the L-H transition is gaining wide acceptance and is further supported by observations of edge rotation on a number of new devices. Observations of poloidal asymmetries of edge fluctuations and dephasing of density and potential fluctuations after the transition pose interesting challenges for understanding H-mode physics. Dedicated scans to determine the scaling of the power threshold have now been performed on many machines. A dear B{sub t} dependence is universally observed but dependence on the line averaged density is complicated. Other dependencies are also reported. Studies of the effect of neutrals and error fields on the power threshold are under investigation. The ITER threshold database has matured and offers guidance to the power threshold scaling issues relevant to next-step devices.

  8. A Lab-Scale CELSS

    NASA Technical Reports Server (NTRS)

    Flynn, Mark E.; Finn, Cory K.; Srinivasan, Venkatesh; Sun, Sidney; Harper, Lynn D. (Technical Monitor)

    1994-01-01

    It has been shown that prohibitive resupply costs for extended-duration manned space flight missions will demand that a high degree of recycling and in situ food production be implemented. A prime candidate for in situ food production is the growth of higher level plants. Research in the area of plant physiology is currently underway at many institutions. This research is aimed at the characterization and optimization of gas exchange, transpiration and food production of higher plants in order to support human life in space. However, there are a number of unresolved issues involved in making plant chambers an integral part of a closed life support system. For example, issues pertaining to the integration of tightly coupled, non-linear systems with small buffer volumes will need to be better understood in order to ensure successful long term operation of a Controlled Ecological Life Support System (CELSS). The Advanced Life Support Division at NASA Ames Research Center has embarked on a program to explore some of these issues and demonstrate the feasibility of the CELSS concept. The primary goal of the Laboratory Scale CELSS Project is to develop a fully-functioning integrated CELSS on a laboratory scale in order to provide insight, knowledge and experience applicable to the design of human-rated CELSS facilities. Phase I of this program involves the integration of a plant chamber with a solid waste processor. This paper will describe the requirements, design and some experimental results from Phase I of the Laboratory Scale CELSS Program.

  9. Temporal scaling in information propagation.

    PubMed

    Huang, Junming; Li, Chao; Wang, Wen-Qiang; Shen, Hua-Wei; Li, Guojie; Cheng, Xue-Qi

    2014-01-01

    For the study of information propagation, one fundamental problem is uncovering universal laws governing the dynamics of information propagation. This problem, from the microscopic perspective, is formulated as estimating the propagation probability that a piece of information propagates from one individual to another. Such a propagation probability generally depends on two major classes of factors: the intrinsic attractiveness of information and the interactions between individuals. Despite the fact that the temporal effect of attractiveness is widely studied, temporal laws underlying individual interactions remain unclear, causing inaccurate prediction of information propagation on evolving social networks. In this report, we empirically study the dynamics of information propagation, using the dataset from a population-scale social media website. We discover a temporal scaling in information propagation: the probability a message propagates between two individuals decays with the length of time latency since their latest interaction, obeying a power-law rule. Leveraging the scaling law, we further propose a temporal model to estimate future propagation probabilities between individuals, reducing the error rate of information propagation prediction from 6.7% to 2.6% and improving viral marketing with 9.7% incremental customers. PMID:24939414

  10. Low on the London Scale

    NASA Astrophysics Data System (ADS)

    Webb, S.

    2013-09-01

    Until relatively recently, many authors have assumed that if extraterrestrial life is discovered it will be via the discovery of extraterrestrial intelligence: we can best try to detect life by adopting the SETI approach of trying to detect beacons or artefacts. The Rio Scale, proposed by Almár and Tarter in 2000, is a tool for quantifying the potential significance for society of any such reported detection. However, improvements in technology and advances in astrobiology raise the possibility that the discovery of extraterrestrial life will instead be via the detection of atmospheric biosignatures. The London Scale, proposed by Almár in 2010, attempts to quantify the potential significance of the discovery of extraterrestrial life rather than extraterrestrial intelligence. What might be the consequences of the announcement of a discovery that ranks low on the London Scale? In other words, what might be society's reaction if 'first contact' is via the remote sensing of the byproducts of unicellular organisms rather than with the products of high intelligence? Here, I examine some possible reactions to that question; in particular, I discuss how such an announcement might affect our views of life here on Earth and of humanity's place in the universe.

  11. Flavor from the electroweak scale

    SciTech Connect

    Bauer, Martin; Carena, Marcela; Gemmler, Katrin

    2015-11-04

    We discuss the possibility that flavor hierarchies arise from the electroweak scale in a two Higgs doublet model, in which the two Higgs doublets jointly act as the flavon. Quark masses and mixing angles are explained by effective Yukawa couplings, generated by higher dimensional operators involving quarks and Higgs doublets. Modified Higgs couplings yield important effects on the production cross sections and decay rates of the light Standard Model like Higgs. In addition, flavor changing neutral currents arise at tree-level and lead to strong constraints from meson-antimeson mixing. Remarkably, flavor constraints turn out to prefer a region in parameter space that is in excellent agreement with the one preferred by recent Higgs precision measurements at the Large Hadron Collider (LHC). Direct searches for extra scalars at the LHC lead to further constraints. Precise predictions for the production and decay modes of the additional Higgs bosons are derived, and we present benchmark scenarios for searches at the LHC Run II. As a result, flavor breaking at the electroweak scale as well as strong coupling effects demand a UV completion at the scale of a few TeV, possibly within the reach of the LHC.

  12. Non-relativistic scale anomalies

    NASA Astrophysics Data System (ADS)

    Arav, Igal; Chapman, Shira; Oz, Yaron

    2016-06-01

    We extend the cohomological analysis in arXiv:1410.5831 of anisotropic Lifshitz scale anomalies. We consider non-relativistic theories with a dynamical critical exponent z = 2 with or without non-relativistic boosts and a particle number symmetry. We distinguish between cases depending on whether the time direction does or does not induce a foliation structure. We analyse both 1 + 1 and 2 + 1 spacetime dimensions. In 1 + 1 dimensions we find no scale anomalies with Galilean boost symmetries. The anomalies in 2 + 1 dimensions with Galilean boosts and a foliation structure are all B-type and are identical to the Lifshitz case in the purely spatial sector. With Galilean boosts and without a foliation structure we find also an A-type scale anomaly. There is an infinite ladder of B-type anomalies in the absence of a foliation structure with or without Galilean boosts. We discuss the relation between the existence of a foliation structure and the causality of the field theory.

  13. Flavor from the electroweak scale

    DOE PAGES

    Bauer, Martin; Carena, Marcela; Gemmler, Katrin

    2015-11-04

    We discuss the possibility that flavor hierarchies arise from the electroweak scale in a two Higgs doublet model, in which the two Higgs doublets jointly act as the flavon. Quark masses and mixing angles are explained by effective Yukawa couplings, generated by higher dimensional operators involving quarks and Higgs doublets. Modified Higgs couplings yield important effects on the production cross sections and decay rates of the light Standard Model like Higgs. In addition, flavor changing neutral currents arise at tree-level and lead to strong constraints from meson-antimeson mixing. Remarkably, flavor constraints turn out to prefer a region in parameter spacemore » that is in excellent agreement with the one preferred by recent Higgs precision measurements at the Large Hadron Collider (LHC). Direct searches for extra scalars at the LHC lead to further constraints. Precise predictions for the production and decay modes of the additional Higgs bosons are derived, and we present benchmark scenarios for searches at the LHC Run II. As a result, flavor breaking at the electroweak scale as well as strong coupling effects demand a UV completion at the scale of a few TeV, possibly within the reach of the LHC.« less

  14. Temporal scaling in information propagation

    NASA Astrophysics Data System (ADS)

    Huang, Junming; Li, Chao; Wang, Wen-Qiang; Shen, Hua-Wei; Li, Guojie; Cheng, Xue-Qi

    2014-06-01

    For the study of information propagation, one fundamental problem is uncovering universal laws governing the dynamics of information propagation. This problem, from the microscopic perspective, is formulated as estimating the propagation probability that a piece of information propagates from one individual to another. Such a propagation probability generally depends on two major classes of factors: the intrinsic attractiveness of information and the interactions between individuals. Despite the fact that the temporal effect of attractiveness is widely studied, temporal laws underlying individual interactions remain unclear, causing inaccurate prediction of information propagation on evolving social networks. In this report, we empirically study the dynamics of information propagation, using the dataset from a population-scale social media website. We discover a temporal scaling in information propagation: the probability a message propagates between two individuals decays with the length of time latency since their latest interaction, obeying a power-law rule. Leveraging the scaling law, we further propose a temporal model to estimate future propagation probabilities between individuals, reducing the error rate of information propagation prediction from 6.7% to 2.6% and improving viral marketing with 9.7% incremental customers.

  15. Analytic theories of allometric scaling.

    PubMed

    Agutter, Paul S; Tuszynski, Jack A

    2011-04-01

    During the 13 years since it was first advanced, the fractal network theory (FNT), an analytic theory of allometric scaling, has been subjected to a wide range of methodological, mathematical and empirical criticisms, not all of which have been answered satisfactorily. FNT presumes a two-variable power-law relationship between metabolic rate and body mass. This assumption has been widely accepted in the past, but a growing body of evidence during the past quarter century has raised questions about its general validity. There is now a need for alternative theories of metabolic scaling that are consistent with empirical observations over a broad range of biological applications. In this article, we briefly review the limitations of FNT, examine the evidence that the two-variable power-law assumption is invalid, and outline alternative perspectives. In particular, we discuss quantum metabolism (QM), an analytic theory based on molecular-cellular processes. QM predicts the large variations in scaling exponent that are found empirically and also predicts the temperature dependence of the proportionality constant, issues that have eluded models such as FNT that are based on macroscopic and network properties of organisms.

  16. Developmental College Student Self-Regulation: Results from Two Measures

    ERIC Educational Resources Information Center

    Young, Dawn; Ley, Kathryn

    2005-01-01

    This study compared 34 lower-achieving (developmental) first-time college students' self-reported self-regulation strategies from a Likert scale to those they reported in structured interviews. Likert scales have offered convenient administration and evaluation and have been used to identify what and how learners study. The reported study activity…

  17. Profiles of Students' Interest in Science Issues around the World: Analysis of Data from PISA 2006

    ERIC Educational Resources Information Center

    Olsen, Rolf Vegar; Lie, Svein

    2011-01-01

    The Programme for International Student Assessment in 2006 included several measures of students' interest in science. These measures were constructed by combining information from several items where students are asked to respond to statements along Likert scale categories. Since there is evidence for Likert scales providing culturally biased…

  18. Dystonia rating scales: critique and recommendations

    PubMed Central

    Albanese, Alberto; Sorbo, Francesca Del; Comella, Cynthia; Jinnah, H.A.; Mink, Jonathan W.; Post, Bart; Vidailhet, Marie; Volkmann, Jens; Warner, Thomas T.; Leentjens, Albert F.G.; Martinez-Martin, Pablo; Stebbins, Glenn T.; Goetz, Christopher G.; Schrag, Anette

    2014-01-01

    Background Many rating scales have been applied to the evaluation of dystonia, but only few have been assessed for clinimetric properties. The Movement Disorders Society commissioned this task force to critique existing dystonia rating scales and place them in the clinical and clinimetric context. Methods A systematic literature review was conducted to identify rating scales that have either been validated or used in dystonia. Results Thirty six potential scales were identified. Eight were excluded because they did not meet review criteria, leaving twenty-eight scales that were critiqued and rated by the task force. Seven scales were found to meet criteria to be “recommended”: the Blepharospasm Disability Index is recommended for rating blepharospasm; the Cervical Dystonia Impact Scale and the Toronto Western Spasmodic Torticollis Rating Scale for rating cervical dystonia; the Craniocervical Dystonia Questionnaire for blepharospasm and cervical dystonia; the Voice Handicap Index (VHI) and the Vocal Performance Questionnaire (VPQ) for laryngeal dystonia; and the Fahn-Marsden Dystonia Rating Scale for rating generalized dystonia. Two “recommended” scales (VHI and VPQ) are generic scales validated on few patients with laryngeal dystonia, whereas the others are disease-specific scales. Twelve scales met criteria for “suggested” and seven scales met criteria for “listed”. All the scales are individually reviewed in the online appendix. Conclusion The task force recommends five specific dystonia scales and suggests to further validate in dystonia two recommended generic voice-disorder scales. Existing scales for oromandibular, arm and task-specific dystonia should be refined and fully assessed. Scales should be developed for body regions where no scales are available, such as lower limbs and trunk. PMID:23893443

  19. Coupled length scales in eroding landscapes

    SciTech Connect

    Chan, Kelvin K.; Rothman, Daniel H.

    2001-05-01

    We report results from an empirical study of the anisotropic structure of eroding landscapes. By constructing a novel correlation function, we show quantitatively that small-scale channel-like features of landscapes are coupled to the large-scale structure of drainage basins. We show additionally that this two-scale interaction is scale-dependent. The latter observation suggests that a commonly applied effective equation for erosive transport may itself depend on scale.

  20. Evaluating the impact of farm scale innovation at catchment scale

    NASA Astrophysics Data System (ADS)

    van Breda, Phelia; De Clercq, Willem; Vlok, Pieter; Querner, Erik

    2014-05-01

    Hydrological modelling lends itself to other disciplines very well, normally as a process based system that acts as a catalogue of events taking place. These hydrological models are spatial-temporal in their design and are generally well suited for what-if situations in other disciplines. Scaling should therefore be a function of the purpose of the modelling. Process is always linked with scale or support but the temporal resolution can affect the results if the spatial scale is not suitable. The use of hydrological response units tends to lump area around physical features but disregards farm boundaries. Farm boundaries are often the more crucial uppermost resolution needed to gain more value from hydrological modelling. In the Letaba Catchment of South Africa, we find a generous portion of landuses, different models of ownership, different farming systems ranging from large commercial farms to small subsistence farming. All of these have the same basic right to water but water distribution in the catchment is somewhat of a problem. Since water quantity is also a problem, the water supply systems need to take into account that valuable production areas not be left without water. Clearly hydrological modelling should therefore be sensitive to specific landuse. As a measure of productivity, a system of small farmer production evaluation was designed. This activity presents a dynamic system outside hydrological modelling that is generally not being considered inside hydrological modelling but depends on hydrological modelling. For sustainable development, a number of important concepts needed to be aligned with activities in this region, and the regulatory actions also need to be adhered to. This study aimed at aligning the activities in a region to the vision and objectives of the regulatory authorities. South Africa's system of socio-economic development planning is complex and mostly ineffective. There are many regulatory authorities involved, often with unclear

  1. Scale-up considerations: Pilot to commercial scale

    NASA Astrophysics Data System (ADS)

    Weisiger, Dan

    1996-01-01

    The success of Photovoltaic (PV) technology as a viable business enterprise depends largely on its ability to provide a competitive advantage over other current energy technologies in meeting the customers' needs. Successful commercialization of the PV technology, therefore requires, in part, an efficient and effective manufacturing strategy in order to ensure a superior quality, low cost product. Several key design considerations for process scale-up were examined associated with GPI's PV module manufacturing expansion project completed in Spring, 1994. Particular emphasis was given to product specification, process specification, process engineering design, site location selection, environmental/health/safety (EHS) factors, and plant maintenance.

  2. Acceptability of Adaptations for Struggling Writers: A National Survey with Primary-Grade Teachers

    ERIC Educational Resources Information Center

    Graham, Steve; Harris, Karen R.; Bartlett, Brendan J.; Popadopoulou, Eleni; Santoro, Julia

    2016-01-01

    One hundred twenty-five primary-grade teachers randomly selected from across the United States indicated how frequently they made 20 instructional adaptations for the struggling writers in their classroom. The measure of frequency ranged from never, several times a year, monthly, weekly, several times a week, and daily. Using a 6-point Likert-type…

  3. Derivation of physically motivated wind speed scales

    NASA Astrophysics Data System (ADS)

    Dotzek, Nikolai

    A class of new wind speed scales is proposed in which the relevant scaling factors are derived from physical quantities like mass flux density, energy density (pressure), or energy flux density. Hence, they are called Energy- or E-scales, and can be applied to wind speeds of any intensity. It is shown that the Mach scale is a special case of an E-scale. Aside from its foundation in physical quantities which allow for a calibration of the scales, the E-scale concept can help to overcome the present plethora of scales for winds in the range from gale to hurricane intensity. A procedure to convert existing data based on the Fujita-scale or other scales (Saffir-Simpson, TORRO, Beaufort) to their corresponding E-scales is outlined. Even for the large US tornado record, the workload of conversion in case of an adoption of the E-scale would in principle remain manageable (if the necessary metadata to do so were available), as primarily the F5 events would have to be re-rated. Compared to damage scales like the "Enhanced Fujita" or EF-scale concept recently implemented in the USA, the E-scales are based on first principles. They can consistently be applied all over the world for the purpose of climatological homogeneity. To account for international variations in building characteristics, one should not adapt wind speed scale thresholds to certain national building characteristics. Instead, one worldwide applicable wind speed scale based on physical principles should rather be complemented by nationally-adapted damage descriptions. The E-scale concept can provide the basis for such a standardised wind speed scale.

  4. Preliminary Scaling Estimate for Select Small Scale Mixing Demonstration Tests

    SciTech Connect

    Wells, Beric E.; Fort, James A.; Gauglitz, Phillip A.; Rector, David R.; Schonewill, Philip P.

    2013-09-12

    The Hanford Site double-shell tank (DST) system provides the staging location for waste that will be transferred to the Hanford Tank Waste Treatment and Immobilization Plant (WTP). Specific WTP acceptance criteria for waste feed delivery describe the physical and chemical characteristics of the waste that must be met before the waste is transferred from the DSTs to the WTP. One of the more challenging requirements relates to the sampling and characterization of the undissolved solids (UDS) in a waste feed DST because the waste contains solid particles that settle and their concentration and relative proportion can change during the transfer of the waste in individual batches. A key uncertainty in the waste feed delivery system is the potential variation in UDS transferred in individual batches in comparison to an initial sample used for evaluating the acceptance criteria. To address this uncertainty, a number of small-scale mixing tests have been conducted as part of Washington River Protection Solutions’ Small Scale Mixing Demonstration (SSMD) project to determine the performance of the DST mixing and sampling systems.

  5. Dimensional Review of Scales for Forensic Photography.

    PubMed

    Ferrucci, Massimiliano; Doiron, Theodore D; Thompson, Robert M; Jones, John P; Freeman, Adam J; Neiman, Janice A

    2016-03-01

    Scales for photography provide a geometrical reference in the photographic documentation of a crime scene, pattern, or item of evidence. The ABFO No. 2 Standard Reference Scale (1) is used by the forensic science community as an accurate reference scale. We investigated the overall accuracy of the major centimeter graduations, internal/external diameters of the circles, error in placement of the circle centers, and leg perpendicularity. Four vendors were selected for the scales, and the features were measured on a vision-based coordinate measurement system. The scales were well within the specified tolerance for the length graduations. After 4 years, the same scales were measured to determine what change could be measured. The scales demonstrated acceptable stability in the scale length and center-to-center measurements; however, the perpendicularity exhibited change. The study results indicate that scale quality checks using certified metal rulers are good practice. PMID:27404626

  6. Proposing a tornado watch scale

    NASA Astrophysics Data System (ADS)

    Mason, Jonathan Brock

    This thesis provides an overview of language used in tornado safety recommendations from various sources, along with developing a rubric for scaled tornado safety recommendations, and subsequent development and testing of a tornado watch scale. The rubric is used to evaluate tornado refuge/shelter adequacy responses of Tuscaloosa residents gathered following the April 27, 2011 Tuscaloosa, Alabama EF4 tornado. There was a significant difference in the counts of refuge adequacy for Tuscaloosa residents when holding the locations during the April 27th tornado constant and comparing adequacy ratings for weak (EF0-EF1), strong (EF2-EF3) and violent (EF4-EF5) tornadoes. There was also a significant difference when comparing future tornado refuge plans of those same participants to the adequacy ratings for weak, strong and violent tornadoes. The tornado refuge rubric is then revised into a six-class, hierarchical Tornado Watch Scale (TWS) from Level 0 to Level 5 based on the likelihood of high-impact or low-impact severe weather events containing weak, strong or violent tornadoes. These levels represent maximum expected tornado intensity and include tornado safety recommendations from the tornado refuge rubric. Audio recordings similar to those used in current National Oceanic and Atmospheric Administration (NOAA) weather radio communications were developed to correspond to three levels of the TWS, a current Storm Prediction Center (SPC) tornado watch and a particularly dangerous situation (PDS) tornado watch. These were then used in interviews of Alabama residents to determine how changes to the information contained in the watch statements would affect each participant's tornado safety actions and perception of event danger. Results from interview participants (n=38) indicate a strong preference (97.37%) for the TWS when compared to current tornado watch and PDS tornado watch statements. Results also show the TWS elicits more adequate safety decisions from participants

  7. Scaling on a limestone flooring

    NASA Astrophysics Data System (ADS)

    Carmona-Quiroga, P. M.; Blanco-Varela, M. T.; Martínez-Ramírez, S.

    2012-04-01

    Natural stone can be use on nearly every surface, inside and outside buildings, but decay is more commonly reported from the ones exposed to outdoor aggressively conditions. This study instead, is an example of limestone weathering of uncertain origin in the interior of a residential building. The stone, used as flooring, started to exhibit loss of material in the form of scaling. These damages were observed before the building, localized in the South of Spain (Málaga), was inhabited. Moreover, according to the company the limestone satisfies the following European standards UNE-EN 1341: 2002, UNE-EN 1343: 2003; UNE-EN 12058: 2004 for floorings. Under these circumstances the main objective of this study was to assess the causes of this phenomenon. For this reason the composition of the mortar was determined and the stone was characterized from a mineralogical and petrological point of view. The last material, which is a fossiliferous limestone from Egypt with natural fissure lines, is mainly composed of calcite, being quartz, kaolinite and apatite minor phases. Moreover, under different spectroscopic and microscopic techniques (FTIR, micro-Raman, SEM-EDX, etc) samples of the weathered, taken directly from the buildings, and unweathered limestone tiles were examined and a new mineralogical phase, trona, was identified at scaled areas which are connected with the natural veins of the stone. In fact, through BSE-mapping the presence of sodium has been detected in these veins. This soluble sodium carbonate would was dissolved in the natural waters from which limestone was precipitated and would migrate with the ascendant capilar humidity and crystallized near the surface of the stone starting the scaling phenomenon which in historic masonry could be very damaging. Therefore, the weathering of the limestone would be related with the hygroscopic behaviour of this salt, but not with the constructive methods used. This makes the limestone unable to be used on restoration

  8. Adopted: A practical salinity scale

    NASA Astrophysics Data System (ADS)

    The Unesco/ICES/SCOR/IAPSO Joint Panel on Oceanographic Tables and Standards has recommended the adoption of a Practical Salinity Scale, 1978, and a corresponding new International Equation of State of Seawater, 1980. A full account of the research leading to their recommendation is available in the series Unesco Technical Papers in Marine Science.The parent organizations have accepted the panel's recommendations and have set January 1, 1982, as the date when the new procedures, formulae, and tables should replace those now in use.

  9. Scale-free convection theory

    NASA Astrophysics Data System (ADS)

    Pasetto, Stefano; Chiosi, Cesare; Cropper, Mark; Grebel, Eva K.

    2015-08-01

    Convection is one of the fundamental mechanism to transport energy, e.g., in planetology, oceanography as well as in astrophysics where stellar structure customarily described by the mixing-length theory, which makes use of the mixing-length scale parameter to express the convective flux, velocity, and temperature gradients of the convective elements and stellar medium. The mixing-length scale is taken to be proportional to the local pressure scale height of the star, and the proportionality factor (the mixing-length parameter) must be determined by comparing the stellar models to some calibrator, usually the Sun.No strong arguments exist to claim that the mixing-length parameter is the same in all stars and all evolutionary phases. Because of this, all stellar models in literature are hampered by this basic uncertainty.In a recent paper (Pasetto et al 2014) we presented the first fully analytical scale-free theory of convection that does not require the mixing-length parameter. Our self-consistent analytical formulation of convection determines all the properties of convection as a function of the physical behaviour of the convective elements themselves and the surrounding medium (being it a either a star, an ocean, a primordial planet). The new theory of convection is formulated starting from a conventional solution of the Navier-Stokes/Euler equations, i.e. the Bernoulli equation for a perfect fluid, but expressed in a non-inertial reference frame co-moving with the convective elements. In our formalism, the motion of convective cells inside convective-unstable layers is fully determined by a new system of equations for convection in a non-local and time dependent formalism.We obtained an analytical, non-local, time-dependent solution for the convective energy transport that does not depend on any free parameter. The predictions of the new theory in astrophysical environment are compared with those from the standard mixing-length paradigm in stars with

  10. The NIST Length Scale Interferometer

    PubMed Central

    Beers, John S.; Penzes, William B.

    1999-01-01

    The National Institute of Standards and Technology (NIST) interferometer for measuring graduated length scales has been in use since 1965. It was developed in response to the redefinition of the meter in 1960 from the prototype platinum-iridium bar to the wavelength of light. The history of the interferometer is recalled, and its design and operation described. A continuous program of modernization by making physical modifications, measurement procedure changes and computational revisions is described, and the effects of these changes are evaluated. Results of a long-term measurement assurance program, the primary control on the measurement process, are presented, and improvements in measurement uncertainty are documented.

  11. Scale problem in wormhole physics

    SciTech Connect

    Kim, J. E.; Lee, K.

    1989-07-03

    Wormhole physics from the quantum thoery of gravity coupled to the second-rank antisymmetric tensor or Goldstone-boson fields leads to an effective potential for these fields. The cosmological energy-density bound is shown to put an upper bound on the cosmological constant which wormhole physics can make zero. This upper bound, of order 10/sup 11/ GeV, is far smaller than the Planck scale and barely compatible with the possible cosmological constant arising from grand unified theories. In addition, the effect of wormholes on the axion for the strong /ital CP/ problem is discussed.

  12. Supergroups and economies of scale.

    PubMed

    Schlossberg, Steven

    2009-02-01

    With the changing environment for medical practice, physician practice models will continue to evolve. These "supergoups'' create economies of scale, but their advantage is not only in the traditional economic sense. Practices with enough size are able to better meet the challenges of medical practice with increasing regulatory demands, explosion of clinical knowledge, quality and information technology initiatives, and an increasingly tight labor market. Smaller practices can adapt some of these strategies selectively. Depending on the topic, smaller practices should think differently about how to approach the challenges of practice.

  13. New Scalings in Nuclear Fragmentation

    SciTech Connect

    Bonnet, E.; Bougault, R.; Galichet, E.; Gagnon-Moisan, F.; Guinet, D.; Lautesse, P.; Marini, P.; Parlog, M.

    2010-10-01

    Fragment partitions of fragmenting hot nuclei produced in central and semiperipheral collisions have been compared in the excitation energy region 4-10 MeV per nucleon where radial collective expansion takes place. It is shown that, for a given total excitation energy per nucleon, the amount of radial collective energy fixes the mean fragment multiplicity. It is also shown that, at a given total excitation energy per nucleon, the different properties of fragment partitions are completely determined by the reduced fragment multiplicity (i.e., normalized to the source size). Freeze-out volumes seem to play a role in the scalings observed.

  14. Drift-Scale Radionuclide Transport

    SciTech Connect

    J. Houseworth

    2004-09-22

    The purpose of this model report is to document the drift scale radionuclide transport model, taking into account the effects of emplacement drifts on flow and transport in the vicinity of the drift, which are not captured in the mountain-scale unsaturated zone (UZ) flow and transport models ''UZ Flow Models and Submodels'' (BSC 2004 [DIRS 169861]), ''Radionuclide Transport Models Under Ambient Conditions'' (BSC 2004 [DIRS 164500]), and ''Particle Tracking Model and Abstraction of Transport Process'' (BSC 2004 [DIRS 170041]). The drift scale radionuclide transport model is intended to be used as an alternative model for comparison with the engineered barrier system (EBS) radionuclide transport model ''EBS Radionuclide Transport Abstraction'' (BSC 2004 [DIRS 169868]). For that purpose, two alternative models have been developed for drift-scale radionuclide transport. One of the alternative models is a dual continuum flow and transport model called the drift shadow model. The effects of variations in the flow field and fracture-matrix interaction in the vicinity of a waste emplacement drift are investigated through sensitivity studies using the drift shadow model (Houseworth et al. 2003 [DIRS 164394]). In this model, the flow is significantly perturbed (reduced) beneath the waste emplacement drifts. However, comparisons of transport in this perturbed flow field with transport in an unperturbed flow field show similar results if the transport is initiated in the rock matrix. This has led to a second alternative model, called the fracture-matrix partitioning model, that focuses on the partitioning of radionuclide transport between the fractures and matrix upon exiting the waste emplacement drift. The fracture-matrix partitioning model computes the partitioning, between fractures and matrix, of diffusive radionuclide transport from the invert (for drifts without seepage) into the rock water. The invert is the structure constructed in a drift to provide the floor of the

  15. Full-Scale Tunnel (FST)

    NASA Technical Reports Server (NTRS)

    1929-01-01

    Modified propeller and spinner in Full-Scale Tunnel (FST) model. On June 26, 1929, Elton W. Miller wrote to George W. Lewis proposing the construction of a model of the full-scale tunnel. 'The excellent energy ratio obtained in the new wind tunnel of the California Institute of Technology suggests that before proceeding with our full scale tunnel design, we ought to investigate the effect on energy ratio of such factors as: 1. small included angle for the exit cone; 2. carefully designed return passages of circular section as far as possible, without sudden changes in cross sections; 3. tightness of walls. It is believed that much useful information can be obtained by building a model of about 1/16 scale, that is, having a closed throat of 2 ft. by 4 ft. The outside dimensions would be about 12 ft. by 25 ft. in plan and the height 4 ft. Two propellers will be required about 28 in. in diameter, each to be driven by direct current motor at a maximum speed of 4500 R.P.M. Provision can be made for altering the length of certain portions, particularly the exit cone, and possibly for the application of boundary layer control in order to effect satisfactory air flow. This model can be constructed in a comparatively short time, using 2 by 4 framing with matched sheathing inside, and where circular sections are desired they can be obtained by nailing sheet metal to wooden ribs, which can be cut on the band saw. It is estimated that three months will be required for the construction and testing of such a model and that the cost will be approximately three thousand dollars, one thousand dollars of which will be for the motors. No suitable location appears to exist in any of our present buildings, and it may be necessary to build it outside and cover it with a roof.' George Lewis responded immediately (June 27) granting the authority to proceed. He urged Langley to expedite construction and to employ extra carpenters if necessary. Funds for the model came from the FST project

  16. Multi-scale Shock Technique

    2009-08-01

    The code to be released is a new addition to the LAMMPS molecular dynamics code. LAMMPS is developed and maintained by Sandia, is publicly available, and is used widely by both natioanl laboratories and academics. The new addition to be released enables LAMMPS to perform molecular dynamics simulations of shock waves using the Multi-scale Shock Simulation Technique (MSST) which we have developed and has been previously published. This technique enables molecular dynamics simulations of shockmore » waves in materials for orders of magnitude longer timescales than the direct, commonly employed approach.« less

  17. Hydrological Modeling of Continental-Scale Basins

    NASA Astrophysics Data System (ADS)

    Wood, Eric F.; Lettenmaier, Dennis; Liang, Xu; Nijssen, Bart; Wetzel, Suzanne W.

    Hydrological models at continental scales are traditionally used for water resources planning. However, continental-scale hydrological models may be useful in assessing the impacts from future climate change on catchment hydrology and water resources or from human activity on hydrology and biogeochemical cycles at large scales. Development of regional-scale terrestrial hydrological models will further our understanding of the Earth's water cycle. Continental scales allow for better understanding of the geographic distribution of land-atmospheric moisture fluxes, improved water management at continental scales, better quantification of the impact of human activity and climate change on the water cycle, and improved simulation of weather and climate.

  18. Identifying characteristic scales in the human genome

    NASA Astrophysics Data System (ADS)

    Carpena, P.; Bernaola-Galván, P.; Coronado, A. V.; Hackenberg, M.; Oliver, J. L.

    2007-03-01

    The scale-free, long-range correlations detected in DNA sequences contrast with characteristic lengths of genomic elements, being particularly incompatible with the isochores (long, homogeneous DNA segments). By computing the local behavior of the scaling exponent α of detrended fluctuation analysis (DFA), we discriminate between sequences with and without true scaling, and we find that no single scaling exists in the human genome. Instead, human chromosomes show a common compositional structure with two characteristic scales, the large one corresponding to the isochores and the other to small and medium scale genomic elements.

  19. How resilient are resilience scales? The Big Five scales outperform resilience scales in predicting adjustment in adolescents.

    PubMed

    Waaktaar, Trine; Torgersen, Svenn

    2010-04-01

    This study's aim was to determine whether resilience scales could predict adjustment over and above that predicted by the five-factor model (FFM). A sample of 1,345 adolescents completed paper-and-pencil scales on FFM personality (Hierarchical Personality Inventory for Children), resilience (Ego-Resiliency Scale [ER89] by Block & Kremen, the Resilience Scale [RS] by Wagnild & Young) and adaptive behaviors (California Healthy Kids Survey, UCLA Loneliness Scale and three measures of school adaptation). The results showed that the FFM scales accounted for the highest proportion of variance in disturbance. For adaptation, the resilience scales contributed as much as the FFM. In no case did the resilience scales outperform the FFM by increasing the explained variance. The results challenge the validity of the resilience concept as an indicator of human adaptation and avoidance of disturbance, although the concept may have heuristic value in combining favorable aspects of a person's personality endowment.

  20. Scaling analysis of stock markets.

    PubMed

    Bu, Luping; Shang, Pengjian

    2014-06-01

    In this paper, we apply the detrended fluctuation analysis (DFA), local scaling detrended fluctuation analysis (LSDFA), and detrended cross-correlation analysis (DCCA) to investigate correlations of several stock markets. DFA method is for the detection of long-range correlations used in time series. LSDFA method is to show more local properties by using local scale exponents. DCCA method is a developed method to quantify the cross-correlation of two non-stationary time series. We report the results of auto-correlation and cross-correlation behaviors in three western countries and three Chinese stock markets in periods 2004-2006 (before the global financial crisis), 2007-2009 (during the global financial crisis), and 2010-2012 (after the global financial crisis) by using DFA, LSDFA, and DCCA method. The findings are that correlations of stocks are influenced by the economic systems of different countries and the financial crisis. The results indicate that there are stronger auto-correlations in Chinese stocks than western stocks in any period and stronger auto-correlations after the global financial crisis for every stock except Shen Cheng; The LSDFA shows more comprehensive and detailed features than traditional DFA method and the integration of China and the world in economy after the global financial crisis; When it turns to cross-correlations, it shows different properties for six stock markets, while for three Chinese stocks, it reaches the weakest cross-correlations during the global financial crisis.

  1. Impedance Scaling and Impedance Control

    NASA Astrophysics Data System (ADS)

    Chou, W.; Griffin, J.

    1997-05-01

    When a machine becomes really large, such as the Really Large Hadron Collider (RLHC),(G. W. Foster and E. Malamud, Fermilab-TM-1976 (June, 1996).) of which the circumference could reach the order of megameters, beam instability could be an essential bottleneck. This paper studies the scaling of the instability threshold vs. machine size when the coupling impedance scales in a ``normal'' way. It is shown that the beam would be intrinsically unstable for the RLHC. As a possible solution to this problem, it is proposed to introduce local impedance inserts for controlling the machine impedance. In the longitudinal plane, this could be done by using a heavily detuned rf cavity (e.g., a biconical structure), which could provide large imaginary impedance with the right sign (i.e., inductive or capacitive) while keeping the real part small. In the transverse direction, a carefully designed variation of the cross section of a beam pipe could generate negative impedance that would partially compensate the transverse impedance in one plane.

  2. Engineering scale electrostatic enclosure demonstration

    SciTech Connect

    Meyer, L.C.

    1993-09-01

    This report presents results from an engineering scale electrostatic enclosure demonstration test. The electrostatic enclosure is part of an overall in-depth contamination control strategy for transuranic (TRU) waste recovery operations. TRU contaminants include small particles of plutonium compounds associated with defense-related waste recovery operations. Demonstration test items consisted of an outer Perma-con enclosure, an inner tent enclosure, and a ventilation system test section for testing electrostatic curtain devices. Three interchangeable test fixtures that could remove plutonium from the contaminated dust were tested in the test section. These were an electret filter, a CRT as an electrostatic field source, and an electrically charged parallel plate separator. Enclosure materials tested included polyethylene, anti-static construction fabric, and stainless steel. The soil size distribution was determined using an eight stage cascade impactor. Photographs of particles containing plutonium were obtained with a scanning electron microscope (SEM). The SEM also provided a second method of getting the size distribution. The amount of plutonium removed from the aerosol by the electrostatic devices was determined by radiochemistry from input and output aerosol samplers. The inner and outer enclosures performed adequately for plutonium handling operations and could be used for full scale operations.

  3. The Kirby-Desai Scale

    PubMed Central

    Desai, Alpesh; Desai, Tejas; Kartono, Francisca; Geeta, Patel

    2009-01-01

    Background: As tattoos have become increasingly popular in the Western world, tattoo-removal requests have also increased, as patients’ personal identities advance. Laser tattoo removal is the current treatment of choice given its safety and efficacy. However, due to varying types of tattoos, it has been difficult to quantify the number of laser treatments required with certainty when discussing laser tattoo removal with our patients. Objective: To propose a practical numerical scale to assess the number of laser tattoo-removal treatments necessary to achieve satisfactory results. Methods and materials: A retrospective chart review was performed on 100 clinic patients who presented for laser tattoo removal. An algorithm was proposed to assign a numerical score to each tattoo across six different categories (skin type, location, color, amount of ink, scarring, and layering). The cumulative score (Kirby-Desai score) is proposed to correlate with the number of treatment sessions required for satisfactory tattoo removal. Results: A correlation coefficient of 0.757 was achieved, with satisfactory tattoo removal in all subjects (N=100, p<0.001). Conclusion: We propose the Kirby-Desai scale as a practical tool to assess the number of laser tattoo-removal sessions required, which will translate into a more certain cost calculation for the patient. PMID:20729941

  4. Full-Scale Tunnel (FST)

    NASA Technical Reports Server (NTRS)

    1931-01-01

    Wing and nacelle set-up in Full-Scale Tunnel (FST). The NACA conducted drag tests in 1931 on a P3M-1 nacelle which were presented in a special report to the Navy. Smith DeFrance described this work in the report's introduction: 'Tests were conducted in the full-scale wind tunnel on a five to four geared Pratt and Whitney Wasp engine mounted in a P3M-1 nacelle. In order to simulate the flight conditions the nacelle was assembled on a 15-foot span of wing from the same airplane. The purpose of the tests was to improve the cooling of the engine and to reduce the drag of the nacelle combination. Thermocouples were installed at various points on the cylinders and temperature readings were obtained from these by the power plants division. These results will be reported in a memorandum by that division. The drag results, which are covered by this memorandum, were obtained with the original nacelle condition as received from the Navy with the tail of the nacelle modified, with the nose section of the nacelle modified, with a Curtiss anti-drag ring attached to the engine, with a Type G ring developed by the N.A.C.A., and with a Type D cowling which was also developed by the N.A.C.A.' (p. 1)

  5. Scaling analysis of stock markets.

    PubMed

    Bu, Luping; Shang, Pengjian

    2014-06-01

    In this paper, we apply the detrended fluctuation analysis (DFA), local scaling detrended fluctuation analysis (LSDFA), and detrended cross-correlation analysis (DCCA) to investigate correlations of several stock markets. DFA method is for the detection of long-range correlations used in time series. LSDFA method is to show more local properties by using local scale exponents. DCCA method is a developed method to quantify the cross-correlation of two non-stationary time series. We report the results of auto-correlation and cross-correlation behaviors in three western countries and three Chinese stock markets in periods 2004-2006 (before the global financial crisis), 2007-2009 (during the global financial crisis), and 2010-2012 (after the global financial crisis) by using DFA, LSDFA, and DCCA method. The findings are that correlations of stocks are influenced by the economic systems of different countries and the financial crisis. The results indicate that there are stronger auto-correlations in Chinese stocks than western stocks in any period and stronger auto-correlations after the global financial crisis for every stock except Shen Cheng; The LSDFA shows more comprehensive and detailed features than traditional DFA method and the integration of China and the world in economy after the global financial crisis; When it turns to cross-correlations, it shows different properties for six stock markets, while for three Chinese stocks, it reaches the weakest cross-correlations during the global financial crisis. PMID:24985421

  6. Scaling analysis of stock markets

    NASA Astrophysics Data System (ADS)

    Bu, Luping; Shang, Pengjian

    2014-06-01

    In this paper, we apply the detrended fluctuation analysis (DFA), local scaling detrended fluctuation analysis (LSDFA), and detrended cross-correlation analysis (DCCA) to investigate correlations of several stock markets. DFA method is for the detection of long-range correlations used in time series. LSDFA method is to show more local properties by using local scale exponents. DCCA method is a developed method to quantify the cross-correlation of two non-stationary time series. We report the results of auto-correlation and cross-correlation behaviors in three western countries and three Chinese stock markets in periods 2004-2006 (before the global financial crisis), 2007-2009 (during the global financial crisis), and 2010-2012 (after the global financial crisis) by using DFA, LSDFA, and DCCA method. The findings are that correlations of stocks are influenced by the economic systems of different countries and the financial crisis. The results indicate that there are stronger auto-correlations in Chinese stocks than western stocks in any period and stronger auto-correlations after the global financial crisis for every stock except Shen Cheng; The LSDFA shows more comprehensive and detailed features than traditional DFA method and the integration of China and the world in economy after the global financial crisis; When it turns to cross-correlations, it shows different properties for six stock markets, while for three Chinese stocks, it reaches the weakest cross-correlations during the global financial crisis.

  7. The weak scale from BBN

    NASA Astrophysics Data System (ADS)

    Hall, Lawrence J.; Pinner, David; Ruderman, Joshua T.

    2014-12-01

    The measured values of the weak scale, v, and the first generation masses, m u, d, e , are simultaneously explained in the multiverse, with all these parameters scanning independently. At the same time, several remarkable coincidences are understood. Small variations in these parameters away from their measured values lead to the instability of hydrogen, the instability of heavy nuclei, and either a hydrogen or a helium dominated universe from Big Bang Nucleosynthesis. In the 4d parameter space of ( m u , m d , m e , v), catastrophic boundaries are reached by separately increasing each parameter above its measured value by a factor of (1.4, 1.3, 2.5, ˜ 5), respectively. The fine-tuning problem of the weak scale in the Standard Model is solved: as v is increased beyond the observed value, it is impossible to maintain a significant cosmological hydrogen abundance for any values of m u, d, e that yield both hydrogen and heavy nuclei stability.

  8. The Autonomy Over Smoking Scale.

    PubMed

    DiFranza, Joseph R; Wellman, Robert J; Ursprung, W W Sanouri A; Sabiston, Catherine

    2009-12-01

    Our goal was to create an instrument that can be used to study how smokers lose autonomy over smoking and regain it after quitting. The Autonomy Over Smoking Scale was produced through a process involving item generation, focus-group evaluation, testing in adults to winnow items, field testing with adults and adolescents, and head-to-head comparisons with other measures. The final 12-item scale shows excellent reliability (alphas = .91-.97), with a one-factor solution explaining 59% of the variance in adults and 61%-74% of the variance in adolescents. Concurrent validity was supported by associations with age of smoking initiation, lifetime use, smoking frequency, daily cigarette consumption, history of failed cessation, Hooked on Nicotine Checklist scores, and Diagnostic and Statistical Manual of Mental Disorder (4th ed., text rev.; American Psychiatric Association, 2000) nicotine dependence criteria. Potentially useful features of this new instrument include (a) it assesses tobacco withdrawal, cue-induced craving, and psychological dependence on cigarettes; (b) it measures symptom intensity; and (c) it asks about current symptoms only, so it could be administered to quitting smokers to track the resolution of symptoms.

  9. Scaling characteristics of topographic depressions

    NASA Astrophysics Data System (ADS)

    Le, P. V.; Kumar, P.

    2013-12-01

    Topographic depressions, areas of no lateral surface flow, are ubiquitous characteristic of land surface that control many ecosystem and biogeochemical processes. Landscapes with high density of depressions increase the surface storage capacity, whereas lower depression density increase runoff, thus influencing soil moisture states, hydrologic connectivity and the climate--soil--vegetation interactions. With the widespread availability of high resolution LiDAR based digital elevation model (lDEM) data, it is now possible to identify and characterize the structure of the spatial distribution of topographic depressions for incorporation in ecohydrologic and biogeochemical studies. Here we use lDEM data to document the prevalence and patterns of topographic depressions across five different landscapes in the United States and quantitatively characterize the distribution of attributes, such as surface area, storage volume, and the distance to the nearest neighbor. Through the use of a depression identification algorithm, we show that these distribution attributes follow scaling laws indicative of a fractal structure in which a large fraction of land surface areas can consist of high number of topographic depressions, accounting for 4 to 200 mm of depression storage. This implies that the impacts of small-scale topographic depressions in the fractal landscapes on the redistribution of surface energy fluxes, evaporation, and hydrologic connectivity are quite significant.

  10. Scaling device for photographic images

    NASA Technical Reports Server (NTRS)

    Rivera, Jorge E. (Inventor); Youngquist, Robert C. (Inventor); Cox, Robert B. (Inventor); Haskell, William D. (Inventor); Stevenson, Charles G. (Inventor)

    2005-01-01

    A scaling device projects a known optical pattern into the field of view of a camera, which can be employed as a reference scale in a resulting photograph of a remote object, for example. The device comprises an optical beam projector that projects two or more spaced, parallel optical beams onto a surface of a remotely located object to be photographed. The resulting beam spots or lines on the object are spaced from one another by a known, predetermined distance. As a result, the size of other objects or features in the photograph can be determined through comparison of their size to the known distance between the beam spots. Preferably, the device is a small, battery-powered device that can be attached to a camera and employs one or more laser light sources and associated optics to generate the parallel light beams. In a first embodiment of the invention, a single laser light source is employed, but multiple parallel beams are generated thereby through use of beam splitting optics. In another embodiment, multiple individual laser light sources are employed that are mounted in the device parallel to one another to generate the multiple parallel beams.

  11. The SCALE-UP Project

    NASA Astrophysics Data System (ADS)

    Beichner, Robert

    2015-03-01

    The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).

  12. Goethite Bench-scale and Large-scale Preparation Tests

    SciTech Connect

    Josephson, Gary B.; Westsik, Joseph H.

    2011-10-23

    The Hanford Waste Treatment and Immobilization Plant (WTP) is the keystone for cleanup of high-level radioactive waste from our nation's nuclear defense program. The WTP will process high-level waste from the Hanford tanks and produce immobilized high-level waste glass for disposal at a national repository, low activity waste (LAW) glass, and liquid effluent from the vitrification off-gas scrubbers. The liquid effluent will be stabilized into a secondary waste form (e.g. grout-like material) and disposed on the Hanford site in the Integrated Disposal Facility (IDF) along with the low-activity waste glass. The major long-term environmental impact at Hanford results from technetium that volatilizes from the WTP melters and finally resides in the secondary waste. Laboratory studies have indicated that pertechnetate ({sup 99}TcO{sub 4}{sup -}) can be reduced and captured into a solid solution of {alpha}-FeOOH, goethite (Um 2010). Goethite is a stable mineral and can significantly retard the release of technetium to the environment from the IDF. The laboratory studies were conducted using reaction times of many days, which is typical of environmental subsurface reactions that were the genesis of this new process. This study was the first step in considering adaptation of the slow laboratory steps to a larger-scale and faster process that could be conducted either within the WTP or within the effluent treatment facility (ETF). Two levels of scale-up tests were conducted (25x and 400x). The largest scale-up produced slurries of Fe-rich precipitates that contained rhenium as a nonradioactive surrogate for {sup 99}Tc. The slurries were used in melter tests at Vitreous State Laboratory (VSL) to determine whether captured rhenium was less volatile in the vitrification process than rhenium in an unmodified feed. A critical step in the technetium immobilization process is to chemically reduce Tc(VII) in the pertechnetate (TcO{sub 4}{sup -}) to Tc(Iv)by reaction with the ferrous

  13. The positive mental health instrument: development and validation of a culturally relevant scale in a multi-ethnic asian population

    PubMed Central

    2011-01-01

    Background Instruments to measure mental health and well-being are largely developed and often used within Western populations and this compromises their validity in other cultures. A previous qualitative study in Singapore demonstrated the relevance of spiritual and religious practices to mental health, a dimension currently not included in exiting multi-dimensional measures. The objective of this study was to develop a self-administered measure that covers all key and culturally appropriate domains of mental health, which can be applied to compare levels of mental health across different age, gender and ethnic groups. We present the item reduction and validation of the Positive Mental Health (PMH) instrument in a community-based adult sample in Singapore. Methods Surveys were conducted among adult (21-65 years) residents belonging to Chinese, Malay and Indian ethnicities. Exploratory and confirmatory factor analysis (EFA, CFA) were conducted and items were reduced using item response theory tests (IRT). The final version of the PMH instrument was tested for internal consistency and criterion validity. Items were tested for differential item functioning (DIF) to check if items functioned in the same way across all subgroups. Results: EFA and CFA identified six first-order factor structure (General coping, Personal growth and autonomy, Spirituality, Interpersonal skills, Emotional support, and Global affect) under one higher-order dimension of Positive Mental Health (RMSEA = 0.05, CFI = 0.96, TLI = 0.96). A 47-item self-administered multi-dimensional instrument with a six-point Likert response scale was constructed. The slope estimates and strength of the relation to the theta for all items in each six PMH subscales were high (range:1.39 to 5.69), suggesting good discrimination properties. The threshold estimates for the instrument ranged from -3.45 to 1.61 indicating that the instrument covers entire spectrums for the six dimensions. The instrument demonstrated

  14. SCALE-UP OF RAPID SMALL-SCALE ADSORPTION TESTS TO FIELD-SCALE ADSORBERS: THEORETICAL AND EXPERIMENTAL BASIS

    EPA Science Inventory

    Design of full-scale adsorption systems typically includes expensive and time-consuming pilot studies to simulate full-scale adsorber performance. Accordingly, the rapid small-scale column test (RSSCT) was developed and evaluated experimentally. The RSSCT can simulate months of f...

  15. THE TRANSLATION, VALIDATION AND CULTURAL ADAPTATION OF FUNCTIONAL ASSESSMENT OF CHRONIC ILLNESS THERAPY - SPIRITUAL WELL-BEING 12 (FACIT-SP12) SCALE IN GREEK LANGUAGE

    PubMed Central

    Fradelos, Evangelos C.; Tzavella, Foteini; Koukia, Evmorfia; Tsaras, Konstantinos; Papathanasiou, Ioanna V.; Aroni, Adamantia; Alikari, Victoria; Ralli, Maria; Bredle, Jason; Zyga, Sofia

    2016-01-01

    Background: According to World Health Organization (WHO), spirituality is an important domain of quality of life especially in terminal, life threatens chronic diseases. For many people spirituality and religion are not just very important dimensions of their existence, but also a source of support that contributes to wellbeing and coping with everyday difficulties of life. Aim: Aim of the study was the translation of the Facit Spiritual Well Being Scale (Facit-Sp12) in Greek language and the validation of the scale for the Greek population. Material and Methods: The Facit-Sp12 questionnaire is an anonymous self-administered questionnaire that contains twelve, four point Likert scale, closed questions (0=Not at all, 1=A little bit, 2=Some-what, 3=Quite a bit, 4=Very Much). The questionnaire was translated into Greek language and then back translated in the English in order to be checked for any inconsistencies. The sample of the study was 183 chronic kidney disease patients, undergoing hemodialysis. Exploratory factor analysis, with principal components analysis with Varimax rotation was performed for checking the construct validity of the questionnaire. The test–retest reliability and the internal consistency were also examined. Statistical analysis performed by the use of SPSS 21.0. Statistical significance level was set at p=0.05 Results: The final Greek version of the questionnaire includes all of the twelve questions. The mean age of the participants was 61.81±13.9. Three factors were exported from the statistical analysis. The Cronbach-α coefficient was 0.77 for the total questionnaire and for each subscale was 0.70 for “meaning”, 0.73 for “peace” and 0.87 for “faith”. Between the three subscales “meaning” had the highest score (mean 12.49, SD=2.865). Conclusions: The Facit Spiritual Wellbeing Scale–Facit-Sp12, is a valuable and reliable questionnaire of three dimensions that can be used for assessing spirituality and spiritual wellbeing

  16. The Adaptive Multi-scale Simulation Infrastructure

    SciTech Connect

    Tobin, William R.

    2015-09-01

    The Adaptive Multi-scale Simulation Infrastructure (AMSI) is a set of libraries and tools developed to support the development, implementation, and execution of general multimodel simulations. Using a minimal set of simulation meta-data AMSI allows for minimally intrusive work to adapt existent single-scale simulations for use in multi-scale simulations. Support for dynamic runtime operations such as single- and multi-scale adaptive properties is a key focus of AMSI. Particular focus has been spent on the development on scale-sensitive load balancing operations to allow single-scale simulations incorporated into a multi-scale simulation using AMSI to use standard load-balancing operations without affecting the integrity of the overall multi-scale simulation.

  17. A Three Component Cancer Attitude Scale.

    ERIC Educational Resources Information Center

    Torabi, Mohammad R.; Seffrin, John R.

    1986-01-01

    A scale was developed to measure college students' attitudes toward cancer and cancer prevention. The three components of attitude were feeling (affective), belief (cognitive), and intention to act (conative). Development of the scale is discussed. (DF)

  18. Scaling ansatz for the jamming transition.

    PubMed

    Goodrich, Carl P; Liu, Andrea J; Sethna, James P

    2016-08-30

    We propose a Widom-like scaling ansatz for the critical jamming transition. Our ansatz for the elastic energy shows that the scaling of the energy, compressive strain, shear strain, system size, pressure, shear stress, bulk modulus, and shear modulus are all related to each other via scaling relations, with only three independent scaling exponents. We extract the values of these exponents from already known numerical or theoretical results, and we numerically verify the resulting predictions of the scaling theory for the energy and residual shear stress. We also derive a scaling relation between pressure and residual shear stress that yields insight into why the shear and bulk moduli scale differently. Our theory shows that the jamming transition exhibits an emergent scale invariance, setting the stage for the potential development of a renormalization group theory for jamming. PMID:27512041

  19. Scaling ansatz for the jamming transition

    NASA Astrophysics Data System (ADS)

    Goodrich, Carl P.; Liu, Andrea J.; Sethna, James P.

    2016-08-01

    We propose a Widom-like scaling ansatz for the critical jamming transition. Our ansatz for the elastic energy shows that the scaling of the energy, compressive strain, shear strain, system size, pressure, shear stress, bulk modulus, and shear modulus are all related to each other via scaling relations, with only three independent scaling exponents. We extract the values of these exponents from already known numerical or theoretical results, and we numerically verify the resulting predictions of the scaling theory for the energy and residual shear stress. We also derive a scaling relation between pressure and residual shear stress that yields insight into why the shear and bulk moduli scale differently. Our theory shows that the jamming transition exhibits an emergent scale invariance, setting the stage for the potential development of a renormalization group theory for jamming.

  20. Small-Scale Rocket Motor Test

    NASA Video Gallery

    Engineers at NASA's Marshall Space Flight Center in Huntsville, Ala. successfully tested a sub-scale solid rocket motor on May 27. Testing a sub-scale version of a rocket motor is a cost-effective ...

  1. Scaling ansatz for the jamming transition

    PubMed Central

    Goodrich, Carl P.; Liu, Andrea J.; Sethna, James P.

    2016-01-01

    We propose a Widom-like scaling ansatz for the critical jamming transition. Our ansatz for the elastic energy shows that the scaling of the energy, compressive strain, shear strain, system size, pressure, shear stress, bulk modulus, and shear modulus are all related to each other via scaling relations, with only three independent scaling exponents. We extract the values of these exponents from already known numerical or theoretical results, and we numerically verify the resulting predictions of the scaling theory for the energy and residual shear stress. We also derive a scaling relation between pressure and residual shear stress that yields insight into why the shear and bulk moduli scale differently. Our theory shows that the jamming transition exhibits an emergent scale invariance, setting the stage for the potential development of a renormalization group theory for jamming. PMID:27512041

  2. The AppScale Cloud Platform

    PubMed Central

    Krintz, Chandra

    2013-01-01

    AppScale is an open source distributed software system that implements a cloud platform as a service (PaaS). AppScale makes cloud applications easy to deploy and scale over disparate cloud fabrics, implementing a set of APIs and architecture that also makes apps portable across the services they employ. AppScale is API-compatible with Google App Engine (GAE) and thus executes GAE applications on-premise or over other cloud infrastructures, without modification. PMID:23828721

  3. Scale Interaction in a California precipitation event

    SciTech Connect

    Leach, M. J., LLNL

    1997-09-01

    Heavy rains and severe flooding frequently plaque California. The heavy rains are most often associated with large scale cyclonic and frontal systems, where large scale dynamics and large moisture influx from the tropical Pacific interact. however, the complex topography along the west coast also interacts with the large scale influences, producing local areas with heavier precipitation. In this paper, we look at some of the local interactions with the large scale.

  4. Strongly scale-dependent non-Gaussianity

    SciTech Connect

    Riotto, Antonio; Sloth, Martin S.

    2011-02-15

    We discuss models of primordial density perturbations where the non-Gaussianity is strongly scale dependent. In particular, the non-Gaussianity may have a sharp cutoff and be very suppressed on large cosmological scales, but sizable on small scales. This may have an impact on probes of non-Gaussianity in the large-scale structure and in the cosmic microwave background radiation anisotropies.

  5. The Scaled Thermal Explosion Experiment

    SciTech Connect

    Wardell, J F; Maienschein, J L

    2002-07-05

    We have developed the Scaled Thermal Explosion Experiment (STEX) to provide a database of reaction violence from thermal explosion for explosives of interest. Such data are needed to develop, calibrate, and validate predictive capability for thermal explosions using simulation computer codes. A cylinder of explosive 25, 50 or 100 mm in diameter, is confined in a steel cylinder with heavy end caps, and heated under controlled conditions until reaction. Reaction violence is quantified through non-contact micropower impulse radar measurements of the cylinder wall velocity and by strain gauge data at reaction onset. Here we describe the test concept, design and diagnostic recording, and report results with HMX- and RDX-based energetic materials.

  6. Hypoallometric scaling in international collaborations

    NASA Astrophysics Data System (ADS)

    Hsiehchen, David; Espinoza, Magdalena; Hsieh, Antony

    2016-02-01

    Collaboration is a vital process and dominant theme in knowledge production, although the effectiveness of policies directed at promoting multinational research remains ambiguous. We examined approximately 24 million research articles published over four decades and demonstrated that the scaling of international publications to research productivity for each country obeys a universal and conserved sublinear power law. Inefficient mechanisms in transborder team dynamics or organization as well as increasing opportunity costs may contribute to the disproportionate growth of international collaboration rates with increasing productivity among nations. Given the constrained growth of international relationships, our findings advocate a greater emphasis on the qualitative aspects of collaborations, such as with whom partnerships are forged, particularly when assessing research and policy outcomes.

  7. Enabling department-scale supercomputing

    SciTech Connect

    Greenberg, D.S.; Hart, W.E.; Phillips, C.A.

    1997-11-01

    The Department of Energy (DOE) national laboratories have one of the longest and most consistent histories of supercomputer use. The authors summarize the architecture of DOE`s new supercomputers that are being built for the Accelerated Strategic Computing Initiative (ASCI). The authors then argue that in the near future scaled-down versions of these supercomputers with petaflop-per-weekend capabilities could become widely available to hundreds of research and engineering departments. The availability of such computational resources will allow simulation of physical phenomena to become a full-fledged third branch of scientific exploration, along with theory and experimentation. They describe the ASCI and other supercomputer applications at Sandia National Laboratories, and discuss which lessons learned from Sandia`s long history of supercomputing can be applied in this new setting.

  8. Bacterial Communities: Interactions to Scale.

    PubMed

    Stubbendieck, Reed M; Vargas-Bautista, Carol; Straight, Paul D

    2016-01-01

    In the environment, bacteria live in complex multispecies communities. These communities span in scale from small, multicellular aggregates to billions or trillions of cells within the gastrointestinal tract of animals. The dynamics of bacterial communities are determined by pairwise interactions that occur between different species in the community. Though interactions occur between a few cells at a time, the outcomes of these interchanges have ramifications that ripple through many orders of magnitude, and ultimately affect the macroscopic world including the health of host organisms. In this review we cover how bacterial competition influences the structures of bacterial communities. We also emphasize methods and insights garnered from culture-dependent pairwise interaction studies, metagenomic analyses, and modeling experiments. Finally, we argue that the integration of multiple approaches will be instrumental to future understanding of the underlying dynamics of bacterial communities. PMID:27551280

  9. Modeling biosilicification at subcellular scales.

    PubMed

    Javaheri, Narjes; Cronemberger, Carolina M; Kaandorp, Jaap A

    2013-01-01

    Biosilicification occurs in many organisms. Sponges and diatoms are major examples of them. In this chapter, we introduce a modeling approach that describes several biological mechanisms controlling silicification. Modeling biosilicification is a typical multiscale problem where processes at very different temporal and spatial scales need to be coupled: processes at the molecular level, physiological processes at the subcellular and cellular level, etc. In biosilicification morphology plays a fundamental role, and a spatiotemporal model is required. In the case of sponges, a particle simulation based on diffusion-limited aggregation is presented here. This model can describe fractal properties of silica aggregates in first steps of deposition on an organic template. In the case of diatoms, a reaction-diffusion model is introduced which can describe the concentrations of chemical components and has the possibility to include polymerization chain of reactions. PMID:24420712

  10. Small Scale High Speed Turbomachinery

    NASA Technical Reports Server (NTRS)

    London, Adam P. (Inventor); Droppers, Lloyd J. (Inventor); Lehman, Matthew K. (Inventor); Mehra, Amitav (Inventor)

    2015-01-01

    A small scale, high speed turbomachine is described, as well as a process for manufacturing the turbomachine. The turbomachine is manufactured by diffusion bonding stacked sheets of metal foil, each of which has been pre-formed to correspond to a cross section of the turbomachine structure. The turbomachines include rotating elements as well as static structures. Using this process, turbomachines may be manufactured with rotating elements that have outer diameters of less than four inches in size, and/or blading heights of less than 0.1 inches. The rotating elements of the turbomachines are capable of rotating at speeds in excess of 150 feet per second. In addition, cooling features may be added internally to blading to facilitate cooling in high temperature operations.

  11. Anisotropic scaling of magnetohydrodynamic turbulence.

    PubMed

    Horbury, Timothy S; Forman, Miriam; Oughton, Sean

    2008-10-24

    We present a quantitative estimate of the anisotropic power and scaling of magnetic field fluctuations in inertial range magnetohydrodynamic turbulence, using a novel wavelet technique applied to spacecraft measurements in the solar wind. We show for the first time that, when the local magnetic field direction is parallel to the flow, the spacecraft-frame spectrum has a spectral index near 2. This can be interpreted as the signature of a population of fluctuations in field-parallel wave numbers with a k(-2)_(||) spectrum but is also consistent with the presence of a "critical balance" style turbulent cascade. We also find, in common with previous studies, that most of the power is contained in wave vectors at large angles to the local magnetic field and that this component of the turbulence has a spectral index of 5/3.

  12. Size Scaling of Static Friction

    NASA Astrophysics Data System (ADS)

    Braun, O. M.; Manini, Nicola; Tosatti, Erio

    2013-02-01

    Sliding friction across a thin soft lubricant film typically occurs by stick slip, the lubricant fully solidifying at stick, yielding and flowing at slip. The static friction force per unit area preceding slip is known from molecular dynamics (MD) simulations to decrease with increasing contact area. That makes the large-size fate of stick slip unclear and unknown; its possible vanishing is important as it would herald smooth sliding with a dramatic drop of kinetic friction at large size. Here we formulate a scaling law of the static friction force, which for a soft lubricant is predicted to decrease as fm+Δf/Aγ for increasing contact area A, with γ>0. Our main finding is that the value of fm, controlling the survival of stick slip at large size, can be evaluated by simulations of comparably small size. MD simulations of soft lubricant sliding are presented, which verify this theory.

  13. Scale control in urea solutions

    SciTech Connect

    Dubin, L.; Diep, D.V.

    1997-08-01

    Legislation to control NO{sub x} emissions, one cause of acid rain and ozone induced smog, has created an impetus to control NO{sub x} emissions. Selective Non Catalytic Reduction (SNCR) using urea chemistry is utilized to control NO{sub x} emissions from boilers, municipal waste incinerators, refinery furnaces, recovery boilers, utilities and other stationary combustion sources. Control requires injecting urea-based solutions into the flue gas at specified temperatures. Urea solutions accelerate CaCO{sub 3} precipitation in industrial waters used for dilution, and thereby interfere with proper application of the urea solution. The negative effect of urea solutions on hardness stability is discussed as well as how CaCO{sub 3} precipitation in urea solution can be controlled by suitable scale inhibitors.

  14. Bacterial Communities: Interactions to Scale

    PubMed Central

    Stubbendieck, Reed M.; Vargas-Bautista, Carol; Straight, Paul D.

    2016-01-01

    In the environment, bacteria live in complex multispecies communities. These communities span in scale from small, multicellular aggregates to billions or trillions of cells within the gastrointestinal tract of animals. The dynamics of bacterial communities are determined by pairwise interactions that occur between different species in the community. Though interactions occur between a few cells at a time, the outcomes of these interchanges have ramifications that ripple through many orders of magnitude, and ultimately affect the macroscopic world including the health of host organisms. In this review we cover how bacterial competition influences the structures of bacterial communities. We also emphasize methods and insights garnered from culture-dependent pairwise interaction studies, metagenomic analyses, and modeling experiments. Finally, we argue that the integration of multiple approaches will be instrumental to future understanding of the underlying dynamics of bacterial communities. PMID:27551280

  15. Quantitative Scaling of Magnetic Avalanches.

    PubMed

    Durin, G; Bohn, F; Corrêa, M A; Sommer, R L; Le Doussal, P; Wiese, K J

    2016-08-19

    We provide the first quantitative comparison between Barkhausen noise experiments and recent predictions from the theory of avalanches for pinned interfaces, both in and beyond mean field. We study different classes of soft magnetic materials, including polycrystals and amorphous samples-which are characterized by long-range and short-range elasticity, respectively-both for thick and thin samples, i.e., with and without eddy currents. The temporal avalanche shape at fixed size as well as observables related to the joint distribution of sizes and durations are analyzed in detail. Both long-range and short-range samples with no eddy currents are fitted extremely well by the theoretical predictions. In particular, the short-range samples provide the first reliable test of the theory beyond mean field. The thick samples show systematic deviations from the scaling theory, providing unambiguous signatures for the presence of eddy currents.

  16. Quantitative Scaling of Magnetic Avalanches

    NASA Astrophysics Data System (ADS)

    Durin, G.; Bohn, F.; Corrêa, M. A.; Sommer, R. L.; Le Doussal, P.; Wiese, K. J.

    2016-08-01

    We provide the first quantitative comparison between Barkhausen noise experiments and recent predictions from the theory of avalanches for pinned interfaces, both in and beyond mean field. We study different classes of soft magnetic materials, including polycrystals and amorphous samples—which are characterized by long-range and short-range elasticity, respectively—both for thick and thin samples, i.e., with and without eddy currents. The temporal avalanche shape at fixed size as well as observables related to the joint distribution of sizes and durations are analyzed in detail. Both long-range and short-range samples with no eddy currents are fitted extremely well by the theoretical predictions. In particular, the short-range samples provide the first reliable test of the theory beyond mean field. The thick samples show systematic deviations from the scaling theory, providing unambiguous signatures for the presence of eddy currents.

  17. Chip Scale Package Implementation Challenges

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Reza

    1998-01-01

    The JPL-led MicrotypeBGA Consortium of enterprises representing government agencies and private companies have jointed together to pool in-kind resources for developing the quality and reliability of chip scale packages (CSPs) for a variety of projects. In the process of building the Consortium CSP test vehicles, many challenges were identified regarding various aspects of technology implementation. This paper will present our experience in the areas of technology implementation challenges, including design and building both standard and microvia boards, and assembly of two types of test vehicles. We also discuss the most current package isothermal aging to 2,000 hours at 100 C and 125 C and thermal cycling test results to 1,700 cycles in the range of -30 to 100 C.

  18. Quantitative Scaling of Magnetic Avalanches.

    PubMed

    Durin, G; Bohn, F; Corrêa, M A; Sommer, R L; Le Doussal, P; Wiese, K J

    2016-08-19

    We provide the first quantitative comparison between Barkhausen noise experiments and recent predictions from the theory of avalanches for pinned interfaces, both in and beyond mean field. We study different classes of soft magnetic materials, including polycrystals and amorphous samples-which are characterized by long-range and short-range elasticity, respectively-both for thick and thin samples, i.e., with and without eddy currents. The temporal avalanche shape at fixed size as well as observables related to the joint distribution of sizes and durations are analyzed in detail. Both long-range and short-range samples with no eddy currents are fitted extremely well by the theoretical predictions. In particular, the short-range samples provide the first reliable test of the theory beyond mean field. The thick samples show systematic deviations from the scaling theory, providing unambiguous signatures for the presence of eddy currents. PMID:27588876

  19. Quasistatic scale-free networks

    NASA Astrophysics Data System (ADS)

    Mukherjee, G.; Manna, S. S.

    2003-01-01

    A network is formed using the N sites of a one-dimensional lattice in the shape of a ring as nodes and each node with the initial degree kin=2. N links are then introduced to this network, each link starts from a distinct node, the other end being connected to any other node with degree k randomly selected with an attachment probability proportional to kα. Tuning the control parameter α, we observe a transition where the average degree of the largest node changes its variation from N0 to N at a specific transition point of αc. The network is scale free, i.e., the nodal degree distribution has a power law decay for α⩾αc.

  20. L-scaling. Working Paper No. 26.

    ERIC Educational Resources Information Center

    Blankmeyer, Eric

    Given "T" joint observations on "K" variables, it is frequently useful to consider the weighted average or scaled score. L-scaling is introduced as a technique for determining the weights. The technique is so named because of its resemblance to the Leontief matrix of mathematical economics. L-scaling is compared to two widely-used procedures for…

  1. 21 CFR 880.2720 - Patient scale.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... transducers, electronic signal amplification, conditioning and display equipment. (b) Classification. Class I... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Patient scale. 880.2720 Section 880.2720 Food and... Patient scale. (a) Identification. A patient scale is a device intended for medical purposes that is...

  2. 27 CFR 19.183 - Scale tanks.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Tank Requirements § 19.183 Scale tanks. (a) Except as otherwise provided in paragraph (b) of this... quickly and accurately determined. (b) The requirement to mount tanks on scales does not apply to tanks... 27 Alcohol, Tobacco Products and Firearms 1 2011-04-01 2011-04-01 false Scale tanks....

  3. 27 CFR 19.183 - Scale tanks.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Tank Requirements § 19.183 Scale tanks. (a) Except as otherwise provided in paragraph (b) of this... quickly and accurately determined. (b) The requirement to mount tanks on scales does not apply to tanks... 27 Alcohol, Tobacco Products and Firearms 1 2012-04-01 2012-04-01 false Scale tanks....

  4. 21 CFR 880.2720 - Patient scale.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... transducers, electronic signal amplification, conditioning and display equipment. (b) Classification. Class I... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Patient scale. 880.2720 Section 880.2720 Food and... Patient scale. (a) Identification. A patient scale is a device intended for medical purposes that is...

  5. 27 CFR 19.183 - Scale tanks.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Tank Requirements § 19.183 Scale tanks. (a) Except as otherwise provided in paragraph (b) of this... quickly and accurately determined. (b) The requirement to mount tanks on scales does not apply to tanks... 27 Alcohol, Tobacco Products and Firearms 1 2013-04-01 2013-04-01 false Scale tanks....

  6. 21 CFR 880.2720 - Patient scale.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... transducers, electronic signal amplification, conditioning and display equipment. (b) Classification. Class I... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Patient scale. 880.2720 Section 880.2720 Food and... Patient scale. (a) Identification. A patient scale is a device intended for medical purposes that is...

  7. 27 CFR 19.183 - Scale tanks.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Tank Requirements § 19.183 Scale tanks. (a) Except as otherwise provided in paragraph (b) of this... quickly and accurately determined. (b) The requirement to mount tanks on scales does not apply to tanks... 27 Alcohol, Tobacco Products and Firearms 1 2014-04-01 2014-04-01 false Scale tanks....

  8. Researching Developmental Careers: The Career Conformity Scale.

    ERIC Educational Resources Information Center

    White, James M.

    1987-01-01

    Developed interval scale to measure deviation from the normative sequencing of first job, marriage, and birth of first child. Scale scores had predictive utility for marital instability and work interruptions. Use of scale score as independent variable provided more information than that provided by Hogan's (1978) temporal sequence categories as…

  9. Behavioral Observation Scales for Performance Appraisal Purposes

    ERIC Educational Resources Information Center

    Latham, Gary P.; Wexley, Kenneth N.

    1977-01-01

    This research attempts to determine whether Behavioral Observation Scales (BOS) could be improved by developing them through quantitative methods. The underlying assumption was that developing composite scales with greater internal consistency might improve their generalizability as evidenced by the cross-validation coefficients of scales based on…

  10. Automatic scale selection for medical image segmentation

    NASA Astrophysics Data System (ADS)

    Bayram, Ersin; Wyatt, Christopher L.; Ge, Yaorong

    2001-07-01

    The scale of interesting structures in medical images is space variant because of partial volume effects, spatial dependence of resolution in many imaging modalities, and differences in tissue properties. Existing segmentation methods either apply a single scale to the entire image or try fine-to-coarse/coarse-to-fine tracking of structures over multiple scales. While single scale approaches fail to fully recover the perceptually important structures, multi-scale methods have problems in providing reliable means to select proper scales and integrating information over multiple scales. A recent approach proposed by Elder and Zucker addresses the scale selection problem by computing a minimal reliable scale for each image pixel. The basic premise of this approach is that, while the scale of structures within an image vary spatially, the imaging system is fixed. Hence, sensor noise statistics can be calculated. Based on a model of edges to be detected, and operators to be used for detection, one can locally compute a unique minimal reliable scale at which the likelihood of error due to sensor noise is less than or equal to a predetermined threshold. In this paper, we improve the segmentation method based on the minimal reliable scale selection and evaluate its effectiveness with both simulated and actual medical data.

  11. 76 FR 18348 - Required Scale Tests

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-04

    ... Register on January 20, 2011 (76 FR 3485), defining required scale tests. That document incorrectly defined... Tests AGENCY: Grain Inspection, Packers and Stockyards Administration. ACTION: Correcting amendments... packer using such scales may use the scales within a 6-month period following each test. * * * * * Alan...

  12. The Attitudes toward Multiracial Children Scale.

    ERIC Educational Resources Information Center

    Jackman, Charmain F.; Wagner, William G.; Johnson, J. T.

    2001-01-01

    Two studies evaluated items developed for the Attitudes Toward Multiracial Children Scale. Researchers administered the scale to diverse college students, revised it, then administered it again. The scale's psychometric properties were such that the instrument could be used to research adults' attitudes regarding psychosocial development of…

  13. Scale and corrosion inhibition by thermal polyaspartates

    SciTech Connect

    Bains, D.I.; Fan, G.; Fan, J.; Ross, R.J.

    1999-11-01

    Organic polymers have found wide spread use as inhibitors for the prevention of mineral scales in heat transfer equipment. Recently a biodegradable organic polymer has been developed which provides both scale and corrosion control. The development of the polymeric inhibitor and laboratory evaluations of scale and corrosion inhibition is discussed together with its potential application in open recirculating cooling systems.

  14. An Aesthetic Value Scale of the Rorschach.

    ERIC Educational Resources Information Center

    Insua, Ana Maria

    1981-01-01

    An aesthetic value scale of the Rorschach cards was built by the successive interval method. This scale was compared with the ratings obtained by means of the Semantic Differential Scales and was found to successfully differentiate sexes in their judgment of card attractiveness. (Author)

  15. Development of Capstone Project Attitude Scales

    ERIC Educational Resources Information Center

    Bringula, Rex P.

    2015-01-01

    This study attempted to develop valid and reliable Capstone Project Attitude Scales (CPAS). Among the scales reviewed, the Modified Fennema-Shermann Mathematics Attitude Scales was adapted in the construction of the CPAS. Usefulness, Confidence, and Gender View were the three subscales of the CPAS. Four hundred sixty-three students answered the…

  16. Developing a Sense of Scale: Looking Backward

    ERIC Educational Resources Information Center

    Jones, M. Gail; Taylor, Amy R.

    2009-01-01

    Although scale has been identified as one of four major interdisciplinary themes that cut across the science domains by the American Association for the Advancement of Science (1989), we are only beginning to understand how students learn and apply scale concepts. Early research on learning scale tended to focus on perceptions of linear distances,…

  17. Why Online Education Will Attain Full Scale

    ERIC Educational Resources Information Center

    Sener, John

    2010-01-01

    Online higher education has attained scale and is poised to take the next step in its growth. Although significant obstacles to a full scale adoption of online education remain, we will see full scale adoption of online higher education within the next five to ten years. Practically all higher education students will experience online education in…

  18. Full-Scale Tunnel (FST)

    NASA Technical Reports Server (NTRS)

    1930-01-01

    Construction of Full-Scale Tunnel (FST). In November 1929, Smith DeFrance submitted his recommendations for the general design of the Full Scale Wind Tunnel. The last on his list concerned the division of labor required to build this unusual facility. He believed the job had five parts and described them as follows: 'It is proposed that invitations be sent out for bids on five groups of items. The first would be for one contract on the complete structure; second the same as first, including the erection of the cones but not the fabrication, since this would be more of a shipyard job; third would cover structural steel, cover, sash and doors, but not cones or foundation; fourth, foundations; an fifth, fabrication of cones.' DeFrance's memorandum prompted the NACA to solicit estimates from a large number of companies. Preliminary designs and estimates were prepared and submitted to the Bureau of the Budget and Congress appropriated funds on February 20, 1929. The main construction contract with the J.A. Jones Company of Charlotte, North Carolina was signed one year later on February 12, 1930. It was a peculiar structure as the building's steel framework is visible on the outside of the building. DeFrance described this in NACA TR No. 459: 'The entire equipment is housed in a structure, the outside walls of which serve as the outer walls of the return passages. The over-all length of the tunnel is 434 feet 6 inches, the width 222 feet, and the maximum height 97 feet. The framework is of structural steel....' (pp. 292-293).

  19. Full-Scale Tunnel (FST)

    NASA Technical Reports Server (NTRS)

    1930-01-01

    Construction of Full-Scale Tunnel (FST): 120-Foot Truss hoisting, one and two point suspension. In November 1929, Smith DeFrance submitted his recommendations for the general design of the Full Scale Wind Tunnel. The last on his list concerned the division of labor required to build this unusual facility. He believed the job had five parts and described them as follows: 'It is proposed that invitations be sent out for bids on five groups of items. The first would be for one contract on the complete structure; second the same as first, including the erection of the cones but not the fabrication, since this would be more of a shipyard job; third would cover structural steel, cover, sash and doors, but not cones or foundation; fourth, foundations; and fifth, fabrication of cones.' DeFrance's memorandum prompted the NACA to solicit estimates from a large number of companies. Preliminary designs and estimates were prepared and submitted to the Bureau of the Budget and Congress appropriated funds on February 20, 1929. The main construction contract with the J.A. Jones Company of Charlotte, North Carolina was signed one year later on February 12, 1930. It was a peculiar structure as the building's steel framework is visible on the outside of the building. DeFrance described this in NACA TR No. 459: 'The entire equipment is housed in a structure, the outside walls of which serve as the outer walls of the return passages. The over-all length of the tunnel is 434 feet 6 inches, the width 222 feet, and the maximum height 97 feet. The framework is of structural steel....' (pp. 292-293)

  20. Large Scale Nanolaminate Deformable Mirror

    SciTech Connect

    Papavasiliou, A; Olivier, S; Barbee, T; Miles, R; Chang, K

    2005-11-30

    This work concerns the development of a technology that uses Nanolaminate foils to form light-weight, deformable mirrors that are scalable over a wide range of mirror sizes. While MEMS-based deformable mirrors and spatial light modulators have considerably reduced the cost and increased the capabilities of adaptive optic systems, there has not been a way to utilize the advantages of lithography and batch-fabrication to produce large-scale deformable mirrors. This technology is made scalable by using fabrication techniques and lithography that are not limited to the sizes of conventional MEMS devices. Like many MEMS devices, these mirrors use parallel plate electrostatic actuators. This technology replicates that functionality by suspending a horizontal piece of nanolaminate foil over an electrode by electroplated nickel posts. This actuator is attached, with another post, to another nanolaminate foil that acts as the mirror surface. Most MEMS devices are produced with integrated circuit lithography techniques that are capable of very small line widths, but are not scalable to large sizes. This technology is very tolerant of lithography errors and can use coarser, printed circuit board lithography techniques that can be scaled to very large sizes. These mirrors use small, lithographically defined actuators and thin nanolaminate foils allowing them to produce deformations over a large area while minimizing weight. This paper will describe a staged program to develop this technology. First-principles models were developed to determine design parameters. Three stages of fabrication will be described starting with a 3 x 3 device using conventional metal foils and epoxy to a 10-across all-metal device with nanolaminate mirror surfaces.

  1. Small-scale field experiments accurately scale up to predict density dependence in reef fish populations at large scales.

    PubMed

    Steele, Mark A; Forrester, Graham E

    2005-09-20

    Field experiments provide rigorous tests of ecological hypotheses but are usually limited to small spatial scales. It is thus unclear whether these findings extrapolate to larger scales relevant to conservation and management. We show that the results of experiments detecting density-dependent mortality of reef fish on small habitat patches scale up to have similar effects on much larger entire reefs that are the size of small marine reserves and approach the scale at which some reef fisheries operate. We suggest that accurate scaling is due to the type of species interaction causing local density dependence and the fact that localized events can be aggregated to describe larger-scale interactions with minimal distortion. Careful extrapolation from small-scale experiments identifying species interactions and their effects should improve our ability to predict the outcomes of alternative management strategies for coral reef fishes and their habitats.

  2. Selection, Optimization, and Compensation: The Structure, Reliability, and Validity of Forced-Choice versus Likert-Type Measures in a Sample of Late Adolescents

    ERIC Educational Resources Information Center

    Geldhof, G. John; Gestsdottir, Steinunn; Stefansson, Kristjan; Johnson, Sara K.; Bowers, Edmond P.; Lerner, Richard M.

    2015-01-01

    Intentional self-regulation (ISR) undergoes significant development across the life span. However, our understanding of ISR's development and function remains incomplete, in part because the field's conceptualization and measurement of ISR vary greatly. A key sample case involves how Baltes and colleagues' Selection, Optimization,…

  3. Scaling regions for food web properties

    PubMed Central

    Bersier, Louis-Félix; Sugihara, George

    1997-01-01

    The robustness of eight common food web properties is examined with respect to web size. We show that the current controversy concerning the scale dependence or scale invariance of these properties can be resolved by accounting for scaling constraints introduced by webs of very small size. We demonstrate statistically that the most robust way to view these properties is not to lump webs of all sizes, but to divide them into two distinct categories. For the present data set, small webs containing 12 or fewer species exhibit scale dependence, and larger webs containing more than 12 species exhibit scale invariance. PMID:11038600

  4. Fluctuation scaling, Taylor's law, and crime.

    PubMed

    Hanley, Quentin S; Khatun, Suniya; Yosef, Amal; Dyer, Rachel-May

    2014-01-01

    Fluctuation scaling relationships have been observed in a wide range of processes ranging from internet router traffic to measles cases. Taylor's law is one such scaling relationship and has been widely applied in ecology to understand communities including trees, birds, human populations, and insects. We show that monthly crime reports in the UK show complex fluctuation scaling which can be approximated by Taylor's law relationships corresponding to local policing neighborhoods and larger regional and countrywide scales. Regression models applied to local scale data from Derbyshire and Nottinghamshire found that different categories of crime exhibited different scaling exponents with no significant difference between the two regions. On this scale, violence reports were close to a Poisson distribution (α = 1.057 ± 0.026) while burglary exhibited a greater exponent (α = 1.292 ± 0.029) indicative of temporal clustering. These two regions exhibited significantly different pre-exponential factors for the categories of anti-social behavior and burglary indicating that local variations in crime reports can be assessed using fluctuation scaling methods. At regional and countrywide scales, all categories exhibited scaling behavior indicative of temporal clustering evidenced by Taylor's law exponents from 1.43 ± 0.12 (Drugs) to 2.094 ± 0081 (Other Crimes). Investigating crime behavior via fluctuation scaling gives insight beyond that of raw numbers and is unique in reporting on all processes contributing to the observed variance and is either robust to or exhibits signs of many types of data manipulation.

  5. Effects of scale on internal blast measurements

    NASA Astrophysics Data System (ADS)

    Granholm, R.; Sandusky, H.; Lee, R.

    2014-05-01

    This paper presents a comparative study between large and small-scale internal blast experiments with the goal of using the small-scale analog for energetic performance evaluation. In the small-scale experiment, highly confined explosive samples <0.5 g were subjected to the output from a PETN detonator while enclosed in a 3-liter chamber. Large-scale tests up to 23 kg were unconfined and released in a chamber with a factor of 60,000 increase in volume. The comparative metric in these experiments is peak quasi-static overpressure, with the explosive sample expressed as sample energy/chamber volume, which normalizes measured pressures across scale. Small-scale measured pressures were always lower than the large-scale measurements, because of heat-loss to the high confinement inherent in the small-scale apparatus. This heat-loss can be quantified and used to correct the small-scale pressure measurements. In some cases the heat-loss was large enough to quench reaction of lower energy samples. These results suggest that small-scale internal blast tests do correlate with their large-scale counterparts, provided that heat-loss to confinement can be measured, and that less reactive or lower energy samples are not quenched by heat-loss.

  6. Multi-scale modeling in cell biology

    PubMed Central

    Meier-Schellersheim, Martin; Fraser, Iain D. C.; Klauschen, Frederick

    2009-01-01

    Biomedical research frequently involves performing experiments and developing hypotheses that link different scales of biological systems such as, for instance, the scales of intracellular molecular interactions to the scale of cellular behavior and beyond to the behavior of cell populations. Computational modeling efforts that aim at exploring such multi-scale systems quantitatively with the help of simulations have to incorporate several different simulation techniques due to the different time and space scales involved. Here, we provide a non-technical overview of how different scales of experimental research can be combined with the appropriate computational modeling techniques. We also show that current modeling software permits building and simulating multi-scale models without having to become involved with the underlying technical details of computational modeling. PMID:20448808

  7. Mechanisms of scaling in pattern formation

    PubMed Central

    Umulis, David M.; Othmer, Hans G.

    2013-01-01

    Many organisms and their constituent tissues and organs vary substantially in size but differ little in morphology; they appear to be scaled versions of a common template or pattern. Such scaling involves adjusting the intrinsic scale of spatial patterns of gene expression that are set up during development to the size of the system. Identifying the mechanisms that regulate scaling of patterns at the tissue, organ and organism level during development is a longstanding challenge in biology, but recent molecular-level data and mathematical modeling have shed light on scaling mechanisms in several systems, including Drosophila and Xenopus. Here, we investigate the underlying principles needed for understanding the mechanisms that can produce scale invariance in spatial pattern formation and discuss examples of systems that scale during development. PMID:24301464

  8. The development of a forgiveness scale.

    PubMed

    Hargrave, T D; Sells, J N

    1997-01-01

    This paper reports on the development, validity, and reliability of a self-report instrument designed to assess a respondent's perspective of pain resulting from relational violations and work toward relational forgiveness based on a framework proposed by Hargrave (1994a). Presented here is the five-stage procedure used in the development of the Interpersonal Relationship Resolution Scale. Construct validity and reliability were determined from an initial sample of 164 subjects. Concurrent validity of the scale was supported by another sample of 35 respondents who took the Interpersonal Relationship Resolution Scale, the Personal Authority in the Family System Questionnaire, the Relational Ethics Scale, the Fundamental Interpersonal Relations Orientation-Behavior scale, and the Burns Depression Checklist. Finally, a predictive validity study of the scale was performed with a clinical and nonclinical sample of 98 volunteers. Data are presented that support the validity and reliability of the instrument, as well as the final version of the scale.

  9. Invariant relationships deriving from classical scaling transformations

    SciTech Connect

    Bludman, Sidney; Kennedy, Dallas C.

    2011-04-15

    Because scaling symmetries of the Euler-Lagrange equations are generally not variational symmetries of the action, they do not lead to conservation laws. Instead, an extension of Noether's theorem reduces the equations of motion to evolutionary laws that prove useful, even if the transformations are not symmetries of the equations of motion. In the case of scaling, symmetry leads to a scaling evolutionary law, a first-order equation in terms of scale invariants, linearly relating kinematic and dynamic degrees of freedom. This scaling evolutionary law appears in dynamical and in static systems. Applied to dynamical central-force systems, the scaling evolutionary equation leads to generalized virial laws, which linearly connect the kinetic and potential energies. Applied to barotropic hydrostatic spheres, the scaling evolutionary equation linearly connects the gravitational and internal energy densities. This implies well-known properties of polytropes, describing degenerate stars and chemically homogeneous nondegenerate stellar cores.

  10. Collaboration and nested environmental governance: Scale dependency, scale framing, and cross-scale interactions in collaborative conservation.

    PubMed

    Wyborn, Carina; Bixler, R Patrick

    2013-07-15

    The problem of fit between social institutions and ecological systems is an enduring challenge in natural resource management and conservation. Developments in the science of conservation biology encourage the management of landscapes at increasingly larger scales. In contrast, sociological approaches to conservation emphasize the importance of ownership, collaboration and stewardship at scales relevant to the individual or local community. Despite the proliferation of initiatives seeking to work with local communities to undertake conservation across large landscapes, there is an inherent tension between these scales of operation. Consequently, questions about the changing nature of effective conservation across scales abound. Through an analysis of three nested cases working in a semiautonomous fashion in the Northern Rocky Mountains in North America, this paper makes an empirical contribution to the literature on nested governance, collaboration and communication across scales. Despite different scales of operation, constituencies and scale frames, we demonstrate a surprising similarity in organizational structure and an implicit dependency between these initiatives. This paper examines the different capacities and capabilities of collaborative conservation from the local to regional to supra regional. We draw on the underexplored concept of 'scale-dependent comparative advantage' (Cash and Moser, 2000), to gain insight into what activities take place at which scale and what those activities contribute to nested governance and collaborative conservation. The comparison of these semiautonomous cases provides fruitful territory to draw lessons for understanding the roles and relationships of organizations operating at different scales in more connected networks of nested governance.

  11. Anticipated adaptation or scale recalibration?

    PubMed Central

    2013-01-01

    Background The aim of our study was to investigate anticipated adaptation among patients in the subacute phase of Spinal Cord Injury (SCI). Methods We used an observational longitudinal design. Patients with SCI (N = 44) rated their actual, previous and expected future Quality of Life (QoL) at three time points: within two weeks of admission to the rehabilitation center (RC), a few weeks before discharge from the RC, and at least three months after discharge. We compared the expected future rating at the second time point with the actual ratings at the third time point, using student’s t-tests. To gain insight into scale recalibration we also compared actual and previous ratings. Results At the group level, patients overpredicted their improvement on the VAS. Actual health at T3(M = 0.65, sd =0.20)) was significantly lower than the predicted health at T1 of T3 (M = 0.76, sd = 0.1; t(43) = 3.24, p < 0.01), and at T2 of T3(M = 0.75,sd = 0.13; t(43) = 3.44, p < 0.001). Similarly the recalled health at T3 of T2 (M = 0.59, sd = 0.18) was significantly lower than the actual health at T2 (M = 0.67, sd = 0.15; t(43) = 3.26, p <0.01). Patients rated their future and past health inaccurately compared to their actual ratings on the VAS. In contrast, on the TTO patients gave accurate estimates of their future and previous health, and they also accurately valued their previous health. Looking at individual ratings, the number of respondents with accurate estimates of their future and previous health were similar between the VAS and TTO. However, the Bland-Altman plots show that the deviation of the accuracy is larger for the TTO then the VAS. That is the accuracy of 95% of the respondents was lower in the TTO then in the VAS. Conclusions Patients at the onset of a disability were able to anticipate adaptation. Valuations given on the VAS seem to be biased by scale recalibration. PMID:24139246

  12. Advances in time-scale algorithms

    NASA Technical Reports Server (NTRS)

    Stein, S. R.

    1993-01-01

    The term clock is usually used to refer to a device that counts a nearly periodic signal. A group of clocks, called an ensemble, is often used for time keeping in mission critical applications that cannot tolerate loss of time due to the failure of a single clock. The time generated by the ensemble of clocks is called a time scale. The question arises how to combine the times of the individual clocks to form the time scale. One might naively be tempted to suggest the expedient of averaging the times of the individual clocks, but a simple thought experiment demonstrates the inadequacy of this approach. Suppose a time scale is composed of two noiseless clocks having equal and opposite frequencies. The mean time scale has zero frequency. However if either clock fails, the time-scale frequency immediately changes to the frequency of the remaining clock. This performance is generally unacceptable and simple mean time scales are not used. First, previous time-scale developments are reviewed and then some new methods that result in enhanced performance are presented. The historical perspective is based upon several time scales: the AT1 and TA time scales of the National Institute of Standards and Technology (NIST), the A.1(MEAN) time scale of the US Naval observatory (USNO), the TAI time scale of the Bureau International des Poids et Measures (BIPM), and the KAS-1 time scale of the Naval Research laboratory (NRL). The new method was incorporated in the KAS-2 time scale recently developed by Timing Solutions Corporation. The goal is to present time-scale concepts in a nonmathematical form with as few equations as possible. Many other papers and texts discuss the details of the optimal estimation techniques that may be used to implement these concepts.

  13. Scales and scaling in turbulent ocean sciences; physics-biology coupling

    NASA Astrophysics Data System (ADS)

    Schmitt, Francois

    2015-04-01

    Geophysical fields possess huge fluctuations over many spatial and temporal scales. In the ocean, such property at smaller scales is closely linked to marine turbulence. The velocity field is varying from large scales to the Kolmogorov scale (mm) and scalar fields from large scales to the Batchelor scale, which is often much smaller. As a consequence, it is not always simple to determine at which scale a process should be considered. The scale question is hence fundamental in marine sciences, especially when dealing with physics-biology coupling. For example, marine dynamical models have typically a grid size of hundred meters or more, which is more than 105 times larger than the smallest turbulence scales (Kolmogorov scale). Such scale is fine for the dynamics of a whale (around 100 m) but for a fish larvae (1 cm) or a copepod (1 mm) a description at smaller scales is needed, due to the nonlinear nature of turbulence. The same is verified also for biogeochemical fields such as passive and actives tracers (oxygen, fluorescence, nutrients, pH, turbidity, temperature, salinity...) In this framework, we will discuss the scale problem in turbulence modeling in the ocean, and the relation of Kolmogorov's and Batchelor's scales of turbulence in the ocean, with the size of marine animals. We will also consider scaling laws for organism-particle Reynolds numbers (from whales to bacteria), and possible scaling laws for organism's accelerations.

  14. Effect of Violating Unidimensional Item Response Theory Vertical Scaling Assumptions on Developmental Score Scales

    ERIC Educational Resources Information Center

    Topczewski, Anna Marie

    2013-01-01

    Developmental score scales represent the performance of students along a continuum, where as students learn more they move higher along that continuum. Unidimensional item response theory (UIRT) vertical scaling has become a commonly used method to create developmental score scales. Research has shown that UIRT vertical scaling methods can be…

  15. Toward Increasing Fairness in Score Scale Calibrations Employed in International Large-Scale Assessments

    ERIC Educational Resources Information Center

    Oliveri, Maria Elena; von Davier, Matthias

    2014-01-01

    In this article, we investigate the creation of comparable score scales across countries in international assessments. We examine potential improvements to current score scale calibration procedures used in international large-scale assessments. Our approach seeks to improve fairness in scoring international large-scale assessments, which often…

  16. Mirages in galaxy scaling relations

    NASA Astrophysics Data System (ADS)

    Mosenkov, A. V.; Sotnikova, N. Ya.; Reshetnikov, V. P.

    2014-06-01

    We analysed several basic correlations between structural parameters of galaxies. The data were taken from various samples in different passbands which are available in the literature. We discuss disc scaling relations as well as some debatable issues concerning the so-called Photometric Plane for bulges and elliptical galaxies in different forms and various versions of the famous Kormendy relation. We show that some of the correlations under discussion are artificial (self-correlations), while others truly reveal some new essential details of the structural properties of galaxies. Our main results are as follows: At present, we cannot conclude that faint stellar discs are, on average, more thin than discs in high surface brightness galaxies. The `central surface brightness-thickness' correlation appears only as a consequence of the transparent exponential disc model to describe real galaxy discs. The Photometric Plane appears to have no independent physical sense. Various forms of this plane are merely sophisticated versions of the Kormendy relation or of the self-relation involving the central surface brightness of a bulge/elliptical galaxy and the Sérsic index n. The Kormendy relation is a physical correlation presumably reflecting the difference in the origin of bright and faint ellipticals and bulges. We present arguments that involve creating artificial samples to prove our main idea.

  17. Scaling in nonstationary voltammetry representations.

    PubMed

    Anastassiou, Costas A; Parker, Kim H; O'Hare, Danny

    2007-12-20

    Despite the widespread use of voltammetry for a range of chemical, biological, environmental, and industrial applications, there is still a lack of understanding regarding the functionality between the applied voltage and the resulting patterns in the current response. This is due to the highly nonlinear relation between the applied voltage and the nonstationary current response, which casts a direct association nonintuitive. In this Article, we focus on large-amplitude/high-frequency ac voltammetry, a technique that has shown to offer increased voltammetric detail compared to alternative methods, to study heterogeneous electrochemical reaction-diffusion cases using a nonstationary time-series analysis, the Hilbert transform, and symmetry considerations. We show that application of this signal processing technique minimizes the significant capacitance contribution associated with rapid voltammetric measurements. From a series of numerical simulations conducted for different voltage excitation parameters as well as kinetic, thermodynamic, and mass transport parameters, a number of scaling laws arise that are related to the underlying parameters/dynamics of the process. Under certain conditions, these observations allow the determination of all underlying parameters very rapidly, experiment duration typically

  18. Scaling laws for iceberg calving

    NASA Astrophysics Data System (ADS)

    Åström, Jan; Moore, John

    2014-05-01

    Over the next century, most additional ocean water will come from ice sheets and glaciers, primarily through calving of ice into the oceans. Calving fluxes are prone to rapid and non-linear variability and therefore have proven difficult to include in models forced by evolving climatic variables. Theoretical and simulation first-principles fracture models are applied to investigating iceberg calving. We demonstrate that calving originates from general behaviour of unstable cracks in elastic media. Cracks in ice trigger calving events that have a striking statistical similarity to avalanches in Abelian sand-pile models. That is, both calving mass distribution and inter-event waiting times are similar to those of sand-pile models. The theoretical results are confirmed by a first-principles simulation model and field observations spanning 12 orders of magnitude in calving size. This suggests that calving termini are self-organized critical systems, hence the difficulty to parameterize calving in large-scale models. Subtle deviation from a critical point towards higher stability will lead to subcritical calving - small and infrequent calving events associated with glacier advance, while subtle deviation towards higher instability will lead to supercritical calving - larger and more frequent events associated with rapid retreat. Such behaviour is consistent with recent worldwide observations of ice shelf disintegration and irreversible tidewater glacier retreat in response to climate warming.

  19. Scaling Agile Infrastructure to People

    NASA Astrophysics Data System (ADS)

    Jones, B.; McCance, G.; Traylen, S.; Barrientos Arias, N.

    2015-12-01

    When CERN migrated its infrastructure away from homegrown fabric management tools to emerging industry-standard open-source solutions, the immediate technical challenges and motivation were clear. The move to a multi-site Cloud Computing model meant that the tool chains that were growing around this ecosystem would be a good choice, the challenge was to leverage them. The use of open-source tools brings challenges other than merely how to deploy them. Homegrown software, for all the deficiencies identified at the outset of the project, has the benefit of growing with the organization. This paper will examine what challenges there were in adapting open-source tools to the needs of the organization, particularly in the areas of multi-group development and security. Additionally, the increase in scale of the plant required changes to how Change Management was organized and managed. Continuous Integration techniques are used in order to manage the rate of change across multiple groups, and the tools and workflow for this will be examined.

  20. Electroweak-scale resonant leptogenesis

    SciTech Connect

    Pilaftsis, Apostolos; Underwood, Thomas E.J.

    2005-12-01

    We study minimal scenarios of resonant leptogenesis near the electroweak phase transition. These models offer a number of testable phenomenological signatures for low-energy experiments and future high-energy colliders. Our study extends previous analyses of the relevant network of Boltzmann equations, consistently taking into account effects from out of equilibrium sphalerons and single lepton flavors. We show that the effects from single lepton flavors become very important in variants of resonant leptogenesis, where the observed baryon asymmetry in the Universe is created by lepton-to-baryon conversion of an individual lepton number, for example, that of the {tau}-lepton. The predictions of such resonant {tau}-leptogenesis models for the final baryon asymmetry are almost independent of the initial lepton-number and heavy neutrino abundances. These models accommodate the current neutrino data and have a number of testable phenomenological implications. They contain electroweak-scale heavy Majorana neutrinos with appreciable couplings to electrons and muons, which can be probed at future e{sup +}e{sup -} and {mu}{sup +}{mu}{sup -} high-energy colliders. In particular, resonant {tau}-leptogenesis models predict sizable 0{nu}{beta}{beta} decay, as well as e- and {mu}-number-violating processes, such as {mu}{yields}e{gamma} and {mu}{yields}e conversion in nuclei, with rates that are within reach of the experiments proposed by the MEG and MECO collaborations.

  1. Challenges to Scaling CIGS Photovoltaics

    NASA Astrophysics Data System (ADS)

    Stanbery, B. J.

    2011-03-01

    The challenges of scaling any photovoltaic technology to terawatts of global capacity are arguably more economic than technological or resource constraints. All commercial thin-film PV technologies are based on direct bandgap semiconductors whose absorption coefficient and bandgap alignment with the solar spectrum enable micron-thick coatings in lieu to hundreds of microns required using indirect-bandgap c-Si. Although thin-film PV reduces semiconductor materials cost, its manufacture is more capital intensive than c-Si production, and proportional to deposition rate. Only when combined with sufficient efficiency and cost of capital does this tradeoff yield lower manufacturing cost. CIGS has the potential to become the first thin film technology to achieve the terawatt benchmark because of its superior conversion efficiency, making it the only commercial thin film technology which demonstrably delivers performance comparable to the dominant incumbent, c-Si. Since module performance leverages total systems cost, this competitive advantage bears directly on CIGS' potential to displace c-Si and attract the requisite capital to finance the tens of gigawatts of annual production capacity needed to manufacture terawatts of PV modules apace with global demand growth.

  2. Large Scale Magnetostrictive Valve Actuator

    NASA Technical Reports Server (NTRS)

    Richard, James A.; Holleman, Elizabeth; Eddleman, David

    2008-01-01

    Marshall Space Flight Center's Valves, Actuators and Ducts Design and Development Branch developed a large scale magnetostrictive valve actuator. The potential advantages of this technology are faster, more efficient valve actuators that consume less power and provide precise position control and deliver higher flow rates than conventional solenoid valves. Magnetostrictive materials change dimensions when a magnetic field is applied; this property is referred to as magnetostriction. Magnetostriction is caused by the alignment of the magnetic domains in the material s crystalline structure and the applied magnetic field lines. Typically, the material changes shape by elongating in the axial direction and constricting in the radial direction, resulting in no net change in volume. All hardware and testing is complete. This paper will discuss: the potential applications of the technology; overview of the as built actuator design; discuss problems that were uncovered during the development testing; review test data and evaluate weaknesses of the design; and discuss areas for improvement for future work. This actuator holds promises of a low power, high load, proportionally controlled actuator for valves requiring 440 to 1500 newtons load.

  3. Universal scaling in sports ranking

    NASA Astrophysics Data System (ADS)

    Deng, Weibing; Li, Wei; Cai, Xu; Bulou, Alain; Wang, Qiuping A.

    2012-09-01

    Ranking is a ubiquitous phenomenon in human society. On the web pages of Forbes, one may find all kinds of rankings, such as the world's most powerful people, the world's richest people, the highest-earning tennis players, and so on and so forth. Herewith, we study a specific kind—sports ranking systems in which players' scores and/or prize money are accrued based on their performances in different matches. By investigating 40 data samples which span 12 different sports, we find that the distributions of scores and/or prize money follow universal power laws, with exponents nearly identical for most sports. In order to understand the origin of this universal scaling we focus on the tennis ranking systems. By checking the data we find that, for any pair of players, the probability that the higher-ranked player tops the lower-ranked opponent is proportional to the rank difference between the pair. Such a dependence can be well fitted to a sigmoidal function. By using this feature, we propose a simple toy model which can simulate the competition of players in different matches. The simulations yield results consistent with the empirical findings. Extensive simulation studies indicate that the model is quite robust with respect to the modifications of some parameters.

  4. Quantifying scale relationships in snow distributions

    NASA Astrophysics Data System (ADS)

    Deems, Jeffrey S.

    2007-12-01

    Spatial distributions of snow in mountain environments represent the time integration of accumulation and ablation processes, and are strongly and dynamically linked to mountain hydrologic, ecologic, and climatic systems. Accurate measurement and modeling of the spatial distribution and variability of the seasonal mountain snowpack at different scales are imperative for water supply and hydropower decision-making, for investigations of land-atmosphere interaction or biogeochemical cycling, and for accurate simulation of earth system processes and feedbacks. Assessment and prediction of snow distributions in complex terrain are heavily dependent on scale effects, as the pattern and magnitude of variability in snow distributions depends on the scale of observation. Measurement and model scales are usually different from process scales, and thereby introduce a scale bias to the estimate or prediction. To quantify this bias, or to properly design measurement schemes and model applications, the process scale must be known or estimated. Airborne Light Detection And Ranging (lidar) products provide high-resolution, broad-extent altimetry data for terrain and snowpack mapping, and allow an application of variogram fractal analysis techniques to characterize snow depth scaling properties over lag distances from 1 to 1000 meters. Snow depth patterns as measured by lidar at three Colorado mountain sites exhibit fractal (power law) scaling patterns over two distinct scale ranges, separated by a distinct break at the 15-40 m lag distance, depending on the site. Each fractal range represents a range of separation distances over which snow depth processes remain consistent. The scale break between fractal regions is a characteristic scale at which snow depth process relationships change fundamentally. Similar scale break distances in vegetation topography datasets suggest that the snow depth scale break represents a change in wind redistribution processes from wind

  5. Meso-scale machining capabilities and issues

    SciTech Connect

    BENAVIDES,GILBERT L.; ADAMS,DAVID P.; YANG,PIN

    2000-05-15

    Meso-scale manufacturing processes are bridging the gap between silicon-based MEMS processes and conventional miniature machining. These processes can fabricate two and three-dimensional parts having micron size features in traditional materials such as stainless steels, rare earth magnets, ceramics, and glass. Meso-scale processes that are currently available include, focused ion beam sputtering, micro-milling, micro-turning, excimer laser ablation, femto-second laser ablation, and micro electro discharge machining. These meso-scale processes employ subtractive machining technologies (i.e., material removal), unlike LIGA, which is an additive meso-scale process. Meso-scale processes have different material capabilities and machining performance specifications. Machining performance specifications of interest include minimum feature size, feature tolerance, feature location accuracy, surface finish, and material removal rate. Sandia National Laboratories is developing meso-scale electro-mechanical components, which require meso-scale parts that move relative to one another. The meso-scale parts fabricated by subtractive meso-scale manufacturing processes have unique tribology issues because of the variety of materials and the surface conditions produced by the different meso-scale manufacturing processes.

  6. Aerosols and Convection: Global scale, MJO Scale and Regional Scale Analyses

    NASA Astrophysics Data System (ADS)

    Rutledge, S. A.

    2014-12-01

    We have investigated interactions between atmospheric thermodynamics, boundary layer aerosol (CCN) concentrations, convective intensity and lightning flash rates (from the TRMM LIS and the Vaisala GLD 360 global network) on three distinct scales, including the global tropical ocean and land masses, the Madden Julian Oscillation genesis region over the central Indian Ocean (CIO) region, and four regions in the U.S., Washington D.C., northern Alabama, central Oklahoma and eastern Colorado. The U.S. locations are each supported by VHF Lightning Mapping Arrays. Total lightning density is shown to increase by a factor of 2-3 as a function of CCN concentration over tropical land and ocean regions. The greatest sensitivity in the lightning vs. aerosol relationship was found in more unstable environments and where warm-cloud depth was intermediate (deep) over land (ocean). The maximum height of 30 dBZ echo tops in lightning producing convective features was found to be insensitive to changes in CCN concentration. However, the vertical profile of radar reflectivity (VPRR) showed a consistent increase of 2-4 dBZ for convective features that developed in more polluted environments, suggesting that aerosols may act to intensify the convection, but not necessarily make the convection deeper. These findings are consistent with the hypothesis that aerosols act to invigorate convection by influencing the evolution of a cloud's hydrometeor populations. For the regional scale analysis, storms in Colorado have favorable thermodynamics (high cloud bases, shallow warm cloud depths and large CAPE's) that aerosols (CCN) appear to have little effect in a bulk sense. For the three remaining regions, storms forming in environments with CCN concentrations between 700 and 1200 cm-3 have notably stronger VPRR and larger flash rates. For aerosol concentrations below and above this range, storms have less vigor and reduced flash rates, consistent with the Rosenfeld et al. (2008) study. Finally

  7. Evaluation of Computer Based Foreign Language Learning Software by Teachers and Students

    ERIC Educational Resources Information Center

    Baz, Fatih Çagatay; Tekdal, Mehmet

    2014-01-01

    The aim of this study is to evaluate Computer Based Foreign Language Learning software called Dynamic Education (DYNED) by teachers and students. The study is conducted with randomly chosen ten primary schools with the participants of 522 7th grade students and 7 English teachers. Three points Likert scale for teachers and five points Likert scale…

  8. Hubble Space Telescope Scale Model

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This is a photograph of a 1/15 scale model of the Hubble Space Telescope (HST). The HST is the product of a partnership between NASA, European Space Agency Contractors, and the international community of astronomers. It is named after Edwin P. Hubble, an American Astronomer who discovered the expanding nature of the universe and was the first to realize the true nature of galaxies. The purpose of the HST, the most complex and sensitive optical telescope ever made, is to study the cosmos from a low-Earth orbit. By placing the telescope in space, astronomers are able to collect data that is free of the Earth's atmosphere. The HST detects objects 25 times fainter than the dimmest objects seen from Earth and provides astronomers with an observable universe 250 times larger than visible from ground-based telescopes, perhaps as far away as 14 billion light-years. The HST views galaxies, stars, planets, comets, possibly other solar systems, and even unusual phenomena such as quasars, with 10 times the clarity of ground-based telescopes. The major elements of the HST are the Optical Telescope Assembly (OTA), the Support System Module (SSM), and the Scientific Instruments (SI). The HST is 42.5-feet (13- meters) long and weighs about 25,000 pounds (11,600 kilograms). The HST was deployed from the Space Shuttle Discovery (STS-31 mission) into Earth orbit in April 1990. The Marshall Space Flight Center had responsibility for design, development, and construction of the HST. The Perkin-Elmer Corporation, in Danbury, Cornecticut, developed the optical system and guidance sensors. The Lockheed Missile and Space Company of Sunnyvale, California produced the protective outer shroud and spacecraft systems, and assembled and tested the finished telescope.

  9. Large-Scale Information Systems

    SciTech Connect

    D. M. Nicol; H. R. Ammerlahn; M. E. Goldsby; M. M. Johnson; D. E. Rhodes; A. S. Yoshimura

    2000-12-01

    Large enterprises are ever more dependent on their Large-Scale Information Systems (LSLS), computer systems that are distinguished architecturally by distributed components--data sources, networks, computing engines, simulations, human-in-the-loop control and remote access stations. These systems provide such capabilities as workflow, data fusion and distributed database access. The Nuclear Weapons Complex (NWC) contains many examples of LSIS components, a fact that motivates this research. However, most LSIS in use grew up from collections of separate subsystems that were not designed to be components of an integrated system. For this reason, they are often difficult to analyze and control. The problem is made more difficult by the size of a typical system, its diversity of information sources, and the institutional complexities associated with its geographic distribution across the enterprise. Moreover, there is no integrated approach for analyzing or managing such systems. Indeed, integrated development of LSIS is an active area of academic research. This work developed such an approach by simulating the various components of the LSIS and allowing the simulated components to interact with real LSIS subsystems. This research demonstrated two benefits. First, applying it to a particular LSIS provided a thorough understanding of the interfaces between the system's components. Second, it demonstrated how more rapid and detailed answers could be obtained to questions significant to the enterprise by interacting with the relevant LSIS subsystems through simulated components designed with those questions in mind. In a final, added phase of the project, investigations were made on extending this research to wireless communication networks in support of telemetry applications.

  10. A clinimetric overview of scar assessment scales.

    PubMed

    van der Wal, M B A; Verhaegen, P D H M; Middelkoop, E; van Zuijlen, P P M

    2012-01-01

    Standardized validated evaluation instruments are mandatory to increase the level of evidence in scar management. Scar assessment scales are potentially suitable for this purpose, but the most appropriate scale still needs to be determined. This review will elaborate on several clinically relevant scar features and critically discuss the currently available scar scales in terms of basic clinimetric requirements. Many current scales can produce reliable measurements but seem to require multiple observers to obtain these results reliably, which limits their feasibility in clinical practice. The validation process of scar scales is hindered by the lack of a "gold standard" in subjective scar assessment or other reliable objective instruments which are necessary for a good comparison. The authors conclude that there are scar scales available that can reliably measure scar quality. However, further research may lead to improvement of their clinimetric properties and enhance the level of evidence in scar research worldwide.

  11. Small scale structure on cosmic strings

    NASA Technical Reports Server (NTRS)

    Albrecht, Andreas

    1989-01-01

    The current understanding of cosmic string evolution is discussed, and the focus placed on the question of small scale structure on strings, where most of the disagreements lie. A physical picture designed to put the role of the small scale structure into more intuitive terms is presented. In this picture it can be seen how the small scale structure can feed back in a major way on the overall scaling solution. It is also argued that it is easy for small scale numerical errors to feed back in just such a way. The intuitive discussion presented here may form the basis for an analytic treatment of the small scale structure, which argued in any case would be extremely valuable in filling the gaps in the present understanding of cosmic string evolution.

  12. Mobility at the scale of meters.

    PubMed

    Surovell, Todd A; O'Brien, Matthew

    2016-05-01

    When archeologists discuss mobility, we are most often referring to a phenomenon that operates on the scale of kilometers, but much of human mobility, at least if measured in terms of frequency of movement, occurs at much smaller scales, ranging from centimeters to tens of meters. Here we refer to the movements we make within the confines of our homes or places of employment. With respect to nomadic peoples, movements at this scale would include movements within campsites. Understanding mobility at small scales is important to archeology because small-scale mobility decisions are a critical factor affecting spatial patterning observed in archeological sites. In this paper, we examine the factors affecting small-scale mobility decisions in a Mongolian reindeer herder summer camp and the implications of those decisions with regard to archeological spatial patterning. PMID:27312186

  13. Testing the barriers to healthy eating scale.

    PubMed

    Fowles, Eileen R; Feucht, Jeanette

    2004-06-01

    Clarifying barriers to dietary intake may identify factors that place pregnant women at risk for complications. This methodological study assessed the psychometric properties of the Barriers to Healthy Eating Scale. Item generation was based on constructs in Pender's health promotion model. The instrument was tested in two separate samples of pregnant women. Content validity was assessed, and construct validity testing resulted in an expected negative relationship between scores on the Barriers to Healthy Eating Scale and the Nutrition subscale of the Health Promoting Lifestyle Profile-II. Factor analysis resulted in a 5-factor scale that explained 73% of the variance. Alpha coefficients for the total scale ranged from.73 to.77, and subscales ranged from.48 to.99. Test-retest reliability for the total scale was.79. The Barriers to Healthy Eating Scale appears to be a reliable and valid instrument to assess barriers that may impede healthy eating in pregnant women.

  14. Scaling of Soil Moisture: A Hydrologic Perspective

    NASA Astrophysics Data System (ADS)

    Western, Andrew W.; Grayson, Rodger B.; Blöschl, Günter

    Soil moisture is spatially and temporally highly variable, and it influences a range of environmental processes in a nonlinear manner. This leads to scale effects that need to be understood for improved prediction of moisture dependent processes. We provide some introductory material on soil moisture, and then review results from the literature relevant to a variety of scaling techniques applicable to soil moisture. This review concentrates on spatial scaling with brief reference to results on temporal scaling. Scaling techniques are divided into behavioral techniques and process-based techniques. We discuss the statistical distribution of soil moisture, spatial correlation of soil moisture at scales from tens of meters to thousands of kilometers and related interpolation and regularization techniques, and the use of auxiliary variables such as terrain indices. Issues related to spatially distributed deterministic modeling of soil moisture are also briefly reviewed.

  15. Factors Affecting Scale Adhesion on Steel Forgings

    NASA Astrophysics Data System (ADS)

    Zitterman, J. A.; Bacco, R. P.; Boggs, W. E.

    1982-04-01

    Occasionally, undesirable "sticky" adherent scale forms on low-carbon steel during reheating for hot forging. The mechanical abrading or chemical pickling required to remove this scale adds appreciably to the fabrication cost. Characterization of the steel-scale system by metallographic examination, x-ray diffraction, and electron-probe microanalysis revealed that nickel, silicon, and/or sulfur might be involved in the mechanism of sticky-scale formation. Laboratory reheating tests were conducted on steels with varied concentrations of nickel and silicon in atmospheres simulating those resulting from burning natural gas or sulfur-bearing fuels. Subsequent characterization of the scale formed during the tests tends to confirm that the composition of the steel, especially increased nickel and silicon contents, and the presence of the sulfur in the furnace atmosphere cause the formation of this undesirable scale.

  16. Scaling crossover for the average avalanche shape

    NASA Astrophysics Data System (ADS)

    Papanikolaou, Stefanos; Bohn, Felipe; Sommer, Rubem L.; Durin, Gianfranco; Zapperi, Stefano; Sethna, James P.

    2010-03-01

    Universality and the renormalization group claim to predict all behavior on long length and time scales asymptotically close to critical points. In practice, large simulations and heroic experiments have been needed to unambiguously test and measure the critical exponents and scaling functions. We announce here the measurement and prediction of universal corrections to scaling, applied to the temporal average shape of Barkhausen noise avalanches. We bypass the confounding factors of time-retarded interactions (eddy currents) by measuring thin permalloy films, and bypass thresholding effects and amplifier distortions by applying Wiener deconvolution. We show experimental shapes that are approximately symmetric, and measure the leading corrections to scaling. We solve a mean-field theory for the magnetization dynamics and calculate the relevant demagnetizing-field correction to scaling, showing qualitative agreement with the experiment. In this way, we move toward a quantitative theory useful at smaller time and length scales and farther from the critical point.

  17. Gap Test Calibrations and Their Scaling

    NASA Astrophysics Data System (ADS)

    Sandusky, Harold

    2011-06-01

    Common tests for measuring the threshold for shock initiation are the NOL large scale gap test (LSGT) with a 50.8-mm diameter donor/gap and the expanded large scale gap test (ELSGT) with a 95.3-mm diameter donor/gap. Despite the same specifications for the explosive donor and polymethyl methacrylate (PMMA) gap in both tests, calibration of shock pressure in the gap versus distance from the donor scales by a factor of 1.75, not the 1.875 difference in their sizes. Recently reported model calculations suggest that the scaling discrepancy results from the viscoelastic properties of PMMA in combination with different methods for obtaining shock pressure. This is supported by the consistent scaling of these donors when calibrated in water-filled aquariums. Calibrations with water gaps will be provided and compared with PMMA gaps. Scaling for other donor systems will also be provided. Shock initiation data with water gaps will be reviewed.

  18. Large-scale instabilities of helical flows

    NASA Astrophysics Data System (ADS)

    Cameron, Alexandre; Alexakis, Alexandros; Brachet, Marc-Étienne

    2016-10-01

    Large-scale hydrodynamic instabilities of periodic helical flows of a given wave number K are investigated using three-dimensional Floquet numerical computations. In the Floquet formalism the unstable field is expanded in modes of different spacial periodicity. This allows us (i) to clearly distinguish large from small scale instabilities and (ii) to study modes of wave number q of arbitrarily large-scale separation q ≪K . Different flows are examined including flows that exhibit small-scale turbulence. The growth rate σ of the most unstable mode is measured as a function of the scale separation q /K ≪1 and the Reynolds number Re. It is shown that the growth rate follows the scaling σ ∝q if an AKA effect [Frisch et al., Physica D: Nonlinear Phenomena 28, 382 (1987), 10.1016/0167-2789(87)90026-1] is present or a negative eddy viscosity scaling σ ∝q2 in its absence. This holds both for the Re≪1 regime where previously derived asymptotic results are verified but also for Re=O (1 ) that is beyond their range of validity. Furthermore, for values of Re above a critical value ReSc beyond which small-scale instabilities are present, the growth rate becomes independent of q and the energy of the perturbation at large scales decreases with scale separation. The nonlinear behavior of these large-scale instabilities is also examined in the nonlinear regime where the largest scales of the system are found to be the most dominant energetically. These results are interpreted by low-order models.

  19. Scaling Rules for Pre-Injector Design

    SciTech Connect

    Tom Schwarz; Dan Amidei

    2003-07-13

    Proposed designs of the prebunching system of the NLC and TESLA are based on the assumption that scaling the SLC design to NLC/TESLA requirements should provide the desired performance. A simple equation is developed to suggest a scaling rule in terms of bunch charge and duration. Detailed simulations of prebunching systems scaled from a single design have been run to investigate these issues.

  20. How to calibrate the jet energy scale?

    SciTech Connect

    Hatakeyama, K.; /Rockefeller U.

    2006-01-01

    Top quarks dominantly decay into b-quark jets and W bosons, and the W bosons often decay into jets, thus the precise determination of the jet energy scale is crucial in measurements of many top quark properties. I present the strategies used by the CDF and D0 collaborations to determine the jet energy scale. The various cross checks performed to verify the determined jet energy scale and evaluate its systematic uncertainty are also discussed.