NASA Astrophysics Data System (ADS)
Feng, Guixiang; Ming, Dongping; Wang, Min; Yang, Jianyu
2017-06-01
Scale problems are a major source of concern in the field of remote sensing. Since the remote sensing is a complex technology system, there is a lack of enough cognition on the connotation of scale and scale effect in remote sensing. Thus, this paper first introduces the connotations of pixel-based scale and summarizes the general understanding of pixel-based scale effect. Pixel-based scale effect analysis is essentially important for choosing the appropriate remote sensing data and the proper processing parameters. Fractal dimension is a useful measurement to analysis pixel-based scale. However in traditional fractal dimension calculation, the impact of spatial resolution is not considered, which leads that the scale effect change with spatial resolution can't be clearly reflected. Therefore, this paper proposes to use spatial resolution as the modified scale parameter of two fractal methods to further analyze the pixel-based scale effect. To verify the results of two modified methods (MFBM (Modified Windowed Fractal Brownian Motion Based on the Surface Area) and MDBM (Modified Windowed Double Blanket Method)); the existing scale effect analysis method (information entropy method) is used to evaluate. And six sub-regions of building areas and farmland areas were cut out from QuickBird images to be used as the experimental data. The results of the experiment show that both the fractal dimension and information entropy present the same trend with the decrease of spatial resolution, and some inflection points appear at the same feature scales. Further analysis shows that these feature scales (corresponding to the inflection points) are related to the actual sizes of the geo-object, which results in fewer mixed pixels in the image, and these inflection points are significantly indicative of the observed features. Therefore, the experiment results indicate that the modified fractal methods are effective to reflect the pixel-based scale effect existing in remote sensing data and it is helpful to analyze the observation scale from different aspects. This research will ultimately benefit for remote sensing data selection and application.
Verloo, Henk; Desmedt, Mario; Morin, Diane
2017-09-01
To evaluate two psychometric properties of the French versions of the Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales, namely their internal consistency and construct validity. The Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales developed by Melnyk et al. are recognised as valid, reliable instruments in English. However, no psychometric validation for their French versions existed. Secondary analysis of a cross sectional survey. Source data came from a cross-sectional descriptive study sample of 382 nurses and other allied healthcare providers. Cronbach's alpha was used to evaluate internal consistency, and principal axis factor analysis and varimax rotation were computed to determine construct validity. The French Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales showed excellent reliability, with Cronbach's alphas close to the scores established by Melnyk et al.'s original versions. Principal axis factor analysis showed medium-to-high factor loading scores without obtaining collinearity. Principal axis factor analysis with varimax rotation of the 16-item Evidence-Based Practice Beliefs scale resulted in a four-factor loading structure. Principal axis factor analysis with varimax rotation of the 17-item Evidence-Based Practice Implementation scale revealed a two-factor loading structure. Further research should attempt to understand why the French Evidence-Based Practice Implementation scale showed a two-factor loading structure but Melnyk et al.'s original has only one. The French versions of the Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales can both be considered valid and reliable instruments for measuring Evidence-Based Practice beliefs and implementation. The results suggest that the French Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales are valid and reliable and can therefore be used to evaluate the effectiveness of organisational strategies aimed at increasing professionals' confidence in Evidence-Based Practice, supporting its use and implementation. © 2017 John Wiley & Sons Ltd.
[Recognition of landscape characteristic scale based on two-dimension wavelet analysis].
Gao, Yan-Ni; Chen, Wei; He, Xing-Yuan; Li, Xiao-Yu
2010-06-01
Three wavelet bases, i. e., Haar, Daubechies, and Symlet, were chosen to analyze the validity of two-dimension wavelet analysis in recognizing the characteristic scales of the urban, peri-urban, and rural landscapes of Shenyang. Owing to the transform scale of two-dimension wavelet must be the integer power of 2, some characteristic scales cannot be accurately recognized. Therefore, the pixel resolution of images was resampled to 3, 3.5, 4, and 4.5 m to densify the scale in analysis. It was shown that two-dimension wavelet analysis worked effectively in checking characteristic scale. Haar, Daubechies, and Symle were the optimal wavelet bases to the peri-urban landscape, urban landscape, and rural landscape, respectively. Both Haar basis and Symlet basis played good roles in recognizing the fine characteristic scale of rural landscape and in detecting the boundary of peri-urban landscape. Daubechies basis and Symlet basis could be also used to detect the boundary of urban landscape and rural landscape, respectively.
Estimation of Handgrip Force from SEMG Based on Wavelet Scale Selection.
Wang, Kai; Zhang, Xianmin; Ota, Jun; Huang, Yanjiang
2018-02-24
This paper proposes a nonlinear correlation-based wavelet scale selection technology to select the effective wavelet scales for the estimation of handgrip force from surface electromyograms (SEMG). The SEMG signal corresponding to gripping force was collected from extensor and flexor forearm muscles during the force-varying analysis task. We performed a computational sensitivity analysis on the initial nonlinear SEMG-handgrip force model. To explore the nonlinear correlation between ten wavelet scales and handgrip force, a large-scale iteration based on the Monte Carlo simulation was conducted. To choose a suitable combination of scales, we proposed a rule to combine wavelet scales based on the sensitivity of each scale and selected the appropriate combination of wavelet scales based on sequence combination analysis (SCA). The results of SCA indicated that the scale combination VI is suitable for estimating force from the extensors and the combination V is suitable for the flexors. The proposed method was compared to two former methods through prolonged static and force-varying contraction tasks. The experiment results showed that the root mean square errors derived by the proposed method for both static and force-varying contraction tasks were less than 20%. The accuracy and robustness of the handgrip force derived by the proposed method is better than that obtained by the former methods.
Terluin, Berend; Brouwers, Evelien P M; Marchand, Miquelle A G; de Vet, Henrica C W
2018-05-01
Many paper-and-pencil (P&P) questionnaires have been migrated to electronic platforms. Differential item and test functioning (DIF and DTF) analysis constitutes a superior research design to assess measurement equivalence across modes of administration. The purpose of this study was to demonstrate an item response theory (IRT)-based DIF and DTF analysis to assess the measurement equivalence of a Web-based version and the original P&P format of the Four-Dimensional Symptom Questionnaire (4DSQ), measuring distress, depression, anxiety, and somatization. The P&P group (n = 2031) and the Web group (n = 958) consisted of primary care psychology clients. Unidimensionality and local independence of the 4DSQ scales were examined using IRT and Yen's Q3. Bifactor modeling was used to assess the scales' essential unidimensionality. Measurement equivalence was assessed using IRT-based DIF analysis using a 3-stage approach: linking on the latent mean and variance, selection of anchor items, and DIF testing using the Wald test. DTF was evaluated by comparing expected scale scores as a function of the latent trait. The 4DSQ scales proved to be essentially unidimensional in both modalities. Five items, belonging to the distress and somatization scales, displayed small amounts of DIF. DTF analysis revealed that the impact of DIF on the scale level was negligible. IRT-based DIF and DTF analysis is demonstrated as a way to assess the equivalence of Web-based and P&P questionnaire modalities. Data obtained with the Web-based 4DSQ are equivalent to data obtained with the P&P version.
A Confirmatory Factor Analysis of Reilly's Role Overload Scale
ERIC Educational Resources Information Center
Thiagarajan, Palaniappan; Chakrabarty, Subhra; Taylor, Ronald D.
2006-01-01
In 1982, Reilly developed a 13-item scale to measure role overload. This scale has been widely used, but most studies did not assess the unidimensionality of the scale. Given the significance of unidimensionality in scale development, the current study reports a confirmatory factor analysis of the 13-item scale in two samples. Based on the…
Modal-pushover-based ground-motion scaling procedure
Kalkan, Erol; Chopra, Anil K.
2011-01-01
Earthquake engineering is increasingly using nonlinear response history analysis (RHA) to demonstrate the performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. This paper presents a modal-pushover-based scaling (MPS) procedure to scale ground motions for use in a nonlinear RHA of buildings. In the MPS method, the ground motions are scaled to match to a specified tolerance, a target value of the inelastic deformation of the first-mode inelastic single-degree-of-freedom (SDF) system whose properties are determined by the first-mode pushover analysis. Appropriate for first-mode dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-mode SDF systems in selecting a subset of the scaled ground motions. Based on results presented for three actual buildings-4, 6, and 13-story-the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.
NASA Astrophysics Data System (ADS)
Kiyono, Ken; Tsujimoto, Yutaka
2016-07-01
We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.
Kiyono, Ken; Tsujimoto, Yutaka
2016-07-01
We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.
An analysis of ratings: A guide to RMRATE
Thomas C. Brown; Terry C. Daniel; Herbert W. Schroeder; Glen E. Brink
1990-01-01
This report describes RMRATE, a computer program for analyzing rating judgments. RMRATE scales ratings using several scaling procedures, and compares the resulting scale values. The scaling procedures include the median and simple mean, standardized values, scale values based on Thurstone's Law of Categorical Judgment, and regression-based values. RMRATE also...
Analysis of DNA Sequences by an Optical Time-Integrating Correlator: Proposal
1991-11-01
OF THE PROBLEM AND CURRENT TECHNOLOGY 2 3.0 TIME-INTEGRATING CORRELATOR 2 4.0 REPRESENTATIONS OF THE DNA BASES 8 5.0 DNA ANALYSIS STRATEGY 8 6.0... DNA bases where each base is represented by a 7-bits long pseudorandom sequence. 9 Figure 5: The flow of data in a DNA analysis system based on an...logarithmic scale and a linear scale. 15 x LIST OF TABLES PAGE Table 1: Short representations of the DNA bases where each base is represented by 7-bits
Incorporating scale into digital terrain analysis
NASA Astrophysics Data System (ADS)
Dragut, L. D.; Eisank, C.; Strasser, T.
2009-04-01
Digital Elevation Models (DEMs) and their derived terrain attributes are commonly used in soil-landscape modeling. Process-based terrain attributes meaningful to the soil properties of interest are sought to be produced through digital terrain analysis. Typically, the standard 3 X 3 window-based algorithms are used for this purpose, thus tying the scale of resulting layers to the spatial resolution of the available DEM. But this is likely to induce mismatches between scale domains of terrain information and soil properties of interest, which further propagate biases in soil-landscape modeling. We have started developing a procedure to incorporate scale into digital terrain analysis for terrain-based environmental modeling (Drăguţ et al., in press). The workflow was exemplified on crop yield data. Terrain information was generalized into successive scale levels with focal statistics on increasing neighborhood size. The degree of association between each terrain derivative and crop yield values was established iteratively for all scale levels through correlation analysis. The first peak of correlation indicated the scale level to be further retained. While in a standard 3 X 3 window-based analysis mean curvature was one of the poorest correlated terrain attribute, after generalization it turned into the best correlated variable. To illustrate the importance of scale, we compared the regression results of unfiltered and filtered mean curvature vs. crop yield. The comparison shows an improvement of R squared from a value of 0.01 when the curvature was not filtered, to 0.16 when the curvature was filtered within 55 X 55 m neighborhood size. This indicates the optimum size of curvature information (scale) that influences soil fertility. We further used these results in an object-based image analysis environment to create terrain objects containing aggregated values of both terrain derivatives and crop yield. Hence, we introduce terrain segmentation as an alternative method for generating scale levels in terrain-based environmental modeling. Based on segments, R squared improved up to a value of 0.47. Before integrating the procedure described above into a software application, thorough comparison between the results of different generalization techniques, on different datasets and terrain conditions is necessary. This is the subject of our ongoing research as part of the SCALA project (Scales and Hierarchies in Landform Classification). References: Drăguţ, L., Schauppenlehner, T., Muhar, A., Strobl, J. and Blaschke, T., in press. Optimization of scale and parametrization for terrain segmentation: an application to soil-landscape modeling, Computers & Geosciences.
Frequency-Specific Fractal Analysis of Postural Control Accounts for Control Strategies
Gilfriche, Pierre; Deschodt-Arsac, Véronique; Blons, Estelle; Arsac, Laurent M.
2018-01-01
Diverse indicators of postural control in Humans have been explored for decades, mostly based on the trajectory of the center-of-pressure. Classical approaches focus on variability, based on the notion that if a posture is too variable, the subject is not stable. Going deeper, an improved understanding of underlying physiology has been gained from studying variability in different frequency ranges, pointing to specific short-loops (proprioception), and long-loops (visuo-vestibular) in neural control. More recently, fractal analyses have proliferated and become useful additional metrics of postural control. They allowed identifying two scaling phenomena, respectively in short and long timescales. Here, we show that one of the most widely used methods for fractal analysis, Detrended Fluctuation Analysis, could be enhanced to account for scalings on specific frequency ranges. By computing and filtering a bank of synthetic fractal signals, we established how scaling analysis can be focused on specific frequency components. We called the obtained method Frequency-specific Fractal Analysis (FsFA) and used it to associate the two scaling phenomena of postural control to proprioceptive-based control loop and visuo-vestibular based control loop. After that, convincing arguments of method validity came from an application on the study of unaltered vs. altered postural control in athletes. Overall, the analysis suggests that at least two timescales contribute to postural control: a velocity-based control in short timescales relying on proprioceptive sensors, and a position-based control in longer timescales with visuo-vestibular sensors, which is a brand-new vision of postural control. Frequency-specific scaling exponents are promising markers of control strategies in Humans. PMID:29643816
Marateb, Hamid Reza; Mansourian, Marjan; Adibi, Peyman; Farina, Dario
2014-01-01
Background: selecting the correct statistical test and data mining method depends highly on the measurement scale of data, type of variables, and purpose of the analysis. Different measurement scales are studied in details and statistical comparison, modeling, and data mining methods are studied based upon using several medical examples. We have presented two ordinal–variables clustering examples, as more challenging variable in analysis, using Wisconsin Breast Cancer Data (WBCD). Ordinal-to-Interval scale conversion example: a breast cancer database of nine 10-level ordinal variables for 683 patients was analyzed by two ordinal-scale clustering methods. The performance of the clustering methods was assessed by comparison with the gold standard groups of malignant and benign cases that had been identified by clinical tests. Results: the sensitivity and accuracy of the two clustering methods were 98% and 96%, respectively. Their specificity was comparable. Conclusion: by using appropriate clustering algorithm based on the measurement scale of the variables in the study, high performance is granted. Moreover, descriptive and inferential statistics in addition to modeling approach must be selected based on the scale of the variables. PMID:24672565
The Development and Validation of the Online Shopping Addiction Scale.
Zhao, Haiyan; Tian, Wei; Xin, Tao
2017-01-01
We report the development and validation of a scale to measure online shopping addiction. Inspired by previous theories and research on behavioral addiction, the Griffiths's widely accepted six-factor component model was referred to and an 18-item scale was constructed, with each component measured by three items. The results of exploratory factor analysis, based on Sample 1 (999 college students) and confirmatory factor analysis, based on Sample 2 (854 college students) showed the Griffiths's substantive six-factor structure underlay the online shopping addiction scale. Cronbach's alpha suggested that the resulting scale was highly reliable. Concurrent validity, based on Sample 3 (328 college students), was also satisfactory as indicated by correlations between the scale and measures of similar constructs. Finally, self-perceived online shopping addiction can be predicted to a relatively high degree. The present 18-item scale is a solid theory-based instrument to empirically measure online shopping addiction and can be used for understanding the phenomena among young adults.
The Development and Validation of the Online Shopping Addiction Scale
Zhao, Haiyan; Tian, Wei; Xin, Tao
2017-01-01
We report the development and validation of a scale to measure online shopping addiction. Inspired by previous theories and research on behavioral addiction, the Griffiths's widely accepted six-factor component model was referred to and an 18-item scale was constructed, with each component measured by three items. The results of exploratory factor analysis, based on Sample 1 (999 college students) and confirmatory factor analysis, based on Sample 2 (854 college students) showed the Griffiths's substantive six-factor structure underlay the online shopping addiction scale. Cronbach's alpha suggested that the resulting scale was highly reliable. Concurrent validity, based on Sample 3 (328 college students), was also satisfactory as indicated by correlations between the scale and measures of similar constructs. Finally, self-perceived online shopping addiction can be predicted to a relatively high degree. The present 18-item scale is a solid theory-based instrument to empirically measure online shopping addiction and can be used for understanding the phenomena among young adults. PMID:28559864
Using Rasch Analysis to Inform Rating Scale Development
ERIC Educational Resources Information Center
Van Zile-Tamsen, Carol
2017-01-01
The use of surveys, questionnaires, and rating scales to measure important outcomes in higher education is pervasive, but reliability and validity information is often based on problematic Classical Test Theory approaches. Rasch Analysis, based on Item Response Theory, provides a better alternative for examining the psychometric quality of rating…
Rueckl, Martin; Lenzi, Stephen C; Moreno-Velasquez, Laura; Parthier, Daniel; Schmitz, Dietmar; Ruediger, Sten; Johenning, Friedrich W
2017-01-01
The measurement of activity in vivo and in vitro has shifted from electrical to optical methods. While the indicators for imaging activity have improved significantly over the last decade, tools for analysing optical data have not kept pace. Most available analysis tools are limited in their flexibility and applicability to datasets obtained at different spatial scales. Here, we present SamuROI (Structured analysis of multiple user-defined ROIs), an open source Python-based analysis environment for imaging data. SamuROI simplifies exploratory analysis and visualization of image series of fluorescence changes in complex structures over time and is readily applicable at different spatial scales. In this paper, we show the utility of SamuROI in Ca 2+ -imaging based applications at three spatial scales: the micro-scale (i.e., sub-cellular compartments including cell bodies, dendrites and spines); the meso-scale, (i.e., whole cell and population imaging with single-cell resolution); and the macro-scale (i.e., imaging of changes in bulk fluorescence in large brain areas, without cellular resolution). The software described here provides a graphical user interface for intuitive data exploration and region of interest (ROI) management that can be used interactively within Jupyter Notebook: a publicly available interactive Python platform that allows simple integration of our software with existing tools for automated ROI generation and post-processing, as well as custom analysis pipelines. SamuROI software, source code and installation instructions are publicly available on GitHub and documentation is available online. SamuROI reduces the energy barrier for manual exploration and semi-automated analysis of spatially complex Ca 2+ imaging datasets, particularly when these have been acquired at different spatial scales.
Rueckl, Martin; Lenzi, Stephen C.; Moreno-Velasquez, Laura; Parthier, Daniel; Schmitz, Dietmar; Ruediger, Sten; Johenning, Friedrich W.
2017-01-01
The measurement of activity in vivo and in vitro has shifted from electrical to optical methods. While the indicators for imaging activity have improved significantly over the last decade, tools for analysing optical data have not kept pace. Most available analysis tools are limited in their flexibility and applicability to datasets obtained at different spatial scales. Here, we present SamuROI (Structured analysis of multiple user-defined ROIs), an open source Python-based analysis environment for imaging data. SamuROI simplifies exploratory analysis and visualization of image series of fluorescence changes in complex structures over time and is readily applicable at different spatial scales. In this paper, we show the utility of SamuROI in Ca2+-imaging based applications at three spatial scales: the micro-scale (i.e., sub-cellular compartments including cell bodies, dendrites and spines); the meso-scale, (i.e., whole cell and population imaging with single-cell resolution); and the macro-scale (i.e., imaging of changes in bulk fluorescence in large brain areas, without cellular resolution). The software described here provides a graphical user interface for intuitive data exploration and region of interest (ROI) management that can be used interactively within Jupyter Notebook: a publicly available interactive Python platform that allows simple integration of our software with existing tools for automated ROI generation and post-processing, as well as custom analysis pipelines. SamuROI software, source code and installation instructions are publicly available on GitHub and documentation is available online. SamuROI reduces the energy barrier for manual exploration and semi-automated analysis of spatially complex Ca2+ imaging datasets, particularly when these have been acquired at different spatial scales. PMID:28706482
[Scale effect of Nanjing urban green infrastructure network pattern and connectivity analysis.
Yu, Ya Ping; Yin, Hai Wei; Kong, Fan Hua; Wang, Jing Jing; Xu, Wen Bin
2016-07-01
Based on ArcGIS, Erdas, GuidosToolbox, Conefor and other software platforms, using morphological spatial pattern analysis (MSPA) and landscape connectivity analysis methods, this paper quantitatively analysed the scale effect, edge effect and distance effect of the Nanjing urban green infrastructure network pattern in 2013 by setting different pixel sizes (P) and edge widths in MSPA analysis, and setting different dispersal distance thresholds in landscape connectivity analysis. The results showed that the type of landscape acquired based on the MSPA had a clear scale effect and edge effect, and scale effects only slightly affected landscape types, whereas edge effects were more obvious. Different dispersal distances had a great impact on the landscape connectivity, 2 km or 2.5 km dispersal distance was a critical threshold for Nanjing. When selecting the pixel size 30 m of the input data and the edge wide 30 m used in the morphological model, we could get more detailed landscape information of Nanjing UGI network. Based on MSPA and landscape connectivity, analysis of the scale effect, edge effect, and distance effect on the landscape types of the urban green infrastructure (UGI) network was helpful for selecting the appropriate size, edge width, and dispersal distance when developing these networks, and for better understanding the spatial pattern of UGI networks and the effects of scale and distance on the ecology of a UGI network. This would facilitate a more scientifically valid set of design parameters for UGI network spatiotemporal pattern analysis. The results of this study provided an important reference for Nanjing UGI networks and a basis for the analysis of the spatial and temporal patterns of medium-scale UGI landscape networks in other regions.
Advanced Connectivity Analysis (ACA): a Large Scale Functional Connectivity Data Mining Environment.
Chen, Rong; Nixon, Erika; Herskovits, Edward
2016-04-01
Using resting-state functional magnetic resonance imaging (rs-fMRI) to study functional connectivity is of great importance to understand normal development and function as well as a host of neurological and psychiatric disorders. Seed-based analysis is one of the most widely used rs-fMRI analysis methods. Here we describe a freely available large scale functional connectivity data mining software package called Advanced Connectivity Analysis (ACA). ACA enables large-scale seed-based analysis and brain-behavior analysis. It can seamlessly examine a large number of seed regions with minimal user input. ACA has a brain-behavior analysis component to delineate associations among imaging biomarkers and one or more behavioral variables. We demonstrate applications of ACA to rs-fMRI data sets from a study of autism.
He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z
2013-12-04
Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.
Local variance for multi-scale analysis in geomorphometry.
Drăguţ, Lucian; Eisank, Clemens; Strasser, Thomas
2011-07-15
Increasing availability of high resolution Digital Elevation Models (DEMs) is leading to a paradigm shift regarding scale issues in geomorphometry, prompting new solutions to cope with multi-scale analysis and detection of characteristic scales. We tested the suitability of the local variance (LV) method, originally developed for image analysis, for multi-scale analysis in geomorphometry. The method consists of: 1) up-scaling land-surface parameters derived from a DEM; 2) calculating LV as the average standard deviation (SD) within a 3 × 3 moving window for each scale level; 3) calculating the rate of change of LV (ROC-LV) from one level to another, and 4) plotting values so obtained against scale levels. We interpreted peaks in the ROC-LV graphs as markers of scale levels where cells or segments match types of pattern elements characterized by (relatively) equal degrees of homogeneity. The proposed method has been applied to LiDAR DEMs in two test areas different in terms of roughness: low relief and mountainous, respectively. For each test area, scale levels for slope gradient, plan, and profile curvatures were produced at constant increments with either resampling (cell-based) or image segmentation (object-based). Visual assessment revealed homogeneous areas that convincingly associate into patterns of land-surface parameters well differentiated across scales. We found that the LV method performed better on scale levels generated through segmentation as compared to up-scaling through resampling. The results indicate that coupling multi-scale pattern analysis with delineation of morphometric primitives is possible. This approach could be further used for developing hierarchical classifications of landform elements.
Local variance for multi-scale analysis in geomorphometry
Drăguţ, Lucian; Eisank, Clemens; Strasser, Thomas
2011-01-01
Increasing availability of high resolution Digital Elevation Models (DEMs) is leading to a paradigm shift regarding scale issues in geomorphometry, prompting new solutions to cope with multi-scale analysis and detection of characteristic scales. We tested the suitability of the local variance (LV) method, originally developed for image analysis, for multi-scale analysis in geomorphometry. The method consists of: 1) up-scaling land-surface parameters derived from a DEM; 2) calculating LV as the average standard deviation (SD) within a 3 × 3 moving window for each scale level; 3) calculating the rate of change of LV (ROC-LV) from one level to another, and 4) plotting values so obtained against scale levels. We interpreted peaks in the ROC-LV graphs as markers of scale levels where cells or segments match types of pattern elements characterized by (relatively) equal degrees of homogeneity. The proposed method has been applied to LiDAR DEMs in two test areas different in terms of roughness: low relief and mountainous, respectively. For each test area, scale levels for slope gradient, plan, and profile curvatures were produced at constant increments with either resampling (cell-based) or image segmentation (object-based). Visual assessment revealed homogeneous areas that convincingly associate into patterns of land-surface parameters well differentiated across scales. We found that the LV method performed better on scale levels generated through segmentation as compared to up-scaling through resampling. The results indicate that coupling multi-scale pattern analysis with delineation of morphometric primitives is possible. This approach could be further used for developing hierarchical classifications of landform elements. PMID:21779138
Sherzer, Gili; Gao, Peng; Schlangen, Erik; Ye, Guang; Gal, Erez
2017-02-28
Modeling the complex behavior of concrete for a specific mixture is a challenging task, as it requires bridging the cement scale and the concrete scale. We describe a multiscale analysis procedure for the modeling of concrete structures, in which material properties at the macro scale are evaluated based on lower scales. Concrete may be viewed over a range of scale sizes, from the atomic scale (10 -10 m), which is characterized by the behavior of crystalline particles of hydrated Portland cement, to the macroscopic scale (10 m). The proposed multiscale framework is based on several models, including chemical analysis at the cement paste scale, a mechanical lattice model at the cement and mortar scales, geometrical aggregate distribution models at the mortar scale, and the Lattice Discrete Particle Model (LDPM) at the concrete scale. The analysis procedure starts from a known chemical and mechanical set of parameters of the cement paste, which are then used to evaluate the mechanical properties of the LDPM concrete parameters for the fracture, shear, and elastic responses of the concrete. Although a macroscopic validation study of this procedure is presented, future research should include a comparison to additional experiments in each scale.
Sherzer, Gili; Gao, Peng; Schlangen, Erik; Ye, Guang; Gal, Erez
2017-01-01
Modeling the complex behavior of concrete for a specific mixture is a challenging task, as it requires bridging the cement scale and the concrete scale. We describe a multiscale analysis procedure for the modeling of concrete structures, in which material properties at the macro scale are evaluated based on lower scales. Concrete may be viewed over a range of scale sizes, from the atomic scale (10−10 m), which is characterized by the behavior of crystalline particles of hydrated Portland cement, to the macroscopic scale (10 m). The proposed multiscale framework is based on several models, including chemical analysis at the cement paste scale, a mechanical lattice model at the cement and mortar scales, geometrical aggregate distribution models at the mortar scale, and the Lattice Discrete Particle Model (LDPM) at the concrete scale. The analysis procedure starts from a known chemical and mechanical set of parameters of the cement paste, which are then used to evaluate the mechanical properties of the LDPM concrete parameters for the fracture, shear, and elastic responses of the concrete. Although a macroscopic validation study of this procedure is presented, future research should include a comparison to additional experiments in each scale. PMID:28772605
Multi-color electron microscopy by element-guided identification of cells, organelles and molecules.
Scotuzzi, Marijke; Kuipers, Jeroen; Wensveen, Dasha I; de Boer, Pascal; Hagen, Kees C W; Hoogenboom, Jacob P; Giepmans, Ben N G
2017-04-07
Cellular complexity is unraveled at nanometer resolution using electron microscopy (EM), but interpretation of macromolecular functionality is hampered by the difficulty in interpreting grey-scale images and the unidentified molecular content. We perform large-scale EM on mammalian tissue complemented with energy-dispersive X-ray analysis (EDX) to allow EM-data analysis based on elemental composition. Endogenous elements, labels (gold and cadmium-based nanoparticles) as well as stains are analyzed at ultrastructural resolution. This provides a wide palette of colors to paint the traditional grey-scale EM images for composition-based interpretation. Our proof-of-principle application of EM-EDX reveals that endocrine and exocrine vesicles exist in single cells in Islets of Langerhans. This highlights how elemental mapping reveals unbiased biomedical relevant information. Broad application of EM-EDX will further allow experimental analysis on large-scale tissue using endogenous elements, multiple stains, and multiple markers and thus brings nanometer-scale 'color-EM' as a promising tool to unravel molecular (de)regulation in biomedicine.
Multi-color electron microscopy by element-guided identification of cells, organelles and molecules
Scotuzzi, Marijke; Kuipers, Jeroen; Wensveen, Dasha I.; de Boer, Pascal; Hagen, Kees (C.) W.; Hoogenboom, Jacob P.; Giepmans, Ben N. G.
2017-01-01
Cellular complexity is unraveled at nanometer resolution using electron microscopy (EM), but interpretation of macromolecular functionality is hampered by the difficulty in interpreting grey-scale images and the unidentified molecular content. We perform large-scale EM on mammalian tissue complemented with energy-dispersive X-ray analysis (EDX) to allow EM-data analysis based on elemental composition. Endogenous elements, labels (gold and cadmium-based nanoparticles) as well as stains are analyzed at ultrastructural resolution. This provides a wide palette of colors to paint the traditional grey-scale EM images for composition-based interpretation. Our proof-of-principle application of EM-EDX reveals that endocrine and exocrine vesicles exist in single cells in Islets of Langerhans. This highlights how elemental mapping reveals unbiased biomedical relevant information. Broad application of EM-EDX will further allow experimental analysis on large-scale tissue using endogenous elements, multiple stains, and multiple markers and thus brings nanometer-scale ‘color-EM’ as a promising tool to unravel molecular (de)regulation in biomedicine. PMID:28387351
Chen, Ning; Yu, Dejie; Xia, Baizhan; Liu, Jian; Ma, Zhengdong
2017-04-01
This paper presents a homogenization-based interval analysis method for the prediction of coupled structural-acoustic systems involving periodical composites and multi-scale uncertain-but-bounded parameters. In the structural-acoustic system, the macro plate structure is assumed to be composed of a periodically uniform microstructure. The equivalent macro material properties of the microstructure are computed using the homogenization method. By integrating the first-order Taylor expansion interval analysis method with the homogenization-based finite element method, a homogenization-based interval finite element method (HIFEM) is developed to solve a periodical composite structural-acoustic system with multi-scale uncertain-but-bounded parameters. The corresponding formulations of the HIFEM are deduced. A subinterval technique is also introduced into the HIFEM for higher accuracy. Numerical examples of a hexahedral box and an automobile passenger compartment are given to demonstrate the efficiency of the presented method for a periodical composite structural-acoustic system with multi-scale uncertain-but-bounded parameters.
Evaluation of modal pushover-based scaling of one component of ground motion: Tall buildings
Kalkan, Erol; Chopra, Anil K.
2012-01-01
Nonlinear response history analysis (RHA) is now increasingly used for performance-based seismic design of tall buildings. Required for nonlinear RHAs is a set of ground motions selected and scaled appropriately so that analysis results would be accurate (unbiased) and efficient (having relatively small dispersion). This paper evaluates accuracy and efficiency of recently developed modal pushover–based scaling (MPS) method to scale ground motions for tall buildings. The procedure presented explicitly considers structural strength and is based on the standard intensity measure (IM) of spectral acceleration in a form convenient for evaluating existing structures or proposed designs for new structures. Based on results presented for two actual buildings (19 and 52 stories, respectively), it is demonstrated that the MPS procedure provided a highly accurate estimate of the engineering demand parameters (EDPs), accompanied by significantly reduced record-to-record variability of the responses. In addition, the MPS procedure is shown to be superior to the scaling procedure specified in the ASCE/SEI 7-05 document.
Web-based Visual Analytics for Extreme Scale Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Evans, Katherine J; Harney, John F
In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less
Theoretical analysis for scaling law of thermal blooming based on optical phase deference
NASA Astrophysics Data System (ADS)
Sun, Yunqiang; Huang, Zhilong; Ren, Zebin; Chen, Zhiqiang; Guo, Longde; Xi, Fengjie
2016-10-01
In order to explore the laser propagation influence of thermal blooming effect of pipe flow and to analysis the influencing factors, scaling law theoretical analysis of the thermal blooming effects in pipe flow are carry out in detail based on the optical path difference caused by thermal blooming effects in pipe flow. Firstly, by solving the energy coupling equation of laser beam propagation, the temperature of the flow is obtained, and then the optical path difference caused by the thermal blooming is deduced. Through the analysis of the influence of pipe size, flow field and laser parameters on the optical path difference, energy scaling parameters Ne=nTαLPR2/(ρɛCpπR02) and geometric scaling parameters Nc=νR2/(ɛL) of thermal blooming for the pipe flow are derived. Secondly, for the direct solution method, the energy coupled equations have analytic solutions only for the straight tube with Gauss beam. Considering the limitation of directly solving the coupled equations, the dimensionless analysis method is adopted, the analysis is also based on the change of optical path difference, same scaling parameters for the pipe flow thermal blooming are derived, which makes energy scaling parameters Ne and geometric scaling parameters Nc have good universality. The research results indicate that when the laser power and the laser beam diameter are changed, thermal blooming effects of the pipeline axial flow caused by optical path difference will not change, as long as you keep energy scaling parameters constant. When diameter or length of the pipe changes, just keep the geometric scaling parameters constant, the pipeline axial flow gas thermal blooming effects caused by optical path difference distribution will not change. That is to say, when the pipe size and laser parameters change, if keeping two scaling parameters with constant, the pipeline axial flow thermal blooming effects caused by the optical path difference will not change. Therefore, the energy scaling parameters and the geometric scaling parameters can really describe the gas thermal blooming effect in the axial pipe flow. These conclusions can give a good reference for the construction of the thermal blooming test system of laser system. Contrasted with the thermal blooming scaling parameters of the Bradley-Hermann distortion number ND and Fresnel number NF, which were derived based on the change of far field beam intensity distortion, the scaling parameters of pipe flow thermal blooming deduced from the optical path deference variation are very suitable for the optical system with short laser propagation distance, large Fresnel number and obviously changed optical path deference.
National scale biomass estimators for United States tree species
Jennifer C. Jenkins; David C. Chojnacky; Linda S. Heath; Richard A. Birdsey
2003-01-01
Estimates of national-scale forest carbon (C) stocks and fluxes are typically based on allometric regression equations developed using dimensional analysis techniques. However, the literature is inconsistent and incomplete with respect to large-scale forest C estimation. We compiled all available diameter-based allometric regression equations for estimating total...
Cleanthous, Sophie; Kinter, Elizabeth; Marquis, Patrick; Petrillo, Jennifer; You, Xiaojun; Wakeford, Craig; Sabatella, Guido
2017-01-01
Background Study objectives were to evaluate the Multiple Sclerosis Impact Scale (MSIS-29) and explore an optimized scoring structure based on empirical post-hoc analyses of data from the Phase III ADVANCE clinical trial. Methods ADVANCE MSIS-29 data from six time-points were analyzed in a sample of patients with relapsing–remitting multiple sclerosis (RRMS). Rasch Measurement Theory (RMT) analysis was undertaken to examine three broad areas: sample-to-scale targeting, measurement scale properties, and sample measurement validity. Interpretation of results led to an alternative MSIS-29 scoring structure, further evaluated alongside responsiveness of the original and revised scales at Week 48. Results RMT analysis provided mixed evidence for Physical and Psychological Impact scales that were sub-optimally targeted at the lower functioning end of the scales. Their conceptual basis could also stand to improve based on item fit results. The revised MSIS-29 rescored scales improved but did not resolve the measurement scale properties and targeting of the MSIS-29. In two out of three revised scales, responsiveness analysis indicated strengthened ability to detect change. Conclusion The revised MSIS-29 provides an initial evidence-based improved patient-reported outcome (PRO) instrument for evaluating the impact of MS. Revised scoring improves conceptual clarity and interpretation of scores by refining scale structure to include Symptoms, Psychological Impact, and General Limitations. Clinical trial ADVANCE (ClinicalTrials.gov identifier NCT00906399). PMID:29104758
NASA Astrophysics Data System (ADS)
Chen, Guoxiong; Cheng, Qiuming
2016-02-01
Multi-resolution and scale-invariance have been increasingly recognized as two closely related intrinsic properties endowed in geofields such as geochemical and geophysical anomalies, and they are commonly investigated by using multiscale- and scaling-analysis methods. In this paper, the wavelet-based multiscale decomposition (WMD) method was proposed to investigate the multiscale natures of geochemical pattern from large scale to small scale. In the light of the wavelet transformation of fractal measures, we demonstrated that the wavelet approximation operator provides a generalization of box-counting method for scaling analysis of geochemical patterns. Specifically, the approximation coefficient acts as the generalized density-value in density-area fractal modeling of singular geochemical distributions. Accordingly, we presented a novel local singularity analysis (LSA) using the WMD algorithm which extends the conventional moving averaging to a kernel-based operator for implementing LSA. Finally, the novel LSA was validated using a case study dealing with geochemical data (Fe2O3) in stream sediments for mineral exploration in Inner Mongolia, China. In comparison with the LSA implemented using the moving averaging method the novel LSA using WMD identified improved weak geochemical anomalies associated with mineralization in covered area.
NASA Astrophysics Data System (ADS)
He, Jiayi; Shang, Pengjian; Xiong, Hui
2018-06-01
Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.
Kalkan, Erol; Chopra, Anil K.
2010-01-01
Earthquake engineering practice is increasingly using nonlinear response history analysis (RHA) to demonstrate performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. Presented herein is a modal-pushover-based scaling (MPS) method to scale ground motions for use in nonlinear RHA of buildings and bridges. In the MPS method, the ground motions are scaled to match (to a specified tolerance) a target value of the inelastic deformation of the first-'mode' inelastic single-degree-of-freedom (SDF) system whose properties are determined by first-'mode' pushover analysis. Appropriate for first-?mode? dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-'mode' SDF system in selecting a subset of the scaled ground motions. Based on results presented for two bridges, covering single- and multi-span 'ordinary standard' bridge types, and six buildings, covering low-, mid-, and tall building types in California, the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.
Multi-Spatiotemporal Patterns of Residential Burglary Crimes in Chicago: 2006-2016
NASA Astrophysics Data System (ADS)
Luo, J.
2017-10-01
This research attempts to explore the patterns of burglary crimes at multi-spatiotemporal scales in Chicago between 2006 and 2016. Two spatial scales are investigated that are census block and police beat area. At each spatial scale, three temporal scales are integrated to make spatiotemporal slices: hourly scale with two-hour time step from 12:00am to the end of the day; daily scale with one-day step from Sunday to Saturday within a week; monthly scale with one-month step from January to December. A total of six types of spatiotemporal slices will be created as the base for the analysis. Burglary crimes are spatiotemporally aggregated to spatiotemporal slices based on where and when they occurred. For each type of spatiotemporal slices with burglary occurrences integrated, spatiotemporal neighborhood will be defined and managed in a spatiotemporal matrix. Hot-spot analysis will identify spatiotemporal clusters of each type of spatiotemporal slices. Spatiotemporal trend analysis is conducted to indicate how the clusters shift in space and time. The analysis results will provide helpful information for better target policing and crime prevention policy such as police patrol scheduling regarding times and places covered.
Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.
2016-01-01
An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.
Clustering "N" Objects into "K" Groups under Optimal Scaling of Variables.
ERIC Educational Resources Information Center
van Buuren, Stef; Heiser, Willem J.
1989-01-01
A method based on homogeneity analysis (multiple correspondence analysis or multiple scaling) is proposed to reduce many categorical variables to one variable with "k" categories. The method is a generalization of the sum of squared distances cluster analysis problem to the case of mixed measurement level variables. (SLD)
ERIC Educational Resources Information Center
Pfeiffer, Steven I.; Jarosewich, Tania
2007-01-01
This study analyzes the standardization sample of a new teacher rating scale designed to assist in the identification of gifted students. The Gifted Rating Scales-School Form (GRS-S) is based on a multidimensional model of giftedness. Results indicate no age or race/ethnicity differences on any of the scales and small but significant differences…
Demonstration of Wavelet Techniques in the Spectral Analysis of Bypass Transition Data
NASA Technical Reports Server (NTRS)
Lewalle, Jacques; Ashpis, David E.; Sohn, Ki-Hyeon
1997-01-01
A number of wavelet-based techniques for the analysis of experimental data are developed and illustrated. A multiscale analysis based on the Mexican hat wavelet is demonstrated as a tool for acquiring physical and quantitative information not obtainable by standard signal analysis methods. Experimental data for the analysis came from simultaneous hot-wire velocity traces in a bypass transition of the boundary layer on a heated flat plate. A pair of traces (two components of velocity) at one location was excerpted. A number of ensemble and conditional statistics related to dominant time scales for energy and momentum transport were calculated. The analysis revealed a lack of energy-dominant time scales inside turbulent spots but identified transport-dominant scales inside spots that account for the largest part of the Reynolds stress. Momentum transport was much more intermittent than were energetic fluctuations. This work is the first step in a continuing study of the spatial evolution of these scale-related statistics, the goal being to apply the multiscale analysis results to improve the modeling of transitional and turbulent industrial flows.
A unifying framework for systems modeling, control systems design, and system operation
NASA Technical Reports Server (NTRS)
Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.
2005-01-01
Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.
NASA Technical Reports Server (NTRS)
Starnes, James H., Jr.; Newman, James C., Jr.; Harris, Charles E.; Piascik, Robert S.; Young, Richard D.; Rose, Cheryl A.
2003-01-01
Analysis methodologies for predicting fatigue-crack growth from rivet holes in panels subjected to cyclic loads and for predicting the residual strength of aluminum fuselage structures with cracks and subjected to combined internal pressure and mechanical loads are described. The fatigue-crack growth analysis methodology is based on small-crack theory and a plasticity induced crack-closure model, and the effect of a corrosive environment on crack-growth rate is included. The residual strength analysis methodology is based on the critical crack-tip-opening-angle fracture criterion that characterizes the fracture behavior of a material of interest, and a geometric and material nonlinear finite element shell analysis code that performs the structural analysis of the fuselage structure of interest. The methodologies have been verified experimentally for structures ranging from laboratory coupons to full-scale structural components. Analytical and experimental results based on these methodologies are described and compared for laboratory coupons and flat panels, small-scale pressurized shells, and full-scale curved stiffened panels. The residual strength analysis methodology is sufficiently general to include the effects of multiple-site damage on structural behavior.
Global-scale patterns of forest fragmentation
Kurt H. Riitters; James D. Wickham; R. O' Neill; B. Jones; E. Smith
2000-01-01
We report an analysis of forest fragmentation based on 1-km resolution land-cover maps for the globe. Measurements in analysis windows from 81 km 2 (9 x 9 pixels, "small" scale) to 59,049 km 2 (243 x 243 pixels, "large" scale) were used to characterize the fragmentation around each forested pixel. We identified six categories of fragmentation (...
a Region-Based Multi-Scale Approach for Object-Based Image Analysis
NASA Astrophysics Data System (ADS)
Kavzoglu, T.; Yildiz Erdemir, M.; Tonbul, H.
2016-06-01
Within the last two decades, object-based image analysis (OBIA) considering objects (i.e. groups of pixels) instead of pixels has gained popularity and attracted increasing interest. The most important stage of the OBIA is image segmentation that groups spectrally similar adjacent pixels considering not only the spectral features but also spatial and textural features. Although there are several parameters (scale, shape, compactness and band weights) to be set by the analyst, scale parameter stands out the most important parameter in segmentation process. Estimating optimal scale parameter is crucially important to increase the classification accuracy that depends on image resolution, image object size and characteristics of the study area. In this study, two scale-selection strategies were implemented in the image segmentation process using pan-sharped Qickbird-2 image. The first strategy estimates optimal scale parameters for the eight sub-regions. For this purpose, the local variance/rate of change (LV-RoC) graphs produced by the ESP-2 tool were analysed to determine fine, moderate and coarse scales for each region. In the second strategy, the image was segmented using the three candidate scale values (fine, moderate, coarse) determined from the LV-RoC graph calculated for whole image. The nearest neighbour classifier was applied in all segmentation experiments and equal number of pixels was randomly selected to calculate accuracy metrics (overall accuracy and kappa coefficient). Comparison of region-based and image-based segmentation was carried out on the classified images and found that region-based multi-scale OBIA produced significantly more accurate results than image-based single-scale OBIA. The difference in classification accuracy reached to 10% in terms of overall accuracy.
Grassi, Mario; Nucera, Andrea
2010-01-01
The objective of this study was twofold: 1) to confirm the hypothetical eight scales and two-component summaries of the questionnaire Short Form 36 Health Survey (SF-36), and 2) to evaluate the performance of two alternative measures to the original physical component summary (PCS) and mental component summary (MCS). We performed principal component analysis (PCA) based on 35 items, after optimal scaling via multiple correspondence analysis (MCA), and subsequently on eight scales, after standard summative scoring. Item-based summary measures were planned. Data from the European Community Respiratory Health Survey II follow-up of 8854 subjects from 25 centers were analyzed to cross-validate the original and the novel PCS and MCS. Overall, the scale- and item-based comparison indicated that the SF-36 scales and summaries meet the supposed dimensionality. However, vitality, social functioning, and general health items did not fit data optimally. The novel measures, derived a posteriori by unit-rule from an oblique (correlated) MCA/PCA solution, are simple item sums or weighted scale sums where the weights are the raw scale ranges. These item-based scores yielded consistent scale-summary results for outliers profiles, with an expected known-group differences validity. We were able to confirm the hypothesized dimensionality of eight scales and two summaries of the SF-36. The alternative scoring reaches at least the same required standards of the original scoring. In addition, it can reduce the item-scale inconsistencies without loss of predictive validity.
Exploring Rating Quality in Rater-Mediated Assessments Using Mokken Scale Analysis
Wind, Stefanie A.; Engelhard, George
2015-01-01
Mokken scale analysis is a probabilistic nonparametric approach that offers statistical and graphical tools for evaluating the quality of social science measurement without placing potentially inappropriate restrictions on the structure of a data set. In particular, Mokken scaling provides a useful method for evaluating important measurement properties, such as invariance, in contexts where response processes are not well understood. Because rater-mediated assessments involve complex interactions among many variables, including assessment contexts, student artifacts, rubrics, individual rater characteristics, and others, rater-assigned scores are suitable candidates for Mokken scale analysis. The purposes of this study are to describe a suite of indices that can be used to explore the psychometric quality of data from rater-mediated assessments and to illustrate the substantive interpretation of Mokken-based statistics and displays in this context. Techniques that are commonly used in polytomous applications of Mokken scaling are adapted for use with rater-mediated assessments, with a focus on the substantive interpretation related to individual raters. Overall, the findings suggest that indices of rater monotonicity, rater scalability, and invariant rater ordering based on Mokken scaling provide diagnostic information at the level of individual raters related to the requirements for invariant measurement. These Mokken-based indices serve as an additional suite of diagnostic tools for exploring the quality of data from rater-mediated assessments that can supplement rating quality indices based on parametric models. PMID:29795883
A picture for the coupling of unemployment and inflation
NASA Astrophysics Data System (ADS)
Safdari, H.; Hosseiny, A.; Vasheghani Farahani, S.; Jafari, G. R.
2016-02-01
The aim of this article is to illustrate the scaling features of two well heard characters in the media; unemployment and inflation. We carry out a scaling analysis on the coupling between unemployment and inflation. This work is based on the wavelet analysis as well as the detrended fluctuation analysis (DFA). Through our analysis we state that while unemployment is time scale invariant, inflation is bi-scale. We show that inflation possess a five year time scale where it experiences different behaviours before and after this scale period. This behaviour of inflation provides basis for the coupling to inherit the stated time interval. Although inflation is bi-scale, it is unemployment that shows a strong multifractality feature. Owing to the cross wavelet analysis we provide a picture that illustrates the dynamics of coupling between unemployment and inflation regarding intensity, direction, and scale. The fact of the matter is that the coupling between inflation and unemployment is not equal in one way compared to the opposite. Regarding the scaling; coupling exhibits different features in various scales. In a sense that although in one scale its correlation behaves in a positive/negative manner, at the same time it can be negative/positive for another scale.
Scaling earthquake ground motions for performance-based assessment of buildings
Huang, Y.-N.; Whittaker, A.S.; Luco, N.; Hamburger, R.O.
2011-01-01
The impact of alternate ground-motion scaling procedures on the distribution of displacement responses in simplified structural systems is investigated. Recommendations are provided for selecting and scaling ground motions for performance-based assessment of buildings. Four scaling methods are studied, namely, (1)geometric-mean scaling of pairs of ground motions, (2)spectrum matching of ground motions, (3)first-mode-period scaling to a target spectral acceleration, and (4)scaling of ground motions per the distribution of spectral demands. Data were developed by nonlinear response-history analysis of a large family of nonlinear single degree-of-freedom (SDOF) oscillators that could represent fixed-base and base-isolated structures. The advantages and disadvantages of each scaling method are discussed. The relationship between spectral shape and a ground-motion randomness parameter, is presented. A scaling procedure that explicitly considers spectral shape is proposed. ?? 2011 American Society of Civil Engineers.
A scale-invariant change detection method for land use/cover change research
NASA Astrophysics Data System (ADS)
Xing, Jin; Sieber, Renee; Caelli, Terrence
2018-07-01
Land Use/Cover Change (LUCC) detection relies increasingly on comparing remote sensing images with different spatial and spectral scales. Based on scale-invariant image analysis algorithms in computer vision, we propose a scale-invariant LUCC detection method to identify changes from scale heterogeneous images. This method is composed of an entropy-based spatial decomposition, two scale-invariant feature extraction methods, Maximally Stable Extremal Region (MSER) and Scale-Invariant Feature Transformation (SIFT) algorithms, a spatial regression voting method to integrate MSER and SIFT results, a Markov Random Field-based smoothing method, and a support vector machine classification method to assign LUCC labels. We test the scale invariance of our new method with a LUCC case study in Montreal, Canada, 2005-2012. We found that the scale-invariant LUCC detection method provides similar accuracy compared with the resampling-based approach but this method avoids the LUCC distortion incurred by resampling.
When micro meets macro: microbial lipid analysis and ecosystem ecology
NASA Astrophysics Data System (ADS)
Balser, T.; Gutknecht, J.
2008-12-01
There is growing interest in linking soil microbial community composition and activity with large-scale field studies of nutrient cycling or plant community response to disturbances. And while analysis of microbial communities has moved rapidly in the past decade from culture-based to non-culture based techniques, still it must be asked what have we gained from the move? How well does the necessarily micro-scale of microbial analysis allow us to address questions of interest at the macro-scale? Several challenges exist in bridging the scales, and foremost is the question of methodological feasibility. Past microbiological methodologies have not been readily adaptable to the large sample sizes necessary for ecosystem-scale research. As a result, it has been difficult to generate compatible microbial and ecosystem data sets. We describe the use of a modified lipid extraction method to generate microbial community data sets that allow us to match landscape-scale or long-term ecological studies with microbial community data. We briefly discuss the challenges and advantages associated with lipid analysis as an approach to addressing ecosystem ecological studies, and provide examples from our research in ecosystem restoration and recovery following disturbance and climate change.
NASA Astrophysics Data System (ADS)
Pokhrel, A.; El Hannach, M.; Orfino, F. P.; Dutta, M.; Kjeang, E.
2016-10-01
X-ray computed tomography (XCT), a non-destructive technique, is proposed for three-dimensional, multi-length scale characterization of complex failure modes in fuel cell electrodes. Comparative tomography data sets are acquired for a conditioned beginning of life (BOL) and a degraded end of life (EOL) membrane electrode assembly subjected to cathode degradation by voltage cycling. Micro length scale analysis shows a five-fold increase in crack size and 57% thickness reduction in the EOL cathode catalyst layer, indicating widespread action of carbon corrosion. Complementary nano length scale analysis shows a significant reduction in porosity, increased pore size, and dramatically reduced effective diffusivity within the remaining porous structure of the catalyst layer at EOL. Collapsing of the structure is evident from the combination of thinning and reduced porosity, as uniquely determined by the multi-length scale approach. Additionally, a novel image processing based technique developed for nano scale segregation of pore, ionomer, and Pt/C dominated voxels shows an increase in ionomer volume fraction, Pt/C agglomerates, and severe carbon corrosion at the catalyst layer/membrane interface at EOL. In summary, XCT based multi-length scale analysis enables detailed information needed for comprehensive understanding of the complex failure modes observed in fuel cell electrodes.
Beutel, Manfred E; Brähler, Elmar; Wiltink, Jörg; Michal, Matthias; Klein, Eva M; Jünger, Claus; Wild, Philipp S; Münzel, Thomas; Blettner, Maria; Lackner, Karl; Nickels, Stefan; Tibubos, Ana N
2017-01-01
Aim of the study was the development and validation of the psychometric properties of a six-item bi-factorial instrument for the assessment of social support (emotional and tangible support) with a population-based sample. A cross-sectional data set of N = 15,010 participants enrolled in the Gutenberg Health Study (GHS) in 2007-2012 was divided in two sub-samples. The GHS is a population-based, prospective, observational single-center cohort study in the Rhein-Main-Region in western Mid-Germany. The first sub-sample was used for scale development by performing an exploratory factor analysis. In order to test construct validity, confirmatory factor analyses were run to compare the extracted bi-factorial model with the one-factor solution. Reliability of the scales was indicated by calculating internal consistency. External validity was tested by investigating demographic characteristics health behavior, and distress using analysis of variance, Spearman and Pearson correlation analysis, and logistic regression analysis. Based on an exploratory factor analysis, a set of six items was extracted representing two independent factors. The two-factor structure of the Brief Social Support Scale (BS6) was confirmed by the results of the confirmatory factor analyses. Fit indices of the bi-factorial model were good and better compared to the one-factor solution. External validity was demonstrated for the BS6. The BS6 is a reliable and valid short scale that can be applied in social surveys due to its brevity to assess emotional and practical dimensions of social support.
Examining the validity and reliability of the Taita symptom checklist using Rasch analysis.
Chen, Yun-Ling; Pan, Ay-Woan; Chung, LyInn; Chen, Tsyr-Jang
2015-03-01
The Taita symptom checklist (TSCL) is a standardized self-rating psychiatric symptom scale for outpatients with mental illness in Taiwan. This study aimed to examine the validity and reliability of the TSCL using Rasch analysis. The TSCL was given to 583 healthy people and 479 people with mental illness. Rasch analysis was used to examine the appropriateness of the rating scale, the unidimensionality of the scale, the differential item functioning across sex and diagnosis, and the Rasch cut-off score of the scale. Rasch analysis confirmed that the revised 37 items with a three-point rating scale of the TSCL demonstrated good internal consistency and met criteria for unidimensionality. The person and item reliability indices were high. The TSCL could reliably measure healthy participants and patients with mental illness. Differential item functioning due to sex or psychiatric diagnosis was evident for three items. A Rasch cut-off score for TSCL was produced for detecting participants' psychiatric symptoms based on an eight-level classification. The TSCL is a reliable and valid assessment to evaluate the participants' perceived disturbance of psychiatric symptoms based on Rasch analysis. Copyright © 2013. Published by Elsevier B.V.
Multifunctional picoliter droplet manipulation platform and its application in single cell analysis.
Gu, Shu-Qing; Zhang, Yun-Xia; Zhu, Ying; Du, Wen-Bin; Yao, Bo; Fang, Qun
2011-10-01
We developed an automated and multifunctional microfluidic platform based on DropLab to perform flexible generation and complex manipulations of picoliter-scale droplets. Multiple manipulations including precise droplet generation, sequential reagent merging, and multistep solid-phase extraction for picoliter-scale droplets could be achieved in the present platform. The system precision in generating picoliter-scale droplets was significantly improved by minimizing the thermo-induced fluctuation of flow rate. A novel droplet fusion technique based on the difference of droplet interfacial tensions was developed without the need of special microchannel networks or external devices. It enabled sequential addition of reagents to droplets on demand for multistep reactions. We also developed an effective picoliter-scale droplet splitting technique with magnetic actuation. The difficulty in phase separation of magnetic beads from picoliter-scale droplets due to the high interfacial tension was overcome using ferromagnetic particles to carry the magnetic beads to pass through the phase interface. With this technique, multistep solid-phase extraction was achieved among picoliter-scale droplets. The present platform had the ability to perform complex multistep manipulations to picoliter-scale droplets, which is particularly required for single cell analysis. Its utility and potentials in single cell analysis were preliminarily demonstrated in achieving high-efficiency single-cell encapsulation, enzyme activity assay at the single cell level, and especially, single cell DNA purification based on solid-phase extraction.
Stress analysis of 27% scale model of AH-64 main rotor hub
NASA Technical Reports Server (NTRS)
Hodges, R. V.
1985-01-01
Stress analysis of an AH-64 27% scale model rotor hub was performed. Component loads and stresses were calculated based upon blade root loads and motions. The static and fatigue analysis indicates positive margins of safety in all components checked. Using the format developed here, the hub can be stress checked for future application.
Secondary Analysis of Large-Scale Assessment Data: An Alternative to Variable-Centred Analysis
ERIC Educational Resources Information Center
Chow, Kui Foon; Kennedy, Kerry John
2014-01-01
International large-scale assessments are now part of the educational landscape in many countries and often feed into major policy decisions. Yet, such assessments also provide data sets for secondary analysis that can address key issues of concern to educators and policymakers alike. Traditionally, such secondary analyses have been based on a…
Allometric scaling: analysis of LD50 data.
Burzala-Kowalczyk, Lidia; Jongbloed, Geurt
2011-04-01
The need to identify toxicologically equivalent doses across different species is a major issue in toxicology and risk assessment. In this article, we investigate interspecies scaling based on the allometric equation applied to the single, oral LD (50) data previously analyzed by Rhomberg and Wolff. We focus on the statistical approach, namely, regression analysis of the mentioned data. In contrast to Rhomberg and Wolff's analysis of species pairs, we perform an overall analysis based on the whole data set. From our study it follows that if one assumes one single scaling rule for all species and substances in the data set, then β = 1 is the most natural choice among a set of candidates known in the literature. In fact, we obtain quite narrow confidence intervals for this parameter. However, the estimate of the variance in the model is relatively high, resulting in rather wide prediction intervals. © 2010 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Liu, Changjiang; Cheng, Irene; Zhang, Yi; Basu, Anup
2017-06-01
This paper presents an improved multi-scale Retinex (MSR) based enhancement for ariel images under low visibility. For traditional multi-scale Retinex, three scales are commonly employed, which limits its application scenarios. We extend our research to a general purpose enhanced method, and design an MSR with more than three scales. Based on the mathematical analysis and deductions, an explicit multi-scale representation is proposed that balances image contrast and color consistency. In addition, a histogram truncation technique is introduced as a post-processing strategy to remap the multi-scale Retinex output to the dynamic range of the display. Analysis of experimental results and comparisons with existing algorithms demonstrate the effectiveness and generality of the proposed method. Results on image quality assessment proves the accuracy of the proposed method with respect to both objective and subjective criteria.
Multiple-length-scale deformation analysis in a thermoplastic polyurethane
Sui, Tan; Baimpas, Nikolaos; Dolbnya, Igor P.; Prisacariu, Cristina; Korsunsky, Alexander M.
2015-01-01
Thermoplastic polyurethane elastomers enjoy an exceptionally wide range of applications due to their remarkable versatility. These block co-polymers are used here as an example of a structurally inhomogeneous composite containing nano-scale gradients, whose internal strain differs depending on the length scale of consideration. Here we present a combined experimental and modelling approach to the hierarchical characterization of block co-polymer deformation. Synchrotron-based small- and wide-angle X-ray scattering and radiography are used for strain evaluation across the scales. Transmission electron microscopy image-based finite element modelling and fast Fourier transform analysis are used to develop a multi-phase numerical model that achieves agreement with the combined experimental data using a minimal number of adjustable structural parameters. The results highlight the importance of fuzzy interfaces, that is, regions of nanometre-scale structure and property gradients, in determining the mechanical properties of hierarchical composites across the scales. PMID:25758945
H2@Scale Resource and Market Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruth, Mark
The 'H2@Scale' concept is based on the potential for wide-scale utilization of hydrogen as an energy intermediate where the hydrogen is produced from low cost energy resources and it is used in both the transportation and industrial sectors. H2@Scale has the potential to address grid resiliency, energy security, and cross-sectoral emissions reductions. This presentation summarizes the status of an ongoing analysis effort to quantify the benefits of H2@Scale. It includes initial results regarding market potential, resource potential, and impacts of when electrolytic hydrogen is produced with renewable electricity to meet the potential market demands. It also proposes additional analysis effortsmore » to better quantify each of the factors.« less
2009-06-01
simulation is the campaign-level Peace Support Operations Model (PSOM). This thesis provides a quantitative analysis of PSOM. The results are based ...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . 15. NUMBER OF PAGES 159...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . vi THIS PAGE
A multi-topographical-instrument analysis: the breast implant texture measurement
NASA Astrophysics Data System (ADS)
Garabédian, Charles; Delille, Rémi; Deltombe, Raphaël; Anselme, Karine; Atlan, Michael; Bigerelle, Maxence
2017-06-01
Capsular contracture is a major complication after implant-based breast augmentation. To address this tissue reaction, most manufacturers texture the outer breast implant surfaces with calibrated salt grains. However, the analysis of these surfaces on sub-micron scales has been under-studied. This scale range is of interest to understand the future of silicone particles potentially released from the implant surface and the aetiology of newly reported complications, such as Anaplastic Large Cell Lymphoma. The surface measurements were accomplished by tomography and by two optical devices based on interferometry and on focus variation. The robustness of the measurements was investigated from the tissue scale to the cellular scale. The macroscopic pore-based structure of the textured implant surfaces is consistently measured by the three instruments. However, the multi-scale analyses start to be discrepant in a scale range between 50 µm and 500 µm characteristic of a finer secondary roughness regardless of the pore shape. The focus variation and the micro-tomography would fail to capture this roughness regime because of a focus-related optical artefact and of step-shaped artefact respectively.
Wang, Lu-Yong; Fasulo, D
2006-01-01
Genome-wide association study for complex diseases will generate massive amount of single nucleotide polymorphisms (SNPs) data. Univariate statistical test (i.e. Fisher exact test) was used to single out non-associated SNPs. However, the disease-susceptible SNPs may have little marginal effects in population and are unlikely to retain after the univariate tests. Also, model-based methods are impractical for large-scale dataset. Moreover, genetic heterogeneity makes the traditional methods harder to identify the genetic causes of diseases. A more recent random forest method provides a more robust method for screening the SNPs in thousands scale. However, for more large-scale data, i.e., Affymetrix Human Mapping 100K GeneChip data, a faster screening method is required to screening SNPs in whole-genome large scale association analysis with genetic heterogeneity. We propose a boosting-based method for rapid screening in large-scale analysis of complex traits in the presence of genetic heterogeneity. It provides a relatively fast and fairly good tool for screening and limiting the candidate SNPs for further more complex computational modeling task.
Static Aeroelastic Scaling and Analysis of a Sub-Scale Flexible Wing Wind Tunnel Model
NASA Technical Reports Server (NTRS)
Ting, Eric; Lebofsky, Sonia; Nguyen, Nhan; Trinh, Khanh
2014-01-01
This paper presents an approach to the development of a scaled wind tunnel model for static aeroelastic similarity with a full-scale wing model. The full-scale aircraft model is based on the NASA Generic Transport Model (GTM) with flexible wing structures referred to as the Elastically Shaped Aircraft Concept (ESAC). The baseline stiffness of the ESAC wing represents a conventionally stiff wing model. Static aeroelastic scaling is conducted on the stiff wing configuration to develop the wind tunnel model, but additional tailoring is also conducted such that the wind tunnel model achieves a 10% wing tip deflection at the wind tunnel test condition. An aeroelastic scaling procedure and analysis is conducted, and a sub-scale flexible wind tunnel model based on the full-scale's undeformed jig-shape is developed. Optimization of the flexible wind tunnel model's undeflected twist along the span, or pre-twist or wash-out, is then conducted for the design test condition. The resulting wind tunnel model is an aeroelastic model designed for the wind tunnel test condition.
Comparison of detrending methods for fluctuation analysis in hydrology
NASA Astrophysics Data System (ADS)
Zhang, Qiang; Zhou, Yu; Singh, Vijay P.; Chen, Yongqin David
2011-03-01
SummaryTrends within a hydrologic time series can significantly influence the scaling results of fluctuation analysis, such as rescaled range (RS) analysis and (multifractal) detrended fluctuation analysis (MF-DFA). Therefore, removal of trends is important in the study of scaling properties of the time series. In this study, three detrending methods, including adaptive detrending algorithm (ADA), Fourier-based method, and average removing technique, were evaluated by analyzing numerically generated series and observed streamflow series with obvious relative regular periodic trend. Results indicated that: (1) the Fourier-based detrending method and ADA were similar in detrending practices, and given proper parameters, these two methods can produce similarly satisfactory results; (2) detrended series by Fourier-based detrending method and ADA lose the fluctuation information at larger time scales, and the location of crossover points is heavily impacted by the chosen parameters of these two methods; and (3) the average removing method has an advantage over the other two methods, i.e., the fluctuation information at larger time scales is kept well-an indication of relatively reliable performance in detrending. In addition, the average removing method performed reasonably well in detrending a time series with regular periods or trends. In this sense, the average removing method should be preferred in the study of scaling properties of the hydrometeorolgical series with relative regular periodic trend using MF-DFA.
Evaluation of ground motion scaling methods for analysis of structural systems
O'Donnell, A. P.; Beltsar, O.A.; Kurama, Y.C.; Kalkan, E.; Taflanidis, A.A.
2011-01-01
Ground motion selection and scaling comprises undoubtedly the most important component of any seismic risk assessment study that involves time-history analysis. Ironically, this is also the single parameter with the least guidance provided in current building codes, resulting in the use of mostly subjective choices in design. The relevant research to date has been primarily on single-degree-of-freedom systems, with only a few studies using multi-degree-of-freedom systems. Furthermore, the previous research is based solely on numerical simulations with no experimental data available for the validation of the results. By contrast, the research effort described in this paper focuses on an experimental evaluation of selected ground motion scaling methods based on small-scale shake-table experiments of re-configurable linearelastic and nonlinear multi-story building frame structure models. Ultimately, the experimental results will lead to the development of guidelines and procedures to achieve reliable demand estimates from nonlinear response history analysis in seismic design. In this paper, an overview of this research effort is discussed and preliminary results based on linear-elastic dynamic response are presented. ?? ASCE 2011.
Quantitative analysis of voids in percolating structures in two-dimensional N-body simulations
NASA Technical Reports Server (NTRS)
Harrington, Patrick M.; Melott, Adrian L.; Shandarin, Sergei F.
1993-01-01
We present in this paper a quantitative method for defining void size in large-scale structure based on percolation threshold density. Beginning with two-dimensional gravitational clustering simulations smoothed to the threshold of nonlinearity, we perform percolation analysis to determine the large scale structure. The resulting objective definition of voids has a natural scaling property, is topologically interesting, and can be applied immediately to redshift surveys.
NASA Astrophysics Data System (ADS)
Huo, Chengyu; Huang, Xiaolin; Zhuang, Jianjun; Hou, Fengzhen; Ni, Huangjing; Ning, Xinbao
2013-09-01
The Poincaré plot is one of the most important approaches in human cardiac rhythm analysis. However, further investigations are still needed to concentrate on techniques that can characterize the dispersion of the points displayed by a Poincaré plot. Based on a modified Poincaré plot, we provide a novel measurement named distribution entropy (DE) and propose a quadrantal multi-scale distribution entropy analysis (QMDE) for the quantitative descriptions of the scatter distribution patterns in various regions and temporal scales. We apply this method to the heartbeat interval series derived from healthy subjects and congestive heart failure (CHF) sufferers, respectively, and find that the discriminations between them are most significant in the first quadrant, which implies significant impacts on vagal regulation brought about by CHF. We also investigate the day-night differences of young healthy people, and it is shown that the results present a clearly circadian rhythm, especially in the first quadrant. In addition, the multi-scale analysis indicates that the results of healthy subjects and CHF sufferers fluctuate in different trends with variation of the scale factor. The same phenomenon also appears in circadian rhythm investigations of young healthy subjects, which implies that the cardiac dynamic system is affected differently in various temporal scales by physiological or pathological factors.
Development and validation of 26-item dysfunctional attitude scale.
Ebrahimi, Amrollah; Samouei, Rahele; Mousavii, Sayyed Ghafour; Bornamanesh, Ali Reza
2013-06-01
Dysfunctional Attitude Scale is one of the most common instruments used to assess cognitive vulnerability. This study aimed to develop and validate a short form of Dysfunctional Attitude Scale appropriate for an Iranian clinical population. Participants were 160 psychiatric patients from medical centers affiliated with Isfahan Medical University, as well as 160 non-patients. Research instruments were clinical interviews based on the Diagnostic and Statistical Manual-IV-TR, Dysfunctional Attitude Scale and General Heath Questionnaire (GHQ-28). Data was analyzed using multicorrelation calculations and factor analysis. Based on the results of factor analysis and item-total correlation, 14 items were judged candidates for omission. Analysis of the 26-item Dysfunctional Attitude Scale (DAS-26) revealed a Cronbach's alpha of 0.92. Evidence for the concurrent criterion validity was obtained through calculating the correlation between the Dysfunctional Attitude Scale and psychiatric diagnosis (r = 0.55), GHQ -28 (r = 0.56) and somatization, anxiety, social dysfunction, and depression subscales (0.45,0.53,0.48, and 0.57, respectively). Factor analysis deemed a four-factor structure the best. The factors were labeled as success-perfectionism, need for approval, need for satisfying others, and vulnerability-performance evaluation. The results showed that the Iranian version of the Dysfunctional Attitude Scale (DAS-26) bears satisfactory psychometric properties suggesting that this cognitive instrument is appropriate for use in an Iranian cultural context. Copyright © 2012 Wiley Publishing Asia Pty Ltd.
Chung, Eva Yin-Han; Lam, Gigi
2018-05-29
The World Health Organization has asserted the importance of enhancing participation of people with disabilities within the International Classification of Functioning, Disability and Health framework. Participation is regarded as a vital outcome in community-based rehabilitation. The actualization of the right to participate is limited by social stigma and discrimination. To date, there is no validated instrument for use in Chinese communities to measure participation restriction or self-perceived stigma. This study aimed to translate and validate the Participation Scale and the Explanatory Model Interview Catalogue (EMIC) Stigma Scale for use in Chinese communities with people with physical disabilities. The Chinese versions of the Participation Scale and the EMIC stigma scale were administered to 264 adults with physical disabilities. The two scales were examined separately. The reliability analysis was studied in conjunction with the construct validity. Reliability analysis was conducted to assess the internal consistency and item-total correlation. Exploratory factor analysis was conducted to investigate the latent patterns of relationships among variables. A Rasch model analysis was conducted to test the dimensionality, internal validity, item hierarchy, and scoring category structure of the two scales. Both the Participation Scale and the EMIC stigma scale were confirmed to have good internal consistency and high item-total correlation. Exploratory factor analysis revealed the factor structure of the two scales, which demonstrated the fitting of a pattern of variables within the studied construct. The Participation Scale was found to be multidimensional, whereas the EMIC stigma scale was confirmed to be unidimensional. The item hierarchies of the Participation Scale and the EMIC stigma scale were discussed and were regarded as compatible with the cultural characteristics of Chinese communities. The Chinese versions of the Participation Scale and the EMIC stigma scale were thoroughly tested in this study to demonstrate their robustness and feasibility in measuring the participation restriction and perceived stigma of people with physical disabilities in Chinese communities. This is crucial as it provides valid measurements to enable comprehensive understanding and assessment of the participation and stigma among people with physical disabilities in Chinese communities.
A Comparison of Three Strategies for Scale Construction to Predict a Specific Behavioral Outcome
ERIC Educational Resources Information Center
Garb, Howard N.; Wood, James M.; Fiedler, Edna R.
2011-01-01
Using 65 items from a mental health screening questionnaire, the History Opinion Inventory-Revised (HOI-R), the present study compared three strategies of scale construction--(1) internal (based on factor analysis), (2) external (based on empirical performance) and (3) intuitive (based on clinicians' opinion)--to predict whether 203,595 U.S. Air…
Steigen, Anne Mari; Bergh, Daniel
2018-02-05
This article analyses the psychometric properties of the Social Provisions Scale 10-items version. The Social Provisions Scale was analysed by means of the polytomous Rasch model, applied to data on 93 young adults (16-30 years) out of school or work, participating in different nature-based services, due to mental or drug-related problems. The psychometric analysis concludes that the original scale has difficulties related to targeting and construct validity. In order to improve the psychometric properties, the scale was modified to include eight items measuring functional support. The modification was based on theoretical and statistical considerations. After modifications the scale showed not only satisfying psychometric properties, but it also clarified uncertainties regarding construct validity of the measure. However, further analysis on larger samples are required. Implications for Rehabilitation Social support is important for a variety of rehabilitation outcomes and for different patient groups in the rehabilitation context, including people with mental health or drug-related problems. Social Provisions Scale may be used as a screening tool to assess social support of participants in rehabilitation, and the scale may also be an important instrument in rehabilitation research. There might be issues measuring structural support using a 10-items version of the Social Provisions Scale but it seemed to work well as an 8-item scale measuring functional support.
NASA Astrophysics Data System (ADS)
Jin, Y.; Lee, D. K.; Jeong, S. G.
2015-12-01
The ecological and social values of forests have recently been highlighted. Assessments of the biodiversity of forests, as well as their other ecological values, play an important role in regional and national conservation planning. The preservation of habitats is linked to the protection of biodiversity. For mapping habitats, species distribution model (SDM) is used for predicting suitable habitat of significant species, and such distribution modeling is increasingly being used in conservation science. However, the pixel-based analysis does not contain contextual or topological information. In order to provide more accurate habitats predictions, a continuous field view that assumes the real world is required. Here we analyze and compare at different scales, habitats of the Yellow Marten's(Martes Flavigula), which is a top predator and also an umbrella species in South Korea. The object-scale, which is a group of pixels that have similar spatial and spectral characteristics, and pixel-scale were used for SDM. Our analysis using the SDM at different scales suggests that object-scale analysis provides a superior representation of continuous habitat, and thus will be useful in forest conservation planning as well as for species habitat monitoring.
Double symbolic joint entropy in nonlinear dynamic complexity analysis
NASA Astrophysics Data System (ADS)
Yao, Wenpo; Wang, Jun
2017-07-01
Symbolizations, the base of symbolic dynamic analysis, are classified as global static and local dynamic approaches which are combined by joint entropy in our works for nonlinear dynamic complexity analysis. Two global static methods, symbolic transformations of Wessel N. symbolic entropy and base-scale entropy, and two local ones, namely symbolizations of permutation and differential entropy, constitute four double symbolic joint entropies that have accurate complexity detections in chaotic models, logistic and Henon map series. In nonlinear dynamical analysis of different kinds of heart rate variability, heartbeats of healthy young have higher complexity than those of the healthy elderly, and congestive heart failure (CHF) patients are lowest in heartbeats' joint entropy values. Each individual symbolic entropy is improved by double symbolic joint entropy among which the combination of base-scale and differential symbolizations have best complexity analysis. Test results prove that double symbolic joint entropy is feasible in nonlinear dynamic complexity analysis.
Development and Validation of a Smartphone Addiction Scale (SAS)
Kwon, Min; Lee, Joon-Yeop; Won, Wang-Youn; Park, Jae-Woo; Min, Jung-Ah; Hahn, Changtae; Gu, Xinyu; Choi, Ji-Hye; Kim, Dai-Jin
2013-01-01
Objective The aim of this study was to develop a self-diagnostic scale that could distinguish smartphone addicts based on the Korean self-diagnostic program for Internet addiction (K-scale) and the smartphone's own features. In addition, the reliability and validity of the smartphone addiction scale (SAS) was demonstrated. Methods A total of 197 participants were selected from Nov. 2011 to Jan. 2012 to accomplish a set of questionnaires, including SAS, K-scale, modified Kimberly Young Internet addiction test (Y-scale), visual analogue scale (VAS), and substance dependence and abuse diagnosis of DSM-IV. There were 64 males and 133 females, with ages ranging from 18 to 53 years (M = 26.06; SD = 5.96). Factor analysis, internal-consistency test, t-test, ANOVA, and correlation analysis were conducted to verify the reliability and validity of SAS. Results Based on the factor analysis results, the subscale “disturbance of reality testing” was removed, and six factors were left. The internal consistency and concurrent validity of SAS were verified (Cronbach's alpha = 0.967). SAS and its subscales were significantly correlated with K-scale and Y-scale. The VAS of each factor also showed a significant correlation with each subscale. In addition, differences were found in the job (p<0.05), education (p<0.05), and self-reported smartphone addiction scores (p<0.001) in SAS. Conclusions This study developed the first scale of the smartphone addiction aspect of the diagnostic manual. This scale was proven to be relatively reliable and valid. PMID:23468893
Development and validation of a smartphone addiction scale (SAS).
Kwon, Min; Lee, Joon-Yeop; Won, Wang-Youn; Park, Jae-Woo; Min, Jung-Ah; Hahn, Changtae; Gu, Xinyu; Choi, Ji-Hye; Kim, Dai-Jin
2013-01-01
The aim of this study was to develop a self-diagnostic scale that could distinguish smartphone addicts based on the Korean self-diagnostic program for Internet addiction (K-scale) and the smartphone's own features. In addition, the reliability and validity of the smartphone addiction scale (SAS) was demonstrated. A total of 197 participants were selected from Nov. 2011 to Jan. 2012 to accomplish a set of questionnaires, including SAS, K-scale, modified Kimberly Young Internet addiction test (Y-scale), visual analogue scale (VAS), and substance dependence and abuse diagnosis of DSM-IV. There were 64 males and 133 females, with ages ranging from 18 to 53 years (M = 26.06; SD = 5.96). Factor analysis, internal-consistency test, t-test, ANOVA, and correlation analysis were conducted to verify the reliability and validity of SAS. Based on the factor analysis results, the subscale "disturbance of reality testing" was removed, and six factors were left. The internal consistency and concurrent validity of SAS were verified (Cronbach's alpha = 0.967). SAS and its subscales were significantly correlated with K-scale and Y-scale. The VAS of each factor also showed a significant correlation with each subscale. In addition, differences were found in the job (p<0.05), education (p<0.05), and self-reported smartphone addiction scores (p<0.001) in SAS. This study developed the first scale of the smartphone addiction aspect of the diagnostic manual. This scale was proven to be relatively reliable and valid.
NASA Astrophysics Data System (ADS)
Deidda, Roberto; Mascaro, Giuseppe; Hellies, Matteo; Baldini, Luca; Roberto, Nicoletta
2013-04-01
COSMO Sky-Med (CSK) is an important programme of the Italian Space Agency aiming at supporting environmental monitoring and management of exogenous, endogenous and anthropogenic risks through X-band Synthetic Aperture Radar (X-SAR) on board of 4 satellites forming a constellation. Most of typical SAR applications are focused on land or ocean observation. However, X-band SAR can be detect precipitation that results in a specific signature caused by the combination of attenuation of surface returns induced by precipitation and enhancement of backscattering determined by the hydrometeors in the SAR resolution volume. Within CSK programme, we conducted an intercomparison between the statistical properties of precipitation fields derived by CSK SARs and those derived by the CNR Polar 55C (C-band) ground based weather radar located in Rome (Italy). This contribution presents main results of this research which was aimed at the robust characterisation of rainfall statistical properties across different scales by means of scale-invariance analysis and multifractal theory. The analysis was performed on a dataset of more two years of precipitation observations collected by the CNR Polar 55C radar and rainfall fields derived from available images collected by the CSK satellites during intense rainfall events. Scale-invariance laws and multifractal properties were detected on the most intense rainfall events derived from the CNR Polar 55C radar for spatial scales from 4 km to 64 km. The analysis on X-SAR retrieved rainfall fields, although based on few images, leaded to similar results and confirmed the existence of scale-invariance and multifractal properties for scales larger than 4 km. These outcomes encourage investigating SAR methodologies for future development of meteo-hydrological forecasting models based on multifractal theory.
The pyramid system for multiscale raster analysis
De Cola, L.; Montagne, N.
1993-01-01
Geographical research requires the management and analysis of spatial data at multiple scales. As part of the U.S. Geological Survey's global change research program a software system has been developed that reads raster data (such as an image or digital elevation model) and produces a pyramid of aggregated lattices as well as various measurements of spatial complexity. For a given raster dataset the system uses the pyramid to report: (1) mean, (2) variance, (3) a spatial autocorrelation parameter based on multiscale analysis of variance, and (4) a monofractal scaling parameter based on the analysis of isoline lengths. The system is applied to 1-km digital elevation model (DEM) data for a 256-km2 region of central California, as well as to 64 partitions of the region. PYRAMID, which offers robust descriptions of data complexity, also is used to describe the behavior of topographic aspect with scale. ?? 1993.
Scale dependent inference in landscape genetics
Samuel A. Cushman; Erin L. Landguth
2010-01-01
Ecological relationships between patterns and processes are highly scale dependent. This paper reports the first formal exploration of how changing scale of research away from the scale of the processes governing gene flow affects the results of landscape genetic analysis. We used an individual-based, spatially explicit simulation model to generate patterns of genetic...
ERIC Educational Resources Information Center
Bilgin, Aysegül; Balbag, Mustafa Zafer
2016-01-01
This study has developed "Personal Professional Development Efforts Scale for Science and Technology Teachers Regarding Their Fields". Exploratory factor analysis of the scale has been conducted based on the data collected from 200 science and technology teachers across Turkey. The scale has been observed through varimax rotation method,…
NASA Technical Reports Server (NTRS)
Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi
1994-01-01
An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.
ERIC Educational Resources Information Center
Fives, Helenrose; Buehl, Michelle M.
2014-01-01
In this investigation, we assessed 443 teachers' beliefs with the "Teaching Ability Belief Scale" (TABS) and the "Importance of Teaching Knowledge Scale" (ITKS). Using cluster analysis, we identified four groups of teachers based on their responses to the TABS reflecting "Innate," "Learned,"…
Zhi, Ruicong; Zhao, Lei; Xie, Nan; Wang, Houyin; Shi, Bolin; Shi, Jingye
2016-01-13
A framework of establishing standard reference scale (texture) is proposed by multivariate statistical analysis according to instrumental measurement and sensory evaluation. Multivariate statistical analysis is conducted to rapidly select typical reference samples with characteristics of universality, representativeness, stability, substitutability, and traceability. The reasonableness of the framework method is verified by establishing standard reference scale of texture attribute (hardness) with Chinese well-known food. More than 100 food products in 16 categories were tested using instrumental measurement (TPA test), and the result was analyzed with clustering analysis, principal component analysis, relative standard deviation, and analysis of variance. As a result, nine kinds of foods were determined to construct the hardness standard reference scale. The results indicate that the regression coefficient between the estimated sensory value and the instrumentally measured value is significant (R(2) = 0.9765), which fits well with Stevens's theory. The research provides reliable a theoretical basis and practical guide for quantitative standard reference scale establishment on food texture characteristics.
Beyhun, Nazim Ercument; Can, Gamze; Tiryaki, Ahmet; Karakullukcu, Serdar; Bulut, Bekir; Yesilbas, Sehbal; Kavgaci, Halil; Topbas, Murat
2016-01-01
Background Needs based biopsychosocial distress instrument for cancer patients (CANDI) is a scale based on needs arising due to the effects of cancer. Objectives The aim of this research was to determine the reliability and validity of the CANDI scale in the Turkish language. Patients and Methods The study was performed with the participation of 172 cancer patients aged 18 and over. Factor analysis (principal components analysis) was used to assess construct validity. Criterion validities were tested by computing Spearman correlation between CANDI and hospital anxiety depression scale (HADS), and brief symptom inventory (BSI) (convergent validity) and quality of life scales (FACT-G) (divergent validity). Test-retest reliabilities and internal consistencies were measured with intraclass correlation (ICC) and Cronbach-α. Results A three-factor solution (emotional, physical and social) was found with factor analysis. Internal reliability (α = 0.94) and test-retest reliability (ICC = 0.87) were significantly high. Correlations between CANDI and HADS (rs = 0.67), and BSI (rs = 0.69) and FACT-G (rs = -0.76) were moderate and significant in the expected direction. Conclusions CANDI is a valid and reliable scale in cancer patients with a three-factor structure (emotional, physical and social) in the Turkish language. PMID:27621931
Chen, Yanhua; Watson, Roger; Hilton, Andrea
2016-05-01
To understand nursing students' expectation from their mentors and assess mentors' performance, a scale of mentors' behavior was developed based on literature review and focus group in China. This study aims to explore the structure of mentors' behavior. A cross-sectional survey. Data were collected from nursing students in three hospitals in southwest China in 2014. A total of 669 pre-registered nursing students in their final year clinical learning participated in this study. Exploratory factor analysis and Mokken scale analysis was employed to explore the structure and hierarchical property of mentors' behavior. Three dimensions (professional development, facilitating learning and psychosocial support) were identified by factor analysis and confirmed by Mokken scaling analysis. The three sub-scales showed internal consistency reliability from 87% to 91%, and moderate to strong precision in ordering students' expectation about mentors' behavior and a small Mokken scale showing hierarchy was identified. Some insight into the structure of mentoring in nursing education has been obtained and a scale which could be used in the study of mentoring and in the preparation of mentors has been developed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Development of Attitudes Toward Homosexuality Scale for Indians (AHSI).
Ahuja, Kanika K
2017-01-01
Attitudes toward homosexuality vary across cultures, with the legal and societal position being rather complicated in India. This study describes the process of developing and validating a Likert-type scale to assess attitudes toward homosexuality among heterosexuals. Phase 1 describes the development of the scale. Items were written based on thematic analysis of narratives generated from 50 college students and reviewing existing scales. After administering the 70-item scale to 68 participants, item analysis yielded 20 statements with item-total correlations over .70. Cronbach's alpha was .97. In Phase 2, the 20-item Attitudes Toward Homosexuality Scale for Indians (AHSI) was administered to 142 participants. Analysis yielded a corrected split-half correlation of .91. Further, AHSI discriminated between women and men; between liberal arts and STEM/business students; and those who reported interpersonal contact with gay men and lesbian women and those who did not. The scale has satisfactory reliability and shows promising construct validity.
Rank Determination of Mental Functions by 1D Wavelets and Partial Correlation.
Karaca, Y; Aslan, Z; Cattani, C; Galletta, D; Zhang, Y
2017-01-01
The main aim of this paper is to classify mental functions by the Wechsler Adult Intelligence Scale-Revised tests with a mixed method based on wavelets and partial correlation. The Wechsler Adult Intelligence Scale-Revised is a widely used test designed and applied for the classification of the adults cognitive skills in a comprehensive manner. In this paper, many different intellectual profiles have been taken into consideration to measure the relationship between the mental functioning and psychological disorder. We propose a method based on wavelets and correlation analysis for classifying mental functioning, by the analysis of some selected parameters measured by the Wechsler Adult Intelligence Scale-Revised tests. In particular, 1-D Continuous Wavelet Analysis, 1-D Wavelet Coefficient Method and Partial Correlation Method have been analyzed on some Wechsler Adult Intelligence Scale-Revised parameters such as School Education, Gender, Age, Performance Information Verbal and Full Scale Intelligence Quotient. In particular, we will show that gender variable has a negative but a significant role on age and Performance Information Verbal factors. The age parameters also has a significant relation in its role on Performance Information Verbal and Full Scale Intelligence Quotient change.
NASA Technical Reports Server (NTRS)
Huffman, George J.; Adler, Robert F.; Bolvin, David T.; Gu, Guojun; Nelkin, Eric J.; Bowman, Kenneth P.; Stocker, Erich; Wolff, David B.
2006-01-01
The TRMM Multi-satellite Precipitation Analysis (TMPA) provides a calibration-based sequential scheme for combining multiple precipitation estimates from satellites, as well as gauge analyses where feasible, at fine scales (0.25 degrees x 0.25 degrees and 3-hourly). It is available both after and in real time, based on calibration by the TRMM Combined Instrument and TRMM Microwave Imager precipitation products, respectively. Only the after-real-time product incorporates gauge data at the present. The data set covers the latitude band 50 degrees N-S for the period 1998 to the delayed present. Early validation results are as follows: The TMPA provides reasonable performance at monthly scales, although it is shown to have precipitation rate dependent low bias due to lack of sensitivity to low precipitation rates in one of the input products (based on AMSU-B). At finer scales the TMPA is successful at approximately reproducing the surface-observation-based histogram of precipitation, as well as reasonably detecting large daily events. The TMPA, however, has lower skill in correctly specifying moderate and light event amounts on short time intervals, in common with other fine-scale estimators. Examples are provided of a flood event and diurnal cycle determination.
ERIC Educational Resources Information Center
Young, Forrest W.
A model permitting construction of algorithms for the polynomial conjoint analysis of similarities is presented. This model, which is based on concepts used in nonmetric scaling, permits one to obtain the best approximate solution. The concepts used to construct nonmetric scaling algorithms are reviewed. Finally, examples of algorithmic models for…
NASA Astrophysics Data System (ADS)
Wang, Hong; Lu, Kaiyu; Pu, Ruiliang
2016-10-01
The Robinia pseudoacacia forest in the Yellow River delta of China has been planted since the 1970s, and a large area of dieback of the forest has occurred since the 1990s. To assess the condition of the R. pseudoacacia forest in three forest areas (i.e., Gudao, Machang, and Abandoned Yellow River) in the delta, we combined an estimation of scale parameters tool and geometry/topology assessment criteria to determine the optimal scale parameters, selected optimal predictive variables determined by stepwise discriminant analysis, and compared object-based image analysis (OBIA) and pixel-based approaches using IKONOS data. The experimental results showed that the optimal segmentation scale is 5 for both the Gudao and Machang forest areas, and 12 for the Abandoned Yellow River forest area. The results produced by the OBIA method were much better than those created by the pixel-based method. The overall accuracy of the OBIA method was 93.7% (versus 85.4% by the pixel-based) for Gudao, 89.0% (versus 72.7%) for Abandoned Yellow River, and 91.7% (versus 84.4%) for Machang. Our analysis results demonstrated that the OBIA method was an effective tool for rapidly mapping and assessing the health levels of forest.
Global sensitivity analysis of multiscale properties of porous materials
NASA Astrophysics Data System (ADS)
Um, Kimoon; Zhang, Xuan; Katsoulakis, Markos; Plechac, Petr; Tartakovsky, Daniel M.
2018-02-01
Ubiquitous uncertainty about pore geometry inevitably undermines the veracity of pore- and multi-scale simulations of transport phenomena in porous media. It raises two fundamental issues: sensitivity of effective material properties to pore-scale parameters and statistical parameterization of Darcy-scale models that accounts for pore-scale uncertainty. Homogenization-based maps of pore-scale parameters onto their Darcy-scale counterparts facilitate both sensitivity analysis (SA) and uncertainty quantification. We treat uncertain geometric characteristics of a hierarchical porous medium as random variables to conduct global SA and to derive probabilistic descriptors of effective diffusion coefficients and effective sorption rate. Our analysis is formulated in terms of solute transport diffusing through a fluid-filled pore space, while sorbing to the solid matrix. Yet it is sufficiently general to be applied to other multiscale porous media phenomena that are amenable to homogenization.
Nanoliter-Scale Oil-Air-Droplet Chip-Based Single Cell Proteomic Analysis.
Li, Zi-Yi; Huang, Min; Wang, Xiu-Kun; Zhu, Ying; Li, Jin-Song; Wong, Catherine C L; Fang, Qun
2018-04-17
Single cell proteomic analysis provides crucial information on cellular heterogeneity in biological systems. Herein, we describe a nanoliter-scale oil-air-droplet (OAD) chip for achieving multistep complex sample pretreatment and injection for single cell proteomic analysis in the shotgun mode. By using miniaturized stationary droplet microreaction and manipulation techniques, our system allows all sample pretreatment and injection procedures to be performed in a nanoliter-scale droplet with minimum sample loss and a high sample injection efficiency (>99%), thus substantially increasing the analytical sensitivity for single cell samples. We applied the present system in the proteomic analysis of 100 ± 10, 50 ± 5, 10, and 1 HeLa cell(s), and protein IDs of 1360, 612, 192, and 51 were identified, respectively. The OAD chip-based system was further applied in single mouse oocyte analysis, with 355 protein IDs identified at the single oocyte level, which demonstrated its special advantages of high enrichment of sequence coverage, hydrophobic proteins, and enzymatic digestion efficiency over the traditional in-tube system.
NASA Astrophysics Data System (ADS)
Lin, Chien-Liang
2018-02-01
This study sought to develop a self-report instrument to be used in the assessment of the project competences of college students engaged in online project-based learning. Three scales of the KIPSSE instrument developed for this study, namely, the knowledge integration, project skills, and self-efficacy scales, were based on related theories and the analysis results of three project advisor interviews. Those items of knowledge integration and project skill scales focused on the integration of different disciplines and technological skills separately. Two samples of data were collected from information technology-related courses taught with an online project-based learning strategy over different semesters at a college in southern Taiwan. The validity and reliability of the KIPSSE instrument were confirmed through item analysis and confirmatory factor analysis using structural equation modeling of two samples of students' online response sets separately. The Cronbach's alpha reliability coefficient for the entire instrument was 0.931; for each scale, the alpha ranged from 0.832 to 0.907. There was also a significant correlation ( r = 0.55, p < 0.01) between the KIPSSE instrument results and the students' product evaluation scores. The findings of this study confirmed the validity and reliability of the KIPSSE instrument. The confirmation process and related implications are also discussed.
NASA Astrophysics Data System (ADS)
Aouabdi, Salim; Taibi, Mahmoud; Bouras, Slimane; Boutasseta, Nadir
2017-06-01
This paper describes an approach for identifying localized gear tooth defects, such as pitting, using phase currents measured from an induction machine driving the gearbox. A new tool of anomaly detection based on multi-scale entropy (MSE) algorithm SampEn which allows correlations in signals to be identified over multiple time scales. The motor current signature analysis (MCSA) in conjunction with principal component analysis (PCA) and the comparison of observed values with those predicted from a model built using nominally healthy data. The Simulation results show that the proposed method is able to detect gear tooth pitting in current signals.
Perceptual security of encrypted images based on wavelet scaling analysis
NASA Astrophysics Data System (ADS)
Vargas-Olmos, C.; Murguía, J. S.; Ramírez-Torres, M. T.; Mejía Carlos, M.; Rosu, H. C.; González-Aguilar, H.
2016-08-01
The scaling behavior of the pixel fluctuations of encrypted images is evaluated by using the detrended fluctuation analysis based on wavelets, a modern technique that has been successfully used recently for a wide range of natural phenomena and technological processes. As encryption algorithms, we use the Advanced Encryption System (AES) in RBT mode and two versions of a cryptosystem based on cellular automata, with the encryption process applied both fully and partially by selecting different bitplanes. In all cases, the results show that the encrypted images in which no understandable information can be visually appreciated and whose pixels look totally random present a persistent scaling behavior with the scaling exponent α close to 0.5, implying no correlation between pixels when the DFA with wavelets is applied. This suggests that the scaling exponents of the encrypted images can be used as a perceptual security criterion in the sense that when their values are close to 0.5 (the white noise value) the encrypted images are more secure also from the perceptual point of view.
Multi-scale analysis of a household level agent-based model of landcover change.
Evans, Tom P; Kelley, Hugh
2004-08-01
Scale issues have significant implications for the analysis of social and biophysical processes in complex systems. These same scale implications are likewise considerations for the design and application of models of landcover change. Scale issues have wide-ranging effects from the representativeness of data used to validate models to aggregation errors introduced in the model structure. This paper presents an analysis of how scale issues affect an agent-based model (ABM) of landcover change developed for a research area in the Midwest, USA. The research presented here explores how scale factors affect the design and application of agent-based landcover change models. The ABM is composed of a series of heterogeneous agents who make landuse decisions on a portfolio of cells in a raster-based programming environment. The model is calibrated using measures of fit derived from both spatial composition and spatial pattern metrics from multi-temporal landcover data interpreted from historical aerial photography. A model calibration process is used to find a best-fit set of parameter weights assigned to agents' preferences for different landuses (agriculture, pasture, timber production, and non-harvested forest). Previous research using this model has shown how a heterogeneous set of agents with differing preferences for a portfolio of landuses produces the best fit to landcover changes observed in the study area. The scale dependence of the model is explored by varying the resolution of the input data used to calibrate the model (observed landcover), ancillary datasets that affect land suitability (topography), and the resolution of the model landscape on which agents make decisions. To explore the impact of these scale relationships the model is run with input datasets constructed at the following spatial resolutions: 60, 90, 120, 150, 240, 300 and 480 m. The results show that the distribution of landuse-preference weights differs as a function of scale. In addition, with the gradient descent model fitting method used in this analysis the model was not able to converge to an acceptable fit at the 300 and 480 m spatial resolutions. This is a product of the ratio of the input cell resolution to the average parcel size in the landscape. This paper uses these findings to identify scale considerations in the design, development, validation and application of ABMs of landcover change.
NASA Technical Reports Server (NTRS)
Beard, Daniel A.; Liang, Shou-Dan; Qian, Hong; Biegel, Bryan (Technical Monitor)
2001-01-01
Predicting behavior of large-scale biochemical metabolic networks represents one of the greatest challenges of bioinformatics and computational biology. Approaches, such as flux balance analysis (FBA), that account for the known stoichiometry of the reaction network while avoiding implementation of detailed reaction kinetics are perhaps the most promising tools for the analysis of large complex networks. As a step towards building a complete theory of biochemical circuit analysis, we introduce energy balance analysis (EBA), which compliments the FBA approach by introducing fundamental constraints based on the first and second laws of thermodynamics. Fluxes obtained with EBA are thermodynamically feasible and provide valuable insight into the activation and suppression of biochemical pathways.
Mi, Misa; Moseley, James L; Green, Michael L
2012-02-01
Many residency programs offer training in evidence-based medicine (EBM). However, these curricula often fail to achieve optimal learning outcomes, perhaps because they neglect various contextual factors in the learning environment. We developed and validated an instrument to characterize the environment for EBM learning and practice in residency programs. An EBM Environment Scale was developed following scale development principles. A survey was administered to residents across six programs in primary care specialties at four medical centers. Internal consistency reliability was analyzed with Cronbach's coefficient alpha. Validity was assessed by comparing predetermined subscales with the survey's internal structure as assessed via factor analysis. Scores were also compared for subgroups based on residency program affiliation and residency characteristics. Out of 262 eligible residents, 124 completed the survey (response rate 47%). The overall mean score was 3.89 (standard deviation=0.56). The initial reliability analysis of the 48-item scale had a high reliability coefficient (Cronbach α=.94). Factor analysis and further item analysis resulted in a shorter 36-item scale with a satisfactory reliability coefficient (Cronbach α=.86). Scores were higher for residents with prior EBM training in medical school (4.14 versus 3.62) and in residency (4.25 versus 3.69). If further testing confirms its properties, the EBM Environment Scale may be used to understand the influence of the learning environment on the effectiveness of EBM training. Additionally, it may detect changes in the EBM learning environment in response to programmatic or institutional interventions.
NASA Technical Reports Server (NTRS)
Krishnamurthy, Thiagarajan
2010-01-01
Equivalent plate analysis is often used to replace the computationally expensive finite element analysis in initial design stages or in conceptual design of aircraft wing structures. The equivalent plate model can also be used to design a wind tunnel model to match the stiffness characteristics of the wing box of a full-scale aircraft wing model while satisfying strength-based requirements An equivalent plate analysis technique is presented to predict the static and dynamic response of an aircraft wing with or without damage. First, a geometric scale factor and a dynamic pressure scale factor are defined to relate the stiffness, load and deformation of the equivalent plate to the aircraft wing. A procedure using an optimization technique is presented to create scaled equivalent plate models from the full scale aircraft wing using geometric and dynamic pressure scale factors. The scaled models are constructed by matching the stiffness of the scaled equivalent plate with the scaled aircraft wing stiffness. It is demonstrated that the scaled equivalent plate model can be used to predict the deformation of the aircraft wing accurately. Once the full equivalent plate geometry is obtained, any other scaled equivalent plate geometry can be obtained using the geometric scale factor. Next, an average frequency scale factor is defined as the average ratio of the frequencies of the aircraft wing to the frequencies of the full-scaled equivalent plate. The average frequency scale factor combined with the geometric scale factor is used to predict the frequency response of the aircraft wing from the scaled equivalent plate analysis. A procedure is outlined to estimate the frequency response and the flutter speed of an aircraft wing from the equivalent plate analysis using the frequency scale factor and geometric scale factor. The equivalent plate analysis is demonstrated using an aircraft wing without damage and another with damage. Both of the problems show that the scaled equivalent plate analysis can be successfully used to predict the frequencies and flutter speed of a typical aircraft wing.
Adjoint Sensitivity Analysis for Scale-Resolving Turbulent Flow Solvers
NASA Astrophysics Data System (ADS)
Blonigan, Patrick; Garai, Anirban; Diosady, Laslo; Murman, Scott
2017-11-01
Adjoint-based sensitivity analysis methods are powerful design tools for engineers who use computational fluid dynamics. In recent years, these engineers have started to use scale-resolving simulations like large-eddy simulations (LES) and direct numerical simulations (DNS), which resolve more scales in complex flows with unsteady separation and jets than the widely-used Reynolds-averaged Navier-Stokes (RANS) methods. However, the conventional adjoint method computes large, unusable sensitivities for scale-resolving simulations, which unlike RANS simulations exhibit the chaotic dynamics inherent in turbulent flows. Sensitivity analysis based on least-squares shadowing (LSS) avoids the issues encountered by conventional adjoint methods, but has a high computational cost even for relatively small simulations. The following talk discusses a more computationally efficient formulation of LSS, ``non-intrusive'' LSS, and its application to turbulent flows simulated with a discontinuous-Galkerin spectral-element-method LES/DNS solver. Results are presented for the minimal flow unit, a turbulent channel flow with a limited streamwise and spanwise domain.
Development of Islamic Spiritual Health Scale (ISHS).
Khorashadizadeh, Fatemeh; Heydari, Abbas; Nabavi, Fatemeh Heshmati; Mazlom, Seyed Reza; Ebrahimi, Mahdi; Esmaili, Habibollah
2017-03-01
To develop and psychometrically assess spiritual health scale based on Islamic view in Iran. The cross-sectional study was conducted at Imam Ali and Quem hospitals in Mashhad and Imam Ali and Imam Reza hospitals in Bojnurd, Iran, from 2015 to 2016 In the first stage, an 81-item Likert-type scale was developed using a qualitative approach. The second stage comprised quantitative component. The scale's impact factor, content validity ratio, content validity index, face validity and exploratory factor analysis were calculated. Test-retest and internal consistency was used to examine the reliability of the instrument. Data analysis was done using SPSS 11. Of 81 items in the scale, those with impact factor above 1.5, content validity ratio above 0.62, and content validity index above 0.79 were considered valid and the rest were discarded, resulting in a 61-item scale. Exploratory factor analysis reduced the list of items to 30, which were divided into seven groups with a minimum eigen value of 1 for each factor. But according to scatter plot, attributes of the concept of spiritual health included love to creator, duty-based life, religious rationality, psychological balance, and attention to afterlife. Internal reliability of the scale was calculated by alpha Cronbach coefficient as 0.91. There was solid evidence of the strength factor structure and reliability of the Islamic Spiritual Health Scale which provides a unique way for spiritual health assessment of Muslims.
Chang, Chih-Cheng; Su, Jian-An; Tsai, Ching-Shu; Yen, Cheng-Fang; Liu, Jiun-Horng; Lin, Chung-Ying
2015-06-01
To examine the psychometrics of the Affiliate Stigma Scale using rigorous psychometric analysis: classical test theory (CTT) (traditional) and Rasch analysis (modern). Differential item functioning (DIF) items were also tested using Rasch analysis. Caregivers of relatives with mental illness (n = 453; mean age: 53.29 ± 13.50 years) were recruited from southern Taiwan. Each participant filled out four questionnaires: Affiliate Stigma Scale, Rosenberg Self-Esteem Scale, Beck Anxiety Inventory, and one background information sheet. CTT analyses showed that the Affiliate Stigma Scale had satisfactory internal consistency (α = 0.85-0.94) and concurrent validity (Rosenberg Self-Esteem Scale: r = -0.52 to -0.46; Beck Anxiety Inventory: r = 0.27-0.34). Rasch analyses supported the unidimensionality of three domains in the Affiliate Stigma Scale and indicated four DIF items (affect domain: 1; cognitive domain: 3) across gender. Our findings, based on rigorous statistical analysis, verified the psychometrics of the Affiliate Stigma Scale and reported its DIF items. We conclude that the three domains of the Affiliate Stigma Scale can be separately used and are suitable for measuring the affiliate stigma of caregivers of relatives with mental illness. Copyright © 2015 Elsevier Inc. All rights reserved.
Haddad, Mark; Waqas, Ahmed; Sukhera, Ahmed Bashir; Tarar, Asad Zaman
2017-07-27
Depression is common mental health problem and leading contributor to the global burden of disease. The attitudes and beliefs of the public and of health professionals influence social acceptance and affect the esteem and help-seeking of people experiencing mental health problems. The attitudes of clinicians are particularly relevant to their role in accurately recognising and providing appropriate support and management of depression. This study examines the characteristics of the revised depression attitude questionnaire (R-DAQ) with doctors working in healthcare settings in Lahore, Pakistan. A cross-sectional survey was conducted in 2015 using the revised depression attitude questionnaire (R-DAQ). A convenience sample of 700 medical practitioners based in six hospitals in Lahore was approached to participate in the survey. The R-DAQ structure was examined using Parallel Analysis from polychoric correlations. Unweighted least squares analysis (ULSA) was used for factor extraction. Model fit was estimated using goodness-of-fit indices and the root mean square of standardized residuals (RMSR), and internal consistency reliability for the overall scale and subscales was assessed using reliability estimates based on Mislevy and Bock (BILOG 3 Item analysis and test scoring with binary logistic models. Mooresville: Scientific Software, 55) and the McDonald's Omega statistic. Findings using this approach were compared with principal axis factor analysis based on Pearson correlation matrix. 601 (86%) of the doctors approached consented to participate in the study. Exploratory factor analysis of R-DAQ scale responses demonstrated the same 3-factor structure as in the UK development study, though analyses indicated removal of 7 of the 22 items because of weak loading or poor model fit. The 3 factor solution accounted for 49.8% of the common variance. Scale reliability and internal consistency were adequate: total scale standardised alpha was 0.694; subscale reliability for professional confidence was 0.732, therapeutic optimism/pessimism was 0.638, and generalist perspective was 0.769. The R-DAQ was developed with a predominantly UK-based sample of health professionals. This study indicates that this scale functions adequately and provides a valid measure of depression attitudes for medical practitioners in Pakistan, with the same factor structure as in the scale development sample. However, optimal scale function necessitated removal of several items, with a 15-item scale enabling the most parsimonious factor solution for this population.
NASA Astrophysics Data System (ADS)
Ali-Akbari, H. R.; Ceballes, S.; Abdelkefi, A.
2017-10-01
A nonlocal continuum-based model is derived to simulate the dynamic behavior of bridged carbon nanotube-based nano-scale mass detectors. The carbon nanotube (CNT) is modeled as an elastic Euler-Bernoulli beam considering von-Kármán type geometric nonlinearity. In order to achieve better accuracy in characterization of the CNTs, the geometrical properties of an attached nano-scale particle are introduced into the model by its moment of inertia with respect to the central axis of the beam. The inter-atomic long-range interactions within the structure of the CNT are incorporated into the model using Eringen's nonlocal elastic field theory. In this model, the mass can be deposited along an arbitrary length of the CNT. After deriving the full nonlinear equations of motion, the natural frequencies and corresponding mode shapes are extracted based on a linear eigenvalue problem analysis. The results show that the geometry of the attached particle has a significant impact on the dynamic behavior of the CNT-based mechanical resonator, especially, for those with small aspect ratios. The developed model and analysis are beneficial for nano-scale mass identification when a CNT-based mechanical resonator is utilized as a small-scale bio-mass sensor and the deposited particles are those, such as proteins, enzymes, cancer cells, DNA and other nano-scale biological objects with different and complex shapes.
NASA Astrophysics Data System (ADS)
Lian, Enyang; Ren, Yingyu; Han, Yunfeng; Liu, Weixin; Jin, Ningde; Zhao, Junying
2016-11-01
The multi-scale analysis is an important method for detecting nonlinear systems. In this study, we carry out experiments and measure the fluctuation signals from a rotating electric field conductance sensor with eight electrodes. We first use a recurrence plot to recognise flow patterns in vertical upward gas-liquid two-phase pipe flow from measured signals. Then we apply a multi-scale morphological analysis based on the first-order difference scatter plot to investigate the signals captured from the vertical upward gas-liquid two-phase flow loop test. We find that the invariant scaling exponent extracted from the multi-scale first-order difference scatter plot with the bisector of the second-fourth quadrant as the reference line is sensitive to the inhomogeneous distribution characteristics of the flow structure, and the variation trend of the exponent is helpful to understand the process of breakup and coalescence of the gas phase. In addition, we explore the dynamic mechanism influencing the inhomogeneous distribution of the gas phase in terms of adaptive optimal kernel time-frequency representation. The research indicates that the system energy is a factor influencing the distribution of the gas phase and the multi-scale morphological analysis based on the first-order difference scatter plot is an effective method for indicating the inhomogeneous distribution of the gas phase in gas-liquid two-phase flow.
NASA Astrophysics Data System (ADS)
Holá, Markéta; Kalvoda, Jiří; Nováková, Hana; Škoda, Radek; Kanický, Viktor
2011-01-01
LA-ICP-MS and solution based ICP-MS in combination with electron microprobe are presented as a method for the determination of the elemental spatial distribution in fish scales which represent an example of a heterogeneous layered bone structure. Two different LA-ICP-MS techniques were tested on recent common carp ( Cyprinus carpio) scales: A line scan through the whole fish scale perpendicular to the growth rings. The ablation crater of 55 μm width and 50 μm depth allowed analysis of the elemental distribution in the external layer. Suitable ablation conditions providing a deeper ablation crater gave average values from the external HAP layer and the collagen basal plate. Depth profiling using spot analysis was tested in fish scales for the first time. Spot analysis allows information to be obtained about the depth profile of the elements at the selected position on the sample. The combination of all mentioned laser ablation techniques provides complete information about the elemental distribution in the fish scale samples. The results were compared with the solution based ICP-MS and EMP analyses. The fact that the results of depth profiling are in a good agreement both with EMP and PIXE results and, with the assumed ways of incorporation of the studied elements in the HAP structure, suggests a very good potential for this method.
Bayesian analysis of factors associated with fibromyalgia syndrome subjects
NASA Astrophysics Data System (ADS)
Jayawardana, Veroni; Mondal, Sumona; Russek, Leslie
2015-01-01
Factors contributing to movement-related fear were assessed by Russek, et al. 2014 for subjects with Fibromyalgia (FM) based on the collected data by a national internet survey of community-based individuals. The study focused on the variables, Activities-Specific Balance Confidence scale (ABC), Primary Care Post-Traumatic Stress Disorder screen (PC-PTSD), Tampa Scale of Kinesiophobia (TSK), a Joint Hypermobility Syndrome screen (JHS), Vertigo Symptom Scale (VSS-SF), Obsessive-Compulsive Personality Disorder (OCPD), Pain, work status and physical activity dependent from the "Revised Fibromyalgia Impact Questionnaire" (FIQR). The study presented in this paper revisits same data with a Bayesian analysis where appropriate priors were introduced for variables selected in the Russek's paper.
Balasubramanian, M; Spencer, A J; Short, S D; Watkins, K; Chrisopoulos, S; Brennan, D S
2016-09-01
The integration of qualitative and quantitative approaches introduces new avenues to bridge strengths, and address weaknesses of both methods. To develop measure(s) for migrant dentist experiences in Australia through a mixed methods approach. The sequential qualitative-quantitative design involved first the harvesting of data items from qualitative study, followed by a national survey of migrant dentists in Australia. Statements representing unique experiences in migrant dentists' life stories were deployed the survey questionnaire, using a five-point Likert scale. Factor analysis was used to examine component factors. Eighty-two statements from 51 participants were harvested from the qualitative analysis. A total of 1,022 of 1,977 migrant dentists (response rate 54.5%) returned completed questionnaires. Factor analysis supported an initial eight-factor solution; further scale development and reliability analysis led to five scales with a final list of 38 life story experience (LSE) items. Three scales were based on home country events: health system and general lifestyle concerns (LSE1; 10 items), society and culture (LSE4; 4 items) and career development (LSE5; 4 items). Two scales included migrant experiences in Australia: appreciation towards Australian way of life (LSE2; 13 items) and settlement concerns (LSE3; 7 items). The five life story experience scales provided necessary conceptual clarity and empirical grounding to explore migrant dentist experiences in Australia. Being based on original migrant dentist narrations, these scales have the potential to offer in-depth insights for policy makers and support future research on dentist migration. Copyright© 2016 Dennis Barber Ltd
Building to Scale: An Analysis of Web-Based Services in CIC (Big Ten) Libraries.
ERIC Educational Resources Information Center
Dewey, Barbara I.
Advancing library services in large universities requires creative approaches for "building to scale." This is the case for CIC, Committee on Institutional Cooperation (Big Ten), libraries whose home institutions serve thousands of students, faculty, staff, and others. Developing virtual Web-based services is an increasingly viable…
ERIC Educational Resources Information Center
Cress, Cynthia; Lambert, Matthew C.; Epstein, Michael H.
2016-01-01
Strength-based assessment of behaviors in preschool children provides evidence of emotional and behavioral skills in children, rather than focusing primarily on weaknesses identified by deficit-based assessments. The Preschool Behavioral and Emotional Rating Scales (PreBERS) is a normative assessment of emotional and behavioral strengths in…
Right-Scaling Stewardship: A Multi-Scale Perspective on Cooperative Print Management
ERIC Educational Resources Information Center
Malpas, Constance; Lavoie, Brian
2014-01-01
The goal of this report is to provide an empirically-based assessment, based on WorldCat bibliographic and holdings data, of the size, scope, and salient features of these collections, with special attention to identifying and characterizing segments consisting of relatively scarce and relatively widely-held materials. The analysis also employs a…
Conceptual design and analysis of a dynamic scale model of the Space Station Freedom
NASA Technical Reports Server (NTRS)
Davis, D. A.; Gronet, M. J.; Tan, M. K.; Thorne, J.
1994-01-01
This report documents the conceptual design study performed to evaluate design options for a subscale dynamic test model which could be used to investigate the expected on-orbit structural dynamic characteristics of the Space Station Freedom early build configurations. The baseline option was a 'near-replica' model of the SSF SC-7 pre-integrated truss configuration. The approach used to develop conceptual design options involved three sets of studies: evaluation of the full-scale design and analysis databases, conducting scale factor trade studies, and performing design sensitivity studies. The scale factor trade study was conducted to develop a fundamental understanding of the key scaling parameters that drive design, performance and cost of a SSF dynamic scale model. Four scale model options were estimated: 1/4, 1/5, 1/7, and 1/10 scale. Prototype hardware was fabricated to assess producibility issues. Based on the results of the study, a 1/4-scale size is recommended based on the increased model fidelity associated with a larger scale factor. A design sensitivity study was performed to identify critical hardware component properties that drive dynamic performance. A total of 118 component properties were identified which require high-fidelity replication. Lower fidelity dynamic similarity scaling can be used for non-critical components.
Analysis of surface scale on the Ni-based superalloy CMSX-10N and proposed mechanism of formation
NASA Astrophysics Data System (ADS)
Simmonds, S.; D'Souza, N.; Ryder, K. S.; Dong, H.
2012-01-01
There is a continuing demand to raise the operating temperature of jet engine turbine blades to meet the need for higher turbine entry temperatures (TET) in order to increase thermal efficiency and thrust. Modern, high-pressure turbine blades are made from Ni-based superalloys in single-crystal form via the investment casting process. One important post-cast surface defect, known as 'surface scale', has been investigated on the alloy CMSX-10N. This is an area of distinct discolouration of the aerofoil seen after casting. Auger electron and X-ray photoelectron spectroscopy analysis were carried out on both scaled and un-scaled areas. In the scaled region, a thin layer (~800nm) of Ni oxide is evident. In the un-scaled regions there is a thicker Al2O3 layer. It is shown that, as the blade cools during casting, differential thermal contraction of mould and alloy causes the solid blade to 'detach' from the mould in these scaled areas. The formation of Ni Oxides is facilitated by this separation.
NASA Astrophysics Data System (ADS)
Wang, Min; Cui, Qi; Wang, Jie; Ming, Dongping; Lv, Guonian
2017-01-01
In this paper, we first propose several novel concepts for object-based image analysis, which include line-based shape regularity, line density, and scale-based best feature value (SBV), based on the region-line primitive association framework (RLPAF). We then propose a raft cultivation area (RCA) extraction method for high spatial resolution (HSR) remote sensing imagery based on multi-scale feature fusion and spatial rule induction. The proposed method includes the following steps: (1) Multi-scale region primitives (segments) are obtained by image segmentation method HBC-SEG, and line primitives (straight lines) are obtained by phase-based line detection method. (2) Association relationships between regions and lines are built based on RLPAF, and then multi-scale RLPAF features are extracted and SBVs are selected. (3) Several spatial rules are designed to extract RCAs within sea waters after land and water separation. Experiments show that the proposed method can successfully extract different-shaped RCAs from HR images with good performance.
Discovery of a diamond-based photonic crystal structure in beetle scales.
Galusha, Jeremy W; Richey, Lauren R; Gardner, John S; Cha, Jennifer N; Bartl, Michael H
2008-05-01
We investigated the photonic crystal structure inside iridescent scales of the weevil Lamprocyphus augustus. By combining a high-resolution structure analysis technique based on sequential focused ion beam milling and scanning electron microscopy imaging with theoretical modeling and photonic band-structure calculations, we discovered a natural three-dimensional photonic structure with a diamond-based crystal lattice operating at visible wavelengths. Moreover, we found that within individual scales, the diamond-based structure is assembled in the form of differently oriented single-crystalline micrometer-sized pixels with only selected lattice planes facing the scales' top surface. A comparison of results obtained from optical microreflectance measurements with photonic band-structure calculations reveals that it is this sophisticated microassembly of the diamond-based crystal lattice that lends Lamprocyphus augustus its macroscopically near angle-independent green coloration.
Desland, Fiona A; Afzal, Aqeela; Warraich, Zuha; Mocco, J
2014-01-01
Animal models of stroke have been crucial in advancing our understanding of the pathophysiology of cerebral ischemia. Currently, the standards for determining neurological deficit in rodents are the Bederson and Garcia scales, manual assessments scoring animals based on parameters ranked on a narrow scale of severity. Automated open field analysis of a live-video tracking system that analyzes animal behavior may provide a more sensitive test. Results obtained from the manual Bederson and Garcia scales did not show significant differences between pre- and post-stroke animals in a small cohort. When using the same cohort, however, post-stroke data obtained from automated open field analysis showed significant differences in several parameters. Furthermore, large cohort analysis also demonstrated increased sensitivity with automated open field analysis versus the Bederson and Garcia scales. These early data indicate use of automated open field analysis software may provide a more sensitive assessment when compared to traditional Bederson and Garcia scales.
Development and validation of the simulation-based learning evaluation scale.
Hung, Chang-Chiao; Liu, Hsiu-Chen; Lin, Chun-Chih; Lee, Bih-O
2016-05-01
The instruments that evaluate a student's perception of receiving simulated training are English versions and have not been tested for reliability or validity. The aim of this study was to develop and validate a Chinese version Simulation-Based Learning Evaluation Scale (SBLES). Four stages were conducted to develop and validate the SBLES. First, specific desired competencies were identified according to the National League for Nursing and Taiwan Nursing Accreditation Council core competencies. Next, the initial item pool was comprised of 50 items related to simulation that were drawn from the literature of core competencies. Content validity was established by use of an expert panel. Finally, exploratory factor analysis and confirmatory factor analysis were conducted for construct validity, and Cronbach's coefficient alpha determined the scale's internal consistency reliability. Two hundred and fifty students who had experienced simulation-based learning were invited to participate in this study. Two hundred and twenty-five students completed and returned questionnaires (response rate=90%). Six items were deleted from the initial item pool and one was added after an expert panel review. Exploratory factor analysis with varimax rotation revealed 37 items remaining in five factors which accounted for 67% of the variance. The construct validity of SBLES was substantiated in a confirmatory factor analysis that revealed a good fit of the hypothesized factor structure. The findings tally with the criterion of convergent and discriminant validity. The range of internal consistency for five subscales was .90 to .93. Items were rated on a 5-point scale from 1 (strongly disagree) to 5 (strongly agree). The results of this study indicate that the SBLES is valid and reliable. The authors recommend that the scale could be applied in the nursing school to evaluate the effectiveness of simulation-based learning curricula. Copyright © 2016 Elsevier Ltd. All rights reserved.
Prediction of Time Response of Electrowetting
NASA Astrophysics Data System (ADS)
Lee, Seung Jun; Hong, Jiwoo; Kang, Kwan Hyoung
2009-11-01
It is very important to predict the time response of electrowetting-based devices, such as liquid lenses, reflective displays, and optical switches. We investigated the time response of electrowetting, based on an analytical and a numerical method, to find out characteristic scales and a scaling law for the switching time. For this, spreading process of a sessile droplet was analyzed based on the domain perturbation method. First, we considered the case of weakly viscous fluids. The analytical result for the spreading process was compared with experimental results, which showed very good agreement in overall time response. It was shown that the overall dynamics is governed by P2 shape mode. We derived characteristic scales combining the droplet volume, density, and surface tension. The overall dynamic process was scaled quite well by the scales. A scaling law was derived from the analytical solution and was verified experimentally. We also suggest a scaling law for highly viscous liquids, based on results of numerical analysis for the electrowetting-actuated spreading process.
NASA Astrophysics Data System (ADS)
Ji, Yi; Sun, Shanlin; Xie, Hong-Bo
2017-06-01
Discrete wavelet transform (WT) followed by principal component analysis (PCA) has been a powerful approach for the analysis of biomedical signals. Wavelet coefficients at various scales and channels were usually transformed into a one-dimensional array, causing issues such as the curse of dimensionality dilemma and small sample size problem. In addition, lack of time-shift invariance of WT coefficients can be modeled as noise and degrades the classifier performance. In this study, we present a stationary wavelet-based two-directional two-dimensional principal component analysis (SW2D2PCA) method for the efficient and effective extraction of essential feature information from signals. Time-invariant multi-scale matrices are constructed in the first step. The two-directional two-dimensional principal component analysis then operates on the multi-scale matrices to reduce the dimension, rather than vectors in conventional PCA. Results are presented from an experiment to classify eight hand motions using 4-channel electromyographic (EMG) signals recorded in healthy subjects and amputees, which illustrates the efficiency and effectiveness of the proposed method for biomedical signal analysis.
O'Donnell, Andrew P.; Kurama, Yahya C.; Kalkan, Erol; Taflanidis, Alexandros A.
2017-01-01
This paper experimentally evaluates four methods to scale earthquake ground-motions within an ensemble of records to minimize the statistical dispersion and maximize the accuracy in the dynamic peak roof drift demand and peak inter-story drift demand estimates from response-history analyses of nonlinear building structures. The scaling methods that are investigated are based on: (1) ASCE/SEI 7–10 guidelines; (2) spectral acceleration at the fundamental (first mode) period of the structure, Sa(T1); (3) maximum incremental velocity, MIV; and (4) modal pushover analysis. A total of 720 shake-table tests of four small-scale nonlinear building frame specimens with different static and dynamic characteristics are conducted. The peak displacement demands from full suites of 36 near-fault ground-motion records as well as from smaller “unbiased” and “biased” design subsets (bins) of ground-motions are included. Out of the four scaling methods, ground-motions scaled to the median MIV of the ensemble resulted in the smallest dispersion in the peak roof and inter-story drift demands. Scaling based on MIValso provided the most accurate median demands as compared with the “benchmark” demands for structures with greater nonlinearity; however, this accuracy was reduced for structures exhibiting reduced nonlinearity. The modal pushover-based scaling (MPS) procedure was the only method to conservatively overestimate the median drift demands.
ERIC Educational Resources Information Center
Duku, Eric; Vaillancourt, Tracy; Szatmari, Peter; Georgiades, Stelios; Zwaigenbaum, Lonnie; Smith, Isabel M.; Bryson, Susan; Fombonne, Eric; Mirenda, Pat; Roberts, Wendy; Volden, Joanne; Waddell, Charlotte; Thompson, Ann; Bennett, Teresa
2013-01-01
The purpose of this study was to examine the measurement properties of the Social Responsiveness Scale in an accelerated longitudinal sample of 4-year-old preschool children with the complementary approaches of categorical confirmatory factor analysis and Rasch analysis. Measurement models based on the literature and other hypothesized measurement…
Nonlinear Image Denoising Methodologies
2002-05-01
53 5.3 A Multiscale Approach to Scale-Space Analysis . . . . . . . . . . . . . . . . 53 5.4...etc. In this thesis, Our approach to denoising is first based on a controlled nonlinear stochastic random walk to achieve a scale space analysis ( as in... stochastic treatment or interpretation of the diffusion. In addition, unless a specific stopping time is known to be adequate, the resulting evolution
Development of an Instrument for Measuring Self-Efficacy in Cell Biology
ERIC Educational Resources Information Center
Reeve, Suzanne; Kitchen, Elizabeth; Sudweeks, Richard R.; Bell, John D.; Bradshaw, William S.
2011-01-01
This article describes the development of a ten-item scale to assess biology majors' self-efficacy towards the critical thinking and data analysis skills taught in an upper-division cell biology course. The original seven-item scale was expanded to include three additional items based on the results of item analysis. Evidence of reliability and…
Projection-Based Reduced Order Modeling for Spacecraft Thermal Analysis
NASA Technical Reports Server (NTRS)
Qian, Jing; Wang, Yi; Song, Hongjun; Pant, Kapil; Peabody, Hume; Ku, Jentung; Butler, Charles D.
2015-01-01
This paper presents a mathematically rigorous, subspace projection-based reduced order modeling (ROM) methodology and an integrated framework to automatically generate reduced order models for spacecraft thermal analysis. Two key steps in the reduced order modeling procedure are described: (1) the acquisition of a full-scale spacecraft model in the ordinary differential equation (ODE) and differential algebraic equation (DAE) form to resolve its dynamic thermal behavior; and (2) the ROM to markedly reduce the dimension of the full-scale model. Specifically, proper orthogonal decomposition (POD) in conjunction with discrete empirical interpolation method (DEIM) and trajectory piece-wise linear (TPWL) methods are developed to address the strong nonlinear thermal effects due to coupled conductive and radiative heat transfer in the spacecraft environment. Case studies using NASA-relevant satellite models are undertaken to verify the capability and to assess the computational performance of the ROM technique in terms of speed-up and error relative to the full-scale model. ROM exhibits excellent agreement in spatiotemporal thermal profiles (<0.5% relative error in pertinent time scales) along with salient computational acceleration (up to two orders of magnitude speed-up) over the full-scale analysis. These findings establish the feasibility of ROM to perform rational and computationally affordable thermal analysis, develop reliable thermal control strategies for spacecraft, and greatly reduce the development cycle times and costs.
A large-scale perspective on stress-induced alterations in resting-state networks
NASA Astrophysics Data System (ADS)
Maron-Katz, Adi; Vaisvaser, Sharon; Lin, Tamar; Hendler, Talma; Shamir, Ron
2016-02-01
Stress is known to induce large-scale neural modulations. However, its neural effect once the stressor is removed and how it relates to subjective experience are not fully understood. Here we used a statistically sound data-driven approach to investigate alterations in large-scale resting-state functional connectivity (rsFC) induced by acute social stress. We compared rsfMRI profiles of 57 healthy male subjects before and after stress induction. Using a parcellation-based univariate statistical analysis, we identified a large-scale rsFC change, involving 490 parcel-pairs. Aiming to characterize this change, we employed statistical enrichment analysis, identifying anatomic structures that were significantly interconnected by these pairs. This analysis revealed strengthening of thalamo-cortical connectivity and weakening of cross-hemispheral parieto-temporal connectivity. These alterations were further found to be associated with change in subjective stress reports. Integrating report-based information on stress sustainment 20 minutes post induction, revealed a single significant rsFC change between the right amygdala and the precuneus, which inversely correlated with the level of subjective recovery. Our study demonstrates the value of enrichment analysis for exploring large-scale network reorganization patterns, and provides new insight on stress-induced neural modulations and their relation to subjective experience.
Ugarte, William J; Högberg, Ulf; Valladares, Eliette C; Essén, Birgitta
2013-04-01
Psychometric properties of external HIV-related stigma and discrimination scales and their predictors were investigated. A cross-sectional community-based study was carried out among 520 participants using an ongoing health and demographic surveillance system in León, Nicaragua. Participants completed an 18-item HIV stigma scale and 19 HIV and AIDS discrimination-related statements. A factor analysis found that 15 of the 18 items in the stigma scale and 18 of the 19 items in the discrimination scale loaded clearly into five- and four-factor structures, respectively. Overall Cronbach's alpha of .81 for the HIV stigma scale and .91 for the HIV discrimination scale provided evidence of internal consistency. Hierarchical multiple linear regression analysis identified that females, rural residents, people with insufficient HIV-related transmission knowledge, those not tested for HIV, those reporting an elevated self-perception of HIV risk, and those unwilling to disclose their HIV status were associated with higher stigmatizing attitudes and higher discriminatory actions towards HIV-positive people. This is the first community-based study in Nicaragua that demonstrates that overall HIV stigma and discrimination scales were reliable and valid in a community-based sample comprised of men and women of reproductive age. Stigma and discrimination were reported high in the general population, especially among sub-groups. The findings in the current study suggest community-based strategies, including the monitoring of stigma and discrimination, and designing and implementing stigma reduction interventions, are greatly needed to reduce inequities and increase acceptance of persons with HIV.
Development and validation of a measure of pediatric oral health-related quality of life: the POQL
Huntington, Noelle L; Spetter, Dante; Jones, Judith A.; Rich, Sharon E.; Garcia, Raul I.; Spiro, Avron
2011-01-01
Objective To develop a brief measure of oral health-related quality of life in children and demonstrate its reliability and validity in a diverse population. Methods We administered the initial 20-item POQL to children (Child Self-Report) and parents (Parent Report on Child) from diverse populations in both school-based and clinic-based settings. Clinical oral health status was measured on a subset of children. We used factor analysis to determine the underlying scales and then reduced the measure to 10 items based on several considerations. Multitrait analysis on the resulting 10-item POQL was used to reaffirm the discrimination of scales and assess the measure’s internal consistency and interscale correlations. We established discriminant and convergent validity with clinical status, perceived oral health and responses on the PedsQL and determined sensitivity to change with children undergoing ECC surgical repair. Results Factor analysis returned a four-scale solution for the initial items – Physical Functioning, Role Functioning, Social Functioning and Emotional Functioning. The reduced items represented the same four scales – two each on Physical and Role and three each on Social and Emotional. Good reliability and validity were shown for the POQL as a whole and for each of the scales. Conclusions The POQL is a valid and reliable measure of oral health-related quality of life for use in pre-school and school-aged children, with high utility for both clinical assessments and large-scale population studies. PMID:21972458
Development and validation of a measure of pediatric oral health-related quality of life: the POQL.
Huntington, Noelle L; Spetter, Dante; Jones, Judith A; Rich, Sharron E; Garcia, Raul I; Spiro, Avron
2011-01-01
To develop a brief measure of oral health-related quality of life (OHQL) in children and demonstrate its reliability and validity in a diverse population. We administered the initial 20-item Pediatric Oral Health-Related Quality of Life (POQL) to children (Child Self-Report) and parents (Parent Report on Child) from diverse populations in both school-based and clinic-based settings. Clinical oral health status was measured on a subset of children. We used factor analysis to determine the underlying scales and then reduced the measure to 10 items based on several considerations. Multitrait analysis on the resulting 10-item POQL was used to reaffirm the discrimination of scales and assess the measure's internal consistency and interscale correlations. We established discriminant and convergent validity with clinical status, perceived oral health and responses on the PedsQL, and determined sensitivity to change with children undergoing ECC surgical repair. Factor analysis returned a four-scale solution for the initial items--Physical Functioning, Role Functioning, Social Functioning, and Emotional Functioning. The reduced items represented the same four scales--two each on Physical and Role and three each on Social and Emotional. Good reliability and validity were shown for the POQL as a whole and for each of the scales. The POQL is a valid and reliable measure of OHQL for use in preschool and school-aged children, with high utility for both clinical assessments and large-scale population studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less
Computational Thermomechanical Modelling of Early-Age Silicate Composites
NASA Astrophysics Data System (ADS)
Vala, J.; Št'astník, S.; Kozák, V.
2009-09-01
Strains and stresses in early-age silicate composites, widely used in civil engineering, especially in fresh concrete mixtures, in addition to those caused by exterior mechanical loads, are results of complicated non-deterministic physical and chemical processes. Their numerical prediction at the macro-scale level requires the non-trivial physical analysis based on the thermodynamic principles, making use of micro-structural information from both theoretical and experimental research. The paper introduces a computational model, based on a nonlinear system of macroscopic equations of evolution, supplied with certain effective material characteristics, coming from the micro-scale analysis, and sketches the algorithm for its numerical analysis.
Shouryabi, Ali Asghar; Ghahrisarabi, Alireza; Anboohi, Sima Zohari; Nasiri, Malihe; Rassouli, Maryam
2017-11-01
Nursing competence is highly related to patient outcomes and patient safety issues, especially in intensive care units. Competence assessment tools are needed specifically for intensive care nursing. This study was performed to determine psychometric properties of the Intensive and Critical Care Nursing Competence Scale version-1 between Iranian Nurses. The present study was a methodological research in which 289 nurses of Intensive Care Units from nine hospitals in Shahid Beheshti University of Medical Sciences in Tehran were selected between 2015 and 2016. The original version of the scale was translated into Persian and back-translated into English, and the comments of the developer were applied. The validity of the scale was the determined quality (content validity and face validity) and quantity (confirmatory factor analysis). Reliability of the scale was reported by Cronbach's alpha coefficient and Intra class Correlation Coefficient. SPSS-PC (v.21) and LISREL (v.8.5) were used to analyze the data. The intensive and critical care nursing competence scale version-1 is a self-assessment test that consists of 144 items and four domains which are the knowledge base, the skill base, the attitudes and values base and the experience base, which are divided into clinical competence and professional competence. Content and face validity was confirmed by 10 experts and 10 practitioner nurses in the intensive care units. In confirmatory factor analysis, all fitness indexes, except goodness of fit index (0.64), confirmed the four-factor structure of the ICCN-CS-1. The results of the factor analysis, load factor between 0.304 and 0.727 items was estimated; only 4 items out of 144 items, that were loaded were less than 0.3 due to high Cronbach's alpha coefficient (0.984-0.986), all items were preserved, no item was removed and 4 subscales of the original scale were confirmed. The results of this study indicated that the Persian version of "The Intensive and Critical Care Nursing Competence Scale version-1" is a valid and reliable scale for the assessment of competency among Iranian nurses, and it can be used as a reliable scale in nursing management, education and research.
Honeycomb: Visual Analysis of Large Scale Social Networks
NASA Astrophysics Data System (ADS)
van Ham, Frank; Schulz, Hans-Jörg; Dimicco, Joan M.
The rise in the use of social network sites allows us to collect large amounts of user reported data on social structures and analysis of this data could provide useful insights for many of the social sciences. This analysis is typically the domain of Social Network Analysis, and visualization of these structures often proves invaluable in understanding them. However, currently available visual analysis tools are not very well suited to handle the massive scale of this network data, and often resolve to displaying small ego networks or heavily abstracted networks. In this paper, we present Honeycomb, a visualization tool that is able to deal with much larger scale data (with millions of connections), which we illustrate by using a large scale corporate social networking site as an example. Additionally, we introduce a new probability based network metric to guide users to potentially interesting or anomalous patterns and discuss lessons learned during design and implementation.
Kawarazuka, Nozomi; Locke, Catherine; McDougall, Cynthia; Kantor, Paula; Morgan, Miranda
2017-03-01
The demand for gender analysis is now increasingly orthodox in natural resource programming, including that for small-scale fisheries. Whilst the analysis of social-ecological resilience has made valuable contributions to integrating social dimensions into research and policy-making on natural resource management, it has so far demonstrated limited success in effectively integrating considerations of gender equity. This paper reviews the challenges in, and opportunities for, bringing a gender analysis together with social-ecological resilience analysis in the context of small-scale fisheries research in developing countries. We conclude that rather than searching for a single unifying framework for gender and resilience analysis, it will be more effective to pursue a plural solution in which closer engagement is fostered between analysis of gender and social-ecological resilience whilst preserving the strengths of each approach. This approach can make an important contribution to developing a better evidence base for small-scale fisheries management and policy.
2016-07-01
bias and scale factor tests. By testing state-of-the-art gyroscopes, the effect of input rate stability and accuracy may be examined. Based on the...tumble test or bias analysis at a tilted position to remove the effect of Earth’s rotation in the scale factor test • A rate table with better rate...format guide and test procedure for coriolis vibratory gyros. Piscataway (NJ): IEEE; 2004 Dec. 3. Maio A, Smith G, Knight R, Nothwang W, Conroy J
NASA Technical Reports Server (NTRS)
Furlong, G Chester; Mchugh, James G
1957-01-01
An analysis of the longitudinal characteristics of swept wings which is based on available large-scale low-speed data and supplemented with low-scale data when feasible is presented. The emphasis has been placed on the differentiation of the characteristics by a differentiation between the basic flow phenomenon involved. Insofar as possible all large-scale data available as of August 15, 1951 have been summarized in tabular form for ready reference.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dietrich, J.P.; et al.
Uncertainty in the mass-observable scaling relations is currently the limiting factor for galaxy cluster based cosmology. Weak gravitational lensing can provide a direct mass calibration and reduce the mass uncertainty. We present new ground-based weak lensing observations of 19 South Pole Telescope (SPT) selected clusters and combine them with previously reported space-based observations of 13 galaxy clusters to constrain the cluster mass scaling relations with the Sunyaev-Zel'dovich effect (SZE), the cluster gas massmore » $$M_\\mathrm{gas}$$, and $$Y_\\mathrm{X}$$, the product of $$M_\\mathrm{gas}$$ and X-ray temperature. We extend a previously used framework for the analysis of scaling relations and cosmological constraints obtained from SPT-selected clusters to make use of weak lensing information. We introduce a new approach to estimate the effective average redshift distribution of background galaxies and quantify a number of systematic errors affecting the weak lensing modelling. These errors include a calibration of the bias incurred by fitting a Navarro-Frenk-White profile to the reduced shear using $N$-body simulations. We blind the analysis to avoid confirmation bias. We are able to limit the systematic uncertainties to 6.4% in cluster mass (68% confidence). Our constraints on the mass-X-ray observable scaling relations parameters are consistent with those obtained by earlier studies, and our constraints for the mass-SZE scaling relation are consistent with the the simulation-based prior used in the most recent SPT-SZ cosmology analysis. We can now replace the external mass calibration priors used in previous SPT-SZ cosmology studies with a direct, internal calibration obtained on the same clusters.« less
Khandoker, Ahsan H; Karmakar, Chandan K; Begg, Rezaul K; Palaniswami, Marimuthu
2007-01-01
As humans age or are influenced by pathology of the neuromuscular system, gait patterns are known to adjust, accommodating for reduced function in the balance control system. The aim of this study was to investigate the effectiveness of a wavelet based multiscale analysis of a gait variable [minimum toe clearance (MTC)] in deriving indexes for understanding age-related declines in gait performance and screening of balance impairments in the elderly. MTC during walking on a treadmill for 30 healthy young, 27 healthy elderly and 10 falls risk elderly subjects with a history of tripping falls were analyzed. The MTC signal from each subject was decomposed to eight detailed signals at different wavelet scales by using the discrete wavelet transform. The variances of detailed signals at scales 8 to 1 were calculated. The multiscale exponent (beta) was then estimated from the slope of the variance progression at successive scales. The variance at scale 5 was significantly (p<0.01) different between young and healthy elderly group. Results also suggest that the Beta between scales 1 to 2 are effective for recognizing falls risk gait patterns. Results have implication for quantifying gait dynamics in normal, ageing and pathological conditions. Early detection of gait pattern changes due to ageing and balance impairments using wavelet-based multiscale analysis might provide the opportunity to initiate preemptive measures to be undertaken to avoid injurious falls.
Multi-Scale Effects in the Strength of Ceramics
Cook, Robert F.
2016-01-01
Multiple length-scale effects are demonstrated in indentation-strength measurements of a range of ceramic materials under inert and reactive conditions. Meso-scale effects associated with flaw disruption by lateral cracking at large indentation loads are shown to increase strengths above the ideal indentation response. Micro-scale effects associated with toughening by microstructural restraints at small indentation loads are shown to decrease strengths below the ideal response. A combined meso-micro-scale analysis is developed that describes ceramic inert strength behaviors over the complete indentation flaw size range. Nano-scale effects associated with chemical equilibria and crack velocity thresholds are shown to lead to invariant minimum strengths at slow applied stressing rates under reactive conditions. A combined meso-micro-nano-scale analysis is developed that describes the full range of reactive and inert strength behaviors as a function of indentation load and applied stressing rate. Applications of the multi-scale analysis are demonstrated for materials design, materials selection, toughness determination, crack velocity determination, bond-rupture parameter determination, and prediction of reactive strengths. The measurements and analysis provide strong support for the existence of sharp crack tips in ceramics such that the nano-scale mechanisms of discrete bond rupture are separate from the larger scale crack driving force mechanics characterized by continuum-based stress-intensity factors. PMID:27563150
Pilecki, Maciej Wojciech; Kowal, Małgorzata; Woronkowicz, Agnieszka; Sobiecki, Jan; Kryst, Łukasz; Kamińska-Reyman, Jadwiga
2014-01-01
The aims of the study were: 1) the assessment of the interaction between the factors specified for behavioural problems observed in pre-school children based on a factor analysis and 2) the assessment of the relationship the specified factors have with the age and gender of the study group. A factor analysis based on a Principal Component Analysis of the main results of a Disturbing Behaviour Questionnaire (DBQ) completed by pre-school teachers, which includes categories of behaviour observed among pre-school age children that provoke the greatest concern among parents, guardians and educators. Nine-hundred and sixty-one children aged from 2.7 to 7.9 years (mean: 5.4; SD 1.13) from randomly chosen pre-schools in all districts of Krak6w. Based on a screen plot, as well as on a substantive analysis of the results, a decision was taken to employ a four-factor analysis (Lagging behind, Excessive behaviour, Eating-avoidance and Overeating) explaining 68% of the common factor variance. A very high Cronbach's alpha value was returned for the reliability of the individual scales. The conducted analysis of the relationship of the scales with age and gender indicated a greater intensity of disturbing behaviour in boys for the Lagging behind factor, the Excessive behaviour factor and the overall scale for the Disturbing Behaviour Questionnaire (DBQ). These were the scales, along with the Eating-avoidance scale, that were found to be related to age. A greater intensity of disturbing behaviour was found to occur in the younger children. The relationship between the Overeating and Excessive behaviour scales that was found among girls but not among boys indicated that--even at such a young age--the characteristics associated with eating in the context of gender were already present. The authors consider that the coherence of the results obtained and their consistency with other studies ofpre-school age children provide a sound platform for further analyses using the questionnaire described above.
Multiscale recurrence quantification analysis of order recurrence plots
NASA Astrophysics Data System (ADS)
Xu, Mengjia; Shang, Pengjian; Lin, Aijing
2017-03-01
In this paper, we propose a new method of multiscale recurrence quantification analysis (MSRQA) to analyze the structure of order recurrence plots. The MSRQA is based on order patterns over a range of time scales. Compared with conventional recurrence quantification analysis (RQA), the MSRQA can show richer and more recognizable information on the local characteristics of diverse systems which successfully describes their recurrence properties. Both synthetic series and stock market indexes exhibit their properties of recurrence at large time scales that quite differ from those at a single time scale. Some systems present more accurate recurrence patterns under large time scales. It demonstrates that the new approach is effective for distinguishing three similar stock market systems and showing some inherent differences.
The Fusion Gain Analysis of the Inductively Driven Liner Compression Based Fusion
NASA Astrophysics Data System (ADS)
Shimazu, Akihisa; Slough, John
2016-10-01
An analytical analysis of the fusion gain expected in the inductively driven liner compression (IDLC) based fusion is conducted to identify the fusion gain scaling at various operating conditions. The fusion based on the IDLC is a magneto-inertial fusion concept, where a Field-Reversed Configuration (FRC) plasmoid is compressed via the inductively-driven metal liner to drive the FRC to fusion conditions. In the past, an approximate scaling law for the expected fusion gain for the IDLC based fusion was obtained under the key assumptions of (1) D-T fuel at 5-40 keV, (2) adiabatic scaling laws for the FRC dynamics, (3) FRC energy dominated by the pressure balance with the edge magnetic field at the peak compression, and (4) the liner dwell time being liner final diameter divided by the peak liner velocity. In this study, various assumptions made in the previous derivation is relaxed to study the change in the fusion gain scaling from the previous result of G ml1 / 2 El11 / 8 , where ml is the liner mass and El is the peak liner kinetic energy. The implication from the modified fusion gain scaling on the performance of the IDLC fusion reactor system is also explored.
An integrated assessment of location-dependent scaling for microalgae biofuel production facilities
Coleman, André M.; Abodeely, Jared M.; Skaggs, Richard L.; ...
2014-06-19
Successful development of a large-scale microalgae-based biofuels industry requires comprehensive analysis and understanding of the feedstock supply chain—from facility siting and design through processing and upgrading of the feedstock to a fuel product. The evolution from pilot-scale production facilities to energy-scale operations presents many multi-disciplinary challenges, including a sustainable supply of water and nutrients, operational and infrastructure logistics, and economic competitiveness with petroleum-based fuels. These challenges are partially addressed by applying the Integrated Assessment Framework (IAF) – an integrated multi-scale modeling, analysis, and data management suite – to address key issues in developing and operating an open-pond microalgae production facility.more » This is done by analyzing how variability and uncertainty over space and through time affect feedstock production rates, and determining the site-specific “optimum” facility scale to minimize capital and operational expenses. This approach explicitly and systematically assesses the interdependence of biofuel production potential, associated resource requirements, and production system design trade-offs. To provide a baseline analysis, the IAF was applied in this paper to a set of sites in the southeastern U.S. with the potential to cumulatively produce 5 billion gallons per year. Finally, the results indicate costs can be reduced by scaling downstream processing capabilities to fit site-specific growing conditions, available and economically viable resources, and specific microalgal strains.« less
Spatio-temporal hierarchy in the dynamics of a minimalist protein model
NASA Astrophysics Data System (ADS)
Matsunaga, Yasuhiro; Baba, Akinori; Li, Chun-Biu; Straub, John E.; Toda, Mikito; Komatsuzaki, Tamiki; Berry, R. Stephen
2013-12-01
A method for time series analysis of molecular dynamics simulation of a protein is presented. In this approach, wavelet analysis and principal component analysis are combined to decompose the spatio-temporal protein dynamics into contributions from a hierarchy of different time and space scales. Unlike the conventional Fourier-based approaches, the time-localized wavelet basis captures the vibrational energy transfers among the collective motions of proteins. As an illustrative vehicle, we have applied our method to a coarse-grained minimalist protein model. During the folding and unfolding transitions of the protein, vibrational energy transfers between the fast and slow time scales were observed among the large-amplitude collective coordinates while the other small-amplitude motions are regarded as thermal noise. Analysis employing a Gaussian-based measure revealed that the time scales of the energy redistribution in the subspace spanned by such large-amplitude collective coordinates are slow compared to the other small-amplitude coordinates. Future prospects of the method are discussed in detail.
Cross-scale analysis of cluster correspondence using different operational neighborhoods
NASA Astrophysics Data System (ADS)
Lu, Yongmei; Thill, Jean-Claude
2008-09-01
Cluster correspondence analysis examines the spatial autocorrelation of multi-location events at the local scale. This paper argues that patterns of cluster correspondence are highly sensitive to the definition of operational neighborhoods that form the spatial units of analysis. A subset of multi-location events is examined for cluster correspondence if they are associated with the same operational neighborhood. This paper discusses the construction of operational neighborhoods for cluster correspondence analysis based on the spatial properties of the underlying zoning system and the scales at which the zones are aggregated into neighborhoods. Impacts of this construction on the degree of cluster correspondence are also analyzed. Empirical analyses of cluster correspondence between paired vehicle theft and recovery locations are conducted on different zoning methods and across a series of geographic scales and the dynamics of cluster correspondence patterns are discussed.
ERIC Educational Resources Information Center
Camparo, James; Camparo, Lorinda B.
2013-01-01
Though ubiquitous, Likert scaling's traditional mode of analysis is often unable to uncover all of the valid information in a data set. Here, the authors discuss a solution to this problem based on methodology developed by quantum physicists: the state multipole method. The authors demonstrate the relative ease and value of this method by…
Goch, Caspar J; Stieltjes, Bram; Henze, Romy; Hering, Jan; Poustka, Luise; Meinzer, Hans-Peter; Maier-Hein, Klaus H
2014-05-01
Diagnosis of autism spectrum disorders (ASD) is difficult, as symptoms vary greatly and are difficult to quantify objectively. Recent work has focused on the assessment of non-invasive diffusion tensor imaging-based biomarkers that reflect the microstructural characteristics of neuronal pathways in the brain. While tractography-based approaches typically analyze specific structures of interest, a graph-based large-scale network analysis of the connectome can yield comprehensive measures of larger-scale architectural patterns in the brain. Commonly applied global network indices, however, do not provide any specificity with respect to functional areas or anatomical structures. Aim of this work was to assess the concept of network centrality as a tool to perform locally specific analysis without disregarding the global network architecture and compare it to other popular network indices. We create connectome networks from fiber tractographies and parcellations of the human brain and compute global network indices as well as local indices for Wernicke's Area, Broca's Area and the Motor Cortex. Our approach was evaluated on 18 children suffering from ASD and 18 typically developed controls using magnetic resonance imaging-based cortical parcellations in combination with diffusion tensor imaging tractography. We show that the network centrality of Wernicke's area is significantly (p<0.001) reduced in ASD, while the motor cortex, which was used as a control region, did not show significant alterations. This could reflect the reduced capacity for comprehension of language in ASD. The betweenness centrality could potentially be an important metric in the development of future diagnostic tools in the clinical context of ASD diagnosis. Our results further demonstrate the applicability of large-scale network analysis tools in the domain of region-specific analysis with a potential application in many different psychological disorders.
Thin-plate spline analysis of the cranial base in subjects with Class III malocclusion.
Singh, G D; McNamara, J A; Lozanoff, S
1997-08-01
The role of the cranial base in the emergence of Class III malocclusion is not fully understood. This study determines deformations that contribute to a Class III cranial base morphology, employing thin-plate spline analysis on lateral cephalographs. A total of 73 children of European-American descent aged between 5 and 11 years of age with Class III malocclusion were compared with an equivalent group of subjects with a normal, untreated, Class I molar occlusion. The cephalographs were traced, checked and subdivided into seven age- and sex-matched groups. Thirteen points on the cranial base were identified and digitized. The datasets were scaled to an equivalent size, and statistical analysis indicated significant differences between average Class I and Class III cranial base morphologies for each group. Thin-plate spline analysis indicated that both affine (uniform) and non-affine transformations contribute toward the total spline for each average cranial base morphology at each age group analysed. For non-affine transformations, Partial warps 10, 8 and 7 had high magnitudes, indicating large-scale deformations affecting Bolton point, basion, pterygo-maxillare, Ricketts' point and articulare. In contrast, high eigenvalues associated with Partial warps 1-3, indicating localized shape changes, were found at tuberculum sellae, sella, and the frontonasomaxillary suture. It is concluded that large spatial-scale deformations affect the occipital complex of the cranial base and sphenoidal region, in combination with localized distortions at the frontonasal suture. These deformations may contribute to reduced orthocephalization or deficient flattening of the cranial base antero-posteriorly that, in turn, leads to the formation of a Class III malocclusion.
Ross, Amy M; Ilic, Kelley; Kiyoshi-Teo, Hiroko; Lee, Christopher S
2017-12-26
The purpose of this study was to establish the psychometric properties of the new 16-item leadership environment scale. The leadership environment scale was based on complexity science concepts relevant to complex adaptive health care systems. A workforce survey of direct-care nurses was conducted (n = 1,443) in Oregon. Confirmatory factor analysis, exploratory factor analysis, concordant validity test and reliability tests were conducted to establish the structure and internal consistency of the leadership environment scale. Confirmatory factor analysis indices approached acceptable thresholds of fit with a single factor solution. Exploratory factor analysis showed improved fit with a two-factor model solution; the factors were labelled 'influencing relationships' and 'interdependent system supports'. Moderate to strong convergent validity was observed between the leadership environment scale/subscales and both the nursing workforce index and the safety organising scale. Reliability of the leadership environment scale and subscales was strong, with all alphas ≥.85. The leadership environment scale is structurally sound and reliable. Nursing management can employ adaptive complexity leadership attributes, measure their influence on the leadership environment, subsequently modify system supports and relationships and improve the quality of health care systems. The leadership environment scale is an innovative fit to complex adaptive systems and how nurses act as leaders within these systems. © 2017 John Wiley & Sons Ltd.
Smith, William Pastor
2013-09-01
The primary purpose of this two-phased study was to examine the structural validity and statistical utility of a racism scale specific to Black men who have sex with men (MSM) who resided in the Washington, DC, metropolitan area and Baltimore, Maryland. Phase I involved pretesting a 10-item racism measure with 20 Black MSM. Based on pretest findings, the scale was adapted into a 21-item racism scale for use in collecting data on 166 respondents in Phase II. Exploratory factor analysis of the 21-item racism scale resulted in a 19-item, two-factor solution. The two factors or subscales were the following: General Racism and Relationships and Racism. Confirmatory factor analysis was used in testing construct validity of the factored racism scale. Specifically, the two racism factors were combined with three homophobia factors into a confirmatory factor analysis model. Based on a summary of the fit indices, both comparative and incremental were equal to .90, suggesting an adequate convergence of the racism and homophobia dimensions into a single social oppression construct. Statistical utility of the two racism subscales was demonstrated when regression analysis revealed that the gay-identified men versus bisexual-identified men in the sample were more likely to experience increased racism within the context of intimate relationships and less likely to be exposed to repeated experiences of general racism. Overall, the findings in this study highlight the importance of continuing to explore the psychometric properties of a racism scale that accounts for the unique psychosocial concerns experienced by Black MSM.
ERIC Educational Resources Information Center
Colligan, Robert C.; And Others
1994-01-01
Developed bipolar Minnesota Multiphasic Personality Inventory (MMPI) Optimism-Pessimism (PSM) scale based on results on Content Analysis of Verbatim Explanation applied to MMPI. Reliability and validity indices show that PSM scale is highly accurate and consistent with Seligman's theory that pessimistic explanatory style predicts increased…
Urban area thermal monitoring: Liepaja case study using satellite and aerial thermal data
NASA Astrophysics Data System (ADS)
Gulbe, Linda; Caune, Vairis; Korats, Gundars
2017-12-01
The aim of this study is to explore large (60 m/pixel) and small scale (individual building level) temperature distribution patterns from thermal remote sensing data and to conclude what kind of information could be extracted from thermal remote sensing on regular basis. Landsat program provides frequent large scale thermal images useful for analysis of city temperature patterns. During the study correlation between temperature patterns and vegetation content based on NDVI and building coverage based on OpenStreetMap data was studied. Landsat based temperature patterns were independent from the season, negatively correlated with vegetation content and positively correlated with building coverage. Small scale analysis included spatial and raster descriptor analysis for polygons corresponding to roofs of individual buildings for evaluating insulation of roofs. Remote sensing and spatial descriptors are poorly related to heat consumption data, however, thermal aerial data median and entropy can help to identify poorly insulated roofs. Automated quantitative roof analysis has high potential for acquiring city wide information about roof insulation, but quality is limited by reference data quality and information on building types, and roof materials would be crucial for further studies.
Graffigna, Guendalina; Barello, Serena; Bonanomi, Andrea; Lozza, Edoardo
2015-01-01
Beyond the rhetorical call for increasing patients' engagement, policy makers recognize the urgency to have an evidence-based measure of patients' engagement and capture its effect when planning and implementing initiatives aimed at sustaining the engagement of consumers in their health. In this paper, authors describe the Patient Health Engagement Scale (PHE-scale), a measure of patient engagement that is grounded in rigorous conceptualization and appropriate psychometric methods. The scale was developed based on our previous conceptualization of patient engagement (the PHE-model). In particular, the items of the PHE-scale were developed based on the findings from the literature review and from interviews with chronic patients. Initial psychometric analysis was performed to pilot test a preliminary version of the items. The items were then refined and administered to a national sample of chronic patients (N = 382) to assess the measure's psychometric performance. A final phase of test-retest reliability was performed. The analysis showed that the PHE Scale has good psychometric properties with good correlation with concurrent measures and solid reliability. Having a valid and reliable measure to assess patient engagement is the first step in understanding patient engagement and its role in health care quality, outcomes, and cost containment. The PHE Scale shows a promising clinical relevance, indicating that it can be used to tailor intervention and assess changes after patient engagement interventions. PMID:25870566
Graffigna, Guendalina; Barello, Serena; Bonanomi, Andrea; Lozza, Edoardo
2015-01-01
Beyond the rhetorical call for increasing patients' engagement, policy makers recognize the urgency to have an evidence-based measure of patients' engagement and capture its effect when planning and implementing initiatives aimed at sustaining the engagement of consumers in their health. In this paper, authors describe the Patient Health Engagement Scale (PHE-scale), a measure of patient engagement that is grounded in rigorous conceptualization and appropriate psychometric methods. The scale was developed based on our previous conceptualization of patient engagement (the PHE-model). In particular, the items of the PHE-scale were developed based on the findings from the literature review and from interviews with chronic patients. Initial psychometric analysis was performed to pilot test a preliminary version of the items. The items were then refined and administered to a national sample of chronic patients (N = 382) to assess the measure's psychometric performance. A final phase of test-retest reliability was performed. The analysis showed that the PHE Scale has good psychometric properties with good correlation with concurrent measures and solid reliability. Having a valid and reliable measure to assess patient engagement is the first step in understanding patient engagement and its role in health care quality, outcomes, and cost containment. The PHE Scale shows a promising clinical relevance, indicating that it can be used to tailor intervention and assess changes after patient engagement interventions.
An Analysis of the Connectedness to Nature Scale Based on Item Response Theory.
Pasca, Laura; Aragonés, Juan I; Coello, María T
2017-01-01
The Connectedness to Nature Scale (CNS) is used as a measure of the subjective cognitive connection between individuals and nature. However, to date, it has not been analyzed at the item level to confirm its quality. In the present study, we conduct such an analysis based on Item Response Theory. We employed data from previous studies using the Spanish-language version of the CNS, analyzing a sample of 1008 participants. The results show that seven items presented appropriate indices of discrimination and difficulty, in addition to a good fit. The remaining six have inadequate discrimination indices and do not present a good fit. A second study with 321 participants shows that the seven-item scale has adequate levels of reliability and validity. Therefore, it would be appropriate to use a reduced version of the scale after eliminating the items that display inappropriate behavior, since they may interfere with research results on connectedness to nature.
Uncertainty analysis on simple mass balance model to calculate critical loads for soil acidity.
Li, Harbin; McNulty, Steven G
2007-10-01
Simple mass balance equations (SMBE) of critical acid loads (CAL) in forest soil were developed to assess potential risks of air pollutants to ecosystems. However, to apply SMBE reliably at large scales, SMBE must be tested for adequacy and uncertainty. Our goal was to provide a detailed analysis of uncertainty in SMBE so that sound strategies for scaling up CAL estimates to the national scale could be developed. Specifically, we wanted to quantify CAL uncertainty under natural variability in 17 model parameters, and determine their relative contributions in predicting CAL. Results indicated that uncertainty in CAL came primarily from components of base cation weathering (BC(w); 49%) and acid neutralizing capacity (46%), whereas the most critical parameters were BC(w) base rate (62%), soil depth (20%), and soil temperature (11%). Thus, improvements in estimates of these factors are crucial to reducing uncertainty and successfully scaling up SMBE for national assessments of CAL.
Storrs, Mark J; Alexander, Heather; Sun, Jing; Kroon, Jeroen; Evans, Jane L
2015-03-01
Previous research on interprofessional education (IPE) assessment has shown the need to evaluate the influence of team-based processes on the quality of clinical education. This study aimed to develop a valid and reliable instrument to evaluate the effectiveness of interprofessional team-based treatment planning (TBTP) on the quality of clinical education at the Griffith University School of Dentistry and Oral Health, Queensland, Australia. A scale was developed and evaluated to measure interprofessional student team processes and their effect on the quality of clinical education for dental, oral health therapy, and dental technology students (known more frequently as intraprofessional education). A face validity analysis by IPE experts confirmed that items on the scale reflected the meaning of relevant concepts. After piloting, 158 students (61% response rate) involved with TBTP participated in a survey. An exploratory factor analysis using the principal component method retained 23 items with a total variance of 64.6%, suggesting high content validity. Three subscales accounted for 45.7%, 11.4%, and 7.5% of the variance. Internal consistency of the scale (α=0.943) and subscales 1 (α=0.953), 2 (α=0.897), and 3 (α=0.813) was high. A reliability analysis yielded moderate (rs=0.43) to high correlations (0.81) with the remaining scale items. Confirmatory factor analyses verified convergent validity and confirmed that this structure had a good model fit. This study suggests that the instrument might be useful in evaluating interprofessional or intraprofessional team-based processes and their influence on the quality of clinical education in academic dental institutions.
Conners' Teacher Rating Scale for Preschool Children: A Revised, Brief, Age-Specific Measure
ERIC Educational Resources Information Center
Purpura, David J.; Lonigan, Christopher J.
2009-01-01
The Conners' Teacher Rating Scale-Revised (CTRS-R) is one of the most commonly used measures of child behavior problems. However, the scale length and the appropriateness of some of the items on the scale may reduce the usefulness of the CTRS-R for use with preschoolers. In this study, a Graded Response Model analysis based on Item Response Theory…
Andersen, Randi Dovland; Jylli, Leena; Ambuel, Bruce
2014-06-01
There is little empirical evidence regarding the translation and cultural adaptation of self-report and observational outcome measures. Studies that evaluate and further develop existing practices are needed. This study explores the use of cognitive interviews in the translation and cultural adaptation of observational measures, using the COMFORT behavioral scale as an example, and demonstrates a structured approach to the analysis of data from cognitive interviews. The COMFORT behavioral scale is developed for assessment of distress and pain in a pediatric intensive care setting. Qualitative, descriptive methodological study. One general public hospital trust in southern Norway. N=12. Eight nurses, three physicians and one nurse assistant, from different wards and with experience caring for children. We translated the COMFORT behavior scale into Norwegian before conducting individual cognitive interviews. Participants first read and then used the translated version of the COMFORT behavioral scale to assess pain based on a 3-min film vignette depicting an infant in pain/distress. Two cognitive interview techniques were applied: Thinking Aloud (TA) during the assessment and Verbal Probing (VP) afterwards. In TA the participant verbalized his/her thought process while completing the COMFORT behavioral scale. During VP the participant responded to specific questions related to understanding of the measure, information recall and the decision process. We audio recorded, transcribed and analyzed interviews using a structured qualitative method (cross-case analysis based on predefined categories and development of a results matrix). Our analysis revealed two categories of problems: (1) Scale problems, warranting a change in the wording of the scale, including (a) translation errors, (b) content not understood as intended, and (c) differences between the original COMFORT scale and the revised COMFORT behavioral scale; and (2) Rater-context problems caused by (a) unfamiliarity with the scale, (b) lack of knowledge and experience, and (c) assessments based on a film vignette. Cognitive interviews revealed problems with both the translated and the original versions of the scale and suggested solutions that enhanced the validity of both versions. Cognitive interviews might be seen as a complement to current published best practices for translation and cultural adaptation. Copyright © 2013 Elsevier Ltd. All rights reserved.
Exploring stability of entropy analysis for signal with different trends
NASA Astrophysics Data System (ADS)
Zhang, Yin; Li, Jin; Wang, Jun
2017-03-01
Considering the effects of environment disturbances and instrument systems, the actual detecting signals always are carrying different trends, which result in that it is difficult to accurately catch signals complexity. So choosing steady and effective analysis methods is very important. In this paper, we applied entropy measures-the base-scale entropy and approximate entropy to analyze signal complexity, and studied the effect of trends on the ideal signal and the heart rate variability (HRV) signals, that is, linear, periodic, and power-law trends which are likely to occur in actual signals. The results show that approximate entropy is unsteady when we embed different trends into the signals, so it is not suitable to analyze signal with trends. However, the base-scale entropy has preferable stability and accuracy for signal with different trends. So the base-scale entropy is an effective method to analyze the actual signals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less
Hydrological landscape analysis based on digital elevation data
NASA Astrophysics Data System (ADS)
Seibert, J.; McGlynn, B.; Grabs, T.; Jensco, K.
2008-12-01
Topography is a major factor controlling both hydrological and soil processes at the landscape scale. While this is well-accepted qualitatively, quantifying relationships between topography and spatial variations of hydrologically relevant variables at the landscape scale still remains a challenging research topic. In this presentation, we describe hydrological landscape analysis HLA) as a way to derive relevant topographic indicies to describe the spatial variations of hydrological variables at the landscape scale. We demonstrate our HLA approach with four high-resolution digital elevation models (DEMs) from Sweden, Switzerland and Montana (USA). To investigate scale effects HLA metrics, we compared DEMs of different resolutions. These LiDAR-derived DEMs of 3m, 10m, and 30m, resolution represent catchments of ~ 5 km2 ranging from low to high relief. A central feature of HLA is the flowpath-based analysis of topography and the separation of hillslopes, riparian areas, and the stream network. We included the following metrics: riparian area delineation, riparian buffer potential, separation of stream inflows into right and left bank components, travel time proxies based on flowpath distances and gradients to the channel, and as a hydrologic similarity to the hypsometric curve we suggest the distribution of elevations above the stream network (computed based on the location where a certain flow pathway enters the stream). Several of these indices depended clearly on DEM resolution, whereas this effect was minor for others. While the hypsometric curves all were S-shaped the 'hillslope-hypsometric curves' had the shape of a power function with exponents less than 1. In a similar way we separated flow pathway lengths and gradients between hillslopes and streams and compared a topographic travel time proxy, which was based on the integration of gradients along the flow pathways. Besides the comparison of HLA-metrics for different catchments and DEM resolutions we present examples from experimental catchments to illustrate how these metrics can be used to describe catchment scale hydrological processes and provide context for plot scale observations.
NASA Astrophysics Data System (ADS)
Donner, Reik; Balasis, Georgios; Stolbova, Veronika; Wiedermann, Marc; Georgiou, Marina; Kurths, Jürgen
2016-04-01
Magnetic storms are the most prominent global manifestations of out-of-equilibrium magnetospheric dynamics. Investigating the dynamical complexity exhibited by geomagnetic observables can provide valuable insights into relevant physical processes as well as temporal scales associated with this phenomenon. In this work, we introduce several innovative data analysis techniques enabling a quantitative analysis of the Dst index non-stationary behavior. Using recurrence quantification analysis (RQA) and recurrence network analysis (RNA), we obtain a variety of complexity measures serving as markers of quiet- and storm-time magnetospheric dynamics. We additionally apply these techniques to the main driver of Dst index variations, the V BSouth coupling function and interplanetary medium parameters Bz and Pdyn in order to discriminate internal processes from the magnetosphere's response directly induced by the external forcing by the solar wind. The derived recurrence-based measures allow us to improve the accuracy with which magnetospheric storms can be classified based on ground-based observations. The new methodology presented here could be of significant interest for the space weather research community working on time series analysis for magnetic storm forecasts.
Models of inertial range spectra of interplanetary magnetohydrodynamic turbulence
NASA Technical Reports Server (NTRS)
Zhou, YE; Matthaeus, William H.
1990-01-01
A framework based on turbulence theory is presented to develop approximations for the local turbulence effects that are required in transport models. An approach based on Kolmogoroff-style dimensional analysis is presented as well as one based on a wave-number diffusion picture. Particular attention is given to the case of MHD turbulence with arbitrary cross helicity and with arbitrary ratios of the Alfven time scale and the nonlinear time scale.
Monte Carlo capabilities of the SCALE code system
Rearden, Bradley T.; Petrie, Jr., Lester M.; Peplow, Douglas E.; ...
2014-09-12
SCALE is a broadly used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a “plug-and-play” framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport asmore » well as activation, depletion, and decay calculations. SCALE’s graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2 will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. Finally, an overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2.« less
Scaling analysis for the investigation of slip mechanisms in nanofluids
NASA Astrophysics Data System (ADS)
Savithiri, S.; Pattamatta, Arvind; Das, Sarit K.
2011-07-01
The primary objective of this study is to investigate the effect of slip mechanisms in nanofluids through scaling analysis. The role of nanoparticle slip mechanisms in both water- and ethylene glycol-based nanofluids is analyzed by considering shape, size, concentration, and temperature of the nanoparticles. From the scaling analysis, it is found that all of the slip mechanisms are dominant in particles of cylindrical shape as compared to that of spherical and sheet particles. The magnitudes of slip mechanisms are found to be higher for particles of size between 10 and 80 nm. The Brownian force is found to dominate in smaller particles below 10 nm and also at smaller volume fraction. However, the drag force is found to dominate in smaller particles below 10 nm and at higher volume fraction. The effect of thermophoresis and Magnus forces is found to increase with the particle size and concentration. In terms of time scales, the Brownian and gravity forces act considerably over a longer duration than the other forces. For copper-water-based nanofluid, the effective contribution of slip mechanisms leads to a heat transfer augmentation which is approximately 36% over that of the base fluid. The drag and gravity forces tend to reduce the Nusselt number of the nanofluid while the other forces tend to enhance it.
Scaling analysis for the investigation of slip mechanisms in nanofluids
2011-01-01
The primary objective of this study is to investigate the effect of slip mechanisms in nanofluids through scaling analysis. The role of nanoparticle slip mechanisms in both water- and ethylene glycol-based nanofluids is analyzed by considering shape, size, concentration, and temperature of the nanoparticles. From the scaling analysis, it is found that all of the slip mechanisms are dominant in particles of cylindrical shape as compared to that of spherical and sheet particles. The magnitudes of slip mechanisms are found to be higher for particles of size between 10 and 80 nm. The Brownian force is found to dominate in smaller particles below 10 nm and also at smaller volume fraction. However, the drag force is found to dominate in smaller particles below 10 nm and at higher volume fraction. The effect of thermophoresis and Magnus forces is found to increase with the particle size and concentration. In terms of time scales, the Brownian and gravity forces act considerably over a longer duration than the other forces. For copper-water-based nanofluid, the effective contribution of slip mechanisms leads to a heat transfer augmentation which is approximately 36% over that of the base fluid. The drag and gravity forces tend to reduce the Nusselt number of the nanofluid while the other forces tend to enhance it. PMID:21791036
Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patchett, John M; Ahrens, James P; Lo, Li - Ta
2010-10-15
Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We presentmore » a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.« less
Arkas: Rapid reproducible RNAseq analysis
Colombo, Anthony R.; J. Triche Jr, Timothy; Ramsingh, Giridharan
2017-01-01
The recently introduced Kallisto pseudoaligner has radically simplified the quantification of transcripts in RNA-sequencing experiments. We offer cloud-scale RNAseq pipelines Arkas-Quantification, and Arkas-Analysis available within Illumina’s BaseSpace cloud application platform which expedites Kallisto preparatory routines, reliably calculates differential expression, and performs gene-set enrichment of REACTOME pathways . Due to inherit inefficiencies of scale, Illumina's BaseSpace computing platform offers a massively parallel distributive environment improving data management services and data importing. Arkas-Quantification deploys Kallisto for parallel cloud computations and is conveniently integrated downstream from the BaseSpace Sequence Read Archive (SRA) import/conversion application titled SRA Import. Arkas-Analysis annotates the Kallisto results by extracting structured information directly from source FASTA files with per-contig metadata, calculates the differential expression and gene-set enrichment analysis on both coding genes and transcripts. The Arkas cloud pipeline supports ENSEMBL transcriptomes and can be used downstream from the SRA Import facilitating raw sequencing importing, SRA FASTQ conversion, RNA quantification and analysis steps. PMID:28868134
Scale invariance in Newton–Cartan and Hořava–Lifshitz gravity
NASA Astrophysics Data System (ADS)
Olgu Devecioğlu, Deniz; Özdemir, Neşe; Ozkan, Mehmet; Zorba, Utku
2018-06-01
We present a detailed analysis of the construction of z = 2 and scale invariant Hořava–Lifshitz gravity. The construction procedure is based on the realization of Hořava–Lifshitz gravity as the dynamical Newton–Cartan geometry as well as a non-relativistic tensor calculus in the presence of the scale symmetry. An important consequence of this method is that it provides us with the necessary mechanism to distinguish the local scale invariance from the local Schrödinger invariance. Based on this result we discuss the z = 2 scale invariant Hořava–Lifshitz gravity and the symmetry enhancement to the full Schrödinger group.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chinthavali, Supriya; Shankar, Mallikarjun
Critical Infrastructure systems(CIs) such as energy, water, transportation and communication are highly interconnected and mutually dependent in complex ways. Robust modeling of CIs interconnections is crucial to identify vulnerabilities in the CIs. We present here a national-scale Infrastructure Vulnerability Analysis System (IVAS) vision leveraging Se- mantic Big Data (SBD) tools, Big Data, and Geographical Information Systems (GIS) tools. We survey existing ap- proaches on vulnerability analysis of critical infrastructures and discuss relevant systems and tools aligned with our vi- sion. Next, we present a generic system architecture and discuss challenges including: (1) Constructing and manag- ing a CI network-of-networks graph,more » (2) Performing analytic operations at scale, and (3) Interactive visualization of ana- lytic output to generate meaningful insights. We argue that this architecture acts as a baseline to realize a national-scale network based vulnerability analysis system.« less
The effect of Web-based Braden Scale training on the reliability of Braden subscale ratings.
Magnan, Morris A; Maklebust, JoAnn
2009-01-01
The primary purpose of this study was to evaluate the effect of Web-based Braden Scale training on the reliability of Braden Scale subscale ratings made by nurses working in acute care hospitals. A secondary purpose was to describe the distribution of reliable Braden subscale ratings before and after Web-based Braden Scale training. Secondary analysis of data from a recently completed quasi-experimental, pretest-posttest, interrater reliability study. A convenience sample of RNs working at 3 Michigan medical centers voluntarily participated in the study. RN participants included nurses who used the Braden Scale regularly at their place of employment ("regular users") as well as nurses who did not use the Braden Scale at their place of employment ("new users"). Using a pretest-posttest, quasi-experimental design, pretest interrater reliability data were collected to identify the percentage of nurses making reliable Braden subscale assessments. Nurses then completed a Web-based Braden Scale training module after which posttest interrater reliability data were collected. The reliability of nurses' Braden subscale ratings was determined by examining the level of agreement/disagreement between ratings made by an RN and an "expert" rating the same patient. In total, 381 RN-to-expert dyads were available for analysis. During both the pretest and posttest periods, the percentage of reliable subscale ratings was highest for the activity subscale, lowest for the moisture subscale, and second lowest for the nutrition subscale. With Web-based Braden Scale training, the percentage of reliable Braden subscale ratings made by new users increased for all 6 subscales with statistically significant improvements in the percentage of reliable assessments made on 3 subscales: sensory-perception, moisture, and mobility. Training had virtually no effect on the percentage of reliable subscale ratings made by regular users of the Braden Scale. With Web-based Braden Scale training the percentage of nurses making reliable ratings increased for all 6 subscales, but this was true for new users only. Additional research is needed to identify educational approaches that effectively improve and sustain the reliability of subscale ratings among regular users of the Braden Scale. Moreover, special attention needs to be given to ensuring that all nurses working with the Braden Scale have a clear understanding of the intended meanings and correct approaches to rating moisture and nutrition subscales.
Edelbring, Samuel
2012-08-15
The degree of learners' self-regulated learning and dependence on external regulation influence learning processes in higher education. These regulation strategies are commonly measured by questionnaires developed in other settings than in which they are being used, thereby requiring renewed validation. The aim of this study was to psychometrically evaluate the learning regulation strategy scales from the Inventory of Learning Styles with Swedish medical students (N = 206). The regulation scales were evaluated regarding their reliability, scale dimensionality and interrelations. The primary evaluation focused on dimensionality and was performed with Mokken scale analysis. To assist future scale refinement, additional item analysis, such as item-to-scale correlations, was performed. Scale scores in the Swedish sample displayed good reliability in relation to published results: Cronbach's alpha: 0.82, 0.72, and 0.65 for self-regulation, external regulation and lack of regulation scales respectively. The dimensionalities in scales were adequate for self-regulation and its subscales, whereas external regulation and lack of regulation displayed less unidimensionality. The established theoretical scales were largely replicated in the exploratory analysis. The item analysis identified two items that contributed little to their respective scales. The results indicate that these scales have an adequate capacity for detecting the three theoretically proposed learning regulation strategies in the medical education sample. Further construct validity should be sought by interpreting scale scores in relation to specific learning activities. Using established scales for measuring students' regulation strategies enables a broad empirical base for increasing knowledge on regulation strategies in relation to different disciplinary settings and contributes to theoretical development.
Multi-scale statistical analysis of coronal solar activity
Gamborino, Diana; del-Castillo-Negrete, Diego; Martinell, Julio J.
2016-07-08
Multi-filter images from the solar corona are used to obtain temperature maps that are analyzed using techniques based on proper orthogonal decomposition (POD) in order to extract dynamical and structural information at various scales. Exploring active regions before and after a solar flare and comparing them with quiet regions, we show that the multi-scale behavior presents distinct statistical properties for each case that can be used to characterize the level of activity in a region. Information about the nature of heat transport is also to be extracted from the analysis.
A multi-objective constraint-based approach for modeling genome-scale microbial ecosystems.
Budinich, Marko; Bourdon, Jérémie; Larhlimi, Abdelhalim; Eveillard, Damien
2017-01-01
Interplay within microbial communities impacts ecosystems on several scales, and elucidation of the consequent effects is a difficult task in ecology. In particular, the integration of genome-scale data within quantitative models of microbial ecosystems remains elusive. This study advocates the use of constraint-based modeling to build predictive models from recent high-resolution -omics datasets. Following recent studies that have demonstrated the accuracy of constraint-based models (CBMs) for simulating single-strain metabolic networks, we sought to study microbial ecosystems as a combination of single-strain metabolic networks that exchange nutrients. This study presents two multi-objective extensions of CBMs for modeling communities: multi-objective flux balance analysis (MO-FBA) and multi-objective flux variability analysis (MO-FVA). Both methods were applied to a hot spring mat model ecosystem. As a result, multiple trade-offs between nutrients and growth rates, as well as thermodynamically favorable relative abundances at community level, were emphasized. We expect this approach to be used for integrating genomic information in microbial ecosystems. Following models will provide insights about behaviors (including diversity) that take place at the ecosystem scale.
USB environment measurements based on full-scale static engine ground tests
NASA Technical Reports Server (NTRS)
Sussman, M. B.; Harkonen, D. L.; Reed, J. B.
1976-01-01
Flow turning parameters, static pressures, surface temperatures, surface fluctuating pressures and acceleration levels were measured in the environment of a full-scale upper surface blowing (USB) propulsive lift test configuration. The test components included a flightworthy CF6-50D engine, nacelle, and USB flap assembly utilized in conjunction with ground verification testing of the USAF YC-14 Advanced Medium STOL Transport propulsion system. Results, based on a preliminary analysis of the data, generally show reasonable agreement with predicted levels based on model data. However, additional detailed analysis is required to confirm the preliminary evaluation, to help delineate certain discrepancies with model data, and to establish a basis for future flight test comparisons.
The calibration analysis of soil infiltration formula in farmland scale
NASA Astrophysics Data System (ADS)
Qian, Tao; Han, Na Na; Chang, Shuan Ling
2018-06-01
Soil infiltration characteristic is an important basis of farmland scale parameter estimation. Based on 12 groups of double-loop infiltration tests conducted in the test field of tianjin agricultural university west campus. Based on the calibration theory and the combination of statistics, the calibration analysis of phillips formula was carried out and the spatial variation characteristics of the calibration factor were analyzed. Results show that in study area based on the soil stability infiltration rate A calculate calibration factor αA calibration effect is best, that is suitable for the area formula of calibration infiltration and αA variation coefficient is 0.3234, with A certain degree of spatial variability.
Establishing a direct connection between detrended fluctuation analysis and Fourier analysis
NASA Astrophysics Data System (ADS)
Kiyono, Ken
2015-10-01
To understand methodological features of the detrended fluctuation analysis (DFA) using a higher-order polynomial fitting, we establish the direct connection between DFA and Fourier analysis. Based on an exact calculation of the single-frequency response of the DFA, the following facts are shown analytically: (1) in the analysis of stochastic processes exhibiting a power-law scaling of the power spectral density (PSD), S (f ) ˜f-β , a higher-order detrending in the DFA has no adverse effect in the estimation of the DFA scaling exponent α , which satisfies the scaling relation α =(β +1 )/2 ; (2) the upper limit of the scaling exponents detectable by the DFA depends on the order of polynomial fit used in the DFA, and is bounded by m +1 , where m is the order of the polynomial fit; (3) the relation between the time scale in the DFA and the corresponding frequency in the PSD are distorted depending on both the order of the DFA and the frequency dependence of the PSD. We can improve the scale distortion by introducing the corrected time scale in the DFA corresponding to the inverse of the frequency scale in the PSD. In addition, our analytical approach makes it possible to characterize variants of the DFA using different types of detrending. As an application, properties of the detrending moving average algorithm are discussed.
ERIC Educational Resources Information Center
Bontempo, Robert
1993-01-01
Describes a method for assessing the quality of translations based on item response theory (IRT). Results from the IRT technique with French and Chinese versions of a scale measuring individualism-collectivism for samples of 250 U.S., 357 French, and 290 Chinese undergraduates show how several biased items are detected. (SLD)
Hydrological predictions at a watershed scale are commonly based on extrapolation and upscaling of hydrological behavior at plot and hillslope scales. Yet, dominant hydrological drivers at a hillslope may not be as dominant at the watershed scale because of the heterogeneity of w...
Development of multiscale complexity and multifractality of fetal heart rate variability.
Gierałtowski, Jan; Hoyer, Dirk; Tetschke, Florian; Nowack, Samuel; Schneider, Uwe; Zebrowski, Jan
2013-11-01
During fetal development a complex system grows and coordination over multiple time scales is formed towards an integrated behavior of the organism. Since essential cardiovascular and associated coordination is mediated by the autonomic nervous system (ANS) and the ANS activity is reflected in recordable heart rate patterns, multiscale heart rate analysis is a tool predestined for the diagnosis of prenatal maturation. The analyses over multiple time scales requires sufficiently long data sets while the recordings of fetal heart rate as well as the behavioral states studied are themselves short. Care must be taken that the analysis methods used are appropriate for short data lengths. We investigated multiscale entropy and multifractal scaling exponents from 30 minute recordings of 27 normal fetuses, aged between 23 and 38 weeks of gestational age (WGA) during the quiet state. In multiscale entropy, we found complexity lower than that of non-correlated white noise over all 20 coarse graining time scales investigated. Significant maturation age related complexity increase was strongest expressed at scale 2, both using sample entropy and generalized mutual information as complexity estimates. Multiscale multifractal analysis (MMA) in which the Hurst surface h(q,s) is calculated, where q is the multifractal parameter and s is the scale, was applied to the fetal heart rate data. MMA is a method derived from detrended fluctuation analysis (DFA). We modified the base algorithm of MMA to be applicable for short time series analysis using overlapping data windows and a reduction of the scale range. We looked for such q and s for which the Hurst exponent h(q,s) is most correlated with gestational age. We used this value of the Hurst exponent to predict the gestational age based only on fetal heart rate variability properties. Comparison with the true age of the fetus gave satisfying results (error 2.17±3.29 weeks; p<0.001; R(2)=0.52). In addition, we found that the normally used DFA scale range is non-optimal for fetal age evaluation. We conclude that 30 min recordings are appropriate and sufficient for assessing fetal age by multiscale entropy and multiscale multifractal analysis. The predominant prognostic role of scale 2 heart beats for MSE and scale 39 heart beats (at q=-0.7) for MMA cannot be explored neither by single scale complexity measures nor by standard detrended fluctuation analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
Home Healthcare Nurses' Job Satisfaction Scale: refinement and psychometric testing.
Ellenbecker, Carol H; Byleckie, James J
2005-10-01
This paper describes a study to further develop and test the psychometric properties of the Home Healthcare Nurses' Job Satisfaction Scale, including reliability and construct and criterion validity. Numerous scales have been developed to measure nurses' job satisfaction. Only one, the Home Healthcare Nurses' Job Satisfaction Scale, has been designed specifically to measure job satisfaction of home healthcare nurses. The Home Healthcare Nurses' Job Satisfaction Scale is based on a theoretical model that integrates the findings of empirical research related to job satisfaction. A convenience sample of 340 home healthcare nurses completed the Home Healthcare Nurses' Job Satisfaction Scale and the Mueller and McCloskey Satisfaction Scale, which was used to test criterion validity. Factor analysis was used for testing and refinement of the theory-based assignment of items to constructs. Reliability was assessed by Cronbach's alpha internal consistency reliability coefficients. The data were collected in 2003. Nine factors contributing to home healthcare nurses' job satisfaction emerged from the factor analysis and were strongly supported by the underlying theory. Factor loadings were all above 0.4. Cronbach's alpha coefficients for each of the nine subscales ranged from 0.64 to 0.83; the alpha for the global scale was 0.89. The correlations between the Home Healthcare Nurses' Job Satisfaction Scale and Mueller and McCloskey Satisfaction Scale was 0.79, indicating good criterion-related validity. The Home Healthcare Nurses' Job Satisfaction Scale has potential as a reliable and valid scale for measurement of job satisfaction of home healthcare nurses.
Solutions to pervasive environmental problems often are not amenable to a straightforward application of science-based actions. These problems encompass large-scale environmental policy questions where environmental concerns, economic constraints, and societal values conflict ca...
Reconfigurable and responsive droplet-based compound micro-lenses.
Nagelberg, Sara; Zarzar, Lauren D; Nicolas, Natalie; Subramanian, Kaushikaram; Kalow, Julia A; Sresht, Vishnu; Blankschtein, Daniel; Barbastathis, George; Kreysing, Moritz; Swager, Timothy M; Kolle, Mathias
2017-03-07
Micro-scale optical components play a crucial role in imaging and display technology, biosensing, beam shaping, optical switching, wavefront-analysis, and device miniaturization. Herein, we demonstrate liquid compound micro-lenses with dynamically tunable focal lengths. We employ bi-phase emulsion droplets fabricated from immiscible hydrocarbon and fluorocarbon liquids to form responsive micro-lenses that can be reconfigured to focus or scatter light, form real or virtual images, and display variable focal lengths. Experimental demonstrations of dynamic refractive control are complemented by theoretical analysis and wave-optical modelling. Additionally, we provide evidence of the micro-lenses' functionality for two potential applications-integral micro-scale imaging devices and light field display technology-thereby demonstrating both the fundamental characteristics and the promising opportunities for fluid-based dynamic refractive micro-scale compound lenses.
Reconfigurable and responsive droplet-based compound micro-lenses
Nagelberg, Sara; Zarzar, Lauren D.; Nicolas, Natalie; Subramanian, Kaushikaram; Kalow, Julia A.; Sresht, Vishnu; Blankschtein, Daniel; Barbastathis, George; Kreysing, Moritz; Swager, Timothy M.; Kolle, Mathias
2017-01-01
Micro-scale optical components play a crucial role in imaging and display technology, biosensing, beam shaping, optical switching, wavefront-analysis, and device miniaturization. Herein, we demonstrate liquid compound micro-lenses with dynamically tunable focal lengths. We employ bi-phase emulsion droplets fabricated from immiscible hydrocarbon and fluorocarbon liquids to form responsive micro-lenses that can be reconfigured to focus or scatter light, form real or virtual images, and display variable focal lengths. Experimental demonstrations of dynamic refractive control are complemented by theoretical analysis and wave-optical modelling. Additionally, we provide evidence of the micro-lenses' functionality for two potential applications—integral micro-scale imaging devices and light field display technology—thereby demonstrating both the fundamental characteristics and the promising opportunities for fluid-based dynamic refractive micro-scale compound lenses. PMID:28266505
de Oliveira, Flávia Augusta; Luna, Stelio Pacca Loureiro; do Amaral, Jackson Barros; Rodrigues, Karoline Alves; Sant'Anna, Aline Cristina; Daolio, Milena; Brondani, Juliana Tabarelli
2014-09-06
The recognition and measurement of pain in cattle are important in determining the necessity for and efficacy of analgesic intervention. The aim of this study was to record behaviour and determine the validity and reliability of an instrument to assess acute pain in 40 cattle subjected to orchiectomy after sedation with xylazine and local anaesthesia. The animals were filmed before and after orchiectomy to record behaviour. The pain scale was based on previous studies, on a pilot study and on analysis of the camera footage. Three blinded observers and a local observer assessed the edited films obtained during the preoperative and postoperative periods, before and after rescue analgesia and 24 hours after surgery. Re-evaluation was performed one month after the first analysis. Criterion validity (agreement) and item-total correlation using Spearman's coefficient were employed to refine the scale. Based on factor analysis, a unidimensional scale was adopted. The internal consistency of the data was excellent after refinement (Cronbach's α coefficient = 0.866). There was a high correlation (p < 0.001) between the proposed scale and the visual analogue, simple descriptive and numerical rating scales. The construct validity and responsiveness were confirmed by the increase and decrease in pain scores after surgery and rescue analgesia, respectively (p < 0.001). Inter- and intra-observer reliability ranged from moderate to very good. The optimal cut-off point for rescue analgesia was > 4, and analysis of the area under the curve (AUC = 0.963) showed excellent discriminatory ability. The UNESP-Botucatu unidimensional pain scale for assessing acute postoperative pain in cattle is a valid, reliable and responsive instrument with excellent internal consistency and discriminatory ability. The cut-off point for rescue analgesia provides an additional tool for guiding analgesic therapy.
Shouryabi, Ali Asghar; Ghahrisarabi, Alireza; Anboohi, Sima Zohari; Nasiri, Malihe; Rassouli, Maryam
2017-01-01
Background Nursing competence is highly related to patient outcomes and patient safety issues, especially in intensive care units. Competence assessment tools are needed specifically for intensive care nursing. Objective This study was performed to determine psychometric properties of the Intensive and Critical Care Nursing Competence Scale version-1 between Iranian Nurses. Methods The present study was a methodological research in which 289 nurses of Intensive Care Units from nine hospitals in Shahid Beheshti University of Medical Sciences in Tehran were selected between 2015 and 2016. The original version of the scale was translated into Persian and back-translated into English, and the comments of the developer were applied. The validity of the scale was the determined quality (content validity and face validity) and quantity (confirmatory factor analysis). Reliability of the scale was reported by Cronbach’s alpha coefficient and Intra class Correlation Coefficient. SPSS-PC (v.21) and LISREL (v.8.5) were used to analyze the data. Results The intensive and critical care nursing competence scale version-1 is a self-assessment test that consists of 144 items and four domains which are the knowledge base, the skill base, the attitudes and values base and the experience base, which are divided into clinical competence and professional competence. Content and face validity was confirmed by 10 experts and 10 practitioner nurses in the intensive care units. In confirmatory factor analysis, all fitness indexes, except goodness of fit index (0.64), confirmed the four-factor structure of the ICCN-CS-1. The results of the factor analysis, load factor between 0.304 and 0.727 items was estimated; only 4 items out of 144 items, that were loaded were less than 0.3 due to high Cronbach’s alpha coefficient (0.984–0.986), all items were preserved, no item was removed and 4 subscales of the original scale were confirmed. Conclusion The results of this study indicated that the Persian version of “The Intensive and Critical Care Nursing Competence Scale version-1” is a valid and reliable scale for the assessment of competency among Iranian nurses, and it can be used as a reliable scale in nursing management, education and research. PMID:29403620
23 CFR 940.11 - Project implementation.
Code of Federal Regulations, 2011 CFR
2011-04-01
... projects funded with highway trust funds shall be based on a systems engineering analysis. (b) The analysis should be on a scale commensurate with the project scope. (c) The systems engineering analysis shall... definitions; (4) Analysis of alternative system configurations and technology options to meet requirements; (5...
23 CFR 940.11 - Project implementation.
Code of Federal Regulations, 2014 CFR
2014-04-01
... projects funded with highway trust funds shall be based on a systems engineering analysis. (b) The analysis should be on a scale commensurate with the project scope. (c) The systems engineering analysis shall... definitions; (4) Analysis of alternative system configurations and technology options to meet requirements; (5...
23 CFR 940.11 - Project implementation.
Code of Federal Regulations, 2012 CFR
2012-04-01
... projects funded with highway trust funds shall be based on a systems engineering analysis. (b) The analysis should be on a scale commensurate with the project scope. (c) The systems engineering analysis shall... definitions; (4) Analysis of alternative system configurations and technology options to meet requirements; (5...
23 CFR 940.11 - Project implementation.
Code of Federal Regulations, 2013 CFR
2013-04-01
... projects funded with highway trust funds shall be based on a systems engineering analysis. (b) The analysis should be on a scale commensurate with the project scope. (c) The systems engineering analysis shall... definitions; (4) Analysis of alternative system configurations and technology options to meet requirements; (5...
23 CFR 940.11 - Project implementation.
Code of Federal Regulations, 2010 CFR
2010-04-01
... projects funded with highway trust funds shall be based on a systems engineering analysis. (b) The analysis should be on a scale commensurate with the project scope. (c) The systems engineering analysis shall... definitions; (4) Analysis of alternative system configurations and technology options to meet requirements; (5...
NASA Astrophysics Data System (ADS)
Quiroz, M.; Cienfuegos, R.
2017-12-01
At present, there is good knowledge acquired by the scientific community on characterizing the evolution of tsunami energy at ocean and shelf scales. For instance, the investigations of Rabinovich (2013) and Yamazaki (2011), represent some important advances in this subject. In the present paper we rather focus on tsunami energy evolution, and ultimately its decay, in coastal areas because characteristic time scales of this process has implications for early warning, evacuation initiation, and cancelling. We address the tsunami energy evolution analysis at three different spatial scales, a global scale at the ocean basin level, in particular the Pacific Ocean basin, a regional scale comprising processes that occur at the continental shelf level, and finally a local scale comprising coastal areas or bays. These scales were selected following the motivation to understand how the response is associated with tsunami, and how the energy evolves until it is completely dissipated. Through signal processing methods, such as discrete and wavelets analysis, we analyze time series of recent tsunamigenic events in the main Chilean coastal cities. Based on this analysis, we propose a conceptual model based on the influence of geomorphological variables on the evolution and decay of tsunami energy. This model acts as a filter from the seismic source to the observed response in coastal zones. Finally, we hope to conclude with practical tools that will establish patterns of behavior and scaling of energy evolution through interconnections from seismic source variables and the geomorphological component to understand the response and predict behavior for a given site.
Using Web Maps to Analyze the Construction of Global Scale Cognitive Maps
ERIC Educational Resources Information Center
Pingel, Thomas J.
2018-01-01
Game-based Web sites and applications are changing the ways in which students learn the world map. In this study, a Web map-based digital learning tool was used as a study aid for a university-level geography course in order to examine the way in which global scale cognitive maps are constructed. A network analysis revealed that clicks were…
Tracey S. Frescino; Gretchen G. Moisen
2009-01-01
The Interior-West, Forest Inventory and Analysis (FIA), Nevada Photo-Based Inventory Pilot (NPIP), launched in 2004, involved acquisition, processing, and interpretation of large scale aerial photographs on a subset of FIA plots (both forest and nonforest) throughout the state of Nevada. Two objectives of the pilot were to use the interpreted photo data to enhance...
Optimising mobility outcome measures in Huntington's disease.
Busse, Monica; Quinn, Lori; Khalil, Hanan; McEwan, Kirsten
2014-01-01
Many of the performance-based mobility measures that are currently used in Huntington's disease (HD) were developed for assessment in other neurological conditions such as stroke. We aimed to assess the individual item-response of commonly used performance-based mobility measures, with a view to optimizing the scales for specific application in Huntington's Disease (HD). Data from a larger multicentre, observational study were used. Seventy-five people with HD (11 pre-manifest & 64 manifest) were assessed on the Six-Minute Walk Test, 10-Meter Walk Test, Timed "Up & Go" Test (TUG), Berg Balance Scale (BBS), Physical Performance Test (PPT), Four Square Step Test, and Tinetti Mobility Test (TMT). The Unified Huntington's Disease Rating Scale (UHDRS) Total Motor Score, Functional Assessment Scale and Total Functional Capacity scores were recorded, alongside cognitive measures. Standard regression analysis was used to assess predictive validity. Individual item responses were investigated using a sequence of approaches to allow for gradual removal of items and the subsequent creation of shortened versions. Psychometric properties (reliability and discriminant ability) of the shortened scales were assessed. TUG (β 0.46, CI 0.20-3.47), BBS (β -0.35, CI -2.10-0.14), and TMT (β -0.45, CI -3.14-0.64) were good disease-specific mobility measures. PPT was the best measure of functional performance (β 0.42, CI 0.00-0.43 for TFC & β 0.57 CI 0.15-0.81 for FAS). Shortened versions of BBS and TMT were developed based on item analysis. The resultant BBS and TMT shortened scales were reliable for use in manifest HD. ROC analysis showed that shortened scales were able to discriminate between manifest and pre-manifest disease states. Our data suggests that the PPT is appropriate as a general measure of function in individuals with HD, and we have identified shortened versions of the BBS and TMT that measure the unique gait and balance impairments in HD. These scales, alongside the TUG, may therefore be important measures to consider in future clinical trials.
Gattinger, Heidrun; Senn, Beate; Hantikainen, Virpi; Köpke, Sascha; Ott, Stefan; Leino-Kilpi, Helena
2017-01-01
Impaired mobility is a prevalent condition among care-dependent persons living in nursing homes. Therefore, competence development of nursing staff in mobility care is important. This study aimed to develop and initially test the Kinaesthetics Competence Self-Evaluation (KCSE) scale for assessing nursing staff's competence in mobility care. The KCSE scale was developed based on an analysis of the concept of nurses' competence in kinaesthetics. Kinaesthetics is a training concept that provides theory and practice about movement foundations that comprise activities of daily living. The scale contains 28 items and four subscales (attitude, dynamic state, knowledge and skills). Content validity was assessed by determining the content validity index within two expert panels. Internal consistency and construct validity were tested within a cross-sectional study in three nursing homes in the German-speaking region of Switzerland between September and November 2015. The content validity index for the entire scale was good (0.93). Based on a sample of nursing staff ( n = 180) the internal consistency results were good for the whole scale (Cronbach's alpha = 0.91) and for the subscales knowledge and skills (α = 0.91, 0.86), acceptable for the subscale attitude (α = 0.63) and weak for the subscale dynamic state (α = 0.54). Most items showed acceptable inter-item and item-total correlations. Based on the exploratory factor analysis, four factors explaining 52% of the variance were extracted. The newly developed KCSE scale is a promising instrument for measuring nursing staff's attitude, dynamic state, knowledge, and skills in mobility care based on kinaesthetics. Despite the need for further psychometric evaluation, the KCSE scale can be used in clinical practice to evaluate competence in mobility care based on kinaesthetics and to identify educational needs for nursing staff.
Scaling analysis of gas-liquid two-phase flow pattern in microgravity
NASA Technical Reports Server (NTRS)
Lee, Jinho
1993-01-01
A scaling analysis of gas-liquid two-phase flow pattern in microgravity, based on the dominant physical mechanism, was carried out with the goal of predicting the gas-liquid two-phase flow regime in a pipe under conditions of microgravity. The results demonstrated the effect of inlet geometry on the flow regime transition. A comparison of the predictions with existing experimental data showed good agreement.
Duncan-Carnesciali, Joanne; Wallace, Barbara C; Odlum, Michelle
2018-06-01
Purpose The purpose of this study was to evaluate the perceptions that certified diabetes educators (CDEs), of diverse health professions, have of a culturally appropriate e-health intervention that used avatar-based technology. Methods Cross-sectional, survey-based design using quantitative and qualitative paradigms. A logic model framed the study, which centered on the broad and general concepts leading to study outcomes. In total, 198 CDEs participated in the evaluation. Participants were mostly female and represented an age range of 26 to 76 years. The profession representative of the sample was registered nurses. Study setting and data collection occurred at https://www.surveymonkey.com/r/AvatarVideoSurvey-for-Certified_Diabetes_Educators . Study instruments used were the Basic Demographics Survey (BD-13), Educational Material Use and Rating of Quality Scale (EMU-ROQ-9), Marlowe-Crowne Social Desirability Survey (MS-SOC-DES-CDE-13), Quality of Avatar Video Rating Scale (QAVRS-7), Recommend Avatar to Patients Scale (RAVTPS-3), Recommend Avatar Video to Health Professionals Scale (RAVTHP-3), and Avatar Video Applications Scale (AVAPP-1). Statistical analysis used included t tests, Pearson product moment correlations, backward stepwise regression, and content/thematic analysis. Results Age, ethnicity, Arab/Middle Eastern, Asian, and white/European descents were significant predictors of a high-quality rating of the video. Thematic and content analysis of the data revealed an overall positive perception of the video. Conclusions An e-health intervention grounded in evidence-based health behavior theories has potential to increase access to diabetes self-management education as evidenced in the ratings and perceptions of the video by CDEs.
Cross-validation of the Student Perceptions of Team-Based Learning Scale in the United States.
Lein, Donald H; Lowman, John D; Eidson, Christopher A; Yuen, Hon K
2017-01-01
The purpose of this study was to cross-validate the factor structure of the previously developed Student Perceptions of Team-Based Learning (TBL) Scale among students in an entry-level doctor of physical therapy (DPT) program in the United States. Toward the end of the semester in 2 patient/client management courses taught using TBL, 115 DPT students completed the Student Perceptions of TBL Scale, with a response rate of 87%. Principal component analysis (PCA) and confirmatory factor analysis (CFA) were conducted to replicate and confirm the underlying factor structure of the scale. Based on the PCA for the validation sample, the original 2-factor structure (preference for TBL and preference for teamwork) of the Student Perceptions of TBL Scale was replicated. The overall goodness-of-fit indices from the CFA suggested that the original 2-factor structure for the 15 items of the scale demonstrated a good model fit (comparative fit index, 0.95; non-normed fit index/Tucker-Lewis index, 0.93; root mean square error of approximation, 0.06; and standardized root mean square residual, 0.07). The 2 factors demonstrated high internal consistency (alpha= 0.83 and 0.88, respectively). DPT students taught using TBL viewed the factor of preference for teamwork more favorably than preference for TBL. Our findings provide evidence supporting the replicability of the internal structure of the Student Perceptions of TBL Scale when assessing perceptions of TBL among DPT students in patient/client management courses.
Poulsen, Ingrid; Kreiner, Svend; Engberg, Aase W
2018-02-13
The Early Functional Abilities scale assesses the restoration of brain function after brain injury, based on 4 dimensions. The primary objective of this study was to evaluate the validity, objectivity, reliability and measurement precision of the Early Functional Abilities scale by Rasch model item analysis. A secondary objective was to examine the relationship between the Early Functional Abilities scale and the Functional Independence Measurement™, in order to establish the criterion validity of the Early Functional Abilities scale and to compare the sensitivity of measurements using the 2 instruments. The Rasch analysis was based on the assessment of 408 adult patients at admission to sub-acute rehabilitation in Copenhagen, Denmark after traumatic brain injury. The Early Functional Abilities scale provides valid and objective measurement of vegetative (autonomic), facio-oral, sensorimotor and communicative/cognitive functions. Removal of one item from the sensorimotor scale confirmed unidimensionality for each of the 4 subscales, but not for the entire scale. The Early Functional Abilities subscales are sensitive to differences between patients in ranges in which the Functional Independence Measurement™ has a floor effect. The Early Functional Abilities scale assesses the early recovery of important aspects of brain function after traumatic brain injury, but is not unidimensional. We recommend removal of the "standing" item and calculation of summary subscales for the separate dimensions.
NASA Astrophysics Data System (ADS)
Zhu, Hongchun; Zhao, Yipeng; Liu, Haiying
2018-04-01
Scale is the basic attribute for expressing and describing spatial entity and phenomena. It offers theoretical significance in the study of gully structure information, variable characteristics of watershed morphology, and development evolution at different scales. This research selected five different areas in China's Loess Plateau as the experimental region and used DEM data at different scales as the experimental data. First, the change rule of the characteristic parameters of the data at different scales was analyzed. The watershed structure information did not change along with a change in the data scale. This condition was proven by selecting indices of gully bifurcation ratio and fractal dimension as characteristic parameters of watershed structure information. Then, the change rule of the characteristic parameters of gully structure with different analysis scales was analyzed by setting the scale sequence of analysis at the extraction gully. The gully structure of the watershed changed with variations in the analysis scale, and the change rule was obvious when the gully level changed. Finally, the change rule of the characteristic parameters of the gully structure at different areas was analyzed. The gully fractal dimension showed a significant numerical difference in different areas, whereas the variation of the gully branch ratio was small. The change rule indicated that the development degree of the gully obviously varied in different regions, but the morphological structure was basically similar.
NASA Astrophysics Data System (ADS)
Zhu, Hongchun; Zhao, Yipeng; Liu, Haiying
2018-06-01
Scale is the basic attribute for expressing and describing spatial entity and phenomena. It offers theoretical significance in the study of gully structure information, variable characteristics of watershed morphology, and development evolution at different scales. This research selected five different areas in China's Loess Plateau as the experimental region and used DEM data at different scales as the experimental data. First, the change rule of the characteristic parameters of the data at different scales was analyzed. The watershed structure information did not change along with a change in the data scale. This condition was proven by selecting indices of gully bifurcation ratio and fractal dimension as characteristic parameters of watershed structure information. Then, the change rule of the characteristic parameters of gully structure with different analysis scales was analyzed by setting the scale sequence of analysis at the extraction gully. The gully structure of the watershed changed with variations in the analysis scale, and the change rule was obvious when the gully level changed. Finally, the change rule of the characteristic parameters of the gully structure at different areas was analyzed. The gully fractal dimension showed a significant numerical difference in different areas, whereas the variation of the gully branch ratio was small. The change rule indicated that the development degree of the gully obviously varied in different regions, but the morphological structure was basically similar.
Segmentation-based wavelet transform for still-image compression
NASA Astrophysics Data System (ADS)
Mozelle, Gerard; Seghier, Abdellatif; Preteux, Francoise J.
1996-10-01
In order to address simultaneously the two functionalities, content-based scalability required by MPEG-4, we introduce a segmentation-based wavelet transform (SBWT). SBWT takes into account both the mathematical properties of multiresolution analysis and the flexibility of region-based approaches for image compression. The associated methodology has two stages: 1) image segmentation into convex and polygonal regions; 2) 2D-wavelet transform of the signal corresponding to each region. In this paper, we have mathematically studied a method for constructing a multiresolution analysis (VjOmega)j (epsilon) N adapted to a polygonal region which provides an adaptive region-based filtering. The explicit construction of scaling functions, pre-wavelets and orthonormal wavelets bases defined on a polygon is carried out by using scaling functions is established by using the theory of Toeplitz operators. The corresponding expression can be interpreted as a location property which allow defining interior and boundary scaling functions. Concerning orthonormal wavelets and pre-wavelets, a similar expansion is obtained by taking advantage of the properties of the orthogonal projector P(V(j(Omega )) perpendicular from the space Vj(Omega ) + 1 onto the space (Vj(Omega )) perpendicular. Finally the mathematical results provide a simple and fast algorithm adapted to polygonal regions.
Parameter Uncertainty Analysis Using Monte Carlo Simulations for a Regional-Scale Groundwater Model
NASA Astrophysics Data System (ADS)
Zhang, Y.; Pohlmann, K.
2016-12-01
Regional-scale grid-based groundwater models for flow and transport often contain multiple types of parameters that can intensify the challenge of parameter uncertainty analysis. We propose a Monte Carlo approach to systematically quantify the influence of various types of model parameters on groundwater flux and contaminant travel times. The Monte Carlo simulations were conducted based on the steady-state conversion of the original transient model, which was then combined with the PEST sensitivity analysis tool SENSAN and particle tracking software MODPATH. Results identified hydrogeologic units whose hydraulic conductivity can significantly affect groundwater flux, and thirteen out of 173 model parameters that can cause large variation in travel times for contaminant particles originating from given source zones.
Inhomogeneous scaling behaviors in Malaysian foreign currency exchange rates
NASA Astrophysics Data System (ADS)
Muniandy, S. V.; Lim, S. C.; Murugan, R.
2001-12-01
In this paper, we investigate the fractal scaling behaviors of foreign currency exchange rates with respect to Malaysian currency, Ringgit Malaysia. These time series are examined piecewise before and after the currency control imposed in 1st September 1998 using the monofractal model based on fractional Brownian motion. The global Hurst exponents are determined using the R/ S analysis, the detrended fluctuation analysis and the method of second moment using the correlation coefficients. The limitation of these monofractal analyses is discussed. The usual multifractal analysis reveals that there exists a wide range of Hurst exponents in each of the time series. A new method of modelling the multifractal time series based on multifractional Brownian motion with time-varying Hurst exponents is studied.
NASA Technical Reports Server (NTRS)
Hilburger, Mark W.; Lovejoy, Andrew E.; Thornburgh, Robert P.; Rankin, Charles
2012-01-01
NASA s Shell Buckling Knockdown Factor (SBKF) project has the goal of developing new analysis-based shell buckling design factors (knockdown factors) and design and analysis technologies for launch vehicle structures. Preliminary design studies indicate that implementation of these new knockdown factors can enable significant reductions in mass and mass-growth in these vehicles. However, in order to validate any new analysis-based design data or methods, a series of carefully designed and executed structural tests are required at both the subscale and full-scale levels. This paper describes the design and analysis of three different orthogrid-stiffeNed metallic cylindrical-shell test articles. Two of the test articles are 8-ft-diameter, 6-ft-long test articles, and one test article is a 27.5-ft-diameter, 20-ft-long Space Shuttle External Tank-derived test article.
A confirmative clinimetric analysis of the 36-item Family Assessment Device.
Timmerby, Nina; Cosci, Fiammetta; Watson, Maggie; Csillag, Claudio; Schmitt, Florence; Steck, Barbara; Bech, Per; Thastum, Mikael
2018-02-07
The Family Assessment Device (FAD) is a 60-item questionnaire widely used to evaluate self-reported family functioning. However, the factor structure as well as the number of items has been questioned. A shorter and more user-friendly version of the original FAD-scale, the 36-item FAD, has therefore previously been proposed, based on findings in a nonclinical population of adults. We aimed in this study to evaluate the brief 36-item version of the FAD in a clinical population. Data from a European multinational study, examining factors associated with levels of family functioning in adult cancer patients' families, were used. Both healthy and ill parents completed the 60-item version FAD. The psychometric analyses conducted were Principal Component Analysis and Mokken-analysis. A total of 564 participants were included. Based on the psychometric analysis we confirmed that the 36-item version of the FAD has robust psychometric properties and can be used in clinical populations. The present analysis confirmed that the 36-item version of the FAD (18 items assessing 'well-being' and 18 items assessing 'dysfunctional' family function) is a brief scale where the summed total score is a valid measure of the dimensions of family functioning. This shorter version of the FAD is, in accordance with the concept of 'measurement-based care', an easy to use scale that could be considered when the aim is to evaluate self-reported family functioning.
López-Jáuregui, Alicia; Oliden, Paula Elosua
2009-11-01
The aim of this study is to adapt the ESPA29 scale of parental socialization styles in adolescence to the Basque language. The study of its psychometric properties is based on the search for evidence of internal and external validity. The first focuses on the assessment of the dimensionality of the scale by means of exploratory factor analysis. The relationship between the dimensions of parental socialization styles and gender and age guarantee the external validity of the scale. The study of the equivalence of the adapted and original versions is based on the comparisons of the reliability coefficients and on factor congruence. The results allow us to conclude the equivalence of the two scales.
Pryse, Yvette; McDaniel, Anna; Schafer, John
2014-08-01
Those in nursing have been charged with practicing to the full extent of their education and training by the Institute of Medicine. Therefore, evidence-based practice (EBP) has never been more important to nursing than in the current healthcare environment. Frequently the burden of EBP is the responsibility of the bedside practitioner, but has been found to be a process that requires leadership and organizational support. A key underlying component of a strong EBP environment includes effective communications and collaboration among staff and nursing leadership. Developing measurement tools that examine the milieu and nursing leadership in which the staff nurse practices is an important component of understanding the factors that support or hinder EBP. The aim of this study is to report on the development and analysis of two new scales designed to explore leadership and organizational support for EBP. The EBP Nursing Leadership Scale (10 items) examines the staff nurses perception of support provided by the nurse manager for EBP, and the EBP Work Environment Scale (8 items) examines organizational support for EBP. Staff nurses who worked at least .5 FTE in direct patient care, from two inner city hospitals (n = 422) completed the scales. The scales were evaluated for internal consistency reliability with the Cronbach alpha technique, content validity using a panel of experts, and construct validity by The content validity index computed from expert rankings was .78 to 1.0 with an average of.96. Cronbach's alpha was .96 (n = 422) for the EBP Nursing Leadership Scale and .86 (n = 422) for the EBP Work Environment Scale. Factor analysis confirmed that each scale measured a unidimensional construct (p < .000). The EBP Nursing Leadership Scale and the EBP Work Environment Scale are psychometrically sound instruments to examine organizational influences on EBP. © 2014 Sigma Theta Tau International.
Multi-Scale Modeling of an Integrated 3D Braided Composite with Applications to Helicopter Arm
NASA Astrophysics Data System (ADS)
Zhang, Diantang; Chen, Li; Sun, Ying; Zhang, Yifan; Qian, Kun
2017-10-01
A study is conducted with the aim of developing multi-scale analytical method for designing the composite helicopter arm with three-dimensional (3D) five-directional braided structure. Based on the analysis of 3D braided microstructure, the multi-scale finite element modeling is developed. Finite element analysis on the load capacity of 3D five-directional braided composites helicopter arm is carried out using the software ABAQUS/Standard. The influences of the braiding angle and loading condition on the stress and strain distribution of the helicopter arm are simulated. The results show that the proposed multi-scale method is capable of accurately predicting the mechanical properties of 3D braided composites, validated by the comparison the stress-strain curves of meso-scale RVCs. Furthermore, it is found that the braiding angle is an important factor affecting the mechanical properties of 3D five-directional braided composite helicopter arm. Based on the optimized structure parameters, the nearly net-shaped composite helicopter arm is fabricated using a novel resin transfer mould (RTM) process.
ERIC Educational Resources Information Center
Lou, Yu-Chiung; Lin, Hsiao-Fang; Lin, Chin-Wen
2013-01-01
The aims of the study were (a) to develop a scale to measure university students' task value and (b) to use confirmatory factor analytic techniques to investigate the construct validity of the scale. The questionnaire items were developed based on theoretical considerations and the final version contained 38 items divided into 4 subscales.…
Two-dimensional DFA scaling analysis applied to encrypted images
NASA Astrophysics Data System (ADS)
Vargas-Olmos, C.; Murguía, J. S.; Ramírez-Torres, M. T.; Mejía Carlos, M.; Rosu, H. C.; González-Aguilar, H.
2015-01-01
The technique of detrended fluctuation analysis (DFA) has been widely used to unveil scaling properties of many different signals. In this paper, we determine scaling properties in the encrypted images by means of a two-dimensional DFA approach. To carry out the image encryption, we use an enhanced cryptosystem based on a rule-90 cellular automaton and we compare the results obtained with its unmodified version and the encryption system AES. The numerical results show that the encrypted images present a persistent behavior which is close to that of the 1/f-noise. These results point to the possibility that the DFA scaling exponent can be used to measure the quality of the encrypted image content.
Multi-scaling allometric analysis for urban and regional development
NASA Astrophysics Data System (ADS)
Chen, Yanguang
2017-01-01
The concept of allometric growth is based on scaling relations, and it has been applied to urban and regional analysis for a long time. However, most allometric analyses were devoted to the single proportional relation between two elements of a geographical system. Few researches focus on the allometric scaling of multielements. In this paper, a process of multiscaling allometric analysis is developed for the studies on spatio-temporal evolution of complex systems. By means of linear algebra, general system theory, and by analogy with the analytical hierarchy process, the concepts of allometric growth can be integrated with the ideas from fractal dimension. Thus a new methodology of geo-spatial analysis and the related theoretical models emerge. Based on the least squares regression and matrix operations, a simple algorithm is proposed to solve the multiscaling allometric equation. Applying the analytical method of multielement allometry to Chinese cities and regions yields satisfying results. A conclusion is reached that the multiscaling allometric analysis can be employed to make a comprehensive evaluation for the relative levels of urban and regional development, and explain spatial heterogeneity. The notion of multiscaling allometry may enrich the current theory and methodology of spatial analyses of urban and regional evolution.
Effect of Gender on the Knowledge of Medicinal Plants: Systematic Review and Meta-Analysis
Torres-Avilez, Wendy; de Medeiros, Patrícia Muniz
2016-01-01
Knowledge of medicinal plants is not only one of the main components in the structure of knowledge in local medical systems but also one of the most studied resources. This study uses a systematic review and meta-analysis of a compilation of ethnobiological studies with a medicinal plant component and the variable of gender to evaluate whether there is a gender-based pattern in medicinal plant knowledge on different scales (national, continental, and global). In this study, three types of meta-analysis are conducted on different scales. We detect no significant differences on the global level; women and men have the same rich knowledge. On the national and continental levels, significant differences are observed in both directions (significant for men and for women), and a lack of significant differences in the knowledge of the genders is also observed. This finding demonstrates that there is no gender-based pattern for knowledge on different scales. PMID:27795730
Using object-based image analysis to guide the selection of field sample locations
USDA-ARS?s Scientific Manuscript database
One of the most challenging tasks for resource management and research is designing field sampling schemes to achieve unbiased estimates of ecosystem parameters as efficiently as possible. This study focused on the potential of fine-scale image objects from object-based image analysis (OBIA) to be u...
Jeremy S. Fried; Larry D. Potts; Sara M. Loreno; Glenn A. Christensen; R. Jamie Barbour
2017-01-01
The Forest Inventory and Analysis (FIA)-based BioSum (Bioregional Inventory Originated Simulation Under Management) is a free policy analysis framework and workflow management software solution. It addresses complex management questions concerning forest health and vulnerability for large, multimillion acre, multiowner landscapes using FIA plot data as the initial...
McCrea, Simon M
2009-01-01
Alexander Luria's model of the working brain consisting of three functional units was formulated through the examination of hundreds of focal brain-injury patients. Several psychometric instruments based on Luria's syndrome analysis and accompanying qualitative tasks have been developed since the 1970s. In the mid-1970s, JP Das and colleagues defined a specific cognitive processes model based directly on Luria's two coding units termed simultaneous and successive by studying diverse cross-cultural, ability, and socioeconomic strata. The cognitive assessment system is based on the PASS model of cognitive processes and consists of four composite scales of Planning-Attention-Simultaneous-Successive (PASS) devised by Naglieri and Das in 1997. Das and colleagues developed the two new scales of planning and attention to more closely model Luria's theory of higher cortical functions. In this paper a theoretical review of Luria's theory, Das and colleagues elaboration of Luria's model, and the neural correlates of PASS composite scales based on extant studies is summarized. A brief empirical study of the neuropsychological specificity of the PASS composite scales in a sample of 33 focal cortical stroke patients using cluster analysis is then discussed. Planning and simultaneous were sensitive to right hemisphere lesions. These findings were integrated with recent functional neuroimaging studies of PASS scales. In sum it was found that simultaneous is strongly dependent on dual bilateral occipitoparietal interhemispheric coordination whereas successive demonstrated left frontotemporal specificity with some evidence of interhemispheric coordination across the prefrontal cortex. Hence, support for the validity of the PASS composite scales was found as well as for the axiom of the independence of code content from code type originally specified in 1994 by Das, Naglieri, and Kirby.
A review and empirical study of the composite scales of the Das–Naglieri cognitive assessment system
McCrea, Simon M
2009-01-01
Alexander Luria’s model of the working brain consisting of three functional units was formulated through the examination of hundreds of focal brain-injury patients. Several psychometric instruments based on Luria’s syndrome analysis and accompanying qualitative tasks have been developed since the 1970s. In the mid-1970s, JP Das and colleagues defined a specific cognitive processes model based directly on Luria’s two coding units termed simultaneous and successive by studying diverse cross-cultural, ability, and socioeconomic strata. The cognitive assessment system is based on the PASS model of cognitive processes and consists of four composite scales of Planning–Attention–Simultaneous–Successive (PASS) devised by Naglieri and Das in 1997. Das and colleagues developed the two new scales of planning and attention to more closely model Luria’s theory of higher cortical functions. In this paper a theoretical review of Luria’s theory, Das and colleagues elaboration of Luria’s model, and the neural correlates of PASS composite scales based on extant studies is summarized. A brief empirical study of the neuropsychological specificity of the PASS composite scales in a sample of 33 focal cortical stroke patients using cluster analysis is then discussed. Planning and simultaneous were sensitive to right hemisphere lesions. These findings were integrated with recent functional neuroimaging studies of PASS scales. In sum it was found that simultaneous is strongly dependent on dual bilateral occipitoparietal interhemispheric coordination whereas successive demonstrated left frontotemporal specificity with some evidence of interhemispheric coordination across the prefrontal cortex. Hence, support for the validity of the PASS composite scales was found as well as for the axiom of the independence of code content from code type originally specified in 1994 by Das, Naglieri, and Kirby. PMID:22110322
NASA Astrophysics Data System (ADS)
Kim, Jeong-Man; Koo, Min-Mo; Jeong, Jae-Hoon; Hong, Keyyong; Cho, Il-Hyoung; Choi, Jang-Young
2017-05-01
This paper reports the design and analysis of a tubular permanent magnet linear generator (TPMLG) for a small-scale wave-energy converter. The analytical field computation is performed by applying a magnetic vector potential and a 2-D analytical model to determine design parameters. Based on analytical solutions, parametric analysis is performed to meet the design specifications of a wave-energy converter (WEC). Then, 2-D FEA is employed to validate the analytical method. Finally, the experimental result confirms the predictions of the analytical and finite element analysis (FEA) methods under regular and irregular wave conditions.
Piccinini, Filippo; Balassa, Tamas; Szkalisity, Abel; Molnar, Csaba; Paavolainen, Lassi; Kujala, Kaisa; Buzas, Krisztina; Sarazova, Marie; Pietiainen, Vilja; Kutay, Ulrike; Smith, Kevin; Horvath, Peter
2017-06-28
High-content, imaging-based screens now routinely generate data on a scale that precludes manual verification and interrogation. Software applying machine learning has become an essential tool to automate analysis, but these methods require annotated examples to learn from. Efficiently exploring large datasets to find relevant examples remains a challenging bottleneck. Here, we present Advanced Cell Classifier (ACC), a graphical software package for phenotypic analysis that addresses these difficulties. ACC applies machine-learning and image-analysis methods to high-content data generated by large-scale, cell-based experiments. It features methods to mine microscopic image data, discover new phenotypes, and improve recognition performance. We demonstrate that these features substantially expedite the training process, successfully uncover rare phenotypes, and improve the accuracy of the analysis. ACC is extensively documented, designed to be user-friendly for researchers without machine-learning expertise, and distributed as a free open-source tool at www.cellclassifier.org. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Sussman, M. B.; Harkonen, D. L.; Reed, J. B.
1976-01-01
Flow turning parameters, static pressures, surface temperatures, surface fluctuating pressures and acceleration levels were measured in the environment of a full-scale upper surface blowing (USB) propulsive-lift test configuration. The test components included a flightworthy CF6-50D engine, nacelle and USB flap assembly utilized in conjunction with ground verification testing of the USAF YC-14 Advanced Medium STOL Transport propulsion system. Results, based on a preliminary analysis of the data, generally show reasonable agreement with predicted levels based on model data. However, additional detailed analysis is required to confirm the preliminary evaluation, to help delineate certain discrepancies with model data and to establish a basis for future flight test comparisons.
An Integrated Assessment of Location-Dependent Scaling for Microalgae Biofuel Production Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Andre M.; Abodeely, Jared; Skaggs, Richard
Successful development of a large-scale microalgae-based biofuels industry requires comprehensive analysis and understanding of the feedstock supply chain—from facility siting/design through processing/upgrading of the feedstock to a fuel product. The evolution from pilot-scale production facilities to energy-scale operations presents many multi-disciplinary challenges, including a sustainable supply of water and nutrients, operational and infrastructure logistics, and economic competitiveness with petroleum-based fuels. These challenges are addressed in part by applying the Integrated Assessment Framework (IAF)—an integrated multi-scale modeling, analysis, and data management suite—to address key issues in developing and operating an open-pond facility by analyzing how variability and uncertainty in space andmore » time affect algal feedstock production rates, and determining the site-specific “optimum” facility scale to minimize capital and operational expenses. This approach explicitly and systematically assesses the interdependence of biofuel production potential, associated resource requirements, and production system design trade-offs. The IAF was applied to a set of sites previously identified as having the potential to cumulatively produce 5 billion-gallons/year in the southeastern U.S. and results indicate costs can be reduced by selecting the most effective processing technology pathway and scaling downstream processing capabilities to fit site-specific growing conditions, available resources, and algal strains.« less
Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B
2013-03-23
Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.
Multiscale analysis of restoration priorities for marine shoreline planning.
Diefenderfer, Heida L; Sobocinski, Kathryn L; Thom, Ronald M; May, Christopher W; Borde, Amy B; Southard, Susan L; Vavrinec, John; Sather, Nichole K
2009-10-01
Planners are being called on to prioritize marine shorelines for conservation status and restoration action. This study documents an approach to determining the management strategy most likely to succeed based on current conditions at local and landscape scales. The conceptual framework based in restoration ecology pairs appropriate restoration strategies with sites based on the likelihood of producing long-term resilience given the condition of ecosystem structures and processes at three scales: the shorezone unit (site), the drift cell reach (nearshore marine landscape), and the watershed (terrestrial landscape). The analysis is structured by a conceptual ecosystem model that identifies anthropogenic impacts on targeted ecosystem functions. A scoring system, weighted by geomorphic class, is applied to available spatial data for indicators of stress and function using geographic information systems. This planning tool augments other approaches to prioritizing restoration, including historical conditions and change analysis and ecosystem valuation.
An Analysis of the Connectedness to Nature Scale Based on Item Response Theory
Pasca, Laura; Aragonés, Juan I.; Coello, María T.
2017-01-01
The Connectedness to Nature Scale (CNS) is used as a measure of the subjective cognitive connection between individuals and nature. However, to date, it has not been analyzed at the item level to confirm its quality. In the present study, we conduct such an analysis based on Item Response Theory. We employed data from previous studies using the Spanish-language version of the CNS, analyzing a sample of 1008 participants. The results show that seven items presented appropriate indices of discrimination and difficulty, in addition to a good fit. The remaining six have inadequate discrimination indices and do not present a good fit. A second study with 321 participants shows that the seven-item scale has adequate levels of reliability and validity. Therefore, it would be appropriate to use a reduced version of the scale after eliminating the items that display inappropriate behavior, since they may interfere with research results on connectedness to nature. PMID:28824509
BioPig: a Hadoop-based analytic toolkit for large-scale sequence data.
Nordberg, Henrik; Bhatia, Karan; Wang, Kai; Wang, Zhong
2013-12-01
The recent revolution in sequencing technologies has led to an exponential growth of sequence data. As a result, most of the current bioinformatics tools become obsolete as they fail to scale with data. To tackle this 'data deluge', here we introduce the BioPig sequence analysis toolkit as one of the solutions that scale to data and computation. We built BioPig on the Apache's Hadoop MapReduce system and the Pig data flow language. Compared with traditional serial and MPI-based algorithms, BioPig has three major advantages: first, BioPig's programmability greatly reduces development time for parallel bioinformatics applications; second, testing BioPig with up to 500 Gb sequences demonstrates that it scales automatically with size of data; and finally, BioPig can be ported without modification on many Hadoop infrastructures, as tested with Magellan system at National Energy Research Scientific Computing Center and the Amazon Elastic Compute Cloud. In summary, BioPig represents a novel program framework with the potential to greatly accelerate data-intensive bioinformatics analysis.
Honigh-de Vlaming, Rianne; Haveman-Nies, Annemien; Bos-Oude Groeniger, Inge; Hooft van Huysduynen, Eveline J C; de Groot, Lisette C P G M; Van't Veer, Pieter
2014-01-01
To develop and evaluate the Loneliness Literacy Scale for the assessment of short-term outcomes of a loneliness prevention programme among Dutch elderly persons. Scale development was based on evidence from literature and experiences from local stakeholders and representatives of the target group. The scale was pre-tested among 303 elderly persons aged 65 years and over. Principal component analysis and internal consistency analysis were used to affirm the scale structure, reduce the number of items and assess the reliability of the constructs. Linear regression analysis was conducted to evaluate the association between the literacy constructs and loneliness. The four constructs "motivation", "self-efficacy", "perceived social support" and "subjective norm" derived from principal component analysis captured 56 % of the original variance. Cronbach's coefficient α was above 0.7 for each construct. The constructs "self-efficacy" and "perceived social support" were positively and "subjective norm" was negatively associated with loneliness. To our knowledge this is the first study developing a short-term indicator for loneliness prevention. The indicator contributes to the need of evaluating public health interventions more close to the intervention activities.
Entropy information of heart rate variability and its power spectrum during day and night
NASA Astrophysics Data System (ADS)
Jin, Li; Jun, Wang
2013-07-01
Physiologic systems generate complex fluctuations in their output signals that reflect the underlying dynamics. We employed the base-scale entropy method and the power spectral analysis to study the 24 hours heart rate variability (HRV) signals. The results show that such profound circadian-, age- and pathologic-dependent changes are accompanied by changes in base-scale entropy and power spectral distribution. Moreover, the base-scale entropy changes reflect the corresponding changes in the autonomic nerve outflow. With the suppression of the vagal tone and dominance of the sympathetic tone in congestive heart failure (CHF) subjects, there is more variability in the date fluctuation mode. So the higher base-scale entropy belongs to CHF subjects. With the decrease of the sympathetic tone and the respiratory frequency (RSA) becoming more pronounced with slower breathing during sleeping, the base-scale entropy drops in CHF subjects. The HRV series of the two healthy groups have the same diurnal/nocturnal trend as the CHF series. The fluctuation dynamics trend of data in the three groups can be described as “HF effect”.
Position Analysis Questionnaire ( PAQ ). This job analysis instrument consists of 187 job elements organized into six divisions. In the analysis of a job...with the PAQ the relevance of the individual elements to the job are rated using any of several rating scales such as importance, or time.
Xia, Hao-Zhi; Gao, Lei; Wang, Yang; Song, Hui; Shi, Bao-Xin
2017-11-01
To develop a Meaning in Life Scale for cancer patients in Chinese version and to test the validity and reliability. Meaning in life is a protective factor of psychological well-being and is negatively related to depression and demoralisation among cancer patients. The existing scales measuring meaning in life are mostly designed in English and there is no scale designed for Chinese cancer patients based on Chinese cultural background. Process of instrument development and psychometric evaluation were used. Items were generated from literature review and a focus group interview. Delphi technique was used to test the content validity. Item analysis and exploratory factor analysis were performed with data from 251 cancer patients. The internal consistency of the scale was tested by Cronbach's alpha. A 25-item Meaning in Life Scale in Chinese version with five domains was developed. The five factors explained 62·686% of the variance. The Cronbach's alpha for the total scale was 0·897. The Meaning in Life Scale in Chinese version has acceptable internal consistency reliability and good content validity and acceptable construct validity. The content of the scale reflected the attitudes of cancer patients towards meaning in life based on Chinese cultural background. The Chinese version of Meaning in Life Scale for Cancer Patients appears to be a new scale to assess meaning in life among Chinese cancer patients exactly and the concept of meaning in life presented in this scale provides new ideas of meaning intervention in routine clinical practice. © 2016 John Wiley & Sons Ltd.
Validation of the Epworth Sleepiness Scale for Children and Adolescents using Rasch analysis.
Janssen, Kitty C; Phillipson, Sivanes; O'Connor, Justen; Johns, Murray W
2017-05-01
A validated measure of daytime sleepiness for adolescents is needed to better explore emerging relationships between sleepiness and the mental and physical health of adolescents. The Epworth Sleepiness Scale (ESS) is a widely used scale for daytime sleepiness in adults but contains references to alcohol and driving. The Epworth Sleepiness Scale for Children and Adolescents (ESS-CHAD) has been proposed as the official modified version of the ESS for children and adolescents. This study describes the psychometric analysis of the ESS-CHAD as a measure of daytime sleepiness for adolescents. The ESS-CHAD was completed by 297 adolescents, 12-18 years old, from two independent schools in Victoria, Australia. Exploratory factor analysis and Rasch analysis was conducted to determine the validity of the scale. Exploratory factor analysis and Rasch analysis indicated that ESS-CHAD has internal validity and a unidimensional structure with good model fit. Rasch analysis of four subgroups based on gender and year-level were consistent with the overall results. The results were consistent with published ESS results, which strongly indicates that the changes to the scale do not affect the scale's capacity to measure daytime sleepiness. It is concluded that the ESS-CHAD is a reliable and internally valid measure of daytime sleepiness in adolescents 12-18 years old. Further studies are needed to establish the internal validity of the ESS-CHAD for children under 12 years, and to establish external validity and accurate cut-off points for children and adolescents. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Lin, Aijing; Shang, Pengjian
2016-04-01
Considering the diverse application of multifractal techniques in natural scientific disciplines, this work underscores the versatility of multiscale multifractal detrended fluctuation analysis (MMA) method to investigate artificial and real-world data sets. The modified MMA method based on cumulative distribution function is proposed with the objective of quantifying the scaling exponent and multifractality of nonstationary time series. It is demonstrated that our approach can provide a more stable and faithful description of multifractal properties in comprehensive range rather than fixing the window length and slide length. Our analyzes based on CDF-MMA method reveal significant differences in the multifractal characteristics in the temporal dynamics between US and Chinese stock markets, suggesting that these two stock markets might be regulated by very different mechanism. The CDF-MMA method is important for evidencing the stable and fine structure of multiscale and multifractal scaling behaviors and can be useful to deepen and broaden our understanding of scaling exponents and multifractal characteristics.
Modified multidimensional scaling approach to analyze financial markets.
Yin, Yi; Shang, Pengjian
2014-06-01
Detrended cross-correlation coefficient (σDCCA) and dynamic time warping (DTW) are introduced as the dissimilarity measures, respectively, while multidimensional scaling (MDS) is employed to translate the dissimilarities between daily price returns of 24 stock markets. We first propose MDS based on σDCCA dissimilarity and MDS based on DTW dissimilarity creatively, while MDS based on Euclidean dissimilarity is also employed to provide a reference for comparisons. We apply these methods in order to further visualize the clustering between stock markets. Moreover, we decide to confront MDS with an alternative visualization method, "Unweighed Average" clustering method, for comparison. The MDS analysis and "Unweighed Average" clustering method are employed based on the same dissimilarity. Through the results, we find that MDS gives us a more intuitive mapping for observing stable or emerging clusters of stock markets with similar behavior, while the MDS analysis based on σDCCA dissimilarity can provide more clear, detailed, and accurate information on the classification of the stock markets than the MDS analysis based on Euclidean dissimilarity. The MDS analysis based on DTW dissimilarity indicates more knowledge about the correlations between stock markets particularly and interestingly. Meanwhile, it reflects more abundant results on the clustering of stock markets and is much more intensive than the MDS analysis based on Euclidean dissimilarity. In addition, the graphs, originated from applying MDS methods based on σDCCA dissimilarity and DTW dissimilarity, may also guide the construction of multivariate econometric models.
Low-carbon building assessment and multi-scale input-output analysis
NASA Astrophysics Data System (ADS)
Chen, G. Q.; Chen, H.; Chen, Z. M.; Zhang, Bo; Shao, L.; Guo, S.; Zhou, S. Y.; Jiang, M. M.
2011-01-01
Presented as a low-carbon building evaluation framework in this paper are detailed carbon emission account procedures for the life cycle of buildings in terms of nine stages as building construction, fitment, outdoor facility construction, transportation, operation, waste treatment, property management, demolition, and disposal for buildings, supported by integrated carbon intensity databases based on multi-scale input-output analysis, essential for low-carbon planning, procurement and supply chain design, and logistics management.
ERIC Educational Resources Information Center
Ling, Guangming; Rijmen, Frank
2011-01-01
The factorial structure of the Time Management (TM) scale of the Student 360: Insight Program (S360) was evaluated based on a national sample. A general procedure with a variety of methods was introduced and implemented, including the computation of descriptive statistics, exploratory factor analysis (EFA), and confirmatory factor analysis (CFA).…
NASA Astrophysics Data System (ADS)
Dronova, I.; Gong, P.; Wang, L.; Clinton, N.; Fu, W.; Qi, S.
2011-12-01
Remote sensing-based vegetation classifications representing plant function such as photosynthesis and productivity are challenging in wetlands with complex cover and difficult field access. Recent advances in object-based image analysis (OBIA) and machine-learning algorithms offer new classification tools; however, few comparisons of different algorithms and spatial scales have been discussed to date. We applied OBIA to delineate wetland plant functional types (PFTs) for Poyang Lake, the largest freshwater lake in China and Ramsar wetland conservation site, from 30-m Landsat TM scene at the peak of spring growing season. We targeted major PFTs (C3 grasses, C3 forbs and different types of C4 grasses and aquatic vegetation) that are both key players in system's biogeochemical cycles and critical providers of waterbird habitat. Classification results were compared among: a) several object segmentation scales (with average object sizes 900-9000 m2); b) several families of statistical classifiers (including Bayesian, Logistic, Neural Network, Decision Trees and Support Vector Machines) and c) two hierarchical levels of vegetation classification, a generalized 3-class set and more detailed 6-class set. We found that classification benefited from object-based approach which allowed including object shape, texture and context descriptors in classification. While a number of classifiers achieved high accuracy at the finest pixel-equivalent segmentation scale, the highest accuracies and best agreement among algorithms occurred at coarser object scales. No single classifier was consistently superior across all scales, although selected algorithms of Neural Network, Logistic and K-Nearest Neighbors families frequently provided the best discrimination of classes at different scales. The choice of vegetation categories also affected classification accuracy. The 6-class set allowed for higher individual class accuracies but lower overall accuracies than the 3-class set because individual classes differed in scales at which they were best discriminated from others. Main classification challenges included a) presence of C3 grasses in C4-grass areas, particularly following harvesting of C4 reeds and b) mixtures of emergent, floating and submerged aquatic plants at sub-object and sub-pixel scales. We conclude that OBIA with advanced statistical classifiers offers useful instruments for landscape vegetation analyses, and that spatial scale considerations are critical in mapping PFTs, while multi-scale comparisons can be used to guide class selection. Future work will further apply fuzzy classification and field-collected spectral data for PFT analysis and compare results with MODIS PFT products.
Bond, Gary R; Drake, Robert E; Rapp, Charles A; McHugo, Gregory J; Xie, Haiyi
2009-09-01
Fidelity scales have been widely used to assess program adherence to the principles of an evidence-based practice, but they do not measure important aspects of quality of care. Pragmatic scales measuring clinical quality of services are needed to complement fidelity scales measuring structural aspects of program implementation. As part of the instrumentation developed for the National Implementing Evidence-Based Practices Project, we piloted a new instrument with two 5-item quality scales, Individualization (a client-level quality scale) and Quality Improvement (an organizational-level quality scale). Pairs of independent fidelity assessors conducted fidelity reviews in 49 sites in 8 states at baseline and at four subsequent 6-month intervals over a 2-year follow-up period. The assessors followed a standardized protocol to administer these quality scales during daylong site visits; during these same visits they assessed programs on fidelity to the evidence-based practice that the site was seeking to implement. Assessors achieved acceptable interrater reliability for both Individualization and Quality Improvement. Principal components factor analysis confirmed the 2-scale structure. The two scales were modestly correlated with each other and with the evidence-based practice fidelity scales. Over the first year, Individualization and Quality Improvement improved, but showed little or no improvement during the last year of follow-up. The two newly developed scales showed adequate psychometric properties in this preliminary study, but further research is needed to assess their validity and utility in routine clinical practice.
Experimental design and quantitative analysis of microbial community multiomics.
Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis
2017-11-30
Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.
Jia, Yongliang; Zhang, Shikai; Huang, Fangyi; Leung, Siu-wai
2012-06-01
Ginseng-based medicines and nitrates are commonly used in treating ischemic heart disease (IHD) angina pectoris in China. Hundreds of randomized controlled trials (RCTs) reported in Chinese language claimed that ginseng-based medicines can relieve the symptoms of IHD. This study provides the first PRISMA-compliant systematic review with sensitivity and subgroup analyses to evaluate the RCTs comparing the efficacies of ginseng-based medicines and nitrates in treating ischemic heart disease, particularly angina pectoris. Past RCTs published up to 2010 on ginseng versus nitrates in treating IHD for 14 or more days were retrieved from major English and Chinese databases, including PubMed, Science Direct, Cochrane Library, WangFang Data, and Chinese National Knowledge Infrastructure. The qualities of included RCTs were assessed with Jadad scale, a refined Jadad scale called M scale, CONSORT 2010 checklist, and Cochrane risk of bias tool. Meta-analysis was performed on the primary outcomes including the improvement of symptoms and electrocardiography (ECG). Subgroup analysis, sensitivity analysis, and meta-regression were performed to evaluate the effects of study characteristics of RCTs, including quality, follow-up periods, and efficacy definitions on the overall effect size of ginseng. Eighteen RCTs with 1549 participants were included. Overall odds ratios for comparing ginseng-based medicines with nitrates were 3.00 (95% CI: 2.27-3.96) in symptom improvement (n=18) and 1.61 (95% CI: 1.20-2.15) in ECG improvement (n=10). Subgroup analysis, sensitivity analysis, and meta-regression found no significant difference in overall effects among all study characteristics, indicating that the overall effects were stable. The meta-analysis of 18 eligible RCTs demonstrates moderate evidence that ginseng is more effective than nitrates for treating angina pectoris. However, further RCTs for higher quality, longer follow-up periods, lager sample size, multi-center/country, and are still required to verify the efficacy. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.
Trend Switching Processes in Financial Markets
NASA Astrophysics Data System (ADS)
Preis, Tobias; Stanley, H. Eugene
For an intriguing variety of switching processes in nature, the underlying complex system abruptly changes at a specific point from one state to another in a highly discontinuous fashion. Financial market fluctuations are characterized by many abrupt switchings creating increasing trends ("bubble formation") and decreasing trends ("bubble collapse"), on time scales ranging from macroscopic bubbles persisting for hundreds of days to microscopic bubbles persisting only for very short time scales. Our analysis is based on a German DAX Future data base containing 13,991,275 transactions recorded with a time resolution of 10- 2 s. For a parallel analysis, we use a data base of all S&P500 stocks providing 2,592,531 daily closing prices. We ask whether these ubiquitous switching processes have quantifiable features independent of the time horizon studied. We find striking scale-free behavior of the volatility after each switching occurs. We interpret our findings as being consistent with time-dependent collective behavior of financial market participants. We test the possible universality of our result by performing a parallel analysis of fluctuations in transaction volume and time intervals between trades. We show that these financial market switching processes have features similar to those present in phase transitions. We find that the well-known catastrophic bubbles that occur on large time scales - such as the most recent financial crisis - are no outliers but in fact single dramatic representatives caused by the formation of upward and downward trends on time scales varying over nine orders of magnitude from the very large down to the very small.
Jones, Alvin; Ingram, M Victoria
2011-10-01
Using a relatively new statistical paradigm, Optimal Data Analysis (ODA; Yarnold & Soltysik, 2005), this research demonstrated that newly developed scales for the Minnesota Multiphasic Personality Inventory-2 (MMPI-2) and MMPI-2 Restructured Form (MMPI-2-RF) specifically designed to assess over-reporting of cognitive and/or somatic symptoms were more effective than the MMPI-2 F-family of scales in predicting effort status on tests of cognitive functioning in a sample of 288 military members. ODA demonstrated that when all scales were performing at their theoretical maximum possible level of classification accuracy, the Henry Heilbronner Index (HHI), Response Bias Scale (RBS), Fake Bad Scale (FBS), and the Symptom Validity Scale (FBS-r) outperformed the F-family of scales on a variety of ODA indexes of classification accuracy, including an omnibus measure (effect strength total, EST) of the descriptive and prognostic utility of ODA models developed for each scale. Based on the guidelines suggested by Yarnold and Soltysik for evaluating effect strengths for ODA models, the newly developed scales had effects sizes that were moderate in size (37.66 to 45.68), whereas the F-family scales had effects strengths that ranged from weak to moderate (15.42 to 32.80). In addition, traditional analysis demonstrated that HHI, RBS, FBS, and FBS-R had large effect sizes (0.98 to 1.16) based on Cohen's (1988) suggested categorization of effect size when comparing mean scores for adequate versus inadequate effort groups, whereas F-family of scales had small to medium effect sizes (0.25 to 0.76). The MMPI-2-RF Infrequent Somatic Responses Scale (F(S)) tended to perform in a fashion similar to F, the best performing F-family scale.
Inverse finite-size scaling for high-dimensional significance analysis
NASA Astrophysics Data System (ADS)
Xu, Yingying; Puranen, Santeri; Corander, Jukka; Kabashima, Yoshiyuki
2018-06-01
We propose an efficient procedure for significance determination in high-dimensional dependence learning based on surrogate data testing, termed inverse finite-size scaling (IFSS). The IFSS method is based on our discovery of a universal scaling property of random matrices which enables inference about signal behavior from much smaller scale surrogate data than the dimensionality of the original data. As a motivating example, we demonstrate the procedure for ultra-high-dimensional Potts models with order of 1010 parameters. IFSS reduces the computational effort of the data-testing procedure by several orders of magnitude, making it very efficient for practical purposes. This approach thus holds considerable potential for generalization to other types of complex models.
A case study for cloud based high throughput analysis of NGS data using the globus genomics system
Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; ...
2015-01-01
Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-end NGS analysis requirements. The Globus Genomicsmore » system is built on Amazon's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.« less
A case study for cloud based high throughput analysis of NGS data using the globus genomics system
Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; Rodriguez, Alex; Madduri, Ravi; Dave, Utpal; Lacinski, Lukasz; Foster, Ian; Gusev, Yuriy; Madhavan, Subha
2014-01-01
Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon 's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research. PMID:26925205
Zhao, Shanrong; Prenger, Kurt; Smith, Lance
2013-01-01
RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets. PMID:25937948
Reliability and validity of the Outcome Expectations for Exercise Scale-2.
Resnick, Barbara
2005-10-01
Development of a reliable and valid measure of outcome expectations for exercise for older adults will help establish the relationship between outcome expectations and exercise and facilitate the development of interventions to increase physical activity in older adults. The purpose of this study was to test the reliability and validity of the Outcome Expectations for Exercise-2 Scale (OEE-2), a 13-item measure with two subscales: positive OEE (POEE) and negative OEE (NOEE). The OEE-2 scale was given to 161 residents in a continuing-care retirement community. There was some evidence of validity based on confirmatory factor analysis, Rasch-analysis INFIT and OUTFIT statistics, and convergent validity and test criterion relationships. There was some evidence for reliability of the OEE-2 based on alpha coefficients, person- and item-separation reliability indexes, and R(2)values. Based on analyses, suggested revisions are provided for future use of the OEE-2. Although ongoing reliability and validity testing are needed, the OEE-2 scale can be used to identify older adults with low outcome expectations for exercise, and interventions can then be implemented to strengthen these expectations and improve exercise behavior.
Agent Based Modeling: Fine-Scale Spatio-Temporal Analysis of Pertussis
NASA Astrophysics Data System (ADS)
Mills, D. A.
2017-10-01
In epidemiology, spatial and temporal variables are used to compute vaccination efficacy and effectiveness. The chosen resolution and scale of a spatial or spatio-temporal analysis will affect the results. When calculating vaccination efficacy, for example, a simple environment that offers various ideal outcomes is often modeled using coarse scale data aggregated on an annual basis. In contrast to the inadequacy of this aggregated method, this research uses agent based modeling of fine-scale neighborhood data centered around the interactions of infants in daycare and their families to demonstrate an accurate reflection of vaccination capabilities. Despite being able to prevent major symptoms, recent studies suggest that acellular Pertussis does not prevent the colonization and transmission of Bordetella Pertussis bacteria. After vaccination, a treated individual becomes a potential asymptomatic carrier of the Pertussis bacteria, rather than an immune individual. Agent based modeling enables the measurable depiction of asymptomatic carriers that are otherwise unaccounted for when calculating vaccination efficacy and effectiveness. Using empirical data from a Florida Pertussis outbreak case study, the results of this model demonstrate that asymptomatic carriers bias the calculated vaccination efficacy and reveal a need for reconsidering current methods that are widely used for calculating vaccination efficacy and effectiveness.
Zhao, Shanrong; Prenger, Kurt; Smith, Lance
2013-01-01
RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets.
ERIC Educational Resources Information Center
Ebesutani, Chad; Reise, Steven P.; Chorpita, Bruce F.; Ale, Chelsea; Regan, Jennifer; Young, John; Higa-McMillan, Charmaine; Weisz, John R.
2012-01-01
Using a school-based (N = 1,060) and clinic-referred (N = 303) youth sample, the authors developed a 25-item shortened version of the Revised Child Anxiety and Depression Scale (RCADS) using Schmid-Leiman exploratory bifactor analysis to reduce client burden and administration time and thus improve the transportability characteristics of this…
ERIC Educational Resources Information Center
Bacanli, Hasan; Surucu, Mustafa; Ilhan, Tahsin
2013-01-01
The aim of the current study was to develop a short form of Coping Styles Scale based on COPE Inventory. A total of 275 undergraduate students (114 female, and 74 male) were administered in the first study. In order to test factors structure of Coping Styles Scale Brief Form, principal components factor analysis and direct oblique rotation was…
Goff, J.A.; Holliger, K.
1999-01-01
The main borehole of the German Continental Deep Drilling Program (KTB) extends over 9000 m into a crystalline upper crust consisting primarily of interlayered gneiss and metabasite. We present a joint analysis of the velocity and lithology logs in an effort to extract the lithology component of the velocity log. Covariance analysis of lithology log, approximated as a binary series, indicates that it may originate from the superposition of two Brownian stochastic processes (fractal dimension 1.5) with characteristic scales of ???2800 m and ???150 m, respectively. Covariance analysis of the velocity fluctuations provides evidence for the superposition of four stochastic process with distinct characteristic scales. The largest two scales are identical to those derived from the lithology, confirming that these scales of velocity heterogeneity are caused by lithology variations. The third characteristic scale, ???20 m, also a Brownian process, is probably related to fracturing based on correlation with the resistivity log. The superposition of these three Brownian processes closely mimics the commonly observed 1/k decay (fractal dimension 2.0) of the velocity power spectrum. The smallest scale process (characteristic scale ???1.7 m) requires a low fractal dimension, ???1.0, and accounts for ???60% of the total rms velocity variation. A comparison of successive logs from 6900-7140 m depth indicates that such variations are not repeatable and thus probably do not represent true velocity variations in the crust. The results of this study resolve disparity between the differing published estimates of seismic heterogeneity based on the KTB sonic logs, and bridge the gap between estimates of crustal heterogeneity from geologic maps and borehole logs. Copyright 1999 by the American Geophysical Union.
EVALUATION AND ANALYSIS OF MICROSCALE FLOW AND TRANSPORT DURING REMEDIATION
The design of in-situ remediation is currently based on a description at the macroscopic scale. Phenomena at the pore and pore-network scales are typically lumped in terms of averaged quantities, using empirical or ad hoc expressions. These models cannot address fundamental rem...
Duque-Ramos, Astrid; Quesada-Martínez, Manuel; Iniesta-Moreno, Miguela; Fernández-Breis, Jesualdo Tomás; Stevens, Robert
2016-10-17
The biomedical community has now developed a significant number of ontologies. The curation of biomedical ontologies is a complex task and biomedical ontologies evolve rapidly, so new versions are regularly and frequently published in ontology repositories. This has the implication of there being a high number of ontology versions over a short time span. Given this level of activity, ontology designers need to be supported in the effective management of the evolution of biomedical ontologies as the different changes may affect the engineering and quality of the ontology. This is why there is a need for methods that contribute to the analysis of the effects of changes and evolution of ontologies. In this paper we approach this issue from the ontology quality perspective. In previous work we have developed an ontology evaluation framework based on quantitative metrics, called OQuaRE. Here, OQuaRE is used as a core component in a method that enables the analysis of the different versions of biomedical ontologies using the quality dimensions included in OQuaRE. Moreover, we describe and use two scales for evaluating the changes between the versions of a given ontology. The first one is the static scale used in OQuaRE and the second one is a new, dynamic scale, based on the observed values of the quality metrics of a corpus defined by all the versions of a given ontology (life-cycle). In this work we explain how OQuaRE can be adapted for understanding the evolution of ontologies. Its use has been illustrated with the ontology of bioinformatics operations, types of data, formats, and topics (EDAM). The two scales included in OQuaRE provide complementary information about the evolution of the ontologies. The application of the static scale, which is the original OQuaRE scale, to the versions of the EDAM ontology reveals a design based on good ontological engineering principles. The application of the dynamic scale has enabled a more detailed analysis of the evolution of the ontology, measured through differences between versions. The statistics of change based on the OQuaRE quality scores make possible to identify key versions where some changes in the engineering of the ontology triggered a change from the OQuaRE quality perspective. In the case of the EDAM, this study let us to identify that the fifth version of the ontology has the largest impact in the quality metrics of the ontology, when comparative analyses between the pairs of consecutive versions are performed.
Multi-Scale Modeling of Liquid Phase Sintering Affected by Gravity: Preliminary Analysis
NASA Technical Reports Server (NTRS)
Olevsky, Eugene; German, Randall M.
2012-01-01
A multi-scale simulation concept taking into account impact of gravity on liquid phase sintering is described. The gravity influence can be included at both the micro- and macro-scales. At the micro-scale, the diffusion mass-transport is directionally modified in the framework of kinetic Monte-Carlo simulations to include the impact of gravity. The micro-scale simulations can provide the values of the constitutive parameters for macroscopic sintering simulations. At the macro-scale, we are attempting to embed a continuum model of sintering into a finite-element framework that includes the gravity forces and substrate friction. If successful, the finite elements analysis will enable predictions relevant to space-based processing, including size and shape and property predictions. Model experiments are underway to support the models via extraction of viscosity moduli versus composition, particle size, heating rate, temperature and time.
Sybil--efficient constraint-based modelling in R.
Gelius-Dietrich, Gabriel; Desouki, Abdelmoneim Amer; Fritzemeier, Claus Jonathan; Lercher, Martin J
2013-11-13
Constraint-based analyses of metabolic networks are widely used to simulate the properties of genome-scale metabolic networks. Publicly available implementations tend to be slow, impeding large scale analyses such as the genome-wide computation of pairwise gene knock-outs, or the automated search for model improvements. Furthermore, available implementations cannot easily be extended or adapted by users. Here, we present sybil, an open source software library for constraint-based analyses in R; R is a free, platform-independent environment for statistical computing and graphics that is widely used in bioinformatics. Among other functions, sybil currently provides efficient methods for flux-balance analysis (FBA), MOMA, and ROOM that are about ten times faster than previous implementations when calculating the effect of whole-genome single gene deletions in silico on a complete E. coli metabolic model. Due to the object-oriented architecture of sybil, users can easily build analysis pipelines in R or even implement their own constraint-based algorithms. Based on its highly efficient communication with different mathematical optimisation programs, sybil facilitates the exploration of high-dimensional optimisation problems on small time scales. Sybil and all its dependencies are open source. Sybil and its documentation are available for download from the comprehensive R archive network (CRAN).
An Analysis of Model Scale Data Transformation to Full Scale Flight Using Chevron Nozzles
NASA Technical Reports Server (NTRS)
Brown, Clifford; Bridges, James
2003-01-01
Ground-based model scale aeroacoustic data is frequently used to predict the results of flight tests while saving time and money. The value of a model scale test is therefore dependent on how well the data can be transformed to the full scale conditions. In the spring of 2000, a model scale test was conducted to prove the value of chevron nozzles as a noise reduction device for turbojet applications. The chevron nozzle reduced noise by 2 EPNdB at an engine pressure ratio of 2.3 compared to that of the standard conic nozzle. This result led to a full scale flyover test in the spring of 2001 to verify these results. The flyover test confirmed the 2 EPNdB reduction predicted by the model scale test one year earlier. However, further analysis of the data revealed that the spectra and directivity, both on an OASPL and PNL basis, do not agree in either shape or absolute level. This paper explores these differences in an effort to improve the data transformation from model scale to full scale.
Manufacturing Analysis | Energy Analysis | NREL
, state, and community levels. Solar photovoltaic manufacturing cost analysis Examining the regional competitiveness of solar photovoltaic manufacturing points to access to capital as a critical component for scale of rare material-based photovoltaic PV technology deployment may influence the United States
Derivation of Zagarola-Smits scaling in zero-pressure-gradient turbulent boundary layers
NASA Astrophysics Data System (ADS)
Wei, Tie; Maciel, Yvan
2018-01-01
This Rapid Communication derives the Zagarola-Smits scaling directly from the governing equations for zero-pressure-gradient turbulent boundary layers (ZPG TBLs). It has long been observed that the scaling of the mean streamwise velocity in turbulent boundary layer flows differs in the near surface region and in the outer layer. In the inner region of small-velocity-defect boundary layers, it is generally accepted that the proper velocity scale is the friction velocity, uτ, and the proper length scale is the viscous length scale, ν /uτ . In the outer region, the most generally used length scale is the boundary layer thickness, δ . However, there is no consensus on velocity scales in the outer layer. Zagarola and Smits [ASME Paper No. FEDSM98-4950 (1998)] proposed a velocity scale, U ZS=(δ1/δ ) U∞ , where δ1 is the displacement thickness and U∞ is the freestream velocity. However, there are some concerns about Zagarola-Smits scaling due to the lack of a theoretical base. In this paper, the Zagarola-Smits scaling is derived directly from a combination of integral, similarity, and order-of-magnitude analysis of the mean continuity equation. The analysis also reveals that V∞, the mean wall-normal velocity at the edge of the boundary layer, is a proper scale for the mean wall-normal velocity V . Extending the analysis to the streamwise mean momentum equation, we find that the Reynolds shear stress in ZPG TBLs scales as U∞V∞ in the outer region. This paper also provides a detailed analysis of the mass and mean momentum balance in the outer region of ZPG TBLs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakamachi, Eiji; Yoshida, Takashi; Yamaguchi, Toshihiko
2014-10-06
We developed two-scale FE analysis procedure based on the crystallographic homogenization method by considering the hierarchical structure of poly-crystal aluminium alloy metal. It can be characterized as the combination of two-scale structure, such as the microscopic polycrystal structure and the macroscopic elastic plastic continuum. Micro polycrystal structure can be modeled as a three dimensional representative volume element (RVE). RVE is featured as by 3×3×3 eight-nodes solid finite elements, which has 216 crystal orientations. This FE analysis code can predict the deformation, strain and stress evolutions in the wire drawing processes in the macro- scales, and further the crystal texture andmore » hardening evolutions in the micro-scale. In this study, we analyzed the texture evolution in the wire drawing processes by our two-scale FE analysis code under conditions of various drawing angles of dice. We evaluates the texture evolution in the surface and center regions of the wire cross section, and to clarify the effects of processing conditions on the texture evolution.« less
NASA Astrophysics Data System (ADS)
Nakamachi, Eiji; Yoshida, Takashi; Kuramae, Hiroyuki; Morimoto, Hideo; Yamaguchi, Toshihiko; Morita, Yusuke
2014-10-01
We developed two-scale FE analysis procedure based on the crystallographic homogenization method by considering the hierarchical structure of poly-crystal aluminium alloy metal. It can be characterized as the combination of two-scale structure, such as the microscopic polycrystal structure and the macroscopic elastic plastic continuum. Micro polycrystal structure can be modeled as a three dimensional representative volume element (RVE). RVE is featured as by 3×3×3 eight-nodes solid finite elements, which has 216 crystal orientations. This FE analysis code can predict the deformation, strain and stress evolutions in the wire drawing processes in the macro- scales, and further the crystal texture and hardening evolutions in the micro-scale. In this study, we analyzed the texture evolution in the wire drawing processes by our two-scale FE analysis code under conditions of various drawing angles of dice. We evaluates the texture evolution in the surface and center regions of the wire cross section, and to clarify the effects of processing conditions on the texture evolution.
Spectral saliency via automatic adaptive amplitude spectrum analysis
NASA Astrophysics Data System (ADS)
Wang, Xiaodong; Dai, Jialun; Zhu, Yafei; Zheng, Haiyong; Qiao, Xiaoyan
2016-03-01
Suppressing nonsalient patterns by smoothing the amplitude spectrum at an appropriate scale has been shown to effectively detect the visual saliency in the frequency domain. Different filter scales are required for different types of salient objects. We observe that the optimal scale for smoothing amplitude spectrum shares a specific relation with the size of the salient region. Based on this observation and the bottom-up saliency detection characterized by spectrum scale-space analysis for natural images, we propose to detect visual saliency, especially with salient objects of different sizes and locations via automatic adaptive amplitude spectrum analysis. We not only provide a new criterion for automatic optimal scale selection but also reserve the saliency maps corresponding to different salient objects with meaningful saliency information by adaptive weighted combination. The performance of quantitative and qualitative comparisons is evaluated by three different kinds of metrics on the four most widely used datasets and one up-to-date large-scale dataset. The experimental results validate that our method outperforms the existing state-of-the-art saliency models for predicting human eye fixations in terms of accuracy and robustness.
Detection of crossover time scales in multifractal detrended fluctuation analysis
NASA Astrophysics Data System (ADS)
Ge, Erjia; Leung, Yee
2013-04-01
Fractal is employed in this paper as a scale-based method for the identification of the scaling behavior of time series. Many spatial and temporal processes exhibiting complex multi(mono)-scaling behaviors are fractals. One of the important concepts in fractals is crossover time scale(s) that separates distinct regimes having different fractal scaling behaviors. A common method is multifractal detrended fluctuation analysis (MF-DFA). The detection of crossover time scale(s) is, however, relatively subjective since it has been made without rigorous statistical procedures and has generally been determined by eye balling or subjective observation. Crossover time scales such determined may be spurious and problematic. It may not reflect the genuine underlying scaling behavior of a time series. The purpose of this paper is to propose a statistical procedure to model complex fractal scaling behaviors and reliably identify the crossover time scales under MF-DFA. The scaling-identification regression model, grounded on a solid statistical foundation, is first proposed to describe multi-scaling behaviors of fractals. Through the regression analysis and statistical inference, we can (1) identify the crossover time scales that cannot be detected by eye-balling observation, (2) determine the number and locations of the genuine crossover time scales, (3) give confidence intervals for the crossover time scales, and (4) establish the statistically significant regression model depicting the underlying scaling behavior of a time series. To substantive our argument, the regression model is applied to analyze the multi-scaling behaviors of avian-influenza outbreaks, water consumption, daily mean temperature, and rainfall of Hong Kong. Through the proposed model, we can have a deeper understanding of fractals in general and a statistical approach to identify multi-scaling behavior under MF-DFA in particular.
NASA Astrophysics Data System (ADS)
El-Etriby, Ahmed E.; Abdel-Meguid, Mohamed E.; Hatem, Tarek M.; Bahei-El-Din, Yehia A.
2014-03-01
Ambient vibrations are major source of wasted energy, exploiting properly such vibration can be converted to valuable energy and harvested to power up devices, i.e. electronic devices. Accordingly, energy harvesting using smart structures with active piezoelectric ceramics has gained wide interest over the past few years as a method for converting such wasted energy. This paper provides numerical and experimental analysis of piezoelectric fiber based composites for energy harvesting applications proposing a multi-scale modeling approach coupled with experimental verification. The multi-scale approach suggested to predict the behavior of piezoelectric fiber-based composites use micromechanical model based on Transformation Field Analysis (TFA) to calculate the overall material properties of electrically active composite structure. Capitalizing on the calculated properties, single-phase analysis of a homogeneous structure is conducted using finite element method. The experimental work approach involves running dynamic tests on piezoelectric fiber-based composites to simulate mechanical vibrations experienced by a subway train floor tiles. Experimental results agree well with the numerical results both for static and dynamic tests.
ERIC Educational Resources Information Center
Borsuk, Ellen R.; Watkins, Marley W.; Canivez, Gary L.
2006-01-01
Although often applied in practice, clinically based cognitive subtest profile analysis has failed to achieve empirical support. Nonlinear multivariate subtest profile analysis may have benefits over clinically based techniques, but the psychometric properties of these methods must be studied prior to their implementation and interpretation. The…
The Use of Weighted Graphs for Large-Scale Genome Analysis
Zhou, Fang; Toivonen, Hannu; King, Ross D.
2014-01-01
There is an acute need for better tools to extract knowledge from the growing flood of sequence data. For example, thousands of complete genomes have been sequenced, and their metabolic networks inferred. Such data should enable a better understanding of evolution. However, most existing network analysis methods are based on pair-wise comparisons, and these do not scale to thousands of genomes. Here we propose the use of weighted graphs as a data structure to enable large-scale phylogenetic analysis of networks. We have developed three types of weighted graph for enzymes: taxonomic (these summarize phylogenetic importance), isoenzymatic (these summarize enzymatic variety/redundancy), and sequence-similarity (these summarize sequence conservation); and we applied these types of weighted graph to survey prokaryotic metabolism. To demonstrate the utility of this approach we have compared and contrasted the large-scale evolution of metabolism in Archaea and Eubacteria. Our results provide evidence for limits to the contingency of evolution. PMID:24619061
Self-esteem among nursing assistants: reliability and validity of the Rosenberg Self-Esteem Scale.
McMullen, Tara; Resnick, Barbara
2013-01-01
To establish the reliability and validity of the Rosenberg Self-Esteem Scale (RSES) when used with nursing assistants (NAs). Testing the RSES used baseline data from a randomized controlled trial testing the Res-Care Intervention. Female NAs were recruited from nursing homes (n = 508). Validity testing for the positive and negative subscales of the RSES was based on confirmatory factor analysis (CFA) using structural equation modeling and Rasch analysis. Estimates of reliability were based on Rasch analysis and the person separation index. Evidence supports the reliability and validity of the RSES in NAs although we recommend minor revisions to the measure for subsequent use. Establishing reliable and valid measures of self-esteem in NAs will facilitate testing of interventions to strengthen workplace self-esteem, job satisfaction, and retention.
NASA Astrophysics Data System (ADS)
ten Veldhuis, Marie-Claire; Schleiss, Marc
2017-04-01
In this study, we introduced an alternative approach for analysis of hydrological flow time series, using an adaptive sampling framework based on inter-amount times (IATs). The main difference with conventional flow time series is the rate at which low and high flows are sampled: the unit of analysis for IATs is a fixed flow amount, instead of a fixed time window. We analysed statistical distributions of flows and IATs across a wide range of sampling scales to investigate sensitivity of statistical properties such as quantiles, variance, skewness, scaling parameters and flashiness indicators to the sampling scale. We did this based on streamflow time series for 17 (semi)urbanised basins in North Carolina, US, ranging from 13 km2 to 238 km2 in size. Results showed that adaptive sampling of flow time series based on inter-amounts leads to a more balanced representation of low flow and peak flow values in the statistical distribution. While conventional sampling gives a lot of weight to low flows, as these are most ubiquitous in flow time series, IAT sampling gives relatively more weight to high flow values, when given flow amounts are accumulated in shorter time. As a consequence, IAT sampling gives more information about the tail of the distribution associated with high flows, while conventional sampling gives relatively more information about low flow periods. We will present results of statistical analyses across a range of subdaily to seasonal scales and will highlight some interesting insights that can be derived from IAT statistics with respect to basin flashiness and impact urbanisation on hydrological response.
Loius R. Iverson; Anantha M. G. Prasad; Charles T. Scott
1996-01-01
The USDA Forest Service's Forest Inventory and Analysis (FIA) and the Natural Resource Conservation Service's State Soil Geographic (STATSGO) data bases provide valuable natural resource data that can be analyzed at the national scale. When coupled with other data (e.g., climate), these data bases can provide insights into factors associated with current and...
Effect of extreme data loss on heart rate signals quantified by entropy analysis
NASA Astrophysics Data System (ADS)
Li, Yu; Wang, Jun; Li, Jin; Liu, Dazhao
2015-02-01
The phenomenon of data loss always occurs in the analysis of large databases. Maintaining the stability of analysis results in the event of data loss is very important. In this paper, we used a segmentation approach to generate a synthetic signal that is randomly wiped from data according to the Gaussian distribution and the exponential distribution of the original signal. Then, the logistic map is used as verification. Finally, two methods of measuring entropy-base-scale entropy and approximate entropy-are comparatively analyzed. Our results show the following: (1) Two key parameters-the percentage and the average length of removed data segments-can change the sequence complexity according to logistic map testing. (2) The calculation results have preferable stability for base-scale entropy analysis, which is not sensitive to data loss. (3) The loss percentage of HRV signals should be controlled below the range (p = 30 %), which can provide useful information in clinical applications.
Allometric scaling theory applied to FIA biomass estimation
David C. Chojnacky
2002-01-01
Tree biomass estimates in the Forest Inventory and Analysis (FIA) database are derived from numerous methodologies whose abundance and complexity raise questions about consistent results throughout the U.S. A new model based on allometric scaling theory ("WBE") offers simplified methodology and a theoretically sound basis for improving the reliability and...
Predicting agricultural impacts of large-scale drought: 2012 and the case for better modeling
USDA-ARS?s Scientific Manuscript database
We present an example of a simulation-based forecast for the 2012 U.S. maize growing season produced as part of a high-resolution, multi-scale, predictive mechanistic modeling study designed for decision support, risk management, and counterfactual analysis. The simulations undertaken for this analy...
Wavelet-based multiscale window transform and energy and vorticity analysis
NASA Astrophysics Data System (ADS)
Liang, Xiang San
A new methodology, Multiscale Energy and Vorticity Analysis (MS-EVA), is developed to investigate sub-mesoscale, meso-scale, and large-scale dynamical interactions in geophysical fluid flows which are intermittent in space and time. The development begins with the construction of a wavelet-based functional analysis tool, the multiscale window transform (MWT), which is local, orthonormal, self-similar, and windowed on scale. The MWT is first built over the real line then modified onto a finite domain. Properties are explored, the most important one being the property of marginalization which brings together a quadratic quantity in physical space with its phase space representation. Based on MWT the MS-EVA is developed. Energy and enstrophy equations for the large-, meso-, and sub-meso-scale windows are derived and their terms interpreted. The processes thus represented are classified into four categories: transport; transfer, conversion, and dissipation/diffusion. The separation of transport from transfer is made possible with the introduction of the concept of perfect transfer. By the property of marginalization, the classical energetic analysis proves to be a particular case of the MS-EVA. The MS-EVA developed is validated with classical instability problems. The validation is carried out through two steps. First, it is established that the barotropic and baroclinic instabilities are indicated by the spatial averages of certain transfer term interaction analyses. Then calculations of these indicators are made with an Eady model and a Kuo model. The results agree precisely with what is expected from their analytical solutions, and the energetics reproduced reveal a consistent and important aspect of the unknown dynamic structures of instability processes. As an application, the MS-EVA is used to investigate the Iceland-Faeroe frontal (IFF) variability. A MS-EVA-ready dataset is first generated, through a forecasting study with the Harvard Ocean Prediction System using the data gathered during the 1993 NRV Alliance cruise. The application starts with a determination of the scale window bounds, which characterize a double-peak structure in either the time wavelet spectrum or the space wavelet spectrum. The resulting energetics, when locally averaged, reveal that there is a clear baroclinic instability happening around the cold tongue intrusion observed in the forecast. Moreover, an interaction analysis shows that the energy released by the instability indeed goes to the meso-scale window and fuel the growth of the intrusion. The sensitivity study shows that, in this case, the key to a successful application is a correct decomposition of the large-scale window from the meso-scale window.
NASA Astrophysics Data System (ADS)
Ying, Shen; Li, Lin; Gao, Yurong
2009-10-01
Spatial visibility analysis is the important direction of pedestrian behaviors because our visual conception in space is the straight method to get environment information and navigate your actions. Based on the agent modeling and up-tobottom method, the paper develop the framework about the analysis of the pedestrian flow depended on visibility. We use viewshed in visibility analysis and impose the parameters on agent simulation to direct their motion in urban space. We analyze the pedestrian behaviors in micro-scale and macro-scale of urban open space. The individual agent use visual affordance to determine his direction of motion in micro-scale urban street on district. And we compare the distribution of pedestrian flow with configuration in macro-scale urban environment, and mine the relationship between the pedestrian flow and distribution of urban facilities and urban function. The paper first computes the visibility situations at the vantage point in urban open space, such as street network, quantify the visibility parameters. The multiple agents use visibility parameters to decide their direction of motion, and finally pedestrian flow reach to a stable state in urban environment through the simulation of multiple agent system. The paper compare the morphology of visibility parameters and pedestrian distribution with urban function and facilities layout to confirm the consistence between them, which can be used to make decision support in urban design.
Power Grid Data Analysis with R and Hadoop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hafen, Ryan P.; Gibson, Tara D.; Kleese van Dam, Kerstin
This book chapter presents an approach to analysis of large-scale time-series sensor information based on our experience with power grid data. We use the R-Hadoop Integrated Programming Environment (RHIPE) to analyze a 2TB data set and present code and results for this analysis.
Impact of Spatial Scales on the Intercomparison of Climate Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Wei; Steptoe, Michael; Chang, Zheng
2017-01-01
Scenario analysis has been widely applied in climate science to understand the impact of climate change on the future human environment, but intercomparison and similarity analysis of different climate scenarios based on multiple simulation runs remain challenging. Although spatial heterogeneity plays a key role in modeling climate and human systems, little research has been performed to understand the impact of spatial variations and scales on similarity analysis of climate scenarios. To address this issue, the authors developed a geovisual analytics framework that lets users perform similarity analysis of climate scenarios from the Global Change Assessment Model (GCAM) using a hierarchicalmore » clustering approach.« less
Loeb, Danielle F; Crane, Lori A; Leister, Erin; Bayliss, Elizabeth A; Ludman, Evette; Binswanger, Ingrid A; Kline, Danielle M; Smith, Meredith; deGruy, Frank V; Nease, Donald E; Dickinson, L Miriam
Develop and validate self-efficacy scales for primary care provider (PCP) mental illness management and team-based care participation. We developed three self-efficacy scales: team-based care (TBC), mental illness management (MIM), and chronic medical illness (CMI). We developed the scales using Bandura's Social Cognitive Theory as a guide. The survey instrument included items from previously validated scales on team-based care and mental illness management. We administered a mail survey to 900 randomly selected Colorado physicians. We conducted exploratory principal factor analysis with oblique rotation. We constructed self-efficacy scales and calculated standardized Cronbach's alpha coefficients to test internal consistency. We calculated correlation coefficients between the MIM and TBC scales and previously validated measures related to each scale to evaluate convergent validity. We tested correlations between the TBC and the measures expected to correlate with the MIM scale and vice versa to evaluate discriminant validity. PCPs (n=402, response rate=49%) from diverse practice settings completed surveys. Items grouped into factors as expected. Cronbach's alphas were 0.94, 0.88, and 0.83 for TBC, MIM, and CMI scales respectively. In convergent validity testing, the TBC scale was correlated as predicted with scales assessing communications strategies, attitudes toward teams, and other teamwork indicators (r=0.25 to 0.40, all statistically significant). Likewise, the MIM scale was significantly correlated with several items about knowledge and experience managing mental illness (r=0.24 to 41, all statistically significant). As expected in discriminant validity testing, the TBC scale had only very weak correlations with the mental illness knowledge and experience managing mental illness items (r=0.03 to 0.12). Likewise, the MIM scale was only weakly correlated with measures of team-based care (r=0.09 to.17). This validation study of MIM and TBC self-efficacy scales showed high internal validity and good construct validity. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Chen, Y.; Luo, M.; Xu, L.; Zhou, X.; Ren, J.; Zhou, J.
2018-04-01
The RF method based on grid-search parameter optimization could achieve a classification accuracy of 88.16 % in the classification of images with multiple feature variables. This classification accuracy was higher than that of SVM and ANN under the same feature variables. In terms of efficiency, the RF classification method performs better than SVM and ANN, it is more capable of handling multidimensional feature variables. The RF method combined with object-based analysis approach could highlight the classification accuracy further. The multiresolution segmentation approach on the basis of ESP scale parameter optimization was used for obtaining six scales to execute image segmentation, when the segmentation scale was 49, the classification accuracy reached the highest value of 89.58 %. The classification accuracy of object-based RF classification was 1.42 % higher than that of pixel-based classification (88.16 %), and the classification accuracy was further improved. Therefore, the RF classification method combined with object-based analysis approach could achieve relatively high accuracy in the classification and extraction of land use information for industrial and mining reclamation areas. Moreover, the interpretation of remotely sensed imagery using the proposed method could provide technical support and theoretical reference for remotely sensed monitoring land reclamation.
Riding the Right Wavelet: Quantifying Scale Transitions in Fractured Rocks
NASA Astrophysics Data System (ADS)
Rizzo, Roberto E.; Healy, David; Farrell, Natalie J.; Heap, Michael J.
2017-12-01
The mechanics of brittle failure is a well-described multiscale process that involves a rapid transition from distributed microcracks to localization along a single macroscopic rupture plane. However, considerable uncertainty exists regarding both the length scale at which this transition occurs and the underlying causes that prompt this shift from a distributed to a localized assemblage of cracks or fractures. For the first time, we used an image analysis tool developed to investigate orientation changes at different scales in images of fracture patterns in faulted materials, based on a two-dimensional continuous wavelet analysis. We detected the abrupt change in the fracture pattern from distributed tensile microcracks to localized shear failure in a fracture network produced by triaxial deformation of a sandstone core plug. The presented method will contribute to our ability of unraveling the physical processes at the base of catastrophic rock failure, including the nucleation of earthquakes, landslides, and volcanic eruptions.
The characterization of an air pollution episode using satellite total ozone measurements
NASA Technical Reports Server (NTRS)
Fishman, Jack; Shipham, Mark C.; Vukovich, Fred M.; Cahoon, Donald R.
1987-01-01
A case study is presented which demonstrates that measurements of total ozone from a space-based platform can be used to study a widespread air pollution episode over the southeastern U.S. In particular, the synoptic-scale distribution of surface-level ozone obtained from an independent analysis of ground-based monitoring stations appears to be captured by the synoptic-scale distribution of total ozone, even though about 90 percent of the total ozone is in the stratosphere. Additional analyses of upper air meteorological data, other satellite imagery, and in situ aircraft measurements of ozone likewise support the fact that synoptic-scale variability of tropospheric ozone is primarily responsible for the observed variability in total ozone under certain conditions. The use of the type of analysis discussed in this study may provide an important technique for understanding the global budget of tropospheric ozone.
Villafranca, Alexander; Hamlin, Colin; Rodebaugh, Thomas L; Robinson, Sandra; Jacobsohn, Eric
2017-09-10
Disruptive intraoperative behavior has detrimental effects to clinicians, institutions, and patients. How clinicians respond to this behavior can either exacerbate or attenuate its effects. Previous investigations of disruptive behavior have used survey scales with significant limitations. The study objective was to develop appropriate scales to measure exposure and responses to disruptive behavior. We obtained ethics approval. The scales were developed in a sequence of steps. They were pretested using expert reviews, computational linguistic analysis, and cognitive interviews. The scales were then piloted on Canadian operating room clinicians. Factor analysis was applied to half of the data set for question reduction and grouping. Item response analysis and theoretical reviews ensured that important questions were not eliminated. Internal consistency was evaluated using Cronbach α. Model fit was examined on the second half of the data set using confirmatory factor analysis. Content validity of the final scales was re-evaluated. Consistency between observed relationships and theoretical predictions was assessed. Temporal stability was evaluated on a subsample of 38 respondents. A total of 1433 and 746 clinicians completed the exposure and response scales, respectively. Content validity indices were excellent (exposure = 0.96, responses = 1.0). Internal consistency was good (exposure = 0.93, responses = 0.87). Correlations between the exposure scale and secondary measures were consistent with expectations based on theory. Temporal stability was acceptable (exposure = 0.77, responses = 0.73). We have developed scales measuring exposure and responses to disruptive behavior. They generate valid and reliable scores when surveying operating room clinicians, and they overcome the limitations of previous tools. These survey scales are freely available.
Scaling and allometry in the building geometries of Greater London
NASA Astrophysics Data System (ADS)
Batty, M.; Carvalho, R.; Hudson-Smith, A.; Milton, R.; Smith, D.; Steadman, P.
2008-06-01
Many aggregate distributions of urban activities such as city sizes reveal scaling but hardly any work exists on the properties of spatial distributions within individual cities, notwithstanding considerable knowledge about their fractal structure. We redress this here by examining scaling relationships in a world city using data on the geometric properties of individual buildings. We first summarise how power laws can be used to approximate the size distributions of buildings, in analogy to city-size distributions which have been widely studied as rank-size and lognormal distributions following Zipf [ Human Behavior and the Principle of Least Effort (Addison-Wesley, Cambridge, 1949)] and Gibrat [ Les Inégalités Économiques (Librarie du Recueil Sirey, Paris, 1931)]. We then extend this analysis to allometric relationships between buildings in terms of their different geometric size properties. We present some preliminary analysis of building heights from the Emporis database which suggests very strong scaling in world cities. The data base for Greater London is then introduced from which we extract 3.6 million buildings whose scaling properties we explore. We examine key allometric relationships between these different properties illustrating how building shape changes according to size, and we extend this analysis to the classification of buildings according to land use types. We conclude with an analysis of two-point correlation functions of building geometries which supports our non-spatial analysis of scaling.
Tielen, Deirdre; Wollmann, Lisa
2014-01-01
The social interaction anxiety scale (SIAS) and the social phobia scale (SPS) assess anxiety in social interactions and fear of scrutiny by others. This study examines the psychometric properties of the Dutch versions of the SIAS and SPS using data from a large group of patients with social phobia and a community-based sample. Confirmatory factor analysis revealed that the SIAS is unidimensional, whereas the SPS is comprised of three subscales. The internal consistency of the scales and subscales was good. The concurrent and discriminant validity was supported and the scales were well able to discriminate between patients and community-based respondents. Cut-off values with excellent sensitivity and specificity are presented. Of all self-report measures included, the SPS was the most sensitive for treatment effects. Normative data are provided which can be used to assess whether clinically significant change has occurred in individual patients. PMID:24701560
de Beurs, Edwin; Tielen, Deirdre; Wollmann, Lisa
2014-01-01
The social interaction anxiety scale (SIAS) and the social phobia scale (SPS) assess anxiety in social interactions and fear of scrutiny by others. This study examines the psychometric properties of the Dutch versions of the SIAS and SPS using data from a large group of patients with social phobia and a community-based sample. Confirmatory factor analysis revealed that the SIAS is unidimensional, whereas the SPS is comprised of three subscales. The internal consistency of the scales and subscales was good. The concurrent and discriminant validity was supported and the scales were well able to discriminate between patients and community-based respondents. Cut-off values with excellent sensitivity and specificity are presented. Of all self-report measures included, the SPS was the most sensitive for treatment effects. Normative data are provided which can be used to assess whether clinically significant change has occurred in individual patients.
Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan
2018-01-01
A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.
NASA Astrophysics Data System (ADS)
Naritomi, Yusuke; Fuchigami, Sotaro
2011-02-01
Protein dynamics on a long time scale was investigated using all-atom molecular dynamics (MD) simulation and time-structure based independent component analysis (tICA). We selected the lysine-, arginine-, ornithine-binding protein (LAO) as a target protein and focused on its domain motions in the open state. A MD simulation of the LAO in explicit water was performed for 600 ns, in which slow and large-amplitude domain motions of the LAO were observed. After extracting domain motions by rigid-body domain analysis, the tICA was applied to the obtained rigid-body trajectory, yielding slow modes of the LAO's domain motions in order of decreasing time scale. The slowest mode detected by the tICA represented not a closure motion described by a largest-amplitude mode determined by the principal component analysis but a twist motion with a time scale of tens of nanoseconds. The slow dynamics of the LAO were well described by only the slowest mode and were characterized by transitions between two basins. The results show that tICA is promising for describing and analyzing slow dynamics of proteins.
Naritomi, Yusuke; Fuchigami, Sotaro
2011-02-14
Protein dynamics on a long time scale was investigated using all-atom molecular dynamics (MD) simulation and time-structure based independent component analysis (tICA). We selected the lysine-, arginine-, ornithine-binding protein (LAO) as a target protein and focused on its domain motions in the open state. A MD simulation of the LAO in explicit water was performed for 600 ns, in which slow and large-amplitude domain motions of the LAO were observed. After extracting domain motions by rigid-body domain analysis, the tICA was applied to the obtained rigid-body trajectory, yielding slow modes of the LAO's domain motions in order of decreasing time scale. The slowest mode detected by the tICA represented not a closure motion described by a largest-amplitude mode determined by the principal component analysis but a twist motion with a time scale of tens of nanoseconds. The slow dynamics of the LAO were well described by only the slowest mode and were characterized by transitions between two basins. The results show that tICA is promising for describing and analyzing slow dynamics of proteins.
Gradient descent for robust kernel-based regression
NASA Astrophysics Data System (ADS)
Guo, Zheng-Chu; Hu, Ting; Shi, Lei
2018-06-01
In this paper, we study the gradient descent algorithm generated by a robust loss function over a reproducing kernel Hilbert space (RKHS). The loss function is defined by a windowing function G and a scale parameter σ, which can include a wide range of commonly used robust losses for regression. There is still a gap between theoretical analysis and optimization process of empirical risk minimization based on loss: the estimator needs to be global optimal in the theoretical analysis while the optimization method can not ensure the global optimality of its solutions. In this paper, we aim to fill this gap by developing a novel theoretical analysis on the performance of estimators generated by the gradient descent algorithm. We demonstrate that with an appropriately chosen scale parameter σ, the gradient update with early stopping rules can approximate the regression function. Our elegant error analysis can lead to convergence in the standard L 2 norm and the strong RKHS norm, both of which are optimal in the mini-max sense. We show that the scale parameter σ plays an important role in providing robustness as well as fast convergence. The numerical experiments implemented on synthetic examples and real data set also support our theoretical results.
He, Hui; Fan, Guotao; Ye, Jianwei; Zhang, Weizhe
2013-01-01
It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system's emergency response capabilities, alleviate the cyber attacks' damage, and strengthen the system's counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system's plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks' topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology.
Karain, Wael I
2017-11-28
Proteins undergo conformational transitions over different time scales. These transitions are closely intertwined with the protein's function. Numerous standard techniques such as principal component analysis are used to detect these transitions in molecular dynamics simulations. In this work, we add a new method that has the ability to detect transitions in dynamics based on the recurrences in the dynamical system. It combines bootstrapping and recurrence quantification analysis. We start from the assumption that a protein has a "baseline" recurrence structure over a given period of time. Any statistically significant deviation from this recurrence structure, as inferred from complexity measures provided by recurrence quantification analysis, is considered a transition in the dynamics of the protein. We apply this technique to a 132 ns long molecular dynamics simulation of the β-Lactamase Inhibitory Protein BLIP. We are able to detect conformational transitions in the nanosecond range in the recurrence dynamics of the BLIP protein during the simulation. The results compare favorably to those extracted using the principal component analysis technique. The recurrence quantification analysis based bootstrap technique is able to detect transitions between different dynamics states for a protein over different time scales. It is not limited to linear dynamics regimes, and can be generalized to any time scale. It also has the potential to be used to cluster frames in molecular dynamics trajectories according to the nature of their recurrence dynamics. One shortcoming for this method is the need to have large enough time windows to insure good statistical quality for the recurrence complexity measures needed to detect the transitions.
Stiggelbout, A M; Molewijk, A C; Otten, W; Timmermans, D R M; van Bockel, J H; Kievit, J
2004-06-01
Evidence based patient choice seems based on a strong liberal individualist interpretation of patient autonomy; however, not all patients are in favour of such an interpretation. The authors wished to assess whether ideals of autonomy in clinical practice are more in accordance with alternative concepts of autonomy from the ethics literature. This paper describes the development of a questionnaire to assess such concepts of autonomy. A questionnaire, based on six moral concepts from the ethics literature, was sent to aneurysm patients and their surgeons. The structure of the questionnaire was assessed by factor analysis, and item reduction was based on reliability. Ninety six patients and 58 surgeons participated. The questionnaire consisted of four scales. Two of the scales reflected the paternalistic and consumerist poles of the liberal individualist model, one scale reflected concepts of Socratic autonomy and of procedural independence, and the fourth scale reflected ideals of risk disclosure. The Ideal Patient Autonomy Scale is a 14 item normative instrument. It is clearly distinct from the generally used psychological preference questionnaires that assess preferences for physician-patient roles.
Stiggelbout, A; Molewijk, A; Otten, W; Timmermans, D; van Bockel, J H; Kievit, J
2004-01-01
Objectives: Evidence based patient choice seems based on a strong liberal individualist interpretation of patient autonomy; however, not all patients are in favour of such an interpretation. The authors wished to assess whether ideals of autonomy in clinical practice are more in accordance with alternative concepts of autonomy from the ethics literature. This paper describes the development of a questionnaire to assess such concepts of autonomy. Methods: A questionnaire, based on six moral concepts from the ethics literature, was sent to aneurysm patients and their surgeons. The structure of the questionnaire was assessed by factor analysis, and item reduction was based on reliability. Results: Ninety six patients and 58 surgeons participated. The questionnaire consisted of four scales. Two of the scales reflected the paternalistic and consumerist poles of the liberal individualist model, one scale reflected concepts of Socratic autonomy and of procedural independence, and the fourth scale reflected ideals of risk disclosure. Discussion: The Ideal Patient Autonomy Scale is a 14 item normative instrument. It is clearly distinct from the generally used psychological preference questionnaires that assess preferences for physician-patient roles. PMID:15173361
WarpIV: In situ visualization and analysis of ion accelerator simulations
Rubel, Oliver; Loring, Burlen; Vay, Jean -Luc; ...
2016-05-09
The generation of short pulses of ion beams through the interaction of an intense laser with a plasma sheath offers the possibility of compact and cheaper ion sources for many applications--from fast ignition and radiography of dense targets to hadron therapy and injection into conventional accelerators. To enable the efficient analysis of large-scale, high-fidelity particle accelerator simulations using the Warp simulation suite, the authors introduce the Warp In situ Visualization Toolkit (WarpIV). WarpIV integrates state-of-the-art in situ visualization and analysis using VisIt with Warp, supports management and control of complex in situ visualization and analysis workflows, and implements integrated analyticsmore » to facilitate query- and feature-based data analytics and efficient large-scale data analysis. WarpIV enables for the first time distributed parallel, in situ visualization of the full simulation data using high-performance compute resources as the data is being generated by Warp. The authors describe the application of WarpIV to study and compare large 2D and 3D ion accelerator simulations, demonstrating significant differences in the acceleration process in 2D and 3D simulations. WarpIV is available to the public via https://bitbucket.org/berkeleylab/warpiv. The Warp In situ Visualization Toolkit (WarpIV) supports large-scale, parallel, in situ visualization and analysis and facilitates query- and feature-based analytics, enabling for the first time high-performance analysis of large-scale, high-fidelity particle accelerator simulations while the data is being generated by the Warp simulation suite. Furthermore, this supplemental material https://extras.computer.org/extra/mcg2016030022s1.pdf provides more details regarding the memory profiling and optimization and the Yee grid recentering optimization results discussed in the main article.« less
A study on phenomenology of Dhat syndrome in men in a general medical setting
Prakash, Sathya; Sharan, Pratap; Sood, Mamta
2016-01-01
Background: “Dhat syndrome” is believed to be a culture-bound syndrome of the Indian subcontinent. Although many studies have been performed, many have methodological limitations and there is a lack of agreement in many areas. Aims: The aim is to study the phenomenology of “Dhat syndrome” in men and to explore the possibility of subtypes within this entity. Settings and Design: It is a cross-sectional descriptive study conducted at a sex and marriage counseling clinic of a tertiary care teaching hospital in Northern India. Materials and Methods: An operational definition and assessment instrument for “Dhat syndrome” was developed after taking all concerned stakeholders into account and review of literature. It was applied on 100 patients along with socio-demographic profile, Hamilton Depression Rating Scale, Hamilton Anxiety Rating Scale, Mini International Neuropsychiatric Interview, and Postgraduate Institute Neuroticism Scale. Statistical Analysis: For statistical analysis, descriptive statistics, group comparisons, and Pearson's product moment correlations were carried out. Factor analysis and cluster analysis were done to determine the factor structure and subtypes of “Dhat syndrome.” Results: A diagnostic and assessment instrument for “Dhat syndrome” has been developed and the phenomenology in 100 patients has been described. Both the health beliefs scale and associated symptoms scale demonstrated a three-factor structure. The patients with “Dhat syndrome” could be categorized into three clusters based on severity. Conclusions: There appears to be a significant agreement among various stakeholders on the phenomenology of “Dhat syndrome” although some differences exist. “Dhat syndrome” could be subtyped into three clusters based on severity. PMID:27385844
NASA Astrophysics Data System (ADS)
Sun, P.; Jokipii, J. R.; Giacalone, J.
2016-12-01
Anisotropies in astrophysical turbulence has been proposed and observed for a long time. And recent observations adopting the multi-scale analysis techniques provided a detailed description of the scale-dependent power spectrum of the magnetic field parallel and perpendicular to the scale-dependent magnetic field line at different scales in the solar wind. In the previous work, we proposed a multi-scale method to synthesize non-isotropic turbulent magnetic field with pre-determined power spectra of the fluctuating magnetic field as a function of scales. We present the effect of test particle transport in the resulting field with a two-scale algorithm. We find that the scale-dependent turbulence anisotropy has a significant difference in the effect on charged par- ticle transport from what the isotropy or the global anisotropy has. It is important to apply this field synthesis method to the solar wind magnetic field based on spacecraft data. However, this relies on how we extract the power spectra of the turbulent magnetic field across different scales. In this study, we propose here a power spectrum synthesis method based on Fourier analysis to extract the large and small scale power spectrum from a single spacecraft observation with a long enough period and a high sampling frequency. We apply the method to the solar wind measurement by the magnetometer onboard the ACE spacecraft and regenerate the large scale isotropic 2D spectrum and the small scale anisotropic 2D spectrum. We run test particle simulations in the magnetid field generated in this way to estimate the transport coefficients and to compare with the isotropic turbulence model.
ESTimating plant phylogeny: lessons from partitioning
de la Torre, Jose EB; Egan, Mary G; Katari, Manpreet S; Brenner, Eric D; Stevenson, Dennis W; Coruzzi, Gloria M; DeSalle, Rob
2006-01-01
Background While Expressed Sequence Tags (ESTs) have proven a viable and efficient way to sample genomes, particularly those for which whole-genome sequencing is impractical, phylogenetic analysis using ESTs remains difficult. Sequencing errors and orthology determination are the major problems when using ESTs as a source of characters for systematics. Here we develop methods to incorporate EST sequence information in a simultaneous analysis framework to address controversial phylogenetic questions regarding the relationships among the major groups of seed plants. We use an automated, phylogenetically derived approach to orthology determination called OrthologID generate a phylogeny based on 43 process partitions, many of which are derived from ESTs, and examine several measures of support to assess the utility of EST data for phylogenies. Results A maximum parsimony (MP) analysis resulted in a single tree with relatively high support at all nodes in the tree despite rampant conflict among trees generated from the separate analysis of individual partitions. In a comparison of broader-scale groupings based on cellular compartment (ie: chloroplast, mitochondrial or nuclear) or function, only the nuclear partition tree (based largely on EST data) was found to be topologically identical to the tree based on the simultaneous analysis of all data. Despite topological conflict among the broader-scale groupings examined, only the tree based on morphological data showed statistically significant differences. Conclusion Based on the amount of character support contributed by EST data which make up a majority of the nuclear data set, and the lack of conflict of the nuclear data set with the simultaneous analysis tree, we conclude that the inclusion of EST data does provide a viable and efficient approach to address phylogenetic questions within a parsimony framework on a genomic scale, if problems of orthology determination and potential sequencing errors can be overcome. In addition, approaches that examine conflict and support in a simultaneous analysis framework allow for a more precise understanding of the evolutionary history of individual process partitions and may be a novel way to understand functional aspects of different kinds of cellular classes of gene products. PMID:16776834
Fan-out Estimation in Spin-based Quantum Computer Scale-up.
Nguyen, Thien; Hill, Charles D; Hollenberg, Lloyd C L; James, Matthew R
2017-10-17
Solid-state spin-based qubits offer good prospects for scaling based on their long coherence times and nexus to large-scale electronic scale-up technologies. However, high-threshold quantum error correction requires a two-dimensional qubit array operating in parallel, posing significant challenges in fabrication and control. While architectures incorporating distributed quantum control meet this challenge head-on, most designs rely on individual control and readout of all qubits with high gate densities. We analysed the fan-out routing overhead of a dedicated control line architecture, basing the analysis on a generalised solid-state spin qubit platform parameterised to encompass Coulomb confined (e.g. donor based spin qubits) or electrostatically confined (e.g. quantum dot based spin qubits) implementations. The spatial scalability under this model is estimated using standard electronic routing methods and present-day fabrication constraints. Based on reasonable assumptions for qubit control and readout we estimate 10 2 -10 5 physical qubits, depending on the quantum interconnect implementation, can be integrated and fanned-out independently. Assuming relatively long control-free interconnects the scalability can be extended. Ultimately, the universal quantum computation may necessitate a much higher number of integrated qubits, indicating that higher dimensional electronics fabrication and/or multiplexed distributed control and readout schemes may be the preferredstrategy for large-scale implementation.
NASA Astrophysics Data System (ADS)
MIYAKITA, T.; MATSUI, T.; ITO, A.; TOKUYAMA, T.; HIRAMATSU, K.; OSADA, Y.; YAMAMOTO, T.
2002-02-01
A questionnaire survey was made of health effects of aircraft noise on residents living around Kadena and Futenma airfields using the Todai Health Index. Aircraft noise exposure expressed by Ldnranged from under 55 to over 70 in the surveyed area. The number of valid answers was 7095, including 848 among the control group. Twelve scale scores were converted to dichotomous variables based on scale scores of the 90 percentile value or the 10 percentile value in the control group. Multiple logistic regression analysis was done taking 12 scale scores converted into the dependent variable andLdn , age (six levels), sex, occupation (four categories) and the interaction of age and sex as the independent variables. Significant dose-response relationships were found in the scale scores for vague complaints, respiratory, digestive, mental instability, depression and nervousness. The results suggest that the residents living around Kadena and Futenma airfields may suffer both physical and mental effects as a result of exposure to military aircraft noise and that such responses increase with the level of noise exposure (Ldn).
Senden, R; Savelberg, H H C M; Grimm, B; Heyligers, I C; Meijer, K
2012-06-01
This study investigated whether the Tinetti scale, as a subjective measure for fall risk, is associated with objectively measured gait characteristics. It is studied whether gait parameters are different for groups that are stratified for fall risk using the Tinetti scale. Moreover, the discriminative power of gait parameters to classify elderly according to the Tinetti scale is investigated. Gait of 50 elderly with a Tinneti>24 and 50 elderly with a Tinetti≤24 was analyzed using acceleration-based gait analysis. Validated algorithms were used to derive spatio-temporal gait parameters, harmonic ratio, inter-stride amplitude variability and root mean square (RMS) from the accelerometer data. Clear differences in gait were found between the groups. All gait parameters correlated with the Tinetti scale (r-range: 0.20-0.73). Only walking speed, step length and RMS showed moderate to strong correlations and high discriminative power to classify elderly according to the Tinetti scale. It is concluded that subtle gait changes that have previously been related to fall risk are not captured by the subjective assessment. It is therefore worthwhile to include objective gait assessment in fall risk screening. Copyright © 2012 Elsevier B.V. All rights reserved.
An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics
2010-01-01
Background Bioinformatics researchers are now confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. Description An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBase project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date. Conclusions Hadoop and the MapReduce programming paradigm already have a substantial base in the bioinformatics community, especially in the field of next-generation sequencing analysis, and such use is increasing. This is due to the cost-effectiveness of Hadoop-based analysis on commodity Linux clusters, and in the cloud via data upload to cloud vendors who have implemented Hadoop/HBase; and due to the effectiveness and ease-of-use of the MapReduce method in parallelization of many data analysis algorithms. PMID:21210976
An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics.
Taylor, Ronald C
2010-12-21
Bioinformatics researchers are now confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBase project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date. Hadoop and the MapReduce programming paradigm already have a substantial base in the bioinformatics community, especially in the field of next-generation sequencing analysis, and such use is increasing. This is due to the cost-effectiveness of Hadoop-based analysis on commodity Linux clusters, and in the cloud via data upload to cloud vendors who have implemented Hadoop/HBase; and due to the effectiveness and ease-of-use of the MapReduce method in parallelization of many data analysis algorithms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, S. L.
1998-08-25
Fluid Catalytic Cracking (FCC) technology is the most important process used by the refinery industry to convert crude oil to valuable lighter products such as gasoline. Process development is generally very time consuming especially when a small pilot unit is being scaled-up to a large commercial unit because of the lack of information to aide in the design of scaled-up units. Such information can now be obtained by analysis based on the pilot scale measurements and computer simulation that includes controlling physics of the FCC system. A Computational fluid dynamic (CFD) code, ICRKFLO, has been developed at Argonne National Laboratorymore » (ANL) and has been successfully applied to the simulation of catalytic petroleum cracking risers. It employs hybrid hydrodynamic-chemical kinetic coupling techniques, enabling the analysis of an FCC unit with complex chemical reaction sets containing tens or hundreds of subspecies. The code has been continuously validated based on pilot-scale experimental data. It is now being used to investigate the effects of scaled-up FCC units. Among FCC operating conditions, the feed injection conditions are found to have a strong impact on the product yields of scaled-up FCC units. The feed injection conditions appear to affect flow and heat transfer patterns and the interaction of hydrodynamics and cracking kinetics causes the product yields to change accordingly.« less
Koyama, Utako; Murayama, Nobuko
2011-08-01
This qualitative and quantitative research was conducted to develop an empowerment scale for health promotion volunteers (hereinafter referred to as the ESFHPV), key persons responsible for creating healthy communities. A focus group interview was conducted with four groups of health promotion volunteers from two cities in S Public Health Center of N Prefecture. A qualitative analysis was employed and a 32-item draft scale was created. The reliability and validity of this scale were then evaluated using quantitative methods. A questionnaire survey was conducted in 2009 for all 660 health promotion volunteers across the 2 cities. Of 401 respondents (response rate, 60.8%), 356 (53.9%) provided valid responses and were thus included in the analysis. 1) Internal consistency was confirmed by item-total correlation analysis (I-T analysis), assessment of Cronbach's coefficient alpha for all except one item and good-poor analysis (G-P analysis). Four items were excluded from the 32-item draft scale because of correlation coefficients more than 0.7, leaving 28 items for analysis. 2) Based on the results obtained from the factor analysis performed on the 28 provisional empowerment questions, 28 items were chosen for inclusion in the ESFHPV. These items consisted of four sub-scales, namely 'activity for healthy community' (10 items), 'intention for solving health problems of the community' (10 items), 'democratic organization activity' (four items) and 'growth as individual health promotion volunteers' (four items). 3) The Cronbach's coefficient alpha for the ESFHPV and its four sub-scales were 0.93, 0.88, 0.89, 0.84 and 0.79 respectively. The coefficients of I-T analysis were between 0.33 and 0.69. 4) The health promotion volunteers who attended other community activities demonstrated significantly high scores for the ESFHPV and the four sub-scales. Persons who were above 60 years, had a longer duration of activity as a health promotion volunteer and were housewives showed significantly high scores on the first sub-scale, 'growth as individual health promotion volunteers' To measure the empowerment levels of health promotion volunteers, a 28-item scale was developed and its reliability and validity were confirmed. Health promotion volunteers as well as the public health nurses who assist them can use this scale to assess the empowerment levels of other health promotion volunteers.
Non-destructive evaluation of laboratory scale hydraulic fracturing using acoustic emission
NASA Astrophysics Data System (ADS)
Hampton, Jesse Clay
The primary objective of this research is to develop techniques to characterize hydraulic fractures and fracturing processes using acoustic emission monitoring based on laboratory scale hydraulic fracturing experiments. Individual microcrack AE source characterization is performed to understand the failure mechanisms associated with small failures along pre-existing discontinuities and grain boundaries. Individual microcrack analysis methods include moment tensor inversion techniques to elucidate the mode of failure, crack slip and crack normal direction vectors, and relative volumetric deformation of an individual microcrack. Differentiation between individual microcrack analysis and AE cloud based techniques is studied in efforts to refine discrete fracture network (DFN) creation and regional damage quantification of densely fractured media. Regional damage estimations from combinations of individual microcrack analyses and AE cloud density plotting are used to investigate the usefulness of weighting cloud based AE analysis techniques with microcrack source data. Two granite types were used in several sample configurations including multi-block systems. Laboratory hydraulic fracturing was performed with sample sizes ranging from 15 x 15 x 25 cm3 to 30 x 30 x 25 cm 3 in both unconfined and true-triaxially confined stress states using different types of materials. Hydraulic fracture testing in rock block systems containing a large natural fracture was investigated in terms of AE response throughout fracture interactions. Investigations of differing scale analyses showed the usefulness of individual microcrack characterization as well as DFN and cloud based techniques. Individual microcrack characterization weighting cloud based techniques correlated well with post-test damage evaluations.
How many records should be used in ASCE/SEI-7 ground motion scaling procedure?
Reyes, Juan C.; Kalkan, Erol
2012-01-01
U.S. national building codes refer to the ASCE/SEI-7 provisions for selecting and scaling ground motions for use in nonlinear response history analysis of structures. Because the limiting values for the number of records in the ASCE/SEI-7 are based on engineering experience, this study examines the required number of records statistically, such that the scaled records provide accurate, efficient, and consistent estimates of “true” structural responses. Based on elastic–perfectly plastic and bilinear single-degree-of-freedom systems, the ASCE/SEI-7 scaling procedure is applied to 480 sets of ground motions; the number of records in these sets varies from three to ten. As compared to benchmark responses, it is demonstrated that the ASCE/SEI-7 scaling procedure is conservative if fewer than seven ground motions are employed. Utilizing seven or more randomly selected records provides more accurate estimate of the responses. Selecting records based on their spectral shape and design spectral acceleration increases the accuracy and efficiency of the procedure.
ERIC Educational Resources Information Center
Lavonen, Jari; Juuti, Kalle; Meisalo, Veijo
2003-01-01
In this study we analyse how the experiences of chemistry teachers on the use of a Microcomputer-Based Laboratory (MBL), gathered by a Likert-scale instrument, can be utilized to develop the new package "Empirica 2000." We used exploratory factor analysis to identify the essential features in a large set of questionnaire data to see how…
Multi-scale functional mapping of tidal marsh vegetation for restoration monitoring
NASA Astrophysics Data System (ADS)
Tuxen Bettman, Karin
2007-12-01
Nearly half of the world's natural wetlands have been destroyed or degraded, and in recent years, there have been significant endeavors to restore wetland habitat throughout the world. Detailed mapping of restoring wetlands can offer valuable information about changes in vegetation and geomorphology, which can inform the restoration process and ultimately help to improve chances of restoration success. I studied six tidal marshes in the San Francisco Estuary, CA, US, between 2003 and 2004 in order to develop techniques for mapping tidal marshes at multiple scales by incorporating specific restoration objectives for improved longer term monitoring. I explored a "pixel-based" remote sensing image analysis method for mapping vegetation in restored and natural tidal marshes, describing the benefits and limitations of this type of approach (Chapter 2). I also performed a multi-scale analysis of vegetation pattern metrics for a recently restored tidal marsh in order to target the metrics that are consistent across scales and will be robust measures of marsh vegetation change (Chapter 3). Finally, I performed an "object-based" image analysis using the same remotely sensed imagery, which maps vegetation type and specific wetland functions at multiple scales (Chapter 4). The combined results of my work highlight important trends and management implications for monitoring wetland restoration using remote sensing, and will better enable restoration ecologists to use remote sensing for tidal marsh monitoring. Several findings important for tidal marsh restoration monitoring were made. Overall results showed that pixel-based methods are effective at quantifying landscape changes in composition and diversity in recently restored marshes, but are limited in their use for quantifying smaller, more fine-scale changes. While pattern metrics can highlight small but important changes in vegetation composition and configuration across years, scientists should exercise caution when using metrics in their studies or to validate restoration management decisions, and multi-scale analyses should be performed before metrics are used in restoration science for important management decisions. Lastly, restoration objectives, ecosystem function, and scale can each be integrated into monitoring techniques using remote sensing for improved restoration monitoring.
Two-dimensional analysis of coupled heat and moisture transport in masonry structures
NASA Astrophysics Data System (ADS)
Krejčí, Tomáš
2016-06-01
Reconstruction and maintenance of historical buildings and bridges require good knowledge of temperature and moisture distribution. Sharp changes in the temperature and moisture can lead to damage. This paper describes analysis of coupled heat and moisture transfer in masonry based on two-level approach. Macro-scale level describes the whole structure while meso-scale level takes into account detailed composition of the masonry. The two-level approach is very computationally demanding and it was implemented in parallel. The two-level approach was used in analysis of temperature and moisture distribution in Charles bridge in Prague, Czech Republic.
Ibrahim, Khaled Z.; Madduri, Kamesh; Williams, Samuel; ...
2013-07-18
The Gyrokinetic Toroidal Code (GTC) uses the particle-in-cell method to efficiently simulate plasma microturbulence. This paper presents novel analysis and optimization techniques to enhance the performance of GTC on large-scale machines. We introduce cell access analysis to better manage locality vs. synchronization tradeoffs on CPU and GPU-based architectures. Finally, our optimized hybrid parallel implementation of GTC uses MPI, OpenMP, and NVIDIA CUDA, achieves up to a 2× speedup over the reference Fortran version on multiple parallel systems, and scales efficiently to tens of thousands of cores.
Development and psychometric properties of the Inner Strength Scale.
Lundman, Berit; Viglund, Kerstin; Aléx, Lena; Jonsén, Elisabeth; Norberg, Astrid; Fischer, Regina Santamäki; Strandberg, Gunilla; Nygren, Björn
2011-10-01
Four dimensions of inner strength were previously identified in a meta-theoretical analysis: firmness, creativity, connectedness, and flexibility. The aim of this study was to develop an Inner Strength Scale (ISS) based on those four dimensions and to evaluate its psychometric properties. An initial version of ISS was distributed for validation purpose with the Rosenberg Self-Esteem Scale, the resilience scale, and the sense of Coherence Scale. A convenience sample of 391 adults, aged 19-90 years participated. Principal component analysis (PCA) and confirmatory factor analysis (CFA) were used in the process of exploring, evaluating, and reducing the 63-item ISS to the 20-item ISS. Cronbach's alpha and test-retest were used to measure reliability. CFA showed satisfactory goodness-of-fit for the 20-item ISS. The analysis supported a fourfactor solution explaining 51% of the variance. Cronbach's alpha on the 20-item ISS was 0.86, and the test-retest showed stability over time (r=0.79). The ISS was found to be a valid and reliable instrument for capturing a multifaceted understanding of inner strength. Further tests of psychometric properties of the ISS will be performed in forthcoming studies. Copyright © 2011 Elsevier Ltd. All rights reserved.
Intelligent Performance Analysis with a Natural Language Interface
NASA Astrophysics Data System (ADS)
Juuso, Esko K.
2017-09-01
Performance improvement is taken as the primary goal in the asset management. Advanced data analysis is needed to efficiently integrate condition monitoring data into the operation and maintenance. Intelligent stress and condition indices have been developed for control and condition monitoring by combining generalized norms with efficient nonlinear scaling. These nonlinear scaling methodologies can also be used to handle performance measures used for management since management oriented indicators can be presented in the same scale as intelligent condition and stress indices. Performance indicators are responses of the process, machine or system to the stress contributions analyzed from process and condition monitoring data. Scaled values are directly used in intelligent temporal analysis to calculate fluctuations and trends. All these methodologies can be used in prognostics and fatigue prediction. The meanings of the variables are beneficial in extracting expert knowledge and representing information in natural language. The idea of dividing the problems into the variable specific meanings and the directions of interactions provides various improvements for performance monitoring and decision making. The integrated temporal analysis and uncertainty processing facilitates the efficient use of domain expertise. Measurements can be monitored with generalized statistical process control (GSPC) based on the same scaling functions.
Analysis of DNA Sequences by an Optical ime-Integrating Correlator: Proposal
1991-11-01
CURRENT TECHNOLOGY 2 3.0 TIME-INTEGRATING CORRELATOR 2 4.0 REPRESENTATIONS OF THE DNA BASES 8 5.0 DNA ANALYSIS STRATEGY 8 6.0 STRATEGY FOR COARSE...1)-correlation peak formed by the AxB term and (2)-pedestal formed by the A + B terms. 7 Figure 4: Short representations of the DNA bases where each...linear scale. 15 x LIST OF TABLES PAGE Table 1: Short representations of the DNA bases where each base is represented by 7-bits long pseudorandom
Simulating and mapping spatial complexity using multi-scale techniques
De Cola, L.
1994-01-01
A central problem in spatial analysis is the mapping of data for complex spatial fields using relatively simple data structures, such as those of a conventional GIS. This complexity can be measured using such indices as multi-scale variance, which reflects spatial autocorrelation, and multi-fractal dimension, which characterizes the values of fields. These indices are computed for three spatial processes: Gaussian noise, a simple mathematical function, and data for a random walk. Fractal analysis is then used to produce a vegetation map of the central region of California based on a satellite image. This analysis suggests that real world data lie on a continuum between the simple and the random, and that a major GIS challenge is the scientific representation and understanding of rapidly changing multi-scale fields. -Author
Development and testing of a scale for assessing the quality of home nursing.
Chiou, Chii-Jun; Wang, Hsiu-Hung; Chang, Hsing-Yi
2016-03-01
To develop a home nursing quality scale and to evaluate its psychometric properties. This was a 3-year study. In the first year, 19 focus group interviews with caregivers of people using home nursing services were carried out in northern, central and southern Taiwan. Content analysis was carried out and a pool of questionnaire items compiled. In the second year (2007), study was carried out on a stratified random sample selected from home nursing organizations covered by the national health insurance scheme in southern Taiwan. The study population was the co-resident primary caregivers of home care nursing service users. Item analysis and exploratory factor analysis were carried out on data from 365 self-administered questionnaires collected from 13 selected home care organizations. In the third year (2008), a random sample of participants was selected from 206 hospital-based home care nursing organizations throughout Taiwan, resulting in completion of 294 questionnaires from 27 organizations. Confirmatory factor analysis was then carried out on the scale, and the validity and reliability of the scale assessed. The present study developed a reliable and valid home nursing quality scale from the perspective of users of home nursing services. The scale comprised three factors: dependability, communication skills and service usefulness. This scale is of practical value for the promotion of long-term community care aging in local policies. The scale is ready to be used to assess the quality of services provided by home care nursing organizations. © 2015 Japan Geriatrics Society.
The assessment of post-vasectomy pain in mice using behaviour and the Mouse Grimace Scale.
Leach, Matthew C; Klaus, Kristel; Miller, Amy L; Scotto di Perrotolo, Maud; Sotocinal, Susana G; Flecknell, Paul A
2012-01-01
Current behaviour-based pain assessments for laboratory rodents have significant limitations. Assessment of facial expression changes, as a novel means of pain scoring, may overcome some of these limitations. The Mouse Grimace Scale appears to offer a means of assessing post-operative pain in mice that is as effective as manual behavioural-based scoring, without the limitations of such schemes. Effective assessment of post-operative pain is not only critical for animal welfare, but also the validity of science using animal models. This study compared changes in behaviour assessed using both an automated system ("HomeCageScan") and using manual analysis with changes in facial expressions assessed using the Mouse Grimace Scale (MGS). Mice (n = 6/group) were assessed before and after surgery (scrotal approach vasectomy) and either received saline, meloxicam or bupivacaine. Both the MGS and manual scoring of pain behaviours identified clear differences between the pre and post surgery periods and between those animals receiving analgesia (20 mg/kg meloxicam or 5 mg/kg bupivacaine) or saline post-operatively. Both of these assessments were highly correlated with those showing high MGS scores also exhibiting high frequencies of pain behaviours. Automated behavioural analysis in contrast was only able to detect differences between the pre and post surgery periods. In conclusion, both the Mouse Grimace Scale and manual scoring of pain behaviours are assessing the presence of post-surgical pain, whereas automated behavioural analysis could be detecting surgical stress and/or post-surgical pain. This study suggests that the Mouse Grimace Scale could prove to be a quick and easy means of assessing post-surgical pain, and the efficacy of analgesic treatment in mice that overcomes some of the limitations of behaviour-based assessment schemes.
A Protection Motivation Theory-Based Scale for Tobacco Research among Chinese Youth
MacDonell, Karen; Chen, Xinguang; Yan, Yaqiong; Li, Fang; Gong, Jie; Sun, Huiling; Li, Xiaoming; Stanton, Bonita
2014-01-01
Rates of tobacco use among adolescents in China and other lower and middle-income countries remain high despite notable prevention and intervention programs. One reason for this may be the lack of theory-based research in tobacco use prevention in these countries. In the current study, a culturally appropriate 21-item measurement scale for cigarette smoking was developed based on the core constructs of Protection Motivation Theory (PMT). The scale was assessed among a sample of 553 Chinese vocational high school students. Results from correlational and measurement modeling analysis indicated adequate measurement reliability for the proposed PMT scale structure. The two PMT Pathways and the seven PMT constructs were significantly correlated with adolescent intention to smoke and actual smoking behavior. This study is the first to evaluate a PMT scale for cigarette smoking among Chinese adolescents. The scale provides a potential tool for assessing social cognitive processes underlying tobacco use. This is essential for understanding smoking behavior among Chinese youth and to support more effective tobacco use prevention efforts. Additional studies are needed to assess its utility for use with Chinese youth in other settings. PMID:24478933
Lee, Jin; Huang, Yueng-hsiang; Robertson, Michelle M; Murphy, Lauren A; Garabet, Angela; Chang, Wen-Ruey
2014-02-01
The goal of this study was to examine the external validity of a 12-item generic safety climate scale for lone workers in order to evaluate the appropriateness of generalized use of the scale in the measurement of safety climate across various lone work settings. External validity evidence was established by investigating the measurement equivalence (ME) across different industries and companies. Confirmatory factor analysis (CFA)-based and item response theory (IRT)-based perspectives were adopted to examine the ME of the generic safety climate scale for lone workers across 11 companies from the trucking, electrical utility, and cable television industries. Fairly strong evidence of ME was observed for both organization- and group-level generic safety climate sub-scales. Although significant invariance was observed in the item intercepts across the different lone work settings, absolute model fit indices remained satisfactory in the most robust step of CFA-based ME testing. IRT-based ME testing identified only one differentially functioning item from the organization-level generic safety climate sub-scale, but its impact was minimal and strong ME was supported. The generic safety climate scale for lone workers reported good external validity and supported the presence of a common feature of safety climate among lone workers. The scale can be used as an effective safety evaluation tool in various lone work situations. Copyright © 2013 Elsevier Ltd. All rights reserved.
Scale-invariant entropy-based theory for dynamic ordering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahulikar, Shripad P., E-mail: spm@iitmandi.ac.in, E-mail: spm@aero.iitb.ac.in; Department of Aerospace Engineering, Indian Institute of Technology Bombay, Mumbai 400076; Kumari, Priti
2014-09-01
Dynamically Ordered self-organized dissipative structure exists in various forms and at different scales. This investigation first introduces the concept of an isolated embedding system, which embeds an open system, e.g., dissipative structure and its mass and/or energy exchange with its surroundings. Thereafter, scale-invariant theoretical analysis is presented using thermodynamic principles for Order creation, existence, and destruction. The sustainability criterion for Order existence based on its structured mass and/or energy interactions with the surroundings is mathematically defined. This criterion forms the basis for the interrelationship of physical parameters during sustained existence of dynamic Order. It is shown that the sufficient conditionmore » for dynamic Order existence is approached if its sustainability criterion is met, i.e., its destruction path is blocked. This scale-invariant approach has the potential to unify the physical understanding of universal dynamic ordering based on entropy considerations.« less
NASA Technical Reports Server (NTRS)
Mehta, Manish; Seaford, Mark; Kovarik, Brian; Dufrene, Aaron; Solly, Nathan
2014-01-01
ATA-002 Technical Team has successfully designed, developed, tested and assessed the SLS Pathfinder propulsion systems for the Main Base Heating Test Program. Major Outcomes of the Pathfinder Test Program: Reach 90% of full-scale chamber pressure Achieved all engine/motor design parameter requirements Reach steady plume flow behavior in less than 35 msec Steady chamber pressure for 60 to 100 msec during engine/motor operation Similar model engine/motor performance to full-scale SLS system Mitigated nozzle throat and combustor thermal erosion Test data shows good agreement with numerical prediction codes Next phase of the ATA-002 Test Program Design & development of the SLS OML for the Main Base Heating Test Tweak BSRM design to optimize performance Tweak CS-REM design to increase robustness MSFC Aerosciences and CUBRC have the capability to develop sub-scale propulsion systems to meet desired performance requirements for short-duration testing.
Defining Tsunami Magnitude as Measure of Potential Impact
NASA Astrophysics Data System (ADS)
Titov, V. V.; Tang, L.
2016-12-01
The goal of tsunami forecast, as a system for predicting potential impact of a tsunami at coastlines, requires quick estimate of a tsunami magnitude. This goal has been recognized since the beginning of tsunami research. The work of Kajiura, Soloviev, Abe, Murty, and many others discussed several scales for tsunami magnitude based on estimates of tsunami energy. However, difficulties of estimating tsunami energy based on available tsunami measurements at coastal sea-level stations has carried significant uncertainties and has been virtually impossible in real time, before tsunami impacts coastlines. The slow process of tsunami magnitude estimates, including collection of vast amount of available coastal sea-level data from affected coastlines, made it impractical to use any tsunami magnitude scales in tsunami warning operations. Uncertainties of estimates made tsunami magnitudes difficult to use as universal scale for tsunami analysis. Historically, the earthquake magnitude has been used as a proxy of tsunami impact estimates, since real-time seismic data is available of real-time processing and ample amount of seismic data is available for an elaborate post event analysis. This measure of tsunami impact carries significant uncertainties in quantitative tsunami impact estimates, since the relation between the earthquake and generated tsunami energy varies from case to case. In this work, we argue that current tsunami measurement capabilities and real-time modeling tools allow for establishing robust tsunami magnitude that will be useful for tsunami warning as a quick estimate for tsunami impact and for post-event analysis as a universal scale for tsunamis inter-comparison. We present a method for estimating the tsunami magnitude based on tsunami energy and present application of the magnitude analysis for several historical events for inter-comparison with existing methods.
Lechuga, Julia; Galletly, Carol L; Broaddus, Michelle R; Dickson-Gomez, Julia B; Glasman, Laura R; McAuliffe, Timothy L; Vega, Miriam Y; LeGrand, Sarah; Mena, Carla A; Barlow, Morgan L; Valera, Erik; Montenegro, Judith I
2017-11-08
To develop, pilot test, and conduct psychometric analyses of an innovative scale measuring the influence of perceived immigration laws on Latino migrants' HIV-testing behavior. The Immigration Law Concerns Scale (ILCS) was developed in three phases: Phase 1 involved a review of law and literature, generation of scale items, consultation with project advisors, and subsequent revision of the scale. Phase 2 involved systematic translation- back translation and consensus-based editorial processes conducted by members of a bilingual and multi-national study team. In Phase 3, 339 sexually active, HIV-negative Spanish-speaking, non-citizen Latino migrant adults (both documented and undocumented) completed the scale via audio computer-assisted self-interview. The psychometric properties of the scale were tested with exploratory factor analysis and estimates of reliability coefficients were generated. Bivariate correlations were conducted to test the discriminant and predictive validity of identified factors. Exploratory factor analysis revealed a three-factor, 17-item scale. subscale reliability ranged from 0.72 to 0.79. There were significant associations between the ILCS and the HIV-testing behaviors of participants. Results of the pilot test and psychometric analysis of the ILCS are promising. The scale is reliable and significantly associated with the HIV-testing behaviors of participants. Subscales related to unwanted government attention and concerns about meeting moral character requirements should be refined.
NASA Astrophysics Data System (ADS)
Ma, Lei; Cheng, Liang; Li, Manchun; Liu, Yongxue; Ma, Xiaoxue
2015-04-01
Unmanned Aerial Vehicle (UAV) has been used increasingly for natural resource applications in recent years due to their greater availability and the miniaturization of sensors. In addition, Geographic Object-Based Image Analysis (GEOBIA) has received more attention as a novel paradigm for remote sensing earth observation data. However, GEOBIA generates some new problems compared with pixel-based methods. In this study, we developed a strategy for the semi-automatic optimization of object-based classification, which involves an area-based accuracy assessment that analyzes the relationship between scale and the training set size. We found that the Overall Accuracy (OA) increased as the training set ratio (proportion of the segmented objects used for training) increased when the Segmentation Scale Parameter (SSP) was fixed. The OA increased more slowly as the training set ratio became larger and a similar rule was obtained according to the pixel-based image analysis. The OA decreased as the SSP increased when the training set ratio was fixed. Consequently, the SSP should not be too large during classification using a small training set ratio. By contrast, a large training set ratio is required if classification is performed using a high SSP. In addition, we suggest that the optimal SSP for each class has a high positive correlation with the mean area obtained by manual interpretation, which can be summarized by a linear correlation equation. We expect that these results will be applicable to UAV imagery classification to determine the optimal SSP for each class.
Development of a Chinese Version of the Suicide Intent Scale
ERIC Educational Resources Information Center
Gau, Susan S. F.; Chen, Chin-Hung; Lee, Charles T. C.; Chang, Jung-Chen; Cheng, Andrew T. A.
2009-01-01
This study established the psychometric properties of the Chinese version of the Suicide Intent Scale (SIS) in a clinic- and community-based sample of 36 patients and 592 respondents, respectively. Results showed that the Chinese SIS demonstrated good inter-rater and test-retest reliability. Factor analysis generated three factors (Precautions,…
Development and Initial Validation of the Intimate Violence Responsibility Scale (IVRS)
ERIC Educational Resources Information Center
Yun, Sung Hyun; Vonk, M. Elizabeth
2011-01-01
The present study demonstrates the development and initial examination of psychometric properties of the Intimate Violence Responsibility Scale (IVRS) in a community-based sample (N = 527). The underlying factor structure of the IVRS was tested by the exploratory factor analysis (Principal Axis Factoring), which identifies the four factors:…
ERIC Educational Resources Information Center
Truckenmiller, James L.
The former HEW (Health, Education, and Welfare) National Strategy for Youth Development Model proposed a community-based program to promote positive youth development and to prevent delinquency through a sequence of youth needs assessments, needs-targeted programs, and program impact evaluation. HEW Community Program Impact Scales data obtained…
USDA-ARS?s Scientific Manuscript database
Vegetative cover can be quantified quickly and consistently and often at lower cost with image analysis of color digital images than with visual assessments. Image-based mapping of vegetative cover for large-scale research and management decisions can now be considered with the accuracy of these met...
Developing a Scale for Quality of Using Learning Strategies
ERIC Educational Resources Information Center
Tasci, Guntay; Yurdugul, Halil
2016-01-01
This study aims to develop a measurement tool to measure the quality of using learning strategies. First, the quality of using learning strategies was described based on the literature. The 32 items in the 5-point Likert scale were then administered to 320 prospective teachers, and they were analysed with exploratory factor analysis using…
Chen, Hong; Li, Shanshan
2018-01-01
There exists a lack of specific research methods to estimate the relationship between an organization and its employees, which has long challenged research in the field of organizational management. Therefore, this article introduces psychological distance concept into the research of organizational behavior, which can define the concept of psychological distance between employees and an organization and describe a level of perceived correspondence or interaction between subjects and objects. We developed an employee-organization psychological distance (EOPD) scale through both qualitative and quantitative analysis methods. As indicated by the research results based on grounded theory (10 employee in-depth interview records and 277 opening questionnaires) and formal investigation (544 questionnaires), this scale consists of six dimensions: experiential distance, behavioral distance, emotional distance, cognitive distance, spatial-temporal distance, and objective social distance based on 44 items. Finally, we determined that the EOPD scale exhibited acceptable reliability and validity using confirmatory factor analysis. This research may establish a foundation for future research on the measurement of psychological relationships between employees and organizations. PMID:29375427
NASA Astrophysics Data System (ADS)
Bundrick, David Ray
The relationship between science and religion in American higher education changed significantly over the past two centuries as empiricism and naturalism became the philosophical underpinnings of the university. This philosophical shift contributed significantly to the secularization of the academy, the context in which philosophers of science during the last half-century have theorized a variety of theoretical patterns for relating science and religion. Evidence suggests that science professors operationalize various science-faith paradigms, but no instrument prior to this research had ever been created to measure the constructs. The purpose of this research was to develop a scale, with at least adequate psychometric properties (good validity and initial reliability), able to identify and discriminate among these various science-faith paradigms (in the Western Christian tradition) in practice among college and university science professors in the United States. The researcher conducted a Web-based electronic survey of a stratified random sample of science professors representing a variety of higher education institution types, science disciplines, and religious affiliation. Principal Components Analysis of the survey data produced five factors predicted by the researcher. These factors correspond to five science-faith paradigms: Conflict---Science over Religion; Conflict---Religion over Science; Compartmentalism; Complementarism; and Concordism. Analysis of items loading on each factor produced a 50-item Science-Faith Paradigm Scale (SFPS) that consists of five sub-scales, each having characteristics of good content validity, construct validity, and initial reliability (Cronbach's alpha ranging from .87 to .95). Preliminary exploratory analysis of differences in SFPS sub-scale scores based on demographic variables indicates that the SFPS is capable of discriminating among groups. This research validates the existence of five science-faith paradigms in practice in the Western Christian tradition, enriches the information base on science-faith paradigms in the academy, and makes possible further research in this subject area. The Science-Faith Paradigm Scale is subject to confirmatory analysis through further research and may be employed voluntarily by science faculty for self-understanding that could lead to more effective communication among science professors and greater appreciation for the diversity of scientific-religious perspectives in American higher education.
Deckel, A W; Hesselbrock, V; Bauer, L
1995-04-01
This experiment examined the relationship between anterior brain functioning and alcohol-related expectancies. Ninety-one young men at risk for developing alcoholism were assessed on the Alcohol Expectancy Questionnaire (AEQ) and administered neuropsychological and EEG tests. Three of the scales on the AEQ, including the "Enhanced Sexual Functioning" scale, the "Increased Social Assertiveness" scale, and items from the "Global/Positive Change scale," were used, because each of these scales has been found to discriminate alcohol-based expectancies adequately by at least two separate sets of investigators. Regression analysis found that anterior neuropsychological tests (including the Wisconsin Card Sorting test, the Porteus Maze test, the Controlled Oral Word Fluency test, and the Luria-Nebraska motor functioning tests) were predictive of the AEQ scale scores on regression analysis. One of the AEQ scales, "Enhanced Sexual Functioning," was also predicted by WAIS-R-Verbal scales, whereas the "Global/Positive" AEQ scale was predicted by the WAIS-R Performance scales. Regression analysis using EEG power as predictors found that left versus right hemisphere "difference" scores obtained from frontal EEG leads were predictive of the three AEQ scales. Conversely, parietal EEG power did not significantly predict any of the expectancy scales. It is concluded that anterior brain any of the expectancy scales. It is concluded that anterior brain functioning is associated with alcohol-related expectancies. These findings suggest that alcohol-related expectancy may be, in part, biologically determined by frontal/prefrontal systems, and that dysfunctioning in these systems may serve as a risk factor for the development of alcohol-related behaviors.
Non-linear scale interactions in a forced turbulent boundary layer
NASA Astrophysics Data System (ADS)
Duvvuri, Subrahmanyam; McKeon, Beverley
2015-11-01
A strong phase-organizing influence exerted by a single synthetic large-scale spatio-temporal mode on directly-coupled (through triadic interactions) small scales in a turbulent boundary layer forced by a spatially-impulsive dynamic wall-roughness patch was previously demonstrated by the authors (J. Fluid Mech. 2015, vol. 767, R4). The experimental set-up was later enhanced to allow for simultaneous forcing of multiple scales in the flow. Results and analysis are presented from a new set of novel experiments where two distinct large scales are forced in the flow by a dynamic wall-roughness patch. The internal non-linear forcing of two other scales with triadic consistency to the artificially forced large scales, corresponding to sum and difference in wavenumbers, is dominated by the latter. This allows for a forcing-response (input-output) type analysis of the two triadic scales, and naturally lends itself to a resolvent operator based model (e.g. McKeon & Sharma, J. Fluid Mech. 2010, vol. 658, pp. 336-382) of the governing Navier-Stokes equations. The support of AFOSR (grant #FA 9550-12-1-0469, program manager D. Smith) is gratefully acknowledged.
Indonesian teacher engagement index: a rasch model analysis
NASA Astrophysics Data System (ADS)
Sasmoko; Abbas, B. S.; Indrianti, Y.; Widhoyoko, S. A.
2018-01-01
The research aimed to calibrate Indonesian Teacher Engagement Index (ITEI) using instrument with RASCH MODEL. The respondents were 672 teachers of elementary, junior high, high school and vocational school. The number of items planned was 165 items with the initial reliability of 0.98. The ITEI scale uses Likert Scale (1 to 4) which was converted from ordinal scale to Equal Interval Scale. RASCH MODEL analysis was done by selecting based on Outfit Mean Square (MNSQ) between 0.5-1.5 as a good item, and measuring Point Measure Correlation (Pt Mean Corr) with the criterion of 0.4-0.85. Moderate Outfit Z-Standard (ZSTD) was ignored because the sample was >500. Conclusions: ITEI is valid with 30 items and reliability of 0.97, and less engage teachers significantly at α <0.05.
Dima, Alexandra Lelia; Schulz, Peter Johannes
2017-01-01
Background The eHealth Literacy Scale (eHEALS) is a tool to assess consumers’ comfort and skills in using information technologies for health. Although evidence exists of reliability and construct validity of the scale, less agreement exists on structural validity. Objective The aim of this study was to validate the Italian version of the eHealth Literacy Scale (I-eHEALS) in a community sample with a focus on its structural validity, by applying psychometric techniques that account for item difficulty. Methods Two Web-based surveys were conducted among a total of 296 people living in the Italian-speaking region of Switzerland (Ticino). After examining the latent variables underlying the observed variables of the Italian scale via principal component analysis (PCA), fit indices for two alternative models were calculated using confirmatory factor analysis (CFA). The scale structure was examined via parametric and nonparametric item response theory (IRT) analyses accounting for differences between items regarding the proportion of answers indicating high ability. Convergent validity was assessed by correlations with theoretically related constructs. Results CFA showed a suboptimal model fit for both models. IRT analyses confirmed all items measure a single dimension as intended. Reliability and construct validity of the final scale were also confirmed. The contrasting results of factor analysis (FA) and IRT analyses highlight the importance of considering differences in item difficulty when examining health literacy scales. Conclusions The findings support the reliability and validity of the translated scale and its use for assessing Italian-speaking consumers’ eHealth literacy. PMID:28400356
Analysis of Bird Habitat-Based Biodiversity Metrics at a National Scale
Ecosystem services have become a key issue of this century in resource management, conservation planning, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with economics to help under...
SOURCE EXPLORER: Towards Web Browser Based Tools for Astronomical Source Visualization and Analysis
NASA Astrophysics Data System (ADS)
Young, M. D.; Hayashi, S.; Gopu, A.
2014-05-01
As a new generation of large format, high-resolution imagers come online (ODI, DECAM, LSST, etc.) we are faced with the daunting prospect of astronomical images containing upwards of hundreds of thousands of identifiable sources. Visualizing and interacting with such large datasets using traditional astronomical tools appears to be unfeasible, and a new approach is required. We present here a method for the display and analysis of arbitrarily large source datasets using dynamically scaling levels of detail, enabling scientists to rapidly move from large-scale spatial overviews down to the level of individual sources and everything in-between. Based on the recognized standards of HTML5+JavaScript, we enable observers and archival users to interact with their images and sources from any modern computer without having to install specialized software. We demonstrate the ability to produce large-scale source lists from the images themselves, as well as overlaying data from publicly available source ( 2MASS, GALEX, SDSS, etc.) or user provided source lists. A high-availability cluster of computational nodes allows us to produce these source maps on demand and customized based on user input. User-generated source lists and maps are persistent across sessions and are available for further plotting, analysis, refinement, and culling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rizzi, Silvio; Hereld, Mark; Insley, Joseph
In this work we perform in-situ visualization of molecular dynamics simulations, which can help scientists to visualize simulation output on-the-fly, without incurring storage overheads. We present a case study to couple LAMMPS, the large-scale molecular dynamics simulation code with vl3, our parallel framework for large-scale visualization and analysis. Our motivation is to identify effective approaches for covisualization and exploration of large-scale atomistic simulations at interactive frame rates.We propose a system of coupled libraries and describe its architecture, with an implementation that runs on GPU-based clusters. We present the results of strong and weak scalability experiments, as well as future researchmore » avenues based on our results.« less
NASA Technical Reports Server (NTRS)
Macwilkinson, D. G.; Blackerby, W. T.; Paterson, J. H.
1974-01-01
The degree of cruise drag correlation on the C-141A aircraft is determined between predictions based on wind tunnel test data, and flight test results. An analysis of wind tunnel tests on a 0.0275 scale model at Reynolds number up to 3.05 x 1 million/MAC is reported. Model support interference corrections are evaluated through a series of tests, and fully corrected model data are analyzed to provide details on model component interference factors. It is shown that predicted minimum profile drag for the complete configuration agrees within 0.75% of flight test data, using a wind tunnel extrapolation method based on flat plate skin friction and component shape factors. An alternative method of extrapolation, based on computed profile drag from a subsonic viscous theory, results in a prediction four percent lower than flight test data.
Tabe-Bordbar, Shayan; Marashi, Sayed-Amir
2013-12-01
Elementary modes (EMs) are steady-state metabolic flux vectors with minimal set of active reactions. Each EM corresponds to a metabolic pathway. Therefore, studying EMs is helpful for analyzing the production of biotechnologically important metabolites. However, memory requirements for computing EMs may hamper their applicability as, in most genome-scale metabolic models, no EM can be computed due to running out of memory. In this study, we present a method for computing randomly sampled EMs. In this approach, a network reduction algorithm is used for EM computation, which is based on flux balance-based methods. We show that this approach can be used to recover the EMs in the medium- and genome-scale metabolic network models, while the EMs are sampled in an unbiased way. The applicability of such results is shown by computing “estimated” control-effective flux values in Escherichia coli metabolic network.
Deviations from uniform power law scaling in nonstationary time series
NASA Technical Reports Server (NTRS)
Viswanathan, G. M.; Peng, C. K.; Stanley, H. E.; Goldberger, A. L.
1997-01-01
A classic problem in physics is the analysis of highly nonstationary time series that typically exhibit long-range correlations. Here we test the hypothesis that the scaling properties of the dynamics of healthy physiological systems are more stable than those of pathological systems by studying beat-to-beat fluctuations in the human heart rate. We develop techniques based on the Fano factor and Allan factor functions, as well as on detrended fluctuation analysis, for quantifying deviations from uniform power-law scaling in nonstationary time series. By analyzing extremely long data sets of up to N = 10(5) beats for 11 healthy subjects, we find that the fluctuations in the heart rate scale approximately uniformly over several temporal orders of magnitude. By contrast, we find that in data sets of comparable length for 14 subjects with heart disease, the fluctuations grow erratically, indicating a loss of scaling stability.
Scale of attitudes toward alcohol - Spanish version: evidences of validity and reliability 1
Ramírez, Erika Gisseth León; de Vargas, Divane
2017-01-01
ABSTRACT Objective: validate the Scale of attitudes toward alcohol, alcoholism and individuals with alcohol use disorders in its Spanish version. Method: methodological study, involving 300 Colombian nurses. Adopting the classical theory, confirmatory factor analysis was applied without prior examination, based on the strong historical evidence of the factorial structure of the original scale to determine the construct validity of this Spanish version. To assess the reliability, Cronbach’s Alpha and Mc Donalid’s Omega coefficients were used. Results: the confirmatory factor analysis indicated the good fit of the scale model in a four-factor distribution, with a cut-off point at 3.2, demonstrating 66.7% of sensitivity. Conclusions: the Scale of attitudes toward alcohol, alcoholism and individuals with alcohol use disorders in Spanish presented robust psychometric qualities, affirming that the instrument possesses a solid factorial structure and reliability and is capable of precisely measuring the nurses’ atittudes towards the phenomenon proposed. PMID:28793126
NASA Astrophysics Data System (ADS)
Most, Sebastian; Nowak, Wolfgang; Bijeljic, Branko
2015-04-01
Fickian transport in groundwater flow is the exception rather than the rule. Transport in porous media is frequently simulated via particle methods (i.e. particle tracking random walk (PTRW) or continuous time random walk (CTRW)). These methods formulate transport as a stochastic process of particle position increments. At the pore scale, geometry and micro-heterogeneities prohibit the commonly made assumption of independent and normally distributed increments to represent dispersion. Many recent particle methods seek to loosen this assumption. Hence, it is important to get a better understanding of the processes at pore scale. For our analysis we track the positions of 10.000 particles migrating through the pore space over time. The data we use come from micro CT scans of a homogeneous sandstone and encompass about 10 grain sizes. Based on those images we discretize the pore structure and simulate flow at the pore scale based on the Navier-Stokes equation. This flow field realistically describes flow inside the pore space and we do not need to add artificial dispersion during the transport simulation. Next, we use particle tracking random walk and simulate pore-scale transport. Finally, we use the obtained particle trajectories to do a multivariate statistical analysis of the particle motion at the pore scale. Our analysis is based on copulas. Every multivariate joint distribution is a combination of its univariate marginal distributions. The copula represents the dependence structure of those univariate marginals and is therefore useful to observe correlation and non-Gaussian interactions (i.e. non-Fickian transport). The first goal of this analysis is to better understand the validity regions of commonly made assumptions. We are investigating three different transport distances: 1) The distance where the statistical dependence between particle increments can be modelled as an order-one Markov process. This would be the Markovian distance for the process, where the validity of yet-unexplored non-Gaussian-but-Markovian random walks start. 2) The distance where bivariate statistical dependence simplifies to a multi-Gaussian dependence based on simple linear correlation (validity of correlated PTRW/CTRW). 3) The distance of complete statistical independence (validity of classical PTRW/CTRW). The second objective is to reveal characteristic dependencies influencing transport the most. Those dependencies can be very complex. Copulas are highly capable of representing linear dependence as well as non-linear dependence. With that tool we are able to detect persistent characteristics dominating transport even across different scales. The results derived from our experimental data set suggest that there are many more non-Fickian aspects of pore-scale transport than the univariate statistics of longitudinal displacements. Non-Fickianity can also be found in transverse displacements, and in the relations between increments at different time steps. Also, the found dependence is non-linear (i.e. beyond simple correlation) and persists over long distances. Thus, our results strongly support the further refinement of techniques like correlated PTRW or correlated CTRW towards non-linear statistical relations.
Khan, Asaduzzaman; Chien, Chi-Wen; Bagraith, Karl S
2015-04-01
To investigate whether using a parametric statistic in comparing groups leads to different conclusions when using summative scores from rating scales compared with using their corresponding Rasch-based measures. A Monte Carlo simulation study was designed to examine between-group differences in the change scores derived from summative scores from rating scales, and those derived from their corresponding Rasch-based measures, using 1-way analysis of variance. The degree of inconsistency between the 2 scoring approaches (i.e. summative and Rasch-based) was examined, using varying sample sizes, scale difficulties and person ability conditions. This simulation study revealed scaling artefacts that could arise from using summative scores rather than Rasch-based measures for determining the changes between groups. The group differences in the change scores were statistically significant for summative scores under all test conditions and sample size scenarios. However, none of the group differences in the change scores were significant when using the corresponding Rasch-based measures. This study raises questions about the validity of the inference on group differences of summative score changes in parametric analyses. Moreover, it provides a rationale for the use of Rasch-based measures, which can allow valid parametric analyses of rating scale data.
Cavitation erosion prediction based on analysis of flow dynamics and impact load spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mihatsch, Michael S., E-mail: michael.mihatsch@aer.mw.tum.de; Schmidt, Steffen J.; Adams, Nikolaus A.
2015-10-15
Cavitation erosion is the consequence of repeated collapse-induced high pressure-loads on a material surface. The present paper assesses the prediction of impact load spectra of cavitating flows, i.e., the rate and intensity distribution of collapse events based on a detailed analysis of flow dynamics. Data are obtained from a numerical simulation which employs a density-based finite volume method, taking into account the compressibility of both phases, and resolves collapse-induced pressure waves. To determine the spectrum of collapse events in the fluid domain, we detect and quantify the collapse of isolated vapor structures. As reference configuration we consider the expansion ofmore » a liquid into a radially divergent gap which exhibits unsteady sheet and cloud cavitation. Analysis of simulation data shows that global cavitation dynamics and dominant flow events are well resolved, even though the spatial resolution is too coarse to resolve individual vapor bubbles. The inviscid flow model recovers increasingly fine-scale vapor structures and collapses with increasing resolution. We demonstrate that frequency and intensity of these collapse events scale with grid resolution. Scaling laws based on two reference lengths are introduced for this purpose. We show that upon applying these laws impact load spectra recorded on experimental and numerical pressure sensors agree with each other. Furthermore, correlation between experimental pitting rates and collapse-event rates is found. Locations of high maximum wall pressures and high densities of collapse events near walls obtained numerically agree well with areas of erosion damage in the experiment. The investigation shows that impact load spectra of cavitating flows can be inferred from flow data that captures the main vapor structures and wave dynamics without the need for resolving all flow scales.« less
BiGG Models: A platform for integrating, standardizing and sharing genome-scale models
King, Zachary A.; Lu, Justin; Dräger, Andreas; Miller, Philip; Federowicz, Stephen; Lerman, Joshua A.; Ebrahim, Ali; Palsson, Bernhard O.; Lewis, Nathan E.
2016-01-01
Genome-scale metabolic models are mathematically-structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scale metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data. PMID:26476456
Bierlein, F.P.; Northover, H.J.; Groves, D.I.; Goldfarb, R.J.; Marsh, E.E.
2008-01-01
The assessment of spatial relationships between the location, abundance and size of orogenic-gold deposits in the highly endowed Sierra Foothills gold province in California, via the combination of field studies and a GIS-based analysis, illustrates the power of such an approach to the characterisation of important parameters of mineral systems, and the prediction of districts likely to host economic mineralisation. Regional- to deposit-scale reconnaissance mapping suggests that deposition of gold-bearing quartz veins occurred in second- and third-order, east-over-west thrusts during regional east - west compression and right-lateral transpression. At the district-scale, significant zones of mineralisation correspond with such transpressional reactivation zones and dilational jogs that developed during the Late Jurassic - Early Cretaceous along the misaligned segments of first-order faults throughout the Sierra Nevada Foothills Metamorphic Belt. Field-based observations and interpretation of GIS data (including solid geology, structural elements, deposit locations, magnetics, gravity) also highlight the importance of structural permeability contrasts, rheological gradients, and variations in fault orientation for localising mineralisation. Although this approach confirms empirical findings and produces promising results at the province scale, enhanced geological, structural, geophysical and geochronological data density is required to generate regionally consistent, high-quality input layers that improve predictive targeting at the goldfield to deposit-scale.
Multifractal analysis of line-edge roughness
NASA Astrophysics Data System (ADS)
Constantoudis, Vassilios; Papavieros, George; Lorusso, Gian; Rutigliani, Vito; van Roey, Frieda; Gogolides, Evangelos
2018-03-01
In this paper, we propose to rethink the issue of LER characterization on the basis of the fundamental concept of symmetries. In LER one can apply two kinds of symmetries: a) the translation symmetry characterized by periodicity and b) the scaling symmetry quantified by the fractal dimension. Up to now, a lot of work has been done on the first symmetry since the Power Spectral Density (PSD), which has been extensively studied recently, is a decomposition of LER signal into periodic edges and quantification of the `power' of each periodicity at the real LER. The aim of this paper is to focus on the second symmetry of scaling invariance. Similarly to PSD, we introduce the multifractal approach in LER analysis which generalizes the scaling analysis of standard (mono)fractal theory and decomposes LER into fractal edges characterized by specific fractal dimensions. The main benefit of multifractal analysis is that it enables the characterization of the multi-scaling contributions of different mechanisms involved in LER formation. In the first part of our work, we present concisely the multifractal theory of line edges and utilize the Box Counting method for its implementation and the extraction of the multifractal spectrum. Special emphasis is given on the explanation of the physical meaning of the obtained multifractal spectrum whose asymmetry quantifies the degree of multifractality. In addition, we propose the distinction between peak-based and valley-based multifractality according to whether the asymmetry of the multifractal spectrum is coming from the sharp line material peaks to space regions or from the cavities of line materis (edge valleys). In the second part, we study systematically the evolution of LER multifractal spectrum during the first successive steps of a multiple (quadruple) patterning lithography technique and find an interesting transition from a peak-based multifractal behavior in the first litho resist LER to a valley-based multifractality caused mainly by the effects of etch pattern transfer steps.
A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring.
Su, Cui; Liang, Zhenhu; Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro
2016-01-01
Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Six MSPE algorithms-derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis-were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales.
NASA Astrophysics Data System (ADS)
Nezhadhaghighi, Mohsen Ghasemi
2017-08-01
Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ -stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α . We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ -stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.
Nezhadhaghighi, Mohsen Ghasemi
2017-08-01
Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ-stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α. We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ-stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.
Psychometric properties of the Exercise Benefits/Barriers Scale in Mexican elderly women
Enríquez-Reyna, María Cristina; Cruz-Castruita, Rosa María; Ceballos-Gurrola, Oswaldo; García-Cadena, Cirilo Humberto; Hernández-Cortés, Perla Lizeth; Guevara-Valtier, Milton Carlos
2017-01-01
ABSTRACT Objective: analyze and assess the psychometric properties of the subscales in the Spanish version of the Exercise Benefits/Barriers Scale in an elderly population in the Northeast of Mexico. Method: methodological study. The sample consisted of 329 elderly associated with one of the five public centers for senior citizens in the metropolitan area of Northeast Mexico. The psychometric properties included the assessment of the Cronbach's alpha coefficient, the Kaiser Meyer Olkin coefficient, the inter-item correlation, exploratory and confirmatory factor analysis. Results: in the principal components analysis, two components were identified based on the 43 items in the scale. The item-total correlation coefficient of the exercise benefits subscale was good. Nevertheless, the coefficient for the exercise barriers subscale revealed inconsistencies. The reliability and validity were acceptable. The confirmatory factor analysis revealed that the elimination of items improved the goodness of fit of the baseline scale, without affecting its validity or reliability. Conclusion: the Exercise Benefits/Barriers subscale presented satisfactory psychometric properties for the Mexican context. A 15-item short version is presented with factorial structure, validity and reliability similar to the complete scale. PMID:28591306
Influence of the time scale on the construction of financial networks.
Emmert-Streib, Frank; Dehmer, Matthias
2010-09-30
In this paper we investigate the definition and formation of financial networks. Specifically, we study the influence of the time scale on their construction. For our analysis we use correlation-based networks obtained from the daily closing prices of stock market data. More precisely, we use the stocks that currently comprise the Dow Jones Industrial Average (DJIA) and estimate financial networks where nodes correspond to stocks and edges correspond to none vanishing correlation coefficients. That means only if a correlation coefficient is statistically significant different from zero, we include an edge in the network. This construction procedure results in unweighted, undirected networks. By separating the time series of stock prices in non-overlapping intervals, we obtain one network per interval. The length of these intervals corresponds to the time scale of the data, whose influence on the construction of the networks will be studied in this paper. Numerical analysis of four different measures in dependence on the time scale for the construction of networks allows us to gain insights about the intrinsic time scale of the stock market with respect to a meaningful graph-theoretical analysis.
NASA Astrophysics Data System (ADS)
Li, Yifan; Liang, Xihui; Lin, Jianhui; Chen, Yuejian; Liu, Jianxin
2018-02-01
This paper presents a novel signal processing scheme, feature selection based multi-scale morphological filter (MMF), for train axle bearing fault detection. In this scheme, more than 30 feature indicators of vibration signals are calculated for axle bearings with different conditions and the features which can reflect fault characteristics more effectively and representatively are selected using the max-relevance and min-redundancy principle. Then, a filtering scale selection approach for MMF based on feature selection and grey relational analysis is proposed. The feature selection based MMF method is tested on diagnosis of artificially created damages of rolling bearings of railway trains. Experimental results show that the proposed method has a superior performance in extracting fault features of defective train axle bearings. In addition, comparisons are performed with the kurtosis criterion based MMF and the spectral kurtosis criterion based MMF. The proposed feature selection based MMF method outperforms these two methods in detection of train axle bearing faults.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruth, Mark
2017-07-12
'H2@Scale' is a concept based on the opportunity for hydrogen to act as an intermediate between energy sources and uses. Hydrogen has the potential to be used like the primary intermediate in use today, electricity, because it too is fungible. This presentation summarizes the H2@Scale analysis efforts performed during the first third of 2017. Results of technical potential uses and supply options are summarized and show that the technical potential demand for hydrogen is 60 million metric tons per year and that the U.S. has sufficient domestic resources to meet that demand. A high level infrastructure analysis is also presentedmore » that shows an 85% increase in energy on the grid if all hydrogen is produced from grid electricity. However, a preliminary spatial assessment shows that supply is sufficient in most counties across the U.S. The presentation also shows plans for analysis of the economic potential for the H2@Scale concept. Those plans involve developing supply and demand curves for potential hydrogen generation options and as compared to other options for use of that hydrogen.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubel, Oliver; Loring, Burlen; Vay, Jean -Luc
The generation of short pulses of ion beams through the interaction of an intense laser with a plasma sheath offers the possibility of compact and cheaper ion sources for many applications--from fast ignition and radiography of dense targets to hadron therapy and injection into conventional accelerators. To enable the efficient analysis of large-scale, high-fidelity particle accelerator simulations using the Warp simulation suite, the authors introduce the Warp In situ Visualization Toolkit (WarpIV). WarpIV integrates state-of-the-art in situ visualization and analysis using VisIt with Warp, supports management and control of complex in situ visualization and analysis workflows, and implements integrated analyticsmore » to facilitate query- and feature-based data analytics and efficient large-scale data analysis. WarpIV enables for the first time distributed parallel, in situ visualization of the full simulation data using high-performance compute resources as the data is being generated by Warp. The authors describe the application of WarpIV to study and compare large 2D and 3D ion accelerator simulations, demonstrating significant differences in the acceleration process in 2D and 3D simulations. WarpIV is available to the public via https://bitbucket.org/berkeleylab/warpiv. The Warp In situ Visualization Toolkit (WarpIV) supports large-scale, parallel, in situ visualization and analysis and facilitates query- and feature-based analytics, enabling for the first time high-performance analysis of large-scale, high-fidelity particle accelerator simulations while the data is being generated by the Warp simulation suite. Furthermore, this supplemental material https://extras.computer.org/extra/mcg2016030022s1.pdf provides more details regarding the memory profiling and optimization and the Yee grid recentering optimization results discussed in the main article.« less
Fuzzy-based propagation of prior knowledge to improve large-scale image analysis pipelines
Mikut, Ralf
2017-01-01
Many automatically analyzable scientific questions are well-posed and a variety of information about expected outcomes is available a priori. Although often neglected, this prior knowledge can be systematically exploited to make automated analysis operations sensitive to a desired phenomenon or to evaluate extracted content with respect to this prior knowledge. For instance, the performance of processing operators can be greatly enhanced by a more focused detection strategy and by direct information about the ambiguity inherent in the extracted data. We present a new concept that increases the result quality awareness of image analysis operators by estimating and distributing the degree of uncertainty involved in their output based on prior knowledge. This allows the use of simple processing operators that are suitable for analyzing large-scale spatiotemporal (3D+t) microscopy images without compromising result quality. On the foundation of fuzzy set theory, we transform available prior knowledge into a mathematical representation and extensively use it to enhance the result quality of various processing operators. These concepts are illustrated on a typical bioimage analysis pipeline comprised of seed point detection, segmentation, multiview fusion and tracking. The functionality of the proposed approach is further validated on a comprehensive simulated 3D+t benchmark data set that mimics embryonic development and on large-scale light-sheet microscopy data of a zebrafish embryo. The general concept introduced in this contribution represents a new approach to efficiently exploit prior knowledge to improve the result quality of image analysis pipelines. The generality of the concept makes it applicable to practically any field with processing strategies that are arranged as linear pipelines. The automated analysis of terabyte-scale microscopy data will especially benefit from sophisticated and efficient algorithms that enable a quantitative and fast readout. PMID:29095927
An operational global-scale ocean thermal analysis system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clancy, R. M.; Pollak, K.D.; Phoebus, P.A.
1990-04-01
The Optimum Thermal Interpolation System (OTIS) is an ocean thermal analysis system designed for operational use at FNOC. It is based on the optimum interpolation of the assimilation technique and functions in an analysis-prediction-analysis data assimilation cycle with the TOPS mixed-layer model. OTIS provides a rigorous framework for combining real-time data, climatology, and predictions from numerical ocean prediction models to produce a large-scale synoptic representation of ocean thermal structure. The techniques and assumptions used in OTIS are documented and results of operational tests of global scale OTIS at FNOC are presented. The tests involved comparisons of OTIS against an existingmore » operational ocean thermal structure model and were conducted during February, March, and April 1988. Qualitative comparison of the two products suggests that OTIS gives a more realistic representation of subsurface anomalies and horizontal gradients and that it also gives a more accurate analysis of the thermal structure, with improvements largest below the mixed layer. 37 refs.« less
Morales-Navarrete, Hernán; Segovia-Miranda, Fabián; Klukowski, Piotr; Meyer, Kirstin; Nonaka, Hidenori; Marsico, Giovanni; Chernykh, Mikhail; Kalaidzidis, Alexander; Zerial, Marino; Kalaidzidis, Yannis
2015-01-01
A prerequisite for the systems biology analysis of tissues is an accurate digital three-dimensional reconstruction of tissue structure based on images of markers covering multiple scales. Here, we designed a flexible pipeline for the multi-scale reconstruction and quantitative morphological analysis of tissue architecture from microscopy images. Our pipeline includes newly developed algorithms that address specific challenges of thick dense tissue reconstruction. Our implementation allows for a flexible workflow, scalable to high-throughput analysis and applicable to various mammalian tissues. We applied it to the analysis of liver tissue and extracted quantitative parameters of sinusoids, bile canaliculi and cell shapes, recognizing different liver cell types with high accuracy. Using our platform, we uncovered an unexpected zonation pattern of hepatocytes with different size, nuclei and DNA content, thus revealing new features of liver tissue organization. The pipeline also proved effective to analyse lung and kidney tissue, demonstrating its generality and robustness. DOI: http://dx.doi.org/10.7554/eLife.11214.001 PMID:26673893
A study on phenomenology of Dhat syndrome in men in a general medical setting.
Prakash, Sathya; Sharan, Pratap; Sood, Mamta
2016-01-01
"Dhat syndrome" is believed to be a culture-bound syndrome of the Indian subcontinent. Although many studies have been performed, many have methodological limitations and there is a lack of agreement in many areas. The aim is to study the phenomenology of "Dhat syndrome" in men and to explore the possibility of subtypes within this entity. It is a cross-sectional descriptive study conducted at a sex and marriage counseling clinic of a tertiary care teaching hospital in Northern India. An operational definition and assessment instrument for "Dhat syndrome" was developed after taking all concerned stakeholders into account and review of literature. It was applied on 100 patients along with socio-demographic profile, Hamilton Depression Rating Scale, Hamilton Anxiety Rating Scale, Mini International Neuropsychiatric Interview, and Postgraduate Institute Neuroticism Scale. For statistical analysis, descriptive statistics, group comparisons, and Pearson's product moment correlations were carried out. Factor analysis and cluster analysis were done to determine the factor structure and subtypes of "Dhat syndrome." A diagnostic and assessment instrument for "Dhat syndrome" has been developed and the phenomenology in 100 patients has been described. Both the health beliefs scale and associated symptoms scale demonstrated a three-factor structure. The patients with "Dhat syndrome" could be categorized into three clusters based on severity. There appears to be a significant agreement among various stakeholders on the phenomenology of "Dhat syndrome" although some differences exist. "Dhat syndrome" could be subtyped into three clusters based on severity.
Rasch model analysis of the Depression, Anxiety and Stress Scales (DASS)
Shea, Tracey L; Tennant, Alan; Pallant, Julie F
2009-01-01
Background There is a growing awareness of the need for easily administered, psychometrically sound screening tools to identify individuals with elevated levels of psychological distress. Although support has been found for the psychometric properties of the Depression, Anxiety and Stress Scales (DASS) using classical test theory approaches it has not been subjected to Rasch analysis. The aim of this study was to use Rasch analysis to assess the psychometric properties of the DASS-21 scales, using two different administration modes. Methods The DASS-21 was administered to 420 participants with half the sample responding to a web-based version and the other half completing a traditional pencil-and-paper version. Conformity of DASS-21 scales to a Rasch partial credit model was assessed using the RUMM2020 software. Results To achieve adequate model fit it was necessary to remove one item from each of the DASS-21 subscales. The reduced scales showed adequate internal consistency reliability, unidimensionality and freedom from differential item functioning for sex, age and mode of administration. Analysis of all DASS-21 items combined did not support its use as a measure of general psychological distress. A scale combining the anxiety and stress items showed satisfactory fit to the Rasch model after removal of three items. Conclusion The results provide support for the measurement properties, internal consistency reliability, and unidimensionality of three slightly modified DASS-21 scales, across two different administration methods. The further use of Rasch analysis on the DASS-21 in larger and broader samples is recommended to confirm the findings of the current study. PMID:19426512
Rasch model analysis of the Depression, Anxiety and Stress Scales (DASS).
Shea, Tracey L; Tennant, Alan; Pallant, Julie F
2009-05-09
There is a growing awareness of the need for easily administered, psychometrically sound screening tools to identify individuals with elevated levels of psychological distress. Although support has been found for the psychometric properties of the Depression, Anxiety and Stress Scales (DASS) using classical test theory approaches it has not been subjected to Rasch analysis. The aim of this study was to use Rasch analysis to assess the psychometric properties of the DASS-21 scales, using two different administration modes. The DASS-21 was administered to 420 participants with half the sample responding to a web-based version and the other half completing a traditional pencil-and-paper version. Conformity of DASS-21 scales to a Rasch partial credit model was assessed using the RUMM2020 software. To achieve adequate model fit it was necessary to remove one item from each of the DASS-21 subscales. The reduced scales showed adequate internal consistency reliability, unidimensionality and freedom from differential item functioning for sex, age and mode of administration. Analysis of all DASS-21 items combined did not support its use as a measure of general psychological distress. A scale combining the anxiety and stress items showed satisfactory fit to the Rasch model after removal of three items. The results provide support for the measurement properties, internal consistency reliability, and unidimensionality of three slightly modified DASS-21 scales, across two different administration methods. The further use of Rasch analysis on the DASS-21 in larger and broader samples is recommended to confirm the findings of the current study.
Posttraumatic Stress Disorder: Diagnostic Data Analysis by Data Mining Methodology
Marinić, Igor; Supek, Fran; Kovačić, Zrnka; Rukavina, Lea; Jendričko, Tihana; Kozarić-Kovačić, Dragica
2007-01-01
Aim To use data mining methods in assessing diagnostic symptoms in posttraumatic stress disorder (PTSD) Methods The study included 102 inpatients: 51 with a diagnosis of PTSD and 51 with psychiatric diagnoses other than PTSD. Several models for predicting diagnosis were built using the random forest classifier, one of the intelligent data analysis methods. The first prediction model was based on a structured psychiatric interview, the second on psychiatric scales (Clinician-administered PTSD Scale – CAPS, Positive and Negative Syndrome Scale – PANSS, Hamilton Anxiety Scale – HAMA, and Hamilton Depression Scale – HAMD), and the third on combined data from both sources. Additional models placing more weight on one of the classes (PTSD or non-PTSD) were trained, and prototypes representing subgroups in the classes constructed. Results The first model was the most relevant for distinguishing PTSD diagnosis from comorbid diagnoses such as neurotic, stress-related, and somatoform disorders. The second model pointed out the scores obtained on the Clinician-administered PTSD Scale (CAPS) and additional Positive and Negative Syndrome Scale (PANSS) scales, together with comorbid diagnoses of neurotic, stress-related, and somatoform disorders as most relevant. In the third model, psychiatric scales and the same group of comorbid diagnoses were found to be most relevant. Specialized models placing more weight on either the PTSD or non-PTSD class were able to better predict their targeted diagnoses at some expense of overall accuracy. Class subgroup prototypes mainly differed in values achieved on psychiatric scales and frequency of comorbid diagnoses. Conclusion Our work demonstrated the applicability of data mining methods for the analysis of structured psychiatric data for PTSD. In all models, the group of comorbid diagnoses, including neurotic, stress-related, and somatoform disorders, surfaced as important. The important attributes of the data, based on the structured psychiatric interview, were the current symptoms and conditions such as presence and degree of disability, hospitalizations, and duration of military service during the war, while CAPS total scores, symptoms of increased arousal, and PANSS additional criteria scores were indicated as relevant from the psychiatric symptom scales. PMID:17436383
Chen, Hong-Lin; Cao, Ying-Juan; Zhang, Wei; Wang, Jing; Huai, Bao-Sha
2017-02-01
The inter-rater reliability of Braden Scale is not so good. We modified the Braden(ALB) scale by defining nutrition subscale based on serum albumin, then assessed it's the validity and reliability in hospital patients. We designed a retrospective study for validity analysis, and a prospective study for reliability analysis. Receiver operating curve (ROC) and area under the curve (AUC) were used to evaluate the predictive validity. Intra-class correlation coefficient (ICC) was used to investigate the inter-rater reliability. Two thousand five hundred twenty-five patients were included for validity analysis, 76 patients (3.0%) developed pressure ulcer. Positive correlation was found between serum albumin and nutrition score in Braden scale (Spearman's coefficient 0.2203, P<0.0001). The AUCs for Braden scale and Braden(ALB) scale predicting pressure ulcer risk were 0.813 (95% CI 0.797-0.828; P<0.0001), and 0.859 (95% CI 0.845-0.872; P<0.0001), respectively. The Braden(ALB) scale was even more valid than the Braden scale (z=1.860, P=0.0628). In different age subgroups, the Braden(ALB) scale seems also more valid than the original Braden scale, but no statistically significant differences were found (P>0.05). The inter-rater reliability study showed the ICC-value for nutrition increased 45.9%, and increased 4.3% for total score. The Braden(ALB) scale has similar validity compared with the original Braden scale for in hospital patients. However, the inter-rater reliability was significantly increased. Copyright © 2016 Elsevier Inc. All rights reserved.
Moon, Chung-Man; Shin, Il-Seon; Jeong, Gwang-Woo
2017-02-01
Background Non-invasive imaging markers can be used to diagnose Alzheimer's disease (AD) in its early stages, but an optimized quantification analysis to measure the brain integrity has been less studied. Purpose To evaluate white matter volume change and its correlation with neuropsychological scales in patients with AD using a diffeomorphic anatomical registration through exponentiated lie algebra (DARTEL)-based voxel-based morphometry (VBM). Material and Methods The 21 participants comprised 11 patients with AD and 10 age-matched healthy controls. High-resolution magnetic resonance imaging (MRI) data were processed by VBM analysis based on DARTEL algorithm. Results The patients showed significant white matter volume reductions in the posterior limb of the internal capsule, cerebral peduncle of the midbrain, and parahippocampal gyrus compared to healthy controls. In correlation analysis, the parahippocampal volume was positively correlated with the Korean-mini mental state examination score in AD. Conclusion This study provides an evidence for localized white matter volume deficits in conjunction with cognitive dysfunction in AD. These findings would be helpful to understand the neuroanatomical mechanisms in AD and to robust the diagnostic accuracy for AD.
Development and validation of the Simulation Learning Effectiveness Scale for nursing students.
Pai, Hsiang-Chu
2016-11-01
To develop and validate the Simulation Learning Effectiveness Scale, which is based on Bandura's social cognitive theory. A simulation programme is a significant teaching strategy for nursing students. Nevertheless, there are few evidence-based instruments that validate the effectiveness of simulation learning in Taiwan. This is a quantitative descriptive design. In Study 1, a nonprobability convenience sample of 151 student nurses completed the Simulation Learning Effectiveness Scale. Exploratory factor analysis was used to examine the factor structure of the instrument. In Study 2, which involved 365 student nurses, confirmatory factor analysis and structural equation modelling were used to analyse the construct validity of the Simulation Learning Effectiveness Scale. In Study 1, exploratory factor analysis yielded three components: self-regulation, self-efficacy and self-motivation. The three factors explained 29·09, 27·74 and 19·32% of the variance, respectively. The final 12-item instrument with the three factors explained 76·15% of variance. Cronbach's alpha was 0·94. In Study 2, confirmatory factor analysis identified a second-order factor termed Simulation Learning Effectiveness Scale. Goodness-of-fit indices showed an acceptable fit overall with the full model (χ 2 /df (51) = 3·54, comparative fit index = 0·96, Tucker-Lewis index = 0·95 and standardised root-mean-square residual = 0·035). In addition, teacher's competence was found to encourage learning, and self-reflection and insight were significantly and positively associated with Simulation Learning Effectiveness Scale. Teacher's competence in encouraging learning also was significantly and positively associated with self-reflection and insight. Overall, theses variable explained 21·9% of the variance in the student's learning effectiveness. The Simulation Learning Effectiveness Scale is a reliable and valid means to assess simulation learning effectiveness for nursing students. The Simulation Learning Effectiveness Scale can be used to examine nursing students' learning effectiveness and serve as a basis to improve student's learning efficiency through simulation programmes. Future implementation research that focuses on the relationship between learning effectiveness and nursing competence in nursing students is recommended. © 2016 John Wiley & Sons Ltd.
Testing of the SEE and OEE post-hip fracture.
Resnick, Barbara; Orwig, Denise; Zimmerman, Sheryl; Hawkes, William; Golden, Justine; Werner-Bronzert, Michelle; Magaziner, Jay
2006-08-01
The purpose of this study was to test the reliability and validity of the Self-Efficacy for Exercise (SEE) and the Outcome Expectations for Exercise (OEE) scales in a sample of 166 older women post-hip fracture. There was some evidence of validity of the SEE and OEE based on confirmatory factor analysis and Rasch model testing, criterion based and convergent validity, and evidence of internal consistency based on alpha coefficients and separation indices and reliability based on R2 estimates. Rasch model testing demonstrated that some items had high variability. Based on these findings suggestions are made for how items could be revised and the scales improved for future use.
Structural similitude and scaling laws for laminated beam-plates
NASA Technical Reports Server (NTRS)
Simitses, George J.; Rezaeepazhand, Jalil
1992-01-01
The establishment of similarity conditions between two structural systems is discussed. Similarity conditions provide the relationship between a scale model and its prototype and can be used to predict the behavior of the prototype by extrapolating the experimental data of the corresponding small-scale model. Since satisfying all the similarity conditions simultaneously is difficult or even impossible, distorted models with partial similarity (with at least one similarity condition relaxed) are more practical. Establishing similarity conditions based on both dimensional analysis and direct use of governing equations is discussed, and the possibility of designing distorted models is investigated. The method is demonstrated through analysis of the cylindrical bending of orthotropic laminated beam-plates subjected to transverse line loads.
NASA Technical Reports Server (NTRS)
Aiken, Alexander
2001-01-01
The Scalable Analysis Toolkit (SAT) project aimed to demonstrate that it is feasible and useful to statically detect software bugs in very large systems. The technical focus of the project was on a relatively new class of constraint-based techniques for analysis software, where the desired facts about programs (e.g., the presence of a particular bug) are phrased as constraint problems to be solved. At the beginning of this project, the most successful forms of formal software analysis were limited forms of automatic theorem proving (as exemplified by the analyses used in language type systems and optimizing compilers), semi-automatic theorem proving for full verification, and model checking. With a few notable exceptions these approaches had not been demonstrated to scale to software systems of even 50,000 lines of code. Realistic approaches to large-scale software analysis cannot hope to make every conceivable formal method scale. Thus, the SAT approach is to mix different methods in one application by using coarse and fast but still adequate methods at the largest scales, and reserving the use of more precise but also more expensive methods at smaller scales for critical aspects (that is, aspects critical to the analysis problem under consideration) of a software system. The principled method proposed for combining a heterogeneous collection of formal systems with different scalability characteristics is mixed constraints. This idea had been used previously in small-scale applications with encouraging results: using mostly coarse methods and narrowly targeted precise methods, useful information (meaning the discovery of bugs in real programs) was obtained with excellent scalability.
Scaling analysis for the direct reactor auxiliary cooling system for FHRs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lv, Q.; Kim, I. H.; Sun, X.
2015-04-01
The Direct Reactor Auxiliary Cooling System (DRACS) is a passive residual heat removal system proposed for the Fluoride-salt-cooled High-temperature Reactor (FHR) that combines the coated particle fuel and graphite moderator with a liquid fluoride salt as the coolant. The DRACS features three natural circulation/convection loops that rely on buoyancy as the driving force and are coupled via two heat exchangers, namely, the DRACS heat exchanger and the natural draft heat exchanger. A fluidic diode is employed to minimize the parasitic flow into the DRACS primary loop and correspondingly the heat loss to the DRACS during reactor normal operation, and tomore » activate the DRACS in accidents when the reactor is shut down. While the DRACS concept has been proposed, there are no actual prototypic DRACS systems for FHRs built or tested in the literature. In this paper, a detailed scaling analysis for the DRACS is performed, which will provide guidance for the design of scaled-down DRACS test facilities. Based on the Boussinesq assumption and one-dimensional flow formulation, the governing equations are non-dimensionalized by introducing appropriate dimensionless parameters. The key dimensionless numbers that characterize the DRACS system are obtained from the non-dimensional governing equations. Based on the dimensionless numbers and non-dimensional governing equations, similarity laws are proposed. In addition, a scaling methodology has been developed, which consists of a core scaling and a loop scaling. The consistency between the core and loop scaling is examined via the reference volume ratio, which can be obtained from both the core and loop scaling processes. The scaling methodology and similarity laws have been applied to obtain a scientific design of a scaled-down high-temperature DRACS test facility.« less
Feng, Shen; Wenhan, Jiang
2002-06-10
Phase-structure and aperture-averaged slope-correlated functions with a finite outer scale are derived based on the Taylor hypothesis and a generalized spectrum, such as the von Kármán modal. The effects of the finite outer scale on measuring and determining the character of atmospheric-turbulence statistics are shown especially for an approximately 4-m class telescope and subaperture. The phase structure function and atmospheric coherent length based on the Kolmogorov model are approximations of the formalism we have derived. The analysis shows that it cannot be determined whether the deviation from the power-law parameter of Kolmogorov turbulence is caused by real variations of the spectrum or by the effect of the finite outer scale.
Preface Sections in English and Arabic Linguistics Books: A Rhetorico-Cultural Analysis
ERIC Educational Resources Information Center
Al-Zubaidi, Nassier A. G.; Jasim, Tahani Awad
2016-01-01
The present paper is a genre analysis of linguistics books prefaces in English and Arabic. Following Swales' (1990) genre framework, this study is a small scale-based generic analysis of 80 preface texts, equally divided into 40 texts from English and Arabic. The corpus analysis revealed that to perform its communicative function, the genre of the…
A Pilot Study: Testing of the Psychological Conditions Scale Among Hospital Nurses.
Fountain, Donna M; Thomas-Hawkins, Charlotte
2016-11-01
The aim of this study was to test the reliability and validity of the Psychological Conditions Scale (PCS), a measure of drivers of engagement in hospital-based nurses. Research suggests drivers of engagement are positive links to patient, employee, and hospital outcomes. Although this scale has been used in other occupations, it has not been tested in nursing. A cross-sectional, methodological study using a convenience sample of 200 nurses in a large Magnet® hospital in New Jersey. Cronbach's α's ranged from .64 to .95. Principal components exploratory factor analysis with oblique rotation revealed that 13 items loaded unambiguously in 3 domains and explained 76% of the variance. Mean PCS scores ranged from 3.62 to 4.68 on a 5-point Likert scale. The scale is an adequate measure of drivers of engagement in hospital-based nurses. Leadership efforts to promote the facilitators of engagement are recommended.
Developing robust recurrence plot analysis techniques for investigating infant respiratory patterns.
Terrill, Philip I; Wilson, Stephen; Suresh, Sadasivam; Cooper, David M
2007-01-01
Recurrence plot analysis is a useful non-linear analysis tool. There are still no well formalised procedures for carrying out this analysis on measured physiological data, and systemising analysis is often difficult. In this paper, the recurrence based embedding is compared to radius based embedding by studying a logistic attractor and measured breathing data collected from sleeping human infants. Recurrence based embedding appears to be a more robust method of carrying out a recurrence analysis when attractor size is likely to be different between datasets. In the infant breathing data, the radius measure calculated at a fixed recurrence, scaled by average respiratory period, allows the accurate discrimination of active sleep from quiet sleep states (AUC=0.975, Sn=098, Sp=0.94).
NASA Astrophysics Data System (ADS)
Dowell, M.; Moore, T.; Follows, M.; Dutkiewicz, S.
2006-12-01
In recent years there has been significant progress both in the use of satellite ocean colour remote sensing and coupled hydrodynamic biological models for producing maps of different dominant phytoplankton groups in the global ocean. In parallel to these initiatives, there is ongoing research largely following on from Alan Longhurst's seminal work on defining a template of distinct ecological and biogeochemical provinces for the oceans based on their physical and biochemical characteristics. For these products and models to be of maximum use in their subsequent inclusion in re-analysis and climate scale models, there is a need to understand how the "observed" distributions of dominant phytoplankton (realized niche) coincide with of the environmental constraints in which they occur (fundamental niche). In the current paper, we base our analysis on the recently published results on the distribution of dominant phytoplankton species at global scale, resulting both from satellite and model analysis. Furthermore, we will present research in defining biogeochemical provinces using satellite and model data inputs and a fuzzy logic based approach. This will be compared with ongoing modelling efforts, which include competitive exclusion and therefore compatible with the definition of the realized ecological niche, to define the emergent distribution of dominant phytoplankton species. Ultimately we investigate the coherence of these two distinct approaches in studying phytoplankton distributions and propose the significance of this in the context of modelling and analysis at various scales.
Speech transformations based on a sinusoidal representation
NASA Astrophysics Data System (ADS)
Quatieri, T. E.; McAulay, R. J.
1986-05-01
A new speech analysis/synthesis technique is presented which provides the basis for a general class of speech transformation including time-scale modification, frequency scaling, and pitch modification. These modifications can be performed with a time-varying change, permitting continuous adjustment of a speaker's fundamental frequency and rate of articulation. The method is based on a sinusoidal representation of the speech production mechanism that has been shown to produce synthetic speech that preserves the waveform shape and is essentially perceptually indistinguishable from the original. Although the analysis/synthesis system originally was designed for single-speaker signals, it is equally capable of recovering and modifying nonspeech signals such as music; multiple speakers, marine biologic sounds, and speakers in the presence of interferences such as noise and musical backgrounds.
Impact of aggregation on scaling behavior of Internet backbone traffic
NASA Astrophysics Data System (ADS)
Zhang, Zhi-Li; Ribeiro, Vinay J.; Moon, Sue B.; Diot, Christophe
2002-07-01
We study the impact of aggregation on the scaling behavior of Internet backbone tra ffic, based on traces collected from OC3 and OC12 links in a tier-1 ISP. We make two striking observations regarding the sub-second small time scaling behaviors of Internet backbone traffic: 1) for a majority of these traces, the Hurst parameters at small time scales (1ms - 100ms) are fairly close to 0.5. Hence the traffic at these time scales are nearly uncorrelated; 2) the scaling behaviors at small time scales are link-dependent, and stay fairly invariant over changing utilization and time. To understand the scaling behavior of network traffic, we develop analytical models and employ them to demonstrate how traffic composition -- aggregation of traffic with different characteristics -- affects the small-time scalings of network traffic. The degree of aggregation and burst correlation structure are two major factors in traffic composition. Our trace-based data analysis confirms this. Furthermore, we discover that traffic composition on a backbone link stays fairly consistent over time and changing utilization, which we believe is the cause for the invariant small-time scalings we observe in the traces.
ERIC Educational Resources Information Center
Walsh, Kerryann; Rassafiani, Mehdi; Mathews, Ben; Farrell, Ann; Butler, Des
2012-01-01
This paper presents an evaluation of an instrument to measure teachers' attitudes toward reporting child sexual abuse and discusses the instrument's merit for research into reporting practice. Based on responses from 444 Australian teachers, the Teachers' Reporting Attitude Scale for Child Sexual Abuse was evaluated using exploratory factor…
The Internal Efficiency in Higher Education: An Analysis Based on Economies of Scope
ERIC Educational Resources Information Center
Gang, Cheng; Keming, Wu
2008-01-01
Among the studies of the internal efficiency in higher education, most have focused on the scale of university (the economies of scale), but little on internal operating efficiency in higher education, especially on the combined efficiency of outputs (the economies of scope). There are few theoretical discussions or experimental research on…
Urban forest health monitoring: large-scale assessments in the United States
Anne Buckelew Cumming; Daniel B. Twardus; David J. Nowak
2008-01-01
The U.S. Department of Agriculture, Forest Service (USFS), together with state partners, developed methods to monitor urban forest structure, function, and health at a large statewide scale. Pilot studies have been established in five states using protocols based on USFS Forest Inventory and Analysis and Forest Health Monitoring program data collection standards....
ERIC Educational Resources Information Center
Sointu, Erkko Tapio; Savolainen, Hannu; Lambert, Matthew C.; Lappalainen, Kristiina; Epstein, Michael H.
2014-01-01
When rating scales are used in different countries, thorough investigation of the psychometric properties is needed. We examined the internal structure of the Finnish translated Behavioral and Emotional Rating Scale-2 (BERS-2) using Rasch and confirmatory factor analysis approaches with a sample of youth, parents, and teachers. The results…
Tarisa K. Zimet; Jonathan E. Martin
2003-01-01
Meteorological assessment of wildfire risk has traditionally involved identification of several synoptic types empirically determined to influence wildfire spread. Such weather types are characterized by identifiable synoptic-scale structures and processes. Schroeder et. al. (1964) identified four recognizable synoptic-scale patterns that contribute most frequently to...
Hierarchical den selection of Canada lynx in western Montana
John R. Squires; Nicholas J. Decesare; Jay A. Kolbe; Leonard F. Ruggiero
2008-01-01
We studied den selection of Canada lynx (Lynx canadensis; hereafter lynx) at multiple ecological scales based on 57 dens from 19 females located in western Montana, USA, between 1999 and 2006. We considered 3 spatial scales in this analysis, including den site (11-m-radius circle surrounding dens), den area (100-m-radius circle), and den environ (1-...
An Analysis of Several Instruments Measuring "Nature of Science" Objectives
ERIC Educational Resources Information Center
Doran, Rodney L.; And Others
1974-01-01
Reported is an investigation of the relationship among three selected instruments based on the responses of a sample of high school students. The instruments were the Nature of Science Scale (NOSS), the Science Support Scale (SSS), and the Test on the Social Aspects of Science (TSAS). All purport to measure "nature of science"…
ERIC Educational Resources Information Center
Kieffer, Michael J.; Lesaux, Nonie K.; Rivera, Mabel; Francis, David J.
2009-01-01
Including English language learners (ELLs) in large-scale assessments raises questions about the validity of inferences based on their scores. Test accommodations for ELLs are intended to reduce the impact of limited English proficiency on the assessment of the target construct, most often mathematic or science proficiency. This meta-analysis…
Testing asteroseismic radii of dwarfs and subgiants with Kepler and Gaia
NASA Astrophysics Data System (ADS)
Sahlholdt, C. L.; Silva Aguirre, V.; Casagrande, L.; Mosumgaard, J. R.; Bojsen-Hansen, M.
2018-05-01
We test asteroseismic radii of Kepler main-sequence and subgiant stars by deriving their parallaxes which are compared with those of the first Gaia data release. We compute radii based on the asteroseismic scaling relations as well as by fitting observed oscillation frequencies to stellar models for a subset of the sample, and test the impact of using effective temperatures from either spectroscopy or the infrared flux method. An offset of 3 per cent, showing no dependency on any stellar parameters, is found between seismic parallaxes derived from frequency modelling and those from Gaia. For parallaxes based on radii from the scaling relations, a smaller offset is found on average; however, the offset becomes temperature dependent which we interpret as problems with the scaling relations at high stellar temperatures. Using the hotter infrared flux method temperature scale, there is no indication that radii from the scaling relations are inaccurate by more than about 5 per cent. Taking the radii and masses from the modelling of individual frequencies as reference values, we seek to correct the scaling relations for the observed temperature trend. This analysis indicates that the scaling relations systematically overestimate radii and masses at high temperatures, and that they are accurate to within 5 per cent in radius and 13 per cent in mass for main-sequence stars with temperatures below 6400 K. However, further analysis is required to test the validity of the corrections on a star-by-star basis and for more evolved stars.
Elders Health Empowerment Scale: Spanish adaptation and psychometric analysis.
Serrani Azcurra, Daniel Jorge Luis
2014-01-01
Empowerment refers to patient skills that allow them to become primary decision-makers in control of daily self-management of health problems. As important the concept as it is, particularly for elders with chronic diseases, few available instruments have been validated for use with Spanish speaking people. Translate and adapt the Health Empowerment Scale (HES) for a Spanish-speaking older adults sample and perform its psychometric validation. The HES was adapted based on the Diabetes Empowerment Scale-Short Form. Where "diabetes" was mentioned in the original tool, it was replaced with "health" terms to cover all kinds of conditions that could affect health empowerment. Statistical and Psychometric Analyses were conducted on 648 urban-dwelling seniors. The HES had an acceptable internal consistency with a Cronbach's α of 0.89. The convergent validity was supported by significant Pearson's Coefficient correlations between the HES total and item scores and the General Self Efficacy Scale (r= 0.77), Swedish Rheumatic Disease Empowerment Scale (r= 0.69) and Making Decisions Empowerment Scale (r= 0.70). Construct validity was evaluated using item analysis, half-split test and corrected item to total correlation coefficients; with good internal consistency (α> 0.8). The content validity was supported by Scale and Item Content Validity Index of 0.98 and 1.0, respectively. HES had acceptable face validity and reliability coefficients; which added to its ease administration and users' unbiased comprehension, could set it as a suitable tool in evaluating elder's outpatient empowerment-based medical education programs.
Bethge, Matthias; Borngräber, Yvonne
2015-03-18
Under conditions of gender-specific division of paid employment and unpaid childcare and housework, rising employment of women increases the likelihood that they will be faced with work-family conflicts. As recent research indicates, such conflicts might also contribute to musculoskeletal disorders. However, research in patient samples is needed to clarify how important these conflicts are for relevant health-related measures of functioning (e.g., work ability). We therefore examined, in a sample of women with chronic musculoskeletal disorders, the indirect and direct associations between the indicators of work-family conflicts and self-reported work ability as well as whether the direct effects remained significant after adjustment for covariates. A cross-sectional questionnaire-based study was conducted. Participants were recruited from five rehabilitation centers. Work-family conflicts were assessed by four scales referring to time- and strain-based work interference with family (WIF) and family interference with work (FIW). Self-reported work ability was measured by the Work Ability Index. A confirmatory factor analysis was performed to approve the anticipated four-factor structure of the work-family conflict measure. Direct and indirect associations between work-family conflict indicators and self-reported work ability were examined by path model analysis. Multivariate regression models were performed to calculate adjusted estimators of the direct effects of strain-based WIF and FIW on work ability. The study included 351 employed women. The confirmatory factor analysis provided support for the anticipated four-factor structure of the work-family conflict measure. The path model analysis identified direct effects of both strain-based scales on self-reported work ability. The time-based scales were indirectly associated with work ability via the strain-based scales. Adjusted regression analyses showed that a five-point increase in strain-based WIF or FIW was associated with a four- and two-point decrease in self-reported work ability, respectively. The standardized regression coefficients were β = 0.35 and β = 0.12. Our findings indicate that work-family conflicts are associated with poor work ability in female patients with chronic musculoskeletal disorders. However, longitudinal research is needed to establish a causal relationship. Better compatibility of work and family life might be an environmental facilitator of better rehabilitation outcomes in female patients with musculoskeletal disorders.
Rasch analysis of the carers quality of life questionnaire for parkinsonism.
Pillas, Marios; Selai, Caroline; Schrag, Anette
2017-03-01
To assess the psychometric properties of the Carers Quality of Life Questionnaire for Parkinsonism using a Rasch modeling approach and determine the optimal cut-off score. We performed a Rasch analysis of the survey answers of 430 carers of patients with atypical parkinsonism. All of the scale items demonstrated acceptable goodness of fit to the Rasch model. The scale was unidimensional and no notable differential item functioning was detected in the items regarding age and disease type. Rating categories were functioning adequately in all scale items. The scale had high reliability (.95) and construct validity and a high degree of precision, distinguishing between 5 distinct groups of carers with different levels of quality of life. A cut-off score of 62 was found to have the optimal screening accuracy based on Hospital Anxiety and Depression Scale subscores. The results suggest that the Carers Quality of Life Questionnaire for Parkinsonism is a useful scale to assess carers' quality of life and allows analyses requiring interval scaling of variables. © 2016 International Parkinson and Movement Disorder Society. © 2016 International Parkinson and Movement Disorder Society.
Barbosa, Daniel C; Roupar, Dalila B; Ramos, Jaime C; Tavares, Adriano C; Lima, Carlos S
2012-01-11
Wireless capsule endoscopy has been introduced as an innovative, non-invasive diagnostic technique for evaluation of the gastrointestinal tract, reaching places where conventional endoscopy is unable to. However, the output of this technique is an 8 hours video, whose analysis by the expert physician is very time consuming. Thus, a computer assisted diagnosis tool to help the physicians to evaluate CE exams faster and more accurately is an important technical challenge and an excellent economical opportunity. The set of features proposed in this paper to code textural information is based on statistical modeling of second order textural measures extracted from co-occurrence matrices. To cope with both joint and marginal non-Gaussianity of second order textural measures, higher order moments are used. These statistical moments are taken from the two-dimensional color-scale feature space, where two different scales are considered. Second and higher order moments of textural measures are computed from the co-occurrence matrices computed from images synthesized by the inverse wavelet transform of the wavelet transform containing only the selected scales for the three color channels. The dimensionality of the data is reduced by using Principal Component Analysis. The proposed textural features are then used as the input of a classifier based on artificial neural networks. Classification performances of 93.1% specificity and 93.9% sensitivity are achieved on real data. These promising results open the path towards a deeper study regarding the applicability of this algorithm in computer aided diagnosis systems to assist physicians in their clinical practice.
Global point signature for shape analysis of carpal bones
NASA Astrophysics Data System (ADS)
Chaudhari, Abhijit J.; Leahy, Richard M.; Wise, Barton L.; Lane, Nancy E.; Badawi, Ramsey D.; Joshi, Anand A.
2014-02-01
We present a method based on spectral theory for the shape analysis of carpal bones of the human wrist. We represent the cortical surface of the carpal bone in a coordinate system based on the eigensystem of the two-dimensional Helmholtz equation. We employ a metric—global point signature (GPS)—that exploits the scale and isometric invariance of eigenfunctions to quantify overall bone shape. We use a fast finite-element-method to compute the GPS metric. We capitalize upon the properties of GPS representation—such as stability, a standard Euclidean (ℓ2) metric definition, and invariance to scaling, translation and rotation—to perform shape analysis of the carpal bones of ten women and ten men from a publicly-available database. We demonstrate the utility of the proposed GPS representation to provide a means for comparing shapes of the carpal bones across populations.
NASA Astrophysics Data System (ADS)
Apriani, Lestari; Satriana, Joshua; Aulian Chalik, Citra; Syahputra Mulyana, Reza; Hafidz, Muhammad; Suryantini
2017-12-01
Volcanostratigraphy study is used for supporting geothermal exploration on preliminary survey. This study is important to identify volcanic eruption center which shows potential area of geothermal heat source. The purpose of volcanostratigraphy study in research area is going to distinguish the characteristics of volcanic eruption product that construct the volcanic body. The analysis of Arjuno-Welirang volcanostratigraphy identification are based on topographic maps of Malang sheet with 1:100.000 scale, 1:50.000 scale, and a geological map. Regarding to the delineation of ridge and river, we determine five crowns, three hummocks, one brigade and one super brigade. The crowns consist of Ringgit, Welirang, Arjuno, Kawi, and Penanggungan, the hummocks comprise of Kembar III, Kembar II, and Kembar I, the brigade is Arjuno-Welirang, and the super brigade is Tengger. Based on topographic map interpretation and geothermal prospect evaluation method analysis, shows that Arjuno-Welirang prospect area have good geothermal resource potential.
Evaluating Mixture Modeling for Clustering: Recommendations and Cautions
ERIC Educational Resources Information Center
Steinley, Douglas; Brusco, Michael J.
2011-01-01
This article provides a large-scale investigation into several of the properties of mixture-model clustering techniques (also referred to as latent class cluster analysis, latent profile analysis, model-based clustering, probabilistic clustering, Bayesian classification, unsupervised learning, and finite mixture models; see Vermunt & Magdison,…
NASA Astrophysics Data System (ADS)
Krajina, Brad A.; Kocherlakota, Lakshmi S.; Overney, René M.
2014-10-01
The energetics involved in the bonding fluctuations between nanometer-sized silicon dioxide (SiO2) probes and highly oriented pyrolytic graphite (HOPG) and molybdenum disulfide (MoS2) could be quantified directly and locally on the submicron scale via a time-temperature superposition analysis of the lateral forces between scanning force microscopy silicon dioxide probes and inorganic sample surfaces. The so-called "intrinsic friction analysis" (IFA) provided direct access to the Hamaker constants for HOPG and MoS2, as well as the control sample, calcium fluoride (CaF2). The use of scanning probe enables nanoscopic analysis of bonding fluctuations, thereby overcoming challenges associated with larger scale inhomogeneity and surface roughness common to conventional techniques used to determine surface free energies and dielectric properties. A complementary numerical analysis based on optical and electron energy loss spectroscopy and the Lifshitz quantum electrodynamic theory of van der Waals interactions is provided and confirms quantitatively the IFA results.
Krajina, Brad A; Kocherlakota, Lakshmi S; Overney, René M
2014-10-28
The energetics involved in the bonding fluctuations between nanometer-sized silicon dioxide (SiO2) probes and highly oriented pyrolytic graphite (HOPG) and molybdenum disulfide (MoS2) could be quantified directly and locally on the submicron scale via a time-temperature superposition analysis of the lateral forces between scanning force microscopy silicon dioxide probes and inorganic sample surfaces. The so-called "intrinsic friction analysis" (IFA) provided direct access to the Hamaker constants for HOPG and MoS2, as well as the control sample, calcium fluoride (CaF2). The use of scanning probe enables nanoscopic analysis of bonding fluctuations, thereby overcoming challenges associated with larger scale inhomogeneity and surface roughness common to conventional techniques used to determine surface free energies and dielectric properties. A complementary numerical analysis based on optical and electron energy loss spectroscopy and the Lifshitz quantum electrodynamic theory of van der Waals interactions is provided and confirms quantitatively the IFA results.
2014-01-01
To describe flow or transport phenomena in porous media, relations between aquifer hydraulic conductivity and effective porosity can prove useful, avoiding the need to perform expensive and time consuming measurements. The practical applications generally require the determination of this parameter at field scale, while most of the empirical and semiempirical formulas, based on grain size analysis and allowing determination of the hydraulic conductivity from the porosity, are related to the laboratory scale and thus are not representative of the aquifer volumes to which one refers. Therefore, following the grain size distribution methodology, a new experimental relation between hydraulic conductivity and effective porosity, representative of aquifer volumes at field scale, is given for a confined aquifer. The experimental values used to determine this law were obtained for both parameters using only field measurements methods. The experimental results found, also if in the strict sense valid only for the investigated aquifer, can give useful suggestions for other alluvial aquifers with analogous characteristics of grain-size distribution. Limited to the investigated range, a useful comparison with the best known empirical formulas based on grain size analysis was carried out. The experimental data allowed also investigation of the existence of a scaling behaviour for both parameters considered. PMID:25180202
Dual-scale Galerkin methods for Darcy flow
NASA Astrophysics Data System (ADS)
Wang, Guoyin; Scovazzi, Guglielmo; Nouveau, Léo; Kees, Christopher E.; Rossi, Simone; Colomés, Oriol; Main, Alex
2018-02-01
The discontinuous Galerkin (DG) method has found widespread application in elliptic problems with rough coefficients, of which the Darcy flow equations are a prototypical example. One of the long-standing issues of DG approximations is the overall computational cost, and many different strategies have been proposed, such as the variational multiscale DG method, the hybridizable DG method, the multiscale DG method, the embedded DG method, and the Enriched Galerkin method. In this work, we propose a mixed dual-scale Galerkin method, in which the degrees-of-freedom of a less computationally expensive coarse-scale approximation are linked to the degrees-of-freedom of a base DG approximation. We show that the proposed approach has always similar or improved accuracy with respect to the base DG method, with a considerable reduction in computational cost. For the specific definition of the coarse-scale space, we consider Raviart-Thomas finite elements for the mass flux and piecewise-linear continuous finite elements for the pressure. We provide a complete analysis of stability and convergence of the proposed method, in addition to a study on its conservation and consistency properties. We also present a battery of numerical tests to verify the results of the analysis, and evaluate a number of possible variations, such as using piecewise-linear continuous finite elements for the coarse-scale mass fluxes.
Quah, Evan S H; Grismer, L Lee; Wood, Perry L Jr; Thura, Myint Kyaw; Zin, Thaw; Kyaw, Htet; Lwin, Ngwe; Grismer, Marta S; Murdoch, Matthew L
2017-03-06
A newly discovered species of homalopsid snake from the genus Gyiophis Murphy & Voris is described from the lowlands of Mawlamyine District in Mon state, southeastern Myanmar. Gyiophis salweenensis sp. nov. is presumed to be closely related to G. maculosa Blanford and G. vorisi Murphy based on the similarities in pholidosis and patterning but can be separated from G. maculosa by the shape of its first three dorsal scale rows that are square, ventral scale pattern that lacks a central spot, and a faint stripe on dorsal scale rows 1-4. It can be further distinguished from G. vorisi by its lower number of ventral scales (129 vs. 142-152), lower number of subcaudals (30/29 vs. 41-58), narrow rostral scale, and having more rows of spots on the dorsum (four vs. three). A preliminary molecular analysis using 1050 base pairs of cytochrome b (cytb) recovered G. salweenensis sp. nov. as the sister species to the Chinese Mud Snake (Myrrophis chinensis). G. maculosa and G. vorisi were unavailable for the analysis. The discovery of G. salweenensis sp. nov. highlights the need for more surveys into the herpetological diversity of eastern Myanmar which remains very much underestimated.
Development and psychometric characteristics of the SCI-QOL Pressure Ulcers scale and short form.
Kisala, Pamela A; Tulsky, David S; Choi, Seung W; Kirshblum, Steven C
2015-05-01
To develop a self-reported measure of the subjective impact of pressure ulcers on health-related quality of life (HRQOL) in individuals with spinal cord injury (SCI) as part of the SCI quality of life (SCI-QOL) measurement system. Grounded-theory based qualitative item development methods, large-scale item calibration testing, confirmatory factor analysis (CFA), and item response theory-based psychometric analysis. Five SCI Model System centers and one Department of Veterans Affairs medical center in the United States. Adults with traumatic SCI. SCI-QOL Pressure Ulcers scale. 189 individuals with traumatic SCI who experienced a pressure ulcer within the past 7 days completed 30 items related to pressure ulcers. CFA confirmed a unidimensional pool of items. IRT analyses were conducted. A constrained Graded Response Model with a constant slope parameter was used to estimate item thresholds for the 12 retained items. The 12-item SCI-QOL Pressure Ulcers scale is unique in that it is specifically targeted to individuals with spinal cord injury and at every stage of development has included input from individuals with SCI. Furthermore, use of CFA and IRT methods provide flexibility and precision of measurement. The scale may be administered in its entirety or as a 7-item "short form" and is available for both research and clinical practice.
HiQuant: Rapid Postquantification Analysis of Large-Scale MS-Generated Proteomics Data.
Bryan, Kenneth; Jarboui, Mohamed-Ali; Raso, Cinzia; Bernal-Llinares, Manuel; McCann, Brendan; Rauch, Jens; Boldt, Karsten; Lynn, David J
2016-06-03
Recent advances in mass-spectrometry-based proteomics are now facilitating ambitious large-scale investigations of the spatial and temporal dynamics of the proteome; however, the increasing size and complexity of these data sets is overwhelming current downstream computational methods, specifically those that support the postquantification analysis pipeline. Here we present HiQuant, a novel application that enables the design and execution of a postquantification workflow, including common data-processing steps, such as assay normalization and grouping, and experimental replicate quality control and statistical analysis. HiQuant also enables the interpretation of results generated from large-scale data sets by supporting interactive heatmap analysis and also the direct export to Cytoscape and Gephi, two leading network analysis platforms. HiQuant may be run via a user-friendly graphical interface and also supports complete one-touch automation via a command-line mode. We evaluate HiQuant's performance by analyzing a large-scale, complex interactome mapping data set and demonstrate a 200-fold improvement in the execution time over current methods. We also demonstrate HiQuant's general utility by analyzing proteome-wide quantification data generated from both a large-scale public tyrosine kinase siRNA knock-down study and an in-house investigation into the temporal dynamics of the KSR1 and KSR2 interactomes. Download HiQuant, sample data sets, and supporting documentation at http://hiquant.primesdb.eu .
Large Scale Processes and Extreme Floods in Brazil
NASA Astrophysics Data System (ADS)
Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.
2016-12-01
Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).
The PMA Scale: A Measure of Physicians' Motivation to Adopt Medical Devices.
Hatz, Maximilian H M; Sonnenschein, Tim; Blankart, Carl Rudolf
2017-04-01
Studies have often stated that individual-level determinants are important drivers for the adoption of medical devices. Empirical evidence supporting this claim is, however, scarce. At the individual level, physicians' adoption motivation was often considered important in the context of adoption decisions, but a clear notion of its dimensions and corresponding measurement scales is not available. To develop and subsequently validate a scale to measure the motivation to adopt medical devices of hospital-based physicians. The development and validation of the physician-motivation-adoption (PMA) scale were based on a literature search, internal expert meetings, a pilot study with physicians, and a three-stage online survey. The data collected in the online survey were analyzed using exploratory factor analysis (EFA), and the PMA scale was revised according to the results. Confirmatory factor analysis (CFA) was conducted to test the results from the EFA in the third stage. Reliability and validity tests and subgroup analyses were also conducted. Overall, 457 questionnaires were completed by medical personnel of the National Health Service England. The EFA favored a six-factor solution to appropriately describe physicians' motivation. The CFA confirmed the results from the EFA. Our tests indicated good reliability and validity of the PMA scale. This is the first reliable and valid scale to measure physicians' adoption motivation. Future adoption studies assessing the individual level should include the PMA scale to obtain more information about the role of physicians' motivation in the broader adoption context. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Diviani, Nicola; Dima, Alexandra Lelia; Schulz, Peter Johannes
2017-04-11
The eHealth Literacy Scale (eHEALS) is a tool to assess consumers' comfort and skills in using information technologies for health. Although evidence exists of reliability and construct validity of the scale, less agreement exists on structural validity. The aim of this study was to validate the Italian version of the eHealth Literacy Scale (I-eHEALS) in a community sample with a focus on its structural validity, by applying psychometric techniques that account for item difficulty. Two Web-based surveys were conducted among a total of 296 people living in the Italian-speaking region of Switzerland (Ticino). After examining the latent variables underlying the observed variables of the Italian scale via principal component analysis (PCA), fit indices for two alternative models were calculated using confirmatory factor analysis (CFA). The scale structure was examined via parametric and nonparametric item response theory (IRT) analyses accounting for differences between items regarding the proportion of answers indicating high ability. Convergent validity was assessed by correlations with theoretically related constructs. CFA showed a suboptimal model fit for both models. IRT analyses confirmed all items measure a single dimension as intended. Reliability and construct validity of the final scale were also confirmed. The contrasting results of factor analysis (FA) and IRT analyses highlight the importance of considering differences in item difficulty when examining health literacy scales. The findings support the reliability and validity of the translated scale and its use for assessing Italian-speaking consumers' eHealth literacy. ©Nicola Diviani, Alexandra Lelia Dima, Peter Johannes Schulz. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 11.04.2017.
Multi-scale variability and long-range memory in indoor Radon concentrations from Coimbra, Portugal
NASA Astrophysics Data System (ADS)
Donner, Reik V.; Potirakis, Stelios; Barbosa, Susana
2014-05-01
The presence or absence of long-range correlations in the variations of indoor Radon concentrations has recently attracted considerable interest. As a radioactive gas naturally emitted from the ground in certain geological settings, understanding environmental factors controlling Radon concentrations and their dynamics is important for estimating its effect on human health and the efficiency of possible measures for reducing the corresponding exposition. In this work, we re-analyze two high-resolution records of indoor Radon concentrations from Coimbra, Portugal, each of which spans several months of continuous measurements. In order to evaluate the presence of long-range correlations and fractal scaling, we utilize a multiplicity of complementary methods, including power spectral analysis, ARFIMA modeling, classical and multi-fractal detrended fluctuation analysis, and two different estimators of the signals' fractal dimensions. Power spectra and fluctuation functions reveal some complex behavior with qualitatively different properties on different time-scales: white noise in the high-frequency part, indications of some long-range correlated process dominating time scales of several hours to days, and pronounced low-frequency variability associated with tidal and/or meteorological forcing. In order to further decompose these different scales of variability, we apply two different approaches. On the one hand, applying multi-resolution analysis based on the discrete wavelet transform allows separately studying contributions on different time scales and characterize their specific correlation and scaling properties. On the other hand, singular system analysis (SSA) provides a reconstruction of the essential modes of variability. Specifically, by considering only the first leading SSA modes, we achieve an efficient de-noising of our environmental signals, highlighting the low-frequency variations together with some distinct scaling on sub-daily time-scales resembling the properties of a long-range correlated process.
Modeling of Urban Heat Island at Global Scale
NASA Astrophysics Data System (ADS)
KC, B.; Ruth, M.
2015-12-01
Urban Heat Island (UHI) is the temperature difference between urban and its rural background temperature. At the local level, the choice of building materials and urban geometry are vital in determining the UHI magnitude of a city. At the city scale, economic growth, population, climate, and land use dynamics are the main drivers behind changes in UHIs. The main objective of this paper is to provide a comprehensive assessment of UHI based on these "macro variables" at regional and global scale. We based our analysis on published research for Europe, North America, and Asia, reporting data for 83 cities across the globe with unique climatic, economic, and environmental conditions. Exploratory data analysis including Pearson correlation was performed to explore the relationship between UHI and PM2.5 (particulate matter with aerodynamic diameter ≤5 microns), PM10 (particulate matter with aerodynamic diameter ≤10 microns), vegetation per capita, built area, Gross Domestic Product (GDP), population density and population. Additionally, dummy variables were used to capture potential influences of climate types (based on Koppen classifications) and the ways by which UHI was measured. We developed three linear regression models, one for each of the three continents (Asia, Europe, and North America) and one model for all the cities across these continents. This study provides a unique perspective for predicting UHI magnitudes at large scales based on economic activity and pollution levels of a city, which has important implications in urban planning.
Cai, Long-Fei; Zhu, Ying; Du, Guan-Sheng; Fang, Qun
2012-01-03
We described a microfluidic chip-based system capable of generating droplet array with a large scale concentration gradient by coupling flow injection gradient technique with droplet-based microfluidics. Multiple modules including sample injection, sample dispersion, gradient generation, droplet formation, mixing of sample and reagents, and online reaction within the droplets were integrated into the microchip. In the system, nanoliter-scale sample solution was automatically injected into the chip under valveless flow injection analysis mode. The sample zone was first dispersed in the microchannel to form a concentration gradient along the axial direction of the microchannel and then segmented into a linear array of droplets by immiscible oil phase. With the segmentation and protection of the oil phase, the concentration gradient profile of the sample was preserved in the droplet array with high fidelity. With a single injection of 16 nL of sample solution, an array of droplets with concentration gradient spanning 3-4 orders of magnitude could be generated. The present system was applied in the enzyme inhibition assay of β-galactosidase to preliminarily demonstrate its potential in high throughput drug screening. With a single injection of 16 nL of inhibitor solution, more than 240 in-droplet enzyme inhibition reactions with different inhibitor concentrations could be performed with an analysis time of 2.5 min. Compared with multiwell plate-based screening systems, the inhibitor consumption was reduced 1000-fold. © 2011 American Chemical Society
NASA Astrophysics Data System (ADS)
Langford, Z. L.; Kumar, J.; Hoffman, F. M.
2015-12-01
Observations indicate that over the past several decades, landscape processes in the Arctic have been changing or intensifying. A dynamic Arctic landscape has the potential to alter ecosystems across a broad range of scales. Accurate characterization is useful to understand the properties and organization of the landscape, optimal sampling network design, measurement and process upscaling and to establish a landscape-based framework for multi-scale modeling of ecosystem processes. This study seeks to delineate the landscape at Seward Peninsula of Alaska into ecoregions using large volumes (terabytes) of high spatial resolution satellite remote-sensing data. Defining high-resolution ecoregion boundaries is difficult because many ecosystem processes in Arctic ecosystems occur at small local to regional scales, which are often resolved in by coarse resolution satellites (e.g., MODIS). We seek to use data-fusion techniques and data analytics algorithms applied to Phased Array type L-band Synthetic Aperture Radar (PALSAR), Interferometric Synthetic Aperture Radar (IFSAR), Satellite for Observation of Earth (SPOT), WorldView-2, WorldView-3, and QuickBird-2 to develop high-resolution (˜5m) ecoregion maps for multiple time periods. Traditional analysis methods and algorithms are insufficient for analyzing and synthesizing such large geospatial data sets, and those algorithms rarely scale out onto large distributed- memory parallel computer systems. We seek to develop computationally efficient algorithms and techniques using high-performance computing for characterization of Arctic landscapes. We will apply a variety of data analytics algorithms, such as cluster analysis, complex object-based image analysis (COBIA), and neural networks. We also propose to use representativeness analysis within the Seward Peninsula domain to determine optimal sampling locations for fine-scale measurements. This methodology should provide an initial framework for analyzing dynamic landscape trends in Arctic ecosystems, such as shrubification and disturbances, and integration of ecoregions into multi-scale models.
Scales of variability of black carbon plumes and their dependence on resolution of ECHAM6-HAM
NASA Astrophysics Data System (ADS)
Weigum, Natalie; Stier, Philip; Schutgens, Nick; Kipling, Zak
2015-04-01
Prediction of the aerosol effect on climate depends on the ability of three-dimensional numerical models to accurately estimate aerosol properties. However, a limitation of traditional grid-based models is their inability to resolve variability on scales smaller than a grid box. Past research has shown that significant aerosol variability exists on scales smaller than these grid-boxes, which can lead to discrepancies between observations and aerosol models. The aim of this study is to understand how a global climate model's (GCM) inability to resolve sub-grid scale variability affects simulations of important aerosol features. This problem is addressed by comparing observed black carbon (BC) plume scales from the HIPPO aircraft campaign to those simulated by ECHAM-HAM GCM, and testing how model resolution affects these scales. This study additionally investigates how model resolution affects BC variability in remote and near-source regions. These issues are examined using three different approaches: comparison of observed and simulated along-flight-track plume scales, two-dimensional autocorrelation analysis, and 3-dimensional plume analysis. We find that the degree to which GCMs resolve variability can have a significant impact on the scales of BC plumes, and it is important for models to capture the scales of aerosol plume structures, which account for a large degree of aerosol variability. In this presentation, we will provide further results from the three analysis techniques along with a summary of the implication of these results on future aerosol model development.
Health education and competency scale: Development and testing.
Hwang, Huei-Lih; Kuo, Mei-Ling; Tu, Chin-Tang
2018-02-01
To develop a tool for measuring competency in conducting health education and to evaluate its psychometric properties in a population of entry-level nurses. Until now, no generic instrument has been developed specifically for measuring competency in health education, which is an essential competency for nurses. Existing scales are either insufficient for psychometric evaluation or are designed specifically for senior nurses. To evaluate curricula and courses designed for entry-level nurses, educators require an instrument for measuring improvement in core competency from baseline to determine whether the minimum level of ability has been achieved. Item development for the survey instrument used for data collection in this study was based on the results of a literature review. The self-evaluated Health Education Competency Scale developed in this study was used to survey 457 nursing students at two nursing schools and 165 clinical nurses at a medical centre in south Taiwan in 2016. The participants were randomly divided into two equal groups. One group was analysed by exploratory factor analysis with varimax rotation, and one group was analysed by confirmatory factor analysis. Factor analysis yielded a four-factor (assessment, pedagogy, motivation and empowerment) solution (18 items) that accounted for 75.9% of the variance. The total scale and subscales had good reliabilities and construct validity coefficients. For measuring competency in entry-level nurses, the Health Education Competency Scale had a good data fit and sound psychometric properties. The proposed scale can be used to assess health education competency for college nursing students and practising nurses. Furthermore, it can provide educators with valuable insight into the minimum competencies required for entry-level nurses to deliver quality health care to clients and can guide them in the practice of client-based teaching. © 2017 John Wiley & Sons Ltd.
Parametric analysis of a down-scaled turbo jet engine suitable for drone and UAV propulsion
NASA Astrophysics Data System (ADS)
Wessley, G. Jims John; Chauhan, Swati
2018-04-01
This paper presents a detailed study on the need for downscaling gas turbine engines for UAV and drone propulsion. Also, the procedure for downscaling and the parametric analysis of a downscaled engine using Gas Turbine Simulation Program software GSP 11 is presented. The need for identifying a micro gas turbine engine in the thrust range of 0.13 to 4.45 kN to power UAVs and drones weighing in the range of 4.5 to 25 kg is considered and in order to meet the requirement a parametric analysis on the scaled down Allison J33-A-35 Turbojet engine is performed. It is evident from the analysis that the thrust developed by the scaled engine and the Thrust Specific Fuel Consumption TSFC depends on pressure ratio, mass flow rate of air and Mach number. A scaling factor of 0.195 corresponding to air mass flow rate of 7.69 kg/s produces a thrust in the range of 4.57 to 5.6 kN while operating at a Mach number of 0.3 within the altitude of 5000 to 9000 m. The thermal and overall efficiency of the scaled engine is found to be 67% and 75% respectively for a pressure ratio of 2. The outcomes of this analysis form a strong base for further analysis, design and fabrication of micro gas turbine engines to propel future UAVs and drones.
Bouvignies, Guillaume; Hansen, D Flemming; Vallurupalli, Pramodh; Kay, Lewis E
2011-02-16
A method for quantifying millisecond time scale exchange in proteins is presented based on scaling the rate of chemical exchange using a 2D (15)N, (1)H(N) experiment in which (15)N dwell times are separated by short spin-echo pulse trains. Unlike the popular Carr-Purcell-Meiboom-Gill (CPMG) experiment where the effects of a radio frequency field on measured transverse relaxation rates are quantified, the new approach measures peak positions in spectra that shift as the effective exchange time regime is varied. The utility of the method is established through an analysis of data recorded on an exchanging protein-ligand system for which the exchange parameters have been accurately determined using alternative approaches. Computations establish that a combined analysis of CPMG and peak shift profiles extends the time scale that can be studied to include exchanging systems with highly skewed populations and exchange rates as slow as 20 s(-1).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nurujjaman, Md.; Narayanan, Ramesh; Iyengar, A. N. Sekar
2009-10-15
Continuous wavelet transform (CWT) based time-scale and multifractal analyses have been carried out on the anode glow related nonlinear floating potential fluctuations in a hollow cathode glow discharge plasma. CWT has been used to obtain the contour and ridge plots. Scale shift (or inversely frequency shift), which is a typical nonlinear behavior, has been detected from the undulating contours. From the ridge plots, we have identified the presence of nonlinearity and degree of chaoticity. Using the wavelet transform modulus maxima technique we have obtained the multifractal spectrum for the fluctuations at different discharge voltages and the spectrum was observed tomore » become a monofractal for periodic signals. These multifractal spectra were also used to estimate different quantities such as the correlation and fractal dimension, degree of multifractality, and complexity parameters. These estimations have been found to be consistent with the nonlinear time series analysis.« less
Students' Perceptions of Their ICT-Based College English Course in China: A Case Study
ERIC Educational Resources Information Center
Zinan, Wen; Sai, George Teoh Boon
2017-01-01
This study investigated foreign language students' perceptions about their Information and Communication Technology (ICT)-based College English Course (CEC) in China. The research used a five-point Likert-scale questionnaire based on Simsek (2008). A factor analysis confirmed the construct validity of the questionnaire and 6 factors were…
Lin, Wei-Chih; Lin, Yu-Pin; Wang, Yung-Chieh; Chang, Tsun-Kuo; Chiang, Li-Chi
2014-02-21
In this study, a deconvolution procedure was used to create a variogram of oral cancer (OC) rates. Based on the variogram, area-to-point (ATP) Poisson kriging and p-field simulation were used to downscale and simulate, respectively, the OC rate data for Taiwan from the district scale to a 1 km × 1 km grid scale. Local cluster analysis (LCA) of OC mortality rates was then performed to identify OC mortality rate hot spots based on the downscaled and the p-field-simulated OC mortality maps. The relationship between OC mortality and land use was studied by overlapping the maps of the downscaled OC mortality, the LCA results, and the land uses. One thousand simulations were performed to quantify local and spatial uncertainties in the LCA to identify OC mortality hot spots. The scatter plots and Spearman's rank correlation yielded the relationship between OC mortality and concentrations of the seven metals in the 1 km cell grid. The correlation analysis results for the 1 km scale revealed a weak correlation between OC mortality rate and concentrations of the seven studied heavy metals in soil. Accordingly, the heavy metal concentrations in soil are not major determinants of OC mortality rates at the 1 km scale at which soils were sampled. The LCA statistical results for local indicator of spatial association (LISA) revealed that the sites with high probability of high-high (high value surrounded by high values) OC mortality at the 1 km grid scale were clustered in southern, eastern, and mid-western Taiwan. The number of such sites was also significantly higher on agricultural land and in urban regions than on land with other uses. The proposed approach can be used to downscale and evaluate uncertainty in mortality data from a coarse scale to a fine scale at which useful additional information can be obtained for assessing and managing land use and risk.
Rasch Based Analysis of Oral Proficiency Test Data.
ERIC Educational Resources Information Center
Nakamura, Yuji
2001-01-01
This paper examines the rating scale data of oral proficiency tests analyzed by a Rasch Analysis focusing on an item map and factor analysis. In discussing the item map, the difficulty order of six items and students' answering patterns are analyzed using descriptive statistics and measures of central tendency of test scores. The data ranks the…
Statistical population based estimates of water ingestion play a vital role in many types of exposure and risk analysis. A significant large scale analysis of water ingestion by the population of the United States was recently completed and is documented in the report titled ...
Control of Thermo-Acoustics Instabilities: The Multi-Scale Extended Kalman Approach
NASA Technical Reports Server (NTRS)
Le, Dzu K.; DeLaat, John C.; Chang, Clarence T.
2003-01-01
"Multi-Scale Extended Kalman" (MSEK) is a novel model-based control approach recently found to be effective for suppressing combustion instabilities in gas turbines. A control law formulated in this approach for fuel modulation demonstrated steady suppression of a high-frequency combustion instability (less than 500Hz) in a liquid-fuel combustion test rig under engine-realistic conditions. To make-up for severe transport-delays on control effect, the MSEK controller combines a wavelet -like Multi-Scale analysis and an Extended Kalman Observer to predict the thermo-acoustic states of combustion pressure perturbations. The commanded fuel modulation is composed of a damper action based on the predicted states, and a tones suppression action based on the Multi-Scale estimation of thermal excitations and other transient disturbances. The controller performs automatic adjustments of the gain and phase of these actions to minimize the Time-Scale Averaged Variances of the pressures inside the combustion zone and upstream of the injector. The successful demonstration of Active Combustion Control with this MSEK controller completed an important NASA milestone for the current research in advanced combustion technologies.
Nonlinear analysis of 0-3 polarized PLZT microplate based on the new modified couple stress theory
NASA Astrophysics Data System (ADS)
Wang, Liming; Zheng, Shijie
2018-02-01
In this study, based on the new modified couple stress theory, the size- dependent model for nonlinear bending analysis of a pure 0-3 polarized PLZT plate is developed for the first time. The equilibrium equations are derived from a variational formulation based on the potential energy principle and the new modified couple stress theory. The Galerkin method is adopted to derive the nonlinear algebraic equations from governing differential equations. And then the nonlinear algebraic equations are solved by using Newton-Raphson method. After simplification, the new model includes only a material length scale parameter. In addition, numerical examples are carried out to study the effect of material length scale parameter on the nonlinear bending of a simply supported pure 0-3 polarized PLZT plate subjected to light illumination and uniform distributed load. The results indicate the new model is able to capture the size effect and geometric nonlinearity.
Multi-scale Modeling and Analysis of Nano-RFID Systems on HPC Setup
NASA Astrophysics Data System (ADS)
Pathak, Rohit; Joshi, Satyadhar
In this paper we have worked out on some the complex modeling aspects such as Multi Scale modeling, MATLAB Sugar based modeling and have shown the complexities involved in the analysis of Nano RFID (Radio Frequency Identification) systems. We have shown the modeling and simulation and demonstrated some novel ideas and library development for Nano RFID. Multi scale modeling plays a very important role in nanotech enabled devices properties of which cannot be explained sometimes by abstraction level theories. Reliability and packaging still remains one the major hindrances in practical implementation of Nano RFID based devices. And to work on them modeling and simulation will play a very important role. CNTs is the future low power material that will replace CMOS and its integration with CMOS, MEMS circuitry will play an important role in realizing the true power in Nano RFID systems. RFID based on innovations in nanotechnology has been shown. MEMS modeling of Antenna, sensors and its integration in the circuitry has been shown. Thus incorporating this we can design a Nano-RFID which can be used in areas like human implantation and complex banking applications. We have proposed modeling of RFID using the concept of multi scale modeling to accurately predict its properties. Also we give the modeling of MEMS devices that are proposed recently that can see possible application in RFID. We have also covered the applications and the advantages of Nano RFID in various areas. RF MEMS has been matured and its devices are being successfully commercialized but taking it to limits of nano domains and integration with singly chip RFID needs a novel approach which is being proposed. We have modeled MEMS based transponder and shown the distribution for multi scale modeling for Nano RFID.
Zhang, Yimei; Li, Shuai; Wang, Fei; Chen, Zhuang; Chen, Jie; Wang, Liqun
2018-09-01
Toxicity of heavy metals from industrialization poses critical concern, and analysis of sources associated with potential human health risks is of unique significance. Assessing human health risk of pollution sources (factored health risk) concurrently in the whole and the sub region can provide more instructive information to protect specific potential victims. In this research, we establish a new expression model of human health risk based on quantitative analysis of sources contribution in different spatial scales. The larger scale grids and their spatial codes are used to initially identify the level of pollution risk, the type of pollution source and the sensitive population at high risk. The smaller scale grids and their spatial codes are used to identify the contribution of various sources of pollution to each sub region (larger grid) and to assess the health risks posed by each source for each sub region. The results of case study show that, for children (sensitive populations, taking school and residential area as major region of activity), the major pollution source is from the abandoned lead-acid battery plant (ALP), traffic emission and agricultural activity. The new models and results of this research present effective spatial information and useful model for quantifying the hazards of source categories and human health a t complex industrial system in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.SCALE 6.2 provides many new capabilities and significant improvements of existing features.New capabilities include:• ENDF/B-VII.1 nuclear data libraries CE and MG with enhanced group structures,• Neutron covariance data based on ENDF/B-VII.1 and supplemented with ORNL data,• Covariance data for fission product yields and decay constants,• Stochastic uncertainty and correlation quantification for any SCALE sequence with Sampler,• Parallel calculations with KENO,• Problem-dependent temperature corrections for CE calculations,• CE shielding and criticality accident alarm system analysis with MAVRIC,• CE depletion with TRITON (T5-DEPL/T6-DEPL),• CE sensitivity/uncertainty analysis with TSUNAMI-3D,• Simplified and efficient LWR lattice physics with Polaris,• Large scale detailed spent fuel characterization with ORIGAMI and ORIGAMI Automator,• Advanced fission source convergence acceleration capabilities with Sourcerer,• Nuclear data library generation with AMPX, and• Integrated user interface with Fulcrum.Enhanced capabilities include:• Accurate and efficient CE Monte Carlo methods for eigenvalue and fixed source calculations,• Improved MG resonance self-shielding methodologies and data,• Resonance self-shielding with modernized and efficient XSProc integrated into most sequences,• Accelerated calculations with TRITON/NEWT (generally 4x faster than SCALE 6.1),• Spent fuel characterization with 1470 new reactor-specific libraries for ORIGEN,• Modernization of ORIGEN (Chebyshev Rational Approximation Method [CRAM] solver, API for high-performance depletion, new keyword input format)• Extension of the maximum mixture number to values well beyond the previous limit of 2147 to ~2 billion,• Nuclear data formats enabling the use of more than 999 energy groups,• Updated standard composition library to provide more accurate use of natural abundances, andvi• Numerous other enhancements for improved usability and stability.« less
Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun
2014-01-01
Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation.
Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun
2014-01-01
Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation. PMID:25405760
Rey-Villamizar, Nicolas; Somasundar, Vinay; Megjhani, Murad; Xu, Yan; Lu, Yanbin; Padmanabhan, Raghav; Trett, Kristen; Shain, William; Roysam, Badri
2014-01-01
In this article, we describe the use of Python for large-scale automated server-based bio-image analysis in FARSIGHT, a free and open-source toolkit of image analysis methods for quantitative studies of complex and dynamic tissue microenvironments imaged by modern optical microscopes, including confocal, multi-spectral, multi-photon, and time-lapse systems. The core FARSIGHT modules for image segmentation, feature extraction, tracking, and machine learning are written in C++, leveraging widely used libraries including ITK, VTK, Boost, and Qt. For solving complex image analysis tasks, these modules must be combined into scripts using Python. As a concrete example, we consider the problem of analyzing 3-D multi-spectral images of brain tissue surrounding implanted neuroprosthetic devices, acquired using high-throughput multi-spectral spinning disk step-and-repeat confocal microscopy. The resulting images typically contain 5 fluorescent channels. Each channel consists of 6000 × 10,000 × 500 voxels with 16 bits/voxel, implying image sizes exceeding 250 GB. These images must be mosaicked, pre-processed to overcome imaging artifacts, and segmented to enable cellular-scale feature extraction. The features are used to identify cell types, and perform large-scale analysis for identifying spatial distributions of specific cell types relative to the device. Python was used to build a server-based script (Dell 910 PowerEdge servers with 4 sockets/server with 10 cores each, 2 threads per core and 1TB of RAM running on Red Hat Enterprise Linux linked to a RAID 5 SAN) capable of routinely handling image datasets at this scale and performing all these processing steps in a collaborative multi-user multi-platform environment. Our Python script enables efficient data storage and movement between computers and storage servers, logs all the processing steps, and performs full multi-threaded execution of all codes, including open and closed-source third party libraries.
Wavelet based free-form deformations for nonrigid registration
NASA Astrophysics Data System (ADS)
Sun, Wei; Niessen, Wiro J.; Klein, Stefan
2014-03-01
In nonrigid registration, deformations may take place on the coarse and fine scales. For the conventional B-splines based free-form deformation (FFD) registration, these coarse- and fine-scale deformations are all represented by basis functions of a single scale. Meanwhile, wavelets have been proposed as a signal representation suitable for multi-scale problems. Wavelet analysis leads to a unique decomposition of a signal into its coarse- and fine-scale components. Potentially, this could therefore be useful for image registration. In this work, we investigate whether a wavelet-based FFD model has advantages for nonrigid image registration. We use a B-splines based wavelet, as defined by Cai and Wang.1 This wavelet is expressed as a linear combination of B-spline basis functions. Derived from the original B-spline function, this wavelet is smooth, differentiable, and compactly supported. The basis functions of this wavelet are orthogonal across scales in Sobolev space. This wavelet was previously used for registration in computer vision, in 2D optical flow problems,2 but it was not compared with the conventional B-spline FFD in medical image registration problems. An advantage of choosing this B-splines based wavelet model is that the space of allowable deformation is exactly equivalent to that of the traditional B-spline. The wavelet transformation is essentially a (linear) reparameterization of the B-spline transformation model. Experiments on 10 CT lung and 18 T1-weighted MRI brain datasets show that wavelet based registration leads to smoother deformation fields than traditional B-splines based registration, while achieving better accuracy.
Multiresolution analysis of characteristic length scales with high-resolution topographic data
NASA Astrophysics Data System (ADS)
Sangireddy, Harish; Stark, Colin P.; Passalacqua, Paola
2017-07-01
Characteristic length scales (CLS) define landscape structure and delimit geomorphic processes. Here we use multiresolution analysis (MRA) to estimate such scales from high-resolution topographic data. MRA employs progressive terrain defocusing, via convolution of the terrain data with Gaussian kernels of increasing standard deviation, and calculation at each smoothing resolution of (i) the probability distributions of curvature and topographic index (defined as the ratio of slope to area in log scale) and (ii) characteristic spatial patterns of divergent and convergent topography identified by analyzing the curvature of the terrain. The MRA is first explored using synthetic 1-D and 2-D signals whose CLS are known. It is then validated against a set of MARSSIM (a landscape evolution model) steady state landscapes whose CLS were tuned by varying hillslope diffusivity and simulated noise amplitude. The known CLS match the scales at which the distributions of topographic index and curvature show scaling breaks, indicating that the MRA can identify CLS in landscapes based on the scaling behavior of topographic attributes. Finally, the MRA is deployed to measure the CLS of five natural landscapes using meter resolution digital terrain model data. CLS are inferred from the scaling breaks of the topographic index and curvature distributions and equated with (i) small-scale roughness features and (ii) the hillslope length scale.
Wang, Hao; Tao, Tianyou; Guo, Tong; Li, Jian; Li, Aiqun
2014-01-01
The structural health monitoring system (SHMS) provides an effective tool to conduct full-scale measurements on existing bridges for essential research on bridge wind engineering. In July 2008, Typhoon Fung-Wong lashed China and hit Sutong cable-stayed bridge (SCB) in China. During typhoon period, full-scale measurements were conducted to record the wind data and the structural vibration responses were collected by the SHMS installed on SCB. Based on the statistical method and the spectral analysis technique, the measured data are analyzed to obtain the typical parameters and characteristics. Furthermore, this paper analyzed the measured structural vibration responses and indicated the vibration characteristics of the stay cable and the deck, the relationship between structural vibrations and wind speed, the comparison of upstream and downstream cable vibrations, the effectiveness of cable dampers, and so forth. Considering the significance of damping ratio in vibration mitigation, the modal damping ratios of the SCB are identified based on the Hilbert-Huang transform (HHT) combined with the random decrement technique (RDT). The analysis results can be used to validate the current dynamic characteristic analysis methods, buffeting calculation methods, and wind tunnel test results of the long-span cable-stayed bridges.
Tao, Tianyou; Li, Aiqun
2014-01-01
The structural health monitoring system (SHMS) provides an effective tool to conduct full-scale measurements on existing bridges for essential research on bridge wind engineering. In July 2008, Typhoon Fung-Wong lashed China and hit Sutong cable-stayed bridge (SCB) in China. During typhoon period, full-scale measurements were conducted to record the wind data and the structural vibration responses were collected by the SHMS installed on SCB. Based on the statistical method and the spectral analysis technique, the measured data are analyzed to obtain the typical parameters and characteristics. Furthermore, this paper analyzed the measured structural vibration responses and indicated the vibration characteristics of the stay cable and the deck, the relationship between structural vibrations and wind speed, the comparison of upstream and downstream cable vibrations, the effectiveness of cable dampers, and so forth. Considering the significance of damping ratio in vibration mitigation, the modal damping ratios of the SCB are identified based on the Hilbert-Huang transform (HHT) combined with the random decrement technique (RDT). The analysis results can be used to validate the current dynamic characteristic analysis methods, buffeting calculation methods, and wind tunnel test results of the long-span cable-stayed bridges. PMID:24995367
Irrational Delay Revisited: Examining Five Procrastination Scales in a Global Sample
Svartdal, Frode; Steel, Piers
2017-01-01
Scales attempting to measure procrastination focus on different facets of the phenomenon, yet they share a common understanding of procrastination as an unnecessary, unwanted, and disadvantageous delay. The present paper examines in a global sample (N = 4,169) five different procrastination scales – Decisional Procrastination Scale (DPS), Irrational Procrastination Scale (IPS), Pure Procrastination Scale (PPS), Adult Inventory of Procrastination Scale (AIP), and General Procrastination Scale (GPS), focusing on factor structures and item functioning using Confirmatory Factor Analysis and Item Response Theory. The results indicated that The PPS (12 items selected from DPS, AIP, and GPS) measures different facets of procrastination even better than the three scales it is based on. An even shorter version of the PPS (5 items focusing on irrational delay), corresponds well to the nine-item IPS. Both scales demonstrate good psychometric properties and appear to be superior measures of core procrastination attributes than alternative procrastination scales. PMID:29163302
Irrational Delay Revisited: Examining Five Procrastination Scales in a Global Sample.
Svartdal, Frode; Steel, Piers
2017-01-01
Scales attempting to measure procrastination focus on different facets of the phenomenon, yet they share a common understanding of procrastination as an unnecessary, unwanted, and disadvantageous delay. The present paper examines in a global sample ( N = 4,169) five different procrastination scales - Decisional Procrastination Scale (DPS), Irrational Procrastination Scale (IPS), Pure Procrastination Scale (PPS), Adult Inventory of Procrastination Scale (AIP), and General Procrastination Scale (GPS), focusing on factor structures and item functioning using Confirmatory Factor Analysis and Item Response Theory. The results indicated that The PPS (12 items selected from DPS, AIP, and GPS) measures different facets of procrastination even better than the three scales it is based on. An even shorter version of the PPS (5 items focusing on irrational delay), corresponds well to the nine-item IPS. Both scales demonstrate good psychometric properties and appear to be superior measures of core procrastination attributes than alternative procrastination scales.
Scaling of Counter-Current Imbibition Process in Low-Permeability Porous Media, TR-121
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kvoscek, A.R.; Zhou, D.; Jia, L.
2001-01-17
This project presents the recent work on imaging imbibition in low permeability porous media (diatomite) with X-ray completed tomography. The viscosity ratio between nonwetting and wetting fluids is varied over several orders of magnitude yielding different levels of imbibition performance. Also performed is mathematical analysis of counter-current imbibition processes and development of a modified scaling group incorporating the mobility ratio. This modified group is physically based and appears to improve scaling accuracy of countercurrent imbibition significantly.
[Development and application of morphological analysis method in Aspergillus niger fermentation].
Tang, Wenjun; Xia, Jianye; Chu, Ju; Zhuang, Yingping; Zhang, Siliang
2015-02-01
Filamentous fungi are widely used in industrial fermentation. Particular fungal morphology acts as a critical index for a successful fermentation. To break the bottleneck of morphological analysis, we have developed a reliable method for fungal morphological analysis. By this method, we can prepare hundreds of pellet samples simultaneously and obtain quantitative morphological information at large scale quickly. This method can largely increase the accuracy and reliability of morphological analysis result. Based on that, the studies of Aspergillus niger morphology under different oxygen supply conditions and shear rate conditions were carried out. As a result, the morphological responding patterns of A. niger morphology to these conditions were quantitatively demonstrated, which laid a solid foundation for the further scale-up.
Duncan, Mitch J; Rashid, Mahbub; Vandelanotte, Corneel; Cutumisu, Nicoleta; Plotnikoff, Ronald C
2013-02-04
Spatial configurations of office environments assessed by Space Syntax methodologies are related to employee movement patterns. These methods require analysis of floors plans which are not readily available in large population-based studies or otherwise unavailable. Therefore a self-report instrument to assess spatial configurations of office environments using four scales was developed. The scales are: local connectivity (16 items), overall connectivity (11 items), visibility of co-workers (10 items), and proximity of co-workers (5 items). A panel cohort (N = 1154) completed an online survey, only data from individuals employed in office-based occupations (n = 307) were used to assess scale measurement properties. To assess test-retest reliability a separate sample of 37 office-based workers completed the survey on two occasions 7.7 (±3.2) days apart. Redundant scale items were eliminated using factor analysis; Chronbach's α was used to evaluate internal consistency and test re-test reliability (retest-ICC). ANOVA was employed to examine differences between office types (Private, Shared, Open) as a measure of construct validity. Generalized Linear Models were used to examine relationships between spatial configuration scales and the duration of and frequency of breaks in occupational sitting. The number of items on all scales were reduced, Chronbach's α and ICCs indicated good scale internal consistency and test re-test reliability: local connectivity (5 items; α = 0.70; retest-ICC = 0.84), overall connectivity (6 items; α = 0.86; retest-ICC = 0.87), visibility of co-workers (4 items; α = 0.78; retest-ICC = 0.86), and proximity of co-workers (3 items; α = 0.85; retest-ICC = 0.70). Significant (p ≤ 0.001) differences, in theoretically expected directions, were observed for all scales between office types, except overall connectivity. Significant associations were observed between all scales and occupational sitting behaviour (p ≤ 0.05). All scales have good measurement properties indicating the instrument may be a useful alternative to Space Syntax to examine environmental correlates of occupational sitting in population surveys.
2013-01-01
Background Spatial configurations of office environments assessed by Space Syntax methodologies are related to employee movement patterns. These methods require analysis of floors plans which are not readily available in large population-based studies or otherwise unavailable. Therefore a self-report instrument to assess spatial configurations of office environments using four scales was developed. Methods The scales are: local connectivity (16 items), overall connectivity (11 items), visibility of co-workers (10 items), and proximity of co-workers (5 items). A panel cohort (N = 1154) completed an online survey, only data from individuals employed in office-based occupations (n = 307) were used to assess scale measurement properties. To assess test-retest reliability a separate sample of 37 office-based workers completed the survey on two occasions 7.7 (±3.2) days apart. Redundant scale items were eliminated using factor analysis; Chronbach’s α was used to evaluate internal consistency and test re-test reliability (retest-ICC). ANOVA was employed to examine differences between office types (Private, Shared, Open) as a measure of construct validity. Generalized Linear Models were used to examine relationships between spatial configuration scales and the duration of and frequency of breaks in occupational sitting. Results The number of items on all scales were reduced, Chronbach’s α and ICCs indicated good scale internal consistency and test re-test reliability: local connectivity (5 items; α = 0.70; retest-ICC = 0.84), overall connectivity (6 items; α = 0.86; retest-ICC = 0.87), visibility of co-workers (4 items; α = 0.78; retest-ICC = 0.86), and proximity of co-workers (3 items; α = 0.85; retest-ICC = 0.70). Significant (p ≤ 0.001) differences, in theoretically expected directions, were observed for all scales between office types, except overall connectivity. Significant associations were observed between all scales and occupational sitting behaviour (p ≤ 0.05). Conclusion All scales have good measurement properties indicating the instrument may be a useful alternative to Space Syntax to examine environmental correlates of occupational sitting in population surveys. PMID:23379485
Samoocha, David; Bruinvels, David J; Elbers, Nieke A; Anema, Johannes R; van der Beek, Allard J
2010-06-24
Patient empowerment is growing in popularity and application. Due to the increasing possibilities of the Internet and eHealth, many initiatives that are aimed at empowering patients are delivered online. Our objective was to evaluate whether Web-based interventions are effective in increasing patient empowerment compared with usual care or face-to-face interventions. We performed a systematic review by searching the MEDLINE, EMBASE, and PsycINFO databases from January 1985 to January 2009 for relevant citations. From the 7096 unique citations retrieved from the search strategy, we included 14 randomized controlled trials (RCTs) that met all inclusion criteria. Pairs of review authors assessed the methodological quality of the obtained studies using the Downs and Black checklist. A meta-analysis was performed on studies that measured comparable outcomes. The GRADE approach was used to determine the level of evidence for each outcome. In comparison with usual care or no care, Web-based interventions had a significant positive effect on empowerment measured with the Diabetes Empowerment Scale (2 studies, standardized mean difference [SMD] = 0.61, 95% confidence interval [CI] 0.29 - 0.94]), on self-efficacy measured with disease-specific self-efficacy scales (9 studies, SMD = 0.23, 95% CI 0.12 - 0.33), and on mastery measured with the Pearlin Mastery Scale (1 study, mean difference [MD] = 2.95, 95% CI 1.66 - 4.24). No effects were found for self-efficacy measured with general self-efficacy scales (3 studies, SMD = 0.05, 95% CI -0.25 to 0.35) or for self-esteem measured with the Rosenberg Self-Esteem Scale (1 study, MD = -0.38, 95% CI -2.45 to 1.69). Furthermore, when comparing Web-based interventions with face-to-face deliveries of the same interventions, no significant (beneficial or harmful) effects were found for mastery (1 study, MD = 1.20, 95% CI -1.73 to 4.13) and self-esteem (1 study, MD = -0.10, 95% CI -0.45 to 0.25). Web-based interventions showed positive effects on empowerment measured with the Diabetes Empowerment Scale, disease-specific self-efficacy scales and the Pearlin Mastery Scale. Because of the low quality of evidence we found, the results should be interpreted with caution. The clinical relevance of the findings can be questioned because the significant effects we found were, in general, small.
Defaults, context, and knowledge: alternatives for OWL-indexed knowledge bases.
Rector, A
2004-01-01
The new Web Ontology Language (OWL) and its Description Logic compatible sublanguage (OWL-DL) explicitly exclude defaults and exceptions, as do all logic based formalisms for ontologies. However, many biomedical applications appear to require default reasoning, at least if they are to be engineered in a maintainable way. Default reasoning has always been one of the great strengths of Frame systems such as Protégé. Resolving this conflict requires analysis of the different uses for defaults and exceptions. In some cases, alternatives can be provided within the OWL framework; in others, it appears that hybrid reasoning about a knowledge base of contingent facts built around the core ontology is necessary. Trade-offs include both human factors and the scaling of computational performance. The analysis presented here is based on the OpenGALEN experience with large scale ontologies using a formalism, GRAIL, which explicitly incorporates constructs for hybrid reasoning, numerous experiments with OWL, and initial work on combining OWL and Protégé.
Xu, Weijia; Ozer, Stuart; Gutell, Robin R
2009-01-01
With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure.
Xu, Weijia; Ozer, Stuart; Gutell, Robin R.
2010-01-01
With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure. PMID:20502534
NASA Technical Reports Server (NTRS)
Mehta, Manish; Seaford, Mark; Kovarik, Brian; Dufrene, Aaron; Solly, Nathan; Kirchner, Robert; Engel, Carl D.
2014-01-01
The Space Launch System (SLS) base heating test is broken down into two test programs: (1) Pathfinder and (2) Main Test. The Pathfinder Test Program focuses on the design, development, hot-fire test and performance analyses of the 2% sub-scale SLS core-stage and booster element propulsion systems. The core-stage propulsion system is composed of four gaseous oxygen/hydrogen RS-25D model engines and the booster element is composed of two aluminum-based model solid rocket motors (SRMs). The first section of the paper discusses the motivation and test facility specifications for the test program. The second section briefly investigates the internal flow path of the design. The third section briefly shows the performance of the model RS-25D engines and SRMs for the conducted short duration hot-fire tests. Good agreement is observed based on design prediction analysis and test data. This program is a challenging research and development effort that has not been attempted in 40+ years for a NASA vehicle.
Tsugawa, Hiroshi; Ohta, Erika; Izumi, Yoshihiro; Ogiwara, Atsushi; Yukihira, Daichi; Bamba, Takeshi; Fukusaki, Eiichiro; Arita, Masanori
2014-01-01
Based on theoretically calculated comprehensive lipid libraries, in lipidomics as many as 1000 multiple reaction monitoring (MRM) transitions can be monitored for each single run. On the other hand, lipid analysis from each MRM chromatogram requires tremendous manual efforts to identify and quantify lipid species. Isotopic peaks differing by up to a few atomic masses further complicate analysis. To accelerate the identification and quantification process we developed novel software, MRM-DIFF, for the differential analysis of large-scale MRM assays. It supports a correlation optimized warping (COW) algorithm to align MRM chromatograms and utilizes quality control (QC) sample datasets to automatically adjust the alignment parameters. Moreover, user-defined reference libraries that include the molecular formula, retention time, and MRM transition can be used to identify target lipids and to correct peak abundances by considering isotopic peaks. Here, we demonstrate the software pipeline and introduce key points for MRM-based lipidomics research to reduce the mis-identification and overestimation of lipid profiles. The MRM-DIFF program, example data set and the tutorials are downloadable at the "Standalone software" section of the PRIMe (Platform for RIKEN Metabolomics, http://prime.psc.riken.jp/) database website.
Tsugawa, Hiroshi; Ohta, Erika; Izumi, Yoshihiro; Ogiwara, Atsushi; Yukihira, Daichi; Bamba, Takeshi; Fukusaki, Eiichiro; Arita, Masanori
2015-01-01
Based on theoretically calculated comprehensive lipid libraries, in lipidomics as many as 1000 multiple reaction monitoring (MRM) transitions can be monitored for each single run. On the other hand, lipid analysis from each MRM chromatogram requires tremendous manual efforts to identify and quantify lipid species. Isotopic peaks differing by up to a few atomic masses further complicate analysis. To accelerate the identification and quantification process we developed novel software, MRM-DIFF, for the differential analysis of large-scale MRM assays. It supports a correlation optimized warping (COW) algorithm to align MRM chromatograms and utilizes quality control (QC) sample datasets to automatically adjust the alignment parameters. Moreover, user-defined reference libraries that include the molecular formula, retention time, and MRM transition can be used to identify target lipids and to correct peak abundances by considering isotopic peaks. Here, we demonstrate the software pipeline and introduce key points for MRM-based lipidomics research to reduce the mis-identification and overestimation of lipid profiles. The MRM-DIFF program, example data set and the tutorials are downloadable at the “Standalone software” section of the PRIMe (Platform for RIKEN Metabolomics, http://prime.psc.riken.jp/) database website. PMID:25688256
Gap Analysis and Conservation Network for Freshwater Wetlands in Central Yangtze Ecoregion
Xiaowen, Li; Haijin, Zhuge; Li, Mengdi
2013-01-01
The Central Yangtze Ecoregion contains a large area of internationally important freshwater wetlands and supports a huge number of endangered waterbirds; however, these unique wetlands and the biodiversity they support are under the constant threats of human development pressures, and the prevailing conservation strategies generated based on the local scale cannot adequately be used as guidelines for ecoregion-based conservation initiatives for Central Yangtze at the broad scale. This paper aims at establishing and optimizing an ecological network for freshwater wetland conservation in the Central Yangtze Ecoregion based on large-scale gap analysis. A group of focal species and GIS-based extrapolation technique were employed to identify the potential habitats and conservation gaps, and the optimized conservation network was then established by combining existing protective system and identified conservation gaps. Our results show that only 23.49% of the potential habitats of the focal species have been included in the existing nature reserves in the Central Yangtze Ecoregion. To effectively conserve over 80% of the potential habitats for the focal species by optimizing the existing conservation network for the freshwater wetlands in Central Yangtze Ecoregion, it is necessary to establish new wetland nature reserves in 22 county units across Hubei, Anhui, and Jiangxi provinces. PMID:24062632
Payo, Dioli Ann; Leliaert, Frederik; Verbruggen, Heroen; D'hondt, Sofie; Calumpong, Hilconida P.; De Clerck, Olivier
2013-01-01
We investigated species diversity and distribution patterns of the marine red alga Portieria in the Philippine archipelago. Species boundaries were tested based on mitochondrial, plastid and nuclear encoded loci, using a general mixed Yule-coalescent (GMYC) model-based approach and a Bayesian multilocus species delimitation method. The outcome of the GMYC analysis of the mitochondrial encoded cox2-3 dataset was highly congruent with the multilocus analysis. In stark contrast with the current morphology-based assumption that the genus includes a single, widely distributed species in the Indo-West Pacific (Portieria hornemannii), DNA-based species delimitation resulted in the recognition of 21 species within the Philippines. Species distributions were found to be highly structured with most species restricted to island groups within the archipelago. These extremely narrow species ranges and high levels of intra-archipelagic endemism contrast with the wide-held belief that marine organisms generally have large geographical ranges and that endemism is at most restricted to the archipelagic level. Our results indicate that speciation in the marine environment may occur at spatial scales smaller than 100 km, comparable with some terrestrial systems. Our finding of fine-scale endemism has important consequences for marine conservation and management. PMID:23269854
Gap analysis and conservation network for freshwater wetlands in Central Yangtze Ecoregion.
Xiaowen, Li; Haijin, Zhuge; Li, Mengdi
2013-01-01
The Central Yangtze Ecoregion contains a large area of internationally important freshwater wetlands and supports a huge number of endangered waterbirds; however, these unique wetlands and the biodiversity they support are under the constant threats of human development pressures, and the prevailing conservation strategies generated based on the local scale cannot adequately be used as guidelines for ecoregion-based conservation initiatives for Central Yangtze at the broad scale. This paper aims at establishing and optimizing an ecological network for freshwater wetland conservation in the Central Yangtze Ecoregion based on large-scale gap analysis. A group of focal species and GIS-based extrapolation technique were employed to identify the potential habitats and conservation gaps, and the optimized conservation network was then established by combining existing protective system and identified conservation gaps. Our results show that only 23.49% of the potential habitats of the focal species have been included in the existing nature reserves in the Central Yangtze Ecoregion. To effectively conserve over 80% of the potential habitats for the focal species by optimizing the existing conservation network for the freshwater wetlands in Central Yangtze Ecoregion, it is necessary to establish new wetland nature reserves in 22 county units across Hubei, Anhui, and Jiangxi provinces.
Park, Junghyun A; Kim, Minki; Yoon, Seokjoon
2016-05-17
Sophisticated anti-fraud systems for the healthcare sector have been built based on several statistical methods. Although existing methods have been developed to detect fraud in the healthcare sector, these algorithms consume considerable time and cost, and lack a theoretical basis to handle large-scale data. Based on mathematical theory, this study proposes a new approach to using Benford's Law in that we closely examined the individual-level data to identify specific fees for in-depth analysis. We extended the mathematical theory to demonstrate the manner in which large-scale data conform to Benford's Law. Then, we empirically tested its applicability using actual large-scale healthcare data from Korea's Health Insurance Review and Assessment (HIRA) National Patient Sample (NPS). For Benford's Law, we considered the mean absolute deviation (MAD) formula to test the large-scale data. We conducted our study on 32 diseases, comprising 25 representative diseases and 7 DRG-regulated diseases. We performed an empirical test on 25 diseases, showing the applicability of Benford's Law to large-scale data in the healthcare industry. For the seven DRG-regulated diseases, we examined the individual-level data to identify specific fees to carry out an in-depth analysis. Among the eight categories of medical costs, we considered the strength of certain irregularities based on the details of each DRG-regulated disease. Using the degree of abnormality, we propose priority action to be taken by government health departments and private insurance institutions to bring unnecessary medical expenses under control. However, when we detect deviations from Benford's Law, relatively high contamination ratios are required at conventional significance levels.
Large-Scale Femtoliter Droplet Array for Single Cell Efflux Assay of Bacteria.
Iino, Ryota; Sakakihara, Shouichi; Matsumoto, Yoshimi; Nishino, Kunihiko
2018-01-01
Large-scale femtoliter droplet array as a platform for single cell efflux assay of bacteria is described. Device microfabrication, femtoliter droplet array formation and concomitant enclosure of single bacterial cells, fluorescence-based detection of efflux activity at the single cell level, and collection of single cells from droplet and subsequent gene analysis are described in detail.
NASA Astrophysics Data System (ADS)
Scholz, Jan; Dejori, Mathäus; Stetter, Martin; Greiner, Martin
2005-05-01
The impact of observational noise on the analysis of scale-free networks is studied. Various noise sources are modeled as random link removal, random link exchange and random link addition. Emphasis is on the resulting modifications for the node-degree distribution and for a functional ranking based on betweenness centrality. The implications for estimated gene-expressed networks for childhood acute lymphoblastic leukemia are discussed.
ERIC Educational Resources Information Center
Reise, Steven P.; Ventura, Joseph; Keefe, Richard S. E.; Baade, Lyle E.; Gold, James M.; Green, Michael F.; Kern, Robert S.; Mesholam-Gately, Raquelle; Nuechterlein, Keith H.; Seidman, Larry J.; Bilder, Robert
2011-01-01
A psychometric analysis of 2 interview-based measures of cognitive deficits was conducted: the 21-item Clinical Global Impression of Cognition in Schizophrenia (CGI-CogS; Ventura et al., 2008), and the 20-item Schizophrenia Cognition Rating Scale (SCoRS; Keefe et al., 2006), which were administered on 2 occasions to a sample of people with…
Project Evaluation: Validation of a Scale and Analysis of Its Predictive Capacity
ERIC Educational Resources Information Center
Fernandes Malaquias, Rodrigo; de Oliveira Malaquias, Fernanda Francielle
2014-01-01
The objective of this study was to validate a scale for assessment of academic projects. As a complement, we examined its predictive ability by comparing the scores of advised/corrected projects based on the model and the final scores awarded to the work by an examining panel (approximately 10 months after the project design). Results of…
Fu, Hai-Yan; Guo, Jun-Wei; Yu, Yong-Jie; Li, He-Dong; Cui, Hua-Peng; Liu, Ping-Ping; Wang, Bing; Wang, Sheng; Lu, Peng
2016-06-24
Peak detection is a critical step in chromatographic data analysis. In the present work, we developed a multi-scale Gaussian smoothing-based strategy for accurate peak extraction. The strategy consisted of three stages: background drift correction, peak detection, and peak filtration. Background drift correction was implemented using a moving window strategy. The new peak detection method is a variant of the system used by the well-known MassSpecWavelet, i.e., chromatographic peaks are found at local maximum values under various smoothing window scales. Therefore, peaks can be detected through the ridge lines of maximum values under these window scales, and signals that are monotonously increased/decreased around the peak position could be treated as part of the peak. Instrumental noise was estimated after peak elimination, and a peak filtration strategy was performed to remove peaks with signal-to-noise ratios smaller than 3. The performance of our method was evaluated using two complex datasets. These datasets include essential oil samples for quality control obtained from gas chromatography and tobacco plant samples for metabolic profiling analysis obtained from gas chromatography coupled with mass spectrometry. Results confirmed the reasonability of the developed method. Copyright © 2016 Elsevier B.V. All rights reserved.
Torsional Oscillations in a Global Solar Dynamo
NASA Astrophysics Data System (ADS)
Beaudoin, P.; Charbonneau, P.; Racine, E.; Smolarkiewicz, P. K.
2013-02-01
We characterize and analyze rotational torsional oscillations developing in a large-eddy magnetohydrodynamical simulation of solar convection (Ghizaru, Charbonneau, and Smolarkiewicz, Astrophys. J. Lett. 715, L133, 2010; Racine et al., Astrophys. J. 735, 46, 2011) producing an axisymmetric, large-scale, magnetic field undergoing periodic polarity reversals. Motivated by the many solar-like features exhibited by these oscillations, we carry out an analysis of the large-scale zonal dynamics. We demonstrate that simulated torsional oscillations are not driven primarily by the periodically varying large-scale magnetic torque, as one might have expected, but rather via the magnetic modulation of angular-momentum transport by the large-scale meridional flow. This result is confirmed by a straightforward energy analysis. We also detect a fairly sharp transition in rotational dynamics taking place as one moves from the base of the convecting layers to the base of the thin tachocline-like shear layer formed in the stably stratified fluid layers immediately below. We conclude by discussing the implications of our analyses with regard to the mechanism of amplitude saturation in the global dynamo operating in the simulation, and speculate on the possible precursor value of torsional oscillations for the forecast of solar-cycle characteristics.
NASA Technical Reports Server (NTRS)
Ahn, Kyung H.
1994-01-01
The RNG-based algebraic turbulence model, with a new method of solving the cubic equation and applying new length scales, is introduced. An analysis is made of the RNG length scale which was previously reported and the resulting eddy viscosity is compared with those from other algebraic turbulence models. Subsequently, a new length scale is introduced which actually uses the two previous RNG length scales in a systematic way to improve the model performance. The performance of the present RNG model is demonstrated by simulating the boundary layer flow over a flat plate and the flow over an airfoil.
Internal Fluid Dynamics and Frequency Scaling of Sweeping Jet Fluidic Oscillators
NASA Astrophysics Data System (ADS)
Seo, Jung Hee; Salazar, Erik; Mittal, Rajat
2017-11-01
Sweeping jet fluidic oscillators (SJFOs) are devices that produce a spatially oscillating jet solely based on intrinsic flow instability mechanisms without any moving parts. Recently, SJFOs have emerged as effective actuators for flow control, but the internal fluid dynamics of the device that drives the oscillatory flow mechanism is not yet fully understood. In the current study, the internal fluid dynamics of the fluidic oscillator with feedback channels has been investigated by employing incompressible flow simulations. The study is focused on the oscillation mechanisms and scaling laws that underpin the jet oscillation. Based on the simulation results, simple phenomenological models that connect the jet deflection to the feedback flow are developed. Several geometric modifications are considered in order to explore the characteristic length scales and phase relationships associated with the jet oscillation and to assess the proposed phenomenological model. A scaling law for the jet oscillation frequency is proposed based on the detailed analysis. This research is supported by AFOSR Grant FA9550-14-1-0289 monitored by Dr. Douglas Smith.
Krishnakumar, V; Prabavathi, N
2009-09-15
This work deals with the vibrational spectroscopy of p-hydroxyanisole (PHA) and p-nitroanisole (PNA) by means of quantum chemical calculations. The mid and far FT-IR and FT-Raman spectra were recorded in the condensed state. The fundamental vibrational frequencies and intensity of vibrational bands were evaluated using density functional theory (DFT) with the standard B3LYP/6-31G* method and basis set combination and were scaled using various scale factors which yield a good agreement between observed and calculated frequencies. The vibrational spectra were interpreted with the aid of normal coordinate analysis based on scaled density functional force field. The results of the calculations were applied to simulate infrared and Raman spectra of the title compounds, which showed excellent agreement with the observed spectra.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawton, Craig R.
2015-01-01
The military is undergoing a significant transformation as it modernizes for the information age and adapts to address an emerging asymmetric threat beyond traditional cold war era adversaries. Techniques such as traditional large-scale, joint services war gaming analysis are no longer adequate to support program evaluation activities and mission planning analysis at the enterprise level because the operating environment is evolving too quickly. New analytical capabilities are necessary to address modernization of the Department of Defense (DoD) enterprise. This presents significant opportunity to Sandia in supporting the nation at this transformational enterprise scale. Although Sandia has significant experience with engineeringmore » system of systems (SoS) and Complex Adaptive System of Systems (CASoS), significant fundamental research is required to develop modeling, simulation and analysis capabilities at the enterprise scale. This report documents an enterprise modeling framework which will enable senior level decision makers to better understand their enterprise and required future investments.« less
NASA Astrophysics Data System (ADS)
Han, Zhenyu; Sun, Shouzheng; Fu, Yunzhong; Fu, Hongya
2017-10-01
Viscidity is an important physical indicator for assessing fluidity of resin that is beneficial to contact resin with the fibers effectively and reduce manufacturing defects during automated fiber placement (AFP) process. However, the effect of processing parameters on viscidity evolution is rarely studied during AFP process. In this paper, viscidities under different scales are analyzed based on multi-scale analysis method. Firstly, viscous dissipation energy (VDE) within meso-unit under different processing parameters is assessed by using finite element method (FEM). According to multi-scale energy transfer model, meso-unit energy is used as the boundary condition for microscopic analysis. Furthermore, molecular structure of micro-system is built by molecular dynamics (MD) method. And viscosity curves are then obtained by integrating stress autocorrelation function (SACF) with time. Finally, the correlation characteristics of processing parameters to viscosity are revealed by using gray relational analysis method (GRAM). A group of processing parameters is found out to achieve the stability of viscosity and better fluidity of resin.
Multiscale analysis of the intensity fluctuation in a time series of dynamic speckle patterns.
Federico, Alejandro; Kaufmann, Guillermo H
2007-04-10
We propose the application of a method based on the discrete wavelet transform to detect, identify, and measure scaling behavior in dynamic speckle. The multiscale phenomena presented by a sample and displayed by its speckle activity are analyzed by processing the time series of dynamic speckle patterns. The scaling analysis is applied to the temporal fluctuation of the speckle intensity and also to the two derived data sets generated by its magnitude and sign. The application of the method is illustrated by analyzing paint-drying processes and bruising in apples. The results are discussed taking into account the different time organizations obtained for the scaling behavior of the magnitude and the sign of the intensity fluctuation.
Metabolic Network Modeling of Microbial Communities
Biggs, Matthew B.; Medlock, Gregory L.; Kolling, Glynis L.
2015-01-01
Genome-scale metabolic network reconstructions and constraint-based analysis are powerful methods that have the potential to make functional predictions about microbial communities. Current use of genome-scale metabolic networks to characterize the metabolic functions of microbial communities includes species compartmentalization, separating species-level and community-level objectives, dynamic analysis, the “enzyme-soup” approach, multi-scale modeling, and others. There are many challenges inherent to the field, including a need for tools that accurately assign high-level omics signals to individual community members, new automated reconstruction methods that rival manual curation, and novel algorithms for integrating omics data and engineering communities. As technologies and modeling frameworks improve, we expect that there will be proportional advances in the fields of ecology, health science, and microbial community engineering. PMID:26109480
The instrument 'sense of security in care--patients' evaluation': its development and presentation.
Krevers, Barbro; Milberg, Anna
2014-08-01
The aim of this paper is to report the development, construction, and psychometric properties of the new instrument Sense of Security in Care - Patients' Evaluation (SEC-P) in palliative home care. The preliminary instrument was based on a review of the literature and an analysis of qualitative interviews with patients about their sense of security. To test the instrument, 161 patients (58% women) in palliative home care were recruited and participated in a structured interview based on a comprehensive questionnaire (response rate 73%). We used principal component analysis to identify subscales and tested the construction in correlation with other scales and questions representing concepts that we expected to be related to sense of security in care. The principal component analysis resulted in three subscales: Care Interaction, Identity, and Mastery, built on a total of 15 items. The component solution had an explained variance of 55%. Internal consistency of the subscales ranged from 0.84 to 0.69. Inter-scale correlations varied between 0.40 and 0.59. The scales were associated to varying degrees with the quality of the care process, perceived health, quality of life, stress, and general sense of security. The developed SEC-P provides a three-component assessment of palliative home care settings using valid and reliable scales. The scales were associated with other concepts in ways that were expected. The SEC-P is a manageable means of assessment that can be used to improve quality of care and in research focusing on patients' sense of security in care. Copyright © 2014 John Wiley & Sons, Ltd.
Validity and Reliability of the Upper Extremity Work Demands Scale.
Jacobs, Nora W; Berduszek, Redmar J; Dijkstra, Pieter U; van der Sluis, Corry K
2017-12-01
Purpose To evaluate validity and reliability of the upper extremity work demands (UEWD) scale. Methods Participants from different levels of physical work demands, based on the Dictionary of Occupational Titles categories, were included. A historical database of 74 workers was added for factor analysis. Criterion validity was evaluated by comparing observed and self-reported UEWD scores. To assess structural validity, a factor analysis was executed. For reliability, the difference between two self-reported UEWD scores, the smallest detectable change (SDC), test-retest reliability and internal consistency were determined. Results Fifty-four participants were observed at work and 51 of them filled in the UEWD twice with a mean interval of 16.6 days (SD 3.3, range = 10-25 days). Criterion validity of the UEWD scale was moderate (r = .44, p = .001). Factor analysis revealed that 'force and posture' and 'repetition' subscales could be distinguished with Cronbach's alpha of .79 and .84, respectively. Reliability was good; there was no significant difference between repeated measurements. An SDC of 5.0 was found. Test-retest reliability was good (intraclass correlation coefficient for agreement = .84) and all item-total correlations were >.30. There were two pairs of highly related items. Conclusion Reliability of the UEWD scale was good, but criterion validity was moderate. Based on current results, a modified UEWD scale (2 items removed, 1 item reworded, divided into 2 subscales) was proposed. Since observation appeared to be an inappropriate gold standard, we advise to investigate other types of validity, such as construct validity, in further research.
The Systems Biology Markup Language (SBML) Level 3 Package: Flux Balance Constraints.
Olivier, Brett G; Bergmann, Frank T
2015-09-04
Constraint-based modeling is a well established modelling methodology used to analyze and study biological networks on both a medium and genome scale. Due to their large size, genome scale models are typically analysed using constraint-based optimization techniques. One widely used method is Flux Balance Analysis (FBA) which, for example, requires a modelling description to include: the definition of a stoichiometric matrix, an objective function and bounds on the values that fluxes can obtain at steady state. The Flux Balance Constraints (FBC) Package extends SBML Level 3 and provides a standardized format for the encoding, exchange and annotation of constraint-based models. It includes support for modelling concepts such as objective functions, flux bounds and model component annotation that facilitates reaction balancing. The FBC package establishes a base level for the unambiguous exchange of genome-scale, constraint-based models, that can be built upon by the community to meet future needs (e. g. by extending it to cover dynamic FBC models).
The Systems Biology Markup Language (SBML) Level 3 Package: Flux Balance Constraints.
Olivier, Brett G; Bergmann, Frank T
2015-06-01
Constraint-based modeling is a well established modelling methodology used to analyze and study biological networks on both a medium and genome scale. Due to their large size, genome scale models are typically analysed using constraint-based optimization techniques. One widely used method is Flux Balance Analysis (FBA) which, for example, requires a modelling description to include: the definition of a stoichiometric matrix, an objective function and bounds on the values that fluxes can obtain at steady state. The Flux Balance Constraints (FBC) Package extends SBML Level 3 and provides a standardized format for the encoding, exchange and annotation of constraint-based models. It includes support for modelling concepts such as objective functions, flux bounds and model component annotation that facilitates reaction balancing. The FBC package establishes a base level for the unambiguous exchange of genome-scale, constraint-based models, that can be built upon by the community to meet future needs (e. g. by extending it to cover dynamic FBC models).
Pavlov, A N; Pavlova, O N; Abdurashitov, A S; Sindeeva, O A; Semyachkina-Glushkovskaya, O V; Kurths, J
2018-01-01
The scaling properties of complex processes may be highly influenced by the presence of various artifacts in experimental recordings. Their removal produces changes in the singularity spectra and the Hölder exponents as compared with the original artifacts-free data, and these changes are significantly different for positively correlated and anti-correlated signals. While signals with power-law correlations are nearly insensitive to the loss of significant parts of data, the removal of fragments of anti-correlated signals is more crucial for further data analysis. In this work, we study the ability of characterizing scaling features of chaotic and stochastic processes with distinct correlation properties using a wavelet-based multifractal analysis, and discuss differences between the effect of missed data for synchronous and asynchronous oscillatory regimes. We show that even an extreme data loss allows characterizing physiological processes such as the cerebral blood flow dynamics.
NASA Astrophysics Data System (ADS)
Pavlov, A. N.; Pavlova, O. N.; Abdurashitov, A. S.; Sindeeva, O. A.; Semyachkina-Glushkovskaya, O. V.; Kurths, J.
2018-01-01
The scaling properties of complex processes may be highly influenced by the presence of various artifacts in experimental recordings. Their removal produces changes in the singularity spectra and the Hölder exponents as compared with the original artifacts-free data, and these changes are significantly different for positively correlated and anti-correlated signals. While signals with power-law correlations are nearly insensitive to the loss of significant parts of data, the removal of fragments of anti-correlated signals is more crucial for further data analysis. In this work, we study the ability of characterizing scaling features of chaotic and stochastic processes with distinct correlation properties using a wavelet-based multifractal analysis, and discuss differences between the effect of missed data for synchronous and asynchronous oscillatory regimes. We show that even an extreme data loss allows characterizing physiological processes such as the cerebral blood flow dynamics.
Development and validation of a professionalism assessment scale for medical students
Klemenc-Ketis, Zalika; Vrecko, Helena
2014-01-01
Objectives To develop and validate a scale for the assess-ment of professionalism in medical students based on students' perceptions of and attitudes towards professional-ism in medicine. Methods This was a mixed methods study with under-graduate medical students. Two focus groups were carried out with 12 students, followed by a transcript analysis (grounded theory method with open coding). Then, a 3-round Delphi with 20 family medicine experts was carried out. A psychometric assessment of the scale was performed with a group of 449 students. The items of the Professional-ism Assessment Scale could be answered on a five-point Likert scale. Results After the focus groups, the first version of the PAS consisted of 56 items and after the Delphi study, 30 items remained. The final sample for quantitative study consisted of 122 students (27.2% response rate). There were 95 (77.9%) female students in the sample. The mean age of the sample was 22.1 ± 2.1 years. After the principal component analysis, we removed 8 items and produced the final version of the PAS (22 items). The Cronbach's alpha of the scale was 0.88. Factor analysis revealed three factors: empathy and humanism, professional relationships and development and responsibility. Conclusions The new Professionalism Assessment Scale proved to be valid and reliable. It can be used for the assessment of professionalism in undergraduate medical students. PMID:25382090
Stemp, W James; Chung, Steven
2011-01-01
This pilot study tests the reliability of laser scanning confocal microscopy (LSCM) to quantitatively measure wear on experimental obsidian tools. To our knowledge, this is the first use of confocal microscopy to study wear on stone flakes made from an amorphous silicate like obsidian. Three-dimensional surface roughness or texture area scans on three obsidian flakes used on different contact materials (hide, shell, wood) were documented using the LSCM to determine whether the worn surfaces could be discriminated using area-scale analysis, specifically relative area (RelA). When coupled with the F-test, this scale-sensitive fractal analysis could not only discriminate the used from unused surfaces on individual tools, but was also capable of discriminating the wear histories of tools used on different contact materials. Results indicate that such discriminations occur at different scales. Confidence levels for the discriminations at different scales were established using the F-test (mean square ratios or MSRs). In instances where discrimination of surface roughness or texture was not possible above the established confidence level based on MSRs, photomicrographs and RelA assisted in hypothesizing why this was so. Copyright © 2011 Wiley Periodicals, Inc.
Influence of the Time Scale on the Construction of Financial Networks
Emmert-Streib, Frank; Dehmer, Matthias
2010-01-01
Background In this paper we investigate the definition and formation of financial networks. Specifically, we study the influence of the time scale on their construction. Methodology/Principal Findings For our analysis we use correlation-based networks obtained from the daily closing prices of stock market data. More precisely, we use the stocks that currently comprise the Dow Jones Industrial Average (DJIA) and estimate financial networks where nodes correspond to stocks and edges correspond to none vanishing correlation coefficients. That means only if a correlation coefficient is statistically significant different from zero, we include an edge in the network. This construction procedure results in unweighted, undirected networks. By separating the time series of stock prices in non-overlapping intervals, we obtain one network per interval. The length of these intervals corresponds to the time scale of the data, whose influence on the construction of the networks will be studied in this paper. Conclusions/Significance Numerical analysis of four different measures in dependence on the time scale for the construction of networks allows us to gain insights about the intrinsic time scale of the stock market with respect to a meaningful graph-theoretical analysis. PMID:20949124
Single-Image Super-Resolution Based on Rational Fractal Interpolation.
Zhang, Yunfeng; Fan, Qinglan; Bao, Fangxun; Liu, Yifang; Zhang, Caiming
2018-08-01
This paper presents a novel single-image super-resolution (SR) procedure, which upscales a given low-resolution (LR) input image to a high-resolution image while preserving the textural and structural information. First, we construct a new type of bivariate rational fractal interpolation model and investigate its analytical properties. This model has different forms of expression with various values of the scaling factors and shape parameters; thus, it can be employed to better describe image features than current interpolation schemes. Furthermore, this model combines the advantages of rational interpolation and fractal interpolation, and its effectiveness is validated through theoretical analysis. Second, we develop a single-image SR algorithm based on the proposed model. The LR input image is divided into texture and non-texture regions, and then, the image is interpolated according to the characteristics of the local structure. Specifically, in the texture region, the scaling factor calculation is the critical step. We present a method to accurately calculate scaling factors based on local fractal analysis. Extensive experiments and comparisons with the other state-of-the-art methods show that our algorithm achieves competitive performance, with finer details and sharper edges.
Pankavich, S; Ortoleva, P
2010-06-01
The multiscale approach to N-body systems is generalized to address the broad continuum of long time and length scales associated with collective behaviors. A technique is developed based on the concept of an uncountable set of time variables and of order parameters (OPs) specifying major features of the system. We adopt this perspective as a natural extension of the commonly used discrete set of time scales and OPs which is practical when only a few, widely separated scales exist. The existence of a gap in the spectrum of time scales for such a system (under quasiequilibrium conditions) is used to introduce a continuous scaling and perform a multiscale analysis of the Liouville equation. A functional-differential Smoluchowski equation is derived for the stochastic dynamics of the continuum of Fourier component OPs. A continuum of spatially nonlocal Langevin equations for the OPs is also derived. The theory is demonstrated via the analysis of structural transitions in a composite material, as occurs for viral capsids and molecular circuits.
Use of JAR-Based Analysis for Improvement of Product Acceptance: A Case Study on Flavored Kefirs.
Gere, Attila; Szabó, Zsófia; Pásztor-Huszár, Klára; Orbán, Csaba; Kókai, Zoltán; Sipos, László
2017-05-01
A common question of dairy product developments is the possible success of the new product. Several publications reported successful results using just-about-right (JAR) scales; although there is some debate about their advantages/disadvantages. This study highlights the limitations and opportunities of JAR scales and penalty analysis of fruit flavored kefirs. The first question is whether penalty analysis results help to improve the product and thus its overall liking (OAL)? The second question is what happens to those who rated the products "ideal" (JAR) before product development when evaluating the new products? Fruit flavored live-flora stirred-type kefir samples were formulated and evaluated by 92 consumers before and after the JAR-based product development. The OAL of two products significantly increased after product development. A new visualization tool is introduced, which shows what happens to those who rated the attribute as JAR but the attribute has been modified. A general product development scheme is also introduced for JAR-based kefir product development. © 2017 Institute of Food Technologists®.
Dosage-based parameters for characterization of puff dispersion results.
Berbekar, Eva; Harms, Frank; Leitl, Bernd
2015-01-01
A set of parameters is introduced to characterize the dispersion of puff releases based on the measured dosage. These parameters are the dosage, peak concentration, arrival time, peak time, leaving time, ascent time, descent time and duration. Dimensionless numbers for the scaling of the parameters are derived from dimensional analysis. The dimensionless numbers are tested and confirmed based on a statistically representative wind tunnel dataset. The measurements were carried out in a 1:300 scale model of the Central Business District in Oklahoma City. Additionally, the effect of the release duration on the puff parameters is investigated. Copyright © 2014 Elsevier B.V. All rights reserved.
A Multivariate Descriptive Model of Motivation for Orthodontic Treatment.
ERIC Educational Resources Information Center
Hackett, Paul M. W.; And Others
1993-01-01
Motivation for receiving orthodontic treatment was studied among 109 young adults, and a multivariate model of the process is proposed. The combination of smallest scale analysis and Partial Order Scalogram Analysis by base Coordinates (POSAC) illustrates an interesting methodology for health treatment studies and explores motivation for dental…
Interim Terrain Data (ITD) and Vector Product Interim Terrain Data (VITD) user's guide
DOT National Transportation Integrated Search
1996-09-01
This guide is intended to be a convenient reference for users of these types of terrain analysis data. ITD is a digitized version of the standard 1:50,000-scale tactical terrain analysis data base (TTADB) product produced by the Defense Mapping Agenc...
Meta-Analysis of Scale Reliability Using Latent Variable Modeling
ERIC Educational Resources Information Center
Raykov, Tenko; Marcoulides, George A.
2013-01-01
A latent variable modeling approach is outlined that can be used for meta-analysis of reliability coefficients of multicomponent measuring instruments. Important limitations of efforts to combine composite reliability findings across multiple studies are initially pointed out. A reliability synthesis procedure is discussed that is based on…
Mining a Web Citation Database for Author Co-Citation Analysis.
ERIC Educational Resources Information Center
He, Yulan; Hui, Siu Cheung
2002-01-01
Proposes a mining process to automate author co-citation analysis based on the Web Citation Database, a data warehouse for storing citation indices of Web publications. Describes the use of agglomerative hierarchical clustering for author clustering and multidimensional scaling for displaying author cluster maps, and explains PubSearch, a…
Three-Dimensional Numerical Analyses of Earth Penetration Dynamics
1979-01-31
Lagrangian formulation based on the HEMP method and has been adapted and validated for treatment of normal-incidence (axisymmetric) impact and...code, is a detailed analysis of the structural response of the EPW. This analysis is generated using a nonlinear dynamic, elastic- plastic finite element...based on the HEMP scheme. Thus, the code has the same material modeling capabilities and abilities to track large scale motion found in the WAVE-L code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andersen, Mie; Medford, Andrew J.; Norskov, Jens K.
Here, we present a generic analysis of the implications of energetic scaling relations on the possibilities for bifunctional gains at homogeneous bimetallic alloy catalysts. Such catalysts exhibit a large number of interface sites, where second-order reaction steps can involve intermediates adsorbed at different active sites. Using different types of model reaction schemes, we show that such site-coupling reaction steps can provide bifunctional gains that allow for a bimetallic catalyst composed of two individually poor catalyst materials to approach the activity of the optimal monomaterial catalyst. However, bifunctional gains cannot result in activities higher than the activity peak of the monomaterialmore » volcano curve as long as both sites obey similar scaling relations, as is generally the case for bimetallic catalysts. These scaling-relation-imposed limitations could be overcome by combining different classes of materials such as metals and oxides.« less
Andersen, Mie; Medford, Andrew J.; Norskov, Jens K.; ...
2017-04-14
Here, we present a generic analysis of the implications of energetic scaling relations on the possibilities for bifunctional gains at homogeneous bimetallic alloy catalysts. Such catalysts exhibit a large number of interface sites, where second-order reaction steps can involve intermediates adsorbed at different active sites. Using different types of model reaction schemes, we show that such site-coupling reaction steps can provide bifunctional gains that allow for a bimetallic catalyst composed of two individually poor catalyst materials to approach the activity of the optimal monomaterial catalyst. However, bifunctional gains cannot result in activities higher than the activity peak of the monomaterialmore » volcano curve as long as both sites obey similar scaling relations, as is generally the case for bimetallic catalysts. These scaling-relation-imposed limitations could be overcome by combining different classes of materials such as metals and oxides.« less
Development and validation of the Alcohol Myopia Scale.
Lac, Andrew; Berger, Dale E
2013-09-01
Alcohol myopia theory conceptualizes the ability of alcohol to narrow attention and how this demand on mental resources produces the impairments of self-inflation, relief, and excess. The current research was designed to develop and validate a scale based on this framework. People who were alcohol users rated items representing myopic experiences arising from drinking episodes in the past month. In Study 1 (N = 260), the preliminary 3-factor structure was supported by exploratory factor analysis. In Study 2 (N = 289), the 3-factor structure was substantiated with confirmatory factor analysis, and it was superior in fit to an empirically indefensible 1-factor structure. The final 14-item scale was evaluated with internal consistency reliability, discriminant validity, convergent validity, criterion validity, and incremental validity. The alcohol myopia scale (AMS) illuminates conceptual underpinnings of this theory and yields insights for understanding the tunnel vision that arises from intoxication.
NASA Astrophysics Data System (ADS)
Rana, Arun; Moradkhani, Hamid
2016-07-01
Uncertainties in climate modelling are well documented in literature. Global Climate Models (GCMs) are often used to downscale the climatic parameters on a regional scale. In the present work, we have analyzed the changes in precipitation and temperature for future scenario period of 2070-2099 with respect to historical period of 1970-2000 from statistically downscaled GCM projections in Columbia River Basin (CRB). Analysis is performed using two different statistically downscaled climate projections (with ten GCMs downscaled products each, for RCP 4.5 and RCP 8.5, from CMIP5 dataset) namely, those from the Bias Correction and Spatial Downscaling (BCSD) technique generated at Portland State University and from the Multivariate Adaptive Constructed Analogs (MACA) technique, generated at University of Idaho, totaling to 40 different scenarios. The two datasets for BCSD and MACA are downscaled from observed data for both scenarios projections i.e. RCP4.5 and RCP8.5. Analysis is performed using spatial change (yearly scale), temporal change (monthly scale), percentile change (seasonal scale), quantile change (yearly scale), and wavelet analysis (yearly scale) in the future period from the historical period, respectively, at a scale of 1/16th of degree for entire CRB region. Results have indicated in varied degree of spatial change pattern for the entire Columbia River Basin, especially western part of the basin. At temporal scales, winter precipitation has higher variability than summer and vice versa for temperature. Most of the models have indicated considerate positive change in quantiles and percentiles for both precipitation and temperature. Wavelet analysis provided insights into possible explanation to changes in precipitation.
Scale-dependent approaches to modeling spatial epidemiology of chronic wasting disease.
Conner, Mary M.; Gross, John E.; Cross, Paul C.; Ebinger, Michael R.; Gillies, Robert; Samuel, Michael D.; Miller, Michael W.
2007-01-01
For each scale, we presented a focal approach that would be useful for understanding the spatial pattern and epidemiology of CWD, as well as being a useful tool for CWD management. The focal approaches include risk analysis and micromaps for the regional scale, cluster analysis for the landscape scale, and individual based modeling for the fine scale of within population. For each of these methods, we used simulated data and walked through the method step by step to fully illustrate the “how to”, with specifics about what is input and output, as well as what questions the method addresses. We also provided a summary table to, at a glance, describe the scale, questions that can be addressed, and general data required for each method described in this e-book. We hope that this review will be helpful to biologists and managers by increasing the utility of their surveillance data, and ultimately be useful for increasing our understanding of CWD and allowing wildlife biologists and managers to move beyond retroactive fire-fighting to proactive preventative action.
Prediction of Gas Injection Performance for Heterogeneous Reservoirs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blunt, Martin J.; Orr, Franklin M.
This report describes research carried out in the Department of Petroleum Engineering at Stanford University from September 1997 - September 1998 under the second year of a three-year grant from the Department of Energy on the "Prediction of Gas Injection Performance for Heterogeneous Reservoirs." The research effort is an integrated study of the factors affecting gas injection, from the pore scale to the field scale, and involves theoretical analysis, laboratory experiments, and numerical simulation. The original proposal described research in four areas: (1) Pore scale modeling of three phase flow in porous media; (2) Laboratory experiments and analysis of factorsmore » influencing gas injection performance at the core scale with an emphasis on the fundamentals of three phase flow; (3) Benchmark simulations of gas injection at the field scale; and (4) Development of streamline-based reservoir simulator. Each state of the research is planned to provide input and insight into the next stage, such that at the end we should have an integrated understanding of the key factors affecting field scale displacements.« less
Large-scale retrieval for medical image analytics: A comprehensive review.
Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting
2018-01-01
Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
Web-Based Virtual Laboratory for Food Analysis Course
NASA Astrophysics Data System (ADS)
Handayani, M. N.; Khoerunnisa, I.; Sugiarti, Y.
2018-02-01
Implementation of learning on food analysis course in Program Study of Agro-industrial Technology Education faced problems. These problems include the availability of space and tools in the laboratory that is not comparable with the number of students also lack of interactive learning tools. On the other hand, the information technology literacy of students is quite high as well the internet network is quite easily accessible on campus. This is a challenge as well as opportunities in the development of learning media that can help optimize learning in the laboratory. This study aims to develop web-based virtual laboratory as one of the alternative learning media in food analysis course. This research is R & D (research and development) which refers to Borg & Gall model. The results showed that assessment’s expert of web-based virtual labs developed, in terms of software engineering aspects; visual communication; material relevance; usefulness and language used, is feasible as learning media. The results of the scaled test and wide-scale test show that students strongly agree with the development of web based virtual laboratory. The response of student to this virtual laboratory was positive. Suggestions from students provided further opportunities for improvement web based virtual laboratory and should be considered for further research.
NASA Astrophysics Data System (ADS)
Kim, Jonghoon; Cho, B. H.
2014-08-01
This paper introduces an innovative approach to analyze electrochemical characteristics and state-of-health (SOH) diagnosis of a Li-ion cell based on the discrete wavelet transform (DWT). In this approach, the DWT has been applied as a powerful tool in the analysis of the discharging/charging voltage signal (DCVS) with non-stationary and transient phenomena for a Li-ion cell. Specifically, DWT-based multi-resolution analysis (MRA) is used for extracting information on the electrochemical characteristics in both time and frequency domain simultaneously. Through using the MRA with implementation of the wavelet decomposition, the information on the electrochemical characteristics of a Li-ion cell can be extracted from the DCVS over a wide frequency range. Wavelet decomposition based on the selection of the order 3 Daubechies wavelet (dB3) and scale 5 as the best wavelet function and the optimal decomposition scale is implemented. In particular, this present approach develops these investigations one step further by showing low and high frequency components (approximation component An and detail component Dn, respectively) extracted from variable Li-ion cells with different electrochemical characteristics caused by aging effect. Experimental results show the clearness of the DWT-based approach for the reliable diagnosis of the SOH for a Li-ion cell.
NASA Astrophysics Data System (ADS)
Akbar, Ruzbeh; Short Gianotti, Daniel; McColl, Kaighin A.; Haghighi, Erfan; Salvucci, Guido D.; Entekhabi, Dara
2018-03-01
The soil water content profile is often well correlated with the soil moisture state near the surface. They share mutual information such that analysis of surface-only soil moisture is, at times and in conjunction with precipitation information, reflective of deeper soil fluxes and dynamics. This study examines the characteristic length scale, or effective depth Δz, of a simple active hydrological control volume. The volume is described only by precipitation inputs and soil water dynamics evident in surface-only soil moisture observations. To proceed, first an observation-based technique is presented to estimate the soil moisture loss function based on analysis of soil moisture dry-downs and its successive negative increments. Then, the length scale Δz is obtained via an optimization process wherein the root-mean-squared (RMS) differences between surface soil moisture observations and its predictions based on water balance are minimized. The process is entirely observation-driven. The surface soil moisture estimates are obtained from the NASA Soil Moisture Active Passive (SMAP) mission and precipitation from the gauge-corrected Climate Prediction Center daily global precipitation product. The length scale Δz exhibits a clear east-west gradient across the contiguous United States (CONUS), such that large Δz depths (>200 mm) are estimated in wetter regions with larger mean precipitation. The median Δz across CONUS is 135 mm. The spatial variance of Δz is predominantly explained and influenced by precipitation characteristics. Soil properties, especially texture in the form of sand fraction, as well as the mean soil moisture state have a lesser influence on the length scale.
Michael, P E; Jahncke, J; Hyrenbach, K D
2016-01-01
At-sea surveys facilitate the study of the distribution and abundance of marine birds along standardized transects, in relation to changes in the local environmental conditions and large-scale oceanographic forcing. We analyzed the form and the intensity of black-footed albatross (Phoebastria nigripes: BFAL) spatial dispersion off central California, using five years (2004-2008) of vessel-based surveys of seven replicated survey lines. We related BFAL patchiness to local, regional and basin-wide oceanographic variability using two complementary approaches: a hypothesis-based model and an exploratory analysis. The former tested the strength and sign of hypothesized BFAL responses to environmental variability, within a hierarchical atmosphere-ocean context. The latter explored BFAL cross-correlations with atmospheric / oceanographic variables. While albatross dispersion was not significantly explained by the hierarchical model, the exploratory analysis revealed that aggregations were influenced by static (latitude, depth) and dynamic (wind speed, upwelling) environmental variables. Moreover, the largest BFAL patches occurred along the survey lines with the highest densities, and in association with shallow banks. In turn, the highest BFAL densities occurred during periods of negative Pacific Decadal Oscillation index values and low atmospheric pressure. The exploratory analyses suggest that BFAL dispersion is influenced by basin-wide, regional-scale and local environmental variability. Furthermore, the hypothesis-based model highlights that BFAL do not respond to oceanographic variability in a hierarchical fashion. Instead, their distributions shift more strongly in response to large-scale ocean-atmosphere forcing. Thus, interpreting local changes in BFAL abundance and dispersion requires considering diverse environmental forcing operating at multiple scales.
ERIC Educational Resources Information Center
Igami, Masatsura; Okazaki, Teruo
2007-01-01
This analysis aims at capturing current inventive activities in nanotechnologies based on the analysis of patent applications to the European Patent Office (EPO). Reported findings include: (1) Nanotechnology is a multifaceted technology, currently consisting of a set of technologies on the nanometre scale rather than a single technological field;…
A Spatial Framework to Map Heat Health Risks at Multiple Scales.
Ho, Hung Chak; Knudby, Anders; Huang, Wei
2015-12-18
In the last few decades extreme heat events have led to substantial excess mortality, most dramatically in Central Europe in 2003, in Russia in 2010, and even in typically cool locations such as Vancouver, Canada, in 2009. Heat-related morbidity and mortality is expected to increase over the coming centuries as the result of climate-driven global increases in the severity and frequency of extreme heat events. Spatial information on heat exposure and population vulnerability may be combined to map the areas of highest risk and focus mitigation efforts there. However, a mismatch in spatial resolution between heat exposure and vulnerability data can cause spatial scale issues such as the Modifiable Areal Unit Problem (MAUP). We used a raster-based model to integrate heat exposure and vulnerability data in a multi-criteria decision analysis, and compared it to the traditional vector-based model. We then used the Getis-Ord G(i) index to generate spatially smoothed heat risk hotspot maps from fine to coarse spatial scales. The raster-based model allowed production of maps at spatial resolution, more description of local-scale heat risk variability, and identification of heat-risk areas not identified with the vector-based approach. Spatial smoothing with the Getis-Ord G(i) index produced heat risk hotspots from local to regional spatial scale. The approach is a framework for reducing spatial scale issues in future heat risk mapping, and for identifying heat risk hotspots at spatial scales ranging from the block-level to the municipality level.
The development and exploratory analysis of the Back Pain Attitudes Questionnaire (Back-PAQ)
Darlow, Ben; Perry, Meredith; Mathieson, Fiona; Stanley, James; Melloh, Markus; Marsh, Reginald; Baxter, G David; Dowell, Anthony
2014-01-01
Objectives To develop an instrument to assess attitudes and underlying beliefs about back pain, and subsequently investigate its internal consistency and underlying structures. Design The instrument was developed by a multidisciplinary team of clinicians and researchers based on analysis of qualitative interviews with people experiencing acute and chronic back pain. Exploratory analysis was conducted using data from a population-based cross-sectional survey. Setting Qualitative interviews with community-based participants and subsequent postal survey. Participants Instrument development informed by interviews with 12 participants with acute back pain and 11 participants with chronic back pain. Data for exploratory analysis collected from New Zealand residents and citizens aged 18 years and above. 1000 participants were randomly selected from the New Zealand Electoral Roll. 602 valid responses were received. Measures The 34-item Back Pain Attitudes Questionnaire (Back-PAQ) was developed. Internal consistency was evaluated by the Cronbach α coefficient. Exploratory analysis investigated the structure of the data using Principal Component Analysis. Results The 34-item long form of the scale had acceptable internal consistency (α=0.70; 95% CI 0.66 to 0.73). Exploratory analysis identified five two-item principal components which accounted for 74% of the variance in the reduced data set: ‘vulnerability of the back’; ‘relationship between back pain and injury’; ‘activity participation while experiencing back pain’; ‘prognosis of back pain’ and ‘psychological influences on recovery’. Internal consistency was acceptable for the reduced 10-item scale (α=0.61; 95% CI 0.56 to 0.66) and the identified components (α between 0.50 and 0.78). Conclusions The 34-item long form of the scale may be appropriate for use in future cross-sectional studies. The 10-item short form may be appropriate for use as a screening tool, or an outcome assessment instrument. Further testing of the 10-item Back-PAQ's construct validity, reliability, responsiveness to change and predictive ability needs to be conducted. PMID:24860003
NASA Astrophysics Data System (ADS)
Hale, V. Cody; McDonnell, Jeffrey J.
2016-02-01
The effect of bedrock permeability and underlying catchment boundaries on stream base flow mean transit time (MTT) and MTT scaling relationships in headwater catchments is poorly understood. Here we examine the effect of bedrock permeability on MTT and MTT scaling relations by comparing 15 nested research catchments in western Oregon; half within the HJ Andrews Experimental Forest and half at the site of the Alsea Watershed Study. The two sites share remarkably similar vegetation, topography, and climate and differ only in bedrock permeability (one poorly permeable volcanic rock and the other more permeable sandstone). We found longer MTTs in the catchments with more permeable fractured and weathered sandstone bedrock than in the catchments with tight, volcanic bedrock (on average, 6.2 versus 1.8 years, respectively). At the permeable bedrock site, 67% of the variance in MTT across catchments scales was explained by drainage area, with no significant correlation to topographic characteristics. The poorly permeable site had opposite scaling relations, where MTT showed no correlation to drainage area but the ratio of median flow path length to median flow path gradient explained 91% of the variance in MTT across seven catchment scales. Despite these differences, hydrometric analyses, including flow duration and recession analysis, and storm response analysis, show that the two sites share relatively indistinguishable hydrodynamic behavior. These results show that similar catchment forms and hydrologic regimes hide different subsurface routing, storage, and scaling behavior—a major issue if only hydrometric data are used to define hydrological similarity for assessing land use or climate change response.
NASA Astrophysics Data System (ADS)
Bonduel, M.; Bassier, M.; Vergauwen, M.; Pauwels, P.; Klein, R.
2017-11-01
The use of Building Information Modeling (BIM) for existing buildings based on point clouds is increasing. Standardized geometric quality assessment of the BIMs is needed to make them more reliable and thus reusable for future users. First, available literature on the subject is studied. Next, an initial proposal for a standardized geometric quality assessment is presented. Finally, this method is tested and evaluated with a case study. The number of specifications on BIM relating to existing buildings is limited. The Levels of Accuracy (LOA) specification of the USIBD provides definitions and suggestions regarding geometric model accuracy, but lacks a standardized assessment method. A deviation analysis is found to be dependent on (1) the used mathematical model, (2) the density of the point clouds and (3) the order of comparison. Results of the analysis can be graphical and numerical. An analysis on macro (building) and micro (BIM object) scale is necessary. On macro scale, the complete model is compared to the original point cloud and vice versa to get an overview of the general model quality. The graphical results show occluded zones and non-modeled objects respectively. Colored point clouds are derived from this analysis and integrated in the BIM. On micro scale, the relevant surface parts are extracted per BIM object and compared to the complete point cloud. Occluded zones are extracted based on a maximum deviation. What remains is classified according to the LOA specification. The numerical results are integrated in the BIM with the use of object parameters.
Axisymmetric computational fluid dynamics analysis of Saturn V/S1-C/F1 nozzle and plume
NASA Technical Reports Server (NTRS)
Ruf, Joseph H.
1993-01-01
An axisymmetric single engine Computational Fluid Dynamics calculation of the Saturn V/S 1-C vehicle base region and F1 engine plume is described. There were two objectives of this work, the first was to calculate an axisymmetric approximation of the nozzle, plume and base region flow fields of S1-C/F1, relate/scale this to flight data and apply this scaling factor to a NLS/STME axisymmetric calculations from a parallel effort. The second was to assess the differences in F1 and STME plume shear layer development and concentration of combustible gases. This second piece of information was to be input/supporting data for assumptions made in NLS2 base temperature scaling methodology from which the vehicle base thermal environments were being generated. The F1 calculations started at the main combustion chamber faceplate and incorporated the turbine exhaust dump/nozzle film coolant. The plume and base region calculations were made for ten thousand feet and 57 thousand feet altitude at vehicle flight velocity and in stagnant freestream. FDNS was implemented with a 14 species, 28 reaction finite rate chemistry model plus a soot burning model for the RP-1/LOX chemistry. Nozzle and plume flow fields are shown, the plume shear layer constituents are compared to a STME plume. Conclusions are made about the validity and status of the analysis and NLS2 vehicle base thermal environment definition methodology.
Medical image classification based on multi-scale non-negative sparse coding.
Zhang, Ruijie; Shen, Jian; Wei, Fushan; Li, Xiong; Sangaiah, Arun Kumar
2017-11-01
With the rapid development of modern medical imaging technology, medical image classification has become more and more important in medical diagnosis and clinical practice. Conventional medical image classification algorithms usually neglect the semantic gap problem between low-level features and high-level image semantic, which will largely degrade the classification performance. To solve this problem, we propose a multi-scale non-negative sparse coding based medical image classification algorithm. Firstly, Medical images are decomposed into multiple scale layers, thus diverse visual details can be extracted from different scale layers. Secondly, for each scale layer, the non-negative sparse coding model with fisher discriminative analysis is constructed to obtain the discriminative sparse representation of medical images. Then, the obtained multi-scale non-negative sparse coding features are combined to form a multi-scale feature histogram as the final representation for a medical image. Finally, SVM classifier is combined to conduct medical image classification. The experimental results demonstrate that our proposed algorithm can effectively utilize multi-scale and contextual spatial information of medical images, reduce the semantic gap in a large degree and improve medical image classification performance. Copyright © 2017 Elsevier B.V. All rights reserved.
Ma, Li; Runesha, H Birali; Dvorkin, Daniel; Garbe, John R; Da, Yang
2008-01-01
Background Genome-wide association studies (GWAS) using single nucleotide polymorphism (SNP) markers provide opportunities to detect epistatic SNPs associated with quantitative traits and to detect the exact mode of an epistasis effect. Computational difficulty is the main bottleneck for epistasis testing in large scale GWAS. Results The EPISNPmpi and EPISNP computer programs were developed for testing single-locus and epistatic SNP effects on quantitative traits in GWAS, including tests of three single-locus effects for each SNP (SNP genotypic effect, additive and dominance effects) and five epistasis effects for each pair of SNPs (two-locus interaction, additive × additive, additive × dominance, dominance × additive, and dominance × dominance) based on the extended Kempthorne model. EPISNPmpi is the parallel computing program for epistasis testing in large scale GWAS and achieved excellent scalability for large scale analysis and portability for various parallel computing platforms. EPISNP is the serial computing program based on the EPISNPmpi code for epistasis testing in small scale GWAS using commonly available operating systems and computer hardware. Three serial computing utility programs were developed for graphical viewing of test results and epistasis networks, and for estimating CPU time and disk space requirements. Conclusion The EPISNPmpi parallel computing program provides an effective computing tool for epistasis testing in large scale GWAS, and the epiSNP serial computing programs are convenient tools for epistasis analysis in small scale GWAS using commonly available computer hardware. PMID:18644146
Discriminant of validity the Wender Utah rating scale in Iranian adults.
Farokhzadi, Farideh; Mohammadi, Mohammad Reza; Salmanian, Maryam
2014-01-01
The aim of this study is the normalization of the Wender Utah rating scale which is used to detect adults with Attention-Deficit and Hyperactivity Disorder (ADHD). Available sampling method was used to choose 400 parents of children (200 parents of children with ADHD as compared to 200 parents of normal children). Wender Utah rating scale, which has been designed to diagnose ADHD in adults, is filled out by each of the parents to most accurately diagnose of ADHD in parents. Wender Utah rating scale was divided into 6 sub scales which consist of dysthymia, oppositional defiant disorder; school work problems, conduct disorder, anxiety, and ADHD were analyzed with exploratory factor analysis method. The value of (Kaiser-Meyer-Olkin) KMO was 86.5% for dysthymia, 86.9% for oppositional defiant disorder, 77.5% for school related problems, 90.9% for conduct disorder, 79.6% for anxiety and 93.5% for Attention deficit/hyperactivity disorder, also the chi square value based on Bartlett's Test was 2242.947 for dysthymia, 2239.112 for oppositional defiant disorder, 1221.917 for school work problems, 5031.511 for conduct, 1421.1 for anxiety, and 7644.122 for ADHD. Since mentioned values were larger than the chi square critical values (P<0.05), it found that the factor correlation matrix is appropriate for factor analysis. Based on the findings, we can conclude that Wender Utah rating scale can be appropriately used for predicting dysthymia, oppositional defiant disorder, school work problems, conduct disorder, anxiety, in adults with ADHD.
NASA Astrophysics Data System (ADS)
Kiliyanpilakkil, Velayudhan Praju
Atmospheric motions take place in spatial scales of sub-millimeters to few thousands of kilometers with temporal changes in the atmospheric variables occur in fractions of seconds to several years. Consequently, the variations in atmospheric kinetic energy associated with these atmospheric motions span over a broad spectrum of space and time. The mesoscale region acts as an energy transferring regime between the energy generating synoptic scale and the energy dissipating microscale. Therefore, the scaling characterizations of mesoscale wind fields are significant in the accurate estimation of the atmospheric energy budget. Moreover, the precise knowledge of the scaling characteristics of atmospheric mesoscale wind fields is important for the validation of the numerical models those focus on wind forecasting, dispersion, diffusion, horizontal transport, and optical turbulence. For these reasons, extensive studies have been conducted in the past to characterize the mesoscale wind fields. Nevertheless, the majority of these studies focused on near-surface and upper atmosphere mesoscale regimes. The present study attempt to identify the existence and to quantify the scaling of mesoscale wind fields in the lower atmospheric boundary layer (ABL; in the wind turbine layer) using wind observations from various research-grade instruments (e.g., sodars, anemometers). The scaling characteristics of the mesoscale wind speeds over diverse homogeneous flat terrains, conducted using structure function based analysis, revealed an altitudinal dependence of the scaling exponents. This altitudinal dependence of the wind speed scaling may be attributed to the buoyancy forcing. Subsequently, we use the framework of extended self-similarity (ESS) to characterize the observed scaling behavior. In the ESS framework, the relative scaling exponents of the mesoscale atmospheric boundary layer wind speed exhibit quasi-universal behavior; even far beyond the inertial range of turbulence (Delta t within 10 minutes to 6 hours range). The ESS framework based study is extended further to enquire its validity over complex terrain. This study, based on multiyear wind observations, demonstrate that the ESS holds for the lower ABL wind speed over the complex terrain as well. Another important inference from this study is that the ESS relative scaling exponents corresponding to the mesoscale wind speed closely matches the scaling characteristics of the inertial range turbulence, albeit not exactly identical. The current study proposes benchmark using ESS-based quasi-universal wind speed scaling characteristics in the ABL for the mesoscale modeling community. Using a state-of-the-art atmospheric mesoscale model in conjunction with different planetary boundary layer (PBL) parameterization schemes, multiple wind speed simulations have been conducted. This study reveals that the ESS scaling characteristics of the model simulated wind speed time series in the lower ABL vary significantly from their observational counterparts. The study demonstrate that the model simulated wind speed time series for the time intervals Delta t < 2 hours do not capture the ESS-based scaling characteristics. The detailed analysis of model simulations using different PBL schemes lead to the conclusion that there is a need for significant improvements in the turbulent closure parameterizations adapted in the new-generation atmospheric models. This study is unique as the ESS framework has never been reported or examined for the validation of PBL parameterizations.
Multi-Scale Analysis of Trends in Northeastern Temperate Forest Springtime Phenology
NASA Astrophysics Data System (ADS)
Moon, M.; Melaas, E. K.; Sulla-menashe, D. J.; Friedl, M. A.
2017-12-01
The timing of spring leaf emergence is highly variable in many ecosystems, exerts first-order control growing season length, and significantly modulates seasonally-integrated photosynthesis. Numerous studies have reported trends toward earlier spring phenology in temperate forests, with some papers indicating that this trend is also leading to increased carbon uptake. At broad spatial scales, however, most of these studies have used data from coarse spatial resolution instruments such as MODIS, which does not resolve ecologically important landscape-scale patterns in phenology. In this work, we examine how long-term trends in spring phenology differ across three data sources acquired at different scales of measurements at the Harvard Forest in central Massachusetts. Specifically, we compared trends in the timing of phenology based on long-term in-situ measurements of phenology, estimates based on eddy-covariance measurements of net carbon uptake transition dates, and from two sources of satellite-based remote sensing (MODIS and Landsat) land surface phenology (LSP) data. Our analysis focused on the flux footprint surrounding the Harvard Forest Environmental Measurements (EMS) tower. Our results reveal clearly defined trends toward earlier springtime phenology in Landsat LSP and in the timing of tower-based net carbon uptake. However, we find no statistically significant trend in springtime phenology measured from MODIS LSP data products, possibly because the time series of MODIS observations is relatively short (13 years). The trend in tower-based transition data exhibited a larger negative value than the trend derived from Landsat LSP data (-0.42 and -0.28 days per year for 21 and 28 years, respectively). More importantly, these results have two key implications regarding how changes in spring phenology are impacting carbon uptake at landscape-scale. First, long-term trends in spring phenology can be quite different, depending on what data source is used to estimate the trend, and 2) the response of carbon uptake to climate change may be more sensitive than the response of land surface phenology itself.
Diagnosis of sustainable collaboration in health promotion – a case study
Leurs, Mariken TW; Mur-Veeman, Ingrid M; van der Sar, Rosalie; Schaalma, Herman P; de Vries, Nanne K
2008-01-01
Background Collaborations are important to health promotion in addressing multi-party problems. Interest in collaborative processes in health promotion is rising, but still lacks monitoring instruments. The authors developed the DIagnosis of Sustainable Collaboration (DISC) model to enable comprehensive monitoring of public health collaboratives. The model focuses on opportunities and impediments for collaborative change, based on evidence from interorganizational collaboration, organizational behavior and planned organizational change. To illustrate and assess the DISC-model, the 2003/2004 application of the model to the Dutch whole-school health promotion collaboration is described. Methods The study combined quantitative research, using a cross-sectional survey, with qualitative research using the personal interview methodology and document analysis. A DISC-based survey was sent to 55 stakeholders in whole-school health promotion in one Dutch region. The survey consisted of 22 scales with 3 to 8 items. Only scales with a reliability score of 0.60 were accepted. The analysis provided for comparisons between stakeholders from education, public service and public health. The survey was followed by approaching 14 stakeholders for a semi-structured DISC-based interview. As the interviews were timed after the survey, the interviews were used to clarify unexpected and unclear outcomes of the survey as well. Additionally, a DISC-based document analysis was conducted including minutes of meetings, project descriptions and correspondence with schools and municipalities. Results Response of the survey was 77% and of the interviews 86%. Significant differences between respondents of different domains were found for the following scales: organizational characteristics scale, the change strategies, network development, project management, willingness to commit and innovative actions and adaptations. The interviews provided a more specific picture of the state of the art of the studied collaboration regarding the DISC-constructs. Conclusion The DISC-model is more than just the sum of the different parameters provided in the literature on interorganizational collaboration, organization change, networking and setting-approaches. Monitoring a collaboration based on the DISC-model yields insight into windows of opportunity and current impediments for collaborative change. DISC-based monitoring is a promising strategy enabling project managers and social entrepreneurs to plan change management strategies systematically. PMID:18992132