Ishikawa, Sohta A; Inagaki, Yuji; Hashimoto, Tetsuo
2012-01-01
In phylogenetic analyses of nucleotide sequences, 'homogeneous' substitution models, which assume the stationarity of base composition across a tree, are widely used, albeit individual sequences may bear distinctive base frequencies. In the worst-case scenario, a homogeneous model-based analysis can yield an artifactual union of two distantly related sequences that achieved similar base frequencies in parallel. Such potential difficulty can be countered by two approaches, 'RY-coding' and 'non-homogeneous' models. The former approach converts four bases into purine and pyrimidine to normalize base frequencies across a tree, while the heterogeneity in base frequency is explicitly incorporated in the latter approach. The two approaches have been applied to real-world sequence data; however, their basic properties have not been fully examined by pioneering simulation studies. Here, we assessed the performances of the maximum-likelihood analyses incorporating RY-coding and a non-homogeneous model (RY-coding and non-homogeneous analyses) on simulated data with parallel convergence to similar base composition. Both RY-coding and non-homogeneous analyses showed superior performances compared with homogeneous model-based analyses. Curiously, the performance of RY-coding analysis appeared to be significantly affected by a setting of the substitution process for sequence simulation relative to that of non-homogeneous analysis. The performance of a non-homogeneous analysis was also validated by analyzing a real-world sequence data set with significant base heterogeneity.
Analysis of swimming performance: perceptions and practices of US-based swimming coaches.
Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; Newell, John; Quinlan, Leo Richard; ÓLaighin, Gearóid
2016-01-01
In elite swimming, a broad range of methods are used to assess performance, inform coaching practices and monitor athletic progression. The aim of this paper was to examine the performance analysis practices of swimming coaches and to explore the reasons behind the decisions that coaches take when analysing performance. Survey data were analysed from 298 Level 3 competitive swimming coaches (245 male, 53 female) based in the United States. Results were compiled to provide a generalised picture of practices and perceptions and to examine key emerging themes. It was found that a disparity exists between the importance swim coaches place on biomechanical analysis of swimming performance and the types of analyses that are actually conducted. Video-based methods are most frequently employed, with over 70% of coaches using these methods at least monthly, with analyses being mainly qualitative in nature rather than quantitative. Barriers to the more widespread use of quantitative biomechanical analysis in elite swimming environments were explored. Constraints include time, cost and availability of resources, but other factors such as sources of information on swimming performance and analysis and control over service provision are also discussed, with particular emphasis on video-based methods and emerging sensor-based technologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maslenikov, O.R.; Mraz, M.J.; Johnson, J.J.
1986-03-01
This report documents the seismic analyses performed by SMA for the MFTF-B Axicell vacuum vessel. In the course of this study we performed response spectrum analyses, CLASSI fixed-base analyses, and SSI analyses that included interaction effects between the vessel and vault. The response spectrum analysis served to benchmark certain modeling differences between the LLNL and SMA versions of the vessel model. The fixed-base analysis benchmarked the differences between analysis techniques. The SSI analyses provided our best estimate of vessel response to the postulated seismic excitation for the MFTF-B facility, and included consideration of uncertainties in soil properties by calculating responsemore » for a range of soil shear moduli. Our results are presented in this report as tables of comparisons of specific member forces from our analyses and the analyses performed by LLNL. Also presented are tables of maximum accelerations and relative displacements and plots of response spectra at various selected locations.« less
SVDS plume impingement modeling development. Sensitivity analysis supporting level B requirements
NASA Technical Reports Server (NTRS)
Chiu, P. B.; Pearson, D. J.; Muhm, P. M.; Schoonmaker, P. B.; Radar, R. J.
1977-01-01
A series of sensitivity analyses (trade studies) performed to select features and capabilities to be implemented in the plume impingement model is described. Sensitivity analyses were performed in study areas pertaining to geometry, flowfield, impingement, and dynamical effects. Recommendations based on these analyses are summarized.
Skylab program earth resources experiment package sensor performance evaluation, volume 1, (S190A)
NASA Technical Reports Server (NTRS)
Kenney, G. P.
1975-01-01
The results of S190A sensor performance evaluation are summarized based on data presented by all contributors to the sensor performance evaluation interim reports. Techniques used in sensor performance evaluation are discussed. Topics discussed include: performance degradation identified during the Skylab missions, S190A and EREP system anomalies that affected S190A performance, and the performance achieved, in terms of pertinent S190A parameters. Additional analyses include final performance analyses completed after submittal of the SL4 interim sensor performance evaluation reports, including completion of detailed analyses of basic performance parameters initiated during the interim report periods and consolidation analyses to reduce independent mission data (SL2, SL3, and SL4) to determine overall performance realized during all three Skylab missions.
DOT National Transportation Integrated Search
2010-09-01
This report presents data and technical analyses for Texas Department of Transportation Project 0-5235. This : project focused on the evaluation of traffic sign sheeting performance in terms of meeting the nighttime : driver needs. The goal was to de...
ERIC Educational Resources Information Center
Jackson, Dan; Bowden, Jack; Baker, Rose
2015-01-01
Moment-based estimators of the between-study variance are very popular when performing random effects meta-analyses. This type of estimation has many advantages including computational and conceptual simplicity. Furthermore, by using these estimators in large samples, valid meta-analyses can be performed without the assumption that the treatment…
Continued Evaluation of Gear Condition Indicator Performance on Rotorcraft Fleet
NASA Technical Reports Server (NTRS)
Delgado, Irebert R.; Dempsey, Paula J.; Antolick, Lance J.; Wade, Daniel R.
2013-01-01
This paper details analyses of condition indicator performance for the helicopter nose gearbox within the U.S. Army's Condition-Based Maintenance Program. Ten nose gearbox data sets underwent two specific analyses. A mean condition indicator level analysis was performed where condition indicator performance was based on a 'batting average' measured before and after part replacement. Two specific condition indicators, Diagnostic Algorithm 1 and Sideband Index, were found to perform well for the data sets studied. A condition indicator versus gear wear analysis was also performed, where gear wear photographs and descriptions from Army tear-down analyses were categorized based on ANSI/AGMA 1010-E95 standards. Seven nose gearbox data sets were analyzed and correlated with condition indicators Diagnostic Algorithm 1 and Sideband Index. Both were found to be most responsive to gear wear cases of micropitting and spalling. Input pinion nose gear box condition indicators were found to be more responsive to part replacement during overhaul than their corresponding output gear nose gear box condition indicators.
Development of the performance confirmation program at YUCCA mountain, nevada
LeCain, G.D.; Barr, D.; Weaver, D.; Snell, R.; Goodin, S.W.; Hansen, F.D.
2006-01-01
The Yucca Mountain Performance Confirmation program consists of tests, monitoring activities, experiments, and analyses to evaluate the adequacy of assumptions, data, and analyses that form the basis of the conceptual and numerical models of flow and transport associated with a proposed radioactive waste repository at Yucca Mountain, Nevada. The Performance Confirmation program uses an eight-stage risk-informed, performance-based approach. Selection of the Performance Confirmation activities for inclusion in the Performance Confirmation program was done using a risk-informed performance-based decision analysis. The result of this analysis was a Performance Confirmation base portfolio that consists of 20 activities. The 20 Performance Confirmation activities include geologic, hydrologie, and construction/engineering testing. Some of the activities began during site characterization, and others will begin during construction, or post emplacement, and continue until repository closure.
Thermal finite-element analysis of space shuttle main engine turbine blade
NASA Technical Reports Server (NTRS)
Abdul-Aziz, Ali; Tong, Michael T.; Kaufman, Albert
1987-01-01
Finite-element, transient heat transfer analyses were performed for the first-stage blades of the space shuttle main engine (SSME) high-pressure fuel turbopump. The analyses were based on test engine data provided by Rocketdyne. Heat transfer coefficients were predicted by performing a boundary-layer analysis at steady-state conditions with the STAN5 boundary-layer code. Two different peak-temperature overshoots were evaluated for the startup transient. Cutoff transient conditions were also analyzed. A reduced gas temperature profile based on actual thermocouple data was also considered. Transient heat transfer analyses were conducted with the MARC finite-element computer code.
Nair, Pradeep S; John, Eugene B
2007-01-01
Aligning specific sequences against a very large number of other sequences is a central aspect of bioinformatics. With the widespread availability of personal computers in biology laboratories, sequence alignment is now often performed locally. This makes it necessary to analyse the performance of personal computers for sequence aligning bioinformatics benchmarks. In this paper, we analyse the performance of a personal computer for the popular BLAST and FASTA sequence alignment suites. Results indicate that these benchmarks have a large number of recurring operations and use memory operations extensively. It seems that the performance can be improved with a bigger L1-cache.
Bounthavong, Mark; Pruitt, Larry D; Smolenski, Derek J; Gahm, Gregory A; Bansal, Aasthaa; Hansen, Ryan N
2018-02-01
Introduction Home-based telebehavioural healthcare improves access to mental health care for patients restricted by travel burden. However, there is limited evidence assessing the economic value of home-based telebehavioural health care compared to in-person care. We sought to compare the economic impact of home-based telebehavioural health care and in-person care for depression among current and former US service members. Methods We performed trial-based cost-minimisation and cost-utility analyses to assess the economic impact of home-based telebehavioural health care versus in-person behavioural care for depression. Our analyses focused on the payer perspective (Department of Defense and Department of Veterans Affairs) at three months. We also performed a scenario analysis where all patients possessed video-conferencing technology that was approved by these agencies. The cost-utility analysis evaluated the impact of different depression categories on the incremental cost-effectiveness ratio. One-way and probabilistic sensitivity analyses were performed to test the robustness of the model assumptions. Results In the base case analysis the total direct cost of home-based telebehavioural health care was higher than in-person care (US$71,974 versus US$20,322). Assuming that patients possessed government-approved video-conferencing technology, home-based telebehavioural health care was less costly compared to in-person care (US$19,177 versus US$20,322). In one-way sensitivity analyses, the proportion of patients possessing personal computers was a major driver of direct costs. In the cost-utility analysis, home-based telebehavioural health care was dominant when patients possessed video-conferencing technology. Results from probabilistic sensitivity analyses did not differ substantially from base case results. Discussion Home-based telebehavioural health care is dependent on the cost of supplying video-conferencing technology to patients but offers the opportunity to increase access to care. Health-care policies centred on implementation of home-based telebehavioural health care should ensure that these technologies are able to be successfully deployed on patients' existing technology.
Meaning profiles of dwellings, pathways, and metaphors in design: implications for education
NASA Astrophysics Data System (ADS)
Casakin, Hernan; Kreitler, Shulamith
2017-11-01
The study deals with the roles and interrelations of the meaning-based assessments of dwellings, pathways and metaphors in design performance. It is grounded in the Meaning Theory [Kreitler, S., and H. Kreitler. 1990. The Cognitive Foundations of Personality Traits. New York: Plenum], which enables identifying the cognitive contents and processes underlying cognitive performance in different domains, thus rendering them more accessible to educational training. The objectives were to identify the components of the meaning profiles of dwellings, pathways, and metaphors as perceived by design students; to analyse their interrelations; and to examine which of the identified components of these constructs serve as best predictors of design performance aided by the use of metaphors. Participants were administered a design task and questionnaires about the Dimensional Profiles of Dwellings, Pathways, and Metaphors, based on the meaning system. Factors based on the factor analyses of the responses to the three questionnaires were used in regression analyses as predictors of the performance score in a design task. The following three factors of the dimensional meaning profiles of metaphors were significant predictors of design performance: sensory, functional, and structural evaluations. Implications for design education are discussed, primarily concerning the important role of metaphor in design problem-solving.
Gender differences in the content of cognitive distraction during sex.
Meana, Marta; Nunnink, Sarah E
2006-02-01
This study compared 220 college men and 237 college women on two types of self-reported cognitive distraction during sex, performance- and appearance-based. Affect, psychological distress, sexual knowledge, attitudes, fantasies, experiences, body image, satisfaction, and sexual function were assessed with the Derogatis Sexual Functioning Inventory and the Sexual History Form to determine associations with distraction. Between-gender analyses revealed that women reported higher levels of overall and appearance-based distraction than did men, but similar levels of performance-based distraction. Within-gender analyses revealed that women reported as much of one type of distraction as the other, while men reported more performance- than appearance-based distraction. In women, appearance-based distraction was predicted by negative body image, psychological distress, and not being in a relationship, while performance-based distraction was predicted by negative body image, psychological distress, and sexual dissatisfaction. In men, appearance-based distraction was predicted by negative body image, sexual dissatisfaction and not being in a relationship, while performance-based distraction was predicted by negative body image and sexual dissatisfaction. Investigating the content of cognitive distraction may be useful in understanding gender differences in sexual experience and in refining cognitive components of sex therapy.
Crash Outcome Data Evaluation System (CODES) Project Safety Belt and Helmet Analyses
DOT National Transportation Integrated Search
1996-02-15
Analyses of the benefits of safety belt and helmet use were undertaken for a : Report that was sent to Congress February, 1996. NHTSA awarded grants to link : crash and injury state data and perform the analyses upon which the report was : based. The...
Rudnick, Paul A.; Clauser, Karl R.; Kilpatrick, Lisa E.; Tchekhovskoi, Dmitrii V.; Neta, Pedatsur; Blonder, Nikša; Billheimer, Dean D.; Blackman, Ronald K.; Bunk, David M.; Cardasis, Helene L.; Ham, Amy-Joan L.; Jaffe, Jacob D.; Kinsinger, Christopher R.; Mesri, Mehdi; Neubert, Thomas A.; Schilling, Birgit; Tabb, David L.; Tegeler, Tony J.; Vega-Montoto, Lorenzo; Variyath, Asokan Mulayath; Wang, Mu; Wang, Pei; Whiteaker, Jeffrey R.; Zimmerman, Lisa J.; Carr, Steven A.; Fisher, Susan J.; Gibson, Bradford W.; Paulovich, Amanda G.; Regnier, Fred E.; Rodriguez, Henry; Spiegelman, Cliff; Tempst, Paul; Liebler, Daniel C.; Stein, Stephen E.
2010-01-01
A major unmet need in LC-MS/MS-based proteomics analyses is a set of tools for quantitative assessment of system performance and evaluation of technical variability. Here we describe 46 system performance metrics for monitoring chromatographic performance, electrospray source stability, MS1 and MS2 signals, dynamic sampling of ions for MS/MS, and peptide identification. Applied to data sets from replicate LC-MS/MS analyses, these metrics displayed consistent, reasonable responses to controlled perturbations. The metrics typically displayed variations less than 10% and thus can reveal even subtle differences in performance of system components. Analyses of data from interlaboratory studies conducted under a common standard operating procedure identified outlier data and provided clues to specific causes. Moreover, interlaboratory variation reflected by the metrics indicates which system components vary the most between laboratories. Application of these metrics enables rational, quantitative quality assessment for proteomics and other LC-MS/MS analytical applications. PMID:19837981
Integrated vehicle-based safety systems field operational test final program report.
DOT National Transportation Integrated Search
2011-06-01
"This document presents results from the light-vehicle and heavy-truck field operational tests performed as part of the Integrated Vehicle-Based Safety Systems (IVBSS) program. The findings are the result of analyses performed by the University of Mi...
Integrated Vehicle-Based Safety Systems Field Operational Test : Final Program Report
DOT National Transportation Integrated Search
2011-06-01
This document presents results from the light-vehicle and heavy-truck field operational tests performed as part of the Integrated Vehicle-Based Safety Systems (IVBSS) program. The findings are the result of analyses performed by the University of Mic...
Rzucidlo, Justyna K; Roseman, Paige L; Laurienti, Paul J; Dagenbach, Dale
2013-01-01
Graph-theory based analyses of resting state functional Magnetic Resonance Imaging (fMRI) data have been used to map the network organization of the brain. While numerous analyses of resting state brain organization exist, many questions remain unexplored. The present study examines the stability of findings based on this approach over repeated resting state and working memory state sessions within the same individuals. This allows assessment of stability of network topology within the same state for both rest and working memory, and between rest and working memory as well. fMRI scans were performed on five participants while at rest and while performing the 2-back working memory task five times each, with task state alternating while they were in the scanner. Voxel-based whole brain network analyses were performed on the resulting data along with analyses of functional connectivity in regions associated with resting state and working memory. Network topology was fairly stable across repeated sessions of the same task, but varied significantly between rest and working memory. In the whole brain analysis, local efficiency, Eloc, differed significantly between rest and working memory. Analyses of network statistics for the precuneus and dorsolateral prefrontal cortex revealed significant differences in degree as a function of task state for both regions and in local efficiency for the precuneus. Conversely, no significant differences were observed across repeated sessions of the same state. These findings suggest that network topology is fairly stable within individuals across time for the same state, but also fluid between states. Whole brain voxel-based network analyses may prove to be a valuable tool for exploring how functional connectivity changes in response to task demands.
SSME Post Test Diagnostic System: Systems Section
NASA Technical Reports Server (NTRS)
Bickmore, Timothy
1995-01-01
An assessment of engine and component health is routinely made after each test firing or flight firing of a Space Shuttle Main Engine (SSME). Currently, this health assessment is done by teams of engineers who manually review sensor data, performance data, and engine and component operating histories. Based on review of information from these various sources, an evaluation is made as to the health of each component of the SSME and the preparedness of the engine for another test or flight. The objective of this project - the SSME Post Test Diagnostic System (PTDS) - is to develop a computer program which automates the analysis of test data from the SSME in order to detect and diagnose anomalies. This report primarily covers work on the Systems Section of the PTDS, which automates the analyses performed by the systems/performance group at the Propulsion Branch of NASA Marshall Space Flight Center (MSFC). This group is responsible for assessing the overall health and performance of the engine, and detecting and diagnosing anomalies which involve multiple components (other groups are responsible for analyzing the behavior of specific components). The PTDS utilizes several advanced software technologies to perform its analyses. Raw test data is analyzed using signal processing routines which detect features in the data, such as spikes, shifts, peaks, and drifts. Component analyses are performed by expert systems, which use 'rules-of-thumb' obtained from interviews with the MSFC data analysts to detect and diagnose anomalies. The systems analysis is performed using case-based reasoning. Results of all analyses are stored in a relational database and displayed via an X-window-based graphical user interface which provides ranked lists of anomalies and observations by engine component, along with supporting data plots for each.
Investigating gender violence in Jamaica.
Spiring, Fred
2014-01-01
As Jamaica moves through implementation of their National Policy on Gender Equality and develops harassment legislation, this article attempts to investigate current levels and trends of gender-based violence in Jamaica. All analyses make use of existing data and data formats in developing performance indicators that illustrate the current state of gender violence in Jamaica. The analyses provide a baseline for the future assessment and comparison with internationally accepted gender-based violence indicators. All source data has been included to facilitate comparisons and discussions regarding related levels and trends of violence as well as addressing performance indicator effectiveness.
Turbine blade forced response prediction using FREPS
NASA Technical Reports Server (NTRS)
Murthy, Durbha, V.; Morel, Michael R.
1993-01-01
This paper describes a software system called FREPS (Forced REsponse Prediction System) that integrates structural dynamic, steady and unsteady aerodynamic analyses to efficiently predict the forced response dynamic stresses in axial flow turbomachinery blades due to aerodynamic and mechanical excitations. A flutter analysis capability is also incorporated into the system. The FREPS system performs aeroelastic analysis by modeling the motion of the blade in terms of its normal modes. The structural dynamic analysis is performed by a finite element code such as MSC/NASTRAN. The steady aerodynamic analysis is based on nonlinear potential theory and the unsteady aerodynamic analyses is based on the linearization of the non-uniform potential flow mean. The program description and presentation of the capabilities are reported herein. The effectiveness of the FREPS package is demonstrated on the High Pressure Oxygen Turbopump turbine of the Space Shuttle Main Engine. Both flutter and forced response analyses are performed and typical results are illustrated.
Solar power satellite system definition study. Volume 3: Operations and systems synthesis, phase 2
NASA Technical Reports Server (NTRS)
1979-01-01
The results of the operations analyses are reported. Some of these analyses examined operations aspects of space vehicle in-space maintenance. Many of the analyses explored in great depth operations concerned the LEO Base cargo handling operations. Personnel transportation operations and cargo packaging were also analyzed. These operations analyses were performed to define the operational requirements for all of the SPS system elements so that equipment and facilities could be synthesized, and to make estimates of the manpower requirements. An overall, integrated, end-to-end description of the SPS operations is presented. The detailed operations analyses, upon which this integrated description was based, are included.
Shek, Daniel T L; Ma, Cecilia M S
2011-01-05
Although different methods are available for the analyses of longitudinal data, analyses based on generalized linear models (GLM) are criticized as violating the assumption of independence of observations. Alternatively, linear mixed models (LMM) are commonly used to understand changes in human behavior over time. In this paper, the basic concepts surrounding LMM (or hierarchical linear models) are outlined. Although SPSS is a statistical analyses package commonly used by researchers, documentation on LMM procedures in SPSS is not thorough or user friendly. With reference to this limitation, the related procedures for performing analyses based on LMM in SPSS are described. To demonstrate the application of LMM analyses in SPSS, findings based on six waves of data collected in the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes) in Hong Kong are presented.
Longitudinal Data Analyses Using Linear Mixed Models in SPSS: Concepts, Procedures and Illustrations
Shek, Daniel T. L.; Ma, Cecilia M. S.
2011-01-01
Although different methods are available for the analyses of longitudinal data, analyses based on generalized linear models (GLM) are criticized as violating the assumption of independence of observations. Alternatively, linear mixed models (LMM) are commonly used to understand changes in human behavior over time. In this paper, the basic concepts surrounding LMM (or hierarchical linear models) are outlined. Although SPSS is a statistical analyses package commonly used by researchers, documentation on LMM procedures in SPSS is not thorough or user friendly. With reference to this limitation, the related procedures for performing analyses based on LMM in SPSS are described. To demonstrate the application of LMM analyses in SPSS, findings based on six waves of data collected in the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes) in Hong Kong are presented. PMID:21218263
NASA Technical Reports Server (NTRS)
Pipher, M. D.; Green, P. A.; Wolfgram, D. F.
1975-01-01
A catalogue is presented of space shuttle electrical equipment as used within a standardized data base for EPS consumables analyses. The general function and expected usage of each type of electrical equipment are described, and the usage of specific equipment of each type in the performance of EPS consumables analyses is defined.
Krumm, Sabine; Kivisaari, Sasa L; Monsch, Andreas U; Reinhardt, Julia; Ulmer, Stephan; Stippich, Christoph; Kressig, Reto W; Taylor, Kirsten I
2017-05-01
The parietal lobe is important for successful recognition memory, but its role is not yet fully understood. We investigated the parietal lobes' contribution to immediate paired-associate memory and delayed item-recognition memory separately for hits (targets) and correct rejections (distractors). We compared the behavioral performance of 56 patients with known parietal and medial temporal lobe dysfunction (i.e. early Alzheimer's Disease) to 56 healthy control participants in an immediate paired and delayed single item object memory task. Additionally, we performed voxel-based morphometry analyses to investigate the functional-neuroanatomic relationships between performance and voxel-based estimates of atrophy in whole-brain analyses. Behaviorally, all participants performed better identifying targets than rejecting distractors. The voxel-based morphometry analyses associated atrophy in the right ventral parietal cortex with fewer correct responses to familiar items (i.e. hits) in the immediate and delayed conditions. Additionally, medial temporal lobe integrity correlated with better performance in rejecting distractors, but not in identifying targets, in the immediate paired-associate task. Our findings suggest that the parietal lobe critically supports successful immediate and delayed target recognition memory, and that the ventral aspect of the parietal cortex and the medial temporal lobe may have complementary preferences for identifying targets and rejecting distractors, respectively, during recognition memory. Copyright © 2017. Published by Elsevier Inc.
Hasan, Haroon; Muhammed, Taaha; Yu, Jennifer; Taguchi, Kelsi; Samargandi, Osama A; Howard, A Fuchsia; Lo, Andrea C; Olson, Robert; Goddard, Karen
2017-10-01
The objective of our study was to evaluate the methodological quality of systematic reviews and meta-analyses in Radiation Oncology. A systematic literature search was conducted for all eligible systematic reviews and meta-analyses in Radiation Oncology from 1966 to 2015. Methodological characteristics were abstracted from all works that satisfied the inclusion criteria and quality was assessed using the critical appraisal tool, AMSTAR. Regression analyses were performed to determine factors associated with a higher score of quality. Following exclusion based on a priori criteria, 410 studies (157 systematic reviews and 253 meta-analyses) satisfied the inclusion criteria. Meta-analyses were found to be of fair to good quality while systematic reviews were found to be of less than fair quality. Factors associated with higher scores of quality in the multivariable analysis were including primary studies consisting of randomized control trials, performing a meta-analysis, and applying a recommended guideline related to establishing a systematic review protocol and/or reporting. Systematic reviews and meta-analyses may introduce a high risk of bias if applied to inform decision-making based on AMSTAR. We recommend that decision-makers in Radiation Oncology scrutinize the methodological quality of systematic reviews and meta-analyses prior to assessing their utility to inform evidence-based medicine and researchers adhere to methodological standards outlined in validated guidelines when embarking on a systematic review. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Measure of Team Resilience: Developing the Resilience at Work Team Scale.
McEwen, Kathryn; Boyd, Carolyn M
2018-03-01
This study develops, and initial evaluates, a new measure of team-based resilience for use in research and practice. We conducted preliminary analyses, based on a cross-sectional sample of 344 employees nested within 31 teams. Seven dimensions were identified through exploratory and confirmatory factor analyses. The measure had high reliability and significant discrimination to indicate the presence of a unique team-based aspect of resilience that contributed to higher work engagement and higher self-rated team performance, over and above the effects of individual resilience. Multilevel analyses showed that team, but not individual, resilience predicted self-rated team performance. Practice implications include a need to focus on collective as well as individual behaviors in resilience-building. The measure provides a diagnostic instrument for teams and a scale to evaluate organizational interventions and research the relationship of resilience to other constructs.
Synthesis of major economic studies of residential photovoltaics
NASA Technical Reports Server (NTRS)
1984-01-01
Six studies that analyze the cost effectiveness of grid connected residential photovoltaic systems are reviewed. These six studies were selected based on two criteria. First, the reports share common emphases on developing photovoltaic systems with certain engineering design goals in mind, and on performing economic analyses to assess the long term economic potential of the system design. The economic analyses presented are performed from the consumer's perspective.
Integrated analysis of large space systems
NASA Technical Reports Server (NTRS)
Young, J. P.
1980-01-01
Based on the belief that actual flight hardware development of large space systems will necessitate a formalized method of integrating the various engineering discipline analyses, an efficient highly user oriented software system capable of performing interdisciplinary design analyses with tolerable solution turnaround time is planned Specific analysis capability goals were set forth with initial emphasis given to sequential and quasi-static thermal/structural analysis and fully coupled structural/control system analysis. Subsequently, the IAC would be expanded to include a fully coupled thermal/structural/control system, electromagnetic radiation, and optical performance analyses.
Effects of maintenance operations on track buckling potential
DOT National Transportation Integrated Search
2003-05-04
This paper presents the results of buckling analyses based on data from recent tests determining the influence of track maintenance and consolidation on track lateral resistance. The buckling analyses were performed using the USDOT/Volpe "CWR-SAFE" m...
Probabilistic Assessment of National Wind Tunnel
NASA Technical Reports Server (NTRS)
Shah, A. R.; Shiao, M.; Chamis, C. C.
1996-01-01
A preliminary probabilistic structural assessment of the critical section of National Wind Tunnel (NWT) is performed using NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) computer code. Thereby, the capabilities of NESSUS code have been demonstrated to address reliability issues of the NWT. Uncertainties in the geometry, material properties, loads and stiffener location on the NWT are considered to perform the reliability assessment. Probabilistic stress, frequency, buckling, fatigue and proof load analyses are performed. These analyses cover the major global and some local design requirements. Based on the assumed uncertainties, the results reveal the assurance of minimum 0.999 reliability for the NWT. Preliminary life prediction analysis results show that the life of the NWT is governed by the fatigue of welds. Also, reliability based proof test assessment is performed.
Do We Really Need Performance Objectives?
ERIC Educational Resources Information Center
Duenk, Lester G.
Trade and industrial education (T&I) has been using performance-based instructional systems long before competency-based education (CBE) emerged as a creative innovation in other subject areas. Allen devised the forerunner of the present CBE system in 1917 by carefully plotting competencies and delineating job analyses from these, and Fryklund's…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peace, Gerald; Goering, Timothy James; Miller, Mark Laverne
2007-01-01
A probabilistic performance assessment has been conducted to evaluate the fate and transport of radionuclides (americium-241, cesium-137, cobalt-60, plutonium-238, plutonium-239, radium-226, radon-222, strontium-90, thorium-232, tritium, uranium-238), heavy metals (lead and cadmium), and volatile organic compounds (VOCs) at the Mixed Waste Landfill (MWL). Probabilistic analyses were performed to quantify uncertainties inherent in the system and models for a 1,000-year period, and sensitivity analyses were performed to identify parameters and processes that were most important to the simulated performance metrics. Comparisons between simulated results and measured values at the MWL were made to gain confidence in the models and perform calibrations whenmore » data were available. In addition, long-term monitoring requirements and triggers were recommended based on the results of the quantified uncertainty and sensitivity analyses.« less
A Next Generation Digital Counting System For Low-Level Tritium Studies (Project Report)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, P.
2016-10-03
Since the early seventies, SRNL has pioneered low-level tritium analysis using various nuclear counting technologies and techniques. Since 1999, SRNL has successfully performed routine low-level tritium analyses with counting systems based on digital signal processor (DSP) modules developed in the late 1990s. Each of these counting systems are complex, unique to SRNL, and fully dedicated to performing routine tritium analyses of low-level environmental samples. It is time to modernize these systems due to a variety of issues including (1) age, (2) lack of direct replacement electronics modules and (3) advances in digital signal processing and computer technology. There has beenmore » considerable development in many areas associated with the enterprise of performing low-level tritium analyses. The objective of this LDRD project was to design, build, and demonstrate a Next Generation Tritium Counting System (NGTCS), while not disrupting the routine low-level tritium analyses underway in the facility on the legacy counting systems. The work involved (1) developing a test bed for building and testing new counting system hardware that does not interfere with our routine analyses, (2) testing a new counting system based on a modern state of the art DSP module, and (3) evolving the low-level tritium counter design to reflect the state of the science.« less
Rule-Based Category Learning in Children: The Role of Age and Executive Functioning
Rabi, Rahel; Minda, John Paul
2014-01-01
Rule-based category learning was examined in 4–11 year-olds and adults. Participants were asked to learn a set of novel perceptual categories in a classification learning task. Categorization performance improved with age, with younger children showing the strongest rule-based deficit relative to older children and adults. Model-based analyses provided insight regarding the type of strategy being used to solve the categorization task, demonstrating that the use of the task appropriate strategy increased with age. When children and adults who identified the correct categorization rule were compared, the performance deficit was no longer evident. Executive functions were also measured. While both working memory and inhibitory control were related to rule-based categorization and improved with age, working memory specifically was found to marginally mediate the age-related improvements in categorization. When analyses focused only on the sample of children, results showed that working memory ability and inhibitory control were associated with categorization performance and strategy use. The current findings track changes in categorization performance across childhood, demonstrating at which points performance begins to mature and resemble that of adults. Additionally, findings highlight the potential role that working memory and inhibitory control may play in rule-based category learning. PMID:24489658
Chang, Hsien-Yen; Weiner, Jonathan P
2010-01-18
Diagnosis-based risk adjustment is becoming an important issue globally as a result of its implications for payment, high-risk predictive modelling and provider performance assessment. The Taiwanese National Health Insurance (NHI) programme provides universal coverage and maintains a single national computerized claims database, which enables the application of diagnosis-based risk adjustment. However, research regarding risk adjustment is limited. This study aims to examine the performance of the Adjusted Clinical Group (ACG) case-mix system using claims-based diagnosis information from the Taiwanese NHI programme. A random sample of NHI enrollees was selected. Those continuously enrolled in 2002 were included for concurrent analyses (n = 173,234), while those in both 2002 and 2003 were included for prospective analyses (n = 164,562). Health status measures derived from 2002 diagnoses were used to explain the 2002 and 2003 health expenditure. A multivariate linear regression model was adopted after comparing the performance of seven different statistical models. Split-validation was performed in order to avoid overfitting. The performance measures were adjusted R2 and mean absolute prediction error of five types of expenditure at individual level, and predictive ratio of total expenditure at group level. The more comprehensive models performed better when used for explaining resource utilization. Adjusted R2 of total expenditure in concurrent/prospective analyses were 4.2%/4.4% in the demographic model, 15%/10% in the ACGs or ADGs (Aggregated Diagnosis Group) model, and 40%/22% in the models containing EDCs (Expanded Diagnosis Cluster). When predicting expenditure for groups based on expenditure quintiles, all models underpredicted the highest expenditure group and overpredicted the four other groups. For groups based on morbidity burden, the ACGs model had the best performance overall. Given the widespread availability of claims data and the superior explanatory power of claims-based risk adjustment models over demographics-only models, Taiwan's government should consider using claims-based models for policy-relevant applications. The performance of the ACG case-mix system in Taiwan was comparable to that found in other countries. This suggested that the ACG system could be applied to Taiwan's NHI even though it was originally developed in the USA. Many of the findings in this paper are likely to be relevant to other diagnosis-based risk adjustment methodologies.
ERIC Educational Resources Information Center
Zheng, Lanqin
2016-01-01
This meta-analysis examined research on the effects of self-regulated learning scaffolds on academic performance in computer-based learning environments from 2004 to 2015. A total of 29 articles met inclusion criteria and were included in the final analysis with a total sample size of 2,648 students. Moderator analyses were performed using a…
Proposed qualification requirements for selected railroad jobs
DOT National Transportation Integrated Search
1975-01-01
This report proposes minimum safety-related knowledge, performance and training requirements for the jobs of railroad engineer, conductor, brakeman and train dispatchers. Analyses performed were primarily based upon job and task analytic documentatio...
Thermal hydraulic feasibility assessment of the hot conditioning system and process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heard, F.J.
1996-10-10
The Spent Nuclear Fuel Project was established to develop engineered solutions for the expedited removal, stabilization, and storage of spent nuclear fuel from the K Basins at the U.S. Department of Energy`s Hanford Site in Richland, Washington. A series of analyses have been completed investigating the thermal-hydraulic performance and feasibility of the proposed Hot Conditioning System and process for the Spent Nuclear Fuel Project. The analyses were performed using a series of thermal-hydraulic models that could respond to all process and safety-related issues that may arise pertaining to the Hot Conditioning System. The subject efforts focus on independently investigating, quantifying,more » and establishing the governing heat production and removal mechanisms, flow distributions within the multi-canister overpack, and performing process simulations for various purge gases under consideration for the Hot Conditioning System, as well as obtaining preliminary results for comparison with and verification of other analyses, and providing technology- based recommendations for consideration and incorporation into the Hot Conditioning System design bases.« less
Defect tolerance in resistor-logic demultiplexers for nanoelectronics.
Kuekes, Philip J; Robinett, Warren; Williams, R Stanley
2006-05-28
Since defect rates are expected to be high in nanocircuitry, we analyse the performance of resistor-based demultiplexers in the presence of defects. The defects observed to occur in fabricated nanoscale crossbars are stuck-open, stuck-closed, stuck-short, broken-wire, and adjacent-wire-short defects. We analyse the distribution of voltages on the nanowire output lines of a resistor-logic demultiplexer, based on an arbitrary constant-weight code, when defects occur. These analyses show that resistor-logic demultiplexers can tolerate small numbers of stuck-closed, stuck-open, and broken-wire defects on individual nanowires, at the cost of some degradation in the circuit's worst-case voltage margin. For stuck-short and adjacent-wire-short defects, and for nanowires with too many defects of the other types, the demultiplexer can still achieve error-free performance, but with a smaller set of output lines. This design thus has two layers of defect tolerance: the coding layer improves the yield of usable output lines, and an avoidance layer guarantees that error-free performance is achieved.
DOT National Transportation Integrated Search
2004-03-01
The report provides the first two major task reports for a study to develop performance specifications and perform supporting objective tests for a planned field operational test (FOT) of a vehicle-based countermeasure to intersection crashes associa...
CASAS: Cancer Survival Analysis Suite, a web based application
Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne
2017-01-01
We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis: quantile, landmark and competing risks, in addition to standard survival analysis. The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots. Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/. PMID:28928946
CASAS: Cancer Survival Analysis Suite, a web based application.
Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne
2017-01-01
We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis: quantile, landmark and competing risks, in addition to standard survival analysis. The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots. Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/.
Developmental Origins of Low Mathematics Performance and Normal Variation in Twins from 7 to 9 Years
Haworth, Claire M. A.; Kovas, Yulia; Petrill, Stephen A.; Plomin, Robert
2009-01-01
A previous publication reported the etiology of mathematics performance in 7-year-old twins (Oliver et al., 2004). As part of the same longitudinal study we investigated low mathematics performance and normal variation in a representative United Kingdom sample of 1713 same-sex 9-year-old twins based on teacher-assessed National Curriculum standards. Univariate individual differences and DeFries-Fulker extremes analyses were performed. Similar to our results at 7 years, all mathematics scores at 9 years showed high heritability (.62–.75) and low shared environmental estimates (.00–.11) for both the low performance group and the full sample. Longitudinal analyses were performed from 7 to 9 years. These longitudinal analyses indicated strong genetic continuity from 7 to 9 years for both low performance and mathematics in the normal range. We conclude that, despite the considerable differences in mathematics curricula from 7 to 9 years, the same genetic effects largely operate at the two ages. PMID:17539370
Haworth, Claire M A; Kovas, Yulia; Petrill, Stephen A; Plomin, Robert
2007-02-01
A previous publication reported the etiology of mathematics performance in 7-year-old twins (Oliver et al., 2004). As part of the same longitudinal study we investigated low mathematics performance and normal variation in a representative United Kingdom sample of 1713 same-sex 9-year-old twins based on teacher-assessed National Curriculum standards. Univariate individual differences and DeFries-Fulker extremes analyses were performed. Similar to our results at 7 years, all mathematics scores at 9 years showed high heritability (.62-.75) and low shared environmental estimates (.00-.11) for both the low performance group and the full sample. Longitudinal analyses were performed from 7 to 9 years. These longitudinal analyses indicated strong genetic continuity from 7 to 9 years for both low performance and mathematics in the normal range. We conclude that, despite the considerable differences in mathematics curricula from 7 to 9 years, the same genetic effects largely operate at the two ages.
Lessons Learned During TBCC Design for the NASA-AFRL Joint System Study
NASA Technical Reports Server (NTRS)
Snyder, Christopher A.; Espinosa, A. M.
2013-01-01
NASA and the Air Force Research Laboratory are involved in a Joint System Study (JSS) on Two-Stage-to-Orbit (TSTO) vehicles. The JSS will examine the performance, operability and analysis uncertainty of unmanned, fully reusable, TSTO launch vehicle concepts. NASA is providing a vehicle concept using turbine-based combined cycle (TBCC) propulsion on the booster stage and an all-rocket orbiter. The variation in vehicle and mission requirements for different potential customers, combined with analysis uncertainties, make it problematic to define optimum vehicle types or concepts, but the study is being used by NASA for tool assessment and development, and to identify technology gaps. Preliminary analyses were performed on the entire TBCC booster concept; then higher-fidelity analyses were performed for particular areas to verify results or reduce analysis uncertainties. Preliminary TBCC system analyses indicated that there would be sufficient thrust margin over its mission portion. The higher fidelity analyses, which included inlet and nozzle performance corrections for significant area mismatches between TBCC propulsion requirements versus the vehicle design, resulted in significant performance penalties from the preliminary results. TBCC system design and vehicle operation assumptions were reviewed to identify items to mitigate these performance penalties. The most promising items were then applied and analyses rerun to update performance predictions. A study overview is given to orient the reader, quickly focusing upon the NASA TBCC booster and low speed propulsion system. Details for the TBCC concept and the analyses performed are described. Finally, a summary of "Lessons Learned" are discussed with suggestions to improve future study efforts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinones, Armando, Sr.; Bibeau, Tiffany A.; Ho, Clifford Kuofei
2008-08-01
Finite-element analyses were performed to simulate the response of a hypothetical vertical masonry wall subject to different lateral loads with and without continuous horizontal filament ties laid between rows of concrete blocks. A static loading analysis and cost comparison were also performed to evaluate optimal materials and designs for the spacers affixed to the filaments. Results showed that polypropylene, ABS, and polyethylene (high density) were suitable materials for the spacers based on performance and cost, and the short T-spacer design was optimal based on its performance and functionality. Simulations of vertical walls subject to static loads representing 100 mph windsmore » (0.2 psi) and a seismic event (0.66 psi) showed that the simulated walls performed similarly and adequately when subject to these loads with and without the ties. Additional simulations and tests are required to assess the performance of actual walls with and without the ties under greater loads and more realistic conditions (e.g., cracks, non-linear response).« less
Geospace Environment Modeling 2008-2009 Challenge: Ground Magnetic Field Perturbations
NASA Technical Reports Server (NTRS)
Pulkkinen, A.; Kuznetsova, M.; Ridley, A.; Raeder, J.; Vapirev, A.; Weimer, D.; Weigel, R. S.; Wiltberger, M.; Millward, G.; Rastatter, L.;
2011-01-01
Acquiring quantitative metrics!based knowledge about the performance of various space physics modeling approaches is central for the space weather community. Quantification of the performance helps the users of the modeling products to better understand the capabilities of the models and to choose the approach that best suits their specific needs. Further, metrics!based analyses are important for addressing the differences between various modeling approaches and for measuring and guiding the progress in the field. In this paper, the metrics!based results of the ground magnetic field perturbation part of the Geospace Environment Modeling 2008 2009 Challenge are reported. Predictions made by 14 different models, including an ensemble model, are compared to geomagnetic observatory recordings from 12 different northern hemispheric locations. Five different metrics are used to quantify the model performances for four storm events. It is shown that the ranking of the models is strongly dependent on the type of metric used to evaluate the model performance. None of the models rank near or at the top systematically for all used metrics. Consequently, one cannot pick the absolute winner : the choice for the best model depends on the characteristics of the signal one is interested in. Model performances vary also from event to event. This is particularly clear for root!mean!square difference and utility metric!based analyses. Further, analyses indicate that for some of the models, increasing the global magnetohydrodynamic model spatial resolution and the inclusion of the ring current dynamics improve the models capability to generate more realistic ground magnetic field fluctuations.
Harvey, Philip D.; Aslan, Mihaela; Du, Mengtian; Zhao, Hongyu; Siever, Larry J.; Pulver, Ann; Gaziano, J. Michael; Concato, John
2015-01-01
Objective Impairments in cognition and everyday functioning are common in schizophrenia and bipolar disorder. Based on two studies of schizophrenia (SCZ) and bipolar I disorder (BPI) with similar methods, this paper presents factor analyses of cognitive and functional capacity (FC) measures. The overall goal of these analyses was to determine whether performance-based assessments should be examined individually, or aggregated on the basis of the correlational structure of the tests and as well as to evaluate the similarity of factor structures in SCZ and BPI. Method Veterans Affairs (VA) Cooperative Studies Program study #572, evaluated cognitive and FC measures among 5,414 BPI and 3,942 SZ patients. A second study evaluated similar neuropsychological (NP) and FC measures among 368 BPI and 436 SZ patients. Principal components analysis, as well as exploratory and confirmatory factor analyses, were used to examine the data. Results Analyses in both datasets suggested that NP and FC measures were explained by of a single underlying factor in BPI and SCZ patients, both when analyzed separately or as in a combined sample. The factor structure in both studies was similar, with or without inclusion of FC measures; homogeneous loadings were observed for that single factor across cognitive and FC domains across the samples. Conclusions The empirically derived factor model suggests that NP performance and FC are best explained as a single latent trait applicable to people with schizophrenia and bipolar illness. This single measure may enhance the robustness of the analyses relating genomic data to performance-based phenotypes. PMID:26710094
ANALYSES OF FISH TISSUE BY VACUUM DISTILLATION/GAS CHROMATOGRAPHY/MASS SPECTROMETRY
The analyses of fish tissue using VD/GC/MS with surrogate-based matrix corrections is described. Techniques for equilibrating surrogate and analyte spikes with a tissue matrix are presented, and equilibrated spiked samples are used to document method performance. The removal of a...
NASA Technical Reports Server (NTRS)
Ruf, Joseph H.; Holt, James B.; Canabal, Francisco
2001-01-01
This paper presents the status of analyses on three Rocket Based Combined Cycle (RBCC) configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics (CFD) analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes (FDNS) code for ejector mode fluid dynamics. The Draco analysis was a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.
Electric Propulsion Performance from Geo-transfer to Geosynchronous Orbits
NASA Technical Reports Server (NTRS)
Dankanich, John W.; Carpenter, Christian B.
2007-01-01
For near-Earth application, solar electric propulsion advocates have focused on Low Earth Orbit (LEO) to Geosynchronous (GEO) low-thrust transfers because of the significant improvement in capability over chemical alternatives. While the performance gain attained from starting with a lower orbit is large, there are also increased transfer times and radiation exposure risk that has hindered the commercial advocacy for electric propulsion stages. An incremental step towards electric propulsion stages is the use of integrated solar electric propulsion systems (SEPS) for GTO to GEO transfer. Thorough analyses of electric propulsion systems options and performance are presented. Results are based on existing or near-term capabilities of Arcjets, Hall thrusters, and Gridded Ion engines. Parametric analyses based on "rubber" thruster and launch site metrics are also provided.
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Perry, Boyd, III; Florance, James R.; Sanetrik, Mark D.; Wieseman, Carol D.; Stevens, William L.; Funk, Christie J.; Hur, Jiyoung; Christhilf, David M.; Coulson, David A.
2011-01-01
A summary of computational and experimental aeroelastic and aeroservoelastic (ASE) results for the Semi-Span Super-Sonic Transport (S4T) wind-tunnel model is presented. A broad range of analyses and multiple ASE wind-tunnel tests of the S4T have been performed in support of the ASE element in the Supersonics Program, part of NASA's Fundamental Aeronautics Program. The computational results to be presented include linear aeroelastic and ASE analyses, nonlinear aeroelastic analyses using an aeroelastic CFD code, and rapid aeroelastic analyses using CFD-based reduced-order models (ROMs). Experimental results from two closed-loop wind-tunnel tests performed at NASA Langley's Transonic Dynamics Tunnel (TDT) will be presented as well.
ERIC Educational Resources Information Center
Sangiumvibool, Payear; Chonglerttham, Supasith
2017-01-01
This study presents analyses of panel data from 2007 to 2011 from various authoritative sources of information on public universities in Thailand. The focus is on factors that influence the budgetary decision-making process in providing educational services to the general public under a recently implemented performance-based budgeting system.…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-05
.... This draft document describes the quantitative analyses that are being conducted as part of the review... primary (health-based) CO NAAQS, the Agency is conducting qualitative and quantitative assessments... results, observations, and related uncertainties associated with the quantitative analyses performed. An...
Budde, Kristin S.; Barron, Daniel S.; Fox, Peter T.
2015-01-01
Developmental stuttering is a speech disorder most likely due to a heritable form of developmental dysmyelination impairing the function of the speech-motor system. Speech-induced brain-activation patterns in persons who stutter (PWS) are anomalous in various ways; the consistency of these aberrant patterns is a matter of ongoing debate. Here, we present a hierarchical series of coordinate-based meta-analyses addressing this issue. Two tiers of meta-analyses were performed on a 17-paper dataset (202 PWS; 167 fluent controls). Four large-scale (top-tier) meta-analyses were performed, two for each subject group (PWS and controls). These analyses robustly confirmed the regional effects previously postulated as “neural signatures of stuttering” (Brown 2005) and extended this designation to additional regions. Two smaller-scale (lower-tier) meta-analyses refined the interpretation of the large-scale analyses: 1) a between-group contrast targeting differences between PWS and controls (stuttering trait); and 2) a within-group contrast (PWS only) of stuttering with induced fluency (stuttering state). PMID:25463820
NASA Technical Reports Server (NTRS)
Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.
2014-01-01
This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle
NASA Technical Reports Server (NTRS)
Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.
2014-01-01
This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle.
Study 2.5 final report. DORCA computer program. Volume 5: Analysis report
NASA Technical Reports Server (NTRS)
Campbell, N.
1972-01-01
A modification of the Dynamic Operational Requirements and Cost Analysis Program to perform traffic analyses of the automated satellite program is described. Inherent in the analyses of the automated satellite program was the assumption that a number of vehicles were available to perform any or all of the missions within the satellite program. The objective of the modification was to select a vehicle or group of vehicles for performing all of the missions at the lowest possible cost. A vehicle selection routine and the capability to simulate ground based vehicle operational modes were incorporated into the program.
Lee, Inhan; Williams, Christopher R.; Athey, Brian D.; Baker, James R.
2010-01-01
Molecular dynamics simulations of nano-therapeutics as a final product and of all intermediates in the process of generating a multi-functional nano-therapeutic based on a poly(amidoamine) (PAMAM) dendrimer were performed along with chemical analyses of each of them. The actual structures of the dendrimers were predicted, based on potentiometric titration, gel permeation chromatography, and NMR. The chemical analyses determined the numbers of functional molecules, based on the actual structure of the dendrimer. Molecular dynamics simulations calculated the configurations of the intermediates and the radial distributions of functional molecules, based on their numbers. This interactive process between the simulation results and the chemical analyses provided a further strategy to design the next reaction steps and to gain insight into the products at each chemical reaction step. PMID:20700476
Instrumentation and Performance Analysis Plans for the HIFiRE Flight 2 Experiment
NASA Technical Reports Server (NTRS)
Gruber, Mark; Barhorst, Todd; Jackson, Kevin; Eklund, Dean; Hass, Neal; Storch, Andrea M.; Liu, Jiwen
2009-01-01
Supersonic combustion performance of a bi-component gaseous hydrocarbon fuel mixture is one of the primary aspects under investigation in the HIFiRE Flight 2 experiment. In-flight instrumentation and post-test analyses will be two key elements used to determine the combustion performance. Pre-flight computational fluid dynamics (CFD) analyses provide valuable information that can be used to optimize the placement of a constrained set of wall pressure instrumentation in the experiment. The simulations also allow pre-flight assessments of performance sensitivities leading to estimates of overall uncertainty in the determination of combustion efficiency. Based on the pre-flight CFD results, 128 wall pressure sensors have been located throughout the isolator/combustor flowpath to minimize the error in determining the wall pressure force at Mach 8 flight conditions. Also, sensitivity analyses show that mass capture and combustor exit stream thrust are the two primary contributors to uncertainty in combustion efficiency.
"Greenbelt or Gutter": Youth "Place-Based" Performance and the Myth of the Suburban/Urban Divide
ERIC Educational Resources Information Center
Wessels, Anne
2014-01-01
A youth-created "place-based" performance set in the grounds of their suburban school challenged the myth of the suburban/urban divide that pits the edenic suburb against the dirty and crime-ridden city. Depicting the power relations of a failed utopia, these youth provoked the researcher to embark on further inquiry, analysing other…
Brown, Gary C; Brown, Melissa M; Brown, Heidi C; Kindermann, Sylvia; Sharma, Sanjay
2007-01-01
To evaluate the comparability of articles in the peer-reviewed literature assessing the (1) patient value and (2) cost-utility (cost-effectiveness) associated with interventions for neovascular age-related macular degeneration (ARMD). A search was performed in the National Library of Medicine database of 16 million peer-reviewed articles using the key words cost-utility, cost-effectiveness, value, verteporfin, pegaptanib, laser photocoagulation, ranibizumab, and therapy. All articles that used an outcome of quality-adjusted life-years (QALYs) were studied in regard to (1) percent improvement in quality of life, (2) utility methodology, (3) utility respondents, (4) types of costs included (eg, direct healthcare, direct nonhealthcare, indirect), (5) cost bases (eg, Medicare, National Health Service in the United Kingdom), and (6) study cost perspective (eg, government, societal, third-party insurer). To qualify as a value-based medicine analysis, the patient value had to be measured using the outcome of the QALYs conferred by respective interventions. As with value-based medicine analyses, patient-based time tradeoff utility analysis had to be utilized, patient utility respondents were necessary, and direct medical costs were used. Among 21 cost-utility analyses performed on interventions for neovascular macular degeneration, 15 (71%) met value-based medicine criteria. The 6 others (29%) were not comparable owing to (1) varying utility methodology, (2) varying utility respondents, (3) differing costs utilized, (4) differing cost bases, and (5) varying study perspectives. Among value-based medicine studies, laser photocoagulation confers a 4.4% value gain (improvement in quality of life) for the treatment of classic subfoveal choroidal neovascularization. Intravitreal pegaptanib confers a 5.9% value gain (improvement in quality of life) for classic, minimally classic, and occult subfoveal choroidal neovascularization, and photodynamic therapy with verteporfin confers a 7.8% to 10.7% value gain for the treatment of classic subfoveal choroidal neovascularization. Intravitreal ranibizumab therapy confers greater than a 15% value gain for the treatment of subfoveal occult and minimally classic subfoveal choroidal neovascularization. The majority of cost-utility studies performed on interventions for neovascular macular degeneration are value-based medicine studies and thus are comparable. Value-based analyses of neovascular ARMD monotherapies demonstrate the power of value-based medicine to improve quality of care and concurrently maximize the efficacy of healthcare resource use in public policy. The comparability of value-based medicine cost-utility analyses has important implications for overall practice standards and public policy. The adoption of value-based medicine standards can greatly facilitate the goal of higher-quality care and maximize the best use of healthcare funds.
Brown, Gary C.; Brown, Melissa M.; Brown, Heidi C.; Kindermann, Sylvia; Sharma, Sanjay
2007-01-01
Purpose To evaluate the comparability of articles in the peer-reviewed literature assessing the (1) patient value and (2) cost-utility (cost-effectiveness) associated with interventions for neovascular age-related macular degeneration (ARMD). Methods A search was performed in the National Library of Medicine database of 16 million peer-reviewed articles using the key words cost-utility, cost-effectiveness, value, verteporfin, pegaptanib, laser photocoagulation, ranibizumab, and therapy. All articles that used an outcome of quality-adjusted life-years (QALYs) were studied in regard to (1) percent improvement in quality of life, (2) utility methodology, (3) utility respondents, (4) types of costs included (eg, direct healthcare, direct nonhealthcare, indirect), (5) cost bases (eg, Medicare, National Health Service in the United Kingdom), and (6) study cost perspective (eg, government, societal, third-party insurer). To qualify as a value-based medicine analysis, the patient value had to be measured using the outcome of the QALYs conferred by respective interventions. As with value-based medicine analyses, patient-based time tradeoff utility analysis had to be utilized, patient utility respondents were necessary, and direct medical costs were used. Results Among 21 cost-utility analyses performed on interventions for neovascular macular degeneration, 15 (71%) met value-based medicine criteria. The 6 others (29%) were not comparable owing to (1) varying utility methodology, (2) varying utility respondents, (3) differing costs utilized, (4) differing cost bases, and (5) varying study perspectives. Among value-based medicine studies, laser photocoagulation confers a 4.4% value gain (improvement in quality of life) for the treatment of classic subfoveal choroidal neovascularization. Intravitreal pegaptanib confers a 5.9% value gain (improvement in quality of life) for classic, minimally classic, and occult subfoveal choroidal neovascularization, and photodynamic therapy with verteporfin confers a 7.8% to 10.7% value gain for the treatment of classic subfoveal choroidal neovascularization. Intravitreal ranibizumab therapy confers greater than a 15% value gain for the treatment of subfoveal occult and minimally classic subfoveal choroidal neovascularization. Conclusions The majority of cost-utility studies performed on interventions for neovascular macular degeneration are value-based medicine studies and thus are comparable. Value-based analyses of neovascular ARMD monotherapies demonstrate the power of value-based medicine to improve quality of care and concurrently maximize the efficacy of healthcare resource use in public policy. The comparability of value-based medicine cost-utility analyses has important implications for overall practice standards and public policy. The adoption of value-based medicine standards can greatly facilitate the goal of higher-quality care and maximize the best use of healthcare funds. PMID:18427606
The space shuttle launch vehicle aerodynamic verification challenges
NASA Technical Reports Server (NTRS)
Wallace, R. O.; Austin, L. D.; Hondros, J. G.; Surber, T. E.; Gaines, L. M.; Hamilton, J. T.
1985-01-01
The Space Shuttle aerodynamics and performance communities were challenged to verify the Space Shuttle vehicle (SSV) aerodynamics and system performance by flight measurements. Historically, launch vehicle flight test programs which faced these same challenges were unmanned instrumented flights of simple aerodynamically shaped vehicles. However, the manned SSV flight test program made these challenges more complex because of the unique aerodynamic configuration powered by the first man-rated solid rocket boosters (SRB). The analyses of flight data did not verify the aerodynamics or performance preflight predictions of the first flight of the Space Transportation System (STS-1). However, these analyses have defined the SSV aerodynamics and verified system performance. The aerodynamics community also was challenged to understand the discrepancy between the wind tunnel and flight defined aerodynamics. The preflight analysis challenges, the aerodynamic extraction challenges, and the postflight analyses challenges which led to the SSV system performance verification and which will lead to the verification of the operational ascent aerodynamics data base are presented.
Śliwińska, Anna; Burchart-Korol, Dorota; Smoliński, Adam
2017-01-01
This paper presents a life cycle assessment (LCA) of greenhouse gas emissions generated through methanol and electricity co-production system based on coal gasification technology. The analysis focuses on polygeneration technologies from which two products are produced, and thus, issues related to an allocation procedure for LCA are addressed in this paper. In the LCA, two methods were used: a 'system expansion' method based on two approaches, the 'avoided burdens approach' and 'direct system enlargement' methods and an 'allocation' method involving proportional partitioning based on physical relationships in a technological process. Cause-effect relationships in the analysed production process were identified, allowing for the identification of allocation factors. The 'system expansion' method involved expanding the analysis to include five additional variants of electricity production technologies in Poland (alternative technologies). This method revealed environmental consequences of implementation for the analysed technologies. It was found that the LCA of polygeneration technologies based on the 'system expansion' method generated a more complete source of information on environmental consequences than the 'allocation' method. The analysis shows that alternative technologies chosen for generating LCA results are crucial. Life cycle assessment was performed for the analysed, reference and variant alternative technologies. Comparative analysis was performed between the analysed technologies of methanol and electricity co-production from coal gasification as well as a reference technology of methanol production from the natural gas reforming process. Copyright © 2016 Elsevier B.V. All rights reserved.
ISAAC - InterSpecies Analysing Application using Containers.
Baier, Herbert; Schultz, Jörg
2014-01-15
Information about genes, transcripts and proteins is spread over a wide variety of databases. Different tools have been developed using these databases to identify biological signals in gene lists from large scale analysis. Mostly, they search for enrichments of specific features. But, these tools do not allow an explorative walk through different views and to change the gene lists according to newly upcoming stories. To fill this niche, we have developed ISAAC, the InterSpecies Analysing Application using Containers. The central idea of this web based tool is to enable the analysis of sets of genes, transcripts and proteins under different biological viewpoints and to interactively modify these sets at any point of the analysis. Detailed history and snapshot information allows tracing each action. Furthermore, one can easily switch back to previous states and perform new analyses. Currently, sets can be viewed in the context of genomes, protein functions, protein interactions, pathways, regulation, diseases and drugs. Additionally, users can switch between species with an automatic, orthology based translation of existing gene sets. As todays research usually is performed in larger teams and consortia, ISAAC provides group based functionalities. Here, sets as well as results of analyses can be exchanged between members of groups. ISAAC fills the gap between primary databases and tools for the analysis of large gene lists. With its highly modular, JavaEE based design, the implementation of new modules is straight forward. Furthermore, ISAAC comes with an extensive web-based administration interface including tools for the integration of third party data. Thus, a local installation is easily feasible. In summary, ISAAC is tailor made for highly explorative interactive analyses of gene, transcript and protein sets in a collaborative environment.
Missing value imputation for microarray data: a comprehensive comparison study and a web tool.
Chiu, Chia-Chun; Chan, Shih-Yao; Wang, Chung-Ching; Wu, Wei-Sheng
2013-01-01
Microarray data are usually peppered with missing values due to various reasons. However, most of the downstream analyses for microarray data require complete datasets. Therefore, accurate algorithms for missing value estimation are needed for improving the performance of microarray data analyses. Although many algorithms have been developed, there are many debates on the selection of the optimal algorithm. The studies about the performance comparison of different algorithms are still incomprehensive, especially in the number of benchmark datasets used, the number of algorithms compared, the rounds of simulation conducted, and the performance measures used. In this paper, we performed a comprehensive comparison by using (I) thirteen datasets, (II) nine algorithms, (III) 110 independent runs of simulation, and (IV) three types of measures to evaluate the performance of each imputation algorithm fairly. First, the effects of different types of microarray datasets on the performance of each imputation algorithm were evaluated. Second, we discussed whether the datasets from different species have different impact on the performance of different algorithms. To assess the performance of each algorithm fairly, all evaluations were performed using three types of measures. Our results indicate that the performance of an imputation algorithm mainly depends on the type of a dataset but not on the species where the samples come from. In addition to the statistical measure, two other measures with biological meanings are useful to reflect the impact of missing value imputation on the downstream data analyses. Our study suggests that local-least-squares-based methods are good choices to handle missing values for most of the microarray datasets. In this work, we carried out a comprehensive comparison of the algorithms for microarray missing value imputation. Based on such a comprehensive comparison, researchers could choose the optimal algorithm for their datasets easily. Moreover, new imputation algorithms could be compared with the existing algorithms using this comparison strategy as a standard protocol. In addition, to assist researchers in dealing with missing values easily, we built a web-based and easy-to-use imputation tool, MissVIA (http://cosbi.ee.ncku.edu.tw/MissVIA), which supports many imputation algorithms. Once users upload a real microarray dataset and choose the imputation algorithms, MissVIA will determine the optimal algorithm for the users' data through a series of simulations, and then the imputed results can be downloaded for the downstream data analyses.
Hypoglycemia alarm enhancement using data fusion.
Skladnev, Victor N; Tarnavskii, Stanislav; McGregor, Thomas; Ghevondian, Nejhdeh; Gourlay, Steve; Jones, Timothy W
2010-01-01
The acceptance of closed-loop blood glucose (BG) control using continuous glucose monitoring systems (CGMS) is likely to improve with enhanced performance of their integral hypoglycemia alarms. This article presents an in silico analysis (based on clinical data) of a modeled CGMS alarm system with trained thresholds on type 1 diabetes mellitus (T1DM) patients that is augmented by sensor fusion from a prototype hypoglycemia alarm system (HypoMon). This prototype alarm system is based on largely independent autonomic nervous system (ANS) response features. Alarm performance was modeled using overnight BG profiles recorded previously on 98 T1DM volunteers. These data included the corresponding ANS response features detected by HypoMon (AiMedics Pty. Ltd.) systems. CGMS data and alarms were simulated by applying a probabilistic model to these overnight BG profiles. The probabilistic model developed used a mean response delay of 7.1 minutes, measurement error offsets on each sample of +/- standard deviation (SD) = 4.5 mg/dl (0.25 mmol/liter), and vertical shifts (calibration offsets) of +/- SD = 19.8 mg/dl (1.1 mmol/liter). Modeling produced 90 to 100 simulated measurements per patient. Alarm systems for all analyses were optimized on a training set of 46 patients and evaluated on the test set of 56 patients. The split between the sets was based on enrollment dates. Optimization was based on detection accuracy but not time to detection for these analyses. The contribution of this form of data fusion to hypoglycemia alarm performance was evaluated by comparing the performance of the trained CGMS and fused data algorithms on the test set under the same evaluation conditions. The simulated addition of HypoMon data produced an improvement in CGMS hypoglycemia alarm performance of 10% at equal specificity. Sensitivity improved from 87% (CGMS as stand-alone measurement) to 97% for the enhanced alarm system. Specificity was maintained constant at 85%. Positive predictive values on the test set improved from 61 to 66% with negative predictive values improving from 96 to 99%. These enhancements were stable within sensitivity analyses. Sensitivity analyses also suggested larger performance increases at lower CGMS alarm performance levels. Autonomic nervous system response features provide complementary information suitable for fusion with CGMS data to enhance nocturnal hypoglycemia alarms. 2010 Diabetes Technology Society.
Estimation of rail wear limits based on rail strength investigations
DOT National Transportation Integrated Search
1998-12-01
This report describes analyses performed to estimate limits on rail wear based on strength investigations. Two different failure modes are considered in this report: (1) permanent plastic bending, and (2) rail fracture. Rail bending stresses are calc...
7 CFR 91.5 - Where services are offered.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) Science and Technology Programs National Science Laboratory. A variety of proximate for composition, chemical, physical, microbiological and biomolecular (DNA-based) tests and laboratory analyses performed on..., honey, meat and meat products, fiber products and processed foods are performed at the Science and...
7 CFR 91.5 - Where services are offered.
Code of Federal Regulations, 2013 CFR
2013-01-01
...) Science and Technology Programs National Science Laboratory. A variety of proximate for composition, chemical, physical, microbiological and biomolecular (DNA-based) tests and laboratory analyses performed on..., honey, meat and meat products, fiber products and processed foods are performed at the Science and...
7 CFR 91.5 - Where services are offered.
Code of Federal Regulations, 2014 CFR
2014-01-01
...) Science and Technology Programs National Science Laboratory. A variety of proximate for composition, chemical, physical, microbiological and biomolecular (DNA-based) tests and laboratory analyses performed on..., honey, meat and meat products, fiber products and processed foods are performed at the Science and...
7 CFR 91.5 - Where services are offered.
Code of Federal Regulations, 2012 CFR
2012-01-01
...) Science and Technology Programs National Science Laboratory. A variety of proximate for composition, chemical, physical, microbiological and biomolecular (DNA-based) tests and laboratory analyses performed on..., honey, meat and meat products, fiber products and processed foods are performed at the Science and...
Consumer acceptance of fresh blueberries in bio-based packages.
Almenar, Eva; Samsudin, Hayati; Auras, Rafael; Harte, Janice
2010-05-01
Instrumental analyses have shown that non-vented bio-based containers made from poly(lactic acid) (PLA) have the capability to enhance blueberry shelf life as compared with commercial vented petroleum-based clamshell containers. However, consumer preference has not been explored so far. In this study, two sensory evaluations, triangle and paired preference tests, were performed after storing fruit in both containers at 3 and 10 degrees C for 7 and 14 days. In addition, physicochemical analyses were performed after each tasting in order to correlate instrumental findings with consumer preference. The results of the triangle test showed the capability of the consumer to differentiate (P < or = 0.001) between blueberries from different packages at both storage temperatures. A consumer preference for flavour, texture, external appearance and overall quality (P < or = 0.001) of blueberries packaged in PLA containers was observed in the paired comparison test. The instrumental analyses showed that blueberries in the PLA packages exhibited a weight loss below the limit for marketable life, a stable soluble solid content and titratable acidity and no fungal growth during storage. Consumers distinguished between blueberries from different packages and preferred those packaged in the PLA containers. The instrumental analyses showed that the usable life of the berries was extended in the PLA containers. A correlation between consumer preference and instrumental evaluations was found.
Shrinkage regression-based methods for microarray missing value imputation.
Wang, Hsiuying; Chiu, Chia-Chun; Wu, Yi-Ching; Wu, Wei-Sheng
2013-01-01
Missing values commonly occur in the microarray data, which usually contain more than 5% missing values with up to 90% of genes affected. Inaccurate missing value estimation results in reducing the power of downstream microarray data analyses. Many types of methods have been developed to estimate missing values. Among them, the regression-based methods are very popular and have been shown to perform better than the other types of methods in many testing microarray datasets. To further improve the performances of the regression-based methods, we propose shrinkage regression-based methods. Our methods take the advantage of the correlation structure in the microarray data and select similar genes for the target gene by Pearson correlation coefficients. Besides, our methods incorporate the least squares principle, utilize a shrinkage estimation approach to adjust the coefficients of the regression model, and then use the new coefficients to estimate missing values. Simulation results show that the proposed methods provide more accurate missing value estimation in six testing microarray datasets than the existing regression-based methods do. Imputation of missing values is a very important aspect of microarray data analyses because most of the downstream analyses require a complete dataset. Therefore, exploring accurate and efficient methods for estimating missing values has become an essential issue. Since our proposed shrinkage regression-based methods can provide accurate missing value estimation, they are competitive alternatives to the existing regression-based methods.
A decision model for cost effective design of biomass based green energy supply chains.
Yılmaz Balaman, Şebnem; Selim, Hasan
2015-09-01
The core driver of this study is to deal with the design of anaerobic digestion based biomass to energy supply chains in a cost effective manner. In this concern, a decision model is developed. The model is based on fuzzy multi objective decision making in order to simultaneously optimize multiple economic objectives and tackle the inherent uncertainties in the parameters and decision makers' aspiration levels for the goals. The viability of the decision model is explored with computational experiments on a real-world biomass to energy supply chain and further analyses are performed to observe the effects of different conditions. To this aim, scenario analyses are conducted to investigate the effects of energy crop utilization and operational costs on supply chain structure and performance measures. Copyright © 2015 Elsevier Ltd. All rights reserved.
Requirements for Next Generation Comprehensive Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne; Data, Anubhav
2008-01-01
The unique demands of rotorcraft aeromechanics analysis have led to the development of software tools that are described as comprehensive analyses. The next generation of rotorcraft comprehensive analyses will be driven and enabled by the tremendous capabilities of high performance computing, particularly modular and scaleable software executed on multiple cores. Development of a comprehensive analysis based on high performance computing both demands and permits a new analysis architecture. This paper describes a vision of the requirements for this next generation of comprehensive analyses of rotorcraft. The requirements are described and substantiated for what must be included and justification provided for what should be excluded. With this guide, a path to the next generation code can be found.
Contamination analyses of technology mirror assembly optical surfaces
NASA Technical Reports Server (NTRS)
Germani, Mark S.
1991-01-01
Automated electron microprobe analyses were performed on tape lift samples from the Technology Mirror Assembly (TMA) optical surfaces. Details of the analyses are given, and the contamination of the mirror surfaces is discussed. Based on the automated analyses of the tape lifts from the TMA surfaces and the control blank, we can conclude that the particles identified on the actual samples were not a result of contamination due to the handling or sampling process itself and that the particles reflect the actual contamination on the surface of the mirror.
NASA Astrophysics Data System (ADS)
Dong, Leng; Chen, Yan; Dias, Sarah; Stone, William; Dias, Joseph; Rout, John; Gale, Alastair G.
2017-03-01
Visual search techniques and FROC analysis have been widely used in radiology to understand medical image perceptual behaviour and diagnostic performance. The potential of exploiting the advantages of both methodologies is of great interest to medical researchers. In this study, eye tracking data of eight dental practitioners was investigated. The visual search measures and their analyses are considered here. Each participant interpreted 20 dental radiographs which were chosen by an expert dental radiologist. Various eye movement measurements were obtained based on image area of interest (AOI) information. FROC analysis was then carried out by using these eye movement measurements as a direct input source. The performance of FROC methods using different input parameters was tested. The results showed that there were significant differences in FROC measures, based on eye movement data, between groups with different experience levels. Namely, the area under the curve (AUC) score evidenced higher values for experienced group for the measurements of fixation and dwell time. Also, positive correlations were found for AUC scores between the eye movement data conducted FROC and rating based FROC. FROC analysis using eye movement measurements as input variables can act as a potential performance indicator to deliver assessment in medical imaging interpretation and assess training procedures. Visual search data analyses lead to new ways of combining eye movement data and FROC methods to provide an alternative dimension to assess performance and visual search behaviour in the area of medical imaging perceptual tasks.
Budde, Kristin S; Barron, Daniel S; Fox, Peter T
2014-12-01
Developmental stuttering is a speech disorder most likely due to a heritable form of developmental dysmyelination impairing the function of the speech-motor system. Speech-induced brain-activation patterns in persons who stutter (PWS) are anomalous in various ways; the consistency of these aberrant patterns is a matter of ongoing debate. Here, we present a hierarchical series of coordinate-based meta-analyses addressing this issue. Two tiers of meta-analyses were performed on a 17-paper dataset (202 PWS; 167 fluent controls). Four large-scale (top-tier) meta-analyses were performed, two for each subject group (PWS and controls). These analyses robustly confirmed the regional effects previously postulated as "neural signatures of stuttering" (Brown, Ingham, Ingham, Laird, & Fox, 2005) and extended this designation to additional regions. Two smaller-scale (lower-tier) meta-analyses refined the interpretation of the large-scale analyses: (1) a between-group contrast targeting differences between PWS and controls (stuttering trait); and (2) a within-group contrast (PWS only) of stuttering with induced fluency (stuttering state). Copyright © 2014 Elsevier Inc. All rights reserved.
An evaluation of GTAW-P versus GTA welding of alloy 718
NASA Technical Reports Server (NTRS)
Gamwell, W. R.; Kurgan, C.; Malone, T. W.
1991-01-01
Mechanical properties were evaluated to determine statistically whether the pulsed current gas tungsten arc welding (GTAW-P) process produces welds in alloy 718 with room temperature structural performance equivalent to current Space Shuttle Main Engine (SSME) welds manufactured by the constant current GTAW-P process. Evaluations were conducted on two base metal lots, two filler metal lots, two heat input levels, and two welding processes. The material form was 0.125-inch (3.175-mm) alloy 718 sheet. Prior to welding, sheets were treated to either the ST or STA-1 condition. After welding, panels were left as welded or heat treated to the STA-1 condition, and weld beads were left intact or machined flush. Statistical analyses were performed on yield strength, ultimate tensile strength (UTS), and high cycle fatigue (HCF) properties for all the post welded material conditions. Analyses of variance were performed on the data to determine if there were any significant effects on UTS or HCF life due to variations in base metal, filler metal, heat input level, or welding process. Statistical analyses showed that the GTAW-P process does produce welds with room temperature structural performance equivalent to current SSME welds manufactured by the GTAW process, regardless of prior material condition or post welding condition.
Petascale supercomputing to accelerate the design of high-temperature alloys
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; ...
2017-10-25
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
NASA Astrophysics Data System (ADS)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; Haynes, J. Allen
2017-12-01
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ‧-Al2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviour of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. The approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.
Analysis For Monitoring the Earth Science Afternoon Constellation
NASA Technical Reports Server (NTRS)
Demarest, Peter; Richon, Karen V.; Wright, Frank
2005-01-01
The Earth Science Afternoon Constellation consists of Aqua, Aura, PARASOL, CALIPSO, Cloudsat, and the Orbiting Carbon Observatory (OCO). The coordination of flight dynamics activities between these missions is critical to the safety and success of the Afternoon Constellation. This coordination is based on two main concepts, the control box and the zone-of-exclusion. This paper describes how these two concepts are implemented in the Constellation Coordination System (CCS). The CCS is a collection of tools that enables the collection and distribution of flight dynamics products among the missions, allows cross-mission analyses to be performed through a web-based interface, performs automated analyses to monitor the overall constellation, and notifies the missions of changes in the status of the other missions.
An Evaluation of the Impact of E-Learning Media Formats on Student Perception and Performance
NASA Astrophysics Data System (ADS)
Kurbel, Karl; Stankov, Ivo; Datsenka, Rastsislau
Factors influencing student evaluation of web-based courses are analyzed, based on student feedback from an online distance-learning graduate program. The impact of different media formats on the perception of the courses by the students as well as on their performance in these courses are examined. In particular, we studied conventional hypertext-based courses, video-based courses and audio-based courses, and tried to find out whether the media format has an effect on how students assess courses and how good or bad their grades are. Statistical analyses were performed to answer several research questions related to the topic and to properly evaluate the factors influencing student evaluation.
Harvey, Philip D; Aslan, Mihaela; Du, Mengtian; Zhao, Hongyu; Siever, Larry J; Pulver, Ann; Gaziano, J Michael; Concato, John
2016-01-01
Impairments in cognition and everyday functioning are common in schizophrenia and bipolar disorder (BPD). In this article, we present factor analyses of cognitive and functional capacity (FC) measures based on 2 studies of schizophrenia (SCZ) and bipolar I disorder (BPI) using similar methods. The overall goal of these analyses was to determine whether performance-based assessments should be examined individually, or aggregated on the basis of the correlational structure of the tests, as well as to evaluate the similarity of factor structures of SCZ and BPI. Veterans Affairs Cooperative Studies Program Study #572 (Harvey et al., 2014) evaluated cognitive and FC measures among 5,414 BPI and 3,942 SCZ patients. A 2nd study evaluated similar neuropsychological (NP) and FC measures among 368 BPI and 436 SCZ patients. Principal components analysis, as well as exploratory and CFAs, were used to examine the data. Analyses in both datasets suggested that NP and FC measures were explained by a single underlying factor in BPI and SCZ patients, both when analyzed separately or as in a combined sample. The factor structure in both studies was similar, with or without inclusion of FC measures; homogeneous loadings were observed for that single factor across cognitive and FC domains across the samples. The empirically derived factor model suggests that NP performance and FC are best explained as a single latent trait applicable to people with SCZ and BPD. This single measure may enhance the robustness of the analyses relating genomic data to performance-based phenotypes. (c) 2015 APA, all rights reserved).
Performance-based management and quality of work: an empirical assessment.
Falzon, Pierre; Nascimento, Adelaide; Gaudart, Corinne; Piney, Cécile; Dujarier, Marie-Anne; Germe, Jean-François
2012-01-01
In France, in the private sector as in the public sector, performance-based management tends to become a norm. Performance-based management is supposed to improve service quality, productivity and efficiency, transparency of allotted means and achieved results, and to better focus the activity of employees and of the whole organization. This text reports a study conducted for the French Ministry of Budget by a team of researchers in ergonomics, sociology and management science, in order to assess the impact of performance-based management on employees, on teams and on work organization. About 100 interviews were conducted with employees of all categories and 6 working groups were set up in order to discuss and validate or amend our first analyses. Results concern several aspects: workload and work intensification, indicators and performance management and the transformation of jobs induced by performance management.
Situation Venice: Towards a Performative "Ex-Planation" of a City
ERIC Educational Resources Information Center
Whybrow, Nicolas
2011-01-01
The article's main concern is to analyse theoretical and artistic factors influencing the attempt by a group of undergraduate students (at the University of Warwick, UK) to produce a "performative mapping" of the city of Venice. In other words, it asks what kind of performance-based strategies might usefully be applied in the process of…
Steingroever, Helen; Pachur, Thorsten; Šmíra, Martin; Lee, Michael D
2018-06-01
The Iowa Gambling Task (IGT) is one of the most popular experimental paradigms for comparing complex decision-making across groups. Most commonly, IGT behavior is analyzed using frequentist tests to compare performance across groups, and to compare inferred parameters of cognitive models developed for the IGT. Here, we present a Bayesian alternative based on Bayesian repeated-measures ANOVA for comparing performance, and a suite of three complementary model-based methods for assessing the cognitive processes underlying IGT performance. The three model-based methods involve Bayesian hierarchical parameter estimation, Bayes factor model comparison, and Bayesian latent-mixture modeling. We illustrate these Bayesian methods by applying them to test the extent to which differences in intuitive versus deliberate decision style are associated with differences in IGT performance. The results show that intuitive and deliberate decision-makers behave similarly on the IGT, and the modeling analyses consistently suggest that both groups of decision-makers rely on similar cognitive processes. Our results challenge the notion that individual differences in intuitive and deliberate decision styles have a broad impact on decision-making. They also highlight the advantages of Bayesian methods, especially their ability to quantify evidence in favor of the null hypothesis, and that they allow model-based analyses to incorporate hierarchical and latent-mixture structures.
Integrated vehicle-based safety systems : heavy-truck field operational test key findings report.
DOT National Transportation Integrated Search
2010-08-01
This document presents key findings from the heavy-truck field operational test conducted as : part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of : analyses performed by the University of Michigan Transporta...
Integrated vehicle-based safety systems light-vehicle field operational test key findings report.
DOT National Transportation Integrated Search
2011-01-01
This document presents key findings from the light-vehicle field operational test conducted as part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of analyses performed by the University of Michigan Transportati...
Integrated vehicle-based safety systems light-vehicle field operational test key findings report.
DOT National Transportation Integrated Search
2011-01-01
"This document presents key findings from the light-vehicle field operational test conducted as part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of analyses performed by the University of Michigan Transportat...
Zhang, Lin; Vranckx, Katleen; Janssens, Koen; Sandrin, Todd R.
2015-01-01
MALDI-TOF mass spectrometry has been shown to be a rapid and reliable tool for identification of bacteria at the genus and species, and in some cases, strain levels. Commercially available and open source software tools have been developed to facilitate identification; however, no universal/standardized data analysis pipeline has been described in the literature. Here, we provide a comprehensive and detailed demonstration of bacterial identification procedures using a MALDI-TOF mass spectrometer. Mass spectra were collected from 15 diverse bacteria isolated from Kartchner Caverns, AZ, USA, and identified by 16S rDNA sequencing. Databases were constructed in BioNumerics 7.1. Follow-up analyses of mass spectra were performed, including cluster analyses, peak matching, and statistical analyses. Identification was performed using blind-coded samples randomly selected from these 15 bacteria. Two identification methods are presented: similarity coefficient-based and biomarker-based methods. Results show that both identification methods can identify the bacteria to the species level. PMID:25590854
Zhang, Lin; Vranckx, Katleen; Janssens, Koen; Sandrin, Todd R
2015-01-02
MALDI-TOF mass spectrometry has been shown to be a rapid and reliable tool for identification of bacteria at the genus and species, and in some cases, strain levels. Commercially available and open source software tools have been developed to facilitate identification; however, no universal/standardized data analysis pipeline has been described in the literature. Here, we provide a comprehensive and detailed demonstration of bacterial identification procedures using a MALDI-TOF mass spectrometer. Mass spectra were collected from 15 diverse bacteria isolated from Kartchner Caverns, AZ, USA, and identified by 16S rDNA sequencing. Databases were constructed in BioNumerics 7.1. Follow-up analyses of mass spectra were performed, including cluster analyses, peak matching, and statistical analyses. Identification was performed using blind-coded samples randomly selected from these 15 bacteria. Two identification methods are presented: similarity coefficient-based and biomarker-based methods. Results show that both identification methods can identify the bacteria to the species level.
Effect of corrosion on the buckling capacity of tubular members
NASA Astrophysics Data System (ADS)
Øyasæter, F. H.; Aeran, A.; Siriwardane, S. C.; Mikkelsen, O.
2017-12-01
Offshore installations are subjected to harsh marine environment and often have damages from corrosion. Several experimental and numerical studies were performed in the past to estimate buckling capacity of corroded tubular members. However, these studies were either based on limited experimental tests or numerical analyses of few cases resulting in semi-empirical relations. Also, there are no guidelines and recommendations in the currently available design standards. To fulfil this research gap, a new formula is proposed to estimate the residual strength of tubular members considering corrosion and initial geometrical imperfections. The proposed formula is verified with results from finite element analyses performed on several members and for varying corrosion patch parameters. The members are selected to represent the most relevant Eurocode buckling curve for tubular members. It is concluded that corrosion reduces the buckling capacity significantly and the proposed formula can be easily applied by practicing engineers without performing detailed numerical analyses.
Facilitation of the PED analysis of large molecules by using global coordinates.
Jamróz, Michał H; Ostrowski, Sławomir; Dobrowolski, Jan Cz
2015-10-05
Global coordinates have been found to be useful in the potential energy distribution (PED) analyses of the following large molecules: [13]-acene and [33]-helicene. The global coordinate is defined based on much distanced fragments of the analysed molecule, whereas so far, the coordinates used in the analysis were based on stretchings, bendings, or torsions of the adjacent atoms. It has been shown that the PED analyses performed using the global coordinate and the classical ones can lead to exactly the same PED contributions. The global coordinates may significantly improve the facility of the analysis of the vibrational spectra of large molecules. Copyright © 2015 Elsevier B.V. All rights reserved.
Vegter, Stefan; Boersma, Cornelis; Rozenbaum, Mark; Wilffert, Bob; Navis, Gerjan; Postma, Maarten J
2008-01-01
The fields of pharmacogenetics and pharmacogenomics have become important practical tools to progress goals in medical and pharmaceutical research and development. As more screening tests are being developed, with some already used in clinical practice, consideration of cost-effectiveness implications is important. A systematic review was performed on the content of and adherence to pharmacoeconomic guidelines of recent pharmacoeconomic analyses performed in the field of pharmacogenetics and pharmacogenomics. Economic analyses of screening strategies for genetic variations, which were evidence-based and assumed to be associated with drug efficacy or safety, were included in the review. The 20 papers included cover a variety of healthcare issues, including screening tests on several cytochrome P450 (CYP) enzyme genes, thiopurine S-methyltransferase (TMPT) and angiotensin-converting enzyme (ACE) insertion deletion (ACE I/D) polymorphisms. Most economic analyses reported that genetic screening was cost effective and often even clearly dominated existing non-screening strategies. However, we found a lack of standardization regarding aspects such as the perspective of the analysis, factors included in the sensitivity analysis and the applied discount rates. In particular, an important limitation of several studies related to the failure to provide a sufficient evidence-based rationale for an association between genotype and phenotype. Future economic analyses should be conducted utilizing correct methods, with adherence to guidelines and including extensive sensitivity analyses. Most importantly, genetic screening strategies should be based on good evidence-based rationales. For these goals, we provide a list of recommendations for good pharmacoeconomic practice deemed useful in the fields of pharmacogenetics and pharmacogenomics, regardless of country and origin of the economic analysis.
Pots, Wendy T M; Trompetter, Hester R; Schreurs, Karlein M G; Bohlmeijer, Ernst T
2016-05-23
Acceptance and Commitment Therapy (ACT) has been demonstrated to be effective in reducing depressive symptoms. However, little is known how and for whom therapeutic change occurs, specifically in web-based interventions. This study focuses on the mediators, moderators and predictors of change during a web-based ACT intervention. Data from 236 adults from the general population with mild to moderate depressive symptoms, randomized to either web-based ACT (n = 82) or one of two control conditions (web-based Expressive Writing (EW; n = 67) and a waiting list (n = 87)), were analysed. Single and multiple mediation analyses, and exploratory linear regression analyses were performed using PROCESS and linear regression analyses, to examine mediators, moderators and predictors on pre- to post- and follow-up treatment change of depressive symptoms. The treatment effect of ACT versus the waiting list was mediated by psychological flexibility and two mindfulness facets. The treatment effect of ACT versus EW was not significantly mediated. The moderator analyses demonstrated that the effects of web-based ACT did not vary according to baseline patient characteristics when compared to both control groups. However, higher baseline depressive symptoms and positive mental health and lower baseline anxiety were identified as predictors of outcome across all conditions. Similar results are found for follow-up. The findings of this study corroborate the evidence that psychological flexibility and mindfulness are distinct process mechanisms that mediate the effects of web-based ACT intervention. The results indicate that there are no restrictions to the allocation of web-based ACT intervention and that web-based ACT can work for different subpopulations. Netherlands Trial Register NTR2736 . Registered 6 February 2011.
Performance deterioration of commercial high-bypass ratio turbofan engines
NASA Technical Reports Server (NTRS)
Mehalic, C. M.; Ziemianski, J. A.
1980-01-01
The results of engine performance deterioration investigations based on historical data, special engine tests, and specific tests to define the influence of flight loads and component clearances on performance are presented. The results of analyses of several damage mechanisms that contribute to performance deterioration such as blade tip rubs, airfoil surface roughness and erosion, and thermal distortion are also included. The significance of these damage mechanisms on component and overall engine performance is discussed.
Aggression in Primary Schools: The Predictive Power of the School and Home Environment
ERIC Educational Resources Information Center
Kozina, Ana
2015-01-01
In this study, we analyse the predictive power of home and school environment-related factors for determining pupils' aggression. The multiple regression analyses are performed for fourth- and eighth-grade pupils based on the Trends in Mathematics and Science Study (TIMSS) 2007 (N = 8394) and TIMSS 2011 (N = 9415) databases for Slovenia. At the…
Lu, Sara A; Wickens, Christopher D; Prinet, Julie C; Hutchins, Shaun D; Sarter, Nadine; Sebok, Angelia
2013-08-01
The aim of this study was to integrate empirical data showing the effects of interrupting task modality on the performance of an ongoing visual-manual task and the interrupting task itself. The goal is to support interruption management and the design of multimodal interfaces. Multimodal interfaces have been proposed as a promising means to support interruption management.To ensure the effectiveness of this approach, their design needs to be based on an analysis of empirical data concerning the effectiveness of individual and redundant channels of information presentation. Three meta-analyses were conducted to contrast performance on an ongoing visual task and interrupting tasks as a function of interrupting task modality (auditory vs. tactile, auditory vs. visual, and single modality vs. redundant auditory-visual). In total, 68 studies were included and six moderator variables were considered. The main findings from the meta-analyses are that response times are faster for tactile interrupting tasks in case of low-urgency messages.Accuracy is higher with tactile interrupting tasks for low-complexity signals but higher with auditory interrupting tasks for high-complexity signals. Redundant auditory-visual combinations are preferable for communication tasks during high workload and with a small visual angle of separation. The three meta-analyses contribute to the knowledge base in multimodal information processing and design. They highlight the importance of moderator variables in predicting the effects of interruption task modality on ongoing and interrupting task performance. The findings from this research will help inform the design of multimodal interfaces in data-rich, event-driven domains.
related: an R package for analysing pairwise relatedness from codominant molecular markers.
Pew, Jack; Muir, Paul H; Wang, Jinliang; Frasier, Timothy R
2015-05-01
Analyses of pairwise relatedness represent a key component to addressing many topics in biology. However, such analyses have been limited because most available programs provide a means to estimate relatedness based on only a single estimator, making comparison across estimators difficult. Second, all programs to date have been platform specific, working only on a specific operating system. This has the undesirable outcome of making choice of relatedness estimator limited by operating system preference, rather than being based on scientific rationale. Here, we present a new R package, called related, that can calculate relatedness based on seven estimators, can account for genotyping errors, missing data and inbreeding, and can estimate 95% confidence intervals. Moreover, simulation functions are provided that allow for easy comparison of the performance of different estimators and for analyses of how much resolution to expect from a given data set. Because this package works in R, it is platform independent. Combined, this functionality should allow for more appropriate analyses and interpretation of pairwise relatedness and will also allow for the integration of relatedness data into larger R workflows. © 2014 John Wiley & Sons Ltd.
Characterizing Discourse Deficits Following Penetrating Head Injury: A Preliminary Model
ERIC Educational Resources Information Center
Coelho, Carl; Le, Karen; Mozeiko, Jennifer; Hamilton, Mark; Tyler, Elizabeth; Krueger, Frank; Grafman, Jordan
2013-01-01
Purpose: Discourse analyses have demonstrated utility for delineating subtle communication deficits following closed head injuries (CHIs). The present investigation examined the discourse performance of a large group of individuals with penetrating head injury (PHI). Performance was also compared across 6 subgroups of PHI based on lesion locale. A…
Chinese Responses to Shanghai's Performance in PISA
ERIC Educational Resources Information Center
Tan, Charlene
2017-01-01
This article analyses the public responses in China to Shanghai's performance in the 2012 Programme for International Student Assessment (PISA). Based on data obtained from media accounts and other materials published between 2013 and 2016, the research findings show that the responses in China are generally reflective, measured and self-critical.…
Profiling Learners' Achievement Goals when Completing Academic Essays
ERIC Educational Resources Information Center
Ng, Chi-Hung Clarence
2009-01-01
This study explored adult learners' goal profiles in relation to the completion of a compulsory academic essay. Based on learners' scores on items assessing mastery, performance-approach, and work-avoidance goals, cluster analyses produced three distinct categories of learners: performance-focused, work-avoidant, and multiple-goal learners. These…
Analytic and Heuristic Processing Influences on Adolescent Reasoning and Decision-Making.
ERIC Educational Resources Information Center
Klaczynski, Paul A.
2001-01-01
Examined the relationship between age and the normative/descriptive gap--the discrepancy between actual reasoning and traditional standards for reasoning. Found that middle adolescents performed closer to normative ideals than early adolescents. Factor analyses suggested that performance was based on two processing systems, analytic and heuristic…
DOT National Transportation Integrated Search
2010-12-01
"This document presents the methodology and results from the heavy-truck field operational test conducted as part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of analyses performed by the University of Michiga...
DOT National Transportation Integrated Search
2010-12-01
"This document presents the methodology and results from the light-vehicle field operational test conducted as part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of analyses performed by the University of Michi...
POLYNOMIAL-BASED DISAGGREGATION OF HOURLY RAINFALL FOR CONTINUOUS HYDROLOGIC SIMULATION
Hydrologic modeling of urban watersheds for designs and analyses of stormwater conveyance facilities can be performed in either an event-based or continuous fashion. Continuous simulation requires, among other things, the use of a time series of rainfall amounts. However, for urb...
Takeuchi, Hikaru; Taki, Yasuyuki; Sassa, Yuko; Hashizume, Hiroshi; Sekiguchi, Atsushi; Fukushima, Ai; Kawashima, Ryuta
2011-10-01
Working memory is the limited capacity storage system involved in the maintenance and manipulation of information over short periods of time. Previous imaging studies have suggested that the frontoparietal regions are activated during working memory tasks; a putative association between the structure of the frontoparietal regions and working memory performance has been suggested based on the analysis of individuals with varying pathologies. This study aimed to identify correlations between white matter and individual differences in verbal working memory performance in normal young subjects. We performed voxel-based morphometry (VBM) analyses using T1-weighted structural images as well as voxel-based analyses of fractional anisotropy (FA) using diffusion tensor imaging. Using the letter span task, we measured verbal working memory performance in normal young adult men and women (mean age, 21.7 years, SD=1.44; 42 men and 13 women). We observed positive correlations between working memory performance and regional white matter volume (rWMV) in the frontoparietal regions. In addition, FA was found to be positively correlated with verbal working memory performance in a white matter region adjacent to the right precuneus. These regions are consistently recruited by working memory. Our findings suggest that, among normal young subjects, verbal working memory performance is associated with various regions that are recruited during working memory tasks, and this association is not limited to specific parts of the working memory network. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Sánchez-Márquez, Jesús; Zorrilla, David; García, Víctor; Fernández, Manuel
2018-07-01
This work presents a new development based on the condensation scheme proposed by Chamorro and Pérez, in which new terms to correct the frozen molecular orbital approximation have been introduced (improved frontier molecular orbital approximation). The changes performed on the original development allow taking into account the orbital relaxation effects, providing equivalent results to those achieved by the finite difference approximation and leading also to a methodology with great advantages. Local reactivity indices based on this new development have been obtained for a sample set of molecules and they have been compared with those indices based on the frontier molecular orbital and finite difference approximations. A new definition based on the improved frontier molecular orbital methodology for the dual descriptor index is also shown. In addition, taking advantage of the characteristics of the definitions obtained with the new condensation scheme, the descriptor local philicity is analysed by separating the components corresponding to the frontier molecular orbital approximation and orbital relaxation effects, analysing also the local parameter multiphilic descriptor in the same way. Finally, the effect of using the basis set is studied and calculations using DFT, CI and Möller-Plesset methodologies are performed to analyse the consequence of different electronic-correlation levels.
ERIC Educational Resources Information Center
Abrams, Zsuzsanna; Rott, Susanne
2017-01-01
Research on second language (L2) grammar in task-based language learning has yielded inconsistent results regarding the effects of task-complexity, prompting calls for more nuanced analyses of L2 development and task performance. The present cross-sectional study contributes to this discussion by comparing the performance of 245 learners of German…
Damage Tolerance Analysis of a Pressurized Liquid Oxygen Tank
NASA Technical Reports Server (NTRS)
Forth, Scott C.; Harvin, Stephen F.; Gregory, Peyton B.; Mason, Brian H.; Thompson, Joe E.; Hoffman, Eric K.
2006-01-01
A damage tolerance assessment was conducted of an 8,000 gallon pressurized Liquid Oxygen (LOX) tank. The LOX tank is constructed of a stainless steel pressure vessel enclosed by a thermal-insulating vacuum jacket. The vessel is pressurized to 2,250 psi with gaseous nitrogen resulting in both thermal and pressure stresses on the tank wall. Finite element analyses were performed on the tank to characterize the stresses from operation. Engineering material data was found from both the construction of the tank and the technical literature. An initial damage state was assumed based on records of a nondestructive inspection performed on the tank. The damage tolerance analyses were conducted using the NASGRO computer code. This paper contains the assumptions, and justifications, made for the input parameters to the damage tolerance analyses and the results of the damage tolerance analyses with a discussion on the operational safety of the LOX tank.
The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2016 update
Afgan, Enis; Baker, Dannon; van den Beek, Marius; Blankenberg, Daniel; Bouvier, Dave; Čech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Eberhard, Carl; Grüning, Björn; Guerler, Aysam; Hillman-Jackson, Jennifer; Von Kuster, Greg; Rasche, Eric; Soranzo, Nicola; Turaga, Nitesh; Taylor, James; Nekrutenko, Anton; Goecks, Jeremy
2016-01-01
High-throughput data production technologies, particularly ‘next-generation’ DNA sequencing, have ushered in widespread and disruptive changes to biomedical research. Making sense of the large datasets produced by these technologies requires sophisticated statistical and computational methods, as well as substantial computational power. This has led to an acute crisis in life sciences, as researchers without informatics training attempt to perform computation-dependent analyses. Since 2005, the Galaxy project has worked to address this problem by providing a framework that makes advanced computational tools usable by non experts. Galaxy seeks to make data-intensive research more accessible, transparent and reproducible by providing a Web-based environment in which users can perform computational analyses and have all of the details automatically tracked for later inspection, publication, or reuse. In this report we highlight recently added features enabling biomedical analyses on a large scale. PMID:27137889
Reliability and performance evaluation of systems containing embedded rule-based expert systems
NASA Technical Reports Server (NTRS)
Beaton, Robert M.; Adams, Milton B.; Harrison, James V. A.
1989-01-01
A method for evaluating the reliability of real-time systems containing embedded rule-based expert systems is proposed and investigated. It is a three stage technique that addresses the impact of knowledge-base uncertainties on the performance of expert systems. In the first stage, a Markov reliability model of the system is developed which identifies the key performance parameters of the expert system. In the second stage, the evaluation method is used to determine the values of the expert system's key performance parameters. The performance parameters can be evaluated directly by using a probabilistic model of uncertainties in the knowledge-base or by using sensitivity analyses. In the third and final state, the performance parameters of the expert system are combined with performance parameters for other system components and subsystems to evaluate the reliability and performance of the complete system. The evaluation method is demonstrated in the context of a simple expert system used to supervise the performances of an FDI algorithm associated with an aircraft longitudinal flight-control system.
Deciphering chicken gut microbial dynamics based on high-throughput 16S rRNA metagenomics analyses.
Mohd Shaufi, Mohd Asrore; Sieo, Chin Chin; Chong, Chun Wie; Gan, Han Ming; Ho, Yin Wan
2015-01-01
Chicken gut microbiota has paramount roles in host performance, health and immunity. Understanding the topological difference in gut microbial community composition is crucial to provide knowledge on the functions of each members of microbiota to the physiological maintenance of the host. The gut microbiota profiling of the chicken was commonly performed previously using culture-dependent and early culture-independent methods which had limited coverage and accuracy. Advances in technology based on next-generation sequencing (NGS), offers unparalleled coverage and depth in determining microbial gut dynamics. Thus, the aim of this study was to investigate the ileal and caecal microbiota development as chicken aged, which is important for future effective gut modulation. Ileal and caecal contents of broiler chicken were extracted from 7, 14, 21 and 42-day old chicken. Genomic DNA was then extracted and amplified based on V3 hyper-variable region of 16S rRNA. Bioinformatics, ecological and statistical analyses such as Principal Coordinate Analysis (PCoA) was performed in mothur software and plotted using PRIMER 6. Additional analyses for predicted metagenomes were performed through PICRUSt and STAMP software package based on Greengenes databases. A distinctive difference in bacterial communities was observed between ilea and caeca as the chicken aged (P < 0.001). The microbial communities in the caeca were more diverse in comparison to the ilea communities. The potentially pathogenic bacteria such as Clostridium were elevated as the chicken aged and the population of beneficial microbe such as Lactobacillus was low at all intervals. On the other hand, based on predicted metagenomes analysed, clear distinction in functions and roles of gut microbiota such as gene pathways related to nutrient absorption (e.g. sugar and amino acid metabolism), and bacterial proliferation and colonization (e.g. bacterial motility proteins, two-component system and bacterial secretion system) were observed between ilea and caeca, respectively (P < 0.05). The caeca microbial communities were more diverse in comparison to ilea. The main functional differences between the two sites were found to be related to nutrient absorption and bacterial colonization. Based on the composition of the microbial community, future gut modulation with beneficial bacteria such as probiotics may benefit the host.
Faggion, Clovis Mariano; Listl, Stefan; Alarcón, Marco Antonio
2015-05-01
The objective of this study was to assess how authors of systematic reviews (SRs) with meta-analyses published in periodontology and implant dentistry evaluate risk of bias (ROB) in primary studies included in these reviews. A literature search for SRs with meta-analyses was performed in PubMed and Cochrane library databases up to July 20th 2014. The reference lists of included articles were screened for further reviews. The standards of evaluating ROB in primary studies were evaluated by using a 14-item checklist based on the Cochrane approach for evaluating ROB. Standards in ROB evaluations in Cochrane and paper-based SRs were compared using the Fisher's exact test. All searches, data extraction and evaluations were performed independently and in duplicate. Seventy SRs were included (45 paper-based and 25 Cochrane SRs, respectively). The median percentage of items addressed was 58% (interquartile range 4-100%). Cochrane SRs more frequently included ROB assessments than paper-based reviews in terms of examiner blinding (p = 0.0026), selective outcome reporting (p = 0.0207) and other bias (p = 0.0241). The ROB evaluation in primary studies currently included in SRs with meta-analyses in periodontology and implant dentistry is not sufficiently comprehensive. Cochrane SRs have more comprehensive ROB evaluation than paper-based reviews. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Hughes, Gareth; Westmacott, Kelly; Honeychurch, Kevin C.; Crew, Adrian; Pemberton, Roy M.; Hart, John P.
2016-01-01
This review describes recent advances in the fabrication of electrochemical (bio)sensors based on screen-printing technology involving carbon materials and their application in biomedical, agri-food and environmental analyses. It will focus on the various strategies employed in the fabrication of screen-printed (bio)sensors, together with their performance characteristics; the application of these devices for the measurement of selected naturally occurring biomolecules, environmental pollutants and toxins will be discussed. PMID:27690118
FY 1987 current fiscal year work plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This Current Year Work Plan presents a detailed description of the activities to be performed by the Joint Integration Office during FY87. It breaks down the activities into two major work areas: Program Management and Program Analysis. Program Management is performed by the JIO by providing technical planning and guidance for the development of advanced TRU waste management capabilities. This includes equipment/facility design, engineering, construction, and operations. These functions are integrated to allow transition from interim storage to final disposition. JIO tasks include program requirements identification, long-range technical planning, budget development, program planning document preparation, task guidance, task monitoring, informationmore » gathering and task reporting to DOE, interfacing with other agencies and DOE lead programs, integrating public involvement with program efforts, and preparation of program status reports for DOE. Program Analysis is performed by the JIO to support identification and assessment of alternatives, and development of long-term TRU waste program capabilities. This work plan includes: system analyses, requirements analyses, interim and procedure development, legislative and regulatory analyses, dispatch and traffic analyses, and data bases.« less
[Clinical research XXIII. From clinical judgment to meta-analyses].
Rivas-Ruiz, Rodolfo; Castelán-Martínez, Osvaldo D; Pérez-Rodríguez, Marcela; Palacios-Cruz, Lino; Noyola-Castillo, Maura E; Talavera, Juan O
2014-01-01
Systematic reviews (SR) are studies made in order to ask clinical questions based on original articles. Meta-analysis (MTA) is the mathematical analysis of SR. These analyses are divided in two groups, those which evaluate the measured results of quantitative variables (for example, the body mass index -BMI-) and those which evaluate qualitative variables (for example, if a patient is alive or dead, or if he is healing or not). Quantitative variables generally use the mean difference analysis and qualitative variables can be performed using several calculations: odds ratio (OR), relative risk (RR), absolute risk reduction (ARR) and hazard ratio (HR). These analyses are represented through forest plots which allow the evaluation of each individual study, as well as the heterogeneity between studies and the overall effect of the intervention. These analyses are mainly based on Student's t test and chi-squared. To take appropriate decisions based on the MTA, it is important to understand the characteristics of statistical methods in order to avoid misinterpretations.
Missing value imputation for microarray data: a comprehensive comparison study and a web tool
2013-01-01
Background Microarray data are usually peppered with missing values due to various reasons. However, most of the downstream analyses for microarray data require complete datasets. Therefore, accurate algorithms for missing value estimation are needed for improving the performance of microarray data analyses. Although many algorithms have been developed, there are many debates on the selection of the optimal algorithm. The studies about the performance comparison of different algorithms are still incomprehensive, especially in the number of benchmark datasets used, the number of algorithms compared, the rounds of simulation conducted, and the performance measures used. Results In this paper, we performed a comprehensive comparison by using (I) thirteen datasets, (II) nine algorithms, (III) 110 independent runs of simulation, and (IV) three types of measures to evaluate the performance of each imputation algorithm fairly. First, the effects of different types of microarray datasets on the performance of each imputation algorithm were evaluated. Second, we discussed whether the datasets from different species have different impact on the performance of different algorithms. To assess the performance of each algorithm fairly, all evaluations were performed using three types of measures. Our results indicate that the performance of an imputation algorithm mainly depends on the type of a dataset but not on the species where the samples come from. In addition to the statistical measure, two other measures with biological meanings are useful to reflect the impact of missing value imputation on the downstream data analyses. Our study suggests that local-least-squares-based methods are good choices to handle missing values for most of the microarray datasets. Conclusions In this work, we carried out a comprehensive comparison of the algorithms for microarray missing value imputation. Based on such a comprehensive comparison, researchers could choose the optimal algorithm for their datasets easily. Moreover, new imputation algorithms could be compared with the existing algorithms using this comparison strategy as a standard protocol. In addition, to assist researchers in dealing with missing values easily, we built a web-based and easy-to-use imputation tool, MissVIA (http://cosbi.ee.ncku.edu.tw/MissVIA), which supports many imputation algorithms. Once users upload a real microarray dataset and choose the imputation algorithms, MissVIA will determine the optimal algorithm for the users' data through a series of simulations, and then the imputed results can be downloaded for the downstream data analyses. PMID:24565220
Multicomponent analysis of a digital Trail Making Test.
Fellows, Robert P; Dahmen, Jessamyn; Cook, Diane; Schmitter-Edgecombe, Maureen
2017-01-01
The purpose of the current study was to use a newly developed digital tablet-based variant of the TMT to isolate component cognitive processes underlying TMT performance. Similar to the paper-based trail making test, this digital variant consists of two conditions, Part A and Part B. However, this digital version automatically collects additional data to create component subtest scores to isolate cognitive abilities. Specifically, in addition to the total time to completion and number of errors, the digital Trail Making Test (dTMT) records several unique components including the number of pauses, pause duration, lifts, lift duration, time inside each circle, and time between circles. Participants were community-dwelling older adults who completed a neuropsychological evaluation including measures of processing speed, inhibitory control, visual working memory/sequencing, and set-switching. The abilities underlying TMT performance were assessed through regression analyses of component scores from the dTMT with traditional neuropsychological measures. Results revealed significant correlations between paper and digital variants of Part A (r s = .541, p < .001) and paper and digital versions of Part B (r s = .799, p < .001). Regression analyses with traditional neuropsychological measures revealed that Part A components were best predicted by speeded processing, while inhibitory control and visual/spatial sequencing were predictors of specific components of Part B. Exploratory analyses revealed that specific dTMT-B components were associated with a performance-based medication management task. Taken together, these results elucidate specific cognitive abilities underlying TMT performance, as well as the utility of isolating digital components.
Giovenzana, Valentina; Civelli, Raffaele; Beghi, Roberto; Oberti, Roberto; Guidetti, Riccardo
2015-11-01
The aim of this work was to test a simplified optical prototype for a rapid estimation of the ripening parameters of white grape for Franciacorta wine directly in field. Spectral acquisition based on reflectance at four wavelengths (630, 690, 750 and 850 nm) was proposed. The integration of a simple processing algorithm in the microcontroller software would allow to visualize real time values of spectral reflectance. Non-destructive analyses were carried out on 95 grape bunches for a total of 475 berries. Samplings were performed weekly during the last ripening stages. Optical measurements were carried out both using the simplified system and a portable commercial vis/NIR spectrophotometer, as reference instrument for performance comparison. Chemometric analyses were performed in order to extract the maximum useful information from optical data. Principal component analysis (PCA) was performed for a preliminary evaluation of the data. Correlations between the optical data matrix and ripening parameters (total soluble solids content, SSC; titratable acidity, TA) were carried out using partial least square (PLS) regression for spectra and using multiple linear regression (MLR) for data from the simplified device. Classification analysis were also performed with the aim of discriminate ripe and unripe samples. PCA, MLR and classification analyses show the effectiveness of the simplified system in separating samples among different sampling dates and in discriminating ripe from unripe samples. Finally, simple equations for SSC and TA prediction were calculated. Copyright © 2015 Elsevier B.V. All rights reserved.
Business Law: Task Analyses. Competency-Based Education.
ERIC Educational Resources Information Center
Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.
This task analysis guide is intended to help teachers and administrators develop instructional materials and implement competency-based education for a course in business law. Section 1 contains a validated task inventory for business law. For each task, applicable information pertaining to performance and enabling objectives, criterion-referenced…
Earthquake Ground Motion Selection
DOT National Transportation Integrated Search
2012-05-01
Nonlinear analyses of soils, structures, and soil-structure systems offer the potential for more accurate characterization of geotechnical and structural response under strong earthquake shaking. The increasing use of advanced performance-based desig...
NASA Technical Reports Server (NTRS)
Jeng, Frank F.
2007-01-01
Development of analysis guidelines for Exploration Life Support (ELS) technology tests was completed. The guidelines were developed based on analysis experiences gained from supporting Environmental Control and Life Support System (ECLSS) technology development in air revitalization systems and water recovery systems. Analyses are vital during all three phases of the ELS technology test: pre-test, during test and post test. Pre-test analyses of a test system help define hardware components, predict system and component performances, required test duration, sampling frequencies of operation parameters, etc. Analyses conducted during tests could verify the consistency of all the measurements and the performance of the test system. Post test analyses are an essential part of the test task. Results of post test analyses are an important factor in judging whether the technology development is a successful one. In addition, development of a rigorous model for a test system is an important objective of any new technology development. Test data analyses, especially post test data analyses, serve to verify the model. Test analyses have supported development of many ECLSS technologies. Some test analysis tasks in ECLSS technology development are listed in the Appendix. To have effective analysis support for ECLSS technology tests, analysis guidelines would be a useful tool. These test guidelines were developed based on experiences gained through previous analysis support of various ECLSS technology tests. A comment on analysis from an experienced NASA ECLSS manager (1) follows: "Bad analysis was one that bent the test to prove that the analysis was right to begin with. Good analysis was one that directed where the testing should go and also bridged the gap between the reality of the test facility and what was expected on orbit."
[Using on-farm records to evaluate the reproductive performance in dairy herds].
Iwersen, M; Klein, D; Drillich, M
2012-01-01
The designated abolition of the European milk quota system on April 1st 2015 is expected to have tremendous effects on the business environment on most dairy farms. Meanwhile farmers should use weak-point analyses to identify "bottlenecks" within their production and herd management system. As experts in herd health and herd performance, veterinarians should give advice to their clients based on sound analyses of production data. Therefore, accurate and reliable on-farm records are needed. This paper will focus on data management, especially data collection, and will address the concepts of evaluation of reproduction records.
Li, Li; Nguyen, Kim-Huong; Comans, Tracy; Scuffham, Paul
2018-04-01
Several utility-based instruments have been applied in cost-utility analysis to assess health state values for people with dementia. Nevertheless, concerns and uncertainty regarding their performance for people with dementia have been raised. To assess the performance of available utility-based instruments for people with dementia by comparing their psychometric properties and to explore factors that cause variations in the reported health state values generated from those instruments by conducting meta-regression analyses. A literature search was conducted and psychometric properties were synthesized to demonstrate the overall performance of each instrument. When available, health state values and variables such as the type of instrument and cognitive impairment levels were extracted from each article. A meta-regression analysis was undertaken and available covariates were included in the models. A total of 64 studies providing preference-based values were identified and included. The EuroQol five-dimension questionnaire demonstrated the best combination of feasibility, reliability, and validity. Meta-regression analyses suggested that significant differences exist between instruments, type of respondents, and mode of administration and the variations in estimated utility values had influences on incremental quality-adjusted life-year calculation. This review finds that the EuroQol five-dimension questionnaire is the most valid utility-based instrument for people with dementia, but should be replaced by others under certain circumstances. Although no utility estimates were reported in the article, the meta-regression analyses that examined variations in utility estimates produced by different instruments impact on cost-utility analysis, potentially altering the decision-making process in some circumstances. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Oneil, W. J.; Rudd, R. P.; Farless, D. L.; Hildebrand, C. E.; Mitchell, R. T.; Rourke, K. H.; Euler, E. A.
1979-01-01
A comprehensive description of the navigation of the Viking spacecraft throughout their flight from Earth launch to Mars landing is given. The flight path design, actual inflight control, and postflight reconstruction are discussed in detail. The preflight analyses upon which the operational strategies and performance predictions were based are discussed. The inflight results are then discussed and compared with the preflight predictions and, finally, the results of any postflight analyses are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
FINSTERLE, STEFAN; JUNG, YOOJIN; KOWALSKY, MICHAEL
2016-09-15
iTOUGH2 (inverse TOUGH2) provides inverse modeling capabilities for TOUGH2, a simulator for multi-dimensional, multi-phase, multi-component, non-isothermal flow and transport in fractured porous media. iTOUGH2 performs sensitivity analyses, data-worth analyses, parameter estimation, and uncertainty propagation analyses in geosciences and reservoir engineering and other application areas. iTOUGH2 supports a number of different combinations of fluids and components (equation-of-state (EOS) modules). In addition, the optimization routines implemented in iTOUGH2 can also be used for sensitivity analysis, automatic model calibration, and uncertainty quantification of any external code that uses text-based input and output files using the PEST protocol. iTOUGH2 solves the inverse problem bymore » minimizing a non-linear objective function of the weighted differences between model output and the corresponding observations. Multiple minimization algorithms (derivative-free, gradient-based, and second-order; local and global) are available. iTOUGH2 also performs Latin Hypercube Monte Carlo simulations for uncertainty propagation analyses. A detailed residual and error analysis is provided. This upgrade includes (a) global sensitivity analysis methods, (b) dynamic memory allocation (c) additional input features and output analyses, (d) increased forward simulation capabilities, (e) parallel execution on multicore PCs and Linux clusters, and (f) bug fixes. More details can be found at http://esd.lbl.gov/iTOUGH2.« less
ERIC Educational Resources Information Center
Macmann, Gregg M.; Barnett, David W.
1994-01-01
Describes exploratory and confirmatory analyses of verbal-performance procedures to illustrate concepts and procedures for analysis of correlated factors. Argues that, based on convergent and discriminant validity criteria, factors should have higher correlations with variables that they purport to measure than with other variables. Discusses…
Coordinate based random effect size meta-analysis of neuroimaging studies.
Tench, C R; Tanasescu, Radu; Constantinescu, C S; Auer, D P; Cottam, W J
2017-06-01
Low power in neuroimaging studies can make them difficult to interpret, and Coordinate based meta-analysis (CBMA) may go some way to mitigating this issue. CBMA has been used in many analyses to detect where published functional MRI or voxel-based morphometry studies testing similar hypotheses report significant summary results (coordinates) consistently. Only the reported coordinates and possibly t statistics are analysed, and statistical significance of clusters is determined by coordinate density. Here a method of performing coordinate based random effect size meta-analysis and meta-regression is introduced. The algorithm (ClusterZ) analyses both coordinates and reported t statistic or Z score, standardised by the number of subjects. Statistical significance is determined not by coordinate density, but by a random effects meta-analyses of reported effects performed cluster-wise using standard statistical methods and taking account of censoring inherent in the published summary results. Type 1 error control is achieved using the false cluster discovery rate (FCDR), which is based on the false discovery rate. This controls both the family wise error rate under the null hypothesis that coordinates are randomly drawn from a standard stereotaxic space, and the proportion of significant clusters that are expected under the null. Such control is necessary to avoid propagating and even amplifying the very issues motivating the meta-analysis in the first place. ClusterZ is demonstrated on both numerically simulated data and on real data from reports of grey matter loss in multiple sclerosis (MS) and syndromes suggestive of MS, and of painful stimulus in healthy controls. The software implementation is available to download and use freely. Copyright © 2017 Elsevier Inc. All rights reserved.
Some Psychometric and Design Implications of Game-Based Learning Analytics
ERIC Educational Resources Information Center
Gibson, David; Clarke-Midura, Jody
2013-01-01
The rise of digital game and simulation-based learning applications has led to new approaches in educational measurement that take account of patterns in time, high resolution paths of action, and clusters of virtual performance artifacts. The new approaches, which depart from traditional statistical analyses, include data mining, machine…
Impaired Verb Fluency: A Sign of Mild Cognitive Impairment
ERIC Educational Resources Information Center
Ostberg, Per; Fernaeus, Sven-Erik; Hellstrom, Ake; Bogdanovic, Nenad; Wahlund, Lars Olof
2005-01-01
We assessed verb fluency vs. noun and letter-based fluency in 199 subjects referred for cognitive complaints including Subjective Cognitive Impairment, Mild Cognitive Impairment, and Alzheimer's disease. ANCOVAs and factor analyses identified verb, noun, and letter-based fluency as distinct tasks. Verb fluency performance in Mild Cognitive…
Buying and Pricing: Instructor Material. Curriculum: Distributive Education.
ERIC Educational Resources Information Center
Missouri Univ., Columbia. Instructional Materials Lab.
This secondary distributive education performance-based instructional unit on buying and pricing contains thirteen lesson plans, each based on a fifty-five minute period. Among the topics covered are the following: (1) the importance of analysing the customers' demands for merchandise before planning what and when to buy, (2) questions about…
Horticulture III, IV, and V. Task Analyses. Competency-Based Education.
ERIC Educational Resources Information Center
Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.
This task analysis guide is intended to help teachers and administrators develop instructional materials and implement competency-based education in the horticulture program. Section 1 contains a validated task inventory for horticulture III, IV, and V. For each task, applicable information pertaining to performance and enabling objectives,…
NASA Technical Reports Server (NTRS)
Stephens, Chad; Kennedy, Kellie; Napoli, Nicholas; Demas, Matthew; Barnes, Laura; Crook, Brenda; Williams, Ralph; Last, Mary Carolyn; Schutte, Paul
2017-01-01
Human-autonomous systems have the potential to mitigate pilot cognitive impairment and improve aviation safety. A research team at NASA Langley conducted an experiment to study the impact of mild normobaric hypoxia induction on aircraft pilot performance and psychophysiological state. A within-subjects design involved non-hypoxic and hypoxic exposures while performing three 10-minute tasks. Results indicated the effect of 15,000 feet simulated altitude did not induce significant performance decrement but did produce increase in perceived workload. Analyses of psychophysiological responses evince the potential of biomarkers for hypoxia onset. This study represents on-going work at NASA intending to add to the current knowledge of psychophysiologically-based input to automation to increase aviation safety. Analyses involving coupling across physiological systems and wavelet transforms of cortical activity revealed patterns that can discern between the simulated altitude conditions. Specifically, multivariate entropy of ECG/Respiration components were found to be significant predictors (p< 0.02) of hypoxia. Furthermore, in EEG, there was a significant decrease in mid-level beta (15.19-18.37Hz) during the hypoxic condition in thirteen of sixteen sites across the scalp. Task performance was not appreciably impacted by the effect of 15,000 feet simulated altitude. Analyses of psychophysiological responses evince the potential of biomarkers for mild hypoxia onset.The potential for identifying shifts in underlying cortical and physiological systems could serve as a means to identify the onset of deteriorated cognitive state. Enabling such assessment in future flightdecks could permit increasingly autonomous systems-supported operations. Augmenting human operator through assessment of cognitive impairment has the potential to further improve operator performance and mitigate human error in safety critical contexts. This study represents ongoing work at NASA intending to add to the current knowledge of psychophysiologically-based input to automation to increase aviation safety.
SLUDGE TREATMENT PROJECT KOP CONCEPTUAL DESIGN CONTROL DECISION REPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
CARRO CA
2010-03-09
This control decision addresses the Knock-Out Pot (KOP) Disposition KOP Processing System (KPS) conceptual design. The KPS functions to (1) retrieve KOP material from canisters, (2) remove particles less than 600 {micro}m in size and low density materials from the KOP material, (3) load the KOP material into Multi-Canister Overpack (MCO) baskets, and (4) stage the MCO baskets for subsequent loading into MCOs. Hazard and accident analyses of the KPS conceptual design have been performed to incorporate safety into the design process. The hazard analysis is documented in PRC-STP-00098, Knock-Out Pot Disposition Project Conceptual Design Hazard Analysis. The accident analysismore » is documented in PRC-STP-CN-N-00167, Knock-Out Pot Disposition Sub-Project Canister Over Lift Accident Analysis. Based on the results of these analyses, and analyses performed in support of MCO transportation and MCO processing and storage activities at the Cold Vacuum Drying Facility (CVDF) and Canister Storage Building (CSB), control decision meetings were held to determine the controls required to protect onsite and offsite receptors and facility workers. At the conceptual design stage, these controls are primarily defined by their safety functions. Safety significant structures, systems, and components (SSCs) that could provide the identified safety functions have been selected for the conceptual design. It is anticipated that some safety SSCs identified herein will be reclassified based on hazard and accident analyses performed in support of preliminary and detailed design.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-04
... Donald Howard, (410) 786-6764, Hospital Value-Based Purchasing (VBP) Program Issues. SUPPLEMENTARY... analyses performed by Brandeis University and Mathematica Policy Research together despite their slightly...
da Câmara, Saionara M A; Zunzunegui, Maria Victoria; Pirkle, Catherine; Moreira, Mayle A; Maciel, Álvaro C C
2015-01-01
To examine associations between menopausal status and physical performance in middle-aged women from the Northeast region of Brazil. Cross-sectional study of women between 40 to 65 years old living in Parnamirim. Women were recruited by advertisements in primary care neighborhood centers across the city. Physical performance was assessed by grip strength, gait speed and chair stands. Menopausal status was determined using the Stages of Reproductive Aging Workshop classification and women were classified in: premenopausal, perimenopausal or postmenopausal. Multiple linear regression analyses were performed to model the effect of menopausal status on each physical performance measure, adjusting for covariates (age, family income, education, body mass index, parity and age at first birth). The premenopausal women were significantly stronger and performed better in chair stands than perimenopausal and postmenopausal women. Gait speed did not vary significantly by menopausal status. In multivariate analyses, menopausal status remained statistically significant only for grip strength. In fully adjusted analyses, premenopausal women had grip strength mean of 2.226 Kgf (95% CI: 0.361 - 4.091) higher than the postmenopausal group. This study provides further evidence for the associations between menopause and physical performance in middle-aged women, since grip strength is weaker in peri and postmenopausal women compared to premenopausal, even adjusted for age and other covariates.
Rocket-Based Combined Cycle Engine Technology Development: Inlet CFD Validation and Application
NASA Technical Reports Server (NTRS)
DeBonis, J. R.; Yungster, S.
1996-01-01
A CFD methodology has been developed for inlet analyses of Rocket-Based Combined Cycle (RBCC) Engines. A full Navier-Stokes analysis code, NPARC, was used in conjunction with pre- and post-processing tools to obtain a complete description of the flow field and integrated inlet performance. This methodology was developed and validated using results from a subscale test of the inlet to a RBCC 'Strut-Jet' engine performed in the NASA Lewis 1 x 1 ft. supersonic wind tunnel. Results obtained from this study include analyses at flight Mach numbers of 5 and 6 for super-critical operating conditions. These results showed excellent agreement with experimental data. The analysis tools were also used to obtain pre-test performance and operability predictions for the RBCC demonstrator engine planned for testing in the NASA Lewis Hypersonic Test Facility. This analysis calculated the baseline fuel-off internal force of the engine which is needed to determine the net thrust with fuel on.
Trajectory-based heating analysis for the European Space Agency/Rosetta Earth Return Vehicle
NASA Technical Reports Server (NTRS)
Henline, William D.; Tauber, Michael E.
1994-01-01
A coupled, trajectory-based flowfield and material thermal-response analysis is presented for the European Space Agency proposed Rosetta comet nucleus sample return vehicle. The probe returns to earth along a hyperbolic trajectory with an entry velocity of 16.5 km/s and requires an ablative heat shield on the forebody. Combined radiative and convective ablating flowfield analyses were performed for the significant heating portion of the shallow ballistic entry trajectory. Both quasisteady ablation and fully transient analyses were performed for a heat shield composed of carbon-phenolic ablative material. Quasisteady analysis was performed using the two-dimensional axisymmetric codes RASLE and BLIMPK. Transient computational results were obtained from the one-dimensional ablation/conduction code CMA. Results are presented for heating, temperature, and ablation rate distributions over the probe forebody for various trajectory points. Comparison of transient and quasisteady results indicates that, for the heating pulse encountered by this probe, the quasisteady approach is conservative from the standpoint of predicted surface recession.
Design considerations for a Space Shuttle Main Engine turbine blade made of single crystal material
NASA Technical Reports Server (NTRS)
Abdul-Aziz, A.; August, R.; Nagpal, V.
1993-01-01
Nonlinear finite-element structural analyses were performed on the first stage high-pressure fuel turbopump blade of the Space Shuttle Main Engine. The analyses examined the structural response and the dynamic characteristics at typical operating conditions. Single crystal material PWA-1480 was considered for the analyses. Structural response and the blade natural frequencies with respect to the crystal orientation were investigated. The analyses were conducted based on typical test stand engine cycle. Influence of combined thermal, aerodynamic, and centrifugal loadings was considered. Results obtained showed that the single crystal secondary orientation effects on the maximum principal stresses are not highly significant.
NASA Astrophysics Data System (ADS)
Soeryana, E.; Fadhlina, N.; Sukono; Rusyaman, E.; Supian, S.
2017-01-01
Investments in stocks investors are also faced with the issue of risk, due to daily price of stock also fluctuate. For minimize the level of risk, investors usually forming an investment portfolio. Establishment of a portfolio consisting of several stocks are intended to get the optimal composition of the investment portfolio. This paper discussed about optimizing investment portfolio of Mean-Variance to stocks by using mean and volatility is not constant based on logarithmic utility function. Non constant mean analysed using models Autoregressive Moving Average (ARMA), while non constant volatility models are analysed using the Generalized Autoregressive Conditional heteroscedastic (GARCH). Optimization process is performed by using the Lagrangian multiplier technique. As a numerical illustration, the method is used to analyse some Islamic stocks in Indonesia. The expected result is to get the proportion of investment in each Islamic stock analysed.
Engineering evaluation of SSME dynamic data from engine tests and SSV flights
NASA Technical Reports Server (NTRS)
1986-01-01
An engineering evaluation of dynamic data from SSME hot firing tests and SSV flights is summarized. The basic objective of the study is to provide analyses of vibration, strain and dynamic pressure measurements in support of MSFC performance and reliability improvement programs. A brief description of the SSME test program is given and a typical test evaluation cycle reviewed. Data banks generated to characterize SSME component dynamic characteristics are described and statistical analyses performed on these data base measurements are discussed. Analytical models applied to define the dynamic behavior of SSME components (such as turbopump bearing elements and the flight accelerometer safety cut-off system) are also summarized. Appendices are included to illustrate some typical tasks performed under this study.
NASA Technical Reports Server (NTRS)
VanHeukelem, Laurie; Thomas, Crystal S.; Glibert, Patricia M.
2001-01-01
The need for accurate determination of chlorophyll a (chl a) is of interest for numerous reasons. From the need for ground-truth data for remote sensing to pigment detection for laboratory experimentation, it is essential to know the accuracy of the analyses and the factors potentially contributing to variability and error. Numerous methods and instrument techniques are currently employed in the analyses of chl a. These methods range from spectrophotometric quantification, to fluorometric analysis and determination by high performance liquid chromatography. Even within the application of HPLC techniques, methods vary. Here we provide the results of a comparison among methods and provide some guidance for improving the accuracy of these analyses. These results are based on a round-robin conducted among numerous investigators, including several in the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) and HyCODE Programs. Our purpose here is not to present the full results of the laboratory intercalibration; those results will be presented elsewhere. Rather, here we highlight some of the major factors that may contribute to the variability observed. Specifically, we aim to assess the comparability of chl a analyses performed by fluorometry and HPLC, and we identify several factors in the analyses which may contribute disproportionately to this variability.
The application of CAD, CAE & CAM in development of butterfly valve’s disc
NASA Astrophysics Data System (ADS)
Asiff Razif Shah Ranjit, Muhammad; Hanie Abdullah, Nazlin
2017-06-01
The improved design of a butterfly valve disc is based on the concept of sandwich theory. Butterfly valves are mostly used in various industries such as oil and gas plant. The primary failure modes for valves are indented disc, keyways and shaft failure and the cavitation damage. Emphasis on the application of CAD, a new model of the butterfly valve’s disc structure was designed. The structure analysis was analysed using the finite element analysis. Butterfly valve performance factors can be obtained is by using Computational Fluid Dynamics (CFD) software to simulate the physics of fluid flow in a piping system around a butterfly valve. A comparison analysis was done using the finite element to justify the performance of the structure. The second application of CAE is the computational fluid flow analysis. The upstream pressure and the downstream pressure was analysed to calculate the cavitation index and determine the performance throughout each opening position of the valve. The CAM process was done using 3D printer to produce a prototype and analysed the structure in form of prototype. The structure was downscale fabricated based on the model designed initially through the application of CAD. This study is utilized the application of CAD, CAE and CAM for a better improvement of the butterfly valve’s disc components.
Stereotype Threat Effects on Older Adults' Episodic and Working Memory: A Meta-Analysis.
Armstrong, Bonnie; Gallant, Sara N; Li, Lingqian; Patel, Khushi; Wong, Brenda I
2017-08-01
Prior research has shown that exposure to negative age-based stereotype threat (ST) can undermine older adults' memory performance. The objective of the current meta-analysis was to examine the reliability and magnitude of ST effects on older adults' episodic and working memory performance-two forms of memory that typically show the greatest age-related declines. In addition, we examined potential moderators of age-based ST including type of ST manipulation, type and timing of memory task, participant age and education level. A total of 23 samples for episodic memory and 15 samples for working memory were derived from 19 published and 4 unpublished articles and analyzed in two separate meta-analyses. Analyses revealed a reliable effect of ST on both older adults' episodic (d = 0.373) and working memory performance (d = 0.253). Interestingly, the age-based ST effect was only significant when blatant ST manipulations were used with episodic memory tasks or when subtle ST manipulations were used with working memory tasks. Moreover, within episodic memory, the ST effect only reached significance for recall but not cued-recall or recognition performance, and for immediate but not delayed tests of memory. Neither age nor level of education moderated the association between ST and older adults' memory performance. These results highlight the vulnerability of both older adults' episodic and working memory performance to age-based ST. When measuring older adults' memory performance in a research context, we must therefore be wary of exposing participants to common stereotypes about aging and memory. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Analyser-based phase contrast image reconstruction using geometrical optics.
Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A
2007-07-21
Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 microm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.
A study of multiplex data bus techniques for the space shuttle
NASA Technical Reports Server (NTRS)
Kearney, R. J.; Kalange, M. A.
1972-01-01
A comprehensive technology base for the design of a multiplexed data bus subsystem is provided. Extensive analyses, both analytical and empirical, were performed. Subjects covered are classified under the following headings: requirements identification and analysis; transmission media studies; signal design and detection studies; synchronization, timing, and control studies; user-subsystem interface studies; operational reliability analyses; design of candidate data bus configurations; and evaluation of candidate data bus designs.
Buckling Design and Analysis of a Payload Fairing One-Sixth Cylindrical Arc-Segment Panel
NASA Technical Reports Server (NTRS)
Kosareo, Daniel N.; Oliver, Stanley T.; Bednarcyk, Brett A.
2013-01-01
Design and analysis results are reported for a panel that is a 16th arc-segment of a full 33-ft diameter cylindrical barrel section of a payload fairing structure. Six such panels could be used to construct the fairing barrel, and, as such, compression buckling testing of a 16th arc-segment panel would serve as a validation test of the buckling analyses used to design the fairing panels. In this report, linear and nonlinear buckling analyses have been performed using finite element software for 16th arc-segment panels composed of aluminum honeycomb core with graphiteepoxy composite facesheets and an alternative fiber reinforced foam (FRF) composite sandwich design. The cross sections of both concepts were sized to represent realistic Space Launch Systems (SLS) Payload Fairing panels. Based on shell-based linear buckling analyses, smaller, more manageable buckling test panel dimensions were determined such that the panel would still be expected to buckle with a circumferential (as opposed to column-like) mode with significant separation between the first and second buckling modes. More detailed nonlinear buckling analyses were then conducted for honeycomb panels of various sizes using both Abaqus and ANSYS finite element codes, and for the smaller size panel, a solid-based finite element analysis was conducted. Finally, for the smaller size FRF panel, nonlinear buckling analysis was performed wherein geometric imperfections measured from an actual manufactured FRF were included. It was found that the measured imperfection did not significantly affect the panel's predicted buckling response
Unconscious analyses of visual scenes based on feature conjunctions.
Tachibana, Ryosuke; Noguchi, Yasuki
2015-06-01
To efficiently process a cluttered scene, the visual system analyzes statistical properties or regularities of visual elements embedded in the scene. It is controversial, however, whether those scene analyses could also work for stimuli unconsciously perceived. Here we show that our brain performs the unconscious scene analyses not only using a single featural cue (e.g., orientation) but also based on conjunctions of multiple visual features (e.g., combinations of color and orientation information). Subjects foveally viewed a stimulus array (duration: 50 ms) where 4 types of bars (red-horizontal, red-vertical, green-horizontal, and green-vertical) were intermixed. Although a conscious perception of those bars was inhibited by a subsequent mask stimulus, the brain correctly analyzed the information about color, orientation, and color-orientation conjunctions of those invisible bars. The information of those features was then used for the unconscious configuration analysis (statistical processing) of the central bars, which induced a perceptual bias and illusory feature binding in visible stimuli at peripheral locations. While statistical analyses and feature binding are normally 2 key functions of the visual system to construct coherent percepts of visual scenes, our results show that a high-level analysis combining those 2 functions is correctly performed by unconscious computations in the brain. (c) 2015 APA, all rights reserved).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ho, Clifford Kuofei
Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skinmore » that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.« less
NASA Astrophysics Data System (ADS)
Khatri, Kshitij; Pu, Yi; Klein, Joshua A.; Wei, Juan; Costello, Catherine E.; Lin, Cheng; Zaia, Joseph
2018-04-01
Analysis of singly glycosylated peptides has evolved to a point where large-scale LC-MS analyses can be performed at almost the same scale as proteomics experiments. While collisionally activated dissociation (CAD) remains the mainstay of bottom-up analyses, it performs poorly for the middle-down analysis of multiply glycosylated peptides. With improvements in instrumentation, electron-activated dissociation (ExD) modes are becoming increasingly prevalent for proteomics experiments and for the analysis of fragile modifications such as glycosylation. While these methods have been applied for glycopeptide analysis in isolated studies, an organized effort to compare their efficiencies, particularly for analysis of multiply glycosylated peptides (termed here middle-down glycoproteomics), has not been made. We therefore compared the performance of different ExD modes for middle-down glycopeptide analyses. We identified key features among the different dissociation modes and show that increased electron energy and supplemental activation provide the most useful data for middle-down glycopeptide analysis. [Figure not available: see fulltext.
Güllich, Arne; Kovar, Peter; Zart, Sebastian; Reimann, Ansgar
2017-02-01
This study examined contributions of different types of sport activities to the development of elite youth soccer performance. Match-play performance of 44 German male players was assessed by expert coaches twice, 24 months apart (age 11.1-13.1 years), based on videotaped 5v5 matches. Player pairs were matched by identical age and initial performance at t 1 . Each player was assigned to a group of either "Strong" or "Weak Responders" based on a higher or lower subsequent performance improvement at t 2 within each pair (mean Δperformance 29% vs. 7%). A questionnaire recorded current and earlier amounts of organised practice/training and non-organised sporting play, in soccer and other sports, respectively. Group comparison revealed that "Strong Responders" accumulated more non-organised soccer play and organised practice/training in other sports, but not more organised soccer practice/training. Subsequent multivariate analyses (multiple linear regression analyses (MLR)) highlighted that higher resultant match-play performance at t 2 was accounted for R 2 adj = 0.65 by performance at t 1 , together with more non-organised soccer play and organised engagement in other sports, respectively, and greater current, but less earlier volume of organised soccer. The findings suggest that variable early sporting experience facilitates subsequent soccer performance development in German elite youth footballers.
VCSEL-based fiber optic link for avionics: implementation and performance analyses
NASA Astrophysics Data System (ADS)
Shi, Jieqin; Zhang, Chunxi; Duan, Jingyuan; Wen, Huaitao
2006-11-01
A Gb/s fiber optic link with built-in test capability (BIT) basing on vertical-cavity surface-emitting laser (VCSEL) sources for military avionics bus for next generation has been presented in this paper. To accurately predict link performance, statistical methods and Bit Error Rate (BER) measurements have been examined. The results show that the 1Gb/s fiber optic link meets the BER requirement and values for link margin can reach up to 13dB. Analysis shows that the suggested photonic network may provide high performance and low cost interconnections alternative for future military avionics.
Advanced space program studies: Overall executive summary
NASA Technical Reports Server (NTRS)
Sitney, L. R.
1974-01-01
Studies were conducted to provide NASA with advanced planning analyses which relate integrated space program goals and options to credible technical capabilities, applications potential, and funding resources. The studies concentrated on the following subjects: (1) upper stage options for the space transportation system based on payload considerations, (2) space servicing and standardization of payloads, (3) payload operations, and (4) space transportation system economic analyses related to user charges and new space applications. A systems cost/performance model was developed to synthesize automated, unmanned spacecraft configurations based on the system requirements and a list of equipments at the assembly level.
Stajkovic, Alexander D; Lee, Dongseop; Nyberg, Anthony J
2009-05-01
The authors examined relationships among collective efficacy, group potency, and group performance. Meta-analytic results (based on 6,128 groups, 31,019 individuals, 118 correlations adjusted for dependence, and 96 studies) reveal that collective efficacy was significantly related to group performance (.35). In the proposed nested 2-level model, collective efficacy assessment (aggregation and group discussion) was tested as the 1st-level moderator. It showed significantly different average correlations with group performance (.32 vs. .45), but the group discussion assessment was homogeneous, whereas the aggregation assessment was heterogeneous. Consequently, there was no 2nd-level moderation for the group discussion, and heterogeneity in the aggregation group was accounted for by the 2nd-level moderator, task interdependence (high, moderate, and low levels were significant; the higher the level, the stronger the relationship). The 2nd and 3rd meta-analyses indicated that group potency was related to group performance (.29) and to collective efficacy (.65). When tested in a structural equation modeling analysis based on meta-analytic findings, collective efficacy fully mediated the relationship between group potency and group performance. The authors suggest future research and convert their findings to a probability of success index to help facilitate practice. (c) 2009 APA, all rights reserved.
Hu, Kang; Fiedler, Thorsten; Blanco, Laura; Geissen, Sven-Uwe; Zander, Simon; Prieto, David; Blanco, Angeles; Negro, Carlos; Swinnen, Nathalie
2017-11-10
A pilot-scale reverse osmosis (RO) followed behind a membrane bioreactor (MBR) was developed for the desalination to reuse wastewater in a PVC production site. The solution-diffusion-film model (SDFM) based on the solution-diffusion model (SDM) and the film theory was proposed to describe rejections of electrolyte mixtures in the MBR effluent which consists of dominant ions (Na + and Cl - ) and several trace ions (Ca 2+ , Mg 2+ , K + and SO 4 2- ). The universal global optimisation method was used to estimate the ion permeability coefficients (B) and mass transfer coefficients (K) in SDFM. Then, the membrane performance was evaluated based on the estimated parameters which demonstrated that the theoretical simulations were in line with the experimental results for the dominant ions. Moreover, an energy analysis model with the consideration of limitation imposed by the thermodynamic restriction was proposed to analyse the specific energy consumption of the pilot-scale RO system in various scenarios.
NASA Astrophysics Data System (ADS)
Zhao, Ming-fu; Hu, Xin-Yu; Shao, Yun; Luo, Bin-bin; Wang, Xin
2008-10-01
This article analyses nowadays in common use of football robots in China, intended to improve the football robots' hardware platform system's capability, and designed a football robot which based on DSP core controller, and combined Fuzzy-PID control algorithm. The experiment showed, because of the advantages of DSP, such as quickly operation, various of interfaces, low power dissipation etc. It has great improvement on the football robot's performance of movement, controlling precision, real-time performance.
ERIC Educational Resources Information Center
Busetti, Simone; Dente, Bruno
2014-01-01
This article investigates management systems in higher education organisations by analysing the 2009 Italian reform of performance management and its implementation within Italian universities. The research is based on a survey that covered about half of Italy's public universities. Survey results provide an account of the state of management…
ERIC Educational Resources Information Center
Menéndez-Varela, José-Luis; Gregori-Giralt, Eva
2016-01-01
Rubrics have attained considerable importance in the authentic and sustainable assessment paradigm; nevertheless, few studies have examined their contribution to validity, especially outside the domain of educational studies. This empirical study used a quantitative approach to analyse the validity of a rubrics-based performance assessment. Raters…
Gardner, Shea N; Wagner, Mark C
2005-01-01
Background Microbial forensics is important in tracking the source of a pathogen, whether the disease is a naturally occurring outbreak or part of a criminal investigation. Results A method and SPR Opt (SNP and PCR-RFLP Optimization) software to perform a comprehensive, whole-genome analysis to forensically discriminate multiple sequences is presented. Tools for the optimization of forensic typing using Single Nucleotide Polymorphism (SNP) and PCR-Restriction Fragment Length Polymorphism (PCR-RFLP) analyses across multiple isolate sequences of a species are described. The PCR-RFLP analysis includes prediction and selection of optimal primers and restriction enzymes to enable maximum isolate discrimination based on sequence information. SPR Opt calculates all SNP or PCR-RFLP variations present in the sequences, groups them into haplotypes according to their co-segregation across those sequences, and performs combinatoric analyses to determine which sets of haplotypes provide maximal discrimination among all the input sequences. Those set combinations requiring that membership in the fewest haplotypes be queried (i.e. the fewest assays be performed) are found. These analyses highlight variable regions based on existing sequence data. These markers may be heterogeneous among unsequenced isolates as well, and thus may be useful for characterizing the relationships among unsequenced as well as sequenced isolates. The predictions are multi-locus. Analyses of mumps and SARS viruses are summarized. Phylogenetic trees created based on SNPs, PCR-RFLPs, and full genomes are compared for SARS virus, illustrating that purported phylogenies based only on SNP or PCR-RFLP variations do not match those based on multiple sequence alignment of the full genomes. Conclusion This is the first software to optimize the selection of forensic markers to maximize information gained from the fewest assays, accepting whole or partial genome sequence data as input. As more sequence data becomes available for multiple strains and isolates of a species, automated, computational approaches such as those described here will be essential to make sense of large amounts of information, and to guide and optimize efforts in the laboratory. The software and source code for SPR Opt is publicly available and free for non-profit use at . PMID:15904493
ERIC Educational Resources Information Center
Hwang, Gwo-Jen; Kuo, Fan-Ray
2015-01-01
Web-based problem-solving, a compound ability of critical thinking, creative thinking, reasoning thinking and information-searching abilities, has been recognised as an important competence for elementary school students. Some researchers have reported the possible correlations between problem-solving competence and information searching ability;…
Effects of Experiential-Based Videos in Multi-Disciplinary Learning
ERIC Educational Resources Information Center
Jabbar, Khalid Bin Abdul; Ong, Alex; Choy, Jeanette; Lim, Lisa
2013-01-01
This study examined the use of authentic experiential-based videos in self-explanation activities on 32 polytechnic students' learning and motivation, using a mixed method quasi-experimental design. The control group analysed a set of six pre-recorded videos of a subject performing the standing broad jump (SBJ). The experimental group captured…
Measuring the Impact of Inquiry-Based Learning on Outcomes and Student Satisfaction
ERIC Educational Resources Information Center
Zafra-Gómez, José Luis; Román-Martínez, Isabel; Gómez-Miranda, María Elena
2015-01-01
The aim of this study is to determine the impact of inquiry-based learning (IBL) on students' academic performance and to assess their satisfaction with the process. Linear and logistic regression analyses show that examination grades are positively related to attendance at classes and tutorials; moreover, there is a positive significant…
Advanced vehicle systems assessment. Volume 3: Systems assessment
NASA Technical Reports Server (NTRS)
Hardy, K.
1985-01-01
The systems analyses integrate the advanced component and vehicle characteristics into conceptual vehicles with identical performance (for a given application) and evaluates the vehicles in typical use patterns. Initial and life-cycle costs are estimated and compared to conventional reference vehicles with comparable technological advances, assuming the vehicles will be in competition in the early 1990s. Electric vans, commuter vehicles, and full-size vehicles, in addition to electric/heat-engine hybrid and fuel-cell powered vehicles, are addressed in terms of performance and economics. System and subsystem recommendations for vans and two-passenger commuter vehicles are based on the economic analyses in this volume.
A software-based tool for video motion tracking in the surgical skills assessment landscape.
Ganni, Sandeep; Botden, Sanne M B I; Chmarra, Magdalena; Goossens, Richard H M; Jakimowicz, Jack J
2018-01-16
The use of motion tracking has been proved to provide an objective assessment in surgical skills training. Current systems, however, require the use of additional equipment or specialised laparoscopic instruments and cameras to extract the data. The aim of this study was to determine the possibility of using a software-based solution to extract the data. 6 expert and 23 novice participants performed a basic laparoscopic cholecystectomy procedure in the operating room. The recorded videos were analysed using Kinovea 0.8.15 and the following parameters calculated the path length, average instrument movement and number of sudden or extreme movements. The analysed data showed that experts had significantly shorter path length (median 127 cm vs. 187 cm, p = 0.01), smaller average movements (median 0.40 cm vs. 0.32 cm, p = 0.002) and fewer sudden movements (median 14.00 vs. 21.61, p = 0.001) than their novice counterparts. The use of software-based video motion tracking of laparoscopic cholecystectomy is a simple and viable method enabling objective assessment of surgical performance. It provides clear discrimination between expert and novice performance.
Gimenez, Thais; Braga, Mariana Minatel; Raggio, Daniela Procida; Deery, Chris; Ricketts, David N; Mendes, Fausto Medeiros
2013-01-01
Fluorescence-based methods have been proposed to aid caries lesion detection. Summarizing and analysing findings of studies about fluorescence-based methods could clarify their real benefits. We aimed to perform a comprehensive systematic review and meta-analysis to evaluate the accuracy of fluorescence-based methods in detecting caries lesions. Two independent reviewers searched PubMed, Embase and Scopus through June 2012 to identify papers/articles published. Other sources were checked to identify non-published literature. STUDY ELIGIBILITY CRITERIA, PARTICIPANTS AND DIAGNOSTIC METHODS: The eligibility criteria were studies that: (1) have assessed the accuracy of fluorescence-based methods of detecting caries lesions on occlusal, approximal or smooth surfaces, in both primary or permanent human teeth, in the laboratory or clinical setting; (2) have used a reference standard; and (3) have reported sufficient data relating to the sample size and the accuracy of methods. A diagnostic 2×2 table was extracted from included studies to calculate the pooled sensitivity, specificity and overall accuracy parameters (Diagnostic Odds Ratio and Summary Receiver-Operating curve). The analyses were performed separately for each method and different characteristics of the studies. The quality of the studies and heterogeneity were also evaluated. Seventy five studies met the inclusion criteria from the 434 articles initially identified. The search of the grey or non-published literature did not identify any further studies. In general, the analysis demonstrated that the fluorescence-based method tend to have similar accuracy for all types of teeth, dental surfaces or settings. There was a trend of better performance of fluorescence methods in detecting more advanced caries lesions. We also observed moderate to high heterogeneity and evidenced publication bias. Fluorescence-based devices have similar overall performance; however, better accuracy in detecting more advanced caries lesions has been observed.
Yang, Yanzheng; Zhu, Qiuan; Peng, Changhui; Wang, Han; Xue, Wei; Lin, Guanghui; Wen, Zhongming; Chang, Jie; Wang, Meng; Liu, Guobin; Li, Shiqing
2016-01-01
Increasing evidence indicates that current dynamic global vegetation models (DGVMs) have suffered from insufficient realism and are difficult to improve, particularly because they are built on plant functional type (PFT) schemes. Therefore, new approaches, such as plant trait-based methods, are urgently needed to replace PFT schemes when predicting the distribution of vegetation and investigating vegetation sensitivity. As an important direction towards constructing next-generation DGVMs based on plant functional traits, we propose a novel approach for modelling vegetation distributions and analysing vegetation sensitivity through trait-climate relationships in China. The results demonstrated that a Gaussian mixture model (GMM) trained with a LMA-Nmass-LAI data combination yielded an accuracy of 72.82% in simulating vegetation distribution, providing more detailed parameter information regarding community structures and ecosystem functions. The new approach also performed well in analyses of vegetation sensitivity to different climatic scenarios. Although the trait-climate relationship is not the only candidate useful for predicting vegetation distributions and analysing climatic sensitivity, it sheds new light on the development of next-generation trait-based DGVMs. PMID:27052108
Applying U.S. EOP Analytical Justification Experience for VVER Plants in the Ukraine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linn, Paul A.; Julian, Harold V.; Chapman, James R.
2002-07-01
The foundation for new Emergency Operating Instructions (EOIs) being developed at several plants in the Ukraine is the Westinghouse Owners Group (WOG) Emergency Response Guidelines (ERGs) developed in the U.S. The ERGs were chosen as a base for the new EOIs for several reasons. First the overall structure and format was adaptable to VVER Pressurized Water Reactor (PWR) designs. Second, the ERGs have served as a base for many plant EOIs in both the U.S. and internationally. Third, key information supporting the ERGs was available. This paper describes the method used at one of the Ukrainian plants to provide anmore » analytical justification for their EOIs. The method being employed by a second plant is very similar, differing only slightly in how it is implemented. The WOG ERG development program, which started shortly after the accident at Three Mile Island Unit 2, used many sources of technical information on plant and system transient response, which were available in support of the plant design and licensing efforts. In addition, operating experience from many operating PWR plants in the U.S. and around the world was used. For example, design basis accident (DBA) analyses, documented in a plant's Safety Analysis Report (SAR) and other design documents, had been performed by Nuclear Steam Supply System (NSSS) vendors, utilities, or the Architect/Engineer. All relevant sources were considered in the development of the ERGs. Limited Probabilistic Risk Assessment (PRA) analyses were available during that time period. When a technical basis for a recovery strategy and associated operator actions was not available, an analysis was defined and performed. In general, these analyses were performed on a generic basis, and addressed the different categories of design (e.g., number of reactor coolant loops and/or low/high pressure safety injection system design). U.S. Nuclear Power plants that were in the WOG program were responsible for implementing the generic ERGs. This required the utilities to review the generic analyses to ensure that they were applicable and to justify any deviations from the ERG methodology. Modern PRA analyses are similar to the analyses supporting the ERGs since they address multiple failures and assume better estimate or expected assumptions for equipment availability and operator performance. The process being employed by Ukrainian plants is similar to the WOG. That is, available analyses and operating experience are being reviewed and pertinent information extracted to assist in the analytical justification of the EOIs. This includes the use of recently updated PRA and DBA analyses, other 'original' design information and operating experience. A systematic review of the EOIs is being conducted to identify items requiring analytical justification. For each analysis identified, the specific purpose of the analysis is being documented. The analysis needs are then compared to the available analyses and operating experience. From this review, new analyses needed to justify the EOIs are developed, and a basis for using existing analyses is established. The work is being conducted in two phases. The first phase performs all of the reviews and assessments necessary to determine the new analyses required to justify the EOIs. In the second phase, these new analyses will be conducted and documented, and the EOI Analytical Justification (AJ) report will be written. (authors)« less
Tiffin, Paul A; Paton, Lewis W; Kasim, Adetayo S; Böhnke, Jan R
2018-01-01
Objectives University academic achievement may be inversely related to the performance of the secondary (high) school an entrant attended. Indeed, some medical schools already offer ‘grade discounts’ to applicants from less well-performing schools. However, evidence to guide such policies is lacking. In this study, we analyse a national dataset in order to understand the relationship between the two main predictors of medical school admission in the UK (prior educational attainment (PEA) and performance on the United Kingdom Clinical Aptitude Test (UKCAT)) and subsequent undergraduate knowledge and skills-related outcomes analysed separately. Methods The study was based on national selection data and linked medical school outcomes for knowledge and skills-based tests during the first five years of medical school. UKCAT scores and PEA grades were available for 2107 students enrolled at 18 medical schools. Models were developed to investigate the potential mediating role played by a student’s previous secondary school’s performance. Multilevel models were created to explore the influence of students’ secondary schools on undergraduate achievement in medical school. Results The ability of the UKCAT scores to predict undergraduate academic performance was significantly mediated by PEA in all five years of medical school. Undergraduate achievement was inversely related to secondary school-level performance. This effect waned over time and was less marked for skills, compared with undergraduate knowledge-based outcomes. Thus, the predictive value of secondary school grades was generally dependent on the secondary school in which they were obtained. Conclusions The UKCAT scores added some value, above and beyond secondary school achievement, in predicting undergraduate performance, especially in the later years of study. Importantly, the findings suggest that the academic entry criteria should be relaxed for candidates applying from the least well performing secondary schools. In the UK, this would translate into a decrease of approximately one to two A-level grades. PMID:29792300
ERIC Educational Resources Information Center
Putnik, Goran; Costa, Eric; Alves, Cátia; Castro, Hélio; Varela, Leonilde; Shah, Vaibhav
2016-01-01
Social network-based engineering education (SNEE) is designed and implemented as a model of Education 3.0 paradigm. SNEE represents a new learning methodology, which is based on the concept of social networks and represents an extended model of project-led education. The concept of social networks was applied in the real-life experiment,…
Taneja, S. R.; Kumar, Jagdish; Thariyan, K. K.; Verma, Sanjeev
2005-01-01
Clinical chemistry analyser is a high-performance microcontroller-based photometric biochemical analyser to measure various blood biochemical parameters such as blood glucose, urea, protein, bilirubin, and so forth, and also to measure and observe enzyme growth occurred while performing the other biochemical tests such as ALT (alkaline amino transferase), amylase, AST (aspartate amino transferase), and so forth. These tests are of great significance in biochemistry and used for diagnostic purposes and classifying various disorders and diseases such as diabetes, liver malfunctioning, renal diseases, and so forth. An inexpensive clinical chemistry analyser developed by the authors is described in this paper. This is an open system in which any reagent kit available in the market can be used. The system is based on the principle of absorbance transmittance photometry. System design is based around 80C31 microcontroller with RAM, EPROM, and peripheral interface devices. The developed system incorporates light source, an optical module, interference filters of various wave lengths, peltier device for maintaining required temperature of the mixture in flow cell, peristaltic pump for sample aspiration, graphic LCD display for displaying blood parameters, patients test results and kinetic test graph, 40 columns mini thermal printer, and also 32-key keyboard for executing various functions. The lab tests conducted on the instrument include versatility of the analyzer, flexibility of the software, and treatment of sample. The prototype was tested and evaluated over 1000 blood samples successfully for seventeen blood parameters. Evaluation was carried out at Government Medical College and Hospital, the Department of Biochemistry. The test results were found to be comparable with other standard instruments. PMID:18924737
Taneja, S R; Gupta, R C; Kumar, Jagdish; Thariyan, K K; Verma, Sanjeev
2005-01-01
Clinical chemistry analyser is a high-performance microcontroller-based photometric biochemical analyser to measure various blood biochemical parameters such as blood glucose, urea, protein, bilirubin, and so forth, and also to measure and observe enzyme growth occurred while performing the other biochemical tests such as ALT (alkaline amino transferase), amylase, AST (aspartate amino transferase), and so forth. These tests are of great significance in biochemistry and used for diagnostic purposes and classifying various disorders and diseases such as diabetes, liver malfunctioning, renal diseases, and so forth. An inexpensive clinical chemistry analyser developed by the authors is described in this paper. This is an open system in which any reagent kit available in the market can be used. The system is based on the principle of absorbance transmittance photometry. System design is based around 80C31 microcontroller with RAM, EPROM, and peripheral interface devices. The developed system incorporates light source, an optical module, interference filters of various wave lengths, peltier device for maintaining required temperature of the mixture in flow cell, peristaltic pump for sample aspiration, graphic LCD display for displaying blood parameters, patients test results and kinetic test graph, 40 columns mini thermal printer, and also 32-key keyboard for executing various functions. The lab tests conducted on the instrument include versatility of the analyzer, flexibility of the software, and treatment of sample. The prototype was tested and evaluated over 1000 blood samples successfully for seventeen blood parameters. Evaluation was carried out at Government Medical College and Hospital, the Department of Biochemistry. The test results were found to be comparable with other standard instruments.
An Evaluation of Clinical Economics and Cases of Cost-effectiveness.
Takura, Tomoyuki
2018-05-01
In order to maintain and develop a universal health insurance system, it is crucial to utilize limited medical resources effectively. In this context, considerations are underway to introduce health technology assessments (HTAs), such as cost-effectiveness analyses (CEAs), into the medical treatment fee system. CEAs, which is the general term for these methods, are classified into four categories, such as cost-effectiveness analyses based on performance indicators, and in the comparison of health technologies, the incremental cost-effectiveness ratio (ICER) is also applied. When I comprehensively consider several Japanese studies based on these concepts, I find that, in the results of the analysis of the economic performance of healthcare systems, Japan shows the most promising trend in the world. In addition, there is research indicating the superior cost-effectiveness of Rituximab against refractory nephrotic syndrome, and it is expected that health economics will be actively applied to the valuation of technical innovations such as drug discovery.
Darbin, Olivier; Gubler, Coral; Naritoku, Dean; Dees, Daniel; Martino, Anthony; Adams, Elizabeth
2016-01-01
This study describes a cost-effective screening protocol for parkinsonism based on combined objective and subjective monitoring of balance function. Objective evaluation of balance function was performed using a game industry balance board and an automated analyses of the dynamic of the center of pressure in time, frequency, and non-linear domains collected during short series of stand up tests with different modalities and severity of sensorial deprivation. The subjective measurement of balance function was performed using the Dizziness Handicap Inventory questionnaire. Principal component analyses on both objective and subjective measurements of balance function allowed to obtained a specificity and selectivity for parkinsonian patients (vs. healthy subjects) of 0.67 and 0.71 respectively. The findings are discussed regarding the relevance of cost-effective balance-based screening system as strategy to meet the needs of broader and earlier screening for parkinsonism in communities with limited access to healthcare.
van Engen-Verheul, Mariëtte M.; Gude, Wouter T.; van der Veer, Sabine N.; Kemps, Hareld M.C.; Jaspers, Monique M.W.; de Keizer, Nicolette F.; Peek, Niels
2015-01-01
Despite their widespread use, audit and feedback (A&F) interventions show variable effectiveness on improving professional performance. Based on known facilitators of successful A&F interventions, we developed a web-based A&F intervention with indicator-based performance feedback, benchmark information, action planning and outreach visits. The goal of the intervention was to engage with multidisciplinary teams to overcome barriers to guideline concordance and to improve overall team performance in the field of cardiac rehabilitation (CR). To assess its effectiveness we conducted a cluster-randomized trial in 18 CR clinics (14,847 patients) already working with computerized decision support (CDS). Our preliminary results showed no increase in concordance with guideline recommendations regarding prescription of CR therapies. Future analyses will investigate whether our intervention did improve team performance on other quality indicators. PMID:26958310
Recent developments in the structural design and optimization of ITER neutral beam manifold
NASA Astrophysics Data System (ADS)
Chengzhi, CAO; Yudong, PAN; Zhiwei, XIA; Bo, LI; Tao, JIANG; Wei, LI
2018-02-01
This paper describes a new design of the neutral beam manifold based on a more optimized support system. A proposed alternative scheme has presented to replace the former complex manifold supports and internal pipe supports in the final design phase. Both the structural reliability and feasibility were confirmed with detailed analyses. Comparative analyses between two typical types of manifold support scheme were performed. All relevant results of mechanical analyses for typical operation scenarios and fault conditions are presented. Future optimization activities are described, which will give useful information for a refined setting of components in the next phase.
Nonlinear heat transfer and structural analyses of SSME turbine blades
NASA Technical Reports Server (NTRS)
Abdul-Aziz, A.; Kaufman, A.
1987-01-01
Three-dimensional nonlinear finite-element heat transfer and structural analyses were performed for the first stage high-pressure fuel turbopump blade of the space shuttle main engine (SSME). Directionally solidified (DS) MAR-M 246 material properties were considered for the analyses. Analytical conditions were based on a typical test stand engine cycle. Blade temperature and stress-strain histories were calculated using MARC finite-element computer code. The study was undertaken to assess the structural response of an SSME turbine blade and to gain greater understanding of blade damage mechanisms, convective cooling effects, and the thermal-mechanical effects.
Wang, Song; Zhou, Ming; Chen, Taolin; Yang, Xun; Chen, Guangxiang; Wang, Meiyun; Gong, Qiyong
2017-04-18
Achievement in school is crucial for students to be able to pursue successful careers and lead happy lives in the future. Although many psychological attributes have been found to be associated with academic performance, the neural substrates of academic performance remain largely unknown. Here, we investigated the relationship between brain structure and academic performance in a large sample of high school students via structural magnetic resonance imaging (S-MRI) using voxel-based morphometry (VBM) approach. The whole-brain regression analyses showed that higher academic performance was related to greater regional gray matter density (rGMD) of the left dorsolateral prefrontal cortex (DLPFC), which is considered a neural center at the intersection of cognitive and non-cognitive functions. Furthermore, mediation analyses suggested that general intelligence partially mediated the impact of the left DLPFC density on academic performance. These results persisted even after adjusting for the effect of family socioeconomic status (SES). In short, our findings reveal a potential neuroanatomical marker for academic performance and highlight the role of general intelligence in explaining the relationship between brain structure and academic performance.
ERIC Educational Resources Information Center
Martínez-Hernández, Cesar; Ulloa-Azpeitia, Ricardo
2017-01-01
Based on the theoretical elements of the instrumental approach to tool use known as Task-Technique-Theory (Artigue, 2002), this paper analyses and discusses the performance of graduate students enrolled in a Teacher Training program. The latter performance relates to tracing tangent lines to the curve of a quadratic function in Dynamic Geometry…
Performing Academic Practice: Using the Master Class to Build Postgraduate Discursive Competences
ERIC Educational Resources Information Center
Baerenholdt, Jorgen Ole; Gregson, Nicky; Everts, Jonathan; Granas, Brynhild; Healey, Ruth L.
2010-01-01
How can we find ways of training PhD students in academic practices, while reflexively analysing how academic practices are performed? The paper's answer to this question is based on evaluations from a British-Nordic master class. The paper discusses how master classes can be used to train the discursive skills required for academic discussion,…
ERIC Educational Resources Information Center
Cliffordson, Christina; Gustafsson, Jan-Eric
2008-01-01
The effects of age and schooling on different aspects of intellectual performance, taking track of study into account, are investigated. The analyses were based on military enlistment test scores, obtained by 48,269 males, measuring Fluid ability (Gf), Crystallized intelligence (Gc), and General visualization (Gv) ability. A regression method,…
ERIC Educational Resources Information Center
Pennsylvania Blue Shield, Camp Hill.
A project developed a model curriculum to be delivered by computer-based instruction to teach the required literacy skills for entry workers in the health insurance industry. Literacy task analyses were performed for the targeted jobs and then validated with focus groups. The job tasks and related basic skills were divided into modules. The job…
A Review of Missing Data Handling Methods in Education Research
ERIC Educational Resources Information Center
Cheema, Jehanzeb R.
2014-01-01
Missing data are a common occurrence in survey-based research studies in education, and the way missing values are handled can significantly affect the results of analyses based on such data. Despite known problems with performance of some missing data handling methods, such as mean imputation, many researchers in education continue to use those…
ERIC Educational Resources Information Center
DuPaul, George J.; Eckert, Tanya L.; Vilardo, Brigid
2012-01-01
A meta-analysis evaluating the effects of school-based interventions for students with attention deficit hyperactivity disorder was conducted by examining 60 outcome studies between 1996 and 2010 that yielded 85 effect sizes. Separate analyses were performed for studies employing between-subjects, within- subjects, and single-subject experimental…
77 FR 61895 - Electricity Market Transparency Provisions of Section 220 of the Federal Power Act
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-11
... rate authority and an ex post analysis of whether a seller with market-based rate authority has... Commission in performing ex post analyses to determine whether a seller with market-based rate authority has... Order Nos. 720 and 720-A, whereby the Commission required major intrastate natural gas pipelines to post...
Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses
Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T
2014-01-01
Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600
Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.
Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T
2014-06-01
Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.
Evaluation and comparison of bioinformatic tools for the enrichment analysis of metabolomics data.
Marco-Ramell, Anna; Palau-Rodriguez, Magali; Alay, Ania; Tulipani, Sara; Urpi-Sarda, Mireia; Sanchez-Pla, Alex; Andres-Lacueva, Cristina
2018-01-02
Bioinformatic tools for the enrichment of 'omics' datasets facilitate interpretation and understanding of data. To date few are suitable for metabolomics datasets. The main objective of this work is to give a critical overview, for the first time, of the performance of these tools. To that aim, datasets from metabolomic repositories were selected and enriched data were created. Both types of data were analysed with these tools and outputs were thoroughly examined. An exploratory multivariate analysis of the most used tools for the enrichment of metabolite sets, based on a non-metric multidimensional scaling (NMDS) of Jaccard's distances, was performed and mirrored their diversity. Codes (identifiers) of the metabolites of the datasets were searched in different metabolite databases (HMDB, KEGG, PubChem, ChEBI, BioCyc/HumanCyc, LipidMAPS, ChemSpider, METLIN and Recon2). The databases that presented more identifiers of the metabolites of the dataset were PubChem, followed by METLIN and ChEBI. However, these databases had duplicated entries and might present false positives. The performance of over-representation analysis (ORA) tools, including BioCyc/HumanCyc, ConsensusPathDB, IMPaLA, MBRole, MetaboAnalyst, Metabox, MetExplore, MPEA, PathVisio and Reactome and the mapping tool KEGGREST, was examined. Results were mostly consistent among tools and between real and enriched data despite the variability of the tools. Nevertheless, a few controversial results such as differences in the total number of metabolites were also found. Disease-based enrichment analyses were also assessed, but they were not found to be accurate probably due to the fact that metabolite disease sets are not up-to-date and the difficulty of predicting diseases from a list of metabolites. We have extensively reviewed the state-of-the-art of the available range of tools for metabolomic datasets, the completeness of metabolite databases, the performance of ORA methods and disease-based analyses. Despite the variability of the tools, they provided consistent results independent of their analytic approach. However, more work on the completeness of metabolite and pathway databases is required, which strongly affects the accuracy of enrichment analyses. Improvements will be translated into more accurate and global insights of the metabolome.
Dausey, David J; Chandra, Anita; Schaefer, Agnes G; Bahney, Ben; Haviland, Amelia; Zakowski, Sarah; Lurie, Nicole
2008-09-01
We tested telephone-based disease surveillance systems in local health departments to identify system characteristics associated with consistent and timely responses to urgent case reports. We identified a stratified random sample of 74 health departments and conducted a series of unannounced tests of their telephone-based surveillance systems. We used regression analyses to identify system characteristics that predicted fast connection with an action officer (an appropriate public health professional). Optimal performance in consistently connecting callers with an action officer in 30 minutes or less was achieved by 31% of participating health departments. Reaching a live person upon dialing, regardless of who that person was, was the strongest predictor of optimal performance both in being connected with an action officer and in consistency of connection times. Health departments can achieve optimal performance in consistently connecting a caller with an action officer in 30 minutes or less and may improve performance by using a telephone-based disease surveillance system in which the phone is answered by a live person at all times.
Attention and P300-based BCI performance in people with amyotrophic lateral sclerosis
Riccio, Angela; Simione, Luca; Schettini, Francesca; Pizzimenti, Alessia; Inghilleri, Maurizio; Belardinelli, Marta Olivetti; Mattia, Donatella; Cincotti, Febo
2013-01-01
The purpose of this study was to investigate the support of attentional and memory processes in controlling a P300-based brain-computer interface (BCI) in people with amyotrophic lateral sclerosis (ALS). Eight people with ALS performed two behavioral tasks: (i) a rapid serial visual presentation (RSVP) task, screening the temporal filtering capacity and the speed of the update of the attentive filter, and (ii) a change detection task, screening the memory capacity and the spatial filtering capacity. The participants were also asked to perform a P300-based BCI spelling task. By using correlation and regression analyses, we found that only the temporal filtering capacity in the RSVP task was a predictor of both the P300-based BCI accuracy and of the amplitude of the P300 elicited performing the BCI task. We concluded that the ability to keep the attentional filter active during the selection of a target influences performance in BCI control. PMID:24282396
Multigroup cross section library for GFR2400
NASA Astrophysics Data System (ADS)
Čerba, Štefan; Vrban, Branislav; Lüley, Jakub; Haščík, Ján; Nečas, Vladimír
2017-09-01
In this paper the development and optimization of the SBJ_E71 multigroup cross section library for GFR2400 applications is discussed. A cross section processing scheme, merging Monte Carlo and deterministic codes, was developed. Several fine and coarse group structures and two weighting flux options were analysed through 18 benchmark experiments selected from the handbook of ICSBEP and based on performed similarity assessments. The performance of the collapsed version of the SBJ_E71 library was compared with MCNP5 CE ENDF/B VII.1 and the Korean KAFAX-E70 library. The comparison was made based on integral parameters of calculations performed on full core homogenous models.
Using enterprise architecture to analyse how organisational structure impact motivation and learning
NASA Astrophysics Data System (ADS)
Närman, Pia; Johnson, Pontus; Gingnell, Liv
2016-06-01
When technology, environment, or strategies change, organisations need to adjust their structures accordingly. These structural changes do not always enhance the organisational performance as intended partly because organisational developers do not understand the consequences of structural changes in performance. This article presents a model-based analysis framework for quantitative analysis of the effect of organisational structure on organisation performance in terms of employee motivation and learning. The model is based on Mintzberg's work on organisational structure. The quantitative analysis is formalised using the Object Constraint Language (OCL) and the Unified Modelling Language (UML) and implemented in an enterprise architecture tool.
Coordination patterns related to high clinical performance in a simulated anesthetic crisis.
Manser, Tanja; Harrison, Thomas Kyle; Gaba, David M; Howard, Steven K
2009-05-01
Teamwork is an integral component in the delivery of safe patient care. Several studies highlight the importance of effective teamwork and the need for teams to respond dynamically to changing task requirements, for example, during crisis situations. In this study, we address one of the many facets of "effective teamwork" in medical teams by investigating coordination patterns related to high performance in the management of a simulated malignant hyperthermia (MH) scenario. We hypothesized that (a) anesthesia crews dynamically adapt their work and coordination patterns to the occurrence of a simulated MH crisis and that (b) crews with higher clinical performance scores (based on a time-based scoring system for critical MH treatment steps) exhibit different coordination patterns. This observational study investigated differences in work and coordination patterns of 24 two-person anesthesia crews in a simulated MH scenario. Clinical and coordination behavior were coded using a structured observation system consisting of 36 mutually exclusive observation categories for clinical activities, coordination activities, teaching, and other communication. Clinical performance scores for treating the simulated episode of MH were calculated using a time-based scoring system for critical treatment steps. Coordination patterns in response to the occurrence of a crisis situation were analyzed using multivariate analysis of variance and the relationship between coordination patterns and clinical performance was investigated using hierarchical regression analyses. Qualitative analyses of the three highest and lowest performing crews were conducted to complement the quantitative analysis. First, a multivariate analysis of variance revealed statistically significant changes in the proportion of time spent on clinical and coordination activities once the MH crisis was declared (F [5,19] = 162.81, P < 0.001, eta(p)(2) = 0.98). Second, hierarchical regression analyses controlling for the effects of cognitive aid use showed that higher performing anesthesia crews exhibit statistically significant less task distribution (beta = -0.539, P < 0.01) and significantly more situation assessment (beta = 0.569, P < 0.05). Additional qualitative video analysis revealed, for example, that lower scoring crews were more likely to split into subcrews (i.e., both anesthesiologists worked with other members of the perioperative team without maintaining a shared plan among the two-person anesthesia crew). Our results of the relationship of coordination patterns and clinical performance will inform future research on adaptive coordination in medical teams and support the development of specific training to improve team coordination and performance.
Comparison between two methodologies for urban drainage decision aid.
Moura, P M; Baptista, M B; Barraud, S
2006-01-01
The objective of the present work is to compare two methodologies based on multicriteria analysis for the evaluation of stormwater systems. The first methodology was developed in Brazil and is based on performance-cost analysis, the second one is ELECTRE III. Both methodologies were applied to a case study. Sensitivity and robustness analyses were then carried out. These analyses demonstrate that both methodologies have equivalent results, and present low sensitivity and high robustness. These results prove that the Brazilian methodology is consistent and can be used safely in order to select a good solution or a small set of good solutions that could be compared with more detailed methods afterwards.
Ma, Chuang; Wang, Xiangfeng
2012-09-01
One of the computational challenges in plant systems biology is to accurately infer transcriptional regulation relationships based on correlation analyses of gene expression patterns. Despite several correlation methods that are applied in biology to analyze microarray data, concerns regarding the compatibility of these methods with the gene expression data profiled by high-throughput RNA transcriptome sequencing (RNA-Seq) technology have been raised. These concerns are mainly due to the fact that the distribution of read counts in RNA-Seq experiments is different from that of fluorescence intensities in microarray experiments. Therefore, a comprehensive evaluation of the existing correlation methods and, if necessary, introduction of novel methods into biology is appropriate. In this study, we compared four existing correlation methods used in microarray analysis and one novel method called the Gini correlation coefficient on previously published microarray-based and sequencing-based gene expression data in Arabidopsis (Arabidopsis thaliana) and maize (Zea mays). The comparisons were performed on more than 11,000 regulatory relationships in Arabidopsis, including 8,929 pairs of transcription factors and target genes. Our analyses pinpointed the strengths and weaknesses of each method and indicated that the Gini correlation can compensate for the shortcomings of the Pearson correlation, the Spearman correlation, the Kendall correlation, and the Tukey's biweight correlation. The Gini correlation method, with the other four evaluated methods in this study, was implemented as an R package named rsgcc that can be utilized as an alternative option for biologists to perform clustering analyses of gene expression patterns or transcriptional network analyses.
GIS-based Landing-Site Analysis and Passive Decision Support
NASA Astrophysics Data System (ADS)
van Gasselt, Stephan; Nass, Andrea
2016-04-01
The increase of surface coverage and the availability and accessibility of planetary data allow researchers and engineers to remotely perform detailed studies on surface processes and properties, in particular on objects such as Mars and the Moon for which Terabytes of multi-temporal data at multiple spatial resolution levels have become available during the last 15 years. Orbiters, rovers and landers have been returning information and insights into the surface evolution of the terrestrial planets in unprecedented detail. While rover- and lander-based analyses are one major research aim to obtain ground truth, resource exploration or even potential establishment of bases using autonomous platforms are others and they require detailed investigation of settings in order to identify spots on the surface that are suitable for spacecraft to land and operate safely and over a long period of time. What has been done using hardcopy material in the past is today being carried by using either in-house developments or off-the-shelf spatial information system technology which allows to manage, integrate and analyse data as well as visualize and create user-defined reports for performing assessments. Usually, such analyses can be broken down (manually) by considering scientific wishes, engineering boundary conditions, potential hazards and various tertiary constraints. We here (1) review standard tasks of landing site analyses, (2) discuss issues inherently related to the analysis using integrated spatial analysis systems and (3) demonstrate a modular analysis framework for integration of data and for the evaluation of results from individual tasks in order to support decisions for landing-site selection.
Ma, Chuang; Wang, Xiangfeng
2012-01-01
One of the computational challenges in plant systems biology is to accurately infer transcriptional regulation relationships based on correlation analyses of gene expression patterns. Despite several correlation methods that are applied in biology to analyze microarray data, concerns regarding the compatibility of these methods with the gene expression data profiled by high-throughput RNA transcriptome sequencing (RNA-Seq) technology have been raised. These concerns are mainly due to the fact that the distribution of read counts in RNA-Seq experiments is different from that of fluorescence intensities in microarray experiments. Therefore, a comprehensive evaluation of the existing correlation methods and, if necessary, introduction of novel methods into biology is appropriate. In this study, we compared four existing correlation methods used in microarray analysis and one novel method called the Gini correlation coefficient on previously published microarray-based and sequencing-based gene expression data in Arabidopsis (Arabidopsis thaliana) and maize (Zea mays). The comparisons were performed on more than 11,000 regulatory relationships in Arabidopsis, including 8,929 pairs of transcription factors and target genes. Our analyses pinpointed the strengths and weaknesses of each method and indicated that the Gini correlation can compensate for the shortcomings of the Pearson correlation, the Spearman correlation, the Kendall correlation, and the Tukey’s biweight correlation. The Gini correlation method, with the other four evaluated methods in this study, was implemented as an R package named rsgcc that can be utilized as an alternative option for biologists to perform clustering analyses of gene expression patterns or transcriptional network analyses. PMID:22797655
Star 48 solid rocket motor nozzle analyses and instrumented firings
NASA Technical Reports Server (NTRS)
Porter, R. L.
1986-01-01
The analyses and testing performed by NASA in support of an expanded and improved nozzle design data base for use by the U.S. solid rocket motor industry is presented. A production nozzle with a history of one ground failure and two flight failures was selected for analyses and testing. The stress analysis was performed with the Champion computer code developed by the U.S. Navy. Several improvements were made to the code. Strain predictions were made and compared to test data. Two short duration motor firings were conducted with highly instrumented nozzles. The first nozzle had 58 thermocouples, 66 strain gages, and 8 bondline pressure measurements. The second nozzle had 59 thermocouples, 68 strain measurements, and 8 bondline pressure measurements. Most of this instrumentation was on the nonmetallic parts, and provided significantly more thermal and strain data on the nonmetallic components of a nozzle than has been accumulated in a solid rocket motor test to date.
NASA Astrophysics Data System (ADS)
Farhat, I. A. H.; Gale, E.; Alpha, C.; Isakovic, A. F.
2017-07-01
Optimizing energy performance of Magnetic Tunnel Junctions (MTJs) is the key for embedding Spin Transfer Torque-Random Access Memory (STT-RAM) in low power circuits. Due to the complex interdependencies of the parameters and variables of the device operating energy, it is important to analyse parameters with most effective control of MTJ power. The impact of threshold current density, Jco , on the energy and the impact of HK on Jco are studied analytically, following the expressions that stem from Landau-Lifshitz-Gilbert-Slonczewski (LLGS-STT) model. In addition, the impact of other magnetic material parameters, such as Ms , and geometric parameters such as tfree and λ is discussed. Device modelling study was conducted to analyse the impact at the circuit level. Nano-magnetism simulation based on NMAGTM package was conducted to analyse the impact of controlling HK on the switching dynamics of the film.
NASA Astrophysics Data System (ADS)
Dong, Li; Wang, Jun-Xi; Li, Qing-Yang; Dong, Hai-Kuan; Xiu, Xiao-Ming; Gao, Ya-Jun
2016-07-01
Employing a polarization-entangled χ state, which is a four-photon genuine entangled state, we propose a protocol teleporting a general two-photon polarization state. Firstly, the sender needs to perform one Controlled-NOT gate, one Hadamard gate, and one Controlled-NOT gate on the state to be teleported in succession. Secondly, the sender performs local nondemolition parity analyses based on cross-Kerr nonlinearities and publicizes the achieved outcomes. Finally, conditioned on the sender's analysis outcomes, the receiver executes the single-photon unitary transformation operations on his own photons to obtain the state originally sit in the sender's location. Due to the employment of nondemolition parity analyses rather than four-qubit joint measurement, it can be realized more feasible with currently available technologies. Moreover, the resources of Bell states can be achieved because the nondestructive measurement is exploited, which facilitates other potential tasks of quantum information processing.
Dynamic behaviour of a rolling tyre: Experimental and numerical analyses
NASA Astrophysics Data System (ADS)
Gonzalez Diaz, Cristobal; Kindt, Peter; Middelberg, Jason; Vercammen, Stijn; Thiry, Christophe; Close, Roland; Leyssens, Jan
2016-03-01
Based on the results of experimental and numerical analyses, the effect of rotation on the tyre dynamic behaviour is investigated. Better understanding of these effects will further improve the ability to control and optimize the noise and vibrations that result from the interaction between the road surface and the rolling tyre. Therefore, more understanding in the complex tyre dynamic properties will contribute to develop tyre design strategies to lower the tyre/road noise while less affecting other tyre performances. The presented work is performed in the framework of the European industry-academia project TIRE-DYN, with partners Goodyear, Katholieke Universiteit Leuven and LMS International. The effect of rotation on the tyre dynamic behaviour is quantified for different operating conditions of the tyre, such as load, air pressure and rotation speed. By means of experimental and numerical analyses, the effects of rotation on the tyre dynamic behaviour are studied.
Cognitive context detection in UAS operators using eye-gaze patterns on computer screens
NASA Astrophysics Data System (ADS)
Mannaru, Pujitha; Balasingam, Balakumar; Pattipati, Krishna; Sibley, Ciara; Coyne, Joseph
2016-05-01
In this paper, we demonstrate the use of eye-gaze metrics of unmanned aerial systems (UAS) operators as effective indices of their cognitive workload. Our analyses are based on an experiment where twenty participants performed pre-scripted UAS missions of three different difficulty levels by interacting with two custom designed graphical user interfaces (GUIs) that are displayed side by side. First, we compute several eye-gaze metrics, traditional eye movement metrics as well as newly proposed ones, and analyze their effectiveness as cognitive classifiers. Most of the eye-gaze metrics are computed by dividing the computer screen into "cells". Then, we perform several analyses in order to select metrics for effective cognitive context classification related to our specific application; the objective of these analyses are to (i) identify appropriate ways to divide the screen into cells; (ii) select appropriate metrics for training and classification of cognitive features; and (iii) identify a suitable classification method.
NASA Astrophysics Data System (ADS)
Marisarla, Soujanya; Ghia, Urmila; "Karman" Ghia, Kirti
2002-11-01
Towards a comprehensive aeroelastic analysis of a joined wing, fluid dynamics and structural analyses are initially performed separately. Steady flow calculations are currently performed using 3-D compressible Navier-Stokes equations. Flow analysis of M6-Onera wing served to validate the software for the fluid dynamics analysis. The complex flow field of the joined wing is analyzed and the prevailing fluid dynamic forces are computed using COBALT software. Currently, these forces are being transferred as fluid loads on the structure. For the structural analysis, several test cases were run considering the wing as a cantilever beam; these served as validation cases. A nonlinear structural analysis of the wing is being performed using ANSYS software to predict the deflections and stresses on the joined wing. Issues related to modeling, and selecting appropriate mesh for the structure were addressed by first performing a linear analysis. The frequencies and mode shapes of the deformed wing are obtained from modal analysis. Both static and dynamic analyses are carried out, and the results obtained are carefully analyzed. Loose coupling between the fluid and structural analyses is currently being examined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Po-E; Lo, Chien -Chi; Anderson, Joseph J.
Continued advancements in sequencing technologies have fueled the development of new sequencing applications and promise to flood current databases with raw data. A number of factors prevent the seamless and easy use of these data, including the breadth of project goals, the wide array of tools that individually perform fractions of any given analysis, the large number of associated software/hardware dependencies, and the detailed expertise required to perform these analyses. To address these issues, we have developed an intuitive web-based environment with a wide assortment of integrated and cutting-edge bioinformatics tools in pre-configured workflows. These workflows, coupled with the easemore » of use of the environment, provide even novice next-generation sequencing users with the ability to perform many complex analyses with only a few mouse clicks and, within the context of the same environment, to visualize and further interrogate their results. As a result, this bioinformatics platform is an initial attempt at Empowering the Development of Genomics Expertise (EDGE) in a wide range of applications for microbial research.« less
Li, Po-E; Lo, Chien-Chi; Anderson, Joseph J.; Davenport, Karen W.; Bishop-Lilly, Kimberly A.; Xu, Yan; Ahmed, Sanaa; Feng, Shihai; Mokashi, Vishwesh P.; Chain, Patrick S.G.
2017-01-01
Continued advancements in sequencing technologies have fueled the development of new sequencing applications and promise to flood current databases with raw data. A number of factors prevent the seamless and easy use of these data, including the breadth of project goals, the wide array of tools that individually perform fractions of any given analysis, the large number of associated software/hardware dependencies, and the detailed expertise required to perform these analyses. To address these issues, we have developed an intuitive web-based environment with a wide assortment of integrated and cutting-edge bioinformatics tools in pre-configured workflows. These workflows, coupled with the ease of use of the environment, provide even novice next-generation sequencing users with the ability to perform many complex analyses with only a few mouse clicks and, within the context of the same environment, to visualize and further interrogate their results. This bioinformatics platform is an initial attempt at Empowering the Development of Genomics Expertise (EDGE) in a wide range of applications for microbial research. PMID:27899609
Li, Po-E; Lo, Chien -Chi; Anderson, Joseph J.; ...
2016-11-24
Continued advancements in sequencing technologies have fueled the development of new sequencing applications and promise to flood current databases with raw data. A number of factors prevent the seamless and easy use of these data, including the breadth of project goals, the wide array of tools that individually perform fractions of any given analysis, the large number of associated software/hardware dependencies, and the detailed expertise required to perform these analyses. To address these issues, we have developed an intuitive web-based environment with a wide assortment of integrated and cutting-edge bioinformatics tools in pre-configured workflows. These workflows, coupled with the easemore » of use of the environment, provide even novice next-generation sequencing users with the ability to perform many complex analyses with only a few mouse clicks and, within the context of the same environment, to visualize and further interrogate their results. As a result, this bioinformatics platform is an initial attempt at Empowering the Development of Genomics Expertise (EDGE) in a wide range of applications for microbial research.« less
40 CFR 63.7515 - When must I conduct subsequent performance tests, fuel analyses, or tune-ups?
Code of Federal Regulations, 2014 CFR
2014-07-01
... performance tests, fuel analyses, or tune-ups? 63.7515 Section 63.7515 Protection of Environment ENVIRONMENTAL..., fuel analyses, or tune-ups? (a) You must conduct all applicable performance tests according to § 63... performance tests and the associated fuel analyses within 60 days after the completion of the performance...
40 CFR 63.7515 - When must I conduct subsequent performance tests, fuel analyses, or tune-ups?
Code of Federal Regulations, 2013 CFR
2013-07-01
... performance tests, fuel analyses, or tune-ups? 63.7515 Section 63.7515 Protection of Environment ENVIRONMENTAL..., fuel analyses, or tune-ups? (a) You must conduct all applicable performance tests according to § 63... performance tests and the associated fuel analyses within 60 days after the completion of the performance...
Flash-flood early warning using weather radar data: from nowcasting to forecasting
NASA Astrophysics Data System (ADS)
Liechti, Katharina; Panziera, Luca; Germann, Urs; Zappa, Massimiliano
2013-04-01
In our study we explore the limits of radar-based forecasting for hydrological runoff prediction. Two novel probabilistic radar-based forecasting chains for flash-flood early warning are investigated in three catchments in the Southern Swiss Alps and set in relation to deterministic discharge forecast for the same catchments. The first probabilistic radar-based forecasting chain is driven by NORA (Nowcasting of Orographic Rainfall by means of Analogues), an analogue-based heuristic nowcasting system to predict orographic rainfall for the following eight hours. The second probabilistic forecasting system evaluated is REAL-C2, where the numerical weather prediction COSMO-2 is initialized with 25 different initial conditions derived from a four-day nowcast with the radar ensemble REAL. Additionally, three deterministic forecasting chains were analysed. The performance of these five flash-flood forecasting systems was analysed for 1389 hours between June 2007 and December 2010 for which NORA forecasts were issued, due to the presence of orographic forcing. We found a clear preference for the probabilistic approach. Discharge forecasts perform better when forced by NORA rather than by a persistent radar QPE for lead times up to eight hours and for all discharge thresholds analysed. The best results were, however, obtained with the REAL-C2 forecasting chain, which was also remarkably skilful even with the highest thresholds. However, for regions where REAL cannot be produced, NORA might be an option for forecasting events triggered by orographic forcing.
Flash-flood early warning using weather radar data: from nowcasting to forecasting
NASA Astrophysics Data System (ADS)
Liechti, K.; Panziera, L.; Germann, U.; Zappa, M.
2013-01-01
This study explores the limits of radar-based forecasting for hydrological runoff prediction. Two novel probabilistic radar-based forecasting chains for flash-flood early warning are investigated in three catchments in the Southern Swiss Alps and set in relation to deterministic discharge forecast for the same catchments. The first probabilistic radar-based forecasting chain is driven by NORA (Nowcasting of Orographic Rainfall by means of Analogues), an analogue-based heuristic nowcasting system to predict orographic rainfall for the following eight hours. The second probabilistic forecasting system evaluated is REAL-C2, where the numerical weather prediction COSMO-2 is initialized with 25 different initial conditions derived from a four-day nowcast with the radar ensemble REAL. Additionally, three deterministic forecasting chains were analysed. The performance of these five flash-flood forecasting systems was analysed for 1389 h between June 2007 and December 2010 for which NORA forecasts were issued, due to the presence of orographic forcing. We found a clear preference for the probabilistic approach. Discharge forecasts perform better when forced by NORA rather than by a persistent radar QPE for lead times up to eight hours and for all discharge thresholds analysed. The best results were, however, obtained with the REAL-C2 forecasting chain, which was also remarkably skilful even with the highest thresholds. However, for regions where REAL cannot be produced, NORA might be an option for forecasting events triggered by orographic precipitation.
Search for Bs0 oscillations using inclusive lepton events
NASA Astrophysics Data System (ADS)
ALEPH Collaboration; Barate, R.; et al.
1999-03-01
A search for Bs0 oscillations is performed using a sample of semileptonic b-hadron decays collected by the ALEPH experiment during 1991-95. Compared to previous inclusive lepton analyses, the proper time resolution and b-flavour mistag rate are significantly improved. Additional sensitivity to Bs0 mixing is obtained by identifying subsamples of events having a Bs0 purity which is higher than the average for the whole data sample. Unbinned maximum likelihood amplitude fits are performed to derive a lower limit of Δ m s > 9.5 ps-1 at the 95% confidence limit (95% CL). Combining with the ALEPH Ds--based analyses yields Δ m s > 9.6 ps-1 at 95% CL.
GREENE, CASEY S.; TAN, JIE; UNG, MATTHEW; MOORE, JASON H.; CHENG, CHAO
2017-01-01
Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the “big data” era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both “machine learning” algorithms as well as “unsupervised” and “supervised” examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. PMID:27908398
GREENE, CASEY S.; TAN, JIE; UNG, MATTHEW; MOORE, JASON H.; CHENG, CHAO
2017-01-01
Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the “big data” era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both “machine learning” algorithms as well as “unsupervised” and “supervised” examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. PMID:24799088
Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao
2014-12-01
Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.
Shearer, Jane; Graham, Terry E
2014-10-01
This review documents two opposing effects of caffeine and caffeine-containing energy drinks, i.e., their positive effects on athletic performance and their negative impacts on glucose tolerance in the sedentary state. Analysis of studies examining caffeine administration prior to performance-based exercise showed caffeine improved completion time by 3.6%. Similar analyses following consumption of caffeine-containing energy drinks yielded positive, but more varied, benefits, which were likely due to the diverse nature of the studies performed, the highly variable composition of the beverages consumed, and the range of caffeine doses administered. Conversely, analyses of studies administering caffeine prior to either an oral glucose tolerance test or insulin clamp showed a decline in whole-body glucose disposal of ~30%. The consequences of this resistance are unknown, but there may be implications for the development of a number of chronic diseases. Both caffeine-induced performance enhancement and insulin resistance converge with the primary actions of caffeine on skeletal muscle. © 2014 International Life Sciences Institute.
Performance deterioration based on in-service engine data: JT9D jet engine diagnostics program
NASA Technical Reports Server (NTRS)
Sallee, G. P.
1979-01-01
Results of analyses of engine performance deterioration trends and levels with respect to service usage are presented. Thirty-two JT9D-7A engines were selected for this purpose. The selection of this engine fleet provided the opportunity of obtaining engine performance data starting before the first flight through initial service such that the trend and levels of engine deterioration related to both short and long term deterioration could be more carefully defined. The performance data collected and analyzed included in-flight, on wing (ground), and test stand prerepair and postrepair performance calibrations with expanded instrumentation where feasible. The results of the analyses of these data were used to: (1) close gaps in previously obtained historical data as well as augment the historical data with more carefully obtained data; (2) refine preliminary models of performance deterioration with respect to usage; (3) establish an understanding of the relationships between ground and altitude performance deterioration trends; (4) refine preliminary recommendations concerning means to reduce and control deterioration; and (5) identify areas where additional effort is required to develop an understanding of complex deterioration issues.
Single-cell analysis of population context advances RNAi screening at multiple levels
Snijder, Berend; Sacher, Raphael; Rämö, Pauli; Liberali, Prisca; Mench, Karin; Wolfrum, Nina; Burleigh, Laura; Scott, Cameron C; Verheije, Monique H; Mercer, Jason; Moese, Stefan; Heger, Thomas; Theusner, Kristina; Jurgeit, Andreas; Lamparter, David; Balistreri, Giuseppe; Schelhaas, Mario; De Haan, Cornelis A M; Marjomäki, Varpu; Hyypiä, Timo; Rottier, Peter J M; Sodeik, Beate; Marsh, Mark; Gruenberg, Jean; Amara, Ali; Greber, Urs; Helenius, Ari; Pelkmans, Lucas
2012-01-01
Isogenic cells in culture show strong variability, which arises from dynamic adaptations to the microenvironment of individual cells. Here we study the influence of the cell population context, which determines a single cell's microenvironment, in image-based RNAi screens. We developed a comprehensive computational approach that employs Bayesian and multivariate methods at the single-cell level. We applied these methods to 45 RNA interference screens of various sizes, including 7 druggable genome and 2 genome-wide screens, analysing 17 different mammalian virus infections and four related cell physiological processes. Analysing cell-based screens at this depth reveals widespread RNAi-induced changes in the population context of individual cells leading to indirect RNAi effects, as well as perturbations of cell-to-cell variability regulators. We find that accounting for indirect effects improves the consistency between siRNAs targeted against the same gene, and between replicate RNAi screens performed in different cell lines, in different labs, and with different siRNA libraries. In an era where large-scale RNAi screens are increasingly performed to reach a systems-level understanding of cellular processes, we show that this is often improved by analyses that account for and incorporate the single-cell microenvironment. PMID:22531119
Constitutive modeling for isotropic materials (HOST)
NASA Technical Reports Server (NTRS)
Lindholm, U. S.; Chan, K. S.; Bodner, S. R.; Weber, R. M.; Walker, K. P.; Cassenti, B. N.
1985-01-01
This report presents the results of the second year of work on a problem which is part of the NASA HOST Program. Its goals are: (1) to develop and validate unified constitutive models for isotropic materials, and (2) to demonstrate their usefulness for structural analyses of hot section components of gas turbine engines. The unified models selected for development and evaluation are that of Bodner-Partom and Walker. For model evaluation purposes, a large constitutive data base is generated for a B1900 + Hf alloy by performing uniaxial tensile, creep, cyclic, stress relation, and thermomechanical fatigue (TMF) tests as well as biaxial (tension/torsion) tests under proportional and nonproportional loading over a wide range of strain rates and temperatures. Systematic approaches for evaluating material constants from a small subset of the data base are developed. Correlations of the uniaxial and biaxial tests data with the theories of Bodner-Partom and Walker are performed to establish the accuracy, range of applicability, and integability of the models. Both models are implemented in the MARC finite element computer code and used for TMF analyses. Benchmark notch round experiments are conducted and the results compared with finite-element analyses using the MARC code and the Walker model.
Koerbin, G; Cavanaugh, J A; Potter, J M; Abhayaratna, W P; West, N P; Glasgow, N; Hawkins, C; Armbruster, D; Oakman, C; Hickman, P E
2015-02-01
Development of reference intervals is difficult, time consuming, expensive and beyond the scope of most laboratories. The Aussie Normals study is a direct a priori study to determine reference intervals in healthy Australian adults. All volunteers completed a health and lifestyle questionnaire and exclusion was based on conditions such as pregnancy, diabetes, renal or cardiovascular disease. Up to 91 biochemical analyses were undertaken on a variety of analytical platforms using serum samples collected from 1856 volunteers. We report on our findings for 40 of these analytes and two calculated parameters performed on the Abbott ARCHITECTci8200/ci16200 analysers. Not all samples were analysed for all assays due to volume requirements or assay/instrument availability. Results with elevated interference indices and those deemed unsuitable after clinical evaluation were removed from the database. Reference intervals were partitioned based on the method of Harris and Boyd into three scenarios, combined gender, males and females and age and gender. We have performed a detailed reference interval study on a healthy Australian population considering the effects of sex, age and body mass. These reference intervals may be adapted to other manufacturer's analytical methods using method transference.
Simulation-based comprehensive benchmarking of RNA-seq aligners
Baruzzo, Giacomo; Hayer, Katharina E; Kim, Eun Ji; Di Camillo, Barbara; FitzGerald, Garret A; Grant, Gregory R
2018-01-01
Alignment is the first step in most RNA-seq analysis pipelines, and the accuracy of downstream analyses depends heavily on it. Unlike most steps in the pipeline, alignment is particularly amenable to benchmarking with simulated data. We performed a comprehensive benchmarking of 14 common splice-aware aligners for base, read, and exon junction-level accuracy and compared default with optimized parameters. We found that performance varied by genome complexity, and accuracy and popularity were poorly correlated. The most widely cited tool underperforms for most metrics, particularly when using default settings. PMID:27941783
Radiological performance assessment for the E-Area Vaults Disposal Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, J.R.
This report is the first revision to ``Radiological Performance Assessment for the E-Area Vaults Disposal Facility, Revision 0'', which was issued in April 1994 and received conditional DOE approval in September 1994. The title of this report has been changed to conform to the current name of the facility. The revision incorporates improved groundwater modeling methodology, which includes a large data base of site specific geotechnical data, and special Analyses on disposal of cement-based wasteforms and naval wastes, issued after publication of Revision 0.
BRepertoire: a user-friendly web server for analysing antibody repertoire data.
Margreitter, Christian; Lu, Hui-Chun; Townsend, Catherine; Stewart, Alexander; Dunn-Walters, Deborah K; Fraternali, Franca
2018-04-14
Antibody repertoire analysis by high throughput sequencing is now widely used, but a persisting challenge is enabling immunologists to explore their data to discover discriminating repertoire features for their own particular investigations. Computational methods are necessary for large-scale evaluation of antibody properties. We have developed BRepertoire, a suite of user-friendly web-based software tools for large-scale statistical analyses of repertoire data. The software is able to use data preprocessed by IMGT, and performs statistical and comparative analyses with versatile plotting options. BRepertoire has been designed to operate in various modes, for example analysing sequence-specific V(D)J gene usage, discerning physico-chemical properties of the CDR regions and clustering of clonotypes. Those analyses are performed on the fly by a number of R packages and are deployed by a shiny web platform. The user can download the analysed data in different table formats and save the generated plots as image files ready for publication. We believe BRepertoire to be a versatile analytical tool that complements experimental studies of immune repertoires. To illustrate the server's functionality, we show use cases including differential gene usage in a vaccination dataset and analysis of CDR3H properties in old and young individuals. The server is accessible under http://mabra.biomed.kcl.ac.uk/BRepertoire.
Elastic-plastic finite-element analyses of thermally cycled double-edge wedge specimens
NASA Technical Reports Server (NTRS)
Kaufman, A.; Hunt, L. E.
1982-01-01
Elastic-plastic stress-strain analyses were performed for double-edge wedge specimens subjected to thermal cycling in fluidized beds at 316 and 1088 C. Four cases involving different nickel-base alloys (IN 100, Mar M-200, NASA TAZ-8A, and Rene 80) were analyzed by using the MARC nonlinear, finite element computer program. Elastic solutions from MARC showed good agreement with previously reported solutions obtained by using the NASTRAN and ISO3DQ computer programs. Equivalent total strain ranges at the critical locations calculated by elastic analyses agreed within 3 percent with those calculated from elastic-plastic analyses. The elastic analyses always resulted in compressive mean stresses at the critical locations. However, elastic-plastic analyses showed tensile mean stresses for two of the four alloys and an increase in the compressive mean stress for the highest plastic strain case.
Performance optimization for space-based sensors: simulation and modelling at Fraunhofer IOSB
NASA Astrophysics Data System (ADS)
Schweitzer, Caroline; Stein, Karin
2014-10-01
The prediction of the effectiveness of a space-based sensor for its designated application in space (e.g. special earth surface observations or missile detection) can help to reduce the expenses, especially during the phases of mission planning and instrumentation. In order to optimize the performance of such systems we simulate and analyse the entire operational scenario, including: - optional waveband - various orbit heights and viewing angles - system design characteristics, e. g. pixel size and filter transmission - atmospheric effects, e. g. different cloud types, climate zones and seasons In the following, an evaluation of the appropriate infrared (IR) waveband for the designated sensor application is given. The simulation environment is also capable of simulating moving objects like aircraft or missiles. Therefore, the spectral signature of the object/missile as well as its track along a flight path is implemented. The resulting video sequence is then analysed by a tracking algorithm and an estimation of the effectiveness of the sensor system can be simulated. This paper summarizes the work carried out at Fraunhofer IOSB in the field of simulation and modelling for the performance optimization of space based sensors. The paper is structured as follows: First, an overview of the applied simulation and modelling software is given. Then, the capability of those tools is illustrated by means of a hypothetical threat scenario for space-based early warning (launch of a long-range ballistic missile (BM)).
Miró-Abella, Eugènia; Herrero, Pol; Canela, Núria; Arola, Lluís; Borrull, Francesc; Ras, Rosa; Fontanals, Núria
2017-08-15
A method was developed for the simultaneous determination of 11 mycotoxins in plant-based beverage matrices, using a QuEChERS extraction followed by ultra-high performance liquid chromatography coupled to tandem mass spectrometry detection (UHPLC-(ESI)MS/MS). This multi-mycotoxin method was applied to analyse plant-based beverages such as soy, oat and rice. QuEChERS extraction was applied obtaining suitable extraction recoveries between 80 and 91%, and good repeatability and reproducibility values. Method Quantification Limits were between 0.05μgL -1 (for aflatoxin G 1 and aflatoxin B 1 ) and 15μgL -1 (for deoxynivalenol and fumonisin B 2 ). This is the first time that plant-based beverages have been analysed, and certain mycotoxins, such as deoxynivalenol, aflatoxin B 1 , aflatoxin B 2 , aflatoxin G 1 , aflatoxin G 2 , ochratoxin A, T-2 toxin and zearalenone, were found in the analysed samples, and some of them quantified between 0.1μgL -1 and 19μgL -1 . Copyright © 2017 Elsevier Ltd. All rights reserved.
Alagoz, Baris Baykant; Deniz, Furkan Nur; Keles, Cemal; Tan, Nusret
2015-03-01
This study investigates disturbance rejection capacity of closed loop control systems by means of reference to disturbance ratio (RDR). The RDR analysis calculates the ratio of reference signal energy to disturbance signal energy at the system output and provides a quantitative evaluation of disturbance rejection performance of control systems on the bases of communication channel limitations. Essentially, RDR provides a straightforward analytical method for the comparison and improvement of implicit disturbance rejection capacity of closed loop control systems. Theoretical analyses demonstrate us that RDR of the negative feedback closed loop control systems are determined by energy spectral density of controller transfer function. In this manner, authors derived design criteria for specifications of disturbance rejection performances of PID and fractional order PID (FOPID) controller structures. RDR spectra are calculated for investigation of frequency dependence of disturbance rejection capacity and spectral RDR analyses are carried out for PID and FOPID controllers. For the validation of theoretical results, simulation examples are presented. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
permGPU: Using graphics processing units in RNA microarray association studies.
Shterev, Ivo D; Jung, Sin-Ho; George, Stephen L; Owzar, Kouros
2010-06-16
Many analyses of microarray association studies involve permutation, bootstrap resampling and cross-validation, that are ideally formulated as embarrassingly parallel computing problems. Given that these analyses are computationally intensive, scalable approaches that can take advantage of multi-core processor systems need to be developed. We have developed a CUDA based implementation, permGPU, that employs graphics processing units in microarray association studies. We illustrate the performance and applicability of permGPU within the context of permutation resampling for a number of test statistics. An extensive simulation study demonstrates a dramatic increase in performance when using permGPU on an NVIDIA GTX 280 card compared to an optimized C/C++ solution running on a conventional Linux server. permGPU is available as an open-source stand-alone application and as an extension package for the R statistical environment. It provides a dramatic increase in performance for permutation resampling analysis in the context of microarray association studies. The current version offers six test statistics for carrying out permutation resampling analyses for binary, quantitative and censored time-to-event traits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard, M.A.; Sommer, S.C.
1995-04-01
AUTOCASK (AUTOmatic Generation of 3-D CASK models) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for the structural analysis of shipping casks for radioactive material. Model specification is performed on the microcomputer, and the analyses are performed on an engineering workstation or mainframe computer. AUTOCASK is based on 80386/80486 compatible microcomputers. The system is composed of a series of menus, input programs, display programs, a mesh generation program, and archive programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests.
Research on axial thrust of the waterjet pump based on CFD under cavitation conditions
NASA Astrophysics Data System (ADS)
Shen, Z. H.; Pan, Z. Y.
2015-01-01
Based on RANS equations, performance of a contra-rotating axial-flow waterjet pump without hydrodynamic cavitation state had been obtained combined with shear stress transport turbulence model. Its cavitation hydrodynamic performance was calculated and analysed with mixture homogeneous flow cavitation model based on Rayleigh-Plesset equations. The results shows that the cavitation causes axial thrust of waterjet pump to drop. Furthermore, axial thrust and head cavitation characteristic curve is similar. However, the drop point of the axial thrust is postponed by 5.1% comparing with one of head, and the critical point of the axial thrust is postponed by 2.6%.
Crystal Growth of ZnSe and Related Ternary Compound Semiconductors by Vapor Transport
NASA Technical Reports Server (NTRS)
Su, Ching-Hua; Brebrick, R. F.; Burger, A.; Dudley, M.; Matyi, R.; Ramachandran, N.; Sha, Yi-Gao; Volz, M.; Shih, Hung-Dah
1999-01-01
Complete and systematic ground-based experimental and theoretical analyses on the Physical Vapor Transport (PVT) of ZnSe and related ternary compound semiconductors have been performed. The analyses included thermodynamics, mass flux, heat treatment of starting material, crystal growth, partial pressure measurements, optical interferometry, chemical analyses, photoluminescence, microscopy, x-ray diffraction and topography as well as theoretical, analytical and numerical analyses. The experimental results showed the influence of gravity orientation on the characteristics of: (1) the morphology of the as-grown crystals as well as the as-grown surface morphology of ZnSe and Cr doped ZnSe crystals; (2) the distribution of impurities and defects in ZnSe grown crystals; and (3) the axial segregation in ZnSeTe grown crystals.
Systematic review with meta-analysis: the effects of rifaximin in hepatic encephalopathy.
Kimer, N; Krag, A; Møller, S; Bendtsen, F; Gluud, L L
2014-07-01
Rifaximin is recommended for prevention of hepatic encephalopathy (HE). The effects of rifaximin on overt and minimal HE are debated. To perform a systematic review and meta-analysis of randomised controlled trials (RCTs) on rifaximin for HE. We performed electronic and manual searches, gathered information from the U.S. Food and Drug Administration Home Page, and obtained unpublished information on trial design and outcome measures from authors and pharmaceutical companies. Meta-analyses were performed and results presented as risk ratios (RR) with 95% confidence intervals (CI) and the number needed to treat. Subgroup, sensitivity, regression and sequential analyses were performed to evaluate the risk of bias and sources of heterogeneity. We included 19 RCTs with 1370 patients. Outcomes were recalculated based on unpublished information of 11 trials. Overall, rifaximin had a beneficial effect on secondary prevention of HE (RR: 1.32; 95% CI 1.06-1.65), but not in a sensitivity analysis on rifaximin after TIPSS (RR: 1.27; 95% CI 1.00-1.53). Rifaximin increased the proportion of patients who recovered from HE (RR: 0.59; 95% CI: 0.46-0.76) and reduced mortality (RR: 0.68, 95% CI 0.48-0.97). The results were robust to adjustments for bias control. No small study effects were identified. The sequential analyses only confirmed the results of the analysis on HE recovery. Rifaximin has a beneficial effect on hepatic encephalopathy and may reduce mortality. The combined evidence suggests that rifaximin may be considered in the evidence-based management of hepatic encephalopathy. © 2014 John Wiley & Sons Ltd.
2014-01-01
This paper analyses how different coordination modes and different multiobjective decision making approaches interfere with each other in hierarchical organizations. The investigation is based on an agent-based simulation. We apply a modified NK-model in which we map multiobjective decision making as adaptive walk on multiple performance landscapes, whereby each landscape represents one objective. We find that the impact of the coordination mode on the performance and the speed of performance improvement is critically affected by the selected multiobjective decision making approach. In certain setups, the performances achieved with the more complex multiobjective decision making approaches turn out to be less sensitive to the coordination mode than the performances achieved with the less complex multiobjective decision making approaches. Furthermore, we present results on the impact of the nature of interactions among decisions on the achieved performance in multiobjective setups. Our results give guidance on how to control the performance contribution of objectives to overall performance and answer the question how effective certain multiobjective decision making approaches perform under certain circumstances (coordination mode and interdependencies among decisions). PMID:25152926
NASA Astrophysics Data System (ADS)
Zafar, I.; Edirisinghe, E. A.; Acar, S.; Bez, H. E.
2007-02-01
Automatic vehicle Make and Model Recognition (MMR) systems provide useful performance enhancements to vehicle recognitions systems that are solely based on Automatic License Plate Recognition (ALPR) systems. Several car MMR systems have been proposed in literature. However these approaches are based on feature detection algorithms that can perform sub-optimally under adverse lighting and/or occlusion conditions. In this paper we propose a real time, appearance based, car MMR approach using Two Dimensional Linear Discriminant Analysis that is capable of addressing this limitation. We provide experimental results to analyse the proposed algorithm's robustness under varying illumination and occlusions conditions. We have shown that the best performance with the proposed 2D-LDA based car MMR approach is obtained when the eigenvectors of lower significance are ignored. For the given database of 200 car images of 25 different make-model classifications, a best accuracy of 91% was obtained with the 2D-LDA approach. We use a direct Principle Component Analysis (PCA) based approach as a benchmark to compare and contrast the performance of the proposed 2D-LDA approach to car MMR. We conclude that in general the 2D-LDA based algorithm supersedes the performance of the PCA based approach.
3D engineered fiberboard : finite element analysis of a new building product
John F. Hunt
2004-01-01
This paper presents finite element analyses that are being used to analyze and estimate the structural performance of a new product called 3D engineered fiberboard in bending and flat-wise compression applications. A 3x3x2 split-plot experimental design was used to vary geometry configurations to determine their effect on performance properties. The models are based on...
ERIC Educational Resources Information Center
Williams, Gareth
2017-01-01
The aim of this paper was to analyse how teachers and government may differ in their views regarding the qualities required to be an effective middle manager with responsibility for Physical Education (PE). Lines of inquiry were based upon the practices associated with a performative work culture and how this has affected teacher language. There…
Performance of Koyna dam based on static and dynamic analysis
NASA Astrophysics Data System (ADS)
Azizan, Nik Zainab Nik; Majid, Taksiah A.; Nazri, Fadzli Mohamed; Maity, Damodar
2017-10-01
This paper discusses the performance of Koyna dam based on static pushover analysis (SPO) and incremental dynamic analysis (IDA). The SPO in this study considered two type of lateral load which is inertial load and hydrodynamic load. The structure was analyse until the damage appears on the structure body. The IDA curves were develop based on 7 ground motion, where the characteristic of the ground motions: i) the distance from the epicenter is less than 15km, (ii) the magnitude is equal to or greater than 5.5 and (iii) the PGA is equal to or greater than 0.15g. All the ground motions convert to respond spectrum and scaled according to the developed elastic respond spectrum in order to match the characteristic of the ground motion to the soil type. Elastic respond spectrum developed based on soil type B by using Eurocode 8. By using SPO and IDA method are able to determine the limit states of the dam. The limit state proposed in this study are yielding and ultimate state which is identified base on crack pattern perform on the structure model. The comparison of maximum crest displacement for both methods is analysed to define the limit state of the dam. The displacement of yielding state for Koyna dam is 23.84mm and 44.91mm for the ultimate state. The results are able to be used as a guideline to monitor Koyna dam under seismic loadings which are considering static and dynamic.
Responding to nonwords in the lexical decision task: Insights from the English Lexicon Project.
Yap, Melvin J; Sibley, Daragh E; Balota, David A; Ratcliff, Roger; Rueckl, Jay
2015-05-01
Researchers have extensively documented how various statistical properties of words (e.g., word frequency) influence lexical processing. However, the impact of lexical variables on nonword decision-making performance is less clear. This gap is surprising, because a better specification of the mechanisms driving nonword responses may provide valuable insights into early lexical processes. In the present study, item-level and participant-level analyses were conducted on the trial-level lexical decision data for almost 37,000 nonwords in the English Lexicon Project in order to identify the influence of different psycholinguistic variables on nonword lexical decision performance and to explore individual differences in how participants respond to nonwords. Item-level regression analyses reveal that nonword response time was positively correlated with number of letters, number of orthographic neighbors, number of affixes, and base-word number of syllables, and negatively correlated with Levenshtein orthographic distance and base-word frequency. Participant-level analyses also point to within- and between-session stability in nonword responses across distinct sets of items, and intriguingly reveal that higher vocabulary knowledge is associated with less sensitivity to some dimensions (e.g., number of letters) but more sensitivity to others (e.g., base-word frequency). The present findings provide well-specified and interesting new constraints for informing models of word recognition and lexical decision. (c) 2015 APA, all rights reserved).
Ramachandran, Andimuthu; Radhapriya, Parthasarathy; Jayakumar, Shanmuganathan; Dhanya, Praveen; Geetha, Rajadurai
2016-01-01
India has one of the largest assemblages of tropical biodiversity, with its unique floristic composition of endemic species. However, current forest cover assessment is performed via satellite-based forest surveys, which have many limitations. The present study, which was performed in the Eastern Ghats, analysed the satellite-based inventory provided by forest surveys and inferred from the results that this process no longer provides adequate information for quantifying forest degradation in an empirical manner. The study analysed 21 soil properties and generated a forest soil quality index of the Eastern Ghats, using principal component analysis. Using matrix modules and geospatial technology, we compared the forest degradation status calculated from satellite-based forest surveys with the degradation status calculated from the forest soil quality index. The Forest Survey of India classified about 1.8% of the Eastern Ghats’ total area as degraded forests and the remainder (98.2%) as open, dense, and very dense forests, whereas the soil quality index results found that about 42.4% of the total area is degraded, with the remainder (57.6%) being non-degraded. Our ground truth verification analyses indicate that the forest soil quality index along with the forest cover density data from the Forest Survey of India are ideal tools for evaluating forest degradation. PMID:26812397
Empirical performance of interpolation techniques in risk-neutral density (RND) estimation
NASA Astrophysics Data System (ADS)
Bahaludin, H.; Abdullah, M. H.
2017-03-01
The objective of this study is to evaluate the empirical performance of interpolation techniques in risk-neutral density (RND) estimation. Firstly, the empirical performance is evaluated by using statistical analysis based on the implied mean and the implied variance of RND. Secondly, the interpolation performance is measured based on pricing error. We propose using the leave-one-out cross-validation (LOOCV) pricing error for interpolation selection purposes. The statistical analyses indicate that there are statistical differences between the interpolation techniques:second-order polynomial, fourth-order polynomial and smoothing spline. The results of LOOCV pricing error shows that interpolation by using fourth-order polynomial provides the best fitting to option prices in which it has the lowest value error.
Inertial Sensor Based Analysis of Lie-to-Stand Transfers in Younger and Older Adults
Schwickert, Lars; Boos, Ronald; Klenk, Jochen; Bourke, Alan; Becker, Clemens; Zijlstra, Wiebren
2016-01-01
Many older adults lack the capacity to stand up again after a fall. Therefore, to analyse falls it is relevant to understand recovery patterns, including successful and failed attempts to get up from the floor in general. This study analysed different kinematic features of standing up from the floor. We used inertial sensors to describe the kinematics of lie-to-stand transfer patterns of younger and healthy older adults. Fourteen younger (20–50 years of age, 50% men) and 10 healthy older community dwellers (≥60 years; 50% men) conducted four lie-to-stand transfers from different initial lying postures. The analysed temporal, kinematic, and elliptic fitting complexity measures of transfer performance were significantly different between younger and older subjects (i.e., transfer duration, angular velocity (RMS), maximum vertical acceleration, maximum vertical velocity, smoothness, fluency, ellipse width, angle between ellipses). These results show the feasibility and potential of analysing kinematic features to describe the lie-to-stand transfer performance, to help design interventions and detection approaches to prevent long lies after falls. It is possible to describe age-related differences in lie-to-stand transfer performance using inertial sensors. The kinematic analysis remains to be tested on patterns after real-world falls. PMID:27529249
Geoengineering properties of potential repository units at Yucca Mountain, southern Nevada
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tillerson, J.R.; Nimick, F.B.
1984-12-01
The Nevada Nuclear Waste Storage Investigations (NNWSI) Project is currently evaluating volcanic tuffs at the Yucca Mountain site, located on and adjacent to the Nevada Test Site, for possible use as a host rock for a radioactive waste repository. The behavior of tuff as an engineering material must be understood to design, license, construct, and operate a repository. Geoengineering evaluations and measurements are being made to develop confidence in both the analysis techniques for thermal, mechanical, and hydrothermal effects and the supporting data base of rock properties. The analysis techniques and the data base are currently used for repository design,more » waste package design, and performance assessment analyses. This report documents the data base of geoengineering properties used in the analyses that aided the selection of the waste emplacement horizon and in analyses synopsized in the Environmental Assessment Report prepared for the Yucca Mountain site. The strategy used for the development of the data base relies primarily on data obtained in laboratory tests that are then confirmed in field tests. Average thermal and mechanical properties (and their anticipated variations) are presented. Based upon these data, analyses completed to date, and previous excavation experience in tuff, it is anticipated that existing mining technology can be used to develop stable underground openings and that repository operations can be carried out safely.« less
ERIC Educational Resources Information Center
Hart, Sara A.; Petrill, Stephen A.; Thompson, Lee A.
2010-01-01
The present study examined the phenotypic and genetic relationship between fluency and non-fluency-based measures of reading and mathematics performance. Participants were drawn from the Western Reserve Reading and Math Project, an ongoing longitudinal twin project of same-sex MZ and DZ twins from Ohio. The present analyses are based on…
The Reliability and Sources of Error of Using Rubrics-Based Assessment for Student Projects
ERIC Educational Resources Information Center
Menéndez-Varela, José-Luis; Gregori-Giralt, Eva
2018-01-01
Rubrics are widely used in higher education to assess performance in project-based learning environments. To date, the sources of error that may affect their reliability have not been studied in depth. Using generalisability theory as its starting-point, this article analyses the influence of the assessors and the criteria of the rubrics on the…
Thermal-structural analyses of Space Shuttle Main Engine (SSME) hot section components
NASA Technical Reports Server (NTRS)
Abdul-Aziz, Ali; Thompson, Robert L.
1988-01-01
Three dimensional nonlinear finite element heat transfer and structural analyses were performed for the first stage high pressure fuel turbopump (HPFTP) blade of the space shuttle main engine (SSME). Directionally solidified (DS) MAR-M 246 and single crystal (SC) PWA-1480 material properties were used for the analyses. Analytical conditions were based on a typical test stand engine cycle. Blade temperature and stress strain histories were calculated by using the MARC finite element computer code. The structural response of an SSME turbine blade was assessed and a greater understanding of blade damage mechanisms, convective cooling effects, and thermal mechanical effects was gained.
An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon
NASA Technical Reports Server (NTRS)
Rutherford, Brian
2000-01-01
The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected and the computational simulations are performed. Examples are provided to demonstrate this approach on small scale problems. These examples give encouraging results. Directions for further research are indicated.
Vander Lugt correlation of DNA sequence data
NASA Astrophysics Data System (ADS)
Christens-Barry, William A.; Hawk, James F.; Martin, James C.
1990-12-01
DNA, the molecule containing the genetic code of an organism, is a linear chain of subunits. It is the sequence of subunits, of which there are four kinds, that constitutes the unique blueprint of an individual. This sequence is the focus of a large number of analyses performed by an army of geneticists, biologists, and computer scientists. Most of these analyses entail searches for specific subsequences within the larger set of sequence data. Thus, most analyses are essentially pattern recognition or correlation tasks. Yet, there are special features to such analysis that influence the strategy and methods of an optical pattern recognition approach. While the serial processing employed in digital electronic computers remains the main engine of sequence analyses, there is no fundamental reason that more efficient parallel methods cannot be used. We describe an approach using optical pattern recognition (OPR) techniques based on matched spatial filtering. This allows parallel comparison of large blocks of sequence data. In this study we have simulated a Vander Lugt1 architecture implementing our approach. Searches for specific target sequence strings within a block of DNA sequence from the Co/El plasmid2 are performed.
Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses.
Syrowatka, Ania; Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn
2016-01-26
Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than others. Integration of content control improved quality of decision making (SMD 0.59 vs 0.23 for knowledge; SMD 0.39 vs 0.29 for decisional conflict). In contrast, tailoring reduced quality of decision making (SMD 0.40 vs 0.71 for knowledge; SMD 0.25 vs 0.52 for decisional conflict). Similarly, patient narratives also reduced quality of decision making (SMD 0.43 vs 0.65 for knowledge; SMD 0.17 vs 0.46 for decisional conflict). Results were varied for different types of explicit values clarification, feedback, and social support. Integration of media rich or interactive features into computer-based decision aids can improve quality of preference-sensitive decision making. However, this is an emerging field with limited evidence to guide use. The systematic review and thematic synthesis identified features that have been integrated into available computer-based decision aids, in an effort to facilitate reporting of these features and to promote integration of such features into decision aids. The meta-analyses and associated subgroup analyses provide preliminary evidence to support integration of specific features into future decision aids. Further research can focus on clarifying independent contributions of specific features through experimental designs and refining the designs of features to improve effectiveness.
Chen, Yi-Wen; Hunt, Michael A; Campbell, Kristin L; Peill, Kortni; Reid, W Darlene
2016-04-01
Many middle-aged and older persons have more than one chronic condition. Thus, it is important to synthesise the effectiveness of interventions across several comorbidities. The aim of this systematic review was to summarise current evidence regarding the effectiveness of Tai Chi in individuals with four common chronic conditions-cancer, osteoarthritis (OA), heart failure (HF) and chronic obstructive pulmonary disease (COPD). 4 databases (MEDLINE, EMBASE, CINAHL and SPORTDiscus) were searched for original articles. Two reviewers independently screened the titles and abstracts and then conducted full-text reviews, quality assessment and finally data abstraction. 33 studies met the inclusion criteria. Meta-analyses were performed on disease-specific symptoms, physiological outcomes and physical performance of each chronic condition. Subgroup analyses on disease-specific symptoms were conducted by categorising studies into subsets based on the type of comparison groups. Meta-analyses showed that Tai Chi improved or showed a tendency to improve physical performance outcomes, including 6-min walking distance (6MWD) and knee extensor strength, in most or all four chronic conditions. Tai Chi also improved disease-specific symptoms of pain and stiffness in OA. The results demonstrated a favourable effect or tendency of Tai Chi to improve physical performance and showed that this type of exercise could be performed by individuals with different chronic conditions, including COPD, HF and OA. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Welsher, Arthur; Rojas, David; Khan, Zain; VanderBeek, Laura; Kapralos, Bill; Grierson, Lawrence E M
2018-02-01
Research has revealed that individuals can improve technical skill performance by viewing demonstrations modeled by either expert or novice performers. These findings support the development of video-based observational practice communities that augment simulation-based skill education and connect geographically distributed learners. This study explores the experimental replicability of the observational learning effect when demonstrations are sampled from a community of distributed learners and serves as a context for understanding learner experiences within this type of training protocol. Participants from 3 distributed medical campuses engaged in a simulation-based learning study of the elliptical excision in which they completed a video-recorded performance before being assigned to 1 of 3 groups for a 2-week observational practice intervention. One group observed expert demonstrations, another observed novice demonstrations, and the third observed a combination of both. Participants returned for posttesting immediately and 1 month after the intervention. Participants also engaged in interviews regarding their perceptions of the usability and relevance of video-based observational practice to clinical education. Checklist (P < 0.0001) and global rating (P < 0.0001) measures indicate that participants, regardless of group assignment, improved after the intervention and after a 1-month retention period. Analyses revealed no significant differences between groups. Qualitative analyses indicate that participants perceived the observational practice platform to be usable, relevant, and potentially improved with enhanced feedback delivery. Video-based observational practice involving expert and/or novice demonstrations enhances simulation-based skill learning in a group of geographically distributed trainees. These findings support the use of Internet-mediated observational learning communities in distributed and simulation-based medical education contexts.
Wojtalik, Jessica A; Eack, Shaun M; Keshavan, Matcheri S
2013-01-10
The Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT) is a key measure of social cognition in schizophrenia that has good psychometric properties and is recommended by the MATRICS committee. As a way to further investigate the validity of the MSCEIT, this study sought to examine the neurobiological correlates of MSCEIT performance in patients with early course schizophrenia. A total of 51 patients diagnosed with early course, stabilized schizophrenia or schizoaffective disorder completed structural magnetic resonance imaging (MRI) scans and the MSCEIT. Investigation of the associations between MSCEIT performance and gray matter morphology was examined by conducting voxel-based morphometry (VBM) analyses across hypothesized social-cognitive regions of interest using automated anatomical labeling in Statistical Parametric Mapping Software, version 5 (SPM5). All VBM analyses utilized general linear models examining gray matter density partitioned images, adjusting for demographic and illness-related confounds. VBM results were then followed up with confirmatory volumetric analyses. Patients with poorer overall and Facilitating, Understanding, and Managing Emotions subscale performances on the MSCEIT showed significantly reduced gray matter density in the left parahippocampal gyrus. Additionally, attenuated performance on the Facilitating and Managing Emotions subscales was significantly associated with reduced right posterior cingulate gray matter density. All associations observed between MSCEIT performance and gray matter density were supported with confirmatory gray matter volumetric analyses, with the exception of the association between the right posterior cingulate and the facilitation of emotions. These findings provide additional evidence for the MSCEIT as a valid social-cognitive measure by elucidating its correlates with neurobiological structures commonly implicated in emotion processing. These findings provide additional biological evidence supporting the use of the MSCEIT in cognitive enhancing clinical trials in schizophrenia. Copyright © 2012 Elsevier Inc. All rights reserved.
Regional analyses of labor markets and demography: a model based Norwegian example.
Stambol, L S; Stolen, N M; Avitsland, T
1998-01-01
The authors discuss the regional REGARD model, developed by Statistics Norway to analyze the regional implications of macroeconomic development of employment, labor force, and unemployment. "In building the model, empirical analyses of regional producer behavior in manufacturing industries have been performed, and the relation between labor market development and regional migration has been investigated. Apart from providing a short description of the REGARD model, this article demonstrates the functioning of the model, and presents some results of an application." excerpt
Welcome to pandoraviruses at the ‘Fourth TRUC’ club
Sharma, Vikas; Colson, Philippe; Chabrol, Olivier; Scheid, Patrick; Pontarotti, Pierre; Raoult, Didier
2015-01-01
Nucleocytoplasmic large DNA viruses, or representatives of the proposed order Megavirales, belong to families of giant viruses that infect a broad range of eukaryotic hosts. Megaviruses have been previously described to comprise a fourth monophylogenetic TRUC (things resisting uncompleted classification) together with cellular domains in the universal tree of life. Recently described pandoraviruses have large (1.9–2.5 MB) and highly divergent genomes. In the present study, we updated the classification of pandoraviruses and other reported giant viruses. Phylogenetic trees were constructed based on six informational genes. Hierarchical clustering was performed based on a set of informational genes from Megavirales members and cellular organisms. Homologous sequences were selected from cellular organisms using TimeTree software, comprising comprehensive, and representative sets of members from Bacteria, Archaea, and Eukarya. Phylogenetic analyses based on three conserved core genes clustered pandoraviruses with phycodnaviruses, exhibiting their close relatedness. Additionally, hierarchical clustering analyses based on informational genes grouped pandoraviruses with Megavirales members as a super group distinct from cellular organisms. Thus, the analyses based on core conserved genes revealed that pandoraviruses are new genuine members of the ‘Fourth TRUC’ club, encompassing distinct life forms compared with cellular organisms. PMID:26042093
Welcome to pandoraviruses at the 'Fourth TRUC' club.
Sharma, Vikas; Colson, Philippe; Chabrol, Olivier; Scheid, Patrick; Pontarotti, Pierre; Raoult, Didier
2015-01-01
Nucleocytoplasmic large DNA viruses, or representatives of the proposed order Megavirales, belong to families of giant viruses that infect a broad range of eukaryotic hosts. Megaviruses have been previously described to comprise a fourth monophylogenetic TRUC (things resisting uncompleted classification) together with cellular domains in the universal tree of life. Recently described pandoraviruses have large (1.9-2.5 MB) and highly divergent genomes. In the present study, we updated the classification of pandoraviruses and other reported giant viruses. Phylogenetic trees were constructed based on six informational genes. Hierarchical clustering was performed based on a set of informational genes from Megavirales members and cellular organisms. Homologous sequences were selected from cellular organisms using TimeTree software, comprising comprehensive, and representative sets of members from Bacteria, Archaea, and Eukarya. Phylogenetic analyses based on three conserved core genes clustered pandoraviruses with phycodnaviruses, exhibiting their close relatedness. Additionally, hierarchical clustering analyses based on informational genes grouped pandoraviruses with Megavirales members as a super group distinct from cellular organisms. Thus, the analyses based on core conserved genes revealed that pandoraviruses are new genuine members of the 'Fourth TRUC' club, encompassing distinct life forms compared with cellular organisms.
A Surprising Effect of Feedback on Learning
ERIC Educational Resources Information Center
Vollmeyer, Regina; Rheinberg, Falko
2005-01-01
As meta-analyses demonstrate feedback effects on performance, our study examined possible mediators. Based on our cognitive-motivational model [Vollmeyer, R., & Rheinberg, F. (1998). Motivationale Einflusse auf Erwerb und Anwendung von Wissen in einem computersimulierten System [Motivational influences on the acquisition and application of…
U33 : impact of distraction and health on commercial driving performance final report.
DOT National Transportation Integrated Search
2012-02-01
"This study examined the interaction of the cognitive and technological aspects of distracted driving as well as physical health among commercial drivers. Participants (n=55; 5 of which were excluded from analyses) were recruited from Alabama-based t...
Ersiphe trifolii-a newly recognized powdery mildew pathogen of pea.
USDA-ARS?s Scientific Manuscript database
Population diversity of powdery mildews infecting pea (Pisum sativum) in the US Pacific Northwest was investigated in order to assess inconsistent resistance performances of pea genotypes in different environments. Phylogenetic analyses based on ITS sequences, in combination with assessment of morph...
40 CFR 63.7515 - When must I conduct subsequent performance tests, fuel analyses, or tune-ups?
Code of Federal Regulations, 2011 CFR
2011-07-01
... performance tests, fuel analyses, or tune-ups? 63.7515 Section 63.7515 Protection of Environment ENVIRONMENTAL... Compliance Requirements § 63.7515 When must I conduct subsequent performance tests, fuel analyses, or tune... and the associated initial fuel analyses within 90 days after the completion of the performance tests...
40 CFR 63.7515 - When must I conduct subsequent performance tests, fuel analyses, or tune-ups?
Code of Federal Regulations, 2012 CFR
2012-07-01
... performance tests, fuel analyses, or tune-ups? 63.7515 Section 63.7515 Protection of Environment ENVIRONMENTAL... Compliance Requirements § 63.7515 When must I conduct subsequent performance tests, fuel analyses, or tune... and the associated initial fuel analyses within 90 days after the completion of the performance tests...
Ehrhart, Mark G; Aarons, Gregory A; Farahnak, Lauren R
2015-05-07
In line with recent research on the role of the inner context of organizations in implementation effectiveness, this study extends research on organizational citizenship behavior (OCB) to the domain of evidence-based practice (EBP) implementation. OCB encompasses those behaviors that go beyond what is required for a given job that contribute to greater organizational effectiveness. The goal of this study was to develop and test a measure of implementation citizenship behavior (ICB) or those behaviors that employees perform that go above and beyond what is required in order to support EBP implementation. The primary participants were 68 supervisors from ten mental health agencies throughout California. Items measuring ICB were developed based on past research on OCB and in consultation with experts on EBP implementation in mental health settings. Supervisors rated 357 of their subordinates on ICB and implementation success. In addition, 292 of the subordinates provided data on self-rated performance, attitudes towards EBPs, work experience, and full-time status. The supervisor sample was randomly split, with half used for exploratory factor analyses and the other half for confirmatory factor analyses. The entire sample of supervisors and subordinates was utilized for analyses assessing the reliability and construct validity of the measure. Exploratory factor analyses supported the proposed two-factor structure of the Implementation Citizenship Behavior Scale (ICBS): (1) Helping Others and (2) Keeping Informed. Confirmatory factor analyses with the other half of the sample supported the factor structure. Additional analyses supported the reliability and construct validity for the ICBS. The ICBS is a pragmatic brief measure (six items) that captures critical behaviors employees perform to go above and beyond the call of duty to support EBP implementation, including helping their fellow employees on implementation-related activities and keeping informed about issues related to EBP and implementation efforts. The ICBS can be used by researchers to better understand the outcomes of improved organizational support for implementation (i.e., implementation climate) and the proximal predictors of implementation effectiveness. The ICBS can also provide insight for organizations, practitioners, and managers by focusing on key employee behaviors that should increase the probability of implementation success.
Pal, P; Kumar, R; Srivastava, N; Chaudhuri, J
2014-02-01
A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.
Lederer, Alyssa M; Middlestadt, Susan E
2014-01-01
Stress impacts college students, faculty, and staff alike. Although meditation has been found to decrease stress, it is an underutilized strategy. This study used the Reasoned Action Approach (RAA) to identify beliefs underlying university constituents' decision to meditate. N=96 students, faculty, and staff at a large midwestern university during spring 2012. A survey measured the RAA global constructs and elicited the beliefs underlying intention to meditate. Thematic and frequency analyses and multiple regression were performed. Quantitative analyses showed that intention to meditate was significantly predicted (R2=.632) by attitude, perceived norm, and perceived behavioral control. Qualitative analyses revealed advantages (eg, reduced stress; feeling calmer), disadvantages (eg, takes time; will not work), and facilitating circumstances (eg, having more time; having quiet space) of meditating. Results of this theory-based research suggest how college health professionals can encourage meditation practice through individual, interpersonal, and environmental interventions.
Static and dynamic deflection studies of the SRM aft case-nozzle joint
NASA Technical Reports Server (NTRS)
Christian, David C.; Kos, Lawrence D.; Torres, Isaias
1989-01-01
The redesign of the joints on the solid rocket motor (SRM) has prompted the need for analyzing the behavior of the joints using several different types of analyses. The types of analyses performed include modal analysis, static analysis, transient response analysis, and base driving response analysis. The forces used in these analyses to drive the mathematical model include SRM internal chamber pressure, nozzle blowout and side forces, shuttle vehicle lift-off dynamics, SRM pressure transient rise curve, gimbal forces and moments, actuator gimbal loads, and vertical and radial bolt preloads. The math model represented the SRM from the aft base tangent point (1,823.95 in) all the way back to the nozzle, where a simplified, tuned nozzle model was attached. The new design used the radial bolts as an additional feature to reduce the gap opening at the aft dome/nozzle fixed housing interface.
Instructional practices and science performance of 10 top-performing regions in PISA 2015
NASA Astrophysics Data System (ADS)
Lau, Kwok-chi; Lam, Terence Yuk-ping
2017-10-01
This study analysed 10 top-performing regions in PISA 2015 on their science performances and instructional practices. The regions include Singapore, Japan, Estonia, Taipei, Finland, Macao, Canada, Hong Kong, China and Korea. The science performances of the 10 regions and their teaching practices are described and compared. The construct of enquiry-based instruction as developed in PISA 2015 is revised into two new constructs using factor analysis. Then, the relationships of the teaching practices with science performance are analysed using hierarchical linear modelling. Adaptive instruction, teacher-directed instruction and interactive application are found positively associated with performance in all regions, while investigation and perceived feedback are all negative. The regions except Japan and Korea tend to have a high frequency of teacher-directed instruction facilitated by more or less authoritative class discussion in class. A fair amount of practical work is done, but not many of them are investigations. The cultural influences on teaching practices are discussed on how an amalgam of didactic and constructivist pedagogy is created by the Western progressive educational philosophy meeting the Confucian culture. The reasons for investigation's negative association with performance are also explored.
Performance comparison of Islamic and commercial banks in Malaysia
NASA Astrophysics Data System (ADS)
Azizud-din, Azimah; Hussin, Siti Aida Sheikh; Zahid, Zalina
2016-10-01
The steady growth in the size and increase in the number of Islamic banks show that the Islamic banking system is considered as an alternative to the conventional banking system. Due to this, comparisons in term of performance measurements and evaluation of the financial health for both type of banks are essential. The main purpose of this study is to analyse the differences between Islamic and commercial banks performance. Five years secondary data were collected from the annual report for each bank. Return on Asset ratio is chosen as the dependent variable, while capital adequacy, asset quality, management quality, earning, liquidity and sensitivity to market risk (CAMELS) are the independent variables. Descriptive analyses were done to understand the data. The independent t-test and Mann Whitney test show the differences of Islamic and commercial banks based on the financial variables. The stepwise and hierarchical multiple regressions were used to determine the factor that affects profitability performance of banks. Results show that Islamic banks are better in term of profitability performance, earning power performance, liquidity performance and sensitive to market risk. The factors that affect profitability performance are capital adequacy, earning power and liquidity variable.
The association between school-to-work programs and school performance.
Welsh, Erin C; Appana, Savi; Anderson, Henry A; Zierold, Kristina M
2014-02-01
The School-to-Work (STW) Opportunities Act was passed to aid students in transitioning from education to employment by offering work-based learning opportunities. In the United States, 72% of high schools offer work-based learning opportunities for credit. This is the first study to describe school performance and school-based behaviors among students enrolled in STW programs and compare them with nonworking and other-working students. In 2003, a questionnaire was administered to five school districts and one large urban school in Wisconsin. Between 2008 and 2010, analyses were completed to characterize STW students and compare them with other students. Of the 6,519 students aged 14-18 years included in the analyses, 461 were involved in an STW program (7%), 3,108 were non-working (48%), and 2,950 were other-working students (45%). Compared with other students, STW students were less likely to have a grade point average >2.0, more likely to have three or more unexcused absences from school, and more likely to spend <1 hour in school-sponsored activities. Holding multiple jobs also negatively affected a student's academic performance. School-to-Work students reported poorer academic performance and more unhealthy school-related behaviors compared with nonworking students and other-working students. Whereas many factors have a role in why students perform poorly in school, more research on students enrolled in STW programs is needed to understand whether participating has a negative impact on students' academic achievement. Copyright © 2014 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
Alvarez-García, Gema; García-Culebras, Alicia; Gutiérrez-Expósito, Daniel; Navarro-Lozano, Vanesa; Pastor-Fernández, Iván; Ortega-Mora, Luis Miguel
2013-11-15
Bovine neosporosis control programs are currently based on herd management and serodiagnosis because effective treatments and vaccines are unavailable. Although a wide variety of serological tools have been developed, enzyme-linked immunosorbent assays (ELISAs) are the most commonly commercialized tests. Partial comparative studies have been performed in the past, and the panel of available ELISAs has notably changed in the last few years. Therefore, diagnostic laboratories are requesting updated information about the performance of these tests. Accordingly, the aim of this study was to compare all of the commercially available ELISAs (n=10) by evaluating their performance and to re-standardize them based on TG-ROC analyses when necessary. For this purpose, a well-characterized serum panel from experimentally and naturally infected bovines and non-infected bovines (n=458) was used. Two different definitions of gold standard were considered: (i) the result of the majority of tests and (ii) pre-test information based on epidemiological, clinical and serological data. Most of the tests displayed high sensitivity (Se) and specificity (Sp) values when both gold standard criteria were considered. Furthermore, all the tests showed near perfect agreement, with the exception of the pair-wise comparisons that included the VMRD and SVANOVIR. The best-adjusted ELISAs were the HIPRA-CIVTEST, IDVET, BIOVET and IDEXX Rum (Se and Sp>95%). After the TG-ROC analyses, higher Se and Sp values were obtained for the BIO-X, LSI Bov, LSI Rum and IDEXX Bov, though the increases were more significant for the SVANOVIR and VMRD. The Kappa values also increased with the new adjusted cut-offs. This is the first study that offers updated performance evaluations of commercially available ELISAs. Such analyses are essential for diagnostic laboratories and are valuable to the companies that develop and distribute these tests. Copyright © 2013 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-08-01
Before disposing of transuranic radioactive waste in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments (PAs) of the WIPP for the DOE to provide interim guidance while preparing for a final compliance evaluation. This volume of the 1992 PA contains results of uncertainty and sensitivity analyses with respect to migration of gas and brine from the undisturbed repository. Additional information about the 1992 PA is provided in other volumes. Volume 1 containsmore » an overview of WIPP PA and results of a preliminary comparison with 40 CFR 191, Subpart B. Volume 2 describes the technical basis for the performance assessment, including descriptions of the linked computational models used in the Monte Carlo analyses. Volume 3 contains the reference data base and values for input parameters used in consequence and probability modeling. Volume 4 contains uncertainty and sensitivity analyses with respect to the EPA`s Environmental Standards for the Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Finally, guidance derived from the entire 1992 PA is presented in Volume 6. Results of the 1992 uncertainty and sensitivity analyses indicate that, conditional on the modeling assumptions and the assigned parameter-value distributions, the most important parameters for which uncertainty has the potential to affect gas and brine migration from the undisturbed repository are: initial liquid saturation in the waste, anhydrite permeability, biodegradation-reaction stoichiometry, gas-generation rates for both corrosion and biodegradation under inundated conditions, and the permeability of the long-term shaft seal.« less
Luck, Tobias; Pabst, Alexander; Rodriguez, Francisca S; Schroeter, Matthias L; Witte, Veronica; Hinz, Andreas; Mehnert, Anja; Engel, Christoph; Loeffler, Markus; Thiery, Joachim; Villringer, Arno; Riedel-Heller, Steffi G
2018-05-01
To provide new age-, sex-, and education-specific reference values for an extended version of the well-established Consortium to Establish a Registry for Alzheimer's Disease Neuropsychological Assessment Battery (CERAD-NAB) that additionally includes the Trail Making Test and the Verbal Fluency Test-S-Words. Norms were calculated based on the cognitive performances of n = 1,888 dementia-free participants (60-79 years) from the population-based German LIFE-Adult-Study. Multiple regressions were used to examine the association of the CERAD-NAB scores with age, sex, and education. In order to calculate the norms, quantile and censored quantile regression analyses were performed estimating marginal means of the test scores at 2.28, 6.68, 10, 15.87, 25, 50, 75, and 90 percentiles for age-, sex-, and education-specific subgroups. Multiple regression analyses revealed that younger age was significantly associated with better cognitive performance in 15 CERAD-NAB measures and higher education with better cognitive performance in all 17 measures. Women performed significantly better than men in 12 measures and men than women in four measures. The determined norms indicate ceiling effects for the cognitive performances in the Boston Naming, Word List Recognition, Constructional Praxis Copying, and Constructional Praxis Recall tests. The new norms for the extended CERAD-NAB will be useful for evaluating dementia-free German-speaking adults in a broad variety of relevant cognitive domains. The extended CERAD-NAB follows more closely the criteria for the new DSM-5 Mild and Major Neurocognitive Disorder. Additionally, it could be further developed to include a test for social cognition. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Schellenberg, Florian; Oberhofer, Katja; Taylor, William R.
2015-01-01
Background. Knowledge of the musculoskeletal loading conditions during strength training is essential for performance monitoring, injury prevention, rehabilitation, and training design. However, measuring muscle forces during exercise performance as a primary determinant of training efficacy and safety has remained challenging. Methods. In this paper we review existing computational techniques to determine muscle forces in the lower limbs during strength exercises in vivo and discuss their potential for uptake into sports training and rehabilitation. Results. Muscle forces during exercise performance have almost exclusively been analysed using so-called forward dynamics simulations, inverse dynamics techniques, or alternative methods. Musculoskeletal models based on forward dynamics analyses have led to considerable new insights into muscular coordination, strength, and power during dynamic ballistic movement activities, resulting in, for example, improved techniques for optimal performance of the squat jump, while quasi-static inverse dynamics optimisation and EMG-driven modelling have helped to provide an understanding of low-speed exercises. Conclusion. The present review introduces the different computational techniques and outlines their advantages and disadvantages for the informed usage by nonexperts. With sufficient validation and widespread application, muscle force calculations during strength exercises in vivo are expected to provide biomechanically based evidence for clinicians and therapists to evaluate and improve training guidelines. PMID:26417378
NASA Astrophysics Data System (ADS)
Čufar, Aljaž; Batistoni, Paola; Conroy, Sean; Ghani, Zamir; Lengar, Igor; Milocco, Alberto; Packer, Lee; Pillon, Mario; Popovichev, Sergey; Snoj, Luka; JET Contributors
2017-03-01
At the Joint European Torus (JET) the ex-vessel fission chambers and in-vessel activation detectors are used as the neutron production rate and neutron yield monitors respectively. In order to ensure that these detectors produce accurate measurements they need to be experimentally calibrated. A new calibration of neutron detectors to 14 MeV neutrons, resulting from deuterium-tritium (DT) plasmas, is planned at JET using a compact accelerator based neutron generator (NG) in which a D/T beam impinges on a solid target containing T/D, producing neutrons by DT fusion reactions. This paper presents the analysis that was performed to model the neutron source characteristics in terms of energy spectrum, angle-energy distribution and the effect of the neutron generator geometry. Different codes capable of simulating the accelerator based DT neutron sources are compared and sensitivities to uncertainties in the generator's internal structure analysed. The analysis was performed to support preparation to the experimental measurements performed to characterize the NG as a calibration source. Further extensive neutronics analyses, performed with this model of the NG, will be needed to support the neutron calibration experiments and take into account various differences between the calibration experiment and experiments using the plasma as a source of neutrons.
Schellenberg, Florian; Oberhofer, Katja; Taylor, William R; Lorenzetti, Silvio
2015-01-01
Knowledge of the musculoskeletal loading conditions during strength training is essential for performance monitoring, injury prevention, rehabilitation, and training design. However, measuring muscle forces during exercise performance as a primary determinant of training efficacy and safety has remained challenging. In this paper we review existing computational techniques to determine muscle forces in the lower limbs during strength exercises in vivo and discuss their potential for uptake into sports training and rehabilitation. Muscle forces during exercise performance have almost exclusively been analysed using so-called forward dynamics simulations, inverse dynamics techniques, or alternative methods. Musculoskeletal models based on forward dynamics analyses have led to considerable new insights into muscular coordination, strength, and power during dynamic ballistic movement activities, resulting in, for example, improved techniques for optimal performance of the squat jump, while quasi-static inverse dynamics optimisation and EMG-driven modelling have helped to provide an understanding of low-speed exercises. The present review introduces the different computational techniques and outlines their advantages and disadvantages for the informed usage by nonexperts. With sufficient validation and widespread application, muscle force calculations during strength exercises in vivo are expected to provide biomechanically based evidence for clinicians and therapists to evaluate and improve training guidelines.
Performance profile of NCAA Division I men's basketball games and training sessions.
Conte, D; Tessitore, A; Smiley, K; Thomas, C; Favero, T G
2016-06-01
This study aimed to analyse live and stoppage time phases, their ratio, and action played on half and full court in college basketball games. Differences were assessed for the entire games and between halves. Moreover, differences of the live/stoppage time ratio were analysed between games and game-based conditioning drills. Ten games as well as fifteen defensive, fourteen offensive and six scrimmage-type drills of the same division I men's college team (13 players) were analysed using time-motion analysis technique. Live and stoppage time were classified in five classes of duration: 1-20, 21-40, 41-60, 61-80, >80 seconds. Half court actions started and finished in the same half court. Full court actions were classified as transfer (TR) phases when at least 3 teammates crossed the mid-court line. TR phases were then classified in 5 classes of frequency: 1TR, 2TR, 3TR, 4TR, and >4TR. The results revealed no statistically significant differences between games or between halves for the considered parameters. The only significant difference was observed for live/stoppage time ratio between halves (p<0.001). Furthermore, a significant difference of the live/stoppage ratio was found between games and game-based drills (p<0.01). Post-hoc analysis demonstrated significant differences of scrimmage-type drills in comparison to games, and defensive and offensive drills (p<0.05), whereas no differences emerged for the other pairwise comparisons. The absence of differences between games in the analysed parameters might be important to characterize the model of performance in division I men's college games. Furthermore, these results encourage coaches to use game-based conditioning drills to replicate the LT/ST ratio documented during games.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakayasu, Ernesto S.; Nicora, Carrie D.; Sims, Amy C.
2016-05-03
ABSTRACT Integrative multi-omics analyses can empower more effective investigation and complete understanding of complex biological systems. Despite recent advances in a range of omics analyses, multi-omic measurements of the same sample are still challenging and current methods have not been well evaluated in terms of reproducibility and broad applicability. Here we adapted a solvent-based method, widely applied for extracting lipids and metabolites, to add proteomics to mass spectrometry-based multi-omics measurements. Themetabolite,protein, andlipidextraction (MPLEx) protocol proved to be robust and applicable to a diverse set of sample types, including cell cultures, microbial communities, and tissues. To illustrate the utility of thismore » protocol, an integrative multi-omics analysis was performed using a lung epithelial cell line infected with Middle East respiratory syndrome coronavirus, which showed the impact of this virus on the host glycolytic pathway and also suggested a role for lipids during infection. The MPLEx method is a simple, fast, and robust protocol that can be applied for integrative multi-omic measurements from diverse sample types (e.g., environmental,in vitro, and clinical). IMPORTANCEIn systems biology studies, the integration of multiple omics measurements (i.e., genomics, transcriptomics, proteomics, metabolomics, and lipidomics) has been shown to provide a more complete and informative view of biological pathways. Thus, the prospect of extracting different types of molecules (e.g., DNAs, RNAs, proteins, and metabolites) and performing multiple omics measurements on single samples is very attractive, but such studies are challenging due to the fact that the extraction conditions differ according to the molecule type. Here, we adapted an organic solvent-based extraction method that demonstrated broad applicability and robustness, which enabled comprehensive proteomics, metabolomics, and lipidomics analyses from the same sample.« less
Linkage and related analyses of Barrett's esophagus and its associated adenocarcinomas.
Sun, Xiangqing; Elston, Robert; Falk, Gary W; Grady, William M; Faulx, Ashley; Mittal, Sumeet K; Canto, Marcia I; Shaheen, Nicholas J; Wang, Jean S; Iyer, Prasad G; Abrams, Julian A; Willis, Joseph E; Guda, Kishore; Markowitz, Sanford; Barnholtz-Sloan, Jill S; Chandar, Apoorva; Brock, Wendy; Chak, Amitabh
2016-07-01
Familial aggregation and segregation analysis studies have provided evidence of a genetic basis for esophageal adenocarcinoma (EAC) and its premalignant precursor, Barrett's esophagus (BE). We aim to demonstrate the utility of linkage analysis to identify the genomic regions that might contain the genetic variants that predispose individuals to this complex trait (BE and EAC). We genotyped 144 individuals in 42 multiplex pedigrees chosen from 1000 singly ascertained BE/EAC pedigrees, and performed both model-based and model-free linkage analyses, using S.A.G.E. and other software. Segregation models were fitted, from the data on both the 42 pedigrees and the 1000 pedigrees, to determine parameters for performing model-based linkage analysis. Model-based and model-free linkage analyses were conducted in two sets of pedigrees: the 42 pedigrees and a subset of 18 pedigrees with female affected members that are expected to be more genetically homogeneous. Genome-wide associations were also tested in these families. Linkage analyses on the 42 pedigrees identified several regions consistently suggestive of linkage by different linkage analysis methods on chromosomes 2q31, 12q23, and 4p14. A linkage on 15q26 is the only consistent linkage region identified in the 18 female-affected pedigrees, in which the linkage signal is higher than in the 42 pedigrees. Other tentative linkage signals are also reported. Our linkage study of BE/EAC pedigrees identified linkage regions on chromosomes 2, 4, 12, and 15, with some reported associations located within our linkage peaks. Our linkage results can help prioritize association tests to delineate the genetic determinants underlying susceptibility to BE and EAC.
Definition of the thermographic regions of interest in cycling by using a factor analysis
NASA Astrophysics Data System (ADS)
Priego Quesada, Jose Ignacio; Lucas-Cuevas, Angel Gabriel; Salvador Palmer, Rosario; Pérez-Soriano, Pedro; Cibrián Ortiz de Anda, Rosa M.a.
2016-03-01
Research in exercise physiology using infrared thermography has increased in the last years. However, the definition of the Regions of Interest (ROIs) varies strongly between studies. Therefore, the aim of this study was to use a factor analysis approach to define highly correlated groups of thermographic ROIs during a cycling test. Factor analyses were performed based on the moment of measurement and on the variation of skin temperatures as a result of the cycling exercise. 19 male participants cycled during 45 min at 50% of their individual peak power output with a cadence of 90 rpm. Infrared thermography was used to measure skin temperatures in sixteen ROIs of the trunk and lower limbs at three moments: before, immediately after and 10 min after the cycling test. Factor analyses were used to identify groups of ROIs based on the skin absolute temperatures at each moment of measurement as well as on skin temperature variations between moments. All the factor analyses performed for each moment and skin temperature variation explained more than the 80% of the variance. Different groups of ROIs were obtained when the analysis was based on the moment of measurement or on the effect of exercise on the skin temperature. Furthermore, some ROIs were grouped in the same way in both analyses (e.g. the ROIs of the trunk), whereas other regions (legs and their joints) were grouped differently in each analysis. Differences between groups of ROIs are related to their tissue composition, muscular activity and capacity of sweating. In conclusion, the resultant groups of ROIs were coherent and could help researchers to define the ROIs in future thermal studies.
Manzo, C; Mei, A; Zampetti, E; Bassani, C; Paciucci, L; Manetti, P
2017-04-15
This paper describes a methodology to perform chemical analyses in landfill areas by integrating multisource geomatic data. We used a top-down approach to identify Environmental Point of Interest (EPI) based on very high-resolution satellite data (Pleiades and WorldView 2) and on in situ thermal and photogrammetric surveys. Change detection techniques and geostatistical analysis supported the chemical survey, undertaken using an accumulation chamber and an RIIA, an unmanned ground vehicle developed by CNR IIA, equipped with a multiparameter sensor platform for environmental monitoring. Such an approach improves site characterization, identifying the key environmental points of interest where it is necessary to perform detailed chemical analyses. Copyright © 2017 Elsevier B.V. All rights reserved.
Full-Scale Crash Tests and Analyses of Three High-Wing Single
NASA Technical Reports Server (NTRS)
Annett, Martin S.; Littell, Justin D.; Stimson, Chad M.; Jackson, Karen E.; Mason, Brian H.
2015-01-01
The NASA Emergency Locator Transmitter Survivability and Reliability (ELTSAR) project was initiated in 2014 to assess the crash performance standards for the next generation of ELT systems. Three Cessna 172 aircraft have been acquired to conduct crash testing at NASA Langley Research Center's Landing and Impact Research Facility. Testing is scheduled for the summer of 2015 and will simulate three crash conditions; a flare to stall while emergency landing, and two controlled flight into terrain scenarios. Instrumentation and video coverage, both onboard and external, will also provide valuable data of airframe response. Full-scale finite element analyses will be performed using two separate commercial explicit solvers. Calibration and validation of the models will be based on the airframe response under these varying crash conditions.
NASA Astrophysics Data System (ADS)
Bamberger, Yael M.; Davis, Elizabeth A.
2013-01-01
This paper focuses on students' ability to transfer modelling performances across content areas, taking into consideration their improvement of content knowledge as a result of a model-based instruction. Sixty-five sixth grade students of one science teacher in an urban public school in the Midwestern USA engaged in scientific modelling practices that were incorporated into a curriculum focused on the nature of matter. Concept-process models were embedded in the curriculum, as well as emphasis on meta-modelling knowledge and modelling practices. Pre-post test items that required drawing scientific models of smell, evaporation, and friction were analysed. The level of content understanding was coded and scored, as were the following elements of modelling performance: explanation, comparativeness, abstraction, and labelling. Paired t-tests were conducted to analyse differences in students' pre-post tests scores on content knowledge and on each element of the modelling performances. These are described in terms of the amount of transfer. Students significantly improved in their content knowledge for the smell and the evaporation models, but not for the friction model, which was expected as that topic was not taught during the instruction. However, students significantly improved in some of their modelling performances for all the three models. This improvement serves as evidence that the model-based instruction can help students acquire modelling practices that they can apply in a new content area.
Muñoz-Martínez, Francisco Antonio; Rubio-Arias, Jacobo Á; Ramos-Campo, Domingo Jesús; Alcaraz, Pedro E
2017-12-01
It is well known that concurrent increases in both maximal strength and aerobic capacity are associated with improvements in sports performance as well as overall health. One of the most popular training methods used for achieving these objectives is resistance circuit-based training. The objective of the present systematic review with a meta-analysis was to evaluate published studies that have investigated the effects of resistance circuit-based training on maximum oxygen uptake and one-repetition maximum of the upper-body strength (bench press exercise) in healthy adults. The following electronic databases were searched from January to June 2016: PubMed, Web of Science and Cochrane. Studies were included if they met the following criteria: (1) examined healthy adults aged between 18 and 65 years; (2) met the characteristics of resistance circuit-based training; and (3) analysed the outcome variables of maximum oxygen uptake using a gas analyser and/or one-repetition maximum bench press. Of the 100 articles found from the database search and after all duplicates were removed, eight articles were analysed for maximum oxygen uptake. Of 118 healthy adults who performed resistance circuit-based training, maximum oxygen uptake was evaluated before and after the training programme. Additionally, from the 308 articles found for one-repetition maximum, eight articles were analysed. The bench press one-repetition maximum load, of 237 healthy adults who performed resistance circuit-based training, was evaluated before and after the training programme. Significant increases in maximum oxygen uptake and one-repetition maximum bench press were observed following resistance circuit-based training. Additionally, significant differences in maximum oxygen uptake and one-repetition maximum bench press were found between the resistance circuit-based training and control groups. The meta-analysis showed that resistance circuit-based training, independent of the protocol used in the studies, is effective in increasing maximum oxygen uptake and one-repetition maximum bench press in healthy adults. However, its effect appears to be larger depending on the population and training characteristics. For large effects in maximum oxygen uptake, the programme should include ~14-30 sessions for ~6-12 weeks, with each session lasting at least ~20-30 min, at intensities between ~60 and 90% one-repetition maximum. For large effects in one-repetition maximum bench press, the data indicate that intensity should be ~30-60% one-repetition maximum, with sessions lasting at least ~22.5-60 min. However, the lower participant's baseline fitness level may explain the lighter optimal loads used in the circuit training studies where greater strength gains were reported.
Adamczyk, Przemysław; Juszczak, Kajetan; Drewa, Tomasz; Hora, Milan; Nyirády, Peter; Sosnowski, Marek
2016-01-01
In recent years, the laparoscopic approach in oncologic urology seems more attractable to the surgeons. It is considered to have the same oncologic quality as open surgery, but is less invasive in patients. It is used widely in all of Europe, but with various frequency. The aim of the study was to present a various amount of oncourological procedures from three neighbouring countries - Poland, Czech Republic and Hungary. Prostatectomy, cystectomy, nephrectomy and tumorectomy (Nephron Sparing Procedures - NSS) were presented as a list of procedures prepared from the national registry. The total amount of procedures was presented, as well as the LO (Lap to Open procedures) index, P/P (procedures/population) index, ratio of cystectomy/population, and cystectomy/TURBT. In the Czech Republic, the most complex procedures are performed (laparoscopic/robotic prostatectomy, NSS LAP, LAP nephrectomy) in the majority when analysing the country's population. In Hungary and Czech Republic, there are more laparoscopic/robotic radical prostatectomies performed, than open ones. In Poland the largest number of cystectomies is performed when analysing the country's population, but it is difficult to explain the much higher ratio of 6.57 TUR/one cystectomy. In the Czech Republic this procedure is performed in almost one quarter of the patients (23.36%). Interestingly, in Hungary the cystectomy with pouch creation is performed in about 67.65% cases. The highest reimbursement for surgical procedure is present in the Czech Republic with approximately 20-40% more than when compared to Poland or Hungary. The definitive leader in Central Europe (based on the national registry) is the Czech Republic, where the most complex procedures are performed (laparoscopic/robotic prostatectomy, NSS LAP, LAP nephrectomy) in biggest amounts when analysing the country's population. Explanation of such circumstances, can be the higher reimbursement rate for surgical procedure in this country.
a Voxel-Based Filtering Algorithm for Mobile LIDAR Data
NASA Astrophysics Data System (ADS)
Qin, H.; Guan, G.; Yu, Y.; Zhong, L.
2018-04-01
This paper presents a stepwise voxel-based filtering algorithm for mobile LiDAR data. In the first step, to improve computational efficiency, mobile LiDAR points, in xy-plane, are first partitioned into a set of two-dimensional (2-D) blocks with a given block size, in each of which all laser points are further organized into an octree partition structure with a set of three-dimensional (3-D) voxels. Then, a voxel-based upward growing processing is performed to roughly separate terrain from non-terrain points with global and local terrain thresholds. In the second step, the extracted terrain points are refined by computing voxel curvatures. This voxel-based filtering algorithm is comprehensively discussed in the analyses of parameter sensitivity and overall performance. An experimental study performed on multiple point cloud samples, collected by different commercial mobile LiDAR systems, showed that the proposed algorithm provides a promising solution to terrain point extraction from mobile point clouds.
Subtitle-Based Word Frequencies as the Best Estimate of Reading Behavior: The Case of Greek
Dimitropoulou, Maria; Duñabeitia, Jon Andoni; Avilés, Alberto; Corral, José; Carreiras, Manuel
2010-01-01
Previous evidence has shown that word frequencies calculated from corpora based on film and television subtitles can readily account for reading performance, since the language used in subtitles greatly approximates everyday language. The present study examines this issue in a society with increased exposure to subtitle reading. We compiled SUBTLEX-GR, a subtitled-based corpus consisting of more than 27 million Modern Greek words, and tested to what extent subtitle-based frequency estimates and those taken from a written corpus of Modern Greek account for the lexical decision performance of young Greek adults who are exposed to subtitle reading on a daily basis. Results showed that SUBTLEX-GR frequency estimates effectively accounted for participants’ reading performance in two different visual word recognition experiments. More importantly, different analyses showed that frequencies estimated from a subtitle corpus explained the obtained results significantly better than traditional frequencies derived from written corpora. PMID:21833273
Biases and power for groups comparison on subjective health measurements.
Hamel, Jean-François; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Roquelaure, Yves; Sébille, Véronique
2012-01-01
Subjective health measurements are increasingly used in clinical research, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: so-called classical test theory (CTT), relying on observed scores and models coming from Item Response Theory (IRT) relying on a response model relating the items responses to a latent parameter, often called latent trait. Whether IRT or CTT would be the most appropriate method to compare two independent groups of patients on a patient reported outcomes measurement remains unknown and was investigated using simulations. For CTT-based analyses, groups comparison was performed using t-test on the scores. For IRT-based analyses, several methods were compared, according to whether the Rasch model was considered with random effects or with fixed effects, and the group effect was included as a covariate or not. Individual latent traits values were estimated using either a deterministic method or by stochastic approaches. Latent traits were then compared with a t-test. Finally, a two-steps method was performed to compare the latent trait distributions, and a Wald test was performed to test the group effect in the Rasch model including group covariates. The only unbiased IRT-based method was the group covariate Wald's test, performed on the random effects Rasch model. This model displayed the highest observed power, which was similar to the power using the score t-test. These results need to be extended to the case frequently encountered in practice where data are missing and possibly informative.
Performance Analysis of Receive Diversity in Wireless Sensor Networks over GBSBE Models
Goel, Shivali; Abawajy, Jemal H.; Kim, Tai-hoon
2010-01-01
Wireless sensor networks have attracted a lot of attention recently. In this paper, we develop a channel model based on the elliptical model for multipath components involving randomly placed scatterers in the scattering region with sensors deployed on a field. We verify that in a sensor network, the use of receive diversity techniques improves the performance of the system. Extensive performance analysis of the system is carried out for both single and multiple antennas with the applied receive diversity techniques. Performance analyses based on variations in receiver height, maximum multipath delay and transmit power have been performed considering different numbers of antenna elements present in the receiver array, Our results show that increasing the number of antenna elements for a wireless sensor network does indeed improve the BER rates that can be obtained. PMID:22163510
Scaling study of the combustion performance of gas—gas rocket injectors
NASA Astrophysics Data System (ADS)
Wang, Xiao-Wei; Cai, Guo-Biao; Jin, Ping
2011-10-01
To obtain the key subelements that may influence the scaling of gas—gas injector combustor performance, the combustion performance subelements in a liquid propellant rocket engine combustor are initially analysed based on the results of a previous study on the scaling of a gas—gas combustion flowfield. Analysis indicates that inner wall friction loss and heat-flux loss are two key issues in gaining the scaling criterion of the combustion performance. The similarity conditions of the inner wall friction loss and heat-flux loss in a gas—gas combustion chamber are obtained by theoretical analyses. Then the theoretical scaling criterion was obtained for the combustion performance, but it proved to be impractical. The criterion conditions, the wall friction and the heat flux are further analysed in detail to obtain the specific engineering scaling criterion of the combustion performance. The results indicate that when the inner flowfields in the combustors are similar, the combustor wall shear stress will have similar distributions qualitatively and will be directly proportional to pc0.8dt-0.2 quantitatively. In addition, the combustion peformance will remain unchanged. Furthermore, multi-element injector chambers with different geometric sizes and at different pressures are numerically simulated and the wall shear stress and combustion efficiencies are solved and compared with each other. A multielement injector chamber is designed and hot-fire tested at several chamber pressures and the combustion performances are measured in a total of nine hot-fire tests. The numerical and experimental results verified the similarities among combustor wall shear stress and combustion performances at different chamber pressures and geometries, with the criterion applied.
Behrouzi, Farshad; Shaharoun, Awaluddin Mohamed; Ma'aram, Azanizawati
2014-05-01
In order to attain a useful balanced scorecard (BSC), appropriate performance perspectives and indicators are crucial to reflect all strategies of the organisation. The objectives of this survey were to give an insight regarding the situation of the BSC in the health sector over the past decade, and to afford a generic approach of the BSC development for health settings with specific focus on performance perspectives, performance indicators and BSC generation. After an extensive search based on publication date and research content, 29 articles published since 2002 were identified, categorised and analysed. Four critical attributes of each article were analysed, including BSC generation, performance perspectives, performance indicators and auxiliary tools. The results showed that 'internal business process' was the most notable BSC perspective as it was included in all reviewed articles. After investigating the literature, it was concluded that its comprehensiveness is the reason for the importance and high usage of this perspective. The findings showed that 12 cases out of 29 reviewed articles (41%) exceeded the maximum number of key performance indicators (KPI) suggested in a previous study. It was found that all 12 cases were large organisations with numerous departments (e.g. national health organisations). Such organisations require numerous KPI to cover all of their strategic objectives. It was recommended to utilise the cascaded BSC within such organisations to avoid complexity and difficulty in gathering, analysing and interpreting performance data. Meanwhile it requires more medical staff to contribute in BSC development, which will result in greater reliability of the BSC.
Statistics for Learning Genetics
ERIC Educational Resources Information Center
Charles, Abigail Sheena
2012-01-01
This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing…
Repository-based software engineering program
NASA Technical Reports Server (NTRS)
Wilson, James
1992-01-01
The activities performed during September 1992 in support of Tasks 01 and 02 of the Repository-Based Software Engineering Program are outlined. The recommendations and implementation strategy defined at the September 9-10 meeting of the Reuse Acquisition Action Team (RAAT) are attached along with the viewgraphs and reference information presented at the Institute for Defense Analyses brief on legal and patent issues related to software reuse.
N.J. Grünwald; E.M. Goss; K. Ivors; M. Garbelotto; F.N. Martin; S. Prospero; E. Hansen; P.J.M. Bonants; R.C. Hamelin; G. Chastagner; S. Werres; D.M. Rizzo; G. Abad; P. Beales; G.J. Bilodeau; C.L. Blomquist; C. Brasier; S.C. Brière; A. Chandelier; J.M. Davidson; S. Denman; M. Elliott; S.J. Frankel; E.M. Goheen; H. de Gruyter; K. Heungens; D. James; A. Kanaskie; M.G. McWilliams; W. Man in ' t Veld; E. Moralejo; N.K. Osterbauer; M.E. Palm; J.L. Parke; A.M. Perez Sierra; S.F. Shamoun; N. Shishkoff; P.W. Tooley; A.M. Vettraino; J. Webber; T.L. Widmer
2009-01-01
Phytophthora ramorum, the causal agent of sudden oak death and ramorum blight, is known to exist as three distinct clonal lineages which can only be distinguished by performing molecular marker-based analyses. However, in the recent literature there exists no consensus on naming of these lineages. Here we propose a system for naming clonal lineages of P. ramorum based...
An Evaluation of Clinical Economics and Cases of Cost-effectiveness
Takura, Tomoyuki
2017-01-01
In order to maintain and develop a universal health insurance system, it is crucial to utilize limited medical resources effectively. In this context, considerations are underway to introduce health technology assessments (HTAs), such as cost-effectiveness analyses (CEAs), into the medical treatment fee system. CEAs, which is the general term for these methods, are classified into four categories, such as cost-effectiveness analyses based on performance indicators, and in the comparison of health technologies, the incremental cost-effectiveness ratio (ICER) is also applied. When I comprehensively consider several Japanese studies based on these concepts, I find that, in the results of the analysis of the economic performance of healthcare systems, Japan shows the most promising trend in the world. In addition, there is research indicating the superior cost-effectiveness of Rituximab against refractory nephrotic syndrome, and it is expected that health economics will be actively applied to the valuation of technical innovations such as drug discovery. PMID:29279514
Performance of Sorghum Varieties under Variable Rainfall in Central Tanzania
Tumbo, S. D.; Kihupi, N. I.; Rwehumbiza, Filbert B.
2017-01-01
Rainfall variability has a significant impact on crop production with manifestations in frequent crop failure in semiarid areas. This study used the parameterized APSIM crop model to investigate how rainfall variability may affect yields of improved sorghum varieties based on long-term historical rainfall and projected climate. Analyses of historical rainfall indicate a mix of nonsignificant and significant trends on the onset, cessation, and length of the growing season. The study confirmed that rainfall variability indeed affects yields of improved sorghum varieties. Further analyses of simulated sorghum yields based on seasonal rainfall distribution indicate the concurrence of lower grain yields with the 10-day dry spells during the cropping season. Simulation results for future sorghum response, however, show that impacts of rainfall variability on sorghum will be overridden by temperature increase. We conclude that, in the event where harms imposed by moisture stress in the study area are not abated, even improved sorghum varieties are likely to perform poorly. PMID:28536708
Performance of Sorghum Varieties under Variable Rainfall in Central Tanzania.
Msongaleli, Barnabas M; Tumbo, S D; Kihupi, N I; Rwehumbiza, Filbert B
2017-01-01
Rainfall variability has a significant impact on crop production with manifestations in frequent crop failure in semiarid areas. This study used the parameterized APSIM crop model to investigate how rainfall variability may affect yields of improved sorghum varieties based on long-term historical rainfall and projected climate. Analyses of historical rainfall indicate a mix of nonsignificant and significant trends on the onset, cessation, and length of the growing season. The study confirmed that rainfall variability indeed affects yields of improved sorghum varieties. Further analyses of simulated sorghum yields based on seasonal rainfall distribution indicate the concurrence of lower grain yields with the 10-day dry spells during the cropping season. Simulation results for future sorghum response, however, show that impacts of rainfall variability on sorghum will be overridden by temperature increase. We conclude that, in the event where harms imposed by moisture stress in the study area are not abated, even improved sorghum varieties are likely to perform poorly.
Bossier, Han; Seurinck, Ruth; Kühn, Simone; Banaschewski, Tobias; Barker, Gareth J.; Bokde, Arun L. W.; Martinot, Jean-Luc; Lemaitre, Herve; Paus, Tomáš; Millenet, Sabina; Moerkerke, Beatrijs
2018-01-01
Given the increasing amount of neuroimaging studies, there is a growing need to summarize published results. Coordinate-based meta-analyses use the locations of statistically significant local maxima with possibly the associated effect sizes to aggregate studies. In this paper, we investigate the influence of key characteristics of a coordinate-based meta-analysis on (1) the balance between false and true positives and (2) the activation reliability of the outcome from a coordinate-based meta-analysis. More particularly, we consider the influence of the chosen group level model at the study level [fixed effects, ordinary least squares (OLS), or mixed effects models], the type of coordinate-based meta-analysis [Activation Likelihood Estimation (ALE) that only uses peak locations, fixed effects, and random effects meta-analysis that take into account both peak location and height] and the amount of studies included in the analysis (from 10 to 35). To do this, we apply a resampling scheme on a large dataset (N = 1,400) to create a test condition and compare this with an independent evaluation condition. The test condition corresponds to subsampling participants into studies and combine these using meta-analyses. The evaluation condition corresponds to a high-powered group analysis. We observe the best performance when using mixed effects models in individual studies combined with a random effects meta-analysis. Moreover the performance increases with the number of studies included in the meta-analysis. When peak height is not taken into consideration, we show that the popular ALE procedure is a good alternative in terms of the balance between type I and II errors. However, it requires more studies compared to other procedures in terms of activation reliability. Finally, we discuss the differences, interpretations, and limitations of our results. PMID:29403344
Netchacovitch, L; Thiry, J; De Bleye, C; Dumont, E; Cailletaud, J; Sacré, P-Y; Evrard, B; Hubert, Ph; Ziemons, E
2017-08-15
Since the Food and Drug Administration (FDA) published a guidance based on the Process Analytical Technology (PAT) approach, real-time analyses during manufacturing processes are in real expansion. In this study, in-line Raman spectroscopic analyses were performed during a Hot-Melt Extrusion (HME) process to determine the Active Pharmaceutical Ingredient (API) content in real-time. The method was validated based on a univariate and a multivariate approach and the analytical performances of the obtained models were compared. Moreover, on one hand, in-line data were correlated with the real API concentration present in the sample quantified by a previously validated off-line confocal Raman microspectroscopic method. On the other hand, in-line data were also treated in function of the concentration based on the weighing of the components in the prepared mixture. The importance of developing quantitative methods based on the use of a reference method was thus highlighted. The method was validated according to the total error approach fixing the acceptance limits at ±15% and the α risk at ±5%. This method reaches the requirements of the European Pharmacopeia norms for the uniformity of content of single-dose preparations. The validation proves that future results will be in the acceptance limits with a previously defined probability. Finally, the in-line validated method was compared with the off-line one to demonstrate its ability to be used in routine analyses. Copyright © 2017 Elsevier B.V. All rights reserved.
Zhang, Kaihua; Zhang, Kai; Cao, Yan; Pan, Wei-ping
2013-03-01
Despite much research on co-combustion of tobacco stem and high-sulfur coal, their blending optimization has not been effectively found. This study investigated the combustion profiles of tobacco stem, high-sulfur bituminous coal and their blends by thermogravimetric analysis. Ignition and burnout performances, heat release performances, and gaseous pollutant emissions were also studied by thermogravimetric and mass spectrometry analyses. The results indicated that combustion of tobacco stem was more complicated than that of high-sulfur bituminous coal, mainly shown as fixed carbon in it was divided into two portions with one early burning and the other delay burning. Ignition and burnout performances, heat release performances, and gaseous pollutant emissions of the blends present variable trends with the increase of tobacco stem content. Taking into account the above three factors, a blending ratio of 0–20% tobacco stem content is conservatively proposed as optimum amount for blending. Copyright © 2012 Elsevier Ltd. All rights reserved.
Integrative analysis of environmental sequences using MEGAN4.
Huson, Daniel H; Mitra, Suparna; Ruscheweyh, Hans-Joachim; Weber, Nico; Schuster, Stephan C
2011-09-01
A major challenge in the analysis of environmental sequences is data integration. The question is how to analyze different types of data in a unified approach, addressing both the taxonomic and functional aspects. To facilitate such analyses, we have substantially extended MEGAN, a widely used taxonomic analysis program. The new program, MEGAN4, provides an integrated approach to the taxonomic and functional analysis of metagenomic, metatranscriptomic, metaproteomic, and rRNA data. While taxonomic analysis is performed based on the NCBI taxonomy, functional analysis is performed using the SEED classification of subsystems and functional roles or the KEGG classification of pathways and enzymes. A number of examples illustrate how such analyses can be performed, and show that one can also import and compare classification results obtained using others' tools. MEGAN4 is freely available for academic purposes, and installers for all three major operating systems can be downloaded from www-ab.informatik.uni-tuebingen.de/software/megan.
What Can the Diffusion Model Tell Us About Prospective Memory?
Horn, Sebastian S.; Bayen, Ute J.; Smith, Rebekah E.
2011-01-01
Cognitive process models, such as Ratcliff’s (1978) diffusion model, are useful tools for examining cost- or interference effects in event-based prospective memory (PM). The diffusion model includes several parameters that provide insight into how and why ongoing-task performance may be affected by a PM task and is ideally suited to analyze performance because both reaction time and accuracy are taken into account. Separate analyses of these measures can easily yield misleading interpretations in cases of speed-accuracy tradeoffs. The diffusion model allows us to measure possible criterion shifts and is thus an important methodological improvement over standard analyses. Performance in an ongoing lexical decision task (Smith, 2003) was analyzed with the diffusion model. The results suggest that criterion shifts play an important role when a PM task is added, but do not fully explain the cost effect on RT. PMID:21443332
Performance of Between-Study Heterogeneity Measures in the Cochrane Library.
Ma, Xiaoyue; Lin, Lifeng; Qu, Zhiyong; Zhu, Motao; Chu, Haitao
2018-05-29
The growth in comparative effectiveness research and evidence-based medicine has increased attention to systematic reviews and meta-analyses. Meta-analysis synthesizes and contrasts evidence from multiple independent studies to improve statistical efficiency and reduce bias. Assessing heterogeneity is critical for performing a meta-analysis and interpreting results. As a widely used heterogeneity measure, the I statistic quantifies the proportion of total variation across studies that is due to real differences in effect size. The presence of outlying studies can seriously exaggerate the I statistic. Two alternative heterogeneity measures, the Ir and Im, have been recently proposed to reduce the impact of outlying studies. To evaluate these measures' performance empirically, we applied them to 20,599 meta-analyses in the Cochrane Library. We found that the Ir and Im have strong agreement with the I, while they are more robust than the I when outlying studies appear.
Hahn, Sowon; Buttaccio, Daniel R; Hahn, Jungwon; Lee, Taehun
2015-01-01
The present study demonstrates that levels of extraversion and neuroticism can predict attentional performance during a change detection task. After completing a change detection task built on the flicker paradigm, participants were assessed for personality traits using the Revised Eysenck Personality Questionnaire (EPQ-R). Multiple regression analyses revealed that higher levels of extraversion predict increased change detection accuracies, while higher levels of neuroticism predict decreased change detection accuracies. In addition, neurotic individuals exhibited decreased sensitivity A' and increased fixation dwell times. Hierarchical regression analyses further revealed that eye movement measures mediate the relationship between neuroticism and change detection accuracies. Based on the current results, we propose that neuroticism is associated with decreased attentional control over the visual field, presumably due to decreased attentional disengagement. Extraversion can predict increased attentional performance, but the effect is smaller than the relationship between neuroticism and attention.
[New trends in the evaluation of mathematics learning disabilities. The role of metacognition].
Miranda-Casas, A; Acosta-Escareño, G; Tarraga-Minguez, R; Fernández, M I; Rosel-Remírez, J
2005-01-15
The current trends in the evaluation of mathematics learning disabilities (MLD), based on cognitive and empirical models, are oriented towards combining procedures involving the criteria and the evaluation of cognitive and metacognitive processes, associated to performance in mathematical tasks. The objective of this study is to analyse the metacognitive skills of prediction and evaluation in performing maths tasks and to compare metacognitive performance among pupils with MLD and younger pupils without MLD, who have the same level of mathematical performance. Likewise, we analyse these pupils' desire to learn. Subjects and methods. We compare a total of 44 pupils from the second cycle of primary education (8-10 years old) with and without mathematics learning disabilities. Significant differences are observed between pupils with and without mathematics learning disabilities in their capacity to predict and assess all of the tasks evaluated. As regards their 'desire to learn', no significant differences were found between pupils with and without MLD, which indicated that those with MLD assess their chances of successfully performing maths tasks in the same way as those without MLD. Finally, the findings reveal a similar metacognitive profile in pupils with MLD and the younger pupils with no mathematics learning disabilities. In future studies we consider it important to analyse the influence of the socio-affective belief system in the use of metacognitive skills.
A web-based endpoint adjudication system for interim analyses in clinical trials.
Nolen, Tracy L; Dimmick, Bill F; Ostrosky-Zeichner, Luis; Kendrick, Amy S; Sable, Carole; Ngai, Angela; Wallace, Dennis
2009-02-01
A data monitoring committee (DMC) is often employed to assess trial progress and review safety data and efficacy endpoints throughout a trail. Interim analyses performed for the DMC should use data that are as complete and verified as possible. Such analyses are complicated when data verification involves subjective study endpoints or requires clinical expertise to determine each subject's status with respect to the study endpoint. Therefore, procedures are needed to obtain adjudicated data for interim analyses in an efficient manner. In the past, methods for handling such data included using locally reported results as surrogate endpoints, adjusting analysis methods for unadjudicated data, or simply performing the adjudication as rapidly as possible. These methods all have inadequacies that make their sole usage suboptimal. For a study of prophylaxis for invasive candidiasis, adjudication of both study eligibility criteria and clinical endpoints prior to two interim analyses was required. Because the study was expected to enroll at a moderate rate and the sponsor required adjudicated endpoints to be used for interim analyses, an efficient process for adjudication was required. We created a web-based endpoint adjudication system (WebEAS) that allows for expedited review by the endpoint adjudication committee (EAC). This system automatically identifies when a subject's data are complete, creates a subject profile from the study data, and assigns EAC reviewers. The reviewers use the WebEAS to review the subject profile and submit their completed review form. The WebEAS then compares the reviews, assigns an additional review as a tiebreaker if needed, and stores the adjudicated data. The study for which this system was originally built was administratively closed after 10 months with only 38 subjects enrolled. The adjudication process was finalized and the WebEAS system activated prior to study closure. Some website accessibility issues presented initially. However, once these issues were resolved, the reviewers found the system user-friendly and easy to navigate. Web-based data adjudication depends upon expeditious data collection and verification. Further, ability to use web-based technologies, in addition to clinical expertise, must be considered in selecting EAC members. The automated nature of this system makes it a practical mechanism for ensuring timely endpoint adjudication. The authors believe a similar approach could be useful for handling endpoint adjudication for future clinical trials.
Müller, Dirk; Pulm, Jannis; Gandjour, Afschin
2012-01-01
To compare cost-effectiveness modeling analyses of strategies to prevent osteoporotic and osteopenic fractures either based on fixed thresholds using bone mineral density or based on variable thresholds including bone mineral density and clinical risk factors. A systematic review was performed by using the MEDLINE database and reference lists from previous reviews. On the basis of predefined inclusion/exclusion criteria, we identified relevant studies published since January 2006. Articles included for the review were assessed for their methodological quality and results. The literature search resulted in 24 analyses, 14 of them using a fixed-threshold approach and 10 using a variable-threshold approach. On average, 70% of the criteria for methodological quality were fulfilled, but almost half of the analyses did not include medication adherence in the base case. The results of variable-threshold strategies were more homogeneous and showed more favorable incremental cost-effectiveness ratios compared with those based on a fixed threshold with bone mineral density. For analyses with fixed thresholds, incremental cost-effectiveness ratios varied from €80,000 per quality-adjusted life-year in women aged 55 years to cost saving in women aged 80 years. For analyses with variable thresholds, the range was €47,000 to cost savings. Risk assessment using variable thresholds appears to be more cost-effective than selecting high-risk individuals by fixed thresholds. Although the overall quality of the studies was fairly good, future economic analyses should further improve their methods, particularly in terms of including more fracture types, incorporating medication adherence, and including or discussing unrelated costs during added life-years. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Template based rotation: A method for functional connectivity analysis with a priori templates☆
Schultz, Aaron P.; Chhatwal, Jasmeer P.; Huijbers, Willem; Hedden, Trey; van Dijk, Koene R.A.; McLaren, Donald G.; Ward, Andrew M.; Wigman, Sarah; Sperling, Reisa A.
2014-01-01
Functional connectivity magnetic resonance imaging (fcMRI) is a powerful tool for understanding the network level organization of the brain in research settings and is increasingly being used to study large-scale neuronal network degeneration in clinical trial settings. Presently, a variety of techniques, including seed-based correlation analysis and group independent components analysis (with either dual regression or back projection) are commonly employed to compute functional connectivity metrics. In the present report, we introduce template based rotation,1 a novel analytic approach optimized for use with a priori network parcellations, which may be particularly useful in clinical trial settings. Template based rotation was designed to leverage the stable spatial patterns of intrinsic connectivity derived from out-of-sample datasets by mapping data from novel sessions onto the previously defined a priori templates. We first demonstrate the feasibility of using previously defined a priori templates in connectivity analyses, and then compare the performance of template based rotation to seed based and dual regression methods by applying these analytic approaches to an fMRI dataset of normal young and elderly subjects. We observed that template based rotation and dual regression are approximately equivalent in detecting fcMRI differences between young and old subjects, demonstrating similar effect sizes for group differences and similar reliability metrics across 12 cortical networks. Both template based rotation and dual-regression demonstrated larger effect sizes and comparable reliabilities as compared to seed based correlation analysis, though all three methods yielded similar patterns of network differences. When performing inter-network and sub-network connectivity analyses, we observed that template based rotation offered greater flexibility, larger group differences, and more stable connectivity estimates as compared to dual regression and seed based analyses. This flexibility owes to the reduced spatial and temporal orthogonality constraints of template based rotation as compared to dual regression. These results suggest that template based rotation can provide a useful alternative to existing fcMRI analytic methods, particularly in clinical trial settings where predefined outcome measures and conserved network descriptions across groups are at a premium. PMID:25150630
Minagawa, Hiroko; Yasui, Yoshihiro; Adachi, Hirokazu; Ito, Miyabi; Hirose, Emi; Nakamura, Noriko; Hata, Mami; Kobayashi, Shinichi; Yamashita, Teruo
2015-11-09
Japan was verified as having achieved measles elimination by the Measles Regional Verification Commission in the Western Pacific Region in March 2015. Verification of measles elimination implies the absence of continuous endemic transmission. After the last epidemic in 2007 with an estimated 18,000 cases, Japan introduced nationwide case-based measles surveillance in January 2008. Laboratory diagnosis for all suspected measles cases is essentially required by law, and virus detection tests are mostly performed by municipal public health institutes. Despite relatively high vaccination coverage and vigorous response to every case by the local health center staff, outbreak of measles is repeatedly observed in Aichi Prefecture, Japan. Measles virus N and H gene detection by nested double RT-PCR was performed with all specimens collected from suspected cases and transferred to our institute. Genotyping and further molecular epidemiological analyses were performed with the direct nucleotide sequence data of appropriate PCR products. Between 2010 and 2014, specimens from 389 patients suspected for measles were tested in our institute. Genotypes D9, D8, H1 and B3 were detected. Further molecular epidemiological analyses were helpful to establish links between patients, and sometimes useful to discriminate one outbreak from another. All virus-positive cases, including 49 cases involved in three outbreaks without any obvious epidemiological link with importation, were considered as import-related based on the nucleotide sequence information. Chain of transmission in the latest outbreak in 2014 terminated after the third generations, much earlier than the 2010-11 outbreak (6th generations). Since 2010, almost all measles cases reported in Aichi Prefecture are either import or import-related, based primarily on genotypes and nucleotide sequences of measles virus detected. In addition, genotyping and molecular epidemiological analyses are indispensable to prove the interruption of endemic transmission when the importations of measles are repeatedly observed. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Vesselinov, V. V.; Harp, D.
2010-12-01
The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.
Effects of personality traits on collaborative performance in problem-based learning tutorials
Jang, Hye Won; Park, Seung Won
2016-01-01
Objectives To examine the relationship between students’ collaborative performance in a problem-based learning (PBL) environment and their personality traits. Methods This retrospective, cross-sectional study was conducted using student data of a PBL program between 2013 and 2014 at Sungkyunkwan University School of Medicine, Seoul, South Korea. Eighty students were included in the study. Student data from the Temperament and Character Inventory were used as a measure of their personality traits. Peer evaluation scores during PBL were used as a measure of students’ collaborative performance. Results Simple regression analyses indicated that participation was negatively related to harm avoidance and positively related to persistence, whereas preparedness for the group work was negatively related to reward dependence. On multiple regression analyses, low reward dependence remained a significant predictor of preparedness. Grade-point average (GPA) was negatively associated with novelty seeking and cooperativeness and was positively associated with persistence. Conclusion Medical students who are less dependent on social reward are more likely to complete assigned independent work to prepare for the PBL tutorials. The findings of this study can help educators better understand and support medical students who are at risk of struggling in collaborative learning environments. PMID:27874153
Effects of personality traits on collaborative performance in problem-based learning tutorials.
Jang, Hye Won; Park, Seung Won
2016-12-01
To examine the relationship between students' collaborative performance in a problem-based learning (PBL) environment and their personality traits. Methods:This retrospective, cross-sectional study was conducted using student data of a PBL program between 2013 and 2014 at Sungkyunkwan University School of Medicine, Seoul, South Korea. Eighty students were included in the study. Student data from the Temperament and Character Inventory were used as a measure of their personality traits. Peer evaluation scores during PBL were used as a measure of students' collaborative performance. Results: Simple regression analyses indicated that participation was negatively related to harm avoidance and positively related to persistence, whereas preparedness for the group work was negatively related to reward dependence. On multiple regression analyses, low reward dependence remained a significant predictor of preparedness. Grade-point average (GPA) was negatively associated with novelty seeking and cooperativeness and was positively associated with persistence. Conclusion: Medical students who are less dependent on social reward are more likely to complete assigned independent work to prepare for the PBL tutorials. The findings of this study can help educators better understand and support medical students who are at risk of struggling in collaborative learning environments.
Li, Po-E; Lo, Chien-Chi; Anderson, Joseph J; Davenport, Karen W; Bishop-Lilly, Kimberly A; Xu, Yan; Ahmed, Sanaa; Feng, Shihai; Mokashi, Vishwesh P; Chain, Patrick S G
2017-01-09
Continued advancements in sequencing technologies have fueled the development of new sequencing applications and promise to flood current databases with raw data. A number of factors prevent the seamless and easy use of these data, including the breadth of project goals, the wide array of tools that individually perform fractions of any given analysis, the large number of associated software/hardware dependencies, and the detailed expertise required to perform these analyses. To address these issues, we have developed an intuitive web-based environment with a wide assortment of integrated and cutting-edge bioinformatics tools in pre-configured workflows. These workflows, coupled with the ease of use of the environment, provide even novice next-generation sequencing users with the ability to perform many complex analyses with only a few mouse clicks and, within the context of the same environment, to visualize and further interrogate their results. This bioinformatics platform is an initial attempt at Empowering the Development of Genomics Expertise (EDGE) in a wide range of applications for microbial research. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Prognostic value of inflammation-based scores in patients with osteosarcoma
Liu, Bangjian; Huang, Yujing; Sun, Yuanjue; Zhang, Jianjun; Yao, Yang; Shen, Zan; Xiang, Dongxi; He, Aina
2016-01-01
Systemic inflammation responses have been associated with cancer development and progression. C-reactive protein (CRP), Glasgow prognostic score (GPS), neutrophil-lymphocyte ratio (NLR), platelet-lymphocyte ratio (PLR), lymphocyte-monocyte ratio (LMR), and neutrophil-platelet score (NPS) have been shown to be independent risk factors in various types of malignant tumors. This retrospective analysis of 162 osteosarcoma cases was performed to estimate their predictive value of survival in osteosarcoma. All statistical analyses were performed by SPSS statistical software. Receiver operating characteristic (ROC) analysis was generated to set optimal thresholds; area under the curve (AUC) was used to show the discriminatory abilities of inflammation-based scores; Kaplan-Meier analysis was performed to plot the survival curve; cox regression models were employed to determine the independent prognostic factors. The optimal cut-off points of NLR, PLR, and LMR were 2.57, 123.5 and 4.73, respectively. GPS and NLR had a markedly larger AUC than CRP, PLR and LMR. High levels of CRP, GPS, NLR, PLR, and low level of LMR were significantly associated with adverse prognosis (P < 0.05). Multivariate Cox regression analyses revealed that GPS, NLR, and occurrence of metastasis were top risk factors associated with death of osteosarcoma patients. PMID:28008988
Is globalization really good for public health?
Tausch, Arno
2016-10-01
In the light of recent very prominent studies, especially that of Mukherjee and Krieckhaus (), one should be initially tempted to assume that nowadays globalization is a driver of a good public health performance in the entire world system. Most of these studies use time series analyses based on the KOF Index of Globalization. We attempt to re-analyze the entire question, using a variety of methodological approaches and data. Our re-analysis shows that neoliberal globalization has resulted in very important implosions of public health development in various regions of the world and in increasing inequality in the countries of the world system, which in turn negatively affect health performance. We use standard ibm/spss ordinary least squares (OLS) regressions, time series and cross-correlation analyses based on aggregate, freely available data. Different components of the KOF Index, most notably actual capital inflows, affect public health negatively. The "decomposition" of the available data suggests that for most of the time period of the last four decades, globalization inflows even implied an aggregate deterioration of public health, quite in line with globalization critical studies. We introduce the effects of inequality on public health, widely debated in global public health research. Our annual time series for 99 countries show that globalization indeed leads to increased inequality, and this, in turn, leads to a deteriorating public health performance. In only 19 of the surveyed 99 nations with complete data (i.e., 19.1%), globalization actually preceded an improvement in the public health performance. Far from falsifying globalization critical research, our analyses show the basic weaknesses of the new "pro-globalization" literature in the public health profession. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Davenport, M.S.
1993-01-01
Water and bottom-sediment samples were collected at 26 sites in the 65-square-mile High Point Lake watershed area of Guilford County, North Carolina, from December 1988 through December 1989. Sampling locations included 10 stream sites, 8 lake sites, and 8 ground-water sites. Generally, six steady-flow samples were collected at each stream site and three storm samples were collected at five sites. Four lake samples and eight ground-water samples also were collected. Chemical analyses of stream and lake sediments and particle-size analyses of lake sediments were performed once during the study. Most stream and lake samples were analyzed for field characteristics, nutrients, major ions, trace elements, total organic carbon, and chemical-oxygen demand. Analyses were performed to detect concentrations of 149 selected organic compounds, including acid and base/neutral extractable and volatile constituents and carbamate, chlorophenoxy acid, triazine, organochlorine, and organophosphorus pesticides and herbicides. Selected lake samples were analyzed for all constituents listed in the Safe Drinking Water Act of 1986, including Giardia, Legionella, radiochemicals, asbestos, and viruses. Various chromatograms from organic analyses were submitted to computerized library searches. The results of these and all other analyses presented in this report are in tabular form.
An approach to checking case-crossover analyses based on equivalence with time-series methods.
Lu, Yun; Symons, James Morel; Geyh, Alison S; Zeger, Scott L
2008-03-01
The case-crossover design has been increasingly applied to epidemiologic investigations of acute adverse health effects associated with ambient air pollution. The correspondence of the design to that of matched case-control studies makes it inferentially appealing for epidemiologic studies. Case-crossover analyses generally use conditional logistic regression modeling. This technique is equivalent to time-series log-linear regression models when there is a common exposure across individuals, as in air pollution studies. Previous methods for obtaining unbiased estimates for case-crossover analyses have assumed that time-varying risk factors are constant within reference windows. In this paper, we rely on the connection between case-crossover and time-series methods to illustrate model-checking procedures from log-linear model diagnostics for time-stratified case-crossover analyses. Additionally, we compare the relative performance of the time-stratified case-crossover approach to time-series methods under 3 simulated scenarios representing different temporal patterns of daily mortality associated with air pollution in Chicago, Illinois, during 1995 and 1996. Whenever a model-be it time-series or case-crossover-fails to account appropriately for fluctuations in time that confound the exposure, the effect estimate will be biased. It is therefore important to perform model-checking in time-stratified case-crossover analyses rather than assume the estimator is unbiased.
In vitro-in vivo correlation for nevirapine extended release tablets.
Macha, Sreeraj; Yong, Chan-Loi; Darrington, Todd; Davis, Mark S; MacGregor, Thomas R; Castles, Mark; Krill, Steven L
2009-12-01
An in vitro-in vivo correlation (IVIVC) for four nevirapine extended release tablets with varying polymer contents was developed. The pharmacokinetics of extended release formulations were assessed in a parallel group study with healthy volunteers and compared with corresponding in vitro dissolution data obtained using a USP apparatus type 1. In vitro samples were analysed using HPLC with UV detection and in vivo samples were analysed using a HPLC-MS/MS assay; the IVIVC analyses comparing the two results were performed using WinNonlin. A Double Weibull model optimally fits the in vitro data. A unit impulse response (UIR) was assessed using the fastest ER formulation as a reference. The deconvolution of the in vivo concentration time data was performed using the UIR to estimate an in vivo drug release profile. A linear model with a time-scaling factor clarified the relationship between in vitro and in vivo data. The predictability of the final model was consistent based on internal validation. Average percent prediction errors for pharmacokinetic parameters were <10% and individual values for all formulations were <15%. Therefore, a Level A IVIVC was developed and validated for nevirapine extended release formulations providing robust predictions of in vivo profiles based on in vitro dissolution profiles. Copyright 2009 John Wiley & Sons, Ltd.
Interim analyses in 2 x 2 crossover trials.
Cook, R J
1995-09-01
A method is presented for performing interim analyses in long term 2 x 2 crossover trials with serial patient entry. The analyses are based on a linear statistic that combines data from individuals observed for one treatment period with data from individuals observed for both periods. The coefficients in this linear combination can be chosen quite arbitrarily, but we focus on variance-based weights to maximize power for tests regarding direct treatment effects. The type I error rate of this procedure is controlled by utilizing the joint distribution of the linear statistics over analysis stages. Methods for performing power and sample size calculations are indicated. A two-stage sequential design involving simultaneous patient entry and a single between-period interim analysis is considered in detail. The power and average number of measurements required for this design are compared to those of the usual crossover trial. The results indicate that, while there is minimal loss in power relative to the usual crossover design in the absence of differential carry-over effects, the proposed design can have substantially greater power when differential carry-over effects are present. The two-stage crossover design can also lead to more economical studies in terms of the expected number of measurements required, due to the potential for early stopping. Attention is directed toward normally distributed responses.
Abstract: The biological removal of ammonia in conventional wastewater treatment plants (WWTPs) is performed by promoting nitrification, which transforms ammonia into nitrate, which in turn is converted into nitrogen gas by denitrifying bacteria. The first step in nitrification, ...
Using temperature sweeps to investigate rheology of bioplastics
USDA-ARS?s Scientific Manuscript database
As part of research toward production of protein-based bioplastics, small amplitude oscillatory shear analyses were performed in the temperature sweep mode to examine protein blends in the presence of wheat flour and glycerol. The elastic modulus (G') of these samples was much higher than the visco...
Assessment and evaluations of I-80 truck loads and their load effects : final report.
DOT National Transportation Integrated Search
2016-12-01
The research objective is to examine the safety of Wyoming bridges on the I-80 corridor considering the actual truck traffic on the : interstate based upon weigh in motion (WIM) data. This was accomplished by performing statistical analyses to determ...
Microbiological Impact on Carbon Capture and Sequestration: Biotic Processes in Natural CO2 Analogue
Multiple ground-water based microbial community analyses including membrane lipids assays for phospholipid fatty acid and DNA analysis were performed from hydraulically isolated zones. DGGE results from DNA extracts from vertical profiling of the entire depth of aquifer sampled a...
44 CFR 65.6 - Revision of base flood elevation determinations.
Code of Federal Regulations, 2014 CFR
2014-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program... new discharge estimates. (6) Any computer program used to perform hydrologic or hydraulic analyses in... control and/or the regulation of flood plain lands. For computer programs adopted by non-Federal agencies...
44 CFR 65.6 - Revision of base flood elevation determinations.
Code of Federal Regulations, 2013 CFR
2013-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program... new discharge estimates. (6) Any computer program used to perform hydrologic or hydraulic analyses in... control and/or the regulation of flood plain lands. For computer programs adopted by non-Federal agencies...
44 CFR 65.6 - Revision of base flood elevation determinations.
Code of Federal Regulations, 2012 CFR
2012-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program... new discharge estimates. (6) Any computer program used to perform hydrologic or hydraulic analyses in... control and/or the regulation of flood plain lands. For computer programs adopted by non-Federal agencies...
44 CFR 65.6 - Revision of base flood elevation determinations.
Code of Federal Regulations, 2011 CFR
2011-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program... new discharge estimates. (6) Any computer program used to perform hydrologic or hydraulic analyses in... control and/or the regulation of flood plain lands. For computer programs adopted by non-Federal agencies...
Optimal planning and design of a renewable energy based supply system for microgrids
Hafez, Omar; Bhattacharya, Kankar
2012-03-03
This paper presents a technique for optimal planning and design of hybrid renewable energy systems for microgrid applications. The Distributed Energy Resources Customer Adoption Model (DER-CAM) is used to determine the optimal size and type of distributed energy resources (DERs) and their operating schedules for a sample utility distribution system. Using the DER-CAM results, an evaluation is performed to evaluate the electrical performance of the distribution circuit if the DERs selected by the DER-CAM optimization analyses are incorporated. Results of analyses regarding the economic benefits of utilizing the optimal locations identified for the selected DER within the system are alsomore » presented. The actual Brookhaven National Laboratory (BNL) campus electrical network is used as an example to show the effectiveness of this approach. The results show that these technical and economic analyses of hybrid renewable energy systems are essential for the efficient utilization of renewable energy resources for microgird applications.« less
Schilbach, Leonhard; Bzdok, Danilo; Timmermans, Bert; Fox, Peter T.; Laird, Angela R.; Vogeley, Kai; Eickhoff, Simon B.
2012-01-01
Previous research suggests overlap between brain regions that show task-induced deactivations and those activated during the performance of social-cognitive tasks. Here, we present results of quantitative meta-analyses of neuroimaging studies, which confirm a statistical convergence in the neural correlates of social and resting state cognition. Based on the idea that both social and unconstrained cognition might be characterized by introspective processes, which are also thought to be highly relevant for emotional experiences, a third meta-analysis was performed investigating studies on emotional processing. By using conjunction analyses across all three sets of studies, we can demonstrate significant overlap of task-related signal change in dorso-medial prefrontal and medial parietal cortex, brain regions that have, indeed, recently been linked to introspective abilities. Our findings, therefore, provide evidence for the existence of a core neural network, which shows task-related signal change during socio-emotional tasks and during resting states. PMID:22319593
An Overview of the Semi-Span Super-Sonic Transport (S4T) Wind-Tunnel Model Program
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Perry, Boyd, III; Florance, James R.; Sanetrik, Mark D.; Wieseman, Carol D.; Stevens, William L.; Funk, Christie J.; Christhilf, David M.; Coulson, David A.
2012-01-01
A summary of computational and experimental aeroelastic (AE) and aeroservoelastic (ASE) results for the Semi-Span Super-Sonic Transport (S4T) wind-tunnel model is presented. A broad range of analyses and multiple AE and ASE wind-tunnel tests of the S4T wind-tunnel model have been performed in support of the ASE element in the Supersonics Program, part of the NASA Fundamental Aeronautics Program. This paper is intended to be an overview of multiple papers that comprise a special S4T technical session. Along those lines, a brief description of the design and hardware of the S4T wind-tunnel model will be presented. Computational results presented include linear and nonlinear aeroelastic analyses, and rapid aeroelastic analyses using CFD-based reduced-order models (ROMs). A brief survey of some of the experimental results from two open-loop and two closed-loop wind-tunnel tests performed at the NASA Langley Transonic Dynamics Tunnel (TDT) will be presented as well.
Bi, Dongbin; Ning, Hao; Liu, Shuai; Que, Xinxiang; Ding, Kejia
2015-06-01
To explore molecular mechanisms of bladder cancer (BC), network strategy was used to find biomarkers for early detection and diagnosis. The differentially expressed genes (DEGs) between bladder carcinoma patients and normal subjects were screened using empirical Bayes method of the linear models for microarray data package. Co-expression networks were constructed by differentially co-expressed genes and links. Regulatory impact factors (RIF) metric was used to identify critical transcription factors (TFs). The protein-protein interaction (PPI) networks were constructed by the Search Tool for the Retrieval of Interacting Genes/Proteins (STRING) and clusters were obtained through molecular complex detection (MCODE) algorithm. Centralities analyses for complex networks were performed based on degree, stress and betweenness. Enrichment analyses were performed based on Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) databases. Co-expression networks and TFs (based on expression data of global DEGs and DEGs in different stages and grades) were identified. Hub genes of complex networks, such as UBE2C, ACTA2, FABP4, CKS2, FN1 and TOP2A, were also obtained according to analysis of degree. In gene enrichment analyses of global DEGs, cell adhesion, proteinaceous extracellular matrix and extracellular matrix structural constituent were top three GO terms. ECM-receptor interaction, focal adhesion, and cell cycle were significant pathways. Our results provide some potential underlying biomarkers of BC. However, further validation is required and deep studies are needed to elucidate the pathogenesis of BC. Copyright © 2015 Elsevier Ltd. All rights reserved.
Zero adjusted models with applications to analysing helminths count data.
Chipeta, Michael G; Ngwira, Bagrey M; Simoonga, Christopher; Kazembe, Lawrence N
2014-11-27
It is common in public health and epidemiology that the outcome of interest is counts of events occurrence. Analysing these data using classical linear models is mostly inappropriate, even after transformation of outcome variables due to overdispersion. Zero-adjusted mixture count models such as zero-inflated and hurdle count models are applied to count data when over-dispersion and excess zeros exist. Main objective of the current paper is to apply such models to analyse risk factors associated with human helminths (S. haematobium) particularly in a case where there's a high proportion of zero counts. The data were collected during a community-based randomised control trial assessing the impact of mass drug administration (MDA) with praziquantel in Malawi, and a school-based cross sectional epidemiology survey in Zambia. Count data models including traditional (Poisson and negative binomial) models, zero modified models (zero inflated Poisson and zero inflated negative binomial) and hurdle models (Poisson logit hurdle and negative binomial logit hurdle) were fitted and compared. Using Akaike information criteria (AIC), the negative binomial logit hurdle (NBLH) and zero inflated negative binomial (ZINB) showed best performance in both datasets. With regards to zero count capturing, these models performed better than other models. This paper showed that zero modified NBLH and ZINB models are more appropriate methods for the analysis of data with excess zeros. The choice between the hurdle and zero-inflated models should be based on the aim and endpoints of the study.
SEASAT A satellite scatterometer
NASA Technical Reports Server (NTRS)
Bianchi, R.; Heath, A.; Marsh, S.; Borusiewicz, J.
1978-01-01
The analyses performed in the early period of the program which formed the basis of the sensor design is reviewed, along with the sensor design. The test program is outlined, listing all tests performed and the environmental exposure (simulated) for each, as applicable. Ground support equipment designed and built for assembly integration and field testing is described. The software developed during the program and the algorithms/flow diagrams which formed the bases for the software are summarized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stillman, J. A.; Feldman, E. E.; Wilson, E. H.
This report contains the results of reactor accident analyses for the University of Missouri Research Reactor (MURR). The calculations were performed as part of the conversion from the use of highly-enriched uranium (HEU) fuel to the use of low-enriched uranium (LEU) fuel. The analyses were performed by staff members of the Global Threat Reduction Initiative (GTRI) Reactor Conversion Program at the Argonne National Laboratory (ANL), the MURR Facility, and the Nuclear Engineering Program – College of Engineering, University of Missouri-Columbia. The core conversion to LEU is being performed with financial support from the U. S. government. This report contains themore » results of reactor accident analyses for the University of Missouri Research Reactor (MURR). The calculations were performed as part of the conversion from the use of highly-enriched uranium (HEU) fuel to the use of low-enriched uranium (LEU) fuel. The analyses were performed by staff members of the Global Threat Reduction Initiative (GTRI) Reactor Conversion Program at the Argonne National Laboratory (ANL), the MURR Facility, and the Nuclear Engineering Program – College of Engineering, University of Missouri-Columbia. The core conversion to LEU is being performed with financial support from the U. S. government. In the framework of non-proliferation policies, the international community presently aims to minimize the amount of nuclear material available that could be used for nuclear weapons. In this geopolitical context most research and test reactors, both domestic and international, have started a program of conversion to the use of LEU fuel. A new type of LEU fuel based on an alloy of uranium and molybdenum (U-Mo) is expected to allow the conversion of U.S. domestic high performance reactors like MURR. This report presents the results of a study of core behavior under a set of accident conditions for MURR cores fueled with HEU U-Alx dispersion fuel or LEU monolithic U-Mo alloy fuel with 10 wt% Mo (U-10Mo).« less
NASA Astrophysics Data System (ADS)
Ficchì, Andrea; Perrin, Charles; Andréassian, Vazken
2016-07-01
Hydro-climatic data at short time steps are considered essential to model the rainfall-runoff relationship, especially for short-duration hydrological events, typically flash floods. Also, using fine time step information may be beneficial when using or analysing model outputs at larger aggregated time scales. However, the actual gain in prediction efficiency using short time-step data is not well understood or quantified. In this paper, we investigate the extent to which the performance of hydrological modelling is improved by short time-step data, using a large set of 240 French catchments, for which 2400 flood events were selected. Six-minute rain gauge data were available and the GR4 rainfall-runoff model was run with precipitation inputs at eight different time steps ranging from 6 min to 1 day. Then model outputs were aggregated at seven different reference time scales ranging from sub-hourly to daily for a comparative evaluation of simulations at different target time steps. Three classes of model performance behaviour were found for the 240 test catchments: (i) significant improvement of performance with shorter time steps; (ii) performance insensitivity to the modelling time step; (iii) performance degradation as the time step becomes shorter. The differences between these groups were analysed based on a number of catchment and event characteristics. A statistical test highlighted the most influential explanatory variables for model performance evolution at different time steps, including flow auto-correlation, flood and storm duration, flood hydrograph peakedness, rainfall-runoff lag time and precipitation temporal variability.
Exploratory Decision-Making as a Function of Lifelong Experience, Not Cognitive Decline
2016-01-01
Older adults perform worse than younger adults in some complex decision-making scenarios, which is commonly attributed to age-related declines in striatal and frontostriatal processing. Recently, this popular account has been challenged by work that considered how older adults’ performance may differ as a function of greater knowledge and experience, and by work showing that, in some cases, older adults outperform younger adults in complex decision-making tasks. In light of this controversy, we examined the performance of older and younger adults in an exploratory choice task that is amenable to model-based analyses and ostensibly not reliant on prior knowledge. Exploration is a critical aspect of decision-making poorly understood across the life span. Across 2 experiments, we addressed (a) how older and younger adults differ in exploratory choice and (b) to what extent observed differences reflect processing capacity declines. Model-based analyses suggested that the strategies used by the 2 groups were qualitatively different, resulting in relatively worse performance for older adults in 1 decision-making environment but equal performance in another. Little evidence was found that differences in processing capacity drove performance differences. Rather the results suggested that older adults’ performance might result from applying a strategy that may have been shaped by their wealth of real-word decision-making experience. While this strategy is likely to be effective in the real world, it is ill suited to some decision environments. These results underscore the importance of taking into account effects of experience in aging studies, even for tasks that do not obviously tap past experiences. PMID:26726916
Genius begins at home: Shared social identity enhances the recognition of creative performance.
Steffens, Niklas K; Haslam, S Alexander; Ryan, Michelle K; Millard, Kathryn
2017-11-01
The present research examines the extent to which the recognition of creative performance is structured by social group membership. It does this by analysing the award of merit prizes for Best Actor and Actress in a Leading Role for the international award of US-based Oscars and British-based BAFTAs since BAFTA's inception of this category in 1968. For both awards, the exclusive assessment criterion is the quality of artists' performance in the international arena. Results show that US artists won a greater proportion of Oscars than BAFTAs (odds ratio: 2.10), whereas British artists won a greater proportion of BAFTAs than Oscars (OR: 2.26). Furthermore, results support the hypothesis that these patterns are more pronounced as the diagnostic value of a quality indicator increases - that is, in the conferring of actual awards rather than nominations. Specifically, US artists won a greater proportion of Oscar awards than nominations (OR: 1.77), while British artists won a greater proportion of BAFTA awards than nominations (OR: 1.62). Additional analyses show that the performances of in-group actors in movies portraying in-group culture (US culture in the case of Oscars, British culture in the case of BAFTAs) are more likely to be recognized than the performances of in-group actors in movies portraying the culture of other (out-)groups. These are the first data to provide clear evidence from the field that the recognition of exceptional creative performance is enhanced by shared social identity between perceivers and performers. © 2017 The British Psychological Society.
Graf, Joachim; Smolka, Robert; Simoes, Elisabeth; Zipfel, Stephan; Junne, Florian; Holderried, Friederike; Wosnik, Annette; Doherty, Anne M; Menzel, Karina; Herrmann-Werner, Anne
2017-05-02
Communication skills are essential in a patient-centred health service and therefore in medical teaching. Although significant differences in communication behaviour of male and female students are known, gender differences in the performance of students are still under-reported. The aim of this study was to analyse gender differences in communication skills of medical students in the context of an OSCE exam (OSCE = Objective Structured Clinical Examination). In a longitudinal trend study based on seven semester-cohorts, it was analysed if there are gender differences in medical students' communication skills. The students (self-perception) and standardized patients (SP) (external perception) were asked to rate the communication skills using uniform questionnaires. Statistical analysis was performed by using frequency analyses and t-tests in SPSS 21. Across all ratings in the self- and the external perception, there was a significant gender difference in favour of female students performing better in the dimensions of empathy, structure, verbal expression and non-verbal expression. The results of male students deteriorated across all dimensions in the external perception between 2011 and 2014. It is important to consider if gender-specific teaching should be developed, considering the reported differences between female and male students.
Analyses and forecasts with LAWS winds
NASA Technical Reports Server (NTRS)
Wang, Muyin; Paegle, Jan
1994-01-01
Horizontal fluxes of atmospheric water vapor are studied for summer months during 1989 and 1992 over North and South America based on analyses from European Center for Medium Range Weather Forecasts, US National Meteorological Center, and United Kingdom Meteorological Office. The calculations are performed over 20 deg by 20 deg box-shaped midlatitude domains located to the east of the Rocky Mountains in North America, and to the east of the Andes Mountains in South America. The fluxes are determined from operational center gridded analyses of wind and moisture. Differences in the monthly mean moisture flux divergence determined from these analyses are as large as 7 cm/month precipitable water equivalent over South America, and 3 cm/month over North America. Gridded analyses at higher spatial and temporal resolution exhibit better agreement in the moisture budget study. However, significant discrepancies of the moisture flux divergence computed from different gridded analyses still exist. The conclusion is more pessimistic than Rasmusson's estimate based on station data. Further analysis reveals that the most significant sources of error result from model surface elevation fields, gaps in the data archive, and uncertainties in the wind and specific humidity analyses. Uncertainties in the wind analyses are the most important problem. The low-level jets, in particular, are substantially different in the different data archives. Part of the reason for this may be due to the way the different analysis models parameterized physical processes affecting low-level jets. The results support the inference that the noise/signal ratio of the moisture budget may be improved more rapidly by providing better wind observations and analyses than by providing better moisture data.
Performance analyses of Schottky diodes with Au/Pd contacts on n-ZnO thin films as UV detectors
NASA Astrophysics Data System (ADS)
Varma, Tarun; Periasamy, C.; Boolchandani, Dharmendar
2017-12-01
In this paper, we report fabrication and performance analyses of UV detectors based on ZnO thin film Schottky diodes with Au and Pd contacts. RF magnetron sputtering technique has been used to deposit the nano-crystalline ZnO thin film, at room temperature. Characterization techniques such as XRD, AFM and SEM provided valuable information related to the micro-structural & optical properties of the thin film. The results show that the prepared thin film has good crystalline orientation and minimal surface roughness, with an optical bandgap of 3.1 eV. I-V and C-V characteristics were evaluated that indicate non-linear behaviour of the diodes with rectification ratios (IF/IR) of 19 and 427, at ± 4 V, for Au/ZnO and Pd/ZnO Schottky diodes, respectively. The fabricated Schottky diodes when exposed to a UV light of 365 nm wavelength, at an applied bias of -2 V, exhibited responsivity of 10.16 and 22.7 A/W, for Au and Pd Schottky contacts, respectively. The Pd based Schottky photo-detectors were found to exhibit better performance with superior values of detectivity and photoconductive gain of 1.95 × 1010 cm Hz0.5/W & 77.18, over those obtained for the Au based detectors which were observed to be 1.23 × 1010 cm Hz0.5/W & 34.5, respectively.
NASA Astrophysics Data System (ADS)
Powell, P. E.
Educators have recently come to consider inquiry based instruction as a more effective method of instruction than didactic instruction. Experience based learning theory suggests that student performance is linked to teaching method. However, research is limited on inquiry teaching and its effectiveness on preparing students to perform well on standardized tests. The purpose of the study to investigate whether one of these two teaching methodologies was more effective in increasing student performance on standardized science tests. The quasi experimental quantitative study was comprised of two stages. Stage 1 used a survey to identify teaching methods of a convenience sample of 57 teacher participants and determined level of inquiry used in instruction to place participants into instructional groups (the independent variable). Stage 2 used analysis of covariance (ANCOVA) to compare posttest scores on a standardized exam by teaching method. Additional analyses were conducted to examine the differences in science achievement by ethnicity, gender, and socioeconomic status by teaching methodology. Results demonstrated a statistically significant gain in test scores when taught using inquiry based instruction. Subpopulation analyses indicated all groups showed improved mean standardized test scores except African American students. The findings benefit teachers and students by presenting data supporting a method of content delivery that increases teacher efficacy and produces students with a greater cognition of science content that meets the school's mission and goals.
NASA Astrophysics Data System (ADS)
Sabri, Karim; Colson, Gérard E.; Mbangala, Augustin M.
2008-10-01
Multi-period differences of technical and financial performances are analysed by comparing five North African railways over the period (1990-2004). A first approach is based on the Malmquist DEA TFP index for measuring the total factors productivity change, decomposed into technical efficiency change and technological changes. A multiple criteria analysis is also performed using the PROMETHEE II method and the software ARGOS. These methods provide complementary detailed information, especially by discriminating the technological and management progresses by Malmquist and the two dimensions of performance by Promethee: that are the service to the community and the enterprises performances, often in conflict.
The role of musical training in emergent and event-based timing.
Baer, L H; Thibodeau, J L N; Gralnick, T M; Li, K Z H; Penhune, V B
2013-01-01
Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced) and then responded at the same rate without the metronome (Unpaced). Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes.
Olexa, Edward M.; Lawrence, Rick L
2014-01-01
Federal land management agencies provide stewardship over much of the rangelands in the arid andsemi-arid western United States, but they often lack data of the proper spatiotemporal resolution andextent needed to assess range conditions and monitor trends. Recent advances in the blending of com-plementary, remotely sensed data could provide public lands managers with the needed information.We applied the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) to five Landsat TMand concurrent Terra MODIS scenes, and used pixel-based regression and difference image analyses toevaluate the quality of synthetic reflectance and NDVI products associated with semi-arid rangeland. Pre-dicted red reflectance data consistently demonstrated higher accuracy, less bias, and stronger correlationwith observed data than did analogous near-infrared (NIR) data. The accuracy of both bands tended todecline as the lag between base and prediction dates increased; however, mean absolute errors (MAE)were typically ≤10%. The quality of area-wide NDVI estimates was less consistent than either spectra lband, although the MAE of estimates predicted using early season base pairs were ≤10% throughout the growing season. Correlation between known and predicted NDVI values and agreement with the 1:1regression line tended to decline as the prediction lag increased. Further analyses of NDVI predictions,based on a 22 June base pair and stratified by land cover/land use (LCLU), revealed accurate estimates through the growing season; however, inter-class performance varied. This work demonstrates the successful application of the STARFM algorithm to semi-arid rangeland; however, we encourage evaluation of STARFM’s performance on a per product basis, stratified by LCLU, with attention given to the influence of base pair selection and the impact of the time lag.
A value-based medicine cost-utility analysis of idiopathic epiretinal membrane surgery.
Gupta, Omesh P; Brown, Gary C; Brown, Melissa M
2008-05-01
To perform a reference case, cost-utility analysis of epiretinal membrane (ERM) surgery using current literature on outcomes and complications. Computer-based, value-based medicine analysis. Decision analyses were performed under two scenarios: ERM surgery in better-seeing eye and ERM surgery in worse-seeing eye. The models applied long-term published data primarily from the Blue Mountains Eye Study and the Beaver Dam Eye Study. Visual acuity and major complications were derived from 25-gauge pars plana vitrectomy studies. Patient-based, time trade-off utility values, Markov modeling, sensitivity analysis, and net present value adjustments were used in the design and calculation of results. Main outcome measures included the number of discounted quality-adjusted-life-years (QALYs) gained and dollars spent per QALY gained. ERM surgery in the better-seeing eye compared with observation resulted in a mean gain of 0.755 discounted QALYs (3% annual rate) per patient treated. This model resulted in $4,680 per QALY for this procedure. When sensitivity analysis was performed, utility values varied from $6,245 to $3,746/QALY gained, medical costs varied from $3,510 to $5,850/QALY gained, and ERM recurrence rate increased to $5,524/QALY. ERM surgery in the worse-seeing eye compared with observation resulted in a mean gain of 0.27 discounted QALYs per patient treated. The $/QALY was $16,146 with a range of $20,183 to $12,110 based on sensitivity analyses. Utility values ranged from $21,520 to $12,916/QALY and ERM recurrence rate increased to $16,846/QALY based on sensitivity analysis. ERM surgery is a very cost-effective procedure when compared with other interventions across medical subspecialties.
Ng, Tze Pin; Feng, Lei; Lim, Wee Shiong; Chong, Mei Sian; Lee, Tih Shih; Yap, Keng Bee; Tsoi, Tung; Liew, Tau Ming; Gao, Qi; Collinson, Simon; Kandiah, Nagaendran; Yap, Philip
2015-01-01
The Montreal Cognitive Assessment (MoCA) was developed as a screening instrument for mild cognitive impairment (MCI). We evaluated the MoCA's test performance by educational groups among older Singaporean Chinese adults. The MoCA and Mini-Mental State Examination (MMSE) were evaluated in two independent studies (clinic-based sample and community-based sample) of MCI and normal cognition (NC) controls, using receiver operating characteristic curve analyses: area under the curve (AUC), sensitivity (Sn), and specificity (Sp). The MoCA modestly discriminated MCI from NC in both study samples (AUC = 0.63 and 0.65): Sn = 0.64 and Sp = 0.36 at a cut-off of 28/29 in the clinic-based sample, and Sn = 0.65 and Sp = 0.55 at a cut-off of 22/23 in the community-based sample. The MoCA's test performance was least satisfactory in the highest (>6 years) education group: AUC = 0.50 (p = 0.98), Sn = 0.54, and Sp = 0.51 at a cut-off of 27/28. Overall, the MoCA's test performance was not better than that of the MMSE. In multivariate analyses controlling for age and gender, MCI diagnosis was associated with a <1-point decrement in MoCA score (η(2) = 0.010), but lower (1-6 years) and no education was associated with a 3- to 5-point decrement (η(2) = 0.115 and η(2) = 0.162, respectively). The MoCA's ability to discriminate MCI from NC was modest in this Chinese population, because it was far more sensitive to the effect of education than MCI diagnosis. © 2015 S. Karger AG, Basel.
Time-based analysis of total cost of patient episodes: a case study of hip replacement.
Peltokorpi, Antti; Kujala, Jaakko
2006-01-01
Healthcare in the public and private sectors is facing increasing pressure to become more cost-effective. Time-based competition and work-in-progress have been used successfully to measure and improve the efficiency of industrial manufacturing. Seeks to address this issue. Presents a framework for time based management of the total cost of a patient episode and apply it to the six sigma DMAIC-process development approach. The framework is used to analyse hip replacement patient episodes in Päijät-Häme Hospital District in Finland, which has a catchment area of 210,000 inhabitants and performs an average of 230 hip replacements per year. The work-in-progress concept is applicable to healthcare--notably that the DMAIC-process development approach can be used to analyse the total cost of patient episodes. Concludes that a framework, which combines the patient-in-process and the DMAIC development approach, can be used not only to analyse the total cost of patient episode but also to improve patient process efficiency. Presents a framework that combines patient-in-process and DMAIC-process development approaches, which can be used to analyse the total cost of a patient episode in order to improve patient process efficiency.
Li, Yueming; Law, Matthew; McDonald, Ann; Correll, Patty; Kaldor, John M; Grulich, Andrew E
2002-01-15
There is methodological debate as to whether cohorts defined by acquired immunodeficiency syndrome (AIDS) diagnosis can be used to estimate risks of cancer in persons with human immunodeficiency virus (HIV) before AIDS. The authors compared risks of non-AIDS-defining cancers before AIDS in persons with HIV using a cohort based on AIDS diagnosis and a second cohort based on HIV diagnosis. National population-based registries of AIDS and HIV diagnoses to August 1999 were matched separately with the National Cancer Registry in Australia. Four analyses were performed. In analysis 1, follow-up was from 5 years before AIDS registration in 8,118 persons with AIDS. Analysis 2 was similar but adjusted expected numbers of cancers for decreased survival. Analysis 3 was based on 7,061 persons registered with HIV, with follow-up from the reported date of diagnosis. Analysis 4 was based on 2,112 AIDS cases previously reported with HIV, with follow-up from 5 years before AIDS diagnosis. In all analyses, follow-up ended at cancer diagnosis, death, 6 months before AIDS, or the end of available cancer data, whichever occurred first. For 10 types of cancer there were at least three cases in any one of the analyses. For these cancers there was no systematic pattern such that one analysis produced consistently higher or lower estimates than the others. These analyses suggest that cancer risk in persons with HIV before AIDS diagnosis may be estimated reliably based on cancer experience 5 years before AIDS.
ERIC Educational Resources Information Center
Beauducel, Andre; Herzberg, Philipp Yorck
2006-01-01
This simulation study compared maximum likelihood (ML) estimation with weighted least squares means and variance adjusted (WLSMV) estimation. The study was based on confirmatory factor analyses with 1, 2, 4, and 8 factors, based on 250, 500, 750, and 1,000 cases, and on 5, 10, 20, and 40 variables with 2, 3, 4, 5, and 6 categories. There was no…
Search strategies in systematic reviews in periodontology and implant dentistry.
Faggion, Clovis M; Atieh, Momen A; Park, Stephanie
2013-09-01
To perform an overview of literature search strategies in systematic reviews (SRs) published in periodontology and implant dentistry. Two electronic databases (PubMed and Cochrane Database of SRs) were searched, independently and in duplicate, for SRs with meta-analyses on interventions, with the last search performed on 11 November 2012. Manual searches of the reference lists of included SRs and 10 specialty dental journals were conducted. Methodological issues of the search strategies of included SRs were assessed with Cochrane collaboration guidelines and AMSTAR recommendations. The search strategies employed in Cochrane and paper-based SRs were compared. A total of 146 SRs with meta-analyses were included, including 19 Cochrane and 127 paper-based SRs. Some issues, such as "the use of keywords," were reported in most of the SRs (86%). Other issues, such as "search of grey literature" and "language restriction," were not fully reported (34% and 50% respectively). The quality of search strategy reporting in Cochrane SRs was better than that of paper-based SRs for seven of the eight criteria assessed. There is room for improving the quality of reporting of search strategies in SRs in periodontology and implant dentistry, particularly in SRs published in paper-based journals. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
First Monte Carlo analysis of fragmentation functions from single-inclusive e + e - annihilation
Sato, Nobuo; Ethier, J. J.; Melnitchouk, W.; ...
2016-12-02
Here, we perform the first iterative Monte Carlo (IMC) analysis of fragmentation functions constrained by all available data from single-inclusive $e^+ e^-$ annihilation into pions and kaons. The IMC method eliminates potential bias in traditional analyses based on single fits introduced by fixing parameters not well contrained by the data, and provides a statistically rigorous determination of uncertainties. Our analysis reveals specific features of fragmentation functions using the new IMC methodology and those obtained from previous analyses, especially for light quarks and for strange quark fragmentation to kaons.
NASA Technical Reports Server (NTRS)
Neil, W. J.; Jordan, J. F.; Zielenbach, J. W.; Wong, S. K.; Mitchell, R. T.; Webb, W. A.; Koskela, P. E.
1973-01-01
A final, comprehensive description of the navigation of Mariner 9-the first U.S. spacecraft to orbit another planet is provided. The Mariner 9 navigation function included not only precision flight path control but also pointing of the spacecraft's scientific instruments mounted on a two degree of freedom scan platform. To the extent appropriate, each section describes the perflight analyses on which the operational strategies and performance predictions were based. Inflight results are then discussed and compared with the preflight predictions. Postflight analyses, which were primarily concerned with developing a thorough understanding of unexpected in-flight results, are also presented.
[Point of Care 2.0: Coagulation Monitoring Using Rotem® Sigma and Teg® 6S].
Weber, Christian Friedrich; Zacharowski, Kai
2018-06-01
New-generation methods for point of care based coagulation monitoring enable fully automated viscoelastic analyses for the assessment of particular parts of hemostasis. Contrary to the measuring techniques of former models, the viscoelastic ROTEM ® sigma and TEG ® 6s analyses are performed in single-use test cartridges without time- and personnel-intensive pre-analytical procedures. This review highlights methodical strengths and limitations of the devices and meets concerns associated with their integration in routine clinical practice. Georg Thieme Verlag KG Stuttgart · New York.
Ellipsometric surface analysis of wear tracks produced by different lubricants
NASA Technical Reports Server (NTRS)
Lauer, J. L.; Marxer, N.; Jones, W. R., Jr.
1985-01-01
Ellipsometric analyses of wear tracks in berings of M-50 steel were carried out after operation under severe conditions with different lubricant additives. The base lubricant was a synthetic ester. It was found that the surface and wear additives benzotirazole and tricresylyphosphate produced very patchy oxide layers. Dioctyldiphenylamine, a common antioxidant, on the other hand produced smoother films. The analyses were performed with a specially designed and constructed ellipsometer of very high (20 micron) spatial resolution. The results are consistent with data obtained by Auger electron spectroscopy.
Ability-versus skill-based assessment of emotional intelligence.
Bradberry, Travis R; Su, Lac D
2006-01-01
Emotional intelligence has received an intense amount of attention in leadership circles during the last decade and continuing debate exists concerning the best method for measuring this construct. This study analyzed leader emotional intelligence scores, measured via skill and ability methodologies, against leader job performance. Two hundred twelve employees from three organizations participated in this study. Scores on the Emotional Intelligence Appraisal, a skill-based assessment, were positively, though not significantly, correlated with scores on the MSCEIT, an ability-based assessment of emotional intelligence. Scores on the MSCEIT did not have a significant relationship with job performance in this study, whereas, scores on the Emotional Intelligence Appraisal had a strong link to leader job performance. The four subcomponents of the Emotional Intelligence Appraisal were examined against job performance. Relationship management was a stronger predictor of leader job performance than the other three subcomponents. Social awareness was the single emotional intelligence skill that did not have a significant link to leader job performance. Factor analyses yielded a two-component model of emotional intelligence encompassing personal and social competence, rather than confirmation of a four-part taxonomy.
NASA Technical Reports Server (NTRS)
1981-01-01
A study was performed to determine the types of manned missions that will likely be performed in the late 1980's or early 1990's timeframe, to define MOTV configurations which satisfy these missions requirements, and to develop a program plan for its development. Twenty generic missions were originally defined for MOTV but, to simplify the selection process, five of these missions were selected as typical and used as Design Reference Missions. Systems and subsystems requirements were re-examined and sensitivity analyses performed to determine optimum point designs. Turnaround modes were considered to determine the most effective combination of ground based and spaced based activities. A preferred concept for the crew capsule and for the mission mode was developed.
Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses
Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn
2016-01-01
Background Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. Objective The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Methods Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Results Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than others. Integration of content control improved quality of decision making (SMD 0.59 vs 0.23 for knowledge; SMD 0.39 vs 0.29 for decisional conflict). In contrast, tailoring reduced quality of decision making (SMD 0.40 vs 0.71 for knowledge; SMD 0.25 vs 0.52 for decisional conflict). Similarly, patient narratives also reduced quality of decision making (SMD 0.43 vs 0.65 for knowledge; SMD 0.17 vs 0.46 for decisional conflict). Results were varied for different types of explicit values clarification, feedback, and social support. Conclusions Integration of media rich or interactive features into computer-based decision aids can improve quality of preference-sensitive decision making. However, this is an emerging field with limited evidence to guide use. The systematic review and thematic synthesis identified features that have been integrated into available computer-based decision aids, in an effort to facilitate reporting of these features and to promote integration of such features into decision aids. The meta-analyses and associated subgroup analyses provide preliminary evidence to support integration of specific features into future decision aids. Further research can focus on clarifying independent contributions of specific features through experimental designs and refining the designs of features to improve effectiveness. PMID:26813512
NASA Astrophysics Data System (ADS)
Zhang, Chuan; Zhang, Lili; Geng, Yi
In recent years, internal control has caught more and more attention over the whole globe. However, whether internal control could improve business efficiency also lacks the empirical supports. Based on a sample size of 146 Chinese real estate enterprises, this study analyses the CPA’s recognition degree on firm’s implementing internal control, and its performance consequence. The evidence suggests that CPAs are able to give exact evaluation on firm’s internal control implement, and the higher the internal control implemented, the better performance the enterprise will have.
Software Engineering Laboratory (SEL) Ada performance study report
NASA Technical Reports Server (NTRS)
Booth, Eric W.; Stark, Michael E.
1991-01-01
The goals of the Ada Performance Study are described. The methods used are explained. Guidelines for future Ada development efforts are given. The goals and scope of the study are detailed, and the background of Ada development in the Flight Dynamics Division (FDD) is presented. The organization and overall purpose of each test are discussed. The purpose, methods, and results of each test and analyses of these results are given. Guidelines for future development efforts based on the analysis of results from this study are provided. The approach used on the performance tests is discussed.
Eickhoff, Simon B; Nichols, Thomas E; Laird, Angela R; Hoffstaedter, Felix; Amunts, Katrin; Fox, Peter T; Bzdok, Danilo; Eickhoff, Claudia R
2016-08-15
Given the increasing number of neuroimaging publications, the automated knowledge extraction on brain-behavior associations by quantitative meta-analyses has become a highly important and rapidly growing field of research. Among several methods to perform coordinate-based neuroimaging meta-analyses, Activation Likelihood Estimation (ALE) has been widely adopted. In this paper, we addressed two pressing questions related to ALE meta-analysis: i) Which thresholding method is most appropriate to perform statistical inference? ii) Which sample size, i.e., number of experiments, is needed to perform robust meta-analyses? We provided quantitative answers to these questions by simulating more than 120,000 meta-analysis datasets using empirical parameters (i.e., number of subjects, number of reported foci, distribution of activation foci) derived from the BrainMap database. This allowed to characterize the behavior of ALE analyses, to derive first power estimates for neuroimaging meta-analyses, and to thus formulate recommendations for future ALE studies. We could show as a first consequence that cluster-level family-wise error (FWE) correction represents the most appropriate method for statistical inference, while voxel-level FWE correction is valid but more conservative. In contrast, uncorrected inference and false-discovery rate correction should be avoided. As a second consequence, researchers should aim to include at least 20 experiments into an ALE meta-analysis to achieve sufficient power for moderate effects. We would like to note, though, that these calculations and recommendations are specific to ALE and may not be extrapolated to other approaches for (neuroimaging) meta-analysis. Copyright © 2016 Elsevier Inc. All rights reserved.
The psychometric properties of the 'Hospital Survey on Patient Safety Culture' in Dutch hospitals.
Smits, Marleen; Christiaans-Dingelhoff, Ingrid; Wagner, Cordula; Wal, Gerrit van der; Groenewegen, Peter P
2008-11-07
In many different countries the Hospital Survey on Patient Safety Culture (HSOPS) is used to assess the safety culture in hospitals. Accordingly, the questionnaire has been translated into Dutch for application in the Netherlands. The aim of this study was to examine the underlying dimensions and psychometric properties of the questionnaire in Dutch hospital settings, and to compare these results with the original questionnaire used in USA hospital settings. The HSOPS was completed by 583 staff members of four general hospitals, three teaching hospitals, and one university hospital in the Netherlands. Confirmatory factor analyses were performed to examine the applicability of the factor structure of the American questionnaire to the Dutch data. Explorative factor analyses were performed to examine whether another composition of items and factors would fit the data better. Supplementary psychometric analyses were performed, including internal consistency and construct validity. The confirmatory factor analyses were based on the 12-factor model of the original questionnaire and resulted in a few low reliability scores. 11 Factors were drawn with explorative factor analyses, with acceptable reliability scores and a good construct validity. Two items were removed from the questionnaire. The composition of the factors was very similar to that of the original questionnaire. A few items moved to another factor and two factors turned out to combine into a six-item dimension. All other dimensions consisted of two to five items. The Dutch translation of the HSOPS consists of 11 factors with acceptable reliability and good construct validity. and is similar to the original HSOPS factor structure.
Eickhoff, Simon B.; Nichols, Thomas E.; Laird, Angela R.; Hoffstaedter, Felix; Amunts, Katrin; Fox, Peter T.
2016-01-01
Given the increasing number of neuroimaging publications, the automated knowledge extraction on brain-behavior associations by quantitative meta-analyses has become a highly important and rapidly growing field of research. Among several methods to perform coordinate-based neuroimaging meta-analyses, Activation Likelihood Estimation (ALE) has been widely adopted. In this paper, we addressed two pressing questions related to ALE meta-analysis: i) Which thresholding method is most appropriate to perform statistical inference? ii) Which sample size, i.e., number of experiments, is needed to perform robust meta-analyses? We provided quantitative answers to these questions by simulating more than 120,000 meta-analysis datasets using empirical parameters (i.e., number of subjects, number of reported foci, distribution of activation foci) derived from the BrainMap database. This allowed to characterize the behavior of ALE analyses, to derive first power estimates for neuroimaging meta-analyses, and to thus formulate recommendations for future ALE studies. We could show as a first consequence that cluster-level family-wise error (FWE) correction represents the most appropriate method for statistical inference, while voxel-level FWE correction is valid but more conservative. In contrast, uncorrected inference and false-discovery rate correction should be avoided. As a second consequence, researchers should aim to include at least 20 experiments into an ALE meta-analysis to achieve sufficient power for moderate effects. We would like to note, though, that these calculations and recommendations are specific to ALE and may not be extrapolated to other approaches for (neuroimaging) meta-analysis. PMID:27179606
GNSS Network Time Series Analysis
NASA Astrophysics Data System (ADS)
Balodis, J.; Janpaule, I.; Haritonova, D.; Normand, M.; Silabriedis, G.; Zarinjsh, A.; Zvirgzds, J.
2012-04-01
Time series of GNSS station results of both the EUPOS®-RIGA and LATPOS networks has been developed at the Institute of Geodesy and Geoinformation (University of Latvia) using Bernese v.5.0 software. The base stations were selected among the EPN and IGS stations in surroundings of Latvia. In various day solutions the base station selection has been miscellaneous. Most frequently 5 - 8 base stations were selected from a set of stations {BOR1, JOEN, JOZE, MDVJ, METS, POLV, PULK, RIGA, TORA, VAAS, VISO, VLNS}. The rejection of "bad base stations" was performed by Bernese software depending on the quality of proper station data in proper day. This caused a reason of miscellaneous base station selection in various days. The results of time series are analysed. The question aroused on the nature of some outlying situations. The seasonal effect of the behaviour of the network has been identified when distance and elevation changes between stations has been analysed. The dependence from various influences has been recognised.
Omics Metadata Management Software v. 1 (OMMS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Our application, the Omics Metadata Management Software (OMMS), answers both needs, empowering experimentalists to generate intuitive, consistent metadata, and to perform bioinformatics analyses and information management tasks via a simple and intuitive web-based interface. Several use cases with short-read sequence datasets are provided to showcase the full functionality of the OMMS, from metadata curation tasks, to bioinformatics analyses and results management and downloading. The OMMS can be implemented as a stand alone-package for individual laboratories, or can be configured for web-based deployment supporting geographically dispersed research teams. Our software was developed with open-source bundles, is flexible, extensible and easily installedmore » and run by operators with general system administration and scripting language literacy.« less
Intercultural Education and Communication in Second Language Interactions
ERIC Educational Resources Information Center
Baraldi, Claudio
2012-01-01
This article analyses intercultural education outcomes produced in the setting of teaching Italian as a second language (ISL) in an Italian school. Intercultural education is produced in interactions which are based on specific cultural presuppositions, i.e. expectations regarding learning, role hierarchy and evaluation of student performances.…
Is Peer Interaction Necessary for Optimal Active Learning?
ERIC Educational Resources Information Center
Linton, Debra L.; Farmer, Jan Keith; Peterson, Ernie
2014-01-01
Meta-analyses of active-learning research consistently show that active-learning techniques result in greater student performance than traditional lecture-based courses. However, some individual studies show no effect of active-learning interventions. This may be due to inexperienced implementation of active learning. To minimize the effect of…
Housekeeping ESL. Workplace Literacy Curriculum for Hotels.
ERIC Educational Resources Information Center
Van Duzer, Carol; And Others
This curriculum for hotel employees is based on the analyses of worksite tasks and interactions. Hotel housekeepers were observed on the job, supervisors were consulted, and existing resources were reviewed to determine the language and basic skills needed to effectively and efficiently perform job duties. Twelve curriculum units were developed,…
Managing Instructor Training to Achieve Excellence.
ERIC Educational Resources Information Center
Norton, Robert E.
A group of concerned companies in the nuclear electric power industry formed the Electric Utility Instructor Training Consortium to train instructors in a cost-effective and time-efficient manner. The companies collaborated with the Ohio State University to (1) conduct job and task analyses; (2) develop performance-based instructor modules; (3)…
ERIC Educational Resources Information Center
Ardolino, Piermatteo; Noventa, Stefano; Formicuzzi, Maddalena; Cubico, Serena; Favretto, Giuseppe
2016-01-01
An observational study has been carried out to analyse differences in performance between students of different undergraduate curricula in the same written business administration examination, focusing particularly on possible effects of "integrated" or "multi-modular" examinations, a recently widespread format in Italian…
USDA-ARS?s Scientific Manuscript database
The peels of different pomegranate cultivars (Molla Nepes, Parfianka, Purple Heart, Wonderful and Vkunsyi) were compared in terms of phenolic composition and total phenolics. Analyses were performed on two silica hydride-based stationary phases: phenyl and undecenoic acid columns. Quantitation was ...
Shifting Discourses in Teacher Education: Performing the Advocate Bilingual Teacher
ERIC Educational Resources Information Center
Caldas, Blanca
2017-01-01
This article analyses the co-construction of the Bilingual teacher as advocate among preservice Bilingual teachers, through the use of narratives drawn from actual stories of Bilingual teachers, by means of drama-based pedagogy inspired by Theater of the Oppressed techniques. This study uses critical discourse analysis and Bakhtinian…
Karmee, Sanjib Kumar; Patria, Raffel Dharma; Lin, Carol Sze Ki
2015-02-18
Fossil fuel shortage is a major challenge worldwide. Therefore, research is currently underway to investigate potential renewable energy sources. Biodiesel is one of the major renewable energy sources that can be obtained from oils and fats by transesterification. However, biodiesel obtained from vegetable oils as feedstock is expensive. Thus, an alternative and inexpensive feedstock such as waste cooking oil (WCO) can be used as feedstock for biodiesel production. In this project, techno-economic analyses were performed on the biodiesel production in Hong Kong using WCO as a feedstock. Three different catalysts such as acid, base, and lipase were evaluated for the biodiesel production from WCO. These economic analyses were then compared to determine the most cost-effective method for the biodiesel production. The internal rate of return (IRR) sensitivity analyses on the WCO price and biodiesel price variation are performed. Acid was found to be the most cost-effective catalyst for the biodiesel production; whereas, lipase was the most expensive catalyst for biodiesel production. In the IRR sensitivity analyses, the acid catalyst can also acquire acceptable IRR despite the variation of the WCO and biodiesel prices.
Hardy, Bruce L.; Kay, Marvin; Marks, Anthony E.; Monigal, Katherine
2001-01-01
Stone tools are often the most abundant type of cultural remains at Paleolithic sites, yet their function is often poorly understood. Investigations of stone tool function, including microscopic use-wear and residue analyses, were performed on a sample of artifacts from the Paleolithic sites of Starosele (40,000–80,000 years BP) and Buran Kaya III (32,000–37,000 years BP). The Middle Paleolithic levels at Starosele exhibit a typical variant of the local Micoquian Industry. The artifacts from Buran Kaya III most closely resemble an Early Streletskayan Industry associated with the early Upper Paleolithic. The results of the functional analyses suggest that hominids at both sites were exploiting woody and starchy plant material as well as birds and mammals. Both sites show evidence of hafting of a wide variety of tools and the possible use of projectile or thrusting spears. These analyses were performed by using two different techniques conducted by independent researchers. Combined residue and use-wear analyses suggest that both the Upper Paleolithic and Middle Paleolithic hominids at these sites were broad-based foragers capable of exploiting a wide range of resources. PMID:11535837
Techno-Economic Evaluation of Biodiesel Production from Waste Cooking Oil—A Case Study of Hong Kong
Karmee, Sanjib Kumar; Patria, Raffel Dharma; Lin, Carol Sze Ki
2015-01-01
Fossil fuel shortage is a major challenge worldwide. Therefore, research is currently underway to investigate potential renewable energy sources. Biodiesel is one of the major renewable energy sources that can be obtained from oils and fats by transesterification. However, biodiesel obtained from vegetable oils as feedstock is expensive. Thus, an alternative and inexpensive feedstock such as waste cooking oil (WCO) can be used as feedstock for biodiesel production. In this project, techno-economic analyses were performed on the biodiesel production in Hong Kong using WCO as a feedstock. Three different catalysts such as acid, base, and lipase were evaluated for the biodiesel production from WCO. These economic analyses were then compared to determine the most cost-effective method for the biodiesel production. The internal rate of return (IRR) sensitivity analyses on the WCO price and biodiesel price variation are performed. Acid was found to be the most cost-effective catalyst for the biodiesel production; whereas, lipase was the most expensive catalyst for biodiesel production. In the IRR sensitivity analyses, the acid catalyst can also acquire acceptable IRR despite the variation of the WCO and biodiesel prices. PMID:25809602
Leontjevas, Ruslan; Gerritsen, Debby L; Koopmans, Raymond T C M; Smalbrugge, Martin; Vernooij-Dassen, Myrra J F J
2012-06-01
A multidisciplinary, evidence-based care program to improve the management of depression in nursing home residents was implemented and tested using a stepped-wedge design in 23 nursing homes (NHs): "Act in case of Depression" (AiD). Before effect analyses, to evaluate AiD process data on sampling quality (recruitment and randomization, reach) and intervention quality (relevance and feasibility, extent to which AiD was performed), which can be used for understanding internal and external validity. In this article, a model is presented that divides process evaluation data into first- and second-order process data. Qualitative and quantitative data based on personal files of residents, interviews of nursing home professionals, and a research database were analyzed according to the following process evaluation components: sampling quality and intervention quality. Nursing home. The pattern of residents' informed consent rates differed for dementia special care units and somatic units during the study. The nursing home staff was satisfied with the AiD program and reported that the program was feasible and relevant. With the exception of the first screening step (nursing staff members using a short observer-based depression scale), AiD components were not performed fully by NH staff as prescribed in the AiD protocol. Although NH staff found the program relevant and feasible and was satisfied with the program content, individual AiD components may have different feasibility. The results on sampling quality implied that statistical analyses of AiD effectiveness should account for the type of unit, whereas the findings on intervention quality implied that, next to the type of unit, analyses should account for the extent to which individual AiD program components were performed. In general, our first-order process data evaluation confirmed internal and external validity of the AiD trial, and this evaluation enabled further statistical fine tuning. The importance of evaluating the first-order process data before executing statistical effect analyses is thus underlined. Copyright © 2012 American Medical Directors Association, Inc. Published by Elsevier Inc. All rights reserved.
Luther, Lauren; Firmin, Ruth L; Lysaker, Paul H; Minor, Kyle S; Salyers, Michelle P
2018-04-07
An array of self-reported, clinician-rated, and performance-based measures has been used to assess motivation in schizophrenia; however, the convergent validity evidence for these motivation assessment methods is mixed. The current study is a series of meta-analyses that summarize the relationships between methods of motivation measurement in 45 studies of people with schizophrenia. The overall mean effect size between self-reported and clinician-rated motivation measures (r = 0.27, k = 33) was significant, positive, and approaching medium in magnitude, and the overall effect size between performance-based and clinician-rated motivation measures (r = 0.21, k = 11) was positive, significant, and small in magnitude. The overall mean effect size between self-reported and performance-based motivation measures was negligible and non-significant (r = -0.001, k = 2), but this meta-analysis was underpowered. Findings suggest modest convergent validity between clinician-rated and both self-reported and performance-based motivation measures, but additional work is needed to clarify the convergent validity between self-reported and performance-based measures. Further, there is likely more variability than similarity in the underlying construct that is being assessed across the three methods, particularly between the performance-based and other motivation measurement types. These motivation assessment methods should not be used interchangeably, and measures should be more precisely described as the specific motivational construct or domain they are capturing. Copyright © 2018 Elsevier Ltd. All rights reserved.
Analysing the operative experience of basic surgical trainees in Ireland using a web-based logbook
2011-01-01
Background There is concern about the adequacy of operative exposure in surgical training programmes, in the context of changing work practices. We aimed to quantify the operative exposure of all trainees on the National Basic Surgical Training (BST) programme in Ireland and compare the results with arbitrary training targets. Methods Retrospective analysis of data obtained from a web-based logbook (http://www.elogbook.org) for all general surgery and orthopaedic training posts between July 2007 and June 2009. Results 104 trainees recorded 23,918 operations between two 6-month general surgery posts. The most common general surgery operation performed was simple skin excision with trainees performing an average of 19.7 (± 9.9) over the 2-year training programme. Trainees most frequently assisted with cholecystectomy with an average of 16.0 (± 11.0) per trainee. Comparison of trainee operative experience to arbitrary training targets found that 2-38% of trainees achieved the targets for 9 emergency index operations and 24-90% of trainees achieved the targets for 8 index elective operations. 72 trainees also completed a 6-month post in orthopaedics and recorded 7,551 operations. The most common orthopaedic operation that trainees performed was removal of metal, with an average of 2.90 (± 3.27) per trainee. The most common orthopaedic operation that trainees assisted with was total hip replacement, with an average of 10.46 (± 6.21) per trainee. Conclusions A centralised web-based logbook provides valuable data to analyse training programme performance. Analysis of logbooks raises concerns about operative experience at junior trainee level. The provision of adequate operative exposure for trainees should be a key performance indicator for training programmes. PMID:21943313
Fourier Transform Mass Spectrometry: The Transformation of Modern Environmental Analyses
Lim, Lucy; Yan, Fangzhi; Bach, Stephen; Pihakari, Katianna; Klein, David
2016-01-01
Unknown compounds in environmental samples are difficult to identify using standard mass spectrometric methods. Fourier transform mass spectrometry (FTMS) has revolutionized how environmental analyses are performed. With its unsurpassed mass accuracy, high resolution and sensitivity, researchers now have a tool for difficult and complex environmental analyses. Two features of FTMS are responsible for changing the face of how complex analyses are accomplished. First is the ability to quickly and with high mass accuracy determine the presence of unknown chemical residues in samples. For years, the field has been limited by mass spectrometric methods that were based on knowing what compounds of interest were. Secondly, by utilizing the high resolution capabilities coupled with the low detection limits of FTMS, analysts also could dilute the sample sufficiently to minimize the ionization changes from varied matrices. PMID:26784175
Garcia Castro, Leyla Jael; Berlanga, Rafael; Garcia, Alexander
2015-10-01
Although full-text articles are provided by the publishers in electronic formats, it remains a challenge to find related work beyond the title and abstract context. Identifying related articles based on their abstract is indeed a good starting point; this process is straightforward and does not consume as many resources as full-text based similarity would require. However, further analyses may require in-depth understanding of the full content. Two articles with highly related abstracts can be substantially different regarding the full content. How similarity differs when considering title-and-abstract versus full-text and which semantic similarity metric provides better results when dealing with full-text articles are the main issues addressed in this manuscript. We have benchmarked three similarity metrics - BM25, PMRA, and Cosine, in order to determine which one performs best when using concept-based annotations on full-text documents. We also evaluated variations in similarity values based on title-and-abstract against those relying on full-text. Our test dataset comprises the Genomics track article collection from the 2005 Text Retrieval Conference. Initially, we used an entity recognition software to semantically annotate titles and abstracts as well as full-text with concepts defined in the Unified Medical Language System (UMLS®). For each article, we created a document profile, i.e., a set of identified concepts, term frequency, and inverse document frequency; we then applied various similarity metrics to those document profiles. We considered correlation, precision, recall, and F1 in order to determine which similarity metric performs best with concept-based annotations. For those full-text articles available in PubMed Central Open Access (PMC-OA), we also performed dispersion analyses in order to understand how similarity varies when considering full-text articles. We have found that the PubMed Related Articles similarity metric is the most suitable for full-text articles annotated with UMLS concepts. For similarity values above 0.8, all metrics exhibited an F1 around 0.2 and a recall around 0.1; BM25 showed the highest precision close to 1; in all cases the concept-based metrics performed better than the word-stem-based one. Our experiments show that similarity values vary when considering only title-and-abstract versus full-text similarity. Therefore, analyses based on full-text become useful when a given research requires going beyond title and abstract, particularly regarding connectivity across articles. Visualization available at ljgarcia.github.io/semsim.benchmark/, data available at http://dx.doi.org/10.5281/zenodo.13323. Copyright © 2015 Elsevier Inc. All rights reserved.
A profile of sports science research (1983-2003).
Williams, Stephen John; Kendall, Lawrence R
2007-08-01
A majority of sports science research is undertaken in universities and dedicated research centres, such as institutes of sport. Reviews of literature analysing and categorising research have been carried out, but categories identified have been limited to research design and data gathering techniques. Hence there is a need to include categories such as discipline, subjects and targeted sport. A study was conducted using document analysis method to gather data that described and categorised performance-based sports science research projects in Australian universities and institutes of sport. An instrument was designed that could be used by researchers to analyse and profile research in the area of sports science. The instrument contained six categories: targeted sport, primary study area, participant type, research setting, methodology and data gathering techniques. Research documents analysed consisted of 725 original unpublished research reports/theses. Results showed that over two-thirds of research projects were targeted to specific sports and, of this group, nearly half involved four sports: cycling, rowing, athletics and swimming. Overall, physiology was the most researched scientific discipline. The most frequently used research method was experimental design, and the most frequently used data gathering technique was physiological (performance) measures. Two-thirds of research was conducted in laboratory settings, and nearly half of the research was conducted with elite or sub-elite athletes as participants/subjects. The findings of this study provide an overall synopsis of performance-based sports science research conducted in Australia over the last 20 years, and should be of considerable importance in the ongoing development of sports science research policy in Australia.
Multi-GNSS signal-in-space range error assessment - Methodology and results
NASA Astrophysics Data System (ADS)
Montenbruck, Oliver; Steigenberger, Peter; Hauschild, André
2018-06-01
The positioning accuracy of global and regional navigation satellite systems (GNSS/RNSS) depends on a variety of influence factors. For constellation-specific performance analyses it has become common practice to separate a geometry-related quality factor (the dilution of precision, DOP) from the measurement and modeling errors of the individual ranging measurements (known as user equivalent range error, UERE). The latter is further divided into user equipment errors and contributions related to the space and control segment. The present study reviews the fundamental concepts and underlying assumptions of signal-in-space range error (SISRE) analyses and presents a harmonized framework for multi-GNSS performance monitoring based on the comparison of broadcast and precise ephemerides. The implications of inconsistent geometric reference points, non-common time systems, and signal-specific range biases are analyzed, and strategies for coping with these issues in the definition and computation of SIS range errors are developed. The presented concepts are, furthermore, applied to current navigation satellite systems, and representative results are presented along with a discussion of constellation-specific problems in their determination. Based on data for the January to December 2017 time frame, representative global average root-mean-square (RMS) SISRE values of 0.2 m, 0.6 m, 1 m, and 2 m are obtained for Galileo, GPS, BeiDou-2, and GLONASS, respectively. Roughly two times larger values apply for the corresponding 95th-percentile values. Overall, the study contributes to a better understanding and harmonization of multi-GNSS SISRE analyses and their use as key performance indicators for the various constellations.
Characterization and effectiveness of pay-for-performance in ophthalmology: a systematic review.
Herbst, Tim; Emmert, Martin
2017-06-05
To identify, characterize and compare existing pay-for-performance approaches and their impact on the quality of care and efficiency in ophthalmology. A systematic evidence-based review was conducted. English, French and German written literature published between 2000 and 2015 were searched in the following databases: Medline (via PubMed), NCBI web site, Scopus, Web of Knowledge, Econlit and the Cochrane Library. Empirical as well as descriptive articles were included. Controlled clinical trials, meta-analyses, randomized controlled studies as well as observational studies were included as empirical articles. Systematic characterization of identified pay-for-performance approaches (P4P approaches) was conducted according to the "Model for Implementing and Monitoring Incentives for Quality" (MIMIQ). Methodological quality of empirical articles was assessed according to the Critical Appraisal Skills Programme (CASP) checklists. Overall, 13 relevant articles were included. Eleven articles were descriptive and two articles included empirical analyses. Based on these articles, four different pay-for-performance approaches implemented in the United States were identified. With regard to quality and incentive elements, systematic comparison showed numerous differences between P4P approaches. Empirical studies showed isolated cost or quality effects, while a simultaneous examination of these effects was missing. Research results show that experiences with pay-for-performance approaches in ophthalmology are limited. Identified approaches differ with regard to quality and incentive elements restricting comparability. Two empirical studies are insufficient to draw strong conclusions about the effectiveness and efficiency of these approaches.
Biases and Power for Groups Comparison on Subjective Health Measurements
Hamel, Jean-François; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Roquelaure, Yves; Sébille, Véronique
2012-01-01
Subjective health measurements are increasingly used in clinical research, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: so-called classical test theory (CTT), relying on observed scores and models coming from Item Response Theory (IRT) relying on a response model relating the items responses to a latent parameter, often called latent trait. Whether IRT or CTT would be the most appropriate method to compare two independent groups of patients on a patient reported outcomes measurement remains unknown and was investigated using simulations. For CTT-based analyses, groups comparison was performed using t-test on the scores. For IRT-based analyses, several methods were compared, according to whether the Rasch model was considered with random effects or with fixed effects, and the group effect was included as a covariate or not. Individual latent traits values were estimated using either a deterministic method or by stochastic approaches. Latent traits were then compared with a t-test. Finally, a two-steps method was performed to compare the latent trait distributions, and a Wald test was performed to test the group effect in the Rasch model including group covariates. The only unbiased IRT-based method was the group covariate Wald’s test, performed on the random effects Rasch model. This model displayed the highest observed power, which was similar to the power using the score t-test. These results need to be extended to the case frequently encountered in practice where data are missing and possibly informative. PMID:23115620
Co-variation of tests commonly used in stroke rehabilitation.
Langhammer, Birgitta; Stanghelle, Johan Kvalvik
2006-12-01
The aim of the present study was to analyse the co-variation of different tests commonly used in stroke rehabilitation, and specifically used in a recent randomized, controlled study of two different physiotherapy models in stroke rehabilitation. Correlations of the performed tests and recordings from previous work were studied. The test results from three-month, one-year and four-year follow-up were analysed in an SPSS Version 11 statistical package with Pearson and Spearman correlations. There was an expected high correlation between the motor function tests, both based on partial and total scores. The correlations between Nottingham Health Profile Part 1 and Motor Assessment Scale (MAS), Sødring Motor Evaluation Scale (SMES), the Berg Balance Scale (BBS) and Barthel Activities of Daily Living (ADL) index were low for all items except physical condition. The correlations between registered living conditions, assistive devices, recurrent stroke, motor function (MAS, SMES), ADL (Barthel ADL index) and balance (BBS) were high. The same variables showed weak or poor correlation to the Nottingham Health Profile (NHP). The co-variations of motor function tests and functional tests were high, but the co-variations of motor, functional and self-reported life-quality tests were poor. The patients rated themselves on a higher functional level in the self-reported tests than was observed objectively in the performance-based tests. A possible reason for this is that the patients may have been unaware they modified their performance to adjust for physical decline, and consequently overestimate their physical condition. This result underlines the importance of both performance-based and self-reported tests as complementary tools in a rehabilitation process.
Strategies in probabilistic feedback learning in Parkinson patients OFF medication.
Bellebaum, C; Kobza, S; Ferrea, S; Schnitzler, A; Pollok, B; Südmeyer, M
2016-04-21
Studies on classification learning suggested that altered dopamine function in Parkinson's Disease (PD) specifically affects learning from feedback. In patients OFF medication, enhanced learning from negative feedback has been described. This learning bias was not seen in observational learning from feedback, indicating different neural mechanisms for this type of learning. The present study aimed to compare the acquisition of stimulus-response-outcome associations in PD patients OFF medication and healthy control subjects in active and observational learning. 16 PD patients OFF medication and 16 controls were examined with three parallel learning tasks each, two feedback-based (active and observational) and one non-feedback-based paired associates task. No acquisition deficit was seen in the patients for any of the tasks. More detailed analyses on the learning strategies did, however, reveal that the patients showed more lose-shift responses during active feedback learning than controls, and that lose-shift and win-stay responses more strongly determined performance accuracy in patients than controls. For observational feedback learning, the performance of both groups correlated similarly with the performance in non-feedback-based paired associates learning and with the accuracy of observed performance. Also, patients and controls showed comparable evidence of feedback processing in observational learning. In active feedback learning, PD patients use alternative learning strategies than healthy controls. Analyses on observational learning did not yield differences between patients and controls, adding to recent evidence of a differential role of the human striatum in active and observational learning from feedback. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Aarsand, Aasne K; Villanger, Jørild H; Støle, Egil; Deybach, Jean-Charles; Marsden, Joanne; To-Figueras, Jordi; Badminton, Mike; Elder, George H; Sandberg, Sverre
2011-11-01
The porphyrias are a group of rare metabolic disorders whose diagnosis depends on identification of specific patterns of porphyrin precursor and porphyrin accumulation in urine, blood, and feces. Diagnostic tests for porphyria are performed by specialized laboratories in many countries. Data regarding the analytical and diagnostic performance of these laboratories are scarce. We distributed 5 sets of multispecimen samples from different porphyria patients accompanied by clinical case histories to 18-21 European specialist porphyria laboratories/centers as part of a European Porphyria Network organized external analytical and postanalytical quality assessment (EQA) program. The laboratories stated which analyses they would normally have performed given the case histories and reported results of all porphyria-related analyses available, interpretative comments, and diagnoses. Reported diagnostic strategies initially showed considerable diversity, but the number of laboratories applying adequate diagnostic strategies increased during the study period. We found an average interlaboratory CV of 50% (range 12%-152%) for analytes in absolute concentrations. Result normalization by forming ratios to the upper reference limits did not reduce this variation. Sixty-five percent of reported results were within biological variation-based analytical quality specifications. Clinical interpretation of the obtained analytical results was accurate, and most laboratories established the correct diagnosis in all distributions. Based on a case-based EQA scheme, variations were apparent in analytical and diagnostic performance between European specialist porphyria laboratories. Our findings reinforce the use of EQA schemes as an essential tool to assess both analytical and diagnostic processes and thereby to improve patient care in rare diseases.
Music performance and the perception of key.
Thompson, W F; Cuddy, L L
1997-02-01
The effect of music performance on perceived key movement was examined. Listeners judged key movement in sequences presented without performance expression (mechanical) in Experiment 1 and with performance expression in Experiment 2. Modulation distance varied. Judgments corresponded to predictions based on the cycle of fifths and toroidal models of key relatedness, with the highest correspondence for performed versions with the toroidal model. In Experiment 3, listeners compared mechanical sequences with either performed sequences or modifications of performed sequences. Modifications preserved expressive differences between chords, but not between voices. Predictions from Experiments 1 and 2 held only for performed sequences, suggesting that differences between voices are informative of key movement. Experiment 4 confirmed that modifications did not disrupt musicality. Analyses of performances further suggested a link between performance expression and key.
Comparative techno-economic assessment and LCA of selected integrated sugarcane-based biorefineries.
Gnansounou, Edgard; Vaskan, Pavel; Pachón, Elia Ruiz
2015-11-01
This work addresses the economic and environmental performance of integrated biorefineries based on sugarcane juice and residues. Four multiproduct scenarios were considered; two from sugar mills and the others from ethanol distilleries. They are integrated biorefineries producing first (1G) and second (2G) generation ethanol, sugar, molasses (for animal feed) and electricity in the context of Brazil. The scenarios were analysed and compared using techno-economic value-based approach and LCA methodology. The results show that the best economic configuration is provided by a scenario with largest ethanol production while the best environmental performance is presented by a scenario with full integration sugar - 1G2G ethanol production. Copyright © 2015 Elsevier Ltd. All rights reserved.
Mackay, Michael M
2016-09-01
This article offers a correlation matrix of meta-analytic estimates between various employee job attitudes (i.e., Employee engagement, job satisfaction, job involvement, and organizational commitment) and indicators of employee effectiveness (i.e., Focal performance, contextual performance, turnover intention, and absenteeism). The meta-analytic correlations in the matrix are based on over 1100 individual studies representing over 340,000 employees. Data was collected worldwide via employee self-report surveys. Structural path analyses based on the matrix, and the interpretation of the data, can be found in "Investigating the incremental validity of employee engagement in the prediction of employee effectiveness: a meta-analytic path analysis" (Mackay et al., 2016) [1].
A genome-wide association study of corneal astigmatism: The CREAM Consortium.
Shah, Rupal L; Li, Qing; Zhao, Wanting; Tedja, Milly S; Tideman, J Willem L; Khawaja, Anthony P; Fan, Qiao; Yazar, Seyhan; Williams, Katie M; Verhoeven, Virginie J M; Xie, Jing; Wang, Ya Xing; Hess, Moritz; Nickels, Stefan; Lackner, Karl J; Pärssinen, Olavi; Wedenoja, Juho; Biino, Ginevra; Concas, Maria Pina; Uitterlinden, André; Rivadeneira, Fernando; Jaddoe, Vincent W V; Hysi, Pirro G; Sim, Xueling; Tan, Nicholas; Tham, Yih-Chung; Sensaki, Sonoko; Hofman, Albert; Vingerling, Johannes R; Jonas, Jost B; Mitchell, Paul; Hammond, Christopher J; Höhn, René; Baird, Paul N; Wong, Tien-Yin; Cheng, Chinfsg-Yu; Teo, Yik Ying; Mackey, David A; Williams, Cathy; Saw, Seang-Mei; Klaver, Caroline C W; Guggenheim, Jeremy A; Bailey-Wilson, Joan E
2018-01-01
To identify genes and genetic markers associated with corneal astigmatism. A meta-analysis of genome-wide association studies (GWASs) of corneal astigmatism undertaken for 14 European ancestry (n=22,250) and 8 Asian ancestry (n=9,120) cohorts was performed by the Consortium for Refractive Error and Myopia. Cases were defined as having >0.75 diopters of corneal astigmatism. Subsequent gene-based and gene-set analyses of the meta-analyzed results of European ancestry cohorts were performed using VEGAS2 and MAGMA software. Additionally, estimates of single nucleotide polymorphism (SNP)-based heritability for corneal and refractive astigmatism and the spherical equivalent were calculated for Europeans using LD score regression. The meta-analysis of all cohorts identified a genome-wide significant locus near the platelet-derived growth factor receptor alpha ( PDGFRA ) gene: top SNP: rs7673984, odds ratio=1.12 (95% CI:1.08-1.16), p=5.55×10 -9 . No other genome-wide significant loci were identified in the combined analysis or European/Asian ancestry-specific analyses. Gene-based analysis identified three novel candidate genes for corneal astigmatism in Europeans-claudin-7 ( CLDN7 ), acid phosphatase 2, lysosomal ( ACP2 ), and TNF alpha-induced protein 8 like 3 ( TNFAIP8L3 ). In addition to replicating a previously identified genome-wide significant locus for corneal astigmatism near the PDGFRA gene, gene-based analysis identified three novel candidate genes, CLDN7 , ACP2 , and TNFAIP8L3 , that warrant further investigation to understand their role in the pathogenesis of corneal astigmatism. The much lower number of genetic variants and genes demonstrating an association with corneal astigmatism compared to published spherical equivalent GWAS analyses suggest a greater influence of rare genetic variants, non-additive genetic effects, or environmental factors in the development of astigmatism.
Zhou, Xiaofan; Shen, Xing-Xing; Hittinger, Chris Todd
2018-01-01
Abstract The sizes of the data matrices assembled to resolve branches of the tree of life have increased dramatically, motivating the development of programs for fast, yet accurate, inference. For example, several different fast programs have been developed in the very popular maximum likelihood framework, including RAxML/ExaML, PhyML, IQ-TREE, and FastTree. Although these programs are widely used, a systematic evaluation and comparison of their performance using empirical genome-scale data matrices has so far been lacking. To address this question, we evaluated these four programs on 19 empirical phylogenomic data sets with hundreds to thousands of genes and up to 200 taxa with respect to likelihood maximization, tree topology, and computational speed. For single-gene tree inference, we found that the more exhaustive and slower strategies (ten searches per alignment) outperformed faster strategies (one tree search per alignment) using RAxML, PhyML, or IQ-TREE. Interestingly, single-gene trees inferred by the three programs yielded comparable coalescent-based species tree estimations. For concatenation-based species tree inference, IQ-TREE consistently achieved the best-observed likelihoods for all data sets, and RAxML/ExaML was a close second. In contrast, PhyML often failed to complete concatenation-based analyses, whereas FastTree was the fastest but generated lower likelihood values and more dissimilar tree topologies in both types of analyses. Finally, data matrix properties, such as the number of taxa and the strength of phylogenetic signal, sometimes substantially influenced the programs’ relative performance. Our results provide real-world gene and species tree phylogenetic inference benchmarks to inform the design and execution of large-scale phylogenomic data analyses. PMID:29177474
Sivasubramanian, Srinivasan; Chandrasekar, Gayathri; Svensson Akusjärvi, Sara; Thangam, Ramar; Sathuvan, Malairaj; Kumar, R B S; Hussein, Hawraa; Vincent, Savariar; Madhan, Balaraman; Gunasekaran, Palani; Kitambi, Satish S
2017-01-01
The potential of multifunctional wound heal biomaterial relies on the optimal content of therapeutic constituents as well as the desirable physical, chemical, and biological properties to accelerate the healing process. Formulating biomaterials such as amnion or collagen based scaffolds with natural products offer an affordable strategy to develop dressing material with high efficiency in healing wounds. Using image based phenotyping and quantification, we screened natural product derived bioactive compounds for modulators of types I and III collagen production from human foreskin derived fibroblast cells. The identified hit was then formulated with amnion to develop a biomaterial, and its biophysical properties, in vitro and in vivo effects were characterized. In addition, we performed functional profiling analyses by PCR array to understand the effect of individual components of these materials on various genes such as inflammatory mediators including chemokines and cytokines, growth factors, fibroblast stimulating markers for collagen secretion, matrix metalloproteinases, etc., associated with wound healing. FACS based cell cycle analyses were carried out to evaluate the potential of biomaterials for induction of proliferation of fibroblasts. Western blot analyses was done to examine the effect of biomaterial on collagen synthesis by cells and compared to cells grown in the presence of growth factors. This work demonstrated an uncomplicated way of identifying components that synergistically promote healing. Besides, we demonstrated that modulating local wound environment using biomaterials with bioactive compounds could enhance healing. This study finds that the developed biomaterials offer immense scope for healing wounds by means of their skin regenerative features such as anti-inflammatory, fibroblast stimulation for collagen secretion as well as inhibition of enzymes and markers impeding the healing, hydrodynamic properties complemented with other features including non-toxicity, biocompatibility, and safety.
A multi-GPU real-time dose simulation software framework for lung radiotherapy.
Santhanam, A P; Min, Y; Neelakkantan, H; Papp, N; Meeks, S L; Kupelian, P A
2012-09-01
Medical simulation frameworks facilitate both the preoperative and postoperative analysis of the patient's pathophysical condition. Of particular importance is the simulation of radiation dose delivery for real-time radiotherapy monitoring and retrospective analyses of the patient's treatment. In this paper, a software framework tailored for the development of simulation-based real-time radiation dose monitoring medical applications is discussed. A multi-GPU-based computational framework coupled with inter-process communication methods is introduced for simulating the radiation dose delivery on a deformable 3D volumetric lung model and its real-time visualization. The model deformation and the corresponding dose calculation are allocated among the GPUs in a task-specific manner and is performed in a pipelined manner. Radiation dose calculations are computed on two different GPU hardware architectures. The integration of this computational framework with a front-end software layer and back-end patient database repository is also discussed. Real-time simulation of the dose delivered is achieved at once every 120 ms using the proposed framework. With a linear increase in the number of GPU cores, the computational time of the simulation was linearly decreased. The inter-process communication time also improved with an increase in the hardware memory. Variations in the delivered dose and computational speedup for variations in the data dimensions are investigated using D70 and D90 as well as gEUD as metrics for a set of 14 patients. Computational speed-up increased with an increase in the beam dimensions when compared with a CPU-based commercial software while the error in the dose calculation was <1%. Our analyses show that the framework applied to deformable lung model-based radiotherapy is an effective tool for performing both real-time and retrospective analyses.
Sivasubramanian, Srinivasan; Chandrasekar, Gayathri; Svensson Akusjärvi, Sara; Thangam, Ramar; Sathuvan, Malairaj; Kumar, R. B. S.; Hussein, Hawraa; Vincent, Savariar; Madhan, Balaraman; Gunasekaran, Palani; Kitambi, Satish S.
2017-01-01
The potential of multifunctional wound heal biomaterial relies on the optimal content of therapeutic constituents as well as the desirable physical, chemical, and biological properties to accelerate the healing process. Formulating biomaterials such as amnion or collagen based scaffolds with natural products offer an affordable strategy to develop dressing material with high efficiency in healing wounds. Using image based phenotyping and quantification, we screened natural product derived bioactive compounds for modulators of types I and III collagen production from human foreskin derived fibroblast cells. The identified hit was then formulated with amnion to develop a biomaterial, and its biophysical properties, in vitro and in vivo effects were characterized. In addition, we performed functional profiling analyses by PCR array to understand the effect of individual components of these materials on various genes such as inflammatory mediators including chemokines and cytokines, growth factors, fibroblast stimulating markers for collagen secretion, matrix metalloproteinases, etc., associated with wound healing. FACS based cell cycle analyses were carried out to evaluate the potential of biomaterials for induction of proliferation of fibroblasts. Western blot analyses was done to examine the effect of biomaterial on collagen synthesis by cells and compared to cells grown in the presence of growth factors. This work demonstrated an uncomplicated way of identifying components that synergistically promote healing. Besides, we demonstrated that modulating local wound environment using biomaterials with bioactive compounds could enhance healing. This study finds that the developed biomaterials offer immense scope for healing wounds by means of their skin regenerative features such as anti-inflammatory, fibroblast stimulation for collagen secretion as well as inhibition of enzymes and markers impeding the healing, hydrodynamic properties complemented with other features including non-toxicity, biocompatibility, and safety. PMID:28769790
Çelebi, Mehmet; Huang, Moh; Shakal, Antony; Hooper, John; Klemencic, Ron
2012-01-01
A 64-story, performance-based design building with reinforced concrete core shear-walls and unique dynamic response modification features (tuned liquid sloshing dampers and buckling-restrained braces) has been instrumented with a monitoring array of 72 channels of accelerometers. Ambient vibration data recorded are analyzed to identify modes and associated frequencies and damping. The low-amplitude dynamic characteristics are considerably different than those computed from design analyses, but serve as a baseline against which to compare with future strong shaking responses. Such studies help to improve our understanding of the effectiveness of the added features to the building and help improve designs in the future.
NASA Technical Reports Server (NTRS)
Gupta, U. K.; Ali, M.
1989-01-01
The LEADER expert system has been developed for automatic learning tasks encompassing real-time detection, identification, verification, and correction of anomalous propulsion system operations, using a set of sensors to monitor engine component performance to ascertain anomalies in engine dynamics and behavior. Two diagnostic approaches are embodied in LEADER's architecture: (1) learning and identifying engine behavior patterns to generate novel hypotheses about possible abnormalities, and (2) the direction of engine sensor data processing to perform resoning based on engine design and functional knowledge, as well as the principles of the relevant mechanics and physics.
Efficient genotype compression and analysis of large genetic variation datasets
Layer, Ryan M.; Kindlon, Neil; Karczewski, Konrad J.; Quinlan, Aaron R.
2015-01-01
Genotype Query Tools (GQT) is a new indexing strategy that expedites analyses of genome variation datasets in VCF format based on sample genotypes, phenotypes and relationships. GQT’s compressed genotype index minimizes decompression for analysis, and performance relative to existing methods improves with cohort size. We show substantial (up to 443 fold) performance gains over existing methods and demonstrate GQT’s utility for exploring massive datasets involving thousands to millions of genomes. PMID:26550772
Performance of Stratified and Subgrouped Disproportionality Analyses in Spontaneous Databases.
Seabroke, Suzie; Candore, Gianmario; Juhlin, Kristina; Quarcoo, Naashika; Wisniewski, Antoni; Arani, Ramin; Painter, Jeffery; Tregunno, Philip; Norén, G Niklas; Slattery, Jim
2016-04-01
Disproportionality analyses are used in many organisations to identify adverse drug reactions (ADRs) from spontaneous report data. Reporting patterns vary over time, with patient demographics, and between different geographical regions, and therefore subgroup analyses or adjustment by stratification may be beneficial. The objective of this study was to evaluate the performance of subgroup and stratified disproportionality analyses for a number of key covariates within spontaneous report databases of differing sizes and characteristics. Using a reference set of established ADRs, signal detection performance (sensitivity and precision) was compared for stratified, subgroup and crude (unadjusted) analyses within five spontaneous report databases (two company, one national and two international databases). Analyses were repeated for a range of covariates: age, sex, country/region of origin, calendar time period, event seriousness, vaccine/non-vaccine, reporter qualification and report source. Subgroup analyses consistently performed better than stratified analyses in all databases. Subgroup analyses also showed benefits in both sensitivity and precision over crude analyses for the larger international databases, whilst for the smaller databases a gain in precision tended to result in some loss of sensitivity. Additionally, stratified analyses did not increase sensitivity or precision beyond that associated with analytical artefacts of the analysis. The most promising subgroup covariates were age and region/country of origin, although this varied between databases. Subgroup analyses perform better than stratified analyses and should be considered over the latter in routine first-pass signal detection. Subgroup analyses are also clearly beneficial over crude analyses for larger databases, but further validation is required for smaller databases.
Schot, Marjolein J C; van Delft, Sanne; Kooijman-Buiting, Antoinette M J; de Wit, Niek J; Hopstaken, Rogier M
2015-01-01
Objective Various point-of-care testing (POCT) urine analysers are commercially available for routine urine analysis in general practice. The present study compares analytical performance, agreement and user-friendliness of six different POCT urine analysers for diagnosing urinary tract infection in general practice. Setting All testing procedures were performed at a diagnostic centre for primary care in the Netherlands. Urine samples were collected at four general practices. Primary and secondary outcome measures Analytical performance and agreement of the POCT analysers regarding nitrite, leucocytes and erythrocytes, with the laboratory reference standard, was the primary outcome measure, and analysed by calculating sensitivity, specificity, positive and negative predictive value, and Cohen's κ coefficient for agreement. Secondary outcome measures were the user-friendliness of the POCT analysers, in addition to other characteristics of the analysers. Results The following six POCT analysers were evaluated: Uryxxon Relax (Macherey Nagel), Urisys 1100 (Roche), Clinitek Status (Siemens), Aution 11 (Menarini), Aution Micro (Menarini) and Urilyzer (Analyticon). Analytical performance was good for all analysers. Compared with laboratory reference standards, overall agreement was good, but differed per parameter and per analyser. Concerning the nitrite test, the most important test for clinical practice, all but one showed perfect agreement with the laboratory standard. For leucocytes and erythrocytes specificity was high, but sensitivity was considerably lower. Agreement for leucocytes varied between good to very good, and for the erythrocyte test between fair and good. First-time users indicated that the analysers were easy to use. They expected higher productivity and accuracy when using these analysers in daily practice. Conclusions The overall performance and user-friendliness of all six commercially available POCT urine analysers was sufficient to justify routine use in suspected urinary tract infections in general practice. PMID:25986635
NASA Technical Reports Server (NTRS)
Gudimenko, Y.; Ng, R.; Iskanderova, Z.; Kleiman, J.; Grigorevsky, A.; Kiseleva, L.; Finckenor, M.; Edwards, D.
2005-01-01
Research has been continued to further improve the space durability of conductive and non-conductive polymer-based paints and of conductive thermal control paints for space applications. Efforts have been made to enhance the space durability and stability of functional Characteristics in ground-based space environment imitating conditions, using specially developed surface modification treatment. The results of surface modification of new conductive paints, including the ground-based testing in aggressive oxidative environments, such as atomic oxygen/UV and oxygen plasma, and performance evaluation are presented. Functional properties and performance characteristics, such as thermal optical properties (differential solar absorptance and thermal emittance representing the thermal optical performance of thermal control paints) and surface resistivity characteristics of pristine, surface modified, and tested materials were verified. Extensive surface analysis studies have been performed using complementary surface analyses including SEM/EDS and XPS. Test results revealed that the successfully treated materials exhibit reduced mass loss and no surface morphology change, thus indicating good protection from the severe oxidative environment. It was demonstrated that the developed surface modification treatment could be applied successfully to charge dissipative and conductive paints.
40 CFR 63.7515 - When must I conduct subsequent performance tests or fuel analyses?
Code of Federal Regulations, 2010 CFR
2010-07-01
... performance tests or fuel analyses? 63.7515 Section 63.7515 Protection of Environment ENVIRONMENTAL PROTECTION... Requirements § 63.7515 When must I conduct subsequent performance tests or fuel analyses? (a) You must conduct... previous performance test. (f) You must conduct a fuel analysis according to § 63.7521 for each type of...
40 CFR 63.11220 - When must I conduct subsequent performance tests or fuel analyses?
Code of Federal Regulations, 2014 CFR
2014-07-01
... performance tests or fuel analyses? 63.11220 Section 63.11220 Protection of Environment ENVIRONMENTAL... When must I conduct subsequent performance tests or fuel analyses? (a) If your boiler has a heat input... performance (stack) tests according to § 63.11212 on a triennial basis, except as specified in paragraphs (b...
40 CFR 63.7515 - When must I conduct subsequent performance tests or fuel analyses?
Code of Federal Regulations, 2011 CFR
2011-07-01
... performance tests or fuel analyses? 63.7515 Section 63.7515 Protection of Environment ENVIRONMENTAL PROTECTION... Requirements § 63.7515 When must I conduct subsequent performance tests or fuel analyses? (a) You must conduct... previous performance test. (f) You must conduct a fuel analysis according to § 63.7521 for each type of...
40 CFR 63.11220 - When must I conduct subsequent performance tests or fuel analyses?
Code of Federal Regulations, 2013 CFR
2013-07-01
... performance tests or fuel analyses? 63.11220 Section 63.11220 Protection of Environment ENVIRONMENTAL... When must I conduct subsequent performance tests or fuel analyses? (a) If your boiler has a heat input... performance (stack) tests according to § 63.11212 on a triennial basis, except as specified in paragraphs (b...
40 CFR 63.7515 - When must I conduct subsequent performance tests or fuel analyses?
Code of Federal Regulations, 2012 CFR
2012-07-01
... performance tests or fuel analyses? 63.7515 Section 63.7515 Protection of Environment ENVIRONMENTAL PROTECTION... Requirements § 63.7515 When must I conduct subsequent performance tests or fuel analyses? (a) You must conduct... previous performance test. (f) You must conduct a fuel analysis according to § 63.7521 for each type of...
Sumiyoshi, Chika; Harvey, Philip D; Takaki, Manabu; Okahisa, Yuko; Sato, Taku; Sora, Ichiro; Nuechterlein, Keith H; Subotnik, Kenneth L; Sumiyoshi, Tomiki
2015-09-01
Functional outcomes in individuals with schizophrenia suggest recovery of cognitive, everyday, and social functioning. Specifically improvement of work status is considered to be most important for their independent living and self-efficacy. The main purposes of the present study were 1) to identify which outcome factors predict occupational functioning, quantified as work hours, and 2) to provide cut-offs on the scales for those factors to attain better work status. Forty-five Japanese patients with schizophrenia and 111 healthy controls entered the study. Cognition, capacity for everyday activities, and social functioning were assessed by the Japanese versions of the MATRICS Cognitive Consensus Battery (MCCB), the UCSD Performance-based Skills Assessment-Brief (UPSA-B), and the Social Functioning Scale Individuals' version modified for the MATRICS-PASS (Modified SFS for PASS), respectively. Potential factors for work outcome were estimated by multiple linear regression analyses (predicting work hours directly) and a multiple logistic regression analyses (predicting dichotomized work status based on work hours). ROC curve analyses were performed to determine cut-off points for differentiating between the better- and poor work status. The results showed that a cognitive component, comprising visual/verbal learning and emotional management, and a social functioning component, comprising independent living and vocational functioning, were potential factors for predicting work hours/status. Cut-off points obtained in ROC analyses indicated that 60-70% achievements on the measures of those factors were expected to maintain the better work status. Our findings suggest that improvement on specific aspects of cognitive and social functioning are important for work outcome in patients with schizophrenia.
Behzadian, Kourosh; Kapelan, Zoran
2015-09-15
Despite providing water-related services as the primary purpose of urban water system (UWS), all relevant activities require capital investments and operational expenditures, consume resources (e.g. materials and chemicals), and may increase negative environmental impacts (e.g. contaminant discharge, emissions to water and air). Performance assessment of such a metabolic system may require developing a holistic approach which encompasses various system elements and criteria. This paper analyses the impact of integration of UWS components on the metabolism based performance assessment for future planning using a number of intervention strategies. It also explores the importance of sustainability based criteria in the assessment of long-term planning. Two assessment approaches analysed here are: (1) planning for only water supply system (WSS) as a part of the UWS and (2) planning for an integrated UWS including potable water, stormwater, wastewater and water recycling. WaterMet(2) model is used to simulate metabolic type processes in the UWS and calculate quantitative performance indicators. The analysis is demonstrated on the problem of strategic level planning of a real-world UWS to where optional intervention strategies are applied. The resulting performance is assessed using the multiple criteria of both conventional and sustainability type; and optional intervention strategies are then ranked using the Compromise Programming method. The results obtained show that the high ranked intervention strategies in the integrated UWS are those supporting both water supply and stormwater/wastewater subsystems (e.g. rainwater harvesting and greywater recycling schemes) whilst these strategies are ranked low in the WSS and those targeting improvement of water supply components only (e.g. rehabilitation of clean water pipes and addition of new water resources) are preferred instead. Results also demonstrate that both conventional and sustainability type performance indicators are necessary for strategic planning in the UWS. Copyright © 2015 Elsevier B.V. All rights reserved.
Welmer, Anna-Karin; Kåreholt, Ingemar; Angleman, Sara; Rydwik, Elisabeth; Fratiglioni, Laura
2012-10-01
It is known that physical performance declines with age in general, however there remains much to be understood in terms of age-related differences amongst older adults across a variety of physical components (such as speed, strength and balance), and particularly in terms of the role played by multimorbidity of chronic diseases. We aimed to detect the age-related differences across four components of physical performance and to explore to what extent chronic diseases and multimorbidity may explain such differences. We analyzed cross-sectional data from a population-based sample of 3323 people, aged 60 years and older from the SNAC-K study, Stockholm, Sweden. Physical performance was assessed by trained nurses using several tests (grip strength, walking speed, balance and chair stands). Clinical diagnoses were made by the examining physician based on clinical history and examination. Censored normal regression analyses showed that the 72-90+ year-old persons had 17-40% worse grip strength, 44-86% worse balance, 30-86% worse chair stand score, and 21-59% worse walking speed, compared with the 60-66 year-old persons. Chronic diseases were strongly associated with physical impairment, and this association was particularly strong among the younger men. However, chronic diseases explained only some of the age-related differences in physical performance. When controlling for chronic diseases in the analyses, the age-related differences in physical performance changed 1-11%. In spite of the strong association between multimorbidity and physical impairment, chronic morbidities explained only a small part of the age-related differences in physical performance.
Challenges and solutions to pre- and post-randomization subgroup analyses.
Desai, Manisha; Pieper, Karen S; Mahaffey, Ken
2014-01-01
Subgroup analyses are commonly performed in the clinical trial setting with the purpose of illustrating that the treatment effect was consistent across different patient characteristics or identifying characteristics that should be targeted for treatment. There are statistical issues involved in performing subgroup analyses, however. These have been given considerable attention in the literature for analyses where subgroups are defined by a pre-randomization feature. Although subgroup analyses are often performed with subgroups defined by a post-randomization feature--including analyses that estimate the treatment effect among compliers--discussion of these analyses has been neglected in the clinical literature. Such analyses pose a high risk of presenting biased descriptions of treatment effects. We summarize the challenges of doing all types of subgroup analyses described in the literature. In particular, we emphasize issues with post-randomization subgroup analyses. Finally, we provide guidelines on how to proceed across the spectrum of subgroup analyses.
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Foote, John; Litchford, Ron
2006-01-01
The objective of this effort is to perform design analyses for a non-nuclear hot-hydrogen materials tester, as a first step towards developing efficient and accurate multiphysics, thermo-fluid computational methodology to predict environments for hypothetical solid-core, nuclear thermal engine thrust chamber design and analysis. The computational methodology is based on a multidimensional, finite-volume, turbulent, chemically reacting, thermally radiating, unstructured-grid, and pressure-based formulation. The multiphysics invoked in this study include hydrogen dissociation kinetics and thermodynamics, turbulent flow, convective, and thermal radiative heat transfers. The goals of the design analyses are to maintain maximum hot-hydrogen jet impingement energy and to minimize chamber wall heating. The results of analyses on three test fixture configurations and the rationale for final selection are presented. The interrogation of physics revealed that reactions of hydrogen dissociation and recombination are highly correlated with local temperature and are necessary for accurate prediction of the hot-hydrogen jet temperature.
Mahmood, Zanjbeel; Burton, Cynthia Z; Vella, Lea; Twamley, Elizabeth W
2018-04-13
Neuropsychological abilities may underlie successful performance of everyday functioning and social skills. We aimed to determine the strongest neuropsychological predictors of performance-based functional capacity and social skills performance across the spectrum of severe mental illness (SMI). Unemployed outpatients with SMI (schizophrenia, bipolar disorder, or major depression; n = 151) were administered neuropsychological (expanded MATRICS Consensus Cognitive Battery), functional capacity (UCSD Performance-Based Skills Assessment-Brief; UPSA-B), and social skills (Social Skills Performance Assessment; SSPA) assessments. Bivariate correlations between neuropsychological performance and UPSA-B and SSPA total scores showed that most neuropsychological tests were significantly associated with each performance-based measure. Forward entry stepwise regression analyses were conducted entering education, diagnosis, symptom severity, and neuropsychological performance as predictors of functional capacity and social skills. Diagnosis, working memory, sustained attention, and category and letter fluency emerged as significant predictors of functional capacity, in a model that explained 43% of the variance. Negative symptoms, sustained attention, and letter fluency were significant predictors of social skill performance, in a model explaining 35% of the variance. Functional capacity is positively associated with neuropsychological functioning, but diagnosis remains strongly influential, with mood disorder participants outperforming those with psychosis. Social skill performance appears to be positively associated with sustained attention and verbal fluency regardless of diagnosis; however, negative symptom severity strongly predicts social skills performance. Improving neuropsychological functioning may improve psychosocial functioning in people with SMI. Published by Elsevier Ltd.
7 CFR 91.4 - Kinds of services.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., chemical, and certain other analyses, requested by the applicant and performed on tobacco, seed, dairy, egg, fruit and vegetable, meat and poultry products, and related processed products. Analyses are performed... and cooperative agreements. Laboratory analyses are also performed on egg products as part of the...
Individual differences in metacontrast masking regarding sensitivity and response bias.
Albrecht, Thorsten; Mattler, Uwe
2012-09-01
In metacontrast masking target visibility is modulated by the time until a masking stimulus appears. The effect of this temporal delay differs across participants in such a way that individual human observers' performance shows distinguishable types of masking functions which remain largely unchanged for months. Here we examined whether individual differences in masking functions depend on different response criteria in addition to differences in discrimination sensitivity. To this end we reanalyzed previously published data and conducted a new experiment for further data analyses. Our analyses demonstrate that a distinction of masking functions based on the type of masking stimulus is superior to a distinction based on the target-mask congruency. Individually different masking functions are based on individual differences in discrimination sensitivities and in response criteria. Results suggest that individual differences in metacontrast masking result from individually different criterion contents. Copyright © 2012 Elsevier Inc. All rights reserved.
Homosexuality as a Discrete Class.
Norris, Alyssa L; Marcus, David K; Green, Bradley A
2015-12-01
Previous research on the latent structure of sexual orientation has returned conflicting results, with some studies finding a dimensional structure (i.e., ranging quantitatively along a spectrum) and others a taxonic structure (i.e., categories of individuals with distinct orientations). The current study used a sample (N = 33,525) from the National Epidemiologic Survey on Alcohol and Related Conditions (NESARC). A series of taxometric analyses were conducted using three indicators of sexual orientation: identity, behavior, and attraction. These analyses, performed separately for women and men, revealed low-base-rate same-sex-oriented taxa for men (base rate = 3.0%) and women (base rate = 2.7%). Generally, taxon membership conferred an increased risk for psychiatric and substance-use disorders. Although taxa were present for men and women, women demonstrated greater sexual fluidity, such that any level of same-sex sexuality conferred taxon membership for men but not for women. © The Author(s) 2015.
Liu, Bao; Fan, Xiaoming; Huo, Shengnan; Zhou, Lili; Wang, Jun; Zhang, Hui; Hu, Mei; Zhu, Jianhua
2011-12-01
A method was established to analyse the overlapped chromatographic peaks based on the chromatographic-spectra data detected by the diode-array ultraviolet detector. In the method, the three-dimensional data were de-noised and normalized firstly; secondly the differences and clustering analysis of the spectra at different time points were calculated; then the purity of the whole chromatographic peak were analysed and the region were sought out in which the spectra of different time points were stable. The feature spectra were extracted from the spectrum-stable region as the basic foundation. The nonnegative least-square method was chosen to separate the overlapped peaks and get the flow curve which was based on the feature spectrum. The three-dimensional divided chromatographic-spectrum peak could be gained by the matrix operations of the feature spectra with the flow curve. The results displayed that this method could separate the overlapped peaks.
Quantication and analysis of respiratory motion from 4D MRI
NASA Astrophysics Data System (ADS)
Aizzuddin Abd Rahni, Ashrani; Lewis, Emma; Wells, Kevin
2014-11-01
It is well known that respiratory motion affects image acquisition and also external beam radiotherapy (EBRT) treatment planning and delivery. However often the existing approaches for respiratory motion management are based on a generic view of respiratory motion such as the general movement of organ, tissue or fiducials. This paper thus aims to present a more in depth analysis of respiratory motion based on 4D MRI for further integration into motion correction in image acquisition or image based EBRT. Internal and external motion was first analysed separately, on a per-organ basis for internal motion. Principal component analysis (PCA) was then performed on the internal and external motion vectors separately and the relationship between the two PCA spaces was analysed. The motion extracted from 4D MRI on general was found to be consistent with what has been reported in literature.
Project-based learning in Geotechnics: cooperative versus collaborative teamwork
NASA Astrophysics Data System (ADS)
Pinho-Lopes, Margarida; Macedo, Joaquim
2016-01-01
Since 2007/2008 project-based learning models have been used to deliver two fundamental courses on Geotechnics in University of Aveiro, Portugal. These models have evolved and have encompassed either cooperative or collaborative teamwork. Using data collected in five editions of each course (Soil Mechanics I and Soil Mechanics II), the different characteristics of the models using cooperative or collaborative teamwork are pointed out and analysed, namely in terms of the students' perceptions. The data collected include informal feedback from students, monitoring of their marks and academic performance, and answers to two sets of questionnaires: developed for these courses, and institutional. The data indicate students have good opinion of the project-based learning model, though collaborative teamwork is the best rated. The overall efficacy of the models was analysed (sum of their effectiveness, efficiency and attractiveness). The collaborative model was found more adequate.
Intrinsic motivation and extrinsic incentives jointly predict performance: a 40-year meta-analysis.
Cerasoli, Christopher P; Nicklin, Jessica M; Ford, Michael T
2014-07-01
More than 4 decades of research and 9 meta-analyses have focused on the undermining effect: namely, the debate over whether the provision of extrinsic incentives erodes intrinsic motivation. This review and meta-analysis builds on such previous reviews by focusing on the interrelationship among intrinsic motivation, extrinsic incentives, and performance, with reference to 2 moderators: performance type (quality vs. quantity) and incentive contingency (directly performance-salient vs. indirectly performance-salient), which have not been systematically reviewed to date. Based on random-effects meta-analytic methods, findings from school, work, and physical domains (k = 183, N = 212,468) indicate that intrinsic motivation is a medium to strong predictor of performance (ρ = .21-45). The importance of intrinsic motivation to performance remained in place whether incentives were presented. In addition, incentive salience influenced the predictive validity of intrinsic motivation for performance: In a "crowding out" fashion, intrinsic motivation was less important to performance when incentives were directly tied to performance and was more important when incentives were indirectly tied to performance. Considered simultaneously through meta-analytic regression, intrinsic motivation predicted more unique variance in quality of performance, whereas incentives were a better predictor of quantity of performance. With respect to performance, incentives and intrinsic motivation are not necessarily antagonistic and are best considered simultaneously. Future research should consider using nonperformance criteria (e.g., well-being, job satisfaction) as well as applying the percent-of-maximum-possible (POMP) method in meta-analyses. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Ciciolla, Lucia; Curlee, Alexandria S.; Karageorge, Jason; Luthar, Suniya S.
2016-01-01
High achievement expectations and academic pressure from parents have been implicated in rising levels of stress and reduced well-being among adolescents. In this study of affluent, middle-school youth, we examined how perceptions of parents' emphasis on achievement (relative to prosocial behavior) influenced youth's psychological adjustment and school performance, and examined perceived parental criticism as a possible moderator of this association. The data were collected from 506 (50% female) middle school students from a predominately white, upper-middle-class community. Students reported their perceptions of parents' values by rank ordering a list of achievement- and prosocial-oriented goals based on what they believed was most valued by their mothers and fathers for them (the child) to achieve. The data also included students' reports of perceived parental criticism, internalizing symptoms, externalizing symptoms, and self-esteem, as well as school-based data on grade point average and teacher-reported classroom behavior. Person-based analyses revealed six distinct latent classes based on perceptions of both mother and father emphases on achievement. Class comparisons showed a consistent pattern of healthier child functioning, including higher school performance, higher self-esteem, and lower psychological symptoms, in association with low to neutral parental achievement emphasis, whereas poorer child functioning was associated with high parental achievement emphasis. In variable-based analyses, interaction effects showed elevated maladjustment when high maternal achievement emphasis coexisted with high (but not low) perceived parental criticism. Results of the study suggest that to foster early adolescents' well-being in affluent school settings, parents focus on prioritizing intrinsic, prosocial values that promote affiliation and community, at least as much as, or more than, they prioritize academic performance and external achievement; and strive to limit the amount of criticism and pressure they place on their children. PMID:27830404
NASA Astrophysics Data System (ADS)
Liechti, K.; Panziera, L.; Germann, U.; Zappa, M.
2013-10-01
This study explores the limits of radar-based forecasting for hydrological runoff prediction. Two novel radar-based ensemble forecasting chains for flash-flood early warning are investigated in three catchments in the southern Swiss Alps and set in relation to deterministic discharge forecasts for the same catchments. The first radar-based ensemble forecasting chain is driven by NORA (Nowcasting of Orographic Rainfall by means of Analogues), an analogue-based heuristic nowcasting system to predict orographic rainfall for the following eight hours. The second ensemble forecasting system evaluated is REAL-C2, where the numerical weather prediction COSMO-2 is initialised with 25 different initial conditions derived from a four-day nowcast with the radar ensemble REAL. Additionally, three deterministic forecasting chains were analysed. The performance of these five flash-flood forecasting systems was analysed for 1389 h between June 2007 and December 2010 for which NORA forecasts were issued, due to the presence of orographic forcing. A clear preference was found for the ensemble approach. Discharge forecasts perform better when forced by NORA and REAL-C2 rather then by deterministic weather radar data. Moreover, it was observed that using an ensemble of initial conditions at the forecast initialisation, as in REAL-C2, significantly improved the forecast skill. These forecasts also perform better then forecasts forced by ensemble rainfall forecasts (NORA) initialised form a single initial condition of the hydrological model. Thus the best results were obtained with the REAL-C2 forecasting chain. However, for regions where REAL cannot be produced, NORA might be an option for forecasting events triggered by orographic precipitation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Milani, Gabriele, E-mail: milani@stru.polimi.it; Valente, Marco, E-mail: milani@stru.polimi.it
2014-10-06
This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where themore » masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures.« less
Using computer-based video analysis in the study of fidgety movements.
Adde, Lars; Helbostad, Jorunn L; Jensenius, Alexander Refsum; Taraldsen, Gunnar; Støen, Ragnhild
2009-09-01
Absence of fidgety movements (FM) in high-risk infants is a strong marker for later cerebral palsy (CP). FMs can be classified by the General Movement Assessment (GMA), based on Gestalt perception of the infant's movement pattern. More objective movement analysis may be provided by computer-based technology. The aim of this study was to explore the feasibility of a computer-based video analysis of infants' spontaneous movements in classifying non-fidgety versus fidgety movements. GMA was performed from video material of the fidgety period in 82 term and preterm infants at low and high risks of developing CP. The same videos were analysed using the developed software called General Movement Toolbox (GMT) with visualisation of the infant's movements for qualitative analyses. Variables derived from the calculation of displacement of pixels from one video frame to the next were used for quantitative analyses. Visual representations from GMT showed easily recognisable patterns of FMs. Of the eight quantitative variables derived, the variability in displacement of a spatial centre of active pixels in the image had the highest sensitivity (81.5) and specificity (70.0) in classifying FMs. By setting triage thresholds at 90% sensitivity and specificity for FM, the need for further referral was reduced by 70%. Video recordings can be used for qualitative and quantitative analyses of FMs provided by GMT. GMT is easy to implement in clinical practice, and may provide assistance in detecting infants without FMs.
Cost of Equity Estimation in Fuel and Energy Sector Companies Based on CAPM
NASA Astrophysics Data System (ADS)
Kozieł, Diana; Pawłowski, Stanisław; Kustra, Arkadiusz
2018-03-01
The article presents cost of equity estimation of capital groups from the fuel and energy sector, listed at the Warsaw Stock Exchange, based on the Capital Asset Pricing Model (CAPM). The objective of the article was to perform a valuation of equity with the application of CAPM, based on actual financial data and stock exchange data and to carry out a sensitivity analysis of such cost, depending on the financing structure of the entity. The objective of the article formulated in this manner has determined its' structure. It focuses on presentation of substantive analyses related to the core of equity and methods of estimating its' costs, with special attention given to the CAPM. In the practical section, estimation of cost was performed according to the CAPM methodology, based on the example of leading fuel and energy companies, such as Tauron GE and PGE. Simultaneously, sensitivity analysis of such cost was performed depending on the structure of financing the company's operation.
SSME fault monitoring and diagnosis expert system
NASA Technical Reports Server (NTRS)
Ali, Moonis; Norman, Arnold M.; Gupta, U. K.
1989-01-01
An expert system, called LEADER, has been designed and implemented for automatic learning, detection, identification, verification, and correction of anomalous propulsion system operations in real time. LEADER employs a set of sensors to monitor engine component performance and to detect, identify, and validate abnormalities with respect to varying engine dynamics and behavior. Two diagnostic approaches are adopted in the architecture of LEADER. In the first approach fault diagnosis is performed through learning and identifying engine behavior patterns. LEADER, utilizing this approach, generates few hypotheses about the possible abnormalities. These hypotheses are then validated based on the SSME design and functional knowledge. The second approach directs the processing of engine sensory data and performs reasoning based on the SSME design, functional knowledge, and the deep-level knowledge, i.e., the first principles (physics and mechanics) of SSME subsystems and components. This paper describes LEADER's architecture which integrates a design based reasoning approach with neural network-based fault pattern matching techniques. The fault diagnosis results obtained through the analyses of SSME ground test data are presented and discussed.
eLearning techniques supporting problem based learning in clinical simulation.
Docherty, Charles; Hoy, Derek; Topp, Helena; Trinder, Kathryn
2005-08-01
This paper details the results of the first phase of a project using eLearning to support students' learning within a simulated environment. The locus was a purpose built clinical simulation laboratory (CSL) where the School's philosophy of problem based learning (PBL) was challenged through lecturers using traditional teaching methods. a student-centred, problem based approach to the acquisition of clinical skills that used high quality learning objects embedded within web pages, substituting for lecturers providing instruction and demonstration. This encouraged student nurses to explore, analyse and make decisions within the safety of a clinical simulation. Learning was facilitated through network communications and reflection on video performances of self and others. Evaluations were positive, students demonstrating increased satisfaction with PBL, improved performance in exams, and increased self-efficacy in the performance of nursing activities. These results indicate that eLearning techniques can help students acquire clinical skills in the safety of a simulated environment within the context of a problem based learning curriculum.
NASA Technical Reports Server (NTRS)
Mehta, Manish; Seaford, Mark; Kovarik, Brian; Dufrene, Aaron; Solly, Nathan; Kirchner, Robert; Engel, Carl D.
2014-01-01
The Space Launch System (SLS) base heating test is broken down into two test programs: (1) Pathfinder and (2) Main Test. The Pathfinder Test Program focuses on the design, development, hot-fire test and performance analyses of the 2% sub-scale SLS core-stage and booster element propulsion systems. The core-stage propulsion system is composed of four gaseous oxygen/hydrogen RS-25D model engines and the booster element is composed of two aluminum-based model solid rocket motors (SRMs). The first section of the paper discusses the motivation and test facility specifications for the test program. The second section briefly investigates the internal flow path of the design. The third section briefly shows the performance of the model RS-25D engines and SRMs for the conducted short duration hot-fire tests. Good agreement is observed based on design prediction analysis and test data. This program is a challenging research and development effort that has not been attempted in 40+ years for a NASA vehicle.
Preliminary performance analysis of an interplanetary navigation system using asteroid based beacons
NASA Technical Reports Server (NTRS)
Jee, J. Rodney; Khatib, Ahmad R.; Muellerschoen, Ronald J.; Williams, Bobby G.; Vincent, Mark A.
1988-01-01
A futuristic interplanetary navigation system using transmitters placed on selected asteroids is introduced. This network of space beacons is seen as a needed alternative to the overly burdened Deep Space Network. Covariance analyses on the potential performance of these space beacons located on a candidate constellation of eight real asteroids are initiated. Simplified analytic calculations are performed to determine limiting accuracies attainable with the network for geometric positioning. More sophisticated computer simulations are also performed to determine potential accuracies using long arcs of range and Doppler data from the beacons. The results from these computations show promise for this navigation system.
The influence of test mode and visuospatial ability on mathematics assessment performance
NASA Astrophysics Data System (ADS)
Logan, Tracy
2015-12-01
Mathematics assessment and testing are increasingly situated within digital environments with international tests moving to computer-based testing in the near future. This paper reports on a secondary data analysis which explored the influence the mode of assessment—computer-based (CBT) and pencil-and-paper based (PPT)—and visuospatial ability had on students' mathematics test performance. Data from 804 grade 6 Singaporean students were analysed using the knowledge discovery in data design. The results revealed statistically significant differences between performance on CBT and PPT test modes across content areas concerning whole number algebraic patterns and data and chance. However, there were no performance differences for content areas related to spatial arrangements geometric measurement or other number. There were also statistically significant differences in performance between those students who possess higher levels of visuospatial ability compared to those with lower levels across all six content areas. Implications include careful consideration for the comparability of CBT and PPT testing and the need for increased attention to the role of visuospatial reasoning in student's mathematics reasoning.
Musicians, postural quality and musculoskeletal health: A literature's review.
Blanco-Piñeiro, Patricia; Díaz-Pereira, M Pino; Martínez, Aurora
2017-01-01
An analysis of the salient characteristics of research papers published between 1989 and 2015 that evaluate the relationship between postural quality during musical performance and various performance quality and health factors, with emphasis on musculoskeletal health variables. Searches of Medline, Scopus and Google Scholar for papers that analysed the subject of the study objective. The following MeSH descriptors were used: posture; postural balance; muscle, skeletal; task performance and analysis; back; and spine and music. A descriptive statistical analysis of their methodology (sample types, temporal design, and postural, health and other variables analysed) and findings has been made. The inclusion criterion was that the body postural quality of the musicians during performance was included among the target study variables. Forty-one relevant empirical studies were found, written in English. Comparison and analysis of their results was hampered by great disparities in measuring instruments and operationalization of variables. Despite the growing interest in the relationships among these variables, the empirical knowledge base still has many limitations, making rigorous comparative analysis difficult. Copyright © 2016 Elsevier Ltd. All rights reserved.
Heavy hydrocarbon main injector technology
NASA Technical Reports Server (NTRS)
Fisher, S. C.; Arbit, H. A.
1988-01-01
One of the key components of the Advanced Launch System (ALS) is a large liquid rocket, booster engine. To keep the overall vehicle size and cost down, this engine will probably use liquid oxygen (LOX) and a heavy hydrocarbon, such as RP-1, as propellants and operate at relatively high chamber pressures to increase overall performance. A technology program (Heavy Hydrocarbon Main Injector Technology) is being studied. The main objective of this effort is to develop a logic plan and supporting experimental data base to reduce the risk of developing a large scale (approximately 750,000 lb thrust), high performance main injector system. The overall approach and program plan, from initial analyses to large scale, two dimensional combustor design and test, and the current status of the program are discussed. Progress includes performance and stability analyses, cold flow tests of injector model, design and fabrication of subscale injectors and calorimeter combustors for performance, heat transfer, and dynamic stability tests, and preparation of hot fire test plans. Related, current, high pressure, LOX/RP-1 injector technology efforts are also briefly discussed.
Lopez-Regalado, María Luisa; Martínez-Granados, Luis; González-Utor, Antonio; Ortiz, Nereyda; Iglesias, Miriam; Ardoy, Manuel; Castilla, Jose A
2018-05-24
The Vienna consensus, based on the recommendations of an expert panel, has identified 19 performance indicators for assisted reproductive technology (ART) laboratories. Two levels of reference values are established for these performance indicators: competence and benchmark. For over 10 years, the Spanish embryology association (ASEBIR) has participated in the definition and design of ART performance indicators, seeking to establish specific guidelines for ART laboratories to enhance quality, safety and patient welfare. Four years ago, ASEBIR took part in an initiative by AENOR, the Spanish Association for Standardization and Certification, to develop a national standard in this field (UNE 17900:2013 System of quality management for assisted reproduction laboratories), extending the former requirements, based on ISO 9001, to include performance indicators. Considering the experience acquired, we discuss various aspects of the Vienna consensus and consider certain discrepancies in performance indicators between the consensus and UNE 179007:2013, and analyse the definitions, methodology and reference values used. Copyright © 2018. Published by Elsevier Ltd.
The analyses covered in this report examine the consequences of these costs and benefits for overall economic performance and welfare. They are based on the application of a multi-sector, inter-temporal general equilibrium model of the U.S. economy.
Analysis of Shadowing Effects on Spacecraft Power Systems
NASA Technical Reports Server (NTRS)
1995-01-01
As part of an ongoing effort within the NASA Lewis Research Center's Power Systems Project Office to assist in the design and characterization of future space-based power systems, analyses have been performed to assess the effects of shadowing on the capabilities of various power systems on the International Space Station and the Russian MIR.
ERIC Educational Resources Information Center
Elkins, Rebecca L.; King, Keith; Nabors, Laura; Vidourek, Rebecca
2017-01-01
School violence, school violent victimization, and suicidal ideation among adolescents are serious public health concerns. This pilot study investigated the influence of steroid use on problem behaviors. Secondary data analyses of the 2014 PRIDE Questionnaire were performed based on information collected from 38,414 high school adolescents.…
The Ethics of Multiple Authorship: Power, Performativity and the Gift Economy
ERIC Educational Resources Information Center
Macfarlane, Bruce
2017-01-01
The allocation of authorship credit in academic publication raises complex ethical issues but is comparatively under-researched, particularly in the social sciences. The paper analyses the results of research into attitudes to multiple authorship based on a survey questionnaire of academics working in education faculties in universities in Hong…
Latent Profile Analyses of Test Anxiety: A Pilot Study
ERIC Educational Resources Information Center
von der Embse, Nathaniel P.; Mata, Andrea D.; Segool, Natasha; Scott, Emma-Catherine
2014-01-01
In an era of test-based accountability, there has been a renewed interest in understanding the relationship between test anxiety and test performance. The development and validation of test anxiety scales have grown with the rise of test anxiety research. Research is needed to critically examine the psychometric properties of these scales prior to…
Exploring Dynamical Assessments of Affect, Behavior, and Cognition and Math State Test Achievement
ERIC Educational Resources Information Center
San Pedro, Maria Ofelia Z.; Snow, Erica L.; Baker, Ryan S.; McNamara, Danielle S.; Heffernan, Neil T.
2015-01-01
There is increasing evidence that fine-grained aspects of student performance and interaction within educational software are predictive of long-term learning. Machine learning models have been used to provide assessments of affect, behavior, and cognition based on analyses of system log data, estimating the probability of a student's particular…
ERIC Educational Resources Information Center
Castejón, Alba; Zancajo, Adrián
2015-01-01
This article focuses on analysing the effect of educational differentiation policies of OECD educational systems on socioeconomically disadvantaged students, based on data from PISA 2009. The analysis is conducted on the basis of a definition of two subgroups of disadvantaged students: those that achieve high scores, and those obtaining scores…
77 FR 62473 - Safety Zone, Seafair Blue Angels Air Show Performance, Seattle, WA
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-15
....1319 has been determined to be inadequate to accommodate the anticipated flight pattern of the Blue.... [Datum: NAD 1983]'' However, the participating aircraft have a flight pattern that will extend past the... summarize our analyses based on a number of these statutes or executive orders. 1. Regulatory Planning and...
ERIC Educational Resources Information Center
Löfgren, Håkan
2015-01-01
This article analyses how the teaching profession takes shape when policy demands on increased documentation in preschool is interpreted and enacted by teachers. The profession and professional identities take shape in the tension between two forms of professionalism: occupational professionalism, based on collegial authority, and organizational…
ERIC Educational Resources Information Center
Chyung, Seung Youn; Winiecki, Donald J.; Hunt, Gary; Sevier, Carol M.
2017-01-01
Team projects are increasingly used in engineering courses. Students may develop attitudes toward team projects from prior experience, and their attitudinal responses could influence their performance during team project-based learning in the future. Thus, instructors need to measure students' attitudes toward team projects during their learner…
Improving Students' Attitudes toward Science Using Instructional Congruence
ERIC Educational Resources Information Center
Zain, Ahmad Nurulazam Md; Samsudin, Mohd Ali; Rohandi, Robertus; Jusoh, Azman
2010-01-01
The objective of this study was to improve students' attitudes toward science using instructional congruence. The study was conducted in Malaysia, in three low-performing secondary schools in the state of Penang. Data collected with an Attitudes in Science instrument were analysed using Rasch modeling. Qualitative data based on the reflections of…
NASA Astrophysics Data System (ADS)
Morikawa, Satoshi; Satake, Yuji; Takashiri, Masayuki
2018-06-01
The effects of crystal orientation and grain size on the thermoelectric properties of Bi2Te3 thin films were investigated by conducting experimental and theoretical analyses. To vary the crystal orientation and grain size, we performed oblique deposition, followed by thermal annealing treatment. The crystal orientation decreased as the oblique angle was increased, while the grain size was not changed significantly. The thermoelectric properties were measured at room temperature. A theoretical analysis was performed using a first principles method based on density functional theory. Then the semi-classical Boltzmann transport equation was used in the relaxation time approximation, with the effect of grain size included. Furthermore, the effect of crystal orientation was included in the calculation based on a simple semi-experimental model. A maximum power factor of 11.6 µW/(cm·K2) was obtained at an oblique angle of 40°. The calculated thermoelectric properties were in very good agreement with the experimentally measured values.
Using computer graphics to enhance astronaut and systems safety
NASA Technical Reports Server (NTRS)
Brown, J. W.
1985-01-01
Computer graphics is being employed at the NASA Johnson Space Center as a tool to perform rapid, efficient and economical analyses for man-machine integration, flight operations development and systems engineering. The Operator Station Design System (OSDS), a computer-based facility featuring a highly flexible and versatile interactive software package, PLAID, is described. This unique evaluation tool, with its expanding data base of Space Shuttle elements, various payloads, experiments, crew equipment and man models, supports a multitude of technical evaluations, including spacecraft and workstation layout, definition of astronaut visual access, flight techniques development, cargo integration and crew training. As OSDS is being applied to the Space Shuttle, Orbiter payloads (including the European Space Agency's Spacelab) and future space vehicles and stations, astronaut and systems safety are being enhanced. Typical OSDS examples are presented. By performing physical and operational evaluations during early conceptual phases. supporting systems verification for flight readiness, and applying its capabilities to real-time mission support, the OSDS provides the wherewithal to satisfy a growing need of the current and future space programs for efficient, economical analyses.
GLS-Finder: A Platform for Fast Profiling of Glucosinolates in Brassica Vegetables.
Sun, Jianghao; Zhang, Mengliang; Chen, Pei
2016-06-01
Mass spectrometry combined with related tandem techniques has become the most popular method for plant secondary metabolite characterization. We introduce a new strategy based on in-database searching, mass fragmentation behavior study, formula predicting for fast profiling of glucosinolates, a class of important compounds in brassica vegetables. A MATLAB script-based expert system computer program, "GLS-Finder", was developed. It is capable of qualitative and semi-quantitative analyses of glucosinolates in samples using data generated by ultrahigh-performance liquid chromatography-high-resolution accurate mass with multi-stage mass fragmentation (UHPLC-HRAM/MS(n)). A suite of bioinformatic tools was integrated into the "GLS-Finder" to perform raw data deconvolution, peak alignment, glucosinolate putative assignments, semi-quantitation, and unsupervised principal component analysis (PCA). GLS-Finder was successfully applied to identify intact glucosinolates in 49 commonly consumed Brassica vegetable samples in the United States. It is believed that this work introduces a new way of fast data processing and interpretation for qualitative and quantitative analyses of glucosinolates, where great efficacy was improved in comparison to identification manually.
Personality and job performance: the Big Five revisited.
Hurtz, G M; Donovan, J J
2000-12-01
Prior meta-analyses investigating the relation between the Big 5 personality dimensions and job performance have all contained a threat to construct validity, in that much of the data included within these analyses was not derived from actual Big 5 measures. In addition, these reviews did not address the relations between the Big 5 and contextual performance. Therefore, the present study sought to provide a meta-analytic estimate of the criterion-related validity of explicit Big 5 measures for predicting job performance and contextual performance. The results for job performance closely paralleled 2 of the previous meta-analyses, whereas analyses with contextual performance showed more complex relations among the Big 5 and performance. A more critical interpretation of the Big 5-performance relationship is presented, and suggestions for future research aimed at enhancing the validity of personality predictors are provided.
The association between restorative pre-clinical activities and musculoskeletal disorders.
Corrocher, P A; Presoto, C D; Campos, J A D B; Garcia, P P N S
2014-08-01
To evaluate the risk of development of musculoskeletal disorders in the upper limbs of undergraduate dentistry students during the execution of pre-clinical laboratory activities based on gender, type of dental procedure and area of the mouth under treatment. Male and female undergraduate students in the second year of the Araraquara Dental School, UNESP, were enrolled in this study. Digital photographs were obtained whilst the subjects performed laboratory activities. The working postures adopted by each student were evaluated using the Rapid Upper Limb Assessment (RULA). The photos were analysed by a calibrated researcher (k = 0.89), and a final risk score was attributed to each analysed procedure (n = 354). Descriptive statistical analyses were performed, and the associations of interest were analysed by the chi-square test (P = 0.05). During most of the laboratory procedures performed, the risk of developing musculoskeletal disorders was high (64.7%; - IC95% : 59.7-69.7%), with no significant association between the RULA final score and gender (χ(2) = 1.100; P = 0.577), type of dental procedure (χ(2) = 5.447, P = 0.244) and mouth area treated (χ(2) =4.150; P = 0.126). The risk of developing musculoskeletal disorders was high in undergraduate dentistry students; this risk was not related to gender, type of dental procedure and region of the mouth being treated. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Katsuyama, Jinya; Uno, Shumpei; Watanabe, Tadashi; Li, Yinsheng
2018-03-01
The thermal hydraulic (TH) behavior of coolant water is a key factor in the structural integrity assessments on reactor pressure vessels (RPVs) of pressurized water reactors (PWRs) under pressurized thermal shock (PTS) events, because the TH behavior may affect the loading conditions in the assessment. From the viewpoint of TH behavior, configuration of plant equipment and their dimensions, and operator action time considerably influence various parameters, such as the temperature and flow rate of coolant water and inner pressure. In this study, to investigate the influence of the operator action time on TH behavior during a PTS event, we developed an analysis model for a typical Japanese PWR plant, including the RPV and the main components of both primary and secondary systems, and performed TH analyses by using a system analysis code called RELAP5. We applied two different operator action times based on the Japanese and the United States (US) rules: Operators may act after 10 min (Japanese rules) and 30 min (the US rules) after the occurrence of PTS events. Based on the results of TH analysis with different operator action times, we also performed structural analyses for evaluating thermal-stress distributions in the RPV during PTS events as loading conditions in the structural integrity assessment. From the analysis results, it was clarified that differences in operator action times significantly affect TH behavior and loading conditions, as the Japanese rule may lead to lower stresses than that under the US rule because an earlier operator action caused lower pressure in the RPV.
NASA Astrophysics Data System (ADS)
Huang, X.; Oram, C.; Sick, M.
2014-03-01
More efforts are put on hydro-power to balance voltage and frequency within seconds for primary control in modern smart grids. This requires hydraulic turbines to run at off-design conditions. especially at low load or speed-no load. Besides. the tendency of increasing power output and decreasing weight of the turbine runners has also led to the high level vibration problem of the runners. especially high head Francis runners. Therefore. it is important to carry out the static and dynamic stress analyses of prototype high head Francis runners. This paper investigates the static and dynamic stresses on the prototype high head Francis runner based on site measurements and numerical simulations. The site measurements are performed with pressure transducers and strain gauges. Based on the measured results. computational fluid dynamics (CFD) simulations for the flow channel from stay vane to draft tube cone are performed. Static pressure distributions and dynamic pressure pulsations caused by rotor-stator interaction (RSI) are obtained under various operating conditions. With the CFD results. static and dynamic stresses on the runner at different operating points are calculated by means of the finite element method (FEM). The agreement between simulation and measurement is analysed with linear regression method. which indicates that the numerical result agrees well with that of measurement. Furthermore. the maximum static and dynamic stresses on the runner blade are obtained at various operating points. The relations of the maximum stresses and the power output are discussed in detail. The influences of the boundary conditions on the structural behaviour of the runner are also discussed.
The Cost of Penicillin Allergy Evaluation.
Blumenthal, Kimberly G; Li, Yu; Banerji, Aleena; Yun, Brian J; Long, Aidan A; Walensky, Rochelle P
2017-09-22
Unverified penicillin allergy leads to adverse downstream clinical and economic sequelae. Penicillin allergy evaluation can be used to identify true, IgE-mediated allergy. To estimate the cost of penicillin allergy evaluation using time-driven activity-based costing (TDABC). We implemented TDABC throughout the care pathway for 30 outpatients presenting for penicillin allergy evaluation. The base-case evaluation included penicillin skin testing and a 1-step amoxicillin drug challenge, performed by an allergist. We varied assumptions about the provider type, clinical setting, procedure type, and personnel timing. The base-case penicillin allergy evaluation costs $220 in 2016 US dollars: $98 for personnel, $119 for consumables, and $3 for space. In sensitivity analyses, lower cost estimates were achieved when only a drug challenge was performed (ie, no skin test, $84) and a nurse practitioner provider was used ($170). Adjusting for the probability of anaphylaxis did not result in a changed estimate ($220); although other analyses led to modest changes in the TDABC estimate ($214-$246), higher estimates were identified with changing to a low-demand practice setting ($268), a 50% increase in personnel times ($269), and including clinician documentation time ($288). In a least/most costly scenario analyses, the lowest TDABC estimate was $40 and the highest was $537. Using TDABC, penicillin allergy evaluation costs $220; even with varied assumptions adjusting for operational challenges, clinical setting, and expanded testing, penicillin allergy evaluation still costs only about $540. This modest investment may be offset for patients treated with costly alternative antibiotics that also may result in adverse consequences. Copyright © 2017 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
Xu, X-H; Xiong, D-H; Liu, X-G; Guo, Y; Chen, Y; Zhao, J; Recker, R R; Deng, H-W
2010-01-01
This study was conducted to test whether there exists an association between vitamin D-binding protein (DBP) gene and compression strength index (CSI) phenotype. Candidate gene association analyses were conducted in total sample, male subgroup, and female subgroup, respectively. Two single-nucleotide polymorphisms (SNPs) with significant association results were found in males, suggesting the importance of DBP gene polymorphisms on the variation in CSI especially in Caucasian males. CSI of the femoral neck (FN) is a newly developed phenotype integrating information about bone size, body size, and bone mineral density. It is considered to have the potential to improve the performance of risk assessment for hip fractures because it is based on a combination of phenotypic traits influencing hip fractures rather than a single trait. CSI is under moderate genetic determination (with a heritability of approximately 44% found in this study), but the relevant genetic study is still rather scarce. Based on the known physiological role of DBP in bone biology and the relatively high heritability of CSI, we tested 12 SNPs of the DBP gene for association with CSI variation in 405 Caucasian nuclear families comprising 1,873 subjects from the Midwestern US. Association analyses were performed in the total sample, male and female subgroups, respectively. Significant associations with CSI were found with two SNPs (rs222029, P = 0.0019; rs222020, P = 0.0042) for the male subgroup. Haplotype-based association tests corroborated the single-SNP results. Our findings suggest that the DBP gene might be one of the genetic factors influencing CSI phenotype in Caucasians, especially in males.
Psychometric properties of stress and anxiety measures among nulliparous women.
Bann, Carla M; Parker, Corette B; Grobman, William A; Willinger, Marian; Simhan, Hyagriv N; Wing, Deborah A; Haas, David M; Silver, Robert M; Parry, Samuel; Saade, George R; Wapner, Ronald J; Elovitz, Michal A; Miller, Emily S; Reddy, Uma M
2017-03-01
To examine the psychometric properties of three measures, the perceived stress scale (PSS), pregnancy experience scale (PES), and state trait anxiety inventory (STAI), for assessing stress and anxiety during pregnancy among a large sample of nulliparous women. The sample included 10,002 pregnant women participating in the Nulliparous Pregnancy Outcomes Study: Monitoring Mothers-to-Be (nMoM2b). Internal consistency reliability was assessed with Cronbach's alpha and factorial validity with confirmatory factor analyses. Intraclass correlations (ICCs) were calculated to determine stability of PSS scales over time. Psychometric properties were examined for the overall sample, as well as subgroups based on maternal age, race/ethnicity and language. All three scales demonstrated good internal consistency reliability. Confirmatory factor analyses supported the factor structures of the PSS and the PES. However, a one-factor solution of the trait-anxiety subscale from the STAI did not fit well; a two-factor solution, splitting the items into factors based on direction of item wording (positive versus negative) provided a better fit. Scores on the PSS were generally stable over time (ICC = 0.60). Subgroup analyses revealed a few items that did not perform well on Spanish versions of the scales. Overall, the scales performed well, suggesting they could be useful tools for identifying women experiencing high levels of stress and anxiety during pregnancy and allowing for the implementation of interventions to help reduce maternal stress and anxiety.
NASA Technical Reports Server (NTRS)
Ruf, Joseph; Holt, James B.; Canabal, Francisco
1999-01-01
This paper presents the status of analyses on three Rocket Based Combined Cycle configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes code for ejector mode fluid dynamics. The Draco engine analysis is a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.
Two-dimensional model of a Space Station Freedom thermal energy storage canister
NASA Astrophysics Data System (ADS)
Kerslake, Thomas W.; Ibrahim, Mounir B.
1990-08-01
The Solar Dynamic Power Module being developed for Space Station Freedom uses a eutectic mixture of LiF-CaF2 phase change salt contained in toroidal canisters for thermal energy storage. Results are presented from heat transfer analyses of the phase change salt containment canister. A 2-D, axisymmetric finite difference computer program which models the canister walls, salt, void, and heat engine working fluid coolant was developed. Analyses included effects of conduction in canister walls and solid salt, conduction and free convection in liquid salt, conduction and radiation across salt vapor filled void regions and forced convection in the heat engine working fluid. Void shape, location, growth or shrinkage (due to density difference between the solid and liquid salt phases) were prescribed based on engineering judgement. The salt phase change process was modeled using the enthalpy method. Discussion of results focuses on the role of free-convection in the liquid salt on canister heat transfer performance. This role is shown to be important for interpreting the relationship between ground based canister performance (in l-g) and expected on-orbit performance (in micro-g). Attention is also focused on the influence of void heat transfer on canister wall temperature distributions. The large thermal resistance of void regions is shown to accentuate canister hot spots and temperature gradients.
NASA Astrophysics Data System (ADS)
Babaveisi, Vahid; Paydar, Mohammad Mahdi; Safaei, Abdul Sattar
2018-07-01
This study aims to discuss the solution methodology for a closed-loop supply chain (CLSC) network that includes the collection of used products as well as distribution of the new products. This supply chain is presented on behalf of the problems that can be solved by the proposed meta-heuristic algorithms. A mathematical model is designed for a CLSC that involves three objective functions of maximizing the profit, minimizing the total risk and shortages of products. Since three objective functions are considered, a multi-objective solution methodology can be advantageous. Therefore, several approaches have been studied and an NSGA-II algorithm is first utilized, and then the results are validated using an MOSA and MOPSO algorithms. Priority-based encoding, which is used in all the algorithms, is the core of the solution computations. To compare the performance of the meta-heuristics, random numerical instances are evaluated by four criteria involving mean ideal distance, spread of non-dominance solution, the number of Pareto solutions, and CPU time. In order to enhance the performance of the algorithms, Taguchi method is used for parameter tuning. Finally, sensitivity analyses are performed and the computational results are presented based on the sensitivity analyses in parameter tuning.
Two-dimensional model of a Space Station Freedom thermal energy storage canister
NASA Technical Reports Server (NTRS)
Kerslake, Thomas W.; Ibrahim, Mounir B.
1990-01-01
The Solar Dynamic Power Module being developed for Space Station Freedom uses a eutectic mixture of LiF-CaF2 phase change salt contained in toroidal canisters for thermal energy storage. Results are presented from heat transfer analyses of the phase change salt containment canister. A 2-D, axisymmetric finite difference computer program which models the canister walls, salt, void, and heat engine working fluid coolant was developed. Analyses included effects of conduction in canister walls and solid salt, conduction and free convection in liquid salt, conduction and radiation across salt vapor filled void regions and forced convection in the heat engine working fluid. Void shape, location, growth or shrinkage (due to density difference between the solid and liquid salt phases) were prescribed based on engineering judgement. The salt phase change process was modeled using the enthalpy method. Discussion of results focuses on the role of free-convection in the liquid salt on canister heat transfer performance. This role is shown to be important for interpreting the relationship between ground based canister performance (in l-g) and expected on-orbit performance (in micro-g). Attention is also focused on the influence of void heat transfer on canister wall temperature distributions. The large thermal resistance of void regions is shown to accentuate canister hot spots and temperature gradients.
NASA Astrophysics Data System (ADS)
Babaveisi, Vahid; Paydar, Mohammad Mahdi; Safaei, Abdul Sattar
2017-07-01
This study aims to discuss the solution methodology for a closed-loop supply chain (CLSC) network that includes the collection of used products as well as distribution of the new products. This supply chain is presented on behalf of the problems that can be solved by the proposed meta-heuristic algorithms. A mathematical model is designed for a CLSC that involves three objective functions of maximizing the profit, minimizing the total risk and shortages of products. Since three objective functions are considered, a multi-objective solution methodology can be advantageous. Therefore, several approaches have been studied and an NSGA-II algorithm is first utilized, and then the results are validated using an MOSA and MOPSO algorithms. Priority-based encoding, which is used in all the algorithms, is the core of the solution computations. To compare the performance of the meta-heuristics, random numerical instances are evaluated by four criteria involving mean ideal distance, spread of non-dominance solution, the number of Pareto solutions, and CPU time. In order to enhance the performance of the algorithms, Taguchi method is used for parameter tuning. Finally, sensitivity analyses are performed and the computational results are presented based on the sensitivity analyses in parameter tuning.
Current Approaches to Tactical Performance Analyses in Soccer Using Position Data.
Memmert, Daniel; Lemmink, Koen A P M; Sampaio, Jaime
2017-01-01
Tactical match performance depends on the quality of actions of individual players or teams in space and time during match-play in order to be successful. Technological innovations have led to new possibilities to capture accurate spatio-temporal information of all players and unravel the dynamics and complexity of soccer matches. The main aim of this article is to give an overview of the current state of development of the analysis of position data in soccer. Based on the same single set of position data of a high-level 11 versus 11 match (Bayern Munich against FC Barcelona) three different promising approaches from the perspective of dynamic systems and neural networks will be presented: Tactical performance analysis revealed inter-player coordination, inter-team and inter-line coordination before critical events, as well as team-team interaction and compactness coefficients. This could lead to a multi-disciplinary discussion on match analyses in sport science and new avenues for theoretical and practical implications in soccer.
Analytic and heuristic processing influences on adolescent reasoning and decision-making.
Klaczynski, P A
2001-01-01
The normative/descriptive gap is the discrepancy between actual reasoning and traditional standards for reasoning. The relationship between age and the normative/descriptive gap was examined by presenting adolescents with a battery of reasoning and decision-making tasks. Middle adolescents (N = 76) performed closer to normative ideals than early adolescents (N = 66), although the normative/descriptive gap was large for both groups. Correlational analyses revealed that (1) normative responses correlated positively with each other, (2) nonnormative responses were positively interrelated, and (3) normative and nonnormative responses were largely independent. Factor analyses suggested that performance was based on two processing systems. The "analytic" system operates on "decontextualized" task representations and underlies conscious, computational reasoning. The "heuristic" system operates on "contextualized," content-laden representations and produces "cognitively cheap" responses that sometimes conflict with traditional norms. Analytic processing was more clearly linked to age and to intelligence than heuristic processing. Implications for cognitive development, the competence/performance issue, and rationality are discussed.
Analysis and Design of Rotors at Ultra-Low Reynolds Numbers
NASA Technical Reports Server (NTRS)
Kunz, Peter J.; Strawn, Roger C.
2003-01-01
Design tools have been developed for ultra-low Reynolds number rotors, combining enhanced actuator-ring / blade-element theory with airfoil section data based on two-dimensional Navier-Stokes calculations. This performance prediction method is coupled with an optimizer for both design and analysis applications. Performance predictions from these tools have been compared with three-dimensional Navier Stokes analyses and experimental data for a 2.5 cm diameter rotor with chord Reynolds numbers below 10,000. Comparisons among the analyses and experimental data show reasonable agreement both in the global thrust and power required, but the spanwise distributions of these quantities exhibit significant deviations. The study also reveals that three-dimensional and rotational effects significantly change local airfoil section performance. The magnitude of this issue, unique to this operating regime, may limit the applicability of blade-element type methods for detailed rotor design at ultra-low Reynolds numbers, but these methods are still useful for evaluating concept feasibility and rapidly generating initial designs for further analysis and optimization using more advanced tools.
Nonlinear analysis of switched semi-active controlled systems
NASA Astrophysics Data System (ADS)
Eslaminasab, Nima; Vahid A., Orang; Golnaraghi, Farid
2011-02-01
Semi-active systems improve suspension performance of the vehicles more effectively than conventional passive systems by simultaneously improving ride comfort and road handling. Also, because of size, weight, price and performance advantages, they have gained more interest over the active as well as passive systems. Probably the most neglected aspect of the semi-active on-off control systems and strategies is the effects of the added nonlinearities of those systems, which are introduced and analysed in this paper. To do so, numerical techniques, analytical method of averaging and experimental analysis are deployed. In this paper, a new method to analyse, calculate and compare the performances of the semi-active controlled systems is proposed; further, a new controller based on the observations of actual test data is proposed to eliminate the adverse effects of added nonlinearities. The significance of the proposed new system is the simplicity of the algorithm and ease of implementation. In fact, this new semi-active control strategy could be easily adopted and used with most of the existing semi-active control systems.
Statistical analysis of the determinations of the Sun's Galactocentric distance
NASA Astrophysics Data System (ADS)
Malkin, Zinovy
2013-02-01
Based on several tens of R0 measurements made during the past two decades, several studies have been performed to derive the best estimate of R0. Some used just simple averaging to derive a result, whereas others provided comprehensive analyses of possible errors in published results. In either case, detailed statistical analyses of data used were not performed. However, a computation of the best estimates of the Galactic rotation constants is not only an astronomical but also a metrological task. Here we perform an analysis of 53 R0 measurements (published in the past 20 years) to assess the consistency of the data. Our analysis shows that they are internally consistent. It is also shown that any trend in the R0 estimates from the last 20 years is statistically negligible, which renders the presence of a bandwagon effect doubtful. On the other hand, the formal errors in the published R0 estimates improve significantly with time.
Members' needs, intragroup conflict, and group performance.
Chun, Jinseok S; Choi, Jin Nam
2014-05-01
Focusing on "what people want in their group" as a critical antecedent of intragroup conflict, the present study theorizes and empirically investigates the relationships among the psychological needs of group members, intragroup conflict, and group performance. It attends to the within-group average and dispersion of members' psychological needs and examines the effects stemming from group composition of needs on multiple types of conflict. The analyses based on multisource data from 145 organizational teams revealed significant relationships between the groups' composition with respect to the members' need for achievement and task conflict, need for affiliation and relationship conflict, and need for power and status conflict. Some of these relationships were moderated by open communication among members. The analyses also demonstrated that when the 3 types of conflict were considered together, task conflict was a positive predictor of group performance, whereas relationship conflict was a negative predictor. The findings highlight the motivational aspects of intragroup conflict, revealing the multilevel dynamics of the psychological needs in social settings. (c) 2014 APA, all rights reserved.
Delineating Species with DNA Barcodes: A Case of Taxon Dependent Method Performance in Moths
Kekkonen, Mari; Mutanen, Marko; Kaila, Lauri; Nieminen, Marko; Hebert, Paul D. N.
2015-01-01
The accelerating loss of biodiversity has created a need for more effective ways to discover species. Novel algorithmic approaches for analyzing sequence data combined with rapidly expanding DNA barcode libraries provide a potential solution. While several analytical methods are available for the delineation of operational taxonomic units (OTUs), few studies have compared their performance. This study compares the performance of one morphology-based and four DNA-based (BIN, parsimony networks, ABGD, GMYC) methods on two groups of gelechioid moths. It examines 92 species of Finnish Gelechiinae and 103 species of Australian Elachistinae which were delineated by traditional taxonomy. The results reveal a striking difference in performance between the two taxa with all four DNA-based methods. OTU counts in the Elachistinae showed a wider range and a relatively low (ca. 65%) OTU match with reference species while OTU counts were more congruent and performance was higher (ca. 90%) in the Gelechiinae. Performance rose when only monophyletic species were compared, but the taxon-dependence remained. None of the DNA-based methods produced a correct match with non-monophyletic species, but singletons were handled well. A simulated test of morphospecies-grouping performed very poorly in revealing taxon diversity in these small, dull-colored moths. Despite the strong performance of analyses based on DNA barcodes, species delineated using single-locus mtDNA data are best viewed as OTUs that require validation by subsequent integrative taxonomic work. PMID:25849083
Active learning for noisy oracle via density power divergence.
Sogawa, Yasuhiro; Ueno, Tsuyoshi; Kawahara, Yoshinobu; Washio, Takashi
2013-10-01
The accuracy of active learning is critically influenced by the existence of noisy labels given by a noisy oracle. In this paper, we propose a novel pool-based active learning framework through robust measures based on density power divergence. By minimizing density power divergence, such as β-divergence and γ-divergence, one can estimate the model accurately even under the existence of noisy labels within data. Accordingly, we develop query selecting measures for pool-based active learning using these divergences. In addition, we propose an evaluation scheme for these measures based on asymptotic statistical analyses, which enables us to perform active learning by evaluating an estimation error directly. Experiments with benchmark datasets and real-world image datasets show that our active learning scheme performs better than several baseline methods. Copyright © 2013 Elsevier Ltd. All rights reserved.
StackSplit - a plugin for multi-event shear wave splitting analyses in SplitLab
NASA Astrophysics Data System (ADS)
Grund, Michael
2017-08-01
SplitLab is a powerful and widely used tool for analysing seismological shear wave splitting of single event measurements. However, in many cases, especially temporary station deployments close to the noisy seaside, ocean bottom or for recordings affected by strong anthropogenic noise, only multi-event approaches provide stable and reliable splitting results. In order to extend the original SplitLab environment for such analyses, I present the StackSplit plugin that can easily be implemented within the well accepted main program. StackSplit grants easy access to several different analysis approaches within SplitLab, including a new multiple waveform based inversion method as well as the most established standard stacking procedures. The possibility to switch between different analysis approaches at any time allows the user for the most flexible processing of individual multi-event splitting measurements for a single recording station. Besides the provided functions of the plugin, no other external program is needed for the multi-event analyses since StackSplit performs within the available SplitLab structure which is based on MATLAB. The effectiveness and use of this plugin is demonstrated with data examples of a long running seismological recording station in Finland.
Oman India Pipeline: An operational repair strategy based on a rational assessment of risk
DOE Office of Scientific and Technical Information (OSTI.GOV)
German, P.
1996-12-31
This paper describes the development of a repair strategy for the operational phase of the Oman India Pipeline based upon the probability and consequences of a pipeline failure. Risk analyses and cost benefit analyses performed provide guidance on the level of deepwater repair development effort appropriate for the Oman India Pipeline project and identifies critical areas toward which more intense development effort should be directed. The risk analysis results indicate that the likelihood of a failure of the Oman India Pipeline during its 40-year life is low. Furthermore, the probability of operational failure of the pipeline in deepwater regions ismore » extremely low, the major proportion of operational failure risk being associated with the shallow water regions.« less
Li, Changwei; Bazzano, Lydia A.L.; Rao, Dabeeru C.; Hixson, James E.; He, Jiang; Gu, Dongfeng; Gu, Charles C.; Shimmin, Lawrence C.; Jaquish, Cashell E.; Schwander, Karen; Liu, De-Pei; Huang, Jianfeng; Lu, Fanghong; Cao, Jie; Chong, Shen; Lu, Xiangfeng; Kelly, Tanika N.
2016-01-01
We conducted a genome-wide linkage scan and positional association study to identify genes and variants influencing blood lipid levels among participants of the Genetic Epidemiology Network of Salt-Sensitivity (GenSalt) study. The GenSalt study was conducted among 1906 participants from 633 Han Chinese families. Lipids were measured from overnight fasting blood samples using standard methods. Multipoint quantitative trait genome-wide linkage scans were performed on the high-density lipoprotein, low-density lipoprotein, and log-transformed triglyceride phenotypes. Using dense panels of single nucleotide polymorphisms (SNPs), single-marker and gene-based association analyses were conducted to follow-up on promising linkage signals. Additive associations between each SNP and lipid phenotypes were tested using mixed linear regression models. Gene-based analyses were performed by combining P-values from single-marker analyses within each gene using the truncated product method (TPM). Significant associations were assessed for replication among 777 Asian participants of the Multi-ethnic Study of Atherosclerosis (MESA). Bonferroni correction was used to adjust for multiple testing. In the GenSalt study, suggestive linkage signals were identified at 2p11.2–2q12.1 [maximum multipoint LOD score (MML) = 2.18 at 2q11.2] and 11q24.3–11q25 (MML = 2.29 at 11q25) for the log-transformed triglyceride phenotype. Follow-up analyses of these two regions revealed gene-based associations of charged multivesicular body protein 3 (CHMP3), ring finger protein 103 (RNF103), AF4/FMR2 family, member 3 (AFF3), and neurotrimin (NTM ) with triglycerides (P = 4 × 10−4, 1.00 × 10−5, 2.00 × 10−5, and 1.00 × 10−7, respectively). Both the AFF3 and NTM triglyceride associations were replicated among MESA study participants (P = 1.00 × 10−7 and 8.00 × 10−5, respectively). Furthermore, NTM explained the linkage signal on chromosome 11. In conclusion, we identified novel genes associated with lipid phenotypes in linkage regions on chromosomes 2 and 11. PMID:25819087
Conceptual Study on Hypersonic Turbojet Experimental Vehicle (HYTEX)
NASA Astrophysics Data System (ADS)
Taguchi, Hideyuki; Murakami, Akira; Sato, Tetsuya; Tsuchiya, Takeshi
Pre-cooled turbojet engines have been investigated aiming at realization of reusable space transportation systems and hypersonic airplanes. Evaluation methods of these engine performances have been established based on ground tests. There are some plans on the demonstration of hypersonic propulsion systems. JAXA focused on hypersonic propulsion systems as a key technology of hypersonic transport airplane. Demonstrations of Mach 5 class hypersonic technologies are stated as a development target at 2025 in the long term vision. In this study, systems analyses of hypersonic turbojet experiment (HYTEX) with Mach 5 flight capability is performed. Aerodynamic coefficients are obtained by CFD analyses and wind tunnel tests. Small Pre-cooled turbojet is fabricated and tested using liquid hydrogen as fuel. As a result, characteristics of the baseline vehicle shape is clarified, . and effects of pre-cooling are confirmed at the firing test.
The Influence of Leadership in Implementing Management Systems
NASA Astrophysics Data System (ADS)
Nae, Ilie; Solomon, Gheorghe; Severin, Irina
2014-12-01
This paper presents a new perspective of the implementation of Management Systems within organizations in order to increase the success rate. The objective is to analyse how the leadership could influence positively or negatively the implementation, according to the leadership approach chosen. It offers a method to analyse the maturity of the leadership for any organization, based on existing leadership models, completing these models with specificities of a Management System. The Maturity Grid is extended to key elements of the Organizational Leadership: Strategic Planning, Process and Performance. The results expected are to change the current understanding of leadership during a Management System implementation(leadership seen as a principle) to an active leadership, implemented at organizational level. It propose an alternative of the classic management approach, to a Performance Management approach, that integrates naturally the leadership in all processes and methods
Menachemi, Nir; Burkhardt, Jeffrey; Shewchuk, Richard; Burke, Darrell; Brooks, Robert G
2007-01-01
Outsourcing of information technology (IT) functions is a popular strategy with both potential benefits and risks for hospitals. Anecdotal evidence, based on case studies, suggests that outsourcing may be associated with significant cost savings. However, no generalizable evidence exists to support such assertions. This study examines whether outsourcing IT functions is related to improved financial performance in hospitals. Primary survey data on IT outsourcing behavior were combined with secondary data on hospital financial performance. Regression analyses examined the relationship between outsourcing and various measures of financial performance while controlling for bed size, average patient acuity, geographic location, and overall IT adoption. Complete data from a total of 83 Florida hospitals were available for analyses. Findings suggest that the decision to outsource IT functions is not related to any of the hospital financial performance measures that were examined. Specifically, outsourcing of IT functions did not correlate with net inpatient revenue, net patient revenue, hospital expenses, total expenses, cash flow ratio, operating margin, or total margin. In most cases, IT outsourcing is not necessarily a cost-lowering strategy, but instead, a cost-neutral manner in which to accomplish an organizational strategy.
Solar and Magnetic Attitude Determination for Small Spacecraft
NASA Technical Reports Server (NTRS)
Woodham, Kurt; Blackman, Kathie; Sanneman, Paul
1997-01-01
During the Phase B development of the NASA New Millennium Program (NMP) Earth Orbiter-1 (EO-1) spacecraft, detailed analyses were performed for on-board attitude determination using the Sun and the Earth's magnetic field. This work utilized the TRMM 'Contingency Mode' as a starting point but concentrated on implementation for a small spacecraft without a high performance mechanical gyro package. The analyses and simulations performed demonstrate a geographic dependence due to diurnal variations in the Earth magnetic field with respect to the Sun synchronous, nearly polar orbit. Sensitivity to uncompensated residual magnetic fields of the spacecraft and field modeling errors is shown to be the most significant obstacle for maximizing performance. Performance has been evaluated with a number of inertial reference units and various mounting orientations for the two-axis Fine Sun Sensors. Attitude determination accuracy using the six state Kalman Filter executing at 2 Hz is approximately 0.2 deg, 3-sigma, per axis. Although EO-1 was subsequently driven to a stellar-based attitude determination system as a result of tighter pointing requirements, solar/magnetic attitude determination is demonstrated to be applicable to a range of small spacecraft with medium precision pointing requirements.
Importance-performance analysis of dental satisfaction among three ethnic groups in malaysia.
Dewi, Fellani Danasra; Gundavarapu, Kalyan C; Cugati, Navaneetha
2013-01-01
To find the differences in patient satisfaction related to dental services among three ethnic groups - Chinese, Indian and Malay - at AIMST University Dental Centre and analyse them with an importance-performance grid, identifying the weak and strong points, in order to provide better service. This questionnaire-based study consisted of convenience samples of 174 patients of Chinese, Indian and Malay ethnicity. Importance-performance analysis for 20 attributes were compared using Likert's scale. The data obtained were statistically analysed using the Kruskal-Wallis test. Chinese and Indians both emphasised low performance on the interpersonal relationship attribute in terms of the receptionist's courtesy, whereas the Malay participants were concerned with convenience attributes. All the ethnic groups favoured maintaining existing major attributes towards technical competency, interpersonal relationship and facility factors. This study demonstrated priority differences between the ethnic groups' perception of the quality of dental services, where ethnic Chinese showed the highest gap (measure of dissatisfaction) between importance and performance compared to ethnic Malays, followed by ethnic Indians. The patients from the three major ethnic groups of Malaysia were generally well satisfied. Perhaps more priority should be placed on improving the interpersonal relationship attribute, especially with the receptionists.
Shigematsu, Toru; Ueno, Shigeaki; Tsuchida, Yasuharu; Hayashi, Mayumi; Okonogi, Hiroko; Masaki, Haruhiko; Fujii, Tomoyuki
2007-12-01
Bacterial counts under liquid cultivation using 96-well microplates were performed. The counts under liquid and under solid cultivation were equivalent in foods, although the counts under liquid cultivation exceeded those under solid cultivation in seawater, suggesting that some bacteria in seawater were viable but did not form detectable colonies. Phylogenetic analysis of bacteria obtained under liquid cultivation was also performed.
NASA Astrophysics Data System (ADS)
Hilico, L.; Felder, R.; Touahri, D.; Acef, O.; Clairon, A.; Biraben, F.
1998-11-01
We have built three optical frequency standards based on the two-photon transition of rubidium at 778nm, and analysed their performance over a period of more than three years. We discuss some systematic effects that could lead to the reproducibility we observe, and point out the possible improvements of the devices. We also examine the short and long term stabilities of the systems, and show that we have reached their ultimate performances.
Solving constant-coefficient differential equations with dielectric metamaterials
NASA Astrophysics Data System (ADS)
Zhang, Weixuan; Qu, Che; Zhang, Xiangdong
2016-07-01
Recently, the concept of metamaterial analog computing has been proposed (Silva et al 2014 Science 343 160-3). Some mathematical operations such as spatial differentiation, integration, and convolution, have been performed by using designed metamaterial blocks. Motivated by this work, we propose a practical approach based on dielectric metamaterial to solve differential equations. The ordinary differential equation can be solved accurately by the correctly designed metamaterial system. The numerical simulations using well-established numerical routines have been performed to successfully verify all theoretical analyses.
Class-specific Error Bounds for Ensemble Classifiers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prenger, R; Lemmond, T; Varshney, K
2009-10-06
The generalization error, or probability of misclassification, of ensemble classifiers has been shown to be bounded above by a function of the mean correlation between the constituent (i.e., base) classifiers and their average strength. This bound suggests that increasing the strength and/or decreasing the correlation of an ensemble's base classifiers may yield improved performance under the assumption of equal error costs. However, this and other existing bounds do not directly address application spaces in which error costs are inherently unequal. For applications involving binary classification, Receiver Operating Characteristic (ROC) curves, performance curves that explicitly trade off false alarms and missedmore » detections, are often utilized to support decision making. To address performance optimization in this context, we have developed a lower bound for the entire ROC curve that can be expressed in terms of the class-specific strength and correlation of the base classifiers. We present empirical analyses demonstrating the efficacy of these bounds in predicting relative classifier performance. In addition, we specify performance regions of the ROC curve that are naturally delineated by the class-specific strengths of the base classifiers and show that each of these regions can be associated with a unique set of guidelines for performance optimization of binary classifiers within unequal error cost regimes.« less
Gianoncelli, Alessandra; Bonini, Sara A; Bertuzzi, Michela; Guarienti, Michela; Vezzoli, Sara; Kumar, Rajesh; Delbarba, Andrea; Mastinu, Andrea; Sigala, Sandra; Spano, Pierfranco; Pani, Luca; Pecorelli, Sergio; Memo, Maurizio
2015-08-01
Authorization to market a biosimilar product by the appropriate institutions is expected based on biosimilarity with its originator product. The analogy between the originator and its biosimilar(s) is assessed through safety, purity, and potency analyses. In this study, we proposed a useful quality control system for rapid and economic primary screening of potential biosimilar drugs. For this purpose, chemical and functional characterization of the originator rhEPO alfa and two of its biosimilars was discussed. Qualitative and quantitative analyses of the originator rhEPO alfa and its biosimilars were performed using reversed-phase high-performance liquid chromatography (RP-HPLC). The identification of proteins and the separation of isoforms were studied using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) and two-dimensional gel electrophoresis (2D-PAGE), respectively. Furthermore, the biological activity of these drugs was measured both in vitro, evaluating the TF-1 cell proliferation rate, and in vivo, using the innovative experimental animal model of the zebrafish embryos. Chemical analyses showed that the quantitative concentrations of rhEPO alfa were in agreement with the labeled claims by the corresponding manufacturers. The qualitative analyses performed demonstrated that the three drugs were pure and that they had the same amino acid sequence. Chemical differences were found only at the level of isoforms containing N-glycosylation; however, functional in vitro and in vivo studies did not show any significant differences from a biosimilar point of view. These rapid and economic structural and functional analyses were effective in the evaluation of the biosimilarity between the originator rhEPO alfa and the biosimilars analyzed.
A Systematic Review of the Cost-Effectiveness of Biologics for Ulcerative Colitis.
Stawowczyk, Ewa; Kawalec, Paweł
2018-04-01
Ulcerative colitis (UC) is a chronic autoimmune inflammation of the colon. The condition significantly decreases quality of life and generates a substantial economic burden for healthcare payers, patients and the society in which they live. Some patients require chronic pharmacotherapy, and access to novel biologic drugs might be crucial for long-term remission. The analyses of cost-effectiveness for biologic drugs are necessary to assess their efficiency and provide the best available drugs to patients. Our aim was to collect and assess the quality of economic analyses carried out for biologic agents used in the treatment of UC, as well as to summarize evidence on the drivers of cost-effectiveness and evaluate the transferability and generalizability of conclusions. A systematic database review was conducted using MEDLINE (via PubMed), EMBASE, Cost-Effectiveness Analysis Registry and CRD0. Both authors independently reviewed the identified articles to determine their eligibility for final review. Hand searching of references in collected papers was also performed to find any relevant articles. The reporting quality of economic analyses included was evaluated by two reviewers using the International Society of Pharmacoeconomics and Outcomes Research (ISPOR) Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement checklist. We reviewed the sensitivity analyses in cost-effectiveness analyses to identify the variables that may have changed the conclusions of the study. Key drivers of cost-effectiveness were selected by identifying uncertain parameters that caused the highest change of the results of the analyses compared with base-case results. Of the 576 identified records, 87 were excluded as duplicates and 16 studies were included in the final review; evaluations for Canada, the UK and Poland were mostly performed. The majority of the evaluations revealed were performed for infliximab (approximately 75% of total volume); however, some assessments were also performed for adalimumab (50%) and golimumab (31%). Only three analyses were conducted for vedolizumab, whereas no relevant studies were found for etrolizumab and tofacitinib. The reporting quality of the included economic analyses was assessed as high, with an average score of 21 points per 24 maximum possible (range 14-23 points according to the ISPOR CHEERS statement checklist). In the case of most analyses, quality-adjusted life-years were used as a clinical outcome, and endpoints such as remission, response and mucosal healing were less common. The higher clinical effectiveness (based on response rates) of biological treatment over non-biological treatments was presented in revealed analyses. The incremental cost-utility ratios for biologics, compared with standard care, varied significantly between the studies and ranged from US$36,309 to US$456,979. The lowest value was obtained for infliximab and the highest for the treatment scheme including infliximab 5 mg/kg and infliximab 10 mg/kg + adalimumab. The change of utility weights and clinical parameters had the most significant influence on the results of the analysis; the variable related to surgery was the least sensitive. Limited data on the cost-effectiveness of UC therapy were identified. In the majority of studies, the lack of cost-effectiveness was revealed for biologics, which was associated with their high costs. Clinical outcomes are transferable to other countries and could be generalized; however, cost inputs are country-specific and therefore limit the transferability and generalizability of conclusions. The key drivers and variables that showed the greatest effect on the analysis results were utility weights and clinical parameters.
GenoMatrix: A Software Package for Pedigree-Based and Genomic Prediction Analyses on Complex Traits.
Nazarian, Alireza; Gezan, Salvador Alejandro
2016-07-01
Genomic and pedigree-based best linear unbiased prediction methodologies (G-BLUP and P-BLUP) have proven themselves efficient for partitioning the phenotypic variance of complex traits into its components, estimating the individuals' genetic merits, and predicting unobserved (or yet-to-be observed) phenotypes in many species and fields of study. The GenoMatrix software, presented here, is a user-friendly package to facilitate the process of using genome-wide marker data and parentage information for G-BLUP and P-BLUP analyses on complex traits. It provides users with a collection of applications which help them on a set of tasks from performing quality control on data to constructing and manipulating the genomic and pedigree-based relationship matrices and obtaining their inverses. Such matrices will be then used in downstream analyses by other statistical packages. The package also enables users to obtain predicted values for unobserved individuals based on the genetic values of observed related individuals. GenoMatrix is available to the research community as a Windows 64bit executable and can be downloaded free of charge at: http://compbio.ufl.edu/software/genomatrix/. © The American Genetic Association. 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Ju, Hyunjin; Lee, Deuck Hang; Cho, Hae-Chang; Kim, Kang Su; Yoon, Seyoon; Seo, Soo-Yeon
2014-01-01
In this study, hydrophilic chemical grout using silanol (HCGS) was adopted to overcome the performance limitations of epoxy materials used for strengthening existing buildings and civil engineering structures. The enhanced material performances of HCGS were introduced, and applied to the section enlargement method, which is one of the typical structural strengthening methods used in practice. To evaluate the excellent structural strengthening performance of the HCGS, structural tests were conducted on reinforced concrete beams, and analyses on the flexural behaviors of test specimens were performed by modified partial interaction theory (PIT). In particular, to improve the constructability of the section enlargement method, an advanced strengthening method was proposed, in which the precast panel was directly attached to the bottom of the damaged structural member by HCGS, and the degree of connection of the test specimens, strengthened by the section enlargement method, were quantitatively evaluated by PIT-based analysis. PMID:28788708
Ju, Hyunjin; Lee, Deuck Hang; Cho, Hae-Chang; Kim, Kang Su; Yoon, Seyoon; Seo, Soo-Yeon
2014-06-23
In this study, hydrophilic chemical grout using silanol (HCGS) was adopted to overcome the performance limitations of epoxy materials used for strengthening existing buildings and civil engineering structures. The enhanced material performances of HCGS were introduced, and applied to the section enlargement method, which is one of the typical structural strengthening methods used in practice. To evaluate the excellent structural strengthening performance of the HCGS, structural tests were conducted on reinforced concrete beams, and analyses on the flexural behaviors of test specimens were performed by modified partial interaction theory (PIT). In particular, to improve the constructability of the section enlargement method, an advanced strengthening method was proposed, in which the precast panel was directly attached to the bottom of the damaged structural member by HCGS, and the degree of connection of the test specimens, strengthened by the section enlargement method, were quantitatively evaluated by PIT-based analysis.
Structural assessment of a Space Station solar dynamic heat receiver thermal energy storage canister
NASA Technical Reports Server (NTRS)
Tong, M. T.; Kerslake, T. W.; Thompson, R. L.
1988-01-01
This paper assesses the structural performance of a Space Station thermal energy storage (TES) canister subject to orbital solar flux variation and engine cold start-up operating conditions. The impact of working fluid temperature and salt-void distribution on the canister structure are assessed. Both analytical and experimental studies were conducted to determine the temperature distribution of the canister. Subsequent finite-element structural analyses of the canister were performed using both analytically and experimentally obtained temperatures. The Arrhenius creep law was incorporated into the procedure, using secondary creep data for the canister material, Haynes-188 alloy. The predicted cyclic creep strain accumulations at the hot spot were used to assess the structural performance of the canister. In addition, the structural performance of the canister based on the analytically-determined temperature was compared with that based on the experimentally-measured temperature data.
Structural assessment of a space station solar dynamic heat receiver thermal energy storage canister
NASA Technical Reports Server (NTRS)
Thompson, R. L.; Kerslake, T. W.; Tong, M. T.
1988-01-01
The structural performance of a space station thermal energy storage (TES) canister subject to orbital solar flux variation and engine cold start up operating conditions was assessed. The impact of working fluid temperature and salt-void distribution on the canister structure are assessed. Both analytical and experimental studies were conducted to determine the temperature distribution of the canister. Subsequent finite element structural analyses of the canister were performed using both analytically and experimentally obtained temperatures. The Arrhenius creep law was incorporated into the procedure, using secondary creep data for the canister material, Haynes 188 alloy. The predicted cyclic creep strain accumulations at the hot spot were used to assess the structural performance of the canister. In addition, the structural performance of the canister based on the analytically determined temperature was compared with that based on the experimentally measured temperature data.
Study on bamboo gluing performance numerical simulation
NASA Astrophysics Data System (ADS)
Zhao, Z. R.; Sun, W. H.; Sui, X. M.; Zhang, X. F.
2018-01-01
Bamboo gluing timber is a green building materials, can be widely used as modern building beams and columns. The existing bamboo gluing timber is usually produced by bamboo columns or bamboo bundle rolled into by bamboo columns. The performance of new bamboo gluing timber is decided by bamboo adhesion character. Based on this, the cohesive damage model of bamboo gluing is created, experiment results are used to validate the model. The model proposed in the work is agreed on the experimental results. Different bamboo bonding length and bamboo gluing performance is analysed. The model is helpful to bamboo integrated timber application.
Assessment of various supervised learning algorithms using different performance metrics
NASA Astrophysics Data System (ADS)
Susheel Kumar, S. M.; Laxkar, Deepak; Adhikari, Sourav; Vijayarajan, V.
2017-11-01
Our work brings out comparison based on the performance of supervised machine learning algorithms on a binary classification task. The supervised machine learning algorithms which are taken into consideration in the following work are namely Support Vector Machine(SVM), Decision Tree(DT), K Nearest Neighbour (KNN), Naïve Bayes(NB) and Random Forest(RF). This paper mostly focuses on comparing the performance of above mentioned algorithms on one binary classification task by analysing the Metrics such as Accuracy, F-Measure, G-Measure, Precision, Misclassification Rate, False Positive Rate, True Positive Rate, Specificity, Prevalence.
NASA Technical Reports Server (NTRS)
Latham, T. S.; Rodgers, R. J.
1972-01-01
Analytical studies were continued to identify the design and performance characteristics of a small-scale model of a nuclear light bulb unit cell suitable for testing in a nuclear furnace reactor. Emphasis was placed on calculating performance characteristics based on detailed radiant heat transfer analyses, on designing the test assembly for ease of insertion, connection, and withdrawal at the reactor test cell, and on determining instrumentation and test effluent handling requirements. In addition, a review of candidate test reactors for future nuclear light bulb in-reactor tests was conducted.
NASA Technical Reports Server (NTRS)
Hague, D. S.; Rozendaal, H. L.
1977-01-01
A rapid mission analysis code based on the use of approximate flight path equations of motion is described. Equation form varies with the segment type, for example, accelerations, climbs, cruises, descents, and decelerations. Realistic and detailed vehicle characteristics are specified in tabular form. In addition to its mission performance calculation capabilities, the code also contains extensive flight envelop performance mapping capabilities. Approximate take off and landing analyses can be performed. At high speeds, centrifugal lift effects are taken into account. Extensive turbojet and ramjet engine scaling procedures are incorporated in the code.
NASA Astrophysics Data System (ADS)
Kalyuzhnyi, O.; Ilnytskyi, J. M.; Holovatch, Yu; von Ferber, C.
2018-05-01
In this paper we study the shape characteristics of star-like polymers in various solvent quality using a mesoscopic level of modeling. The dissipative particle dynamics simulations are performed for the homogeneous and four different heterogeneous star polymers with the same molecular weight. We analyse the gyration radius and asphericity at the poor, good and θ-solvent regimes. Detailed explanation based on interplay between enthalpic and entropic contributions to the free energy and analyses on of the asphericity of individual branches are provided to explain the increase of the apsphericity in θ-solvent regime.
Internal quantum efficiency and tunable colour temperature in monolithic white InGaN/GaN LED
NASA Astrophysics Data System (ADS)
Titkov, Ilya E.; Yadav, Amit; Zerova, Vera L.; Zulonas, Modestas; Tsatsulnikov, Andrey F.; Lundin, Wsevolod V.; Sakharov, Alexey V.; Rafailov, Edik U.
2014-03-01
Internal Quantum Efficiency (IQE) of two-colour monolithic white light emitting diode (LED) was measured by temperature dependant electro-luminescence (TDEL) and analysed with modified rate equation based on ABC model. External, internal and injection efficiencies of blue and green quantum wells were analysed separately. Monolithic white LED contained one green InGaN QW and two blue QWs being separated by GaN barrier. This paper reports also the tunable behaviour of correlated colour temperature (CCT) in pulsed operation mode and effect of self-heating on device performance.
Modeling and Simulations for the High Flux Isotope Reactor Cycle 400
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ilas, Germina; Chandler, David; Ade, Brian J
2015-03-01
A concerted effort over the past few years has been focused on enhancing the core model for the High Flux Isotope Reactor (HFIR), as part of a comprehensive study for HFIR conversion from high-enriched uranium (HEU) to low-enriched uranium (LEU) fuel. At this time, the core model used to perform analyses in support of HFIR operation is an MCNP model for the beginning of Cycle 400, which was documented in detail in a 2005 technical report. A HFIR core depletion model that is based on current state-of-the-art methods and nuclear data was needed to serve as reference for the designmore » of an LEU fuel for HFIR. The recent enhancements in modeling and simulations for HFIR that are discussed in the present report include: (1) revision of the 2005 MCNP model for the beginning of Cycle 400 to improve the modeling data and assumptions as necessary based on appropriate primary reference sources HFIR drawings and reports; (2) improvement of the fuel region model, including an explicit representation for the involute fuel plate geometry that is characteristic to HFIR fuel; and (3) revision of the Monte Carlo-based depletion model for HFIR in use since 2009 but never documented in detail, with the development of a new depletion model for the HFIR explicit fuel plate representation. The new HFIR models for Cycle 400 are used to determine various metrics of relevance to reactor performance and safety assessments. The calculated metrics are compared, where possible, with measurement data from preconstruction critical experiments at HFIR, data included in the current HFIR safety analysis report, and/or data from previous calculations performed with different methods or codes. The results of the analyses show that the models presented in this report provide a robust and reliable basis for HFIR analyses.« less
Yoshida, Catherine E; Kruczkiewicz, Peter; Laing, Chad R; Lingohr, Erika J; Gannon, Victor P J; Nash, John H E; Taboada, Eduardo N
2016-01-01
For nearly 100 years serotyping has been the gold standard for the identification of Salmonella serovars. Despite the increasing adoption of DNA-based subtyping approaches, serotype information remains a cornerstone in food safety and public health activities aimed at reducing the burden of salmonellosis. At the same time, recent advances in whole-genome sequencing (WGS) promise to revolutionize our ability to perform advanced pathogen characterization in support of improved source attribution and outbreak analysis. We present the Salmonella In Silico Typing Resource (SISTR), a bioinformatics platform for rapidly performing simultaneous in silico analyses for several leading subtyping methods on draft Salmonella genome assemblies. In addition to performing serovar prediction by genoserotyping, this resource integrates sequence-based typing analyses for: Multi-Locus Sequence Typing (MLST), ribosomal MLST (rMLST), and core genome MLST (cgMLST). We show how phylogenetic context from cgMLST analysis can supplement the genoserotyping analysis and increase the accuracy of in silico serovar prediction to over 94.6% on a dataset comprised of 4,188 finished genomes and WGS draft assemblies. In addition to allowing analysis of user-uploaded whole-genome assemblies, the SISTR platform incorporates a database comprising over 4,000 publicly available genomes, allowing users to place their isolates in a broader phylogenetic and epidemiological context. The resource incorporates several metadata driven visualizations to examine the phylogenetic, geospatial and temporal distribution of genome-sequenced isolates. As sequencing of Salmonella isolates at public health laboratories around the world becomes increasingly common, rapid in silico analysis of minimally processed draft genome assemblies provides a powerful approach for molecular epidemiology in support of public health investigations. Moreover, this type of integrated analysis using multiple sequence-based methods of sub-typing allows for continuity with historical serotyping data as we transition towards the increasing adoption of genomic analyses in epidemiology. The SISTR platform is freely available on the web at https://lfz.corefacility.ca/sistr-app/.
An AUC-based permutation variable importance measure for random forests
2013-01-01
Background The random forest (RF) method is a commonly used tool for classification with high dimensional data as well as for ranking candidate predictors based on the so-called random forest variable importance measures (VIMs). However the classification performance of RF is known to be suboptimal in case of strongly unbalanced data, i.e. data where response class sizes differ considerably. Suggestions were made to obtain better classification performance based either on sampling procedures or on cost sensitivity analyses. However to our knowledge the performance of the VIMs has not yet been examined in the case of unbalanced response classes. In this paper we explore the performance of the permutation VIM for unbalanced data settings and introduce an alternative permutation VIM based on the area under the curve (AUC) that is expected to be more robust towards class imbalance. Results We investigated the performance of the standard permutation VIM and of our novel AUC-based permutation VIM for different class imbalance levels using simulated data and real data. The results suggest that the new AUC-based permutation VIM outperforms the standard permutation VIM for unbalanced data settings while both permutation VIMs have equal performance for balanced data settings. Conclusions The standard permutation VIM loses its ability to discriminate between associated predictors and predictors not associated with the response for increasing class imbalance. It is outperformed by our new AUC-based permutation VIM for unbalanced data settings, while the performance of both VIMs is very similar in the case of balanced classes. The new AUC-based VIM is implemented in the R package party for the unbiased RF variant based on conditional inference trees. The codes implementing our study are available from the companion website: http://www.ibe.med.uni-muenchen.de/organisation/mitarbeiter/070_drittmittel/janitza/index.html. PMID:23560875
An AUC-based permutation variable importance measure for random forests.
Janitza, Silke; Strobl, Carolin; Boulesteix, Anne-Laure
2013-04-05
The random forest (RF) method is a commonly used tool for classification with high dimensional data as well as for ranking candidate predictors based on the so-called random forest variable importance measures (VIMs). However the classification performance of RF is known to be suboptimal in case of strongly unbalanced data, i.e. data where response class sizes differ considerably. Suggestions were made to obtain better classification performance based either on sampling procedures or on cost sensitivity analyses. However to our knowledge the performance of the VIMs has not yet been examined in the case of unbalanced response classes. In this paper we explore the performance of the permutation VIM for unbalanced data settings and introduce an alternative permutation VIM based on the area under the curve (AUC) that is expected to be more robust towards class imbalance. We investigated the performance of the standard permutation VIM and of our novel AUC-based permutation VIM for different class imbalance levels using simulated data and real data. The results suggest that the new AUC-based permutation VIM outperforms the standard permutation VIM for unbalanced data settings while both permutation VIMs have equal performance for balanced data settings. The standard permutation VIM loses its ability to discriminate between associated predictors and predictors not associated with the response for increasing class imbalance. It is outperformed by our new AUC-based permutation VIM for unbalanced data settings, while the performance of both VIMs is very similar in the case of balanced classes. The new AUC-based VIM is implemented in the R package party for the unbiased RF variant based on conditional inference trees. The codes implementing our study are available from the companion website: http://www.ibe.med.uni-muenchen.de/organisation/mitarbeiter/070_drittmittel/janitza/index.html.
Seismic performance for vertical geometric irregularity frame structures
NASA Astrophysics Data System (ADS)
Ismail, R.; Mahmud, N. A.; Ishak, I. S.
2018-04-01
This research highlights the result of vertical geometric irregularity frame structures. The aid of finite element analysis software, LUSAS was used to analyse seismic performance by focusing particularly on type of irregular frame on the differences in height floors and continued in the middle of the building. Malaysia’s building structures were affected once the earthquake took place in the neighbouring country such as Indonesia (Sumatera Island). In Malaysia, concrete is widely used in building construction and limited tension resistance to prevent it. Analysing structural behavior with horizontal and vertical static load is commonly analyses by using the Plane Frame Analysis. The case study of this research is to determine the stress and displacement in the seismic response under this type of irregular frame structures. This study is based on seven-storey building of Clinical Training Centre located in Sungai Buloh, Selayang, Selangor. Since the largest earthquake occurs in Acheh, Indonesia on December 26, 2004, the data was recorded and used in conducting this research. The result of stress and displacement using IMPlus seismic analysis in LUSAS Modeller Software under the seismic response of a formwork frame system states that the building is safe to withstand the ground and in good condition under the variation of seismic performance.
NASA Astrophysics Data System (ADS)
Santini, M.; Guilizzoni, M.; Fest-Santini, S.; Lorenzi, M.
2017-11-01
Highly hydrophobic surfaces have been intensively investigated in the last years because their properties may lead to very promising technological spillovers encompassing both everyday use and high-tech fields. Focusing on textiles, hydrophobic fabrics are of major interest for applications ranging from clothes to architecture to environment protection and energy conversion. Gas diffusion media - made by a gas diffusion layer (GDL) and a microporous layer (MPL) - for fuel cells are a good benchmark to develop techniques aimed at characterizing the wetting performances of engineered textiles. An experimental investigation was carried out about carbon-based, PTFE-treated GDLs with and without MPLs. Two samples (woven and woven-non-woven) were analysed before and after coating with a MPL. Their three-dimensional structure was reconstructed and analysed by computer-aided X-ray microtomography (µCT). Static and dynamic wettability analyses were then carried out using a modified axisymmetric drop shape analysis technique. All the surfaces exhibited very high hydrophobicity, three of them near to a super-hydrophobic behavior. Water drop impacts were performed, evidencing different bouncing, sticking and fragmentation outcomes for which critical values of the Weber number were identified. Finally, a µCT scan of a drop on a GDL was performed, confirming the Cassie-Baxter wetting state on such surface.
Hospital level analysis to improve patient flow.
Khanna, Sankalp; Boyle, Justin; Good, Norm; Bugden, Simon; Scott, Mark
2013-01-01
The complexity of hospital operations ensures that one-size-fits-all solutions seldom work. As hospitals turn to evidence based strategies to redesign flow, it is critical that they tailor the strategies to suit their individual service. This paper analyses the effect of hospital occupancy on inpatient and emergency department patient flow parameters at the Caboolture hospital in Queensland, Australia, and identifies critical levels, or choke points, that result in performance decline. The effect of weekdays and weekends on patient flow is also investigated. We compare these findings to a previous study that has analysed patient flow across Queensland hospitals grouped by size, and discover several differences in the interaction between rising occupancy and patient flow parameters including rates of patient flow, length of stay, and access block. We also identify significantly higher choke points for Caboolture hospital as compared to other similarly sized Queensland hospitals, which suggest that patient flow here can be redesigned to operate at higher levels of occupancy without degrading flow performance. The findings support arguments for hospitals to analyse patient flow at a service level to deliver optimum service improvement.
Butinar, Bojan; Bucar-Miklavcic, Milena; Valencic, Vasilij; Raspor, Peter
2010-05-12
In Slovenia two superb vegetable oils with high added nutritional value are produced: "Ekstra devisko oljcno olje Slovenske Istre (extra virgin olive oil from Slovene Istra)" and "Stajersko prekmursko bucno olje (pumpkin seed oil from Slovenia)". Their quality and genuineness must be monitored as adulteration can easily be undertaken. Olive oil genuineness determination experiences can show how analyses following an experience data-driven decision tree gathering several chemical determinations (fatty acids, (E)-isomers of fatty acids, sterol and tocopherol determinations) may be helpful in assessing the pumpkin seed oil from Slovenia genuineness. In the present work a set of HPLC triacylglycerol determinations was performed, based on the nine main triacylglycerols (LLLn, LLL, PLL, LOO, PLO, OOO, POO, SPL, and SLS) on a limited number of different pumpkin seed oils from northeastern Slovenia. The performed determinations showed that stereospecific analyses of triacylglycerols together with other chemical determinations can be useful in building a protocol for the evaluation of the genuineness of pumpkin seed oil from Slovenia.