ERIC Educational Resources Information Center
Li, Ying; Jiao, Hong; Lissitz, Robert W.
2012-01-01
This study investigated the application of multidimensional item response theory (IRT) models to validate test structure and dimensionality. Multiple content areas or domains within a single subject often exist in large-scale achievement tests. Such areas or domains may cause multidimensionality or local item dependence, which both violate the…
Modeling Booklet Effects for Nonequivalent Group Designs in Large-Scale Assessment
ERIC Educational Resources Information Center
Hecht, Martin; Weirich, Sebastian; Siegle, Thilo; Frey, Andreas
2015-01-01
Multiple matrix designs are commonly used in large-scale assessments to distribute test items to students. These designs comprise several booklets, each containing a subset of the complete item pool. Besides reducing the test burden of individual students, using various booklets allows aligning the difficulty of the presented items to the assumed…
ERIC Educational Resources Information Center
Frey, Andreas; Hartig, Johannes; Rupp, Andre A.
2009-01-01
In most large-scale assessments of student achievement, several broad content domains are tested. Because more items are needed to cover the content domains than can be presented in the limited testing time to each individual student, multiple test forms or booklets are utilized to distribute the items to the students. The construction of an…
Equating in Small-Scale Language Testing Programs
ERIC Educational Resources Information Center
LaFlair, Geoffrey T.; Isbell, Daniel; May, L. D. Nicolas; Gutierrez Arvizu, Maria Nelly; Jamieson, Joan
2017-01-01
Language programs need multiple test forms for secure administrations and effective placement decisions, but can they have confidence that scores on alternate test forms have the same meaning? In large-scale testing programs, various equating methods are available to ensure the comparability of forms. The choice of equating method is informed by…
Shaw, Emily E; Schultz, Aaron P; Sperling, Reisa A; Hedden, Trey
2015-10-01
Intrinsic functional connectivity MRI has become a widely used tool for measuring integrity in large-scale cortical networks. This study examined multiple cortical networks using Template-Based Rotation (TBR), a method that applies a priori network and nuisance component templates defined from an independent dataset to test datasets of interest. A priori templates were applied to a test dataset of 276 older adults (ages 65-90) from the Harvard Aging Brain Study to examine the relationship between multiple large-scale cortical networks and cognition. Factor scores derived from neuropsychological tests represented processing speed, executive function, and episodic memory. Resting-state BOLD data were acquired in two 6-min acquisitions on a 3-Tesla scanner and processed with TBR to extract individual-level metrics of network connectivity in multiple cortical networks. All results controlled for data quality metrics, including motion. Connectivity in multiple large-scale cortical networks was positively related to all cognitive domains, with a composite measure of general connectivity positively associated with general cognitive performance. Controlling for the correlations between networks, the frontoparietal control network (FPCN) and executive function demonstrated the only significant association, suggesting specificity in this relationship. Further analyses found that the FPCN mediated the relationships of the other networks with cognition, suggesting that this network may play a central role in understanding individual variation in cognition during aging.
Impact of Accumulated Error on Item Response Theory Pre-Equating with Mixed Format Tests
ERIC Educational Resources Information Center
Keller, Lisa A.; Keller, Robert; Cook, Robert J.; Colvin, Kimberly F.
2016-01-01
The equating of tests is an essential process in high-stakes, large-scale testing conducted over multiple forms or administrations. By adjusting for differences in difficulty and placing scores from different administrations of a test on a common scale, equating allows scores from these different forms and administrations to be directly compared…
ERIC Educational Resources Information Center
Andrich, David; Marais, Ida; Humphry, Stephen Mark
2016-01-01
Recent research has shown how the statistical bias in Rasch model difficulty estimates induced by guessing in multiple-choice items can be eliminated. Using vertical scaling of a high-profile national reading test, it is shown that the dominant effect of removing such bias is a nonlinear change in the unit of scale across the continuum. The…
Integral criteria for large-scale multiple fingerprint solutions
NASA Astrophysics Data System (ADS)
Ushmaev, Oleg S.; Novikov, Sergey O.
2004-08-01
We propose the definition and analysis of the optimal integral similarity score criterion for large scale multmodal civil ID systems. Firstly, the general properties of score distributions for genuine and impostor matches for different systems and input devices are investigated. The empirical statistics was taken from the real biometric tests. Then we carry out the analysis of simultaneous score distributions for a number of combined biometric tests and primary for ultiple fingerprint solutions. The explicit and approximate relations for optimal integral score, which provides the least value of the FRR while the FAR is predefined, have been obtained. The results of real multiple fingerprint test show good correspondence with the theoretical results in the wide range of the False Acceptance and the False Rejection Rates.
Does the Position of Response Options in Multiple-Choice Tests Matter?
ERIC Educational Resources Information Center
Hohensinn, Christine; Baghaei, Purya
2017-01-01
In large scale multiple-choice (MC) tests alternate forms of a test may be developed to prevent cheating by changing the order of items or by changing the position of the response options. The assumption is that since the content of the test forms are the same the order of items or the positions of the response options do not have any effect on…
Modal Testing of the NPSAT1 Engineering Development Unit
2012-07-01
erkläre ich, dass die vorliegende Master Arbeit von mir selbstständig und nur unter Verwendung der angegebenen Quellen und Hilfsmittel angefertigt...logarithmic scale . As 5 Figure 2 shows, natural frequencies are indicated by large values of the first CMIF (peaks), and multiple modes can be detected by...structure’s behavior. Ewins even states, “that no large- scale modal test should be permitted to proceed until some preliminary SDOF analyses have
Measures of Agreement Between Many Raters for Ordinal Classifications
Nelson, Kerrie P.; Edwards, Don
2015-01-01
Screening and diagnostic procedures often require a physician's subjective interpretation of a patient's test result using an ordered categorical scale to define the patient's disease severity. Due to wide variability observed between physicians’ ratings, many large-scale studies have been conducted to quantify agreement between multiple experts’ ordinal classifications in common diagnostic procedures such as mammography. However, very few statistical approaches are available to assess agreement in these large-scale settings. Existing summary measures of agreement rely on extensions of Cohen's kappa [1 - 5]. These are prone to prevalence and marginal distribution issues, become increasingly complex for more than three experts or are not easily implemented. Here we propose a model-based approach to assess agreement in large-scale studies based upon a framework of ordinal generalized linear mixed models. A summary measure of agreement is proposed for multiple experts assessing the same sample of patients’ test results according to an ordered categorical scale. This measure avoids some of the key flaws associated with Cohen's kappa and its extensions. Simulation studies are conducted to demonstrate the validity of the approach with comparison to commonly used agreement measures. The proposed methods are easily implemented using the software package R and are applied to two large-scale cancer agreement studies. PMID:26095449
NASA Technical Reports Server (NTRS)
Swanson, Gregory T.; Cassell, Alan M.
2011-01-01
Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology is currently being considered for multiple atmospheric entry applications as the limitations of traditional entry vehicles have been reached. The Inflatable Re-entry Vehicle Experiment (IRVE) has successfully demonstrated this technology as a viable candidate with a 3.0 m diameter vehicle sub-orbital flight. To further this technology, large scale HIADs (6.0 8.5 m) must be developed and tested. To characterize the performance of large scale HIAD technology new instrumentation concepts must be developed to accommodate the flexible nature inflatable aeroshell. Many of the concepts that are under consideration for the HIAD FY12 subsonic wind tunnel test series are discussed below.
Multi-scale pixel-based image fusion using multivariate empirical mode decomposition.
Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P; McDonald-Maier, Klaus D
2015-05-08
A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences.
Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition
Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P.; McDonald-Maier, Klaus D.
2015-01-01
A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences. PMID:26007714
Multi-Objective Parallel Test-Sheet Composition Using Enhanced Particle Swarm Optimization
ERIC Educational Resources Information Center
Ho, Tsu-Feng; Yin, Peng-Yeng; Hwang, Gwo-Jen; Shyu, Shyong Jian; Yean, Ya-Nan
2009-01-01
For large-scale tests, such as certification tests or entrance examinations, the composed test sheets must meet multiple assessment criteria. Furthermore, to fairly compare the knowledge levels of the persons who receive tests at different times owing to the insufficiency of available examination halls or the occurrence of certain unexpected…
False Discovery Control in Large-Scale Spatial Multiple Testing
Sun, Wenguang; Reich, Brian J.; Cai, T. Tony; Guindani, Michele; Schwartzman, Armin
2014-01-01
Summary This article develops a unified theoretical and computational framework for false discovery control in multiple testing of spatial signals. We consider both point-wise and cluster-wise spatial analyses, and derive oracle procedures which optimally control the false discovery rate, false discovery exceedance and false cluster rate, respectively. A data-driven finite approximation strategy is developed to mimic the oracle procedures on a continuous spatial domain. Our multiple testing procedures are asymptotically valid and can be effectively implemented using Bayesian computational algorithms for analysis of large spatial data sets. Numerical results show that the proposed procedures lead to more accurate error control and better power performance than conventional methods. We demonstrate our methods for analyzing the time trends in tropospheric ozone in eastern US. PMID:25642138
Automatic Scoring of Paper-and-Pencil Figural Responses. Research Report.
ERIC Educational Resources Information Center
Martinez, Michael E.; And Others
Large-scale testing is dominated by the multiple-choice question format. Widespread use of the format is due, in part, to the ease with which multiple-choice items can be scored automatically. This paper examines automatic scoring procedures for an alternative item type: figural response. Figural response items call for the completion or…
NASA Technical Reports Server (NTRS)
Renselaer, D. J.; Nishida, R. S.; Wilkin, C. A.
1975-01-01
The results and analyses of aerodynamic and acoustic studies conducted on the small scale noise and wind tunnel tests of upper surface blowing nozzle flap concepts are presented. Various types of nozzle flap concepts were tested. These are an upper surface blowing concept with a multiple slot arrangement with seven slots (seven slotted nozzle), an upper surface blowing type with a large nozzle exit at approximately mid-chord location in conjunction with a powered trailing edge flap with multiple slots (split flow or partially slotted nozzle). In addition, aerodynamic tests were continued on a similar multi-slotted nozzle flap, but with 14 slots. All three types of nozzle flap concepts tested appear to be about equal in overall aerodynamic performance but with the split flow nozzle somewhat better than the other two nozzle flaps in the landing approach mode. All nozzle flaps can be deflected to a large angle to increase drag without significant loss in lift. The nozzle flap concepts appear to be viable aerodynamic drag modulation devices for landing.
Deep convolutional neural network based antenna selection in multiple-input multiple-output system
NASA Astrophysics Data System (ADS)
Cai, Jiaxin; Li, Yan; Hu, Ying
2018-03-01
Antenna selection of wireless communication system has attracted increasing attention due to the challenge of keeping a balance between communication performance and computational complexity in large-scale Multiple-Input MultipleOutput antenna systems. Recently, deep learning based methods have achieved promising performance for large-scale data processing and analysis in many application fields. This paper is the first attempt to introduce the deep learning technique into the field of Multiple-Input Multiple-Output antenna selection in wireless communications. First, the label of attenuation coefficients channel matrix is generated by minimizing the key performance indicator of training antenna systems. Then, a deep convolutional neural network that explicitly exploits the massive latent cues of attenuation coefficients is learned on the training antenna systems. Finally, we use the adopted deep convolutional neural network to classify the channel matrix labels of test antennas and select the optimal antenna subset. Simulation experimental results demonstrate that our method can achieve better performance than the state-of-the-art baselines for data-driven based wireless antenna selection.
Nelson, Kerrie P; Mitani, Aya A; Edwards, Don
2017-09-10
Widespread inconsistencies are commonly observed between physicians' ordinal classifications in screening tests results such as mammography. These discrepancies have motivated large-scale agreement studies where many raters contribute ratings. The primary goal of these studies is to identify factors related to physicians and patients' test results, which may lead to stronger consistency between raters' classifications. While ordered categorical scales are frequently used to classify screening test results, very few statistical approaches exist to model agreement between multiple raters. Here we develop a flexible and comprehensive approach to assess the influence of rater and subject characteristics on agreement between multiple raters' ordinal classifications in large-scale agreement studies. Our approach is based upon the class of generalized linear mixed models. Novel summary model-based measures are proposed to assess agreement between all, or a subgroup of raters, such as experienced physicians. Hypothesis tests are described to formally identify factors such as physicians' level of experience that play an important role in improving consistency of ratings between raters. We demonstrate how unique characteristics of individual raters can be assessed via conditional modes generated during the modeling process. Simulation studies are presented to demonstrate the performance of the proposed methods and summary measure of agreement. The methods are applied to a large-scale mammography agreement study to investigate the effects of rater and patient characteristics on the strength of agreement between radiologists. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
ITC Guidelines on Quality Control in Scoring, Test Analysis, and Reporting of Test Scores
ERIC Educational Resources Information Center
Allalouf, Avi
2014-01-01
The Quality Control (QC) Guidelines are intended to increase the efficiency, precision, and accuracy of the scoring, analysis, and reporting process of testing. The QC Guidelines focus on large-scale testing operations where multiple forms of tests are created for use on set dates. However, they may also be used for a wide variety of other testing…
Relative Costs of Various Types of Assessments.
ERIC Educational Resources Information Center
Wheeler, Patricia H.
Issues of the relative costs of multiple choice tests and alternative types of assessment are explored. Before alternative assessments in large-scale or small-scale programs are used, attention must be given to cost considerations and the resources required to develop and implement the assessment. Major categories of cost to be considered are…
Robust Detection of Examinees with Aberrant Answer Changes
ERIC Educational Resources Information Center
Belov, Dmitry I.
2015-01-01
The statistical analysis of answer changes (ACs) has uncovered multiple testing irregularities on large-scale assessments and is now routinely performed at testing organizations. However, AC data has an uncertainty caused by technological or human factors. Therefore, existing statistics (e.g., number of wrong-to-right ACs) used to detect examinees…
2014-01-01
Background In complex large-scale experiments, in addition to simultaneously considering a large number of features, multiple hypotheses are often being tested for each feature. This leads to a problem of multi-dimensional multiple testing. For example, in gene expression studies over ordered categories (such as time-course or dose-response experiments), interest is often in testing differential expression across several categories for each gene. In this paper, we consider a framework for testing multiple sets of hypothesis, which can be applied to a wide range of problems. Results We adopt the concept of the overall false discovery rate (OFDR) for controlling false discoveries on the hypothesis set level. Based on an existing procedure for identifying differentially expressed gene sets, we discuss a general two-step hierarchical hypothesis set testing procedure, which controls the overall false discovery rate under independence across hypothesis sets. In addition, we discuss the concept of the mixed-directional false discovery rate (mdFDR), and extend the general procedure to enable directional decisions for two-sided alternatives. We applied the framework to the case of microarray time-course/dose-response experiments, and proposed three procedures for testing differential expression and making multiple directional decisions for each gene. Simulation studies confirm the control of the OFDR and mdFDR by the proposed procedures under independence and positive correlations across genes. Simulation results also show that two of our new procedures achieve higher power than previous methods. Finally, the proposed methodology is applied to a microarray dose-response study, to identify 17 β-estradiol sensitive genes in breast cancer cells that are induced at low concentrations. Conclusions The framework we discuss provides a platform for multiple testing procedures covering situations involving two (or potentially more) sources of multiplicity. The framework is easy to use and adaptable to various practical settings that frequently occur in large-scale experiments. Procedures generated from the framework are shown to maintain control of the OFDR and mdFDR, quantities that are especially relevant in the case of multiple hypothesis set testing. The procedures work well in both simulations and real datasets, and are shown to have better power than existing methods. PMID:24731138
ERIC Educational Resources Information Center
Tay, Louis; Huang, Qiming; Vermunt, Jeroen K.
2016-01-01
In large-scale testing, the use of multigroup approaches is limited for assessing differential item functioning (DIF) across multiple variables as DIF is examined for each variable separately. In contrast, the item response theory with covariate (IRT-C) procedure can be used to examine DIF across multiple variables (covariates) simultaneously. To…
ERIC Educational Resources Information Center
Domyancich, John M.
2014-01-01
Multiple-choice questions are an important part of large-scale summative assessments, such as the advanced placement (AP) chemistry exam. However, past AP chemistry exam items often lacked the ability to test conceptual understanding and higher-order cognitive skills. The redesigned AP chemistry exam shows a distinctive shift in item types toward…
Mach Number effects on turbulent superstructures in wall bounded flows
NASA Astrophysics Data System (ADS)
Kaehler, Christian J.; Bross, Matthew; Scharnowski, Sven
2017-11-01
Planer and three-dimensional flow field measurements along a flat plat boundary layer in the Trisonic Wind Tunnel Munich (TWM) are examined with the aim to characterize the scaling, spatial organization, and topology of large scale turbulent superstructures in compressible flow. This facility is ideal for this investigation as the ratio of boundary layer thickness to test section spanwise extent ratio is around 1/25, ensuring minimal sidewall and corner effects on turbulent structures in the center of the test section. A major difficulty in the experimental investigation of large scale features is the mutual size of the superstructures which can extend over many boundary layer thicknesses. Using multiple PIV systems, it was possible to capture the full spatial extent of large-scale structures over a range of Mach numbers from Ma = 0.3 - 3. To calculate the average large-scale structure length and spacing, the acquired vector fields were analyzed by statistical multi-point methods that show large scale structures with a correlation length of around 10 boundary layer thicknesses over the range of Mach numbers investigated. Furthermore, the average spacing between high and low momentum structures is on the order of a boundary layer thicknesses. This work is supported by the Priority Programme SPP 1881 Turbulent Superstructures of the Deutsche Forschungsgemeinschaft.
Andrich, David; Marais, Ida; Humphry, Stephen Mark
2015-01-01
Recent research has shown how the statistical bias in Rasch model difficulty estimates induced by guessing in multiple-choice items can be eliminated. Using vertical scaling of a high-profile national reading test, it is shown that the dominant effect of removing such bias is a nonlinear change in the unit of scale across the continuum. The consequence is that the proficiencies of the more proficient students are increased relative to those of the less proficient. Not controlling the guessing bias underestimates the progress of students across 7 years of schooling with important educational implications. PMID:29795871
Dynamical tuning for MPC using population games: A water supply network application.
Barreiro-Gomez, Julian; Ocampo-Martinez, Carlos; Quijano, Nicanor
2017-07-01
Model predictive control (MPC) is a suitable strategy for the control of large-scale systems that have multiple design requirements, e.g., multiple physical and operational constraints. Besides, an MPC controller is able to deal with multiple control objectives considering them within the cost function, which implies to determine a proper prioritization for each of the objectives. Furthermore, when the system has time-varying parameters and/or disturbances, the appropriate prioritization might vary along the time as well. This situation leads to the need of a dynamical tuning methodology. This paper addresses the dynamical tuning issue by using evolutionary game theory. The advantages of the proposed method are highlighted and tested over a large-scale water supply network with periodic time-varying disturbances. Finally, results are analyzed with respect to a multi-objective MPC controller that uses static tuning. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
He, Yong
2013-01-01
Common test items play an important role in equating multiple test forms under the common-item nonequivalent groups design. Inconsistent item parameter estimates among common items can lead to large bias in equated scores for IRT true score equating. Current methods extensively focus on detection and elimination of outlying common items, which…
Sources of Score Scale Inconsistency. Research Report. ETS RR-11-10
ERIC Educational Resources Information Center
Haberman, Shelby J.; Dorans, Neil J.
2011-01-01
For testing programs that administer multiple forms within a year and across years, score equating is used to ensure that scores can be used interchangeably. In an ideal world, samples sizes are large and representative of populations that hardly change over time, and very reliable alternate test forms are built with nearly identical psychometric…
ERIC Educational Resources Information Center
Wei, Youhua; Low, Albert
2017-01-01
In most large-scale programs of tests that aid in making high-stakes decisions, such as the "TOEIC"® family of products and service, it is not unusual for a significant portion of test takers to retake the test at multiple times.The study reported here used multilevel growth modeling to explore the score change patterns of nearly 20,000…
NASA Astrophysics Data System (ADS)
Alexander, L.; Hupp, C. R.; Forman, R. T.
2002-12-01
Many geodisturbances occur across large spatial scales, spanning entire landscapes and creating ecological phenomena in their wake. Ecological study at large scales poses special problems: (1) large-scale studies require large-scale resources, and (2) sampling is not always feasible at the appropriate scale, and researchers rely on data collected at smaller scales to interpret patterns across broad regions. A criticism of landscape ecology is that findings at small spatial scales are "scaled up" and applied indiscriminately across larger spatial scales. In this research, landscape scaling is addressed through process-pattern relationships between hydrogeomorphic processes and patterns of plant diversity in forested wetlands. The research addresses: (1) whether patterns and relationships between hydrogeomorphic, vegetation, and spatial variables can transcend scale; and (2) whether data collected at small spatial scales can be used to describe patterns and relationships across larger spatial scales. Field measurements of hydrologic, geomorphic, spatial, and vegetation data were collected or calculated for 15- 1-ha sites on forested floodplains of six (6) Chesapeake Bay Coastal Plain streams over a total area of about 20,000 km2. Hydroperiod (day/yr), floodplain surface elevation range (m), discharge (m3/s), stream power (kg-m/s2), sediment deposition (mm/yr), relative position downstream and other variables were used in multivariate analyses to explain differences in species richness, tree diversity (Shannon-Wiener Diversity Index H'), and plant community composition at four spatial scales. Data collected at the plot (400-m2) and site- (c. 1-ha) scales are applied to and tested at the river watershed and regional spatial scales. Results indicate that plant species richness and tree diversity (Shannon-Wiener diversity index H') can be described by hydrogeomorphic conditions at all scales, but are best described at the site scale. Data collected at plot and site scales are tested for spatial heterogeneity across the Chesapeake Bay Coastal Plain using a geostatistical variogram, and multiple regression analysis is used to relate plant diversity, spatial, and hydrogeomorphic variables across Coastal Plain regions and hydrologic regimes. Results indicate that relationships between hydrogeomorphic processes and patterns of plant diversity at finer scales can proxy relationships at coarser scales in some, not all, cases. Findings also suggest that data collected at small scales can be used to describe trends across broader scales under limited conditions.
ERIC Educational Resources Information Center
Nese, Joseph F. T.; Tindal, Gerald; Stevens, Joseph J.; Elliott, Stephen N.
2015-01-01
The stakes of large-scale testing programs have grown considerably in the past decade with the enactment of the No Child Left Behind (NCLB) and Race To The Top (RTTT) legislations. A significant component of NCLB has been required reporting of annual yearly progress (AYP) of student subgroups disaggregated by sex, special education status, English…
Hyder, Adnan A; Allen, Katharine A; Peters, David H; Chandran, Aruna; Bishai, David
2013-01-01
The growing burden of road traffic injuries, which kill over 1.2 million people yearly, falls mostly on low- and middle-income countries (LMICs). Despite this, evidence generation on the effectiveness of road safety interventions in LMIC settings remains scarce. This paper explores a scientific approach for evaluating road safety programmes in LMICs and introduces such a road safety multi-country initiative, the Road Safety in 10 Countries Project (RS-10). By building on existing evaluation frameworks, we develop a scientific approach for evaluating large-scale road safety programmes in LMIC settings. This also draws on '13 lessons' of large-scale programme evaluation: defining the evaluation scope; selecting study sites; maintaining objectivity; developing an impact model; utilising multiple data sources; using multiple analytic techniques; maximising external validity; ensuring an appropriate time frame; the importance of flexibility and a stepwise approach; continuous monitoring; providing feedback to implementers, policy-makers; promoting the uptake of evaluation results; and understanding evaluation costs. The use of relatively new approaches for evaluation of real-world programmes allows for the production of relevant knowledge. The RS-10 project affords an important opportunity to scientifically test these approaches for a real-world, large-scale road safety evaluation and generate new knowledge for the field of road safety.
Dislocation Multiplication by Single Cross Slip for FCC at Submicron Scales
NASA Astrophysics Data System (ADS)
Cui, Yi-Nan; Liu, Zhan-Li; Zhuang, Zhuo
2013-04-01
The operation mechanism of single cross slip multiplication (SCSM) is investigated by studying the response of one dislocation loop expanding in face-centered-cubic (FCC) single crystal using three-dimensional discrete dislocation dynamic (3D-DDD) simulation. The results show that SCSM can trigger highly correlated dislocation generation in a short time, which may shed some light on understanding the large strain burst observed experimentally. Furthermore, we find that there is a critical stress and material size for the operation of SCSM, which agrees with that required to trigger large strain burst in the compression tests of FCC micropillars.
3D plasmonic nanoantennas integrated with MEA biosensors
NASA Astrophysics Data System (ADS)
Dipalo, Michele; Messina, Gabriele C.; Amin, Hayder; La Rocca, Rosanna; Shalabaeva, Victoria; Simi, Alessandro; Maccione, Alessandro; Zilio, Pierfrancesco; Berdondini, Luca; de Angelis, Francesco
2015-02-01
Neuronal signaling in brain circuits occurs at multiple scales ranging from molecules and cells to large neuronal assemblies. However, current sensing neurotechnologies are not designed for parallel access of signals at multiple scales. With the aim of combining nanoscale molecular sensing with electrical neural activity recordings within large neuronal assemblies, in this work three-dimensional (3D) plasmonic nanoantennas are integrated with multielectrode arrays (MEA). Nanoantennas are fabricated by fast ion beam milling on optical resist; gold is deposited on the nanoantennas in order to connect them electrically to the MEA microelectrodes and to obtain plasmonic behavior. The optical properties of these 3D nanostructures are studied through finite elements method (FEM) simulations that show a high electromagnetic field enhancement. This plasmonic enhancement is confirmed by surface enhancement Raman spectroscopy of a dye performed in liquid, which presents an enhancement of almost 100 times the incident field amplitude at resonant excitation. Finally, the reported MEA devices are tested on cultured rat hippocampal neurons. Neurons develop by extending branches on the nanostructured electrodes and extracellular action potentials are recorded over multiple days in vitro. Raman spectra of living neurons cultured on the nanoantennas are also acquired. These results highlight that these nanostructures could be potential candidates for combining electrophysiological measures of large networks with simultaneous spectroscopic investigations at the molecular level.Neuronal signaling in brain circuits occurs at multiple scales ranging from molecules and cells to large neuronal assemblies. However, current sensing neurotechnologies are not designed for parallel access of signals at multiple scales. With the aim of combining nanoscale molecular sensing with electrical neural activity recordings within large neuronal assemblies, in this work three-dimensional (3D) plasmonic nanoantennas are integrated with multielectrode arrays (MEA). Nanoantennas are fabricated by fast ion beam milling on optical resist; gold is deposited on the nanoantennas in order to connect them electrically to the MEA microelectrodes and to obtain plasmonic behavior. The optical properties of these 3D nanostructures are studied through finite elements method (FEM) simulations that show a high electromagnetic field enhancement. This plasmonic enhancement is confirmed by surface enhancement Raman spectroscopy of a dye performed in liquid, which presents an enhancement of almost 100 times the incident field amplitude at resonant excitation. Finally, the reported MEA devices are tested on cultured rat hippocampal neurons. Neurons develop by extending branches on the nanostructured electrodes and extracellular action potentials are recorded over multiple days in vitro. Raman spectra of living neurons cultured on the nanoantennas are also acquired. These results highlight that these nanostructures could be potential candidates for combining electrophysiological measures of large networks with simultaneous spectroscopic investigations at the molecular level. Electronic supplementary information (ESI) available. See DOI: 10.1039/c4nr05578k
Computational Issues in Damping Identification for Large Scale Problems
NASA Technical Reports Server (NTRS)
Pilkey, Deborah L.; Roe, Kevin P.; Inman, Daniel J.
1997-01-01
Two damping identification methods are tested for efficiency in large-scale applications. One is an iterative routine, and the other a least squares method. Numerical simulations have been performed on multiple degree-of-freedom models to test the effectiveness of the algorithm and the usefulness of parallel computation for the problems. High Performance Fortran is used to parallelize the algorithm. Tests were performed using the IBM-SP2 at NASA Ames Research Center. The least squares method tested incurs high communication costs, which reduces the benefit of high performance computing. This method's memory requirement grows at a very rapid rate meaning that larger problems can quickly exceed available computer memory. The iterative method's memory requirement grows at a much slower pace and is able to handle problems with 500+ degrees of freedom on a single processor. This method benefits from parallelization, and significant speedup can he seen for problems of 100+ degrees-of-freedom.
Psychosocial correlates of fatigue in multiple sclerosis.
Schwartz, C E; Coulthard-Morris, L; Zeng, Q
1996-02-01
To explore: (1) the interrelation among the neuropsychological, psychological, and psychosocial factors and fatigue as measured by the Multidimensional Assessment of Fatigue scale, and (2) the impact of fatigue on role performance. Clinical interview with neuropsychological testing and cross-sectional study by mail. Multiple sclerosis (MS) clinic registry of a large Boston teaching hospital. 139 MS patients representing a broad range of disability. The Multidimensional Assessment of Fatigue (MAF) scale, the Extended Disability Status Scale, the Sickness Impact Profile, Rao cognitive battery, the Trailmaking Test, depression, anxiety, and social activity limitations subscales from the Arthritis Impact Measurement Scales, and the Ryff Happiness Scale. Stepwise multiple regression analyses revealed that having a low sense of environmental mastery was the best psychosocial predictor of both global fatigue and fatigue-related distress, after adjusting for sociodemographic and medical factors. Further, people who reported being more depressed tended to report more severe fatigue. Neuropsychological performance was not associated with fatigue. Fatigue was found to limit social, work, and overall role performance, but not physical role performance. People who feel that they can choose or create environments suitable to their psychic or physical conditions report less global fatigue and less fatigue-related distress, and fatigue can have an important impact on role performance. The implications of these findings for designing fatigue management interventions are discussed.
Selecting habitat to survive: the impact of road density on survival in a large carnivore.
Basille, Mathieu; Van Moorter, Bram; Herfindal, Ivar; Martin, Jodie; Linnell, John D C; Odden, John; Andersen, Reidar; Gaillard, Jean-Michel
2013-01-01
Habitat selection studies generally assume that animals select habitat and food resources at multiple scales to maximise their fitness. However, animals sometimes prefer habitats of apparently low quality, especially when considering the costs associated with spatially heterogeneous human disturbance. We used spatial variation in human disturbance, and its consequences on lynx survival, a direct fitness component, to test the Hierarchical Habitat Selection hypothesis from a population of Eurasian lynx Lynx lynx in southern Norway. Data from 46 lynx monitored with telemetry indicated that a high proportion of forest strongly reduced the risk of mortality from legal hunting at the home range scale, while increasing road density strongly increased such risk at the finer scale within the home range. We found hierarchical effects of the impact of human disturbance, with a higher road density at a large scale reinforcing its negative impact at a fine scale. Conversely, we demonstrated that lynx shifted their habitat selection to avoid areas with the highest road densities within their home ranges, thus supporting a compensatory mechanism at fine scale enabling lynx to mitigate the impact of large-scale disturbance. Human impact, positively associated with high road accessibility, was thus a stronger driver of lynx space use at a finer scale, with home range characteristics nevertheless constraining habitat selection. Our study demonstrates the truly hierarchical nature of habitat selection, which aims at maximising fitness by selecting against limiting factors at multiple spatial scales, and indicates that scale-specific heterogeneity of the environment is driving individual spatial behaviour, by means of trade-offs across spatial scales.
NASA Technical Reports Server (NTRS)
Tomsik, Thomas M.; Meyer, Michael L.
2010-01-01
This paper describes in-detail a test program that was initiated at the Glenn Research Center (GRC) involving the cryogenic densification of liquid oxygen (LO2). A large scale LO2 propellant densification system rated for 200 gpm and sized for the X-33 LO2 propellant tank, was designed, fabricated and tested at the GRC. Multiple objectives of the test program included validation of LO2 production unit hardware and characterization of densifier performance at design and transient conditions. First, performance data is presented for an initial series of LO2 densifier screening and check-out tests using densified liquid nitrogen. The second series of tests show performance data collected during LO2 densifier test operations with liquid oxygen as the densified product fluid. An overview of LO2 X-33 tanking operations and load tests with the 20,000 gallon Structural Test Article (STA) are described. Tank loading testing and the thermal stratification that occurs inside of a flight-weight launch vehicle propellant tank were investigated. These operations involved a closed-loop recirculation process of LO2 flow through the densifier and then back into the STA. Finally, in excess of 200,000 gallons of densified LO2 at 120 oR was produced with the propellant densification unit during the demonstration program, an achievement that s never been done before in the realm of large-scale cryogenic tests.
Curtis, Gary P.; Kohler, Matthias; Kannappan, Ramakrishnan; Briggs, Martin A.; Day-Lewis, Frederick D.
2015-01-01
Scientifically defensible predictions of field scale U(VI) transport in groundwater requires an understanding of key processes at multiple scales. These scales range from smaller than the sediment grain scale (less than 10 μm) to as large as the field scale which can extend over several kilometers. The key processes that need to be considered include both geochemical reactions in solution and at sediment surfaces as well as physical transport processes including advection, dispersion, and pore-scale diffusion. The research summarized in this report includes both experimental and modeling results in batch, column and tracer tests. The objectives of this research were to: (1) quantify the rates of U(VI) desorption from sediments acquired from a uranium contaminated aquifer in batch experiments;(2) quantify rates of U(VI) desorption in column experiments with variable chemical conditions, and(3) quantify nonreactive tracer and U(VI) transport in field tests.
Multiple Phenotype Association Tests Using Summary Statistics in Genome-Wide Association Studies
Liu, Zhonghua; Lin, Xihong
2017-01-01
Summary We study in this paper jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. PMID:28653391
Multiple phenotype association tests using summary statistics in genome-wide association studies.
Liu, Zhonghua; Lin, Xihong
2018-03-01
We study in this article jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. © 2017, The International Biometric Society.
Chronic, Wireless Recordings of Large Scale Brain Activity in Freely Moving Rhesus Monkeys
Schwarz, David A.; Lebedev, Mikhail A.; Hanson, Timothy L.; Dimitrov, Dragan F.; Lehew, Gary; Meloy, Jim; Rajangam, Sankaranarayani; Subramanian, Vivek; Ifft, Peter J.; Li, Zheng; Ramakrishnan, Arjun; Tate, Andrew; Zhuang, Katie; Nicolelis, Miguel A.L.
2014-01-01
Advances in techniques for recording large-scale brain activity contribute to both the elucidation of neurophysiological principles and the development of brain-machine interfaces (BMIs). Here we describe a neurophysiological paradigm for performing tethered and wireless large-scale recordings based on movable volumetric three-dimensional (3D) multielectrode implants. This approach allowed us to isolate up to 1,800 units per animal and simultaneously record the extracellular activity of close to 500 cortical neurons, distributed across multiple cortical areas, in freely behaving rhesus monkeys. The method is expandable, in principle, to thousands of simultaneously recorded channels. It also allows increased recording longevity (5 consecutive years), and recording of a broad range of behaviors, e.g. social interactions, and BMI paradigms in freely moving primates. We propose that wireless large-scale recordings could have a profound impact on basic primate neurophysiology research, while providing a framework for the development and testing of clinically relevant neuroprostheses. PMID:24776634
Validation of the Fatigue Impact Scale in Hungarian patients with multiple sclerosis.
Losonczi, Erika; Bencsik, Krisztina; Rajda, Cecília; Lencsés, Gyula; Török, Margit; Vécsei, László
2011-03-01
Fatigue is one of the most frequent complaints of patients with multiple sclerosis (MS). The Fatigue Impact Scale (FIS), one of the 30 available fatigue questionnaires, is commonly applied because it evaluates multidimensional aspects of fatigue. The main purposes of this study were to test the validity, test-retest reliability, and internal consistency of the Hungarian version of the FIS. One hundred and eleven MS patients and 85 healthy control (HC) subjects completed the FIS and the Beck Depression Inventory, a large majority of them on two occasions, 3 months apart. The total FIS score and subscale scores differed statistically between the MS patients and the HC subjects in both FIS sessions. In the test-retest reliability assessment, statistically, the intraclass correlation coefficients were high in both the MS and HC groups. Cronbach's alpha values were also notably high. The results of this study indicate that the FIS can be regarded as a valid and reliable scale with which to improve our understanding of the impact of fatigue on the health-related quality of life in MS patients without severe disability.
Accurate Modeling of Galaxy Clustering on Small Scales: Testing the Standard ΛCDM + Halo Model
NASA Astrophysics Data System (ADS)
Sinha, Manodeep; Berlind, Andreas A.; McBride, Cameron; Scoccimarro, Roman
2015-01-01
The large-scale distribution of galaxies can be explained fairly simply by assuming (i) a cosmological model, which determines the dark matter halo distribution, and (ii) a simple connection between galaxies and the halos they inhabit. This conceptually simple framework, called the halo model, has been remarkably successful at reproducing the clustering of galaxies on all scales, as observed in various galaxy redshift surveys. However, none of these previous studies have carefully modeled the systematics and thus truly tested the halo model in a statistically rigorous sense. We present a new accurate and fully numerical halo model framework and test it against clustering measurements from two luminosity samples of galaxies drawn from the SDSS DR7. We show that the simple ΛCDM cosmology + halo model is not able to simultaneously reproduce the galaxy projected correlation function and the group multiplicity function. In particular, the more luminous sample shows significant tension with theory. We discuss the implications of our findings and how this work paves the way for constraining galaxy formation by accurate simultaneous modeling of multiple galaxy clustering statistics.
Developing and Evaluating a Machine-Scorable, Constrained Constructed-Response Item.
ERIC Educational Resources Information Center
Braun, Henry I.; And Others
The use of constructed response items in large scale standardized testing has been hampered by the costs and difficulties associated with obtaining reliable scores. The advent of expert systems may signal the eventual removal of this impediment. This study investigated the accuracy with which expert systems could score a new, non-multiple choice…
ERIC Educational Resources Information Center
Liu, Jinghua; Guo, Hongwen; Dorans, Neil J.
2014-01-01
Maintaining score interchangeability and scale consistency is crucial for any testing programs that administer multiple forms across years. The use of a multiple linking design, which involves equating a new form to multiple old forms and averaging the conversions, has been proposed to control scale drift. However, the use of multiple linking…
Airlie, J; Baker, G A; Smith, S J; Young, C A
2001-06-01
To develop a scale to measure self-efficacy in neurologically impaired patients with multiple sclerosis and to assess the scale's psychometric properties. Cross-sectional questionnaire study in a clinical setting, the retest questionnaire returned by mail after completion at home. Regional multiple sclerosis (MS) outpatient clinic or the Clinical Trials Unit (CTU) at a large neuroscience centre in the UK. One hundred persons with MS attending the Walton Centre for Neurology and Neurosurgery and Clatterbridge Hospital, Wirral, as outpatients. Cognitively impaired patients were excluded at an initial clinic assessment. Patients were asked to provide demographic data and complete the self-efficacy scale along with the following validated scales: Hospital Anxiety and Depression Scale, Rosenberg Self-Esteem Scale, Impact, Stigma and Mastery and Rankin Scales. The Rankin Scale and Barthel Index were also assessed by the physician. A new 11-item self-efficacy scale was constructed consisting of two domains of control and personal agency. The validity of the scale was confirmed using Cronbach's alpha analysis of internal consistency (alpha = 0.81). The test-retest reliability of the scale over two weeks was acceptable with an intraclass correlation coefficient of 0.79. Construct validity was investigated using Pearson's product moment correlation coefficient resulting in significant correlations with depression (r= -0.52) anxiety (r =-0.50) and mastery (r= 0.73). Multiple regression analysis demonstrated that these factors accounted for 70% of the variance of scores on the self-efficacy scale, with scores on mastery, anxiety and perceived disability being independently significant. Assessment of the psychometric properties of this new self-efficacy scale suggest that it possesses good validity and reliability in patients with multiple sclerosis.
Coral mass spawning predicted by rapid seasonal rise in ocean temperature
Maynard, Jeffrey A.; Edwards, Alasdair J.; Guest, James R.; Rahbek, Carsten
2016-01-01
Coral spawning times have been linked to multiple environmental factors; however, to what extent these factors act as generalized cues across multiple species and large spatial scales is unknown. We used a unique dataset of coral spawning from 34 reefs in the Indian and Pacific Oceans to test if month of spawning and peak spawning month in assemblages of Acropora spp. can be predicted by sea surface temperature (SST), photosynthetically available radiation, wind speed, current speed, rainfall or sunset time. Contrary to the classic view that high mean SST initiates coral spawning, we found rapid increases in SST to be the best predictor in both cases (month of spawning: R2 = 0.73, peak: R2 = 0.62). Our findings suggest that a rapid increase in SST provides the dominant proximate cue for coral mass spawning over large geographical scales. We hypothesize that coral spawning is ultimately timed to ensure optimal fertilization success. PMID:27170709
Tsugawa, Hiroshi; Arita, Masanori; Kanazawa, Mitsuhiro; Ogiwara, Atsushi; Bamba, Takeshi; Fukusaki, Eiichiro
2013-05-21
We developed a new software program, MRMPROBS, for widely targeted metabolomics by using the large-scale multiple reaction monitoring (MRM) mode. The strategy became increasingly popular for the simultaneous analysis of up to several hundred metabolites at high sensitivity, selectivity, and quantitative capability. However, the traditional method of assessing measured metabolomics data without probabilistic criteria is not only time-consuming but is often subjective and makeshift work. Our program overcomes these problems by detecting and identifying metabolites automatically, by separating isomeric metabolites, and by removing background noise using a probabilistic score defined as the odds ratio from an optimized multivariate logistic regression model. Our software program also provides a user-friendly graphical interface to curate and organize data matrices and to apply principal component analyses and statistical tests. For a demonstration, we conducted a widely targeted metabolome analysis (152 metabolites) of propagating Saccharomyces cerevisiae measured at 15 time points by gas and liquid chromatography coupled to triple quadrupole mass spectrometry. MRMPROBS is a useful and practical tool for the assessment of large-scale MRM data available to any instrument or any experimental condition.
NASA Astrophysics Data System (ADS)
Zhang, Yang; Liu, Wei; Li, Xiaodong; Yang, Fan; Gao, Peng; Jia, Zhenyuan
2015-10-01
Large-scale triangulation scanning measurement systems are widely used to measure the three-dimensional profile of large-scale components and parts. The accuracy and speed of the laser stripe center extraction are essential for guaranteeing the accuracy and efficiency of the measuring system. However, in the process of large-scale measurement, multiple factors can cause deviation of the laser stripe center, including the spatial light intensity distribution, material reflectivity characteristics, and spatial transmission characteristics. A center extraction method is proposed for improving the accuracy of the laser stripe center extraction based on image evaluation of Gaussian fitting structural similarity and analysis of the multiple source factors. First, according to the features of the gray distribution of the laser stripe, evaluation of the Gaussian fitting structural similarity is estimated to provide a threshold value for center compensation. Then using the relationships between the gray distribution of the laser stripe and the multiple source factors, a compensation method of center extraction is presented. Finally, measurement experiments for a large-scale aviation composite component are carried out. The experimental results for this specific implementation verify the feasibility of the proposed center extraction method and the improved accuracy for large-scale triangulation scanning measurements.
Predicting the propagation of concentration and saturation fronts in fixed-bed filters.
Callery, O; Healy, M G
2017-10-15
The phenomenon of adsorption is widely exploited across a range of industries to remove contaminants from gases and liquids. Much recent research has focused on identifying low-cost adsorbents which have the potential to be used as alternatives to expensive industry standards like activated carbons. Evaluating these emerging adsorbents entails a considerable amount of labor intensive and costly testing and analysis. This study proposes a simple, low-cost method to rapidly assess the potential of novel media for potential use in large-scale adsorption filters. The filter media investigated in this study were low-cost adsorbents which have been found to be capable of removing dissolved phosphorus from solution, namely: i) aluminum drinking water treatment residual, and ii) crushed concrete. Data collected from multiple small-scale column tests was used to construct a model capable of describing and predicting the progression of adsorbent saturation and the associated effluent concentration breakthrough curves. This model was used to predict the performance of long-term, large-scale filter columns packed with the same media. The approach proved highly successful, and just 24-36 h of experimental data from the small-scale column experiments were found to provide sufficient information to predict the performance of the large-scale filters for up to three months. Copyright © 2017 Elsevier Ltd. All rights reserved.
To What Extent Do Teachers in European Countries Differ in Their Professional Community Practices?
ERIC Educational Resources Information Center
Lomos, Catalina
2017-01-01
Within comparative school effectiveness research facilitated by large-scale data across countries, this article presents the results of the testing for measurement invariance of the latent concept of Professional Community (PC) across 23 European countries and more than 35,000 teachers in secondary schools. The newly proposed Multiple-Group Factor…
NASA Astrophysics Data System (ADS)
Thorslund, Josefin; Jarsjö, Jerker; Destouni, Georgia
2017-04-01
Wetlands are often considered as nature-based solutions that can provide a multitude of services of great social, economic and environmental value to humankind. The services may include recreation, greenhouse gas sequestration, contaminant retention, coastal protection, groundwater level and soil moisture regulation, flood regulation and biodiversity support. Changes in land-use, water use and climate can all impact wetland functions and occur at scales extending well beyond the local scale of an individual wetland. However, in practical applications, management decisions usually regard and focus on individual wetland sites and local conditions. To understand the potential usefulness and services of wetlands as larger-scale nature-based solutions, e.g. for mitigating negative impacts from large-scale change pressures, one needs to understand the combined function multiple wetlands at the relevant large scales. We here systematically investigate if and to what extent research so far has addressed the large-scale dynamics of landscape systems with multiple wetlands, which are likely to be relevant for understanding impacts of regional to global change. Our investigation regards key changes and impacts of relevance for nature-based solutions, such as large-scale nutrient and pollution retention, flow regulation and coastal protection. Although such large-scale knowledge is still limited, evidence suggests that the aggregated functions and effects of multiple wetlands in the landscape can differ considerably from those observed at individual wetlands. Such scale differences may have important implications for wetland function-effect predictability and management under large-scale change pressures and impacts, such as those of climate change.
Lix, Lisa M; Wu, Xiuyun; Hopman, Wilma; Mayo, Nancy; Sajobi, Tolulope T; Liu, Juxin; Prior, Jerilynn C; Papaioannou, Alexandra; Josse, Robert G; Towheed, Tanveer E; Davison, K Shawn; Sawatzky, Richard
2016-01-01
Self-reported health status measures, like the Short Form 36-item Health Survey (SF-36), can provide rich information about the overall health of a population and its components, such as physical, mental, and social health. However, differential item functioning (DIF), which arises when population sub-groups with the same underlying (i.e., latent) level of health have different measured item response probabilities, may compromise the comparability of these measures. The purpose of this study was to test for DIF on the SF-36 physical functioning (PF) and mental health (MH) sub-scale items in a Canadian population-based sample. Study data were from the prospective Canadian Multicentre Osteoporosis Study (CaMos), which collected baseline data in 1996-1997. DIF was tested using a multiple indicators multiple causes (MIMIC) method. Confirmatory factor analysis defined the latent variable measurement model for the item responses and latent variable regression with demographic and health status covariates (i.e., sex, age group, body weight, self-perceived general health) produced estimates of the magnitude of DIF effects. The CaMos cohort consisted of 9423 respondents; 69.4% were female and 51.7% were less than 65 years. Eight of 10 items on the PF sub-scale and four of five items on the MH sub-scale exhibited DIF. Large DIF effects were observed on PF sub-scale items about vigorous and moderate activities, lifting and carrying groceries, walking one block, and bathing or dressing. On the MH sub-scale items, all DIF effects were small or moderate in size. SF-36 PF and MH sub-scale scores were not comparable across population sub-groups defined by demographic and health status variables due to the effects of DIF, although the magnitude of this bias was not large for most items. We recommend testing and adjusting for DIF to ensure comparability of the SF-36 in population-based investigations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jang, Seogjoo; Hoyer, Stephan; Fleming, Graham
2014-10-31
A generalized master equation (GME) governing quantum evolution of modular exciton density (MED) is derived for large scale light harvesting systems composed of weakly interacting modules of multiple chromophores. The GME-MED offers a practical framework to incorporate real time coherent quantum dynamics calculations of small length scales into dynamics over large length scales, and also provides a non-Markovian generalization and rigorous derivation of the Pauli master equation employing multichromophoric Förster resonance energy transfer rates. A test of the GME-MED for four sites of the Fenna-Matthews-Olson complex demonstrates how coherent dynamics of excitonic populations over coupled chromophores can be accurately describedmore » by transitions between subgroups (modules) of delocalized excitons. Application of the GME-MED to the exciton dynamics between a pair of light harvesting complexes in purple bacteria demonstrates its promise as a computationally efficient tool to investigate large scale exciton dynamics in complex environments.« less
Huang, Yi-Shao; Liu, Wel-Ping; Wu, Min; Wang, Zheng-Wu
2014-09-01
This paper presents a novel observer-based decentralized hybrid adaptive fuzzy control scheme for a class of large-scale continuous-time multiple-input multiple-output (MIMO) uncertain nonlinear systems whose state variables are unmeasurable. The scheme integrates fuzzy logic systems, state observers, and strictly positive real conditions to deal with three issues in the control of a large-scale MIMO uncertain nonlinear system: algorithm design, controller singularity, and transient response. Then, the design of the hybrid adaptive fuzzy controller is extended to address a general large-scale uncertain nonlinear system. It is shown that the resultant closed-loop large-scale system keeps asymptotically stable and the tracking error converges to zero. The better characteristics of our scheme are demonstrated by simulations. Copyright © 2014. Published by Elsevier Ltd.
Performance of fuselage pressure structure
NASA Technical Reports Server (NTRS)
Maclin, James R.
1992-01-01
There are currently more than 1,000 Boeing airplanes around the world over 20 years old. That number is expected to double by the year 1995. With these statistics comes the reality that structural airworthiness will be in the forefront of aviation issues well into the next century. The results of previous and recent test programs Boeing has implemented to study the structural performance of older airplanes relative to pressurized fuselage sections are described. Included in testing were flat panels with multiple site damage (MSD), a full-scale 737 and 2 747s as well as panels representing a 737 and 777, and a generic aircraft in large pressure-test fixtures. Because damage is a normal part of aging, focus is on the degree to which structural integrity is maintained after failure or partial failure of any structural element, including multiple site damage (MSD), and multiple element damage (MED).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis, Gary P.; Kohler, Matthias; Kannappan, Ramakrishnan
2015-02-24
Scientifically defensible predictions of field scale U(VI) transport in groundwater requires an understanding of key processes at multiple scales. These scales range from smaller than the sediment grain scale (less than 10 μm) to as large as the field scale which can extend over several kilometers. The key processes that need to be considered include both geochemical reactions in solution and at sediment surfaces as well as physical transport processes including advection, dispersion, and pore-scale diffusion. The research summarized in this report includes both experimental and modeling results in batch, column and tracer tests. The objectives of this research weremore » to: (1) quantify the rates of U(VI) desorption from sediments acquired from a uranium contaminated aquifer in batch experiments;(2) quantify rates of U(VI) desorption in column experiments with variable chemical conditions, and(3) quantify nonreactive tracer and U(VI) transport in field tests.« less
NASA Astrophysics Data System (ADS)
Mekanik, F.; Imteaz, M. A.; Gato-Trinidad, S.; Elmahdi, A.
2013-10-01
In this study, the application of Artificial Neural Networks (ANN) and Multiple regression analysis (MR) to forecast long-term seasonal spring rainfall in Victoria, Australia was investigated using lagged El Nino Southern Oscillation (ENSO) and Indian Ocean Dipole (IOD) as potential predictors. The use of dual (combined lagged ENSO-IOD) input sets for calibrating and validating ANN and MR Models is proposed to investigate the simultaneous effect of past values of these two major climate modes on long-term spring rainfall prediction. The MR models that did not violate the limits of statistical significance and multicollinearity were selected for future spring rainfall forecast. The ANN was developed in the form of multilayer perceptron using Levenberg-Marquardt algorithm. Both MR and ANN modelling were assessed statistically using mean square error (MSE), mean absolute error (MAE), Pearson correlation (r) and Willmott index of agreement (d). The developed MR and ANN models were tested on out-of-sample test sets; the MR models showed very poor generalisation ability for east Victoria with correlation coefficients of -0.99 to -0.90 compared to ANN with correlation coefficients of 0.42-0.93; ANN models also showed better generalisation ability for central and west Victoria with correlation coefficients of 0.68-0.85 and 0.58-0.97 respectively. The ability of multiple regression models to forecast out-of-sample sets is compatible with ANN for Daylesford in central Victoria and Kaniva in west Victoria (r = 0.92 and 0.67 respectively). The errors of the testing sets for ANN models are generally lower compared to multiple regression models. The statistical analysis suggest the potential of ANN over MR models for rainfall forecasting using large scale climate modes.
Viewpoint: observations on scaled average bioequivalence.
Patterson, Scott D; Jones, Byron
2012-01-01
The two one-sided test procedure (TOST) has been used for average bioequivalence testing since 1992 and is required when marketing new formulations of an approved drug. TOST is known to require comparatively large numbers of subjects to demonstrate bioequivalence for highly variable drugs, defined as those drugs having intra-subject coefficients of variation greater than 30%. However, TOST has been shown to protect public health when multiple generic formulations enter the marketplace following patent expiration. Recently, scaled average bioequivalence (SABE) has been proposed as an alternative statistical analysis procedure for such products by multiple regulatory agencies. SABE testing requires that a three-period partial replicate cross-over or full replicate cross-over design be used. Following a brief summary of SABE analysis methods applied to existing data, we will consider three statistical ramifications of the proposed additional decision rules and the potential impact of implementation of scaled average bioequivalence in the marketplace using simulation. It is found that a constraint being applied is biased, that bias may also result from the common problem of missing data and that the SABE methods allow for much greater changes in exposure when generic-generic switching occurs in the marketplace. Copyright © 2011 John Wiley & Sons, Ltd.
Algorithm of OMA for large-scale orthology inference
Roth, Alexander CJ; Gonnet, Gaston H; Dessimoz, Christophe
2008-01-01
Background OMA is a project that aims to identify orthologs within publicly available, complete genomes. With 657 genomes analyzed to date, OMA is one of the largest projects of its kind. Results The algorithm of OMA improves upon standard bidirectional best-hit approach in several respects: it uses evolutionary distances instead of scores, considers distance inference uncertainty, includes many-to-many orthologous relations, and accounts for differential gene losses. Herein, we describe in detail the algorithm for inference of orthology and provide the rationale for parameter selection through multiple tests. Conclusion OMA contains several novel improvement ideas for orthology inference and provides a unique dataset of large-scale orthology assignments. PMID:19055798
3D plasmonic nanoantennas integrated with MEA biosensors.
Dipalo, Michele; Messina, Gabriele C; Amin, Hayder; La Rocca, Rosanna; Shalabaeva, Victoria; Simi, Alessandro; Maccione, Alessandro; Zilio, Pierfrancesco; Berdondini, Luca; De Angelis, Francesco
2015-02-28
Neuronal signaling in brain circuits occurs at multiple scales ranging from molecules and cells to large neuronal assemblies. However, current sensing neurotechnologies are not designed for parallel access of signals at multiple scales. With the aim of combining nanoscale molecular sensing with electrical neural activity recordings within large neuronal assemblies, in this work three-dimensional (3D) plasmonic nanoantennas are integrated with multielectrode arrays (MEA). Nanoantennas are fabricated by fast ion beam milling on optical resist; gold is deposited on the nanoantennas in order to connect them electrically to the MEA microelectrodes and to obtain plasmonic behavior. The optical properties of these 3D nanostructures are studied through finite elements method (FEM) simulations that show a high electromagnetic field enhancement. This plasmonic enhancement is confirmed by surface enhancement Raman spectroscopy of a dye performed in liquid, which presents an enhancement of almost 100 times the incident field amplitude at resonant excitation. Finally, the reported MEA devices are tested on cultured rat hippocampal neurons. Neurons develop by extending branches on the nanostructured electrodes and extracellular action potentials are recorded over multiple days in vitro. Raman spectra of living neurons cultured on the nanoantennas are also acquired. These results highlight that these nanostructures could be potential candidates for combining electrophysiological measures of large networks with simultaneous spectroscopic investigations at the molecular level.
Investigation of Vapor Cooling Enhancements for Applications on Large Cryogenic Systems
NASA Technical Reports Server (NTRS)
Ameen, Lauren; Zoeckler, Joseph
2017-01-01
The need to demonstrate and evaluate the effectiveness of heat interception methods for use on a relevant cryogenic propulsion stage at a system level has been identified. Evolvable Cryogenics (eCryo) Structural Heat Intercept, Insulation and Vibration Evaluation Rig (SHIIVER) will be designed with vehicle specific geometries (SLS Exploration Upper Stage (EUS) as guidance) and will be subjected to simulated space environments. One method of reducing structure-born heat leak being investigated utilizes vapor-based heat interception. Vapor-based heat interception could potentially reduce heat leak into liquid hydrogen propulsion tanks, increasing potential mission length or payload capability. Due to the high number of unknowns associated with the heat transfer mechanism and integration of vapor-based heat interception on a realistic large-scale skirt design, a sub-scale investigation was developed. The sub-project effort is known as the Small-scale Laboratory Investigation of Cooling Enhancements (SLICE). The SLICE aims to study, design, and test sub-scale multiple attachments and flow configuration concepts for vapor-based heat interception of structural skirts. SLICE will focus on understanding the efficiency of the heat transfer mechanism to the boil-off hydrogen vapor by varying the fluid network designs and configurations. Various analyses were completed in MATLAB, Excel VBA, and COMSOL Multiphysics to understand the optimum flow pattern for heat transfer and fluid dynamics. Results from these analyses were used to design and fabricate test article subsections of a large forward skirt with vapor cooling applied. The SLICE testing is currently being performed to collect thermal mechanical performance data on multiple skirt heat removal designs while varying inlet vapor conditions necessary to intercept a specified amount of heat for a given system. Initial results suggest that applying vapor-cooling provides a 50 heat reduction in conductive heat transmission along the skirt to the tank. The information obtained by SLICE will be used by the SHIIVER engineering team to design and implement vapor-based heat removal technology into the SHIIVER forward skirt hardware design.
ERIC Educational Resources Information Center
Kim, Ah-Young
2015-01-01
Previous research in cognitive diagnostic assessment (CDA) of L2 reading ability has been frequently conducted using large-scale English proficiency exams (e.g., TOEFL, MELAB). Using CDA, it is possible to analyze individual learners' strengths and weaknesses in multiple attributes (i.e., knowledge, skill, strategy) measured at the item level.…
NASA Astrophysics Data System (ADS)
Derrick, M.; Krakauer, D.; Magill, S.; Mikunas, D.; Musgrave, B.; Repond, J.; Stanek, R.; Talaga, R. L.; Zhang, H.; Avad, R.; Bari, G.; Basile, M.; Bellagamba, L.; Boscherini, D.; Bruni, A.; Bruni, G.; Bruni, P.; Romeo, G. Cara; Castellini, G.; Chiarini, M.; Cifarelli, L.; Cindolo, F.; Contin, A.; Gialas, I.; Giusti, P.; Lacobucci, G.; Laurenti, G.; Levi, G.; Margotti, A.; Massam, T.; Nania, R.; Nemoz, C.; Palmonari, F.; Polini, A.; Sartorelli, G.; Timellini, R.; Garcia, Y. Zamora; Zichichi, A.; Bargende, A.; Crittenden, J.; Desch, K.; Diekmann, B.; Doeker, T.; Eckert, M.; Feld, L.; Frey, A.; Geerts, M.; Geitz, G.; Grothe, M.; Haas, T.; Hartmann, H.; Haun, D.; Heinloth, K.; Hilger, E.; Jakob, H.-P.; Katz, U. F.; Mari, S. M.; Mass, A.; Mengel, S.; Mollen, J.; Paul, E.; Rembser, Ch.; Schattevoy, R.; Schramm, D.; Stamm, V.; Wedemeyer, R.; Campbell-Robson, S.; Cassidy, A.; Dyce, N.; Foster, B.; George, S.; Gilmore, R.; Heath, G. P.; Heath, H. F.; Llewellyn, T. J.; Morgado, C. J. S.; Norman, D. J. P.; O'Mara, J. A.; Tapper, R. J.; Wilson, S. S.; Yoshida, R.; Rau, R. R.; Arneodo, M.; Iannotti, L.; Schioppa, M.; Susinno, G.; Bernstein, A.; Caldwell, A.; Parsons, J. A.; Ritz, S.; Sciulli, F.; Straub, P. B.; Wai, L.; Yang, S.; Zhu, Q.; Borzemski, P.; Chwastowski, J.; Eskreys, A.; Piotrzkowski, K.; Zachara, M.; Zawiejski, L.; Adamczyk, L.; Bednarek, B.; Eskreys, K.; Jeleń, K.; Kisielewska, D.; Kowalski, T.; Rulikowska-Zarębska, E.; Suszycki, L.; Zając, J.; Kotański, A.; Przybycień, M.; Bauerdick, I. A. T.; Behrens, U.; Beier, H.; Bienlein, J. K.; Coldewey, C.; Deppe, O.; Desler, K.; Drews, G.; Flasińki, M.; Gilkinson, D. J.; Glasman, C.; Göttlicher, P.; Große-Knetter, J.; Gutjahr, B.; Hain, W.; Hasell, D.; Heßling, H.; Hultschig, H.; Iga, Y.; Joos, P.; Kasemann, M.; Klanner, R.; Koch, W.; Kopke, L.; Kötz, U.; Kowalski, H.; Labs, J.; Ladage, A.; Löhr, B.; Löwe, M.; Lüke, D.; Mańczak, O.; Ng, J. S. T.; Nickel, S.; Notz, D.; Ohrenberg, K.; Roco, M.; Rohde, M.; Roldán, J.; Schneckloth, U.; Schulz, W.; Selonke, F.; Stiliaris, E.; Surrow, B.; Voß, T.; Westphal, D.; Wolf, G.; Youngman, C.; Zhou, J. F.; Grabosch, H. J.; Kharchilava, A.; Leich, A.; Mattingly, M.; Meyer, A.; Schlenstedt, S.; Barbagli, G.; Pelfer, P.; Anzivino, G.; Maccarrone, G.; de Pasquale, S.; Votano, L.; Bamberger, A.; Eisenhardt, S.; Freidhof, A.; Söldner-Rembold, S.; Schroeder, J.; Trefzger, T.; Brook, N. H.; Bussey, P. J.; Doyle, A. T.; Fleck, I.; Jamieson, V. A.; Saxon, D. H.; Utley, M. L.; Wilson, A. S.; Dannemann, A.; Holm, U.; Horstmann, D.; Neumann, T.; Sinkus, R.; Wick, K.; Badura, E.; Burow, B. D.; Hagge, L.; Lohrmann, E.; Mainusch, J.; Milewski, J.; Nakahata, M.; Pavel, N.; Poelz, G.; Schott, W.; Zetsche, F.; Bacon, T. C.; Butterworth, I.; Gallo, E.; Harris, V. L.; Hung, B. Y. H.; Long, K. R.; Miller, D. B.; Morawitz, P. P. O.; Prinias, A.; Sedgbeer, J. K.; Whitfield, A. F.; Mallik, U.; McCliment, E.; Wang, M. Z.; Wang, S. M.; Wu, J. T.; Zhang, Y.; Cloth, P.; Filges, D.; An, S. H.; Hong, S. M.; Nam, S. W.; Park, S. K.; Suh, M. H.; Yon, S. H.; Imlay, R.; Kartik, S.; Kim, H.-I.; McNeil, R. R.; Metcalf, W.; Nadendla, V. K.; Barreiro, F.; Cases, G.; Graciani, R.; Hernández, J. M.; Hervás, L.; Labarga, L.; Del Peso, J.; Puga, J.; Terron, J.; de Trocóniz, I. F.; Smith, G. R.; Corriveau, F.; Hanna, D. S.; Hartmann, J.; Hung, L. W.; Lim, J. N.; Matthews, C. G.; Patel, P. M.; Sinclair, L. E.; Stairs, D. G.; St. Laurent, M.; Ullmann, R.; Zacek, G.; Bashkirov, V.; Dolgoshein, B. A.; Stifutkin, A.; Bashindzhagyan, G. L.; Ermolov, P. F.; Gladilin, L. K.; Golubkov, Y. A.; Kobrin, V. D.; Kuzmin, V. A.; Proskuryakov, A. S.; Savin, A. A.; Shcheglova, L. M.; Solomin, A. N.; Zotov, N. P.; Botje, M.; Chlebana, F.; Dake, A.; Engelen, J.; de Kamps, M.; Kooijman, P.; Kruse, A.; Tiecke, H.; Verkerke, W.; Vreeswijk, M.; Wiggers, L.; de Wolf, E.; van Woudenberg, R.; Acosta, D.; Bylsma, B.; Durkin, L. S.; Honscheid, K.; Li, C.; Ling, T. Y.; McLean, K. W.; Murray, W. N.; Park, I. H.; Romanowski, T. A.; Seidlein, R.; Bailey, D. S.; Blair, G. A.; Byrne, A.; Cashmore, R. J.; Cooper-Sarkar, A. M.; Daniels, D.; Devenish, R. C. E.; Harnew, N.; Lancaster, M.; Luffman, P. E.; Lindemann, L.; McFall, J. D.; Nath, C.; Quadt, A.; Uijterwaal, H.; Walczak, R.; Wilson, F. F.; Yip, T.; Abbiendi, G.; Bertolin, A.; Brugnera, R.; Carlin, R.; Dal Corso, F.; de Giorgi, M.; Dosselli, U.; Limentani, S.; Morandin, M.; Posocco, M.; Stanco, L.; Stroili, R.; Voci, C.; Bulmahn, J.; Butterworth, J. M.; Field, R. G.; Oh, B. Y.; Whitmore, J. J.; D'Agostini, G.; Marini, G.; Nigro, A.; Tassi, E.; Hart, J. C.; McCubbin, N. A.; Prytz, K.; Shah, T. P.; Short, T. L.; Barberis, E.; Cartiglia, N.; Dubbs, T.; Heusch, C.; van Hook, M.; Hubbard, B.; Lockman, W.; Rahn, J. T.; Sadrozinski, H. F.-W.; Seiden, A.; Biltzinger, J.; Seifert, R. J.; Walenta, A. H.; Zech, G.; Abramowicz, H.; Briskin, G.; Dagan, S.; Levy, A.; Hasegawa, T.; Hazumi, M.; Ishii, T.; Kuze, M.; Mine, S.; Nagasawa, Y.; Nakao, M.; Suzuki, I.; Tokushuku, K.; Yamada, S.; Yamazaki, Y.; Chiba, M.; Hamatsu, R.; Hirose, T.; Homma, K.; Kitamura, S.; Nakamitsu, Y.; Yamauchi, K.; Cirio, R.; Costa, M.; Ferrero, M. I.; Lamberti, L.; Maselli, S.; Peroni, C.; Sacchi, R.; Solano, A.; Staiano, A.; Dardo, M.; Bailey, D. C.; Bandyopadhyay, D.; Benard, F.; Brkic, M.; Crombie, M. B.; Gingrich, D. M.; Hartner, G. F.; Joo, K. K.; Levman, G. M.; Martin, J. F.; Orr, R. S.; Sampson, C. R.; Teuscher, R. J.; Catterall, C. D.; Jones, T. W.; Kaziewicz, P. B.; Lane, J. B.; Saunders, R. L.; Shulman, J.; Blankenship, K.; Kochocki, J.; Lu, B.; Mo, L. W.; Bogusz, W.; Charchuła, K.; Ciborowski, J.; Gajewski, J.; Grzelak, G.; Kasprzak, M.; Krzyżanowski, M.; Muchorowski, K.; Nowak, R. J.; Pawlak, J. M.; Tymieniecka, T.; Wróblewski, A. K.; Zakrzewski, J. A.; Żarnecki, A. F.; Adamus, M.; Eisenberg, Y.; Karshon, U.; Revel, D.; Zer-Zion, D.; Ali, I.; Badgett, W. F.; Behrens, B.; Dasu, S.; Fordham, C.; Foudas, C.; Goussiou, A.; Loveless, R. J.; Reeder, D. D.; Silverstein, S.; Smith, W. H.; Vaiciulis, A.; Wodarczyk, M.; Tsurugai, T.; Bhadra, S.; Cardy, M. L.; Fagerstroem, C.-P.; Frisken, W. R.; Furutani, K. M.; Khakzad, M.; Schmidke, W. B.
1995-03-01
Charged particle production has been measured in Deep Inelastic Scattering (DIS) events using the ZEUS detector over a large range of Q 2 from 10 to 1280 GeV2. The evolution with Q of the charged multiplicity and scaled momentum has been investigated in the current fragmentation region of the Breit frame. The data are used to study QCD coherence effects in DIS and are compared with corresponding e + e - data in order to test the universality of quark fragmentation.
Data for Room Fire Model Comparisons
Peacock, Richard D.; Davis, Sanford; Babrauskas, Vytenis
1991-01-01
With the development of models to predict fire growth and spread in buildings, there has been a concomitant evolution in the measurement and analysis of experimental data in real-scale fires. This report presents the types of analyses that can be used to examine large-scale room fire test data to prepare the data for comparison with zone-based fire models. Five sets of experimental data which can be used to test the limits of a typical two-zone fire model are detailed. A standard set of nomenclature describing the geometry of the building and the quantities measured in each experiment is presented. Availability of ancillary data (such as smaller-scale test results) is included. These descriptions, along with the data (available in computer-readable form) should allow comparisons between the experiment and model predictions. The base of experimental data ranges in complexity from one room tests with individual furniture items to a series of tests conducted in a multiple story hotel equipped with a zoned smoke control system. PMID:28184121
Data for Room Fire Model Comparisons.
Peacock, Richard D; Davis, Sanford; Babrauskas, Vytenis
1991-01-01
With the development of models to predict fire growth and spread in buildings, there has been a concomitant evolution in the measurement and analysis of experimental data in real-scale fires. This report presents the types of analyses that can be used to examine large-scale room fire test data to prepare the data for comparison with zone-based fire models. Five sets of experimental data which can be used to test the limits of a typical two-zone fire model are detailed. A standard set of nomenclature describing the geometry of the building and the quantities measured in each experiment is presented. Availability of ancillary data (such as smaller-scale test results) is included. These descriptions, along with the data (available in computer-readable form) should allow comparisons between the experiment and model predictions. The base of experimental data ranges in complexity from one room tests with individual furniture items to a series of tests conducted in a multiple story hotel equipped with a zoned smoke control system.
Forum: The Rise of International Large-Scale Assessments and Rationales for Participation
ERIC Educational Resources Information Center
Addey, Camilla; Sellar, Sam; Steiner-Khamsi, Gita; Lingard, Bob; Verger, Antoni
2017-01-01
This Forum discusses the significant growth of international large-scale assessments (ILSAs) since the mid-1990s. Addey and Sellar's contribution ("A Framework for Analysing the Multiple Rationales for Participating in International Large-Scale Assessments") outlines a framework of rationales for participating in ILSAs and examines the…
Large Terrain Modeling and Visualization for Planets
NASA Technical Reports Server (NTRS)
Myint, Steven; Jain, Abhinandan; Cameron, Jonathan; Lim, Christopher
2011-01-01
Physics-based simulations are actively used in the design, testing, and operations phases of surface and near-surface planetary space missions. One of the challenges in realtime simulations is the ability to handle large multi-resolution terrain data sets within models as well as for visualization. In this paper, we describe special techniques that we have developed for visualization, paging, and data storage for dealing with these large data sets. The visualization technique uses a real-time GPU-based continuous level-of-detail technique that delivers multiple frames a second performance even for planetary scale terrain model sizes.
Geospatial Optimization of Siting Large-Scale Solar Projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macknick, Jordan; Quinby, Ted; Caulfield, Emmet
2014-03-01
Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent withmore » each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.« less
Designing fire safe interiors.
Belles, D W
1992-01-01
Any product that causes a fire to grow large is deficient in fire safety performance. A large fire in any building represents a serious hazard. Multiple-death fires almost always are linked to fires that grow quickly to a large size. Interior finishes have large, continuous surfaces over which fire can spread. They are regulated to slow initial fire growth, and must be qualified for use on the basis of fire tests. To obtain meaningful results, specimens must be representative of actual installation. Variables--such as the substrate, the adhesive, and product thickness and density--can affect product performance. The tunnel test may not adequately evaluate some products, such as foam plastics or textile wall coverings, thermoplastic materials, or materials of minimal mass. Where questions exist, products should be evaluated on a full-scale basis. Curtains and draperies are examples of products that ignite easily and spread flames readily. The present method for testing curtains and draperies evaluates one fabric at a time. Although a fabric tested alone may perform well, fabrics that meet test standards individually sometimes perform poorly when tested in combination. Contents and furnishings constitute the major fuels in many fires. Contents may involve paper products and other lightweight materials that are easily ignited and capable of fast fire growth. Similarly, a small source may ignite many items of furniture that are capable of sustained fire growth. Upholstered furniture can reach peak burning rates in less than 5 minutes. Furnishings have been associated with many multiple-death fires.(ABSTRACT TRUNCATED AT 250 WORDS)
Teledyne Taber 206-1000 and 2210-3000 pressure transducer proof test and burst test
NASA Technical Reports Server (NTRS)
Ricks, G. A.
1989-01-01
The range accuracy and structural integrity of the Teledyne Taber 206-1000 and 2210-3000 pressure transducers are verified and multiple uses are studied to determine is they have a significant effect on the transducers. Burst pressure for these transducers was also established. The Teledyne Taber 206-1000 pressure transducer is used to measure chamber pressure on full-scale space shuttle rocket motors. The Teledyne Taber 2210-3000 pressure transducer is used to measure igniter pressure. The Teledyne Taber transducer has very good temperature stability and was used on all full-scale solid rocket motors, so there is a large data base established using this transducer.
Large-region acoustic source mapping using a movable array and sparse covariance fitting.
Zhao, Shengkui; Tuna, Cagdas; Nguyen, Thi Ngoc Tho; Jones, Douglas L
2017-01-01
Large-region acoustic source mapping is important for city-scale noise monitoring. Approaches using a single-position measurement scheme to scan large regions using small arrays cannot provide clean acoustic source maps, while deploying large arrays spanning the entire region of interest is prohibitively expensive. A multiple-position measurement scheme is applied to scan large regions at multiple spatial positions using a movable array of small size. Based on the multiple-position measurement scheme, a sparse-constrained multiple-position vectorized covariance matrix fitting approach is presented. In the proposed approach, the overall sample covariance matrix of the incoherent virtual array is first estimated using the multiple-position array data and then vectorized using the Khatri-Rao (KR) product. A linear model is then constructed for fitting the vectorized covariance matrix and a sparse-constrained reconstruction algorithm is proposed for recovering source powers from the model. The user parameter settings are discussed. The proposed approach is tested on a 30 m × 40 m region and a 60 m × 40 m region using simulated and measured data. Much cleaner acoustic source maps and lower sound pressure level errors are obtained compared to the beamforming approaches and the previous sparse approach [Zhao, Tuna, Nguyen, and Jones, Proc. IEEE Intl. Conf. on Acoustics, Speech and Signal Processing (ICASSP) (2016)].
Explorative Function in Williams Syndrome Analyzed through a Large-Scale Task with Multiple Rewards
ERIC Educational Resources Information Center
Foti, F.; Petrosini, L.; Cutuli, D.; Menghini, D.; Chiarotti, F.; Vicari, S.; Mandolesi, L.
2011-01-01
This study aimed to evaluate spatial function in subjects with Williams syndrome (WS) by using a large-scale task with multiple rewards and comparing the spatial abilities of WS subjects with those of mental age-matched control children. In the present spatial task, WS participants had to explore an open space to search nine rewards placed in…
Statistical significance test for transition matrices of atmospheric Markov chains
NASA Technical Reports Server (NTRS)
Vautard, Robert; Mo, Kingtse C.; Ghil, Michael
1990-01-01
Low-frequency variability of large-scale atmospheric dynamics can be represented schematically by a Markov chain of multiple flow regimes. This Markov chain contains useful information for the long-range forecaster, provided that the statistical significance of the associated transition matrix can be reliably tested. Monte Carlo simulation yields a very reliable significance test for the elements of this matrix. The results of this test agree with previously used empirical formulae when each cluster of maps identified as a distinct flow regime is sufficiently large and when they all contain a comparable number of maps. Monte Carlo simulation provides a more reliable way to test the statistical significance of transitions to and from small clusters. It can determine the most likely transitions, as well as the most unlikely ones, with a prescribed level of statistical significance.
FEATURE 3, LARGE GUN POSITION, SHOWING MULTIPLE COMPARTMENTS, VIEW FACING ...
FEATURE 3, LARGE GUN POSITION, SHOWING MULTIPLE COMPARTMENTS, VIEW FACING SOUTH (with scale stick). - Naval Air Station Barbers Point, Anti-Aircraft Battery Complex-Large Gun Position, East of Coral Sea Road, northwest of Hamilton Road, Ewa, Honolulu County, HI
Peñaloza López, Yolanda Rebeca; Orozco Peña, Xóchitl Daisy; Pérez Ruiz, Santiago Jesús
2018-04-03
To evaluate the central auditory processing disorders in patients with multiple sclerosis, emphasizing auditory laterality by applying psychoacoustic tests and to identify their relationship with the Multiple Sclerosis Disability Scale (EDSS) functions. Depression scales (HADS), EDSS, and 9 psychoacoustic tests to study CAPD were applied to 26 individuals with multiple sclerosis and 26 controls. Correlation tests were performed between the EDSS and psychoacoustic tests. Seven out of 9 psychoacoustic tests were significantly different (P<.05); right or left (14/19 explorations) with respect to control. In dichotic digits there was a left-ear advantage compared to the usual predominance of RDD. There was significant correlation in five psychoacoustic tests and the specific functions of EDSS. The left-ear advantage detected and interpreted as an expression of deficient influences of the corpus callosum and attention in multiple sclerosis should be investigated. There was a correlation between psychoacoustic tests and specific EDSS functions. Copyright © 2018 Sociedad Española de Otorrinolaringología y Cirugía de Cabeza y Cuello. Publicado por Elsevier España, S.L.U. All rights reserved.
ERIC Educational Resources Information Center
Argüelles Álvarez, Irina
2013-01-01
The new requirement placed on students in tertiary settings in Spain to demonstrate a B1 or a B2 proficiency level of English, in accordance with the Common European Framework of Reference for Languages (CEFRL), has led most Spanish universities to develop a program of certification or accreditation of the required level. The first part of this…
NASA Astrophysics Data System (ADS)
Shelestov, Andrii; Lavreniuk, Mykola; Kussul, Nataliia; Novikov, Alexei; Skakun, Sergii
2017-02-01
Many applied problems arising in agricultural monitoring and food security require reliable crop maps at national or global scale. Large scale crop mapping requires processing and management of large amount of heterogeneous satellite imagery acquired by various sensors that consequently leads to a “Big Data” problem. The main objective of this study is to explore efficiency of using the Google Earth Engine (GEE) platform when classifying multi-temporal satellite imagery with potential to apply the platform for a larger scale (e.g. country level) and multiple sensors (e.g. Landsat-8 and Sentinel-2). In particular, multiple state-of-the-art classifiers available in the GEE platform are compared to produce a high resolution (30 m) crop classification map for a large territory ( 28,100 km2 and 1.0 M ha of cropland). Though this study does not involve large volumes of data, it does address efficiency of the GEE platform to effectively execute complex workflows of satellite data processing required with large scale applications such as crop mapping. The study discusses strengths and weaknesses of classifiers, assesses accuracies that can be achieved with different classifiers for the Ukrainian landscape, and compares them to the benchmark classifier using a neural network approach that was developed in our previous studies. The study is carried out for the Joint Experiment of Crop Assessment and Monitoring (JECAM) test site in Ukraine covering the Kyiv region (North of Ukraine) in 2013. We found that Google Earth Engine (GEE) provides very good performance in terms of enabling access to the remote sensing products through the cloud platform and providing pre-processing; however, in terms of classification accuracy, the neural network based approach outperformed support vector machine (SVM), decision tree and random forest classifiers available in GEE.
Fabrication of the HIAD Large-Scale Demonstration Assembly and Upcoming Mission Applications
NASA Technical Reports Server (NTRS)
Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; Dinonno, J. M.; Cheatwood, F M.
2017-01-01
Over a decade of work has been conducted in the development of NASAs Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale.In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then integrated in early 2017. The design includes provisions to add the remaining four tori necessary to complete the assembly of the 12m Human-Scale Pathfinder HIAD in the event future project funding becomes available.This presentation will discuss the HIAD large-scale demonstration assembly design and fabrication per-formed in the last year including the precursor tori development and the partial-stack fabrication. Potential near-term and future 10-15m HIAD applications will also be discussed.
Fabrication of the HIAD Large-Scale Demonstration Assembly
NASA Technical Reports Server (NTRS)
Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; DiNonno, J. M.; Cheatwood, F. M.
2017-01-01
Over a decade of work has been conducted in the development of NASA's Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale. In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then integrated in early 2017. The design includes provisions to add the remaining four tori necessary to complete the assembly of the 12m Human-Scale Pathfinder HIAD in the event future project funding becomes available.This presentation will discuss the HIAD large-scale demonstration assembly design and fabrication per-formed in the last year including the precursor tori development and the partial-stack fabrication. Potential near-term and future 10-15m HIAD applications will also be discussed.
Trends in computer applications in science assessment
NASA Astrophysics Data System (ADS)
Kumar, David D.; Helgeson, Stanley L.
1995-03-01
Seven computer applications to science assessment are reviewed. Conventional test administration includes record keeping, grading, and managing test banks. Multiple-choice testing involves forced selection of an answer from a menu, whereas constructed-response testing involves options for students to present their answers within a set standard deviation. Adaptive testing attempts to individualize the test to minimize the number of items and time needed to assess a student's knowledge. Figurai response testing assesses science proficiency in pictorial or graphic mode and requires the student to construct a mental image rather than selecting a response from a multiple choice menu. Simulations have been found useful for performance assessment on a large-scale basis in part because they make it possible to independently specify different aspects of a real experiment. An emerging approach to performance assessment is solution pathway analysis, which permits the analysis of the steps a student takes in solving a problem. Virtually all computer-based testing systems improve the quality and efficiency of record keeping and data analysis.
Dretsch, Michael N; Silverberg, Noah D; Iverson, Grant L
2015-09-01
The extent to which multiple past concussions are associated with lingering symptoms or mental health problems in military service members is not well understood. The purpose of this study was to examine the association between lifetime concussion history, cognitive functioning, general health, and psychological health in a large sample of fit-for-duty U.S. Army soldiers preparing for deployment. Data on 458 active-duty soldiers were collected and analyzed. A computerized cognitive screening battery (CNS-Vital Signs(®)) was used to assess complex attention (CA), reaction time (RT), processing speed (PS), cognitive flexibility (CF), and memory. Health questionnaires included the Neurobehavioral Symptom Inventory (NSI), PTSD Checklist-Military Version (PCL-M), Zung Depression and Anxiety Scales (ZDS; ZAS), Perceived Stress Scale (PSS), Pittsburgh Sleep Quality Index (PSQI), Epworth Sleepiness Scale (ESS), and the Alcohol Use and Dependency Identification Test (AUDIT). Soldiers with a history of multiple concussions (i.e., three or more concussions) had significantly greater post-concussive symptom scores compared with those with zero (d=1.83, large effect), one (d=0.64, medium effect), and two (d=0.64, medium effect) prior concussions. Although the group with three or more concussions also reported more traumatic stress symptoms, the results revealed that traumatic stress was a mediator between concussions and post-concussive symptom severity. There were no significant differences on neurocognitive testing between the number of concussions. These results add to the accumulating evidence suggesting that most individuals recover from one or two prior concussions, but there is a greater risk for ongoing symptoms if one exceeds this number of injuries.
NASA Astrophysics Data System (ADS)
Udomsungworagul, A.; Charnsethikul, P.
2018-03-01
This article introduces methodology to solve large scale two-phase linear programming with a case of multiple time period animal diet problems under both nutrients in raw materials and finished product demand uncertainties. Assumption of allowing to manufacture multiple product formulas in the same time period and assumption of allowing to hold raw materials and finished products inventory have been added. Dantzig-Wolfe decompositions, Benders decomposition and Column generations technique has been combined and applied to solve the problem. The proposed procedure was programmed using VBA and Solver tool in Microsoft Excel. A case study was used and tested in term of efficiency and effectiveness trade-offs.
Wang, Yi-Feng; Long, Zhiliang; Cui, Qian; Liu, Feng; Jing, Xiu-Juan; Chen, Heng; Guo, Xiao-Nan; Yan, Jin H; Chen, Hua-Fu
2016-01-01
Neural oscillations are essential for brain functions. Research has suggested that the frequency of neural oscillations is lower for more integrative and remote communications. In this vein, some resting-state studies have suggested that large scale networks function in the very low frequency range (<1 Hz). However, it is difficult to determine the frequency characteristics of brain networks because both resting-state studies and conventional frequency tagging approaches cannot simultaneously capture multiple large scale networks in controllable cognitive activities. In this preliminary study, we aimed to examine whether large scale networks can be modulated by task-induced low frequency steady-state brain responses (lfSSBRs) in a frequency-specific pattern. In a revised attention network test, the lfSSBRs were evoked in the triple network system and sensory-motor system, indicating that large scale networks can be modulated in a frequency tagging way. Furthermore, the inter- and intranetwork synchronizations as well as coherence were increased at the fundamental frequency and the first harmonic rather than at other frequency bands, indicating a frequency-specific modulation of information communication. However, there was no difference among attention conditions, indicating that lfSSBRs modulate the general attention state much stronger than distinguishing attention conditions. This study provides insights into the advantage and mechanism of lfSSBRs. More importantly, it paves a new way to investigate frequency-specific large scale brain activities. © 2015 Wiley Periodicals, Inc.
Fabio, Anthony; Geller, Ruth; Bazaco, Michael; Bear, Todd M; Foulds, Abigail L; Duell, Jessica; Sharma, Ravi
2015-01-01
Emerging research highlights the promise of community- and policy-level strategies in preventing youth violence. Large-scale economic developments, such as sports and entertainment arenas and casinos, may improve the living conditions, economics, public health, and overall wellbeing of area residents and may influence rates of violence within communities. To assess the effect of community economic development efforts on neighborhood residents' perceptions on violence, safety, and economic benefits. Telephone survey in 2011 using a listed sample of randomly selected numbers in six Pittsburgh neighborhoods. Descriptive analyses examined measures of perceived violence and safety and economic benefit. Responses were compared across neighborhoods using chi-square tests for multiple comparisons. Survey results were compared to census and police data. Residents in neighborhoods with the large-scale economic developments reported more casino-specific and arena-specific economic benefits. However, 42% of participants in the neighborhood with the entertainment arena felt there was an increase in crime, and 29% of respondents from the neighborhood with the casino felt there was an increase. In contrast, crime decreased in both neighborhoods. Large-scale economic developments have a direct influence on the perception of violence, despite actual violence rates.
NASA Astrophysics Data System (ADS)
Raghav, Anil N.; Kule, Ankita
2018-05-01
The large-scale magnetic cloud such as coronal mass ejections (CMEs) is the fundamental driver of the space weather. The interaction of the multiple-CMEs in interplanetary space affects their dynamic evolution and geo-effectiveness. The complex and merged multiple magnetic clouds appear as the in situ signature of the interacting CMEs. The Alfvén waves are speculated to be one of the major possible energy exchange/dissipation mechanism during the interaction. However, no such observational evidence has been found in the literature. The case studies of CME-CME collision events suggest that the magnetic and thermal energy of the CME is converted into the kinetic energy. Moreover, magnetic reconnection process is justified to be responsible for merging of multiple magnetic clouds. Here, we present unambiguous evidence of sunward torsional Alfvén waves in the interacting region after the super-elastic collision of multiple CMEs. The Walén relation is used to confirm the presence of Alfvén waves in the interacting region of multiple CMEs/magnetic clouds. We conclude that Alfvén waves and magnetic reconnection are the possible energy exchange/dissipation mechanisms during large-scale magnetic clouds collisions. This study has significant implications not only in CME-magnetosphere interactions but also in the interstellar medium where interactions of large-scale magnetic clouds are possible.
ERIC Educational Resources Information Center
Arnold, Erik P.
2014-01-01
A multiple-case qualitative study of five school districts that had implemented various large-scale technology initiatives was conducted to describe what superintendents do to gain acceptance of those initiatives. The large-scale technology initiatives in the five participating districts included 1:1 District-Provided Device laptop and tablet…
Detecting anomalies in CMB maps: a new method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neelakanta, Jayanth T., E-mail: jayanthtn@gmail.com
2015-10-01
Ever since WMAP announced its first results, different analyses have shown that there is weak evidence for several large-scale anomalies in the CMB data. While the evidence for each anomaly appears to be weak, the fact that there are multiple seemingly unrelated anomalies makes it difficult to account for them via a single statistical fluke. So, one is led to considering a combination of these anomalies. But, if we ''hand-pick'' the anomalies (test statistics) to consider, we are making an a posteriori choice. In this article, we propose two statistics that do not suffer from this problem. The statistics aremore » linear and quadratic combinations of the a{sub ℓ m}'s with random co-efficients, and they test the null hypothesis that the a{sub ℓ m}'s are independent, normally-distributed, zero-mean random variables with an m-independent variance. The motivation for considering multiple modes is this: because most physical models that lead to large-scale anomalies result in coupling multiple ℓ and m modes, the ''coherence'' of this coupling should get enhanced if a combination of different modes is considered. In this sense, the statistics are thus much more generic than those that have been hitherto considered in literature. Using fiducial data, we demonstrate that the method works and discuss how it can be used with actual CMB data to make quite general statements about the incompatibility of the data with the null hypothesis.« less
Reynolds, Matthew R; Scheiber, Caroline; Hajovsky, Daniel B; Schwartz, Bryanna; Kaufman, Alan S
2015-01-01
The gender similarities hypothesis by J. S. Hyde ( 2005 ), based on large-scale reviews of studies, concludes that boys and girls are more alike than different on most psychological variables, including academic skills such as reading and math (J. S. Hyde, 2005 ). Writing is an academic skill that may be an exception. The authors investigated gender differences in academic achievement using a large, nationally stratified sample of children and adolescents ranging from ages 7-19 years (N = 2,027). Achievement data were from the conormed sample for the Kaufman intelligence and achievement tests. Multiple-indicator, multiple-cause, and multigroup mean and covariance structure models were used to test for mean differences. Girls had higher latent reading ability and higher scores on a test of math computation, but the effect sizes were consistent with the gender similarities hypothesis. Conversely, girls scored higher on spelling and written expression, with effect sizes inconsistent with the gender similarities hypothesis. The findings remained the same after controlling for cognitive ability. Girls outperform boys on tasks of writing.
SDG and qualitative trend based model multiple scale validation
NASA Astrophysics Data System (ADS)
Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike
2017-09-01
Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.
Dudbridge, Frank; Koeleman, Bobby P C
2004-09-01
Large exploratory studies, including candidate-gene-association testing, genomewide linkage-disequilibrium scans, and array-expression experiments, are becoming increasingly common. A serious problem for such studies is that statistical power is compromised by the need to control the false-positive rate for a large family of tests. Because multiple true associations are anticipated, methods have been proposed that combine evidence from the most significant tests, as a more powerful alternative to individually adjusted tests. The practical application of these methods is currently limited by a reliance on permutation testing to account for the correlated nature of single-nucleotide polymorphism (SNP)-association data. On a genomewide scale, this is both very time-consuming and impractical for repeated explorations with standard marker panels. Here, we alleviate these problems by fitting analytic distributions to the empirical distribution of combined evidence. We fit extreme-value distributions for fixed lengths of combined evidence and a beta distribution for the most significant length. An initial phase of permutation sampling is required to fit these distributions, but it can be completed more quickly than a simple permutation test and need be done only once for each panel of tests, after which the fitted parameters give a reusable calibration of the panel. Our approach is also a more efficient alternative to a standard permutation test. We demonstrate the accuracy of our approach and compare its efficiency with that of permutation tests on genomewide SNP data released by the International HapMap Consortium. The estimation of analytic distributions for combined evidence will allow these powerful methods to be applied more widely in large exploratory studies.
Do large-scale assessments measure students' ability to integrate scientific knowledge?
NASA Astrophysics Data System (ADS)
Lee, Hee-Sun
2010-03-01
Large-scale assessments are used as means to diagnose the current status of student achievement in science and compare students across schools, states, and countries. For efficiency, multiple-choice items and dichotomously-scored open-ended items are pervasively used in large-scale assessments such as Trends in International Math and Science Study (TIMSS). This study investigated how well these items measure secondary school students' ability to integrate scientific knowledge. This study collected responses of 8400 students to 116 multiple-choice and 84 open-ended items and applied an Item Response Theory analysis based on the Rasch Partial Credit Model. Results indicate that most multiple-choice items and dichotomously-scored open-ended items can be used to determine whether students have normative ideas about science topics, but cannot measure whether students integrate multiple pieces of relevant science ideas. Only when the scoring rubric is redesigned to capture subtle nuances of student open-ended responses, open-ended items become a valid and reliable tool to assess students' knowledge integration ability.
Multi-scale signed envelope inversion
NASA Astrophysics Data System (ADS)
Chen, Guo-Xin; Wu, Ru-Shan; Wang, Yu-Qing; Chen, Sheng-Chang
2018-06-01
Envelope inversion based on modulation signal mode was proposed to reconstruct large-scale structures of underground media. In order to solve the shortcomings of conventional envelope inversion, multi-scale envelope inversion was proposed using new envelope Fréchet derivative and multi-scale inversion strategy to invert strong contrast models. In multi-scale envelope inversion, amplitude demodulation was used to extract the low frequency information from envelope data. However, only to use amplitude demodulation method will cause the loss of wavefield polarity information, thus increasing the possibility of inversion to obtain multiple solutions. In this paper we proposed a new demodulation method which can contain both the amplitude and polarity information of the envelope data. Then we introduced this demodulation method into multi-scale envelope inversion, and proposed a new misfit functional: multi-scale signed envelope inversion. In the numerical tests, we applied the new inversion method to the salt layer model and SEG/EAGE 2-D Salt model using low-cut source (frequency components below 4 Hz were truncated). The results of numerical test demonstrated the effectiveness of this method.
Christopher P. Bloch; Michael R. Willi
2006-01-01
Large-scale natural disturbances, such as hurricanes, can have profound effects on animal populations. Nonetheless, generalizations about the effects of disturbance are elusive, and few studies consider long-term responses of a single population or community to multiple large-scale disturbance events. In the last 20 y, twomajor hurricanes (Hugo and Georges) have struck...
R Patrick Bixler; Shawn Johnson; Kirk Emerson; Tina Nabatchi; Melly Reuling; Charles Curtin; Michele Romolini; Morgan Grove
2016-01-01
The objective of large landscape conser vation is to mitigate complex ecological problems through interventions at multiple and overlapping scales. Implementation requires coordination among a diverse network of individuals and organizations to integrate local-scale conservation activities with broad-scale goals. This requires an understanding of the governance options...
Phillips, Edward Geoffrey; Shadid, John N.; Cyr, Eric C.
2018-05-01
Here, we report multiple physical time-scales can arise in electromagnetic simulations when dissipative effects are introduced through boundary conditions, when currents follow external time-scales, and when material parameters vary spatially. In such scenarios, the time-scales of interest may be much slower than the fastest time-scales supported by the Maxwell equations, therefore making implicit time integration an efficient approach. The use of implicit temporal discretizations results in linear systems in which fast time-scales, which severely constrain the stability of an explicit method, can manifest as so-called stiff modes. This study proposes a new block preconditioner for structure preserving (also termed physicsmore » compatible) discretizations of the Maxwell equations in first order form. The intent of the preconditioner is to enable the efficient solution of multiple-time-scale Maxwell type systems. An additional benefit of the developed preconditioner is that it requires only a traditional multigrid method for its subsolves and compares well against alternative approaches that rely on specialized edge-based multigrid routines that may not be readily available. Lastly, results demonstrate parallel scalability at large electromagnetic wave CFL numbers on a variety of test problems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillips, Edward Geoffrey; Shadid, John N.; Cyr, Eric C.
Here, we report multiple physical time-scales can arise in electromagnetic simulations when dissipative effects are introduced through boundary conditions, when currents follow external time-scales, and when material parameters vary spatially. In such scenarios, the time-scales of interest may be much slower than the fastest time-scales supported by the Maxwell equations, therefore making implicit time integration an efficient approach. The use of implicit temporal discretizations results in linear systems in which fast time-scales, which severely constrain the stability of an explicit method, can manifest as so-called stiff modes. This study proposes a new block preconditioner for structure preserving (also termed physicsmore » compatible) discretizations of the Maxwell equations in first order form. The intent of the preconditioner is to enable the efficient solution of multiple-time-scale Maxwell type systems. An additional benefit of the developed preconditioner is that it requires only a traditional multigrid method for its subsolves and compares well against alternative approaches that rely on specialized edge-based multigrid routines that may not be readily available. Lastly, results demonstrate parallel scalability at large electromagnetic wave CFL numbers on a variety of test problems.« less
Implementation of Fiber Optic Sensing System on Sandwich Composite Cylinder Buckling Test
NASA Technical Reports Server (NTRS)
Pena, Francisco; Richards, W. Lance; Parker, Allen R.; Piazza, Anthony; Schultz, Marc R.; Rudd, Michelle T.; Gardner, Nathaniel W.; Hilburger, Mark W.
2018-01-01
The National Aeronautics and Space Administration (NASA) Engineering and Safety Center Shell Buckling Knockdown Factor Project is a multicenter project tasked with developing new analysis-based shell buckling design guidelines and design factors (i.e., knockdown factors) through high-fidelity buckling simulations and advanced test technologies. To validate these new buckling knockdown factors for future launch vehicles, the Shell Buckling Knockdown Factor Project is carrying out structural testing on a series of large-scale metallic and composite cylindrical shells at the NASA Marshall Space Flight Center (Marshall Space Flight Center, Alabama). A fiber optic sensor system was used to measure strain on a large-scale sandwich composite cylinder that was tested under multiple axial compressive loads up to more than 850,000 lb, and equivalent bending loads over 22 million in-lb. During the structural testing of the composite cylinder, strain data were collected from optical cables containing distributed fiber Bragg gratings using a custom fiber optic sensor system interrogator developed at the NASA Armstrong Flight Research Center. A total of 16 fiber-optic strands, each containing nearly 1,000 fiber Bragg gratings, measuring strain, were installed on the inner and outer cylinder surfaces to monitor the test article global structural response through high-density real-time and post test strain measurements. The distributed sensing system provided evidence of local epoxy failure at the attachment-ring-to-barrel interface that would not have been detected with conventional instrumentation. Results from the fiber optic sensor system were used to further refine and validate structural models for buckling of the large-scale composite structures. This paper discusses the techniques employed for real-time structural monitoring of the composite cylinder for structural load introduction and distributed bending-strain measurements over a large section of the cylinder by utilizing unique sensing capabilities of fiber optic sensors.
Kernel Machine SNP-set Testing under Multiple Candidate Kernels
Wu, Michael C.; Maity, Arnab; Lee, Seunggeun; Simmons, Elizabeth M.; Harmon, Quaker E.; Lin, Xinyi; Engel, Stephanie M.; Molldrem, Jeffrey J.; Armistead, Paul M.
2013-01-01
Joint testing for the cumulative effect of multiple single nucleotide polymorphisms grouped on the basis of prior biological knowledge has become a popular and powerful strategy for the analysis of large scale genetic association studies. The kernel machine (KM) testing framework is a useful approach that has been proposed for testing associations between multiple genetic variants and many different types of complex traits by comparing pairwise similarity in phenotype between subjects to pairwise similarity in genotype, with similarity in genotype defined via a kernel function. An advantage of the KM framework is its flexibility: choosing different kernel functions allows for different assumptions concerning the underlying model and can allow for improved power. In practice, it is difficult to know which kernel to use a priori since this depends on the unknown underlying trait architecture and selecting the kernel which gives the lowest p-value can lead to inflated type I error. Therefore, we propose practical strategies for KM testing when multiple candidate kernels are present based on constructing composite kernels and based on efficient perturbation procedures. We demonstrate through simulations and real data applications that the procedures protect the type I error rate and can lead to substantially improved power over poor choices of kernels and only modest differences in power versus using the best candidate kernel. PMID:23471868
Multiple well-shutdown tests and site-scale flow simulation in fractured rocks
Tiedeman, Claire; Lacombe, Pierre J.; Goode, Daniel J.
2010-01-01
A new method was developed for conducting aquifer tests in fractured-rock flow systems that have a pump-and-treat (P&T) operation for containing and removing groundwater contaminants. The method involves temporary shutdown of individual pumps in wells of the P&T system. Conducting aquifer tests in this manner has several advantages, including (1) no additional contaminated water is withdrawn, and (2) hydraulic containment of contaminants remains largely intact because pumping continues at most wells. The well-shutdown test method was applied at the former Naval Air Warfare Center (NAWC), West Trenton, New Jersey, where a P&T operation is designed to contain and remove trichloroethene and its daughter products in the dipping fractured sedimentary rocks underlying the site. The detailed site-scale subsurface geologic stratigraphy, a three-dimensional MODFLOW model, and inverse methods in UCODE_2005 were used to analyze the shutdown tests. In the model, a deterministic method was used for representing the highly heterogeneous hydraulic conductivity distribution and simulations were conducted using an equivalent porous media method. This approach was very successful for simulating the shutdown tests, contrary to a common perception that flow in fractured rocks must be simulated using a stochastic or discrete fracture representation of heterogeneity. Use of inverse methods to simultaneously calibrate the model to the multiple shutdown tests was integral to the effectiveness of the approach.
Patterns and controlling factors of species diversity in the Arctic Ocean
Yasuhara, Moriaki; Hunt, Gene; van Dijken, Gert; Arrigo, Kevin R.; Cronin, Thomas M.; Wollenburg, Jutta E.
2012-01-01
Aim The Arctic Ocean is one of the last near-pristine regions on Earth, and, although human activities are expected to impact on Arctic ecosystems, we know very little about baseline patterns of Arctic Ocean biodiversity. This paper aims to describe Arctic Ocean-wide patterns of benthic biodiversity and to explore factors related to the large-scale species diversity patterns.Location Arctic Ocean.Methods We used large ostracode and foraminiferal datasets to describe the biodiversity patterns and applied comprehensive ecological modelling to test the degree to which these patterns are potentially governed by environmental factors, such as temperature, productivity, seasonality, ice cover and others. To test environmental control of the observed diversity patterns, subsets of samples for which all environmental parameters were available were analysed with multiple regression and model averaging.Results Well-known negative latitudinal species diversity gradients (LSDGs) were found in metazoan Ostracoda, but the LSDGs were unimodal with an intermediate maximum with respect to latitude in protozoan foraminifera. Depth species diversity gradients were unimodal, with peaks in diversity shallower than those in other oceans. Our modelling results showed that several factors are significant predictors of diversity, but the significant predictors were different among shallow marine ostracodes, deep-sea ostracodes and deep-sea foraminifera.Main conclusions On the basis of these Arctic Ocean-wide comprehensive datasets, we document large-scale diversity patterns with respect to latitude and depth. Our modelling results suggest that the underlying mechanisms causing these species diversity patterns are unexpectedly complex. The environmental parameters of temperature, surface productivity, seasonality of productivity, salinity and ice cover can all play a role in shaping large-scale diversity patterns, but their relative importance may depend on the ecological preferences of taxa and the oceanographic context of regions. These results suggest that a multiplicity of variables appear to be related to community structure in this system.
Improvement of individual camouflage through background choice in ground-nesting birds.
Stevens, Martin; Troscianko, Jolyon; Wilson-Aggarwal, Jared K; Spottiswoode, Claire N
2017-09-01
Animal camouflage is a longstanding example of adaptation. Much research has tested how camouflage prevents detection and recognition, largely focusing on changes to an animal's own appearance over evolution. However, animals could also substantially alter their camouflage by behaviourally choosing appropriate substrates. Recent studies suggest that individuals from several animal taxa could select backgrounds or positions to improve concealment. Here, we test whether individual wild animals choose backgrounds in complex environments, and whether this improves camouflage against predator vision. We studied nest site selection by nine species of ground-nesting birds (nightjars, plovers and coursers) in Zambia, and used image analysis and vision modeling to quantify egg and plumage camouflage to predator vision. Individual birds chose backgrounds that enhanced their camouflage, being better matched to their chosen backgrounds than to other potential backgrounds with respect to multiple aspects of camouflage. This occurred at all three spatial scales tested (a few cm and five meters from the nest, and compared to other sites chosen by conspecifics), and was the case for the eggs of all bird groups studied, and for adult nightjar plumage. Thus, individual wild animals improve their camouflage through active background choice, with choices highly refined across multiple spatial scales.
Improvement of individual camouflage through background choice in ground-nesting birds
Stevens, Martin; Troscianko, Jolyon; Wilson-Aggarwal, Jared K.; Spottiswoode, Claire N.
2017-01-01
Animal camouflage is a longstanding example of adaptation. Much research has tested how camouflage prevents detection and recognition, largely focusing on changes to an animal's own appearance over evolution. However, animals could also substantially alter their camouflage by behaviourally choosing appropriate substrates. Recent studies suggest that individuals from several animal taxa could select backgrounds or positions to improve concealment. Here, we test whether individual wild animals choose backgrounds in complex environments, and whether this improves camouflage against predator vision. We studied nest site selection by nine species of ground-nesting birds (nightjars, plovers and coursers) in Zambia, and used image analysis and vision modeling to quantify egg and plumage camouflage to predator vision. Individual birds chose backgrounds that enhanced their camouflage, being better matched to their chosen backgrounds than to other potential backgrounds with respect to multiple aspects of camouflage. This occurred at all three spatial scales tested (a few cm and five meters from the nest, and compared to other sites chosen by conspecifics), and was the case for the eggs of all bird groups studied, and for adult nightjar plumage. Thus, individual wild animals improve their camouflage through active background choice, with choices highly refined across multiple spatial scales. PMID:28890937
ERIC Educational Resources Information Center
Akbiyik, Derya Iren; Sumbuloglu, Vildan; Guney, Zafer; Armutlu, Kadriye; Korkmaz, Nilufer; Keser, Ilke; Yuksel, Muazzez Merve; Karabudak, Rana
2009-01-01
The aim of the study was to translate and test the reliability and validity of the Leeds Multiple Sclerosis Quality of Life Scale (LMSQoL) in Turkish patients with multiple sclerosis (MS). Demographic data of MS patients who had a registration in and followed up by a university hospital were recorded. The LMSQoL and Turkish Quality of Life…
A two-stage design for multiple testing in large-scale association studies.
Wen, Shu-Hui; Tzeng, Jung-Ying; Kao, Jau-Tsuen; Hsiao, Chuhsing Kate
2006-01-01
Modern association studies often involve a large number of markers and hence may encounter the problem of testing multiple hypotheses. Traditional procedures are usually over-conservative and with low power to detect mild genetic effects. From the design perspective, we propose a two-stage selection procedure to address this concern. Our main principle is to reduce the total number of tests by removing clearly unassociated markers in the first-stage test. Next, conditional on the findings of the first stage, which uses a less stringent nominal level, a more conservative test is conducted in the second stage using the augmented data and the data from the first stage. Previous studies have suggested using independent samples to avoid inflated errors. However, we found that, after accounting for the dependence between these two samples, the true discovery rate increases substantially. In addition, the cost of genotyping can be greatly reduced via this approach. Results from a study of hypertriglyceridemia and simulations suggest the two-stage method has a higher overall true positive rate (TPR) with a controlled overall false positive rate (FPR) when compared with single-stage approaches. We also report the analytical form of its overall FPR, which may be useful in guiding study design to achieve a high TPR while retaining the desired FPR.
Lyons, James E.; Andrew, Royle J.; Thomas, Susan M.; Elliott-Smith, Elise; Evenson, Joseph R.; Kelly, Elizabeth G.; Milner, Ruth L.; Nysewander, David R.; Andres, Brad A.
2012-01-01
Large-scale monitoring of bird populations is often based on count data collected across spatial scales that may include multiple physiographic regions and habitat types. Monitoring at large spatial scales may require multiple survey platforms (e.g., from boats and land when monitoring coastal species) and multiple survey methods. It becomes especially important to explicitly account for detection probability when analyzing count data that have been collected using multiple survey platforms or methods. We evaluated a new analytical framework, N-mixture models, to estimate actual abundance while accounting for multiple detection biases. During May 2006, we made repeated counts of Black Oystercatchers (Haematopus bachmani) from boats in the Puget Sound area of Washington (n = 55 sites) and from land along the coast of Oregon (n = 56 sites). We used a Bayesian analysis of N-mixture models to (1) assess detection probability as a function of environmental and survey covariates and (2) estimate total Black Oystercatcher abundance during the breeding season in the two regions. Probability of detecting individuals during boat-based surveys was 0.75 (95% credible interval: 0.42–0.91) and was not influenced by tidal stage. Detection probability from surveys conducted on foot was 0.68 (0.39–0.90); the latter was not influenced by fog, wind, or number of observers but was ~35% lower during rain. The estimated population size was 321 birds (262–511) in Washington and 311 (276–382) in Oregon. N-mixture models provide a flexible framework for modeling count data and covariates in large-scale bird monitoring programs designed to understand population change.
Recording large-scale neuronal ensembles with silicon probes in the anesthetized rat.
Schjetnan, Andrea Gomez Palacio; Luczak, Artur
2011-10-19
Large scale electrophysiological recordings from neuronal ensembles offer the opportunity to investigate how the brain orchestrates the wide variety of behaviors from the spiking activity of its neurons. One of the most effective methods to monitor spiking activity from a large number of neurons in multiple local neuronal circuits simultaneously is by using silicon electrode arrays. Action potentials produce large transmembrane voltage changes in the vicinity of cell somata. These output signals can be measured by placing a conductor in close proximity of a neuron. If there are many active (spiking) neurons in the vicinity of the tip, the electrode records combined signal from all of them, where contribution of a single neuron is weighted by its 'electrical distance'. Silicon probes are ideal recording electrodes to monitor multiple neurons because of a large number of recording sites (+64) and a small volume. Furthermore, multiple sites can be arranged over a distance of millimeters, thus allowing for the simultaneous recordings of neuronal activity in the various cortical layers or in multiple cortical columns (Fig. 1). Importantly, the geometrically precise distribution of the recording sites also allows for the determination of the spatial relationship of the isolated single neurons. Here, we describe an acute, large-scale neuronal recording from the left and right forelimb somatosensory cortex simultaneously in an anesthetized rat with silicon probes (Fig. 2).
Recording Large-scale Neuronal Ensembles with Silicon Probes in the Anesthetized Rat
Schjetnan, Andrea Gomez Palacio; Luczak, Artur
2011-01-01
Large scale electrophysiological recordings from neuronal ensembles offer the opportunity to investigate how the brain orchestrates the wide variety of behaviors from the spiking activity of its neurons. One of the most effective methods to monitor spiking activity from a large number of neurons in multiple local neuronal circuits simultaneously is by using silicon electrode arrays1-3. Action potentials produce large transmembrane voltage changes in the vicinity of cell somata. These output signals can be measured by placing a conductor in close proximity of a neuron. If there are many active (spiking) neurons in the vicinity of the tip, the electrode records combined signal from all of them, where contribution of a single neuron is weighted by its 'electrical distance'. Silicon probes are ideal recording electrodes to monitor multiple neurons because of a large number of recording sites (+64) and a small volume. Furthermore, multiple sites can be arranged over a distance of millimeters, thus allowing for the simultaneous recordings of neuronal activity in the various cortical layers or in multiple cortical columns (Fig. 1). Importantly, the geometrically precise distribution of the recording sites also allows for the determination of the spatial relationship of the isolated single neurons4. Here, we describe an acute, large-scale neuronal recording from the left and right forelimb somatosensory cortex simultaneously in an anesthetized rat with silicon probes (Fig. 2). PMID:22042361
Multi-Column Experimental Test Bed Using CaSDB MOF for Xe/Kr Separation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Welty, Amy Keil; Greenhalgh, Mitchell Randy; Garn, Troy Gerry
Processing of spent nuclear fuel produces off-gas from which several volatile radioactive components must be separated for further treatment or storage. As part of the Off-gas Sigma Team, parallel research at INL and PNNL has produced several promising sorbents for the selective capture of xenon and krypton from these off-gas streams. In order to design full-scale treatment systems, sorbents that are promising on a laboratory scale must be proven under process conditions to be considered for pilot and then full-scale use. To that end, a bench-scale multi-column system with capability to test multiple sorbents was designed and constructed at INL.more » This report details bench-scale testing of CaSDB MOF, produced at PNNL, and compares the results to those reported last year using INL engineered sorbents. Two multi-column tests were performed with the CaSDB MOF installed in the first column, followed with HZ-PAN installed in the second column. The CaSDB MOF column was placed in a Stirling cryocooler while the cryostat was employed for the HZ-PAN column. Test temperatures of 253 K and 191 K were selected for the first column while the second column was held at 191 K for both tests. Calibrated volume sample bombs were utilized for gas stream analyses. At the conclusion of each test, samples were collected from each column and analyzed for gas composition. While CaSDB MOF does appear to have good capacity for Xe, the short time to initial breakthrough would make design of a continuous adsorption/desorption cycle difficult, requiring either very large columns or a large number of smaller columns. Because of the tenacity with which Xe and Kr adhere to the material once adsorbed, this CaSDB MOF may be more suitable for use as a long-term storage solution. Further testing is recommended to determine if CaSDB MOF is suitable for this purpose.« less
Large-Scale Multiobjective Static Test Generation for Web-Based Testing with Integer Programming
ERIC Educational Resources Information Center
Nguyen, M. L.; Hui, Siu Cheung; Fong, A. C. M.
2013-01-01
Web-based testing has become a ubiquitous self-assessment method for online learning. One useful feature that is missing from today's web-based testing systems is the reliable capability to fulfill different assessment requirements of students based on a large-scale question data set. A promising approach for supporting large-scale web-based…
DIALIGN P: fast pair-wise and multiple sequence alignment using parallel processors.
Schmollinger, Martin; Nieselt, Kay; Kaufmann, Michael; Morgenstern, Burkhard
2004-09-09
Parallel computing is frequently used to speed up computationally expensive tasks in Bioinformatics. Herein, a parallel version of the multi-alignment program DIALIGN is introduced. We propose two ways of dividing the program into independent sub-routines that can be run on different processors: (a) pair-wise sequence alignments that are used as a first step to multiple alignment account for most of the CPU time in DIALIGN. Since alignments of different sequence pairs are completely independent of each other, they can be distributed to multiple processors without any effect on the resulting output alignments. (b) For alignments of large genomic sequences, we use a heuristics by splitting up sequences into sub-sequences based on a previously introduced anchored alignment procedure. For our test sequences, this combined approach reduces the program running time of DIALIGN by up to 97%. By distributing sub-routines to multiple processors, the running time of DIALIGN can be crucially improved. With these improvements, it is possible to apply the program in large-scale genomics and proteomics projects that were previously beyond its scope.
2009-06-01
simulation is the campaign-level Peace Support Operations Model (PSOM). This thesis provides a quantitative analysis of PSOM. The results are based ...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . 15. NUMBER OF PAGES 159...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . vi THIS PAGE
Bockholt, Henry J.; Scully, Mark; Courtney, William; Rachakonda, Srinivas; Scott, Adam; Caprihan, Arvind; Fries, Jill; Kalyanam, Ravi; Segall, Judith M.; de la Garza, Raul; Lane, Susan; Calhoun, Vince D.
2009-01-01
A neuroinformatics (NI) system is critical to brain imaging research in order to shorten the time between study conception and results. Such a NI system is required to scale well when large numbers of subjects are studied. Further, when multiple sites participate in research projects organizational issues become increasingly difficult. Optimized NI applications mitigate these problems. Additionally, NI software enables coordination across multiple studies, leveraging advantages potentially leading to exponential research discoveries. The web-based, Mind Research Network (MRN), database system has been designed and improved through our experience with 200 research studies and 250 researchers from seven different institutions. The MRN tools permit the collection, management, reporting and efficient use of large scale, heterogeneous data sources, e.g., multiple institutions, multiple principal investigators, multiple research programs and studies, and multimodal acquisitions. We have collected and analyzed data sets on thousands of research participants and have set up a framework to automatically analyze the data, thereby making efficient, practical data mining of this vast resource possible. This paper presents a comprehensive framework for capturing and analyzing heterogeneous neuroscience research data sources that has been fully optimized for end-users to perform novel data mining. PMID:20461147
Geller, Ruth; Bear, Todd M.; Foulds, Abigail L.; Duell, Jessica; Sharma, Ravi
2015-01-01
Background. Emerging research highlights the promise of community- and policy-level strategies in preventing youth violence. Large-scale economic developments, such as sports and entertainment arenas and casinos, may improve the living conditions, economics, public health, and overall wellbeing of area residents and may influence rates of violence within communities. Objective. To assess the effect of community economic development efforts on neighborhood residents' perceptions on violence, safety, and economic benefits. Methods. Telephone survey in 2011 using a listed sample of randomly selected numbers in six Pittsburgh neighborhoods. Descriptive analyses examined measures of perceived violence and safety and economic benefit. Responses were compared across neighborhoods using chi-square tests for multiple comparisons. Survey results were compared to census and police data. Results. Residents in neighborhoods with the large-scale economic developments reported more casino-specific and arena-specific economic benefits. However, 42% of participants in the neighborhood with the entertainment arena felt there was an increase in crime, and 29% of respondents from the neighborhood with the casino felt there was an increase. In contrast, crime decreased in both neighborhoods. Conclusions. Large-scale economic developments have a direct influence on the perception of violence, despite actual violence rates. PMID:26273310
ERIC Educational Resources Information Center
van Barneveld, Christina; Brinson, Karieann
2017-01-01
The purpose of this research was to identify conflicts in the rights and responsibility of Grade 9 test takers when some parts of a large-scale test are marked by teachers and used in the calculation of students' class marks. Data from teachers' questionnaires and students' questionnaires from a 2009-10 administration of a large-scale test of…
Critical Issues in Large-Scale Assessment: A Resource Guide.
ERIC Educational Resources Information Center
Redfield, Doris
The purpose of this document is to provide practical guidance and support for the design, development, and implementation of large-scale assessment systems that are grounded in research and best practice. Information is included about existing large-scale testing efforts, including national testing programs, state testing programs, and…
NASA Astrophysics Data System (ADS)
Chen, Xiao; Dong, Gang; Jiang, Hua
2017-04-01
The instabilities of a three-dimensional sinusoidally premixed flame induced by an incident shock wave with Mach = 1.7 and its reshock waves were studied by using the Navier-Stokes (NS) equations with a single-step chemical reaction and a high resolution, 9th-order weighted essentially non-oscillatory scheme. The computational results were validated by the grid independence test and the experimental results in the literature. The computational results show that after the passage of incident shock wave the flame interface develops in symmetric structure accompanied by large-scale transverse vortex structures. After the interactions by successive reshock waves, the flame interface is gradually destabilized and broken up, and the large-scale vortex structures are gradually transformed into small-scale vortex structures. The small-scale vortices tend to be isotropic later. The results also reveal that the evolution of the flame interface is affected by both mixing process and chemical reaction. In order to identify the relationship between the mixing and the chemical reaction, a dimensionless parameter, η , that is defined as the ratio of mixing time scale to chemical reaction time scale, is introduced. It is found that at each interaction stage the effect of chemical reaction is enhanced with time. The enhanced effect of chemical reaction at the interaction stage by incident shock wave is greater than that at the interaction stages by reshock waves. The result suggests that the parameter η can reasonably character the features of flame interface development induced by the multiple shock waves.
Bioinspired legged-robot based on large deformation of flexible skeleton.
Mayyas, Mohammad
2014-11-11
In this article we present STARbot, a bioinspired legged robot capable of multiple locomotion modalities by using large deformation of its skeleton. We construct STARbot by using origami-style folding of flexible laminates. The long-term goal is to provide a robotic platform with maximum mobility on multiple surfaces. This paper particularly studies the quasistatic model of STARbot's leg under different conditions. We describe the large elastic deformation of a leg under external force, payload, and friction by using a set of non-dimensional, nonlinear approximate equations. We developed a test mechanism that models the motion of a leg in STARbot. We augmented several foot shapes and then tested them on soft to rough grounds. Both simulation and experimental findings were in good agreement. We utilized the model to develop several scales of tri and quad STARbot. We demonstrated the capability of these robots to locomote by combining their leg deformations with their foot motions. The combination provided a design platform for an active suspension STARbot with controlled foot locomotion. This included the ability of STARbot to change size, run over obstacles, walk and slide. Furthermore, in this paper we discuss a cost effective manufacturing and production method for manufacturing STARbot.
Goldstein, Elizabeth; Farquhar, Marybeth; Crofton, Christine; Darby, Charles; Garfinkel, Steven
2005-12-01
To describe the developmental process for the CAHPS Hospital Survey. A pilot was conducted in three states with 19,720 hospital discharges. A rigorous, multi-step process was used to develop the CAHPS Hospital Survey. It included a public call for measures, multiple Federal Register notices soliciting public input, a review of the relevant literature, meetings with hospitals, consumers and survey vendors, cognitive interviews with consumer, a large-scale pilot test in three states and consumer testing and numerous small-scale field tests. The current version of the CAHPS Hospital Survey has survey items in seven domains, two overall ratings of the hospital and five items used for adjusting for the mix of patients across hospitals and for analytical purposes. The CAHPS Hospital Survey is a core set of questions that can be administered as a stand-alone questionnaire or combined with a broader set of hospital specific items.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maurer, Simon A.; Clin, Lucien; Ochsenfeld, Christian, E-mail: christian.ochsenfeld@uni-muenchen.de
2014-06-14
Our recently developed QQR-type integral screening is introduced in our Cholesky-decomposed pseudo-densities Møller-Plesset perturbation theory of second order (CDD-MP2) method. We use the resolution-of-the-identity (RI) approximation in combination with efficient integral transformations employing sparse matrix multiplications. The RI-CDD-MP2 method shows an asymptotic cubic scaling behavior with system size and a small prefactor that results in an early crossover to conventional methods for both small and large basis sets. We also explore the use of local fitting approximations which allow to further reduce the scaling behavior for very large systems. The reliability of our method is demonstrated on test sets formore » interaction and reaction energies of medium sized systems and on a diverse selection from our own benchmark set for total energies of larger systems. Timings on DNA systems show that fast calculations for systems with more than 500 atoms are feasible using a single processor core. Parallelization extends the range of accessible system sizes on one computing node with multiple cores to more than 1000 atoms in a double-zeta basis and more than 500 atoms in a triple-zeta basis.« less
Planning and executing complex large-scale exercises.
McCormick, Lisa C; Hites, Lisle; Wakelee, Jessica F; Rucks, Andrew C; Ginter, Peter M
2014-01-01
Increasingly, public health departments are designing and engaging in complex operations-based full-scale exercises to test multiple public health preparedness response functions. The Department of Homeland Security's Homeland Security Exercise and Evaluation Program (HSEEP) supplies benchmark guidelines that provide a framework for both the design and the evaluation of drills and exercises; however, the HSEEP framework does not seem to have been designed to manage the development and evaluation of multiple, operations-based, parallel exercises combined into 1 complex large-scale event. Lessons learned from the planning of the Mississippi State Department of Health Emergency Support Function--8 involvement in National Level Exercise 2011 were used to develop an expanded exercise planning model that is HSEEP compliant but accounts for increased exercise complexity and is more functional for public health. The Expanded HSEEP (E-HSEEP) model was developed through changes in the HSEEP exercise planning process in areas of Exercise Plan, Controller/Evaluator Handbook, Evaluation Plan, and After Action Report and Improvement Plan development. The E-HSEEP model was tested and refined during the planning and evaluation of Mississippi's State-level Emergency Support Function-8 exercises in 2012 and 2013. As a result of using the E-HSEEP model, Mississippi State Department of Health was able to capture strengths, lessons learned, and areas for improvement, and identify microlevel issues that may have been missed using the traditional HSEEP framework. The South Central Preparedness and Emergency Response Learning Center is working to create an Excel-based E-HSEEP tool that will allow practice partners to build a database to track corrective actions and conduct many different types of analyses and comparisons.
Runaway of energetic test ions in a toroidal plasma
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eilerman, S., E-mail: eilerman@wisc.edu; Anderson, J. K.; Sarff, J. S.
2015-02-15
Ion runaway in the presence of a large-scale, reconnection-driven electric field has been conclusively measured in the Madison Symmetric Torus reversed-field pinch (RFP). Measurements of the acceleration of a beam of fast ions agree well with test particle and Fokker-Planck modeling of the runaway process. However, the runaway mechanism does not explain all measured ion heating in the RFP, particularly previous measurements of strong perpendicular heating. It is likely that multiple energization mechanisms occur simultaneously and with differing significance for magnetically coupled thermal ions and magnetically decoupled tail and beam ions.
Seismic behavior of outrigger truss-wall shear connections using multiple steel angles
NASA Astrophysics Data System (ADS)
Li, Xian; Wang, Wei; Lü, Henglin; Zhang, Guangchang
2016-06-01
An experimental investigation on the seismic behavior of a type of outrigger truss-reinforced concrete wall shear connection using multiple steel angles is presented. Six large-scale shear connection models, which involved a portion of reinforced concrete wall and a shear tab welded onto a steel endplate with three steel angles, were constructed and tested under combined actions of cyclic axial load and eccentric shear. The effects of embedment lengths of steel angles, wall boundary elements, types of anchor plates, and thicknesses of endplates were investigated. The test results indicate that properly detailed connections exhibit desirable seismic behavior and fail due to the ductile fracture of steel angles. Wall boundary elements provide beneficial confinement to the concrete surrounding steel angles and thus increase the strength and stiffness of connections. Connections using whole anchor plates are prone to suffer concrete pry-out failure while connections with thin endplates have a relatively low strength and fail due to large inelastic deformations of the endplates. The current design equations proposed by Chinese Standard 04G362 and Code GB50011 significantly underestimate the capacities of the connection models. A revised design method to account for the influence of previously mentioned test parameters was developed.
Photometry of icy satellites: How important is multiple scattering in diluting shadows?
NASA Technical Reports Server (NTRS)
Buratti, B.; Veverka, J.
1984-01-01
Voyager observations have shown that the photometric properties of icy satellites are influenced significantly by large-scale roughness elements on the surfaces. While recent progress was made in treating the photometric effects of macroscopic roughness, it is still the case that even the most complete models do not account for the effects of multiple scattering fully. Multiple scattering dilutes shadows caused by large-scale features, yet for any specific model it is difficult to calculate the amount of dilution as a function of albedo. Accordingly, laboratory measurements were undertaken using the Cornell Goniometer to evaluate the magnitude of the effect.
He, Xinhua; Hu, Wenfa
2014-01-01
This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model.
He, Xinhua
2014-01-01
This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367
Uszynski, Marcin Kacper; Purtill, Helen; Donnelly, Alan; Coote, Susan
2016-07-01
This study aimed firstly to investigate the feasibility of the study protocol and outcome measures, secondly to obtain data in order to inform the power calculations for a larger randomised controlled trial, and finally to investigate if whole-body vibration (WBV) is more effective than the same duration and intensity of standard exercises (EXE) in people with Multiple Sclerosis (PwMS). Randomised controlled feasibility study. Outpatient MS centre. Twenty seven PwMS (age mean (SD) 48.1 (11.2)) with minimal gait impairments. Twelve weeks of WBV or standard EXE, three times weekly. Participants were measured with isokinetic muscle strength, vibration threshold, Timed Up and Go test (TUG), Mini-BESTest (MBT), 6 Minute Walk test (6MWT), Multiple Sclerosis Impact Scale 29 (MSIS 29), Modified Fatigue Impact Scale (MFIS) and Verbal Analogue scale for sensation (VAS) pre and post 12 week intervention. WBV intervention was found feasible with low drop-out rate (11.1%) and high compliance (90%). Data suggest that a sample of 52 in each group would be sufficient to detect a moderate effect size, with 80% power and 5% significance for 6 minute walk test. Large effect sizes in favour of standard exercise were found for vibration threshold at 5th metatarsophalangeal joint and heel (P=0.014, r= 0.5 and P=0.005, r=0.56 respectively). No between group differences were found for muscle strength, balance or gait (P>0.05). Data suggest that the protocol is feasible, there were no adverse effects. A trial including 120 people would be needed to detect an effect on walking endurance. © The Author(s) 2015.
NASA Technical Reports Server (NTRS)
Hill, Gerald M.; Evans, Richard K.
2009-01-01
A large-scale, distributed, high-speed data acquisition system (HSDAS) is currently being installed at the Space Power Facility (SPF) at NASA Glenn Research Center s Plum Brook Station in Sandusky, OH. This installation is being done as part of a facility construction project to add Vibro-acoustic Test Capabilities (VTC) to the current thermal-vacuum testing capability of SPF in support of the Orion Project s requirement for Space Environments Testing (SET). The HSDAS architecture is a modular design, which utilizes fully-remotely managed components, enables the system to support multiple test locations with a wide-range of measurement types and a very large system channel count. The architecture of the system is presented along with details on system scalability and measurement verification. In addition, the ability of the system to automate many of its processes such as measurement verification and measurement system analysis is also discussed.
Evaluating scaling models in biology using hierarchical Bayesian approaches
Price, Charles A; Ogle, Kiona; White, Ethan P; Weitz, Joshua S
2009-01-01
Theoretical models for allometric relationships between organismal form and function are typically tested by comparing a single predicted relationship with empirical data. Several prominent models, however, predict more than one allometric relationship, and comparisons among alternative models have not taken this into account. Here we evaluate several different scaling models of plant morphology within a hierarchical Bayesian framework that simultaneously fits multiple scaling relationships to three large allometric datasets. The scaling models include: inflexible universal models derived from biophysical assumptions (e.g. elastic similarity or fractal networks), a flexible variation of a fractal network model, and a highly flexible model constrained only by basic algebraic relationships. We demonstrate that variation in intraspecific allometric scaling exponents is inconsistent with the universal models, and that more flexible approaches that allow for biological variability at the species level outperform universal models, even when accounting for relative increases in model complexity. PMID:19453621
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sig Drellack, Lance Prothro
2007-12-01
The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less
Resilience of Florida Keys coral communities following large scale disturbances
The decline of coral reefs in the Caribbean over the last 40 years has been attributed to multiple chronic stressors and episodic large-scale disturbances. This study assessed the resilience of coral communities in two different regions of the Florida Keys reef system between 199...
Large-Scale Wind Turbine Testing in the NASA 24.4m (80) by 36.6m(120) Wind Tunnel
NASA Technical Reports Server (NTRS)
Zell, Peter T.; Imprexia, Cliff (Technical Monitor)
2000-01-01
The 80- by 120-Foot Wind Tunnel at NASA Ames Research Center in California provides a unique capability to test large-scale wind turbines under controlled conditions. This special capability is now available for domestic and foreign entities wishing to test large-scale wind turbines. The presentation will focus on facility capabilities to perform wind turbine tests and typical research objectives for this type of testing.
1984-06-01
RD-Rl45 988 AQUATIC PLANT CONTROL RESEARCH PROGRAM LARGE-SCALE 1/2 OPERATIONS MANAGEMENT ..(U) ARMY ENGINEER WATERWAYS EXPERIMENT STATION VICKSBURG MS...REPORT A-78-2 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR -, CONTROL OF PROBLEM AQUATIC PLANTS Report 5 SYNTHESIS REPORT bv Andrew...Corps of Engineers Washington, DC 20314 84 0,_1 oil.. LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC
Deformation and Failure Mechanisms of Shape Memory Alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daly, Samantha Hayes
2015-04-15
The goal of this research was to understand the fundamental mechanics that drive the deformation and failure of shape memory alloys (SMAs). SMAs are difficult materials to characterize because of the complex phase transformations that give rise to their unique properties, including shape memory and superelasticity. These phase transformations occur across multiple length scales (one example being the martensite-austenite twinning that underlies macroscopic strain localization) and result in a large hysteresis. In order to optimize the use of this hysteretic behavior in energy storage and damping applications, we must first have a quantitative understanding of this transformation behavior. Prior resultsmore » on shape memory alloys have been largely qualitative (i.e., mapping phase transformations through cracked oxide coatings or surface morphology). The PI developed and utilized new approaches to provide a quantitative, full-field characterization of phase transformation, conducting a comprehensive suite of experiments across multiple length scales and tying these results to theoretical and computational analysis. The research funded by this award utilized new combinations of scanning electron microscopy, diffraction, digital image correlation, and custom testing equipment and procedures to study phase transformation processes at a wide range of length scales, with a focus at small length scales with spatial resolution on the order of 1 nanometer. These experiments probe the basic connections between length scales during phase transformation. In addition to the insights gained on the fundamental mechanisms driving transformations in shape memory alloys, the unique experimental methodologies developed under this award are applicable to a wide range of solid-to-solid phase transformations and other strain localization mechanisms.« less
Hobart, J; Cano, S
2009-02-01
In this monograph we examine the added value of new psychometric methods (Rasch measurement and Item Response Theory) over traditional psychometric approaches by comparing and contrasting their psychometric evaluations of existing sets of rating scale data. We have concentrated on Rasch measurement rather than Item Response Theory because we believe that it is the more advantageous method for health measurement from a conceptual, theoretical and practical perspective. Our intention is to provide an authoritative document that describes the principles of Rasch measurement and the practice of Rasch analysis in a clear, detailed, non-technical form that is accurate and accessible to clinicians and researchers in health measurement. A comparison was undertaken of traditional and new psychometric methods in five large sets of rating scale data: (1) evaluation of the Rivermead Mobility Index (RMI) in data from 666 participants in the Cannabis in Multiple Sclerosis (CAMS) study; (2) evaluation of the Multiple Sclerosis Impact Scale (MSIS-29) in data from 1725 people with multiple sclerosis; (3) evaluation of test-retest reliability of MSIS-29 in data from 150 people with multiple sclerosis; (4) examination of the use of Rasch analysis to equate scales purporting to measure the same health construct in 585 people with multiple sclerosis; and (5) comparison of relative responsiveness of the Barthel Index and Functional Independence Measure in data from 1400 people undergoing neurorehabilitation. Both Rasch measurement and Item Response Theory are conceptually and theoretically superior to traditional psychometric methods. Findings from each of the five studies show that Rasch analysis is empirically superior to traditional psychometric methods for evaluating rating scales, developing rating scales, analysing rating scale data, understanding and measuring stability and change, and understanding the health constructs we seek to quantify. There is considerable added value in using Rasch analysis rather than traditional psychometric methods in health measurement. Future research directions include the need to reproduce our findings in a range of clinical populations, detailed head-to-head comparisons of Rasch analysis and Item Response Theory, and the application of Rasch analysis to clinical practice.
Advances in DNA sequencing technologies for high resolution HLA typing.
Cereb, Nezih; Kim, Hwa Ran; Ryu, Jaejun; Yang, Soo Young
2015-12-01
This communication describes our experience in large-scale G group-level high resolution HLA typing using three different DNA sequencing platforms - ABI 3730 xl, Illumina MiSeq and PacBio RS II. Recent advances in DNA sequencing technologies, so-called next generation sequencing (NGS), have brought breakthroughs in deciphering the genetic information in all living species at a large scale and at an affordable level. The NGS DNA indexing system allows sequencing multiple genes for large number of individuals in a single run. Our laboratory has adopted and used these technologies for HLA molecular testing services. We found that each sequencing technology has its own strengths and weaknesses, and their sequencing performances complement each other. HLA genes are highly complex and genotyping them is quite challenging. Using these three sequencing platforms, we were able to meet all requirements for G group-level high resolution and high volume HLA typing. Copyright © 2015 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.
1981-06-01
V ADA02 7414 UNIVERSITY OF SOUTH FLORIDA TAMPA DEPT OF BIOLOGY F/6 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE MHITE AMUM-ETC(U) JUN 81...Army Engineer Waterways Expiftaton P. 0. Box 631, Vicksburg, Miss. 391( 0 81 8 1102 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR...78-22// 4. TITLE (and Su~btitle) 5 TYPE OF REPORT & PERIOD COVERED LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE. or Report I of a series THE W4HITE
1982-02-01
7AD-AI3 853 ’FLORIDA SAME AND FRESH WATER FISH COMMISSION ORLANDO F/ 616 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR--ETC(U...of a series of reports documenting a large-scale operations management test of use of the white amur for control of problem aquatic plants in Lake...M. 1982. "Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants; Report 2, First Year Poststock- ing
1982-02-01
AD A113 .5. ORANGE COUNTY POLLUTION CONTROL DEPT ORLANDO FL F/S 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR-ETC(U) FEB 82 H D...Large-Scale Operations Management Test of use of the white amur for control of problem aquatic plants in Lake Conway, Fla. Report 1 of the series presents...as follows: Miller, D. 1982. "Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants; Report 2, First
1983-01-01
RAI-RI247443 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE i/i UNITE AMUR FOR CONTR.. (U) MILLER RND MILLER INC ORLANDO FL H D MILLER ET RL...LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC PLANTS Report 1: Baseline Studies Volume I...Boyd, J. 1983. "Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants; Report 4, Third Year Poststocking
Implementation and Performance Issues in Collaborative Optimization
NASA Technical Reports Server (NTRS)
Braun, Robert; Gage, Peter; Kroo, Ilan; Sobieski, Ian
1996-01-01
Collaborative optimization is a multidisciplinary design architecture that is well-suited to large-scale multidisciplinary optimization problems. This paper compares this approach with other architectures, examines the details of the formulation, and some aspects of its performance. A particular version of the architecture is proposed to better accommodate the occurrence of multiple feasible regions. The use of system level inequality constraints is shown to increase the convergence rate. A series of simple test problems, demonstrated to challenge related optimization architectures, is successfully solved with collaborative optimization.
Gas-Centered Swirl Coaxial Liquid Injector Evaluations
NASA Technical Reports Server (NTRS)
Cohn, A. K.; Strakey, P. A.; Talley, D. G.
2005-01-01
Development of Liquid Rocket Engines is expensive. Extensive testing at large scales usually required. In order to verify engine lifetime, large number of tests required. Limited Resources available for development. Sub-scale cold-flow and hot-fire testing is extremely cost effective. Could be a necessary (but not sufficient) condition for long engine lifetime. Reduces overall costs and risk of large scale testing. Goal: Determine knowledge that can be gained from sub-scale cold-flow and hot-fire evaluations of LRE injectors. Determine relationships between cold-flow and hot-fire data.
NASA Astrophysics Data System (ADS)
Abidin, Anas Z.; Chockanathan, Udaysankar; DSouza, Adora M.; Inglese, Matilde; Wismüller, Axel
2017-03-01
Clinically Isolated Syndrome (CIS) is often considered to be the first neurological episode associated with Multiple sclerosis (MS). At an early stage the inflammatory demyelination occurring in the CNS can manifest as a change in neuronal metabolism, with multiple asymptomatic white matter lesions detected in clinical MRI. Such damage may induce topological changes of brain networks, which can be captured by advanced functional MRI (fMRI) analysis techniques. We test this hypothesis by capturing the effective relationships of 90 brain regions, defined in the Automated Anatomic Labeling (AAL) atlas, using a large-scale Granger Causality (lsGC) framework. The resulting networks are then characterized using graph-theoretic measures that quantify various network topology properties at a global as well as at a local level. We study for differences in these properties in network graphs obtained for 18 subjects (10 male and 8 female, 9 with CIS and 9 healthy controls). Global network properties captured trending differences with modularity and clustering coefficient (p<0.1). Additionally, local network properties, such as local efficiency and the strength of connections, captured statistically significant (p<0.01) differences in some regions of the inferior frontal and parietal lobe. We conclude that multivariate analysis of fMRI time-series can reveal interesting information about changes occurring in the brain in early stages of MS.
Multiple Linking in Equating and Random Scale Drift. Research Report. ETS RR-11-46
ERIC Educational Resources Information Center
Guo, Hongwen; Liu, Jinghua; Dorans, Neil; Feigenbaum, Miriam
2011-01-01
Maintaining score stability is crucial for an ongoing testing program that administers several tests per year over many years. One way to stall the drift of the score scale is to use an equating design with multiple links. In this study, we use the operational and experimental SAT® data collected from 44 administrations to investigate the effect…
A fracture criterion for widespread cracking in thin-sheet aluminum alloys
NASA Technical Reports Server (NTRS)
Newman, J. C., Jr.; Dawicke, D. S.; Sutton, M. A.; Bigelow, C. A.
1993-01-01
An elastic-plastic finite-element analysis was used with a critical crack-tip-opening angle (CTOA) fracture criterion to model stable crack growth in thin-sheet 2024-T3 aluminum alloy panels with single and multiple-site damage (MSD) cracks. Comparisons were made between critical angles determined from the analyses and those measured with photographic methods. Calculated load against crack extension and load against crack-tip displacement on single crack specimens agreed well with test data even for large-scale plastic deformations. The analyses were also able to predict the stable tearing behavior of large lead cracks in the presence of stably tearing MSD cracks. Small MSD cracks significantly reduced the residual strength for large lead cracks.
Accounting for aquifer heterogeneity from geological data to management tools.
Blouin, Martin; Martel, Richard; Gloaguen, Erwan
2013-01-01
A nested workflow of multiple-point geostatistics (MPG) and sequential Gaussian simulation (SGS) was tested on a study area of 6 km(2) located about 20 km northwest of Quebec City, Canada. In order to assess its geological and hydrogeological parameter heterogeneity and to provide tools to evaluate uncertainties in aquifer management, direct and indirect field measurements are used as inputs in the geostatistical simulations to reproduce large and small-scale heterogeneities. To do so, the lithological information is first associated to equivalent hydrogeological facies (hydrofacies) according to hydraulic properties measured at several wells. Then, heterogeneous hydrofacies (HF) realizations are generated using a prior geological model as training image (TI) with the MPG algorithm. The hydraulic conductivity (K) heterogeneity modeling within each HF is finally computed using SGS algorithm. Different K models are integrated in a finite-element hydrogeological model to calculate multiple transport simulations. Different scenarios exhibit variations in mass transport path and dispersion associated with the large- and small-scale heterogeneity respectively. Three-dimensional maps showing the probability of overpassing different thresholds are presented as examples of management tools. © 2012, The Author(s). Groundwater © 2012, National Ground Water Association.
Science Competencies That Go Unassessed
ERIC Educational Resources Information Center
Gilmer, Penny J.; Sherdan, Danielle M.; Oosterhof, Albert; Rohani, Faranak; Rouby, Aaron
2011-01-01
Present large-scale assessments require the use of item formats, such as multiple choice, that can be administered and scored efficiently. This limits competencies that can be measured by these assessments. An alternative approach to large-scale assessments is being investigated that would include the use of complex performance assessments. As…
Effects of Ensemble Configuration on Estimates of Regional Climate Uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldenson, N.; Mauger, G.; Leung, L. R.
Internal variability in the climate system can contribute substantial uncertainty in climate projections, particularly at regional scales. Internal variability can be quantified using large ensembles of simulations that are identical but for perturbed initial conditions. Here we compare methods for quantifying internal variability. Our study region spans the west coast of North America, which is strongly influenced by El Niño and other large-scale dynamics through their contribution to large-scale internal variability. Using a statistical framework to simultaneously account for multiple sources of uncertainty, we find that internal variability can be quantified consistently using a large ensemble or an ensemble ofmore » opportunity that includes small ensembles from multiple models and climate scenarios. The latter also produce estimates of uncertainty due to model differences. We conclude that projection uncertainties are best assessed using small single-model ensembles from as many model-scenario pairings as computationally feasible, which has implications for ensemble design in large modeling efforts.« less
The scale-dependent market trend: Empirical evidences using the lagged DFA method
NASA Astrophysics Data System (ADS)
Li, Daye; Kou, Zhun; Sun, Qiankun
2015-09-01
In this paper we make an empirical research and test the efficiency of 44 important market indexes in multiple scales. A modified method based on the lagged detrended fluctuation analysis is utilized to maximize the information of long-term correlations from the non-zero lags and keep the margin of errors small when measuring the local Hurst exponent. Our empirical result illustrates that a common pattern can be found in the majority of the measured market indexes which tend to be persistent (with the local Hurst exponent > 0.5) in the small time scale, whereas it displays significant anti-persistent characteristics in large time scales. Moreover, not only the stock markets but also the foreign exchange markets share this pattern. Considering that the exchange markets are only weakly synchronized with the economic cycles, it can be concluded that the economic cycles can cause anti-persistence in the large time scale but there are also other factors at work. The empirical result supports the view that financial markets are multi-fractal and it indicates that deviations from efficiency and the type of model to describe the trend of market price are dependent on the forecasting horizon.
Quantitative evidence for the effects of multiple drivers on continental-scale amphibian declines
Grant, Evan H. Campbell; Miller, David A. W.; Schmidt, Benedikt R.; Adams, Michael J.; Amburgey, Staci M.; Chambert, Thierry A.; Cruickshank, Sam S.; Fisher, Robert N.; Green, David M.; Hossack, Blake R.; Johnson, Pieter T.J.; Joseph, Maxwell B.; Rittenhouse, Tracy A. G.; Ryan, Maureen E.; Waddle, J. Hardin; Walls, Susan C.; Bailey, Larissa L.; Fellers, Gary M.; Gorman, Thomas A.; Ray, Andrew M.; Pilliod, David S.; Price, Steven J.; Saenz, Daniel; Sadinski, Walt; Muths, Erin L.
2016-01-01
Since amphibian declines were first proposed as a global phenomenon over a quarter century ago, the conservation community has made little progress in halting or reversing these trends. The early search for a “smoking gun” was replaced with the expectation that declines are caused by multiple drivers. While field observations and experiments have identified factors leading to increased local extinction risk, evidence for effects of these drivers is lacking at large spatial scales. Here, we use observations of 389 time-series of 83 species and complexes from 61 study areas across North America to test the effects of 4 of the major hypothesized drivers of declines. While we find that local amphibian populations are being lost from metapopulations at an average rate of 3.79% per year, these declines are not related to any particular threat at the continental scale; likewise the effect of each stressor is variable at regional scales. This result - that exposure to threats varies spatially, and populations vary in their response - provides little generality in the development of conservation strategies. Greater emphasis on local solutions to this globally shared phenomenon is needed.
paraGSEA: a scalable approach for large-scale gene expression profiling
Peng, Shaoliang; Yang, Shunyun
2017-01-01
Abstract More studies have been conducted using gene expression similarity to identify functional connections among genes, diseases and drugs. Gene Set Enrichment Analysis (GSEA) is a powerful analytical method for interpreting gene expression data. However, due to its enormous computational overhead in the estimation of significance level step and multiple hypothesis testing step, the computation scalability and efficiency are poor on large-scale datasets. We proposed paraGSEA for efficient large-scale transcriptome data analysis. By optimization, the overall time complexity of paraGSEA is reduced from O(mn) to O(m+n), where m is the length of the gene sets and n is the length of the gene expression profiles, which contributes more than 100-fold increase in performance compared with other popular GSEA implementations such as GSEA-P, SAM-GS and GSEA2. By further parallelization, a near-linear speed-up is gained on both workstations and clusters in an efficient manner with high scalability and performance on large-scale datasets. The analysis time of whole LINCS phase I dataset (GSE92742) was reduced to nearly half hour on a 1000 node cluster on Tianhe-2, or within 120 hours on a 96-core workstation. The source code of paraGSEA is licensed under the GPLv3 and available at http://github.com/ysycloud/paraGSEA. PMID:28973463
Multilevel Hierarchical Kernel Spectral Clustering for Real-Life Large Scale Complex Networks
Mall, Raghvendra; Langone, Rocco; Suykens, Johan A. K.
2014-01-01
Kernel spectral clustering corresponds to a weighted kernel principal component analysis problem in a constrained optimization framework. The primal formulation leads to an eigen-decomposition of a centered Laplacian matrix at the dual level. The dual formulation allows to build a model on a representative subgraph of the large scale network in the training phase and the model parameters are estimated in the validation stage. The KSC model has a powerful out-of-sample extension property which allows cluster affiliation for the unseen nodes of the big data network. In this paper we exploit the structure of the projections in the eigenspace during the validation stage to automatically determine a set of increasing distance thresholds. We use these distance thresholds in the test phase to obtain multiple levels of hierarchy for the large scale network. The hierarchical structure in the network is determined in a bottom-up fashion. We empirically showcase that real-world networks have multilevel hierarchical organization which cannot be detected efficiently by several state-of-the-art large scale hierarchical community detection techniques like the Louvain, OSLOM and Infomap methods. We show that a major advantage of our proposed approach is the ability to locate good quality clusters at both the finer and coarser levels of hierarchy using internal cluster quality metrics on 7 real-life networks. PMID:24949877
Estimating False Discovery Proportion Under Arbitrary Covariance Dependence*
Fan, Jianqing; Han, Xu; Gu, Weijie
2012-01-01
Multiple hypothesis testing is a fundamental problem in high dimensional inference, with wide applications in many scientific fields. In genome-wide association studies, tens of thousands of tests are performed simultaneously to find if any SNPs are associated with some traits and those tests are correlated. When test statistics are correlated, false discovery control becomes very challenging under arbitrary dependence. In the current paper, we propose a novel method based on principal factor approximation, which successfully subtracts the common dependence and weakens significantly the correlation structure, to deal with an arbitrary dependence structure. We derive an approximate expression for false discovery proportion (FDP) in large scale multiple testing when a common threshold is used and provide a consistent estimate of realized FDP. This result has important applications in controlling FDR and FDP. Our estimate of realized FDP compares favorably with Efron (2007)’s approach, as demonstrated in the simulated examples. Our approach is further illustrated by some real data applications. We also propose a dependence-adjusted procedure, which is more powerful than the fixed threshold procedure. PMID:24729644
NASA Astrophysics Data System (ADS)
Zhang, M.; Liu, S.
2017-12-01
Despite extensive studies on hydrological responses to forest cover change in small watersheds, the hydrological responses to forest change and associated mechanisms across multiple spatial scales have not been fully understood. This review thus examined about 312 watersheds worldwide to provide a generalized framework to evaluate hydrological responses to forest cover change and to identify the contribution of spatial scale, climate, forest type and hydrological regime in determining the intensity of forest change related hydrological responses in small (<1000 km2) and large watersheds (≥1000 km2). Key findings include: 1) the increase in annual runoff associated with forest cover loss is statistically significant at multiple spatial scales whereas the effect of forest cover gain is statistically inconsistent; 2) the sensitivity of annual runoff to forest cover change tends to attenuate as watershed size increases only in large watersheds; 3) annual runoff is more sensitive to forest cover change in water-limited watersheds than in energy-limited watersheds across all spatial scales; and 4) small mixed forest-dominated watersheds or large snow-dominated watersheds are more hydrologically resilient to forest cover change. These findings improve the understanding of hydrological response to forest cover change at different spatial scales and provide a scientific underpinning to future watershed management in the context of climate change and increasing anthropogenic disturbances.
Kalron, Alon; Rosenblum, Uri; Frid, Lior; Achiron, Anat
2017-03-01
Evaluate the effects of a Pilates exercise programme on walking and balance in people with multiple sclerosis and compare this exercise approach to conventional physical therapy sessions. Randomized controlled trial. Multiple Sclerosis Center, Sheba Medical Center, Tel-Hashomer, Israel. Forty-five people with multiple sclerosis, 29 females, mean age (SD) was 43.2 (11.6) years; mean Expanded Disability Status Scale (S.D) was 4.3 (1.3). Participants received 12 weekly training sessions of either Pilates ( n=22) or standardized physical therapy ( n=23) in an outpatient basis. Spatio-temporal parameters of walking and posturography parameters during static stance. Functional tests included the Time Up and Go Test, 2 and 6-minute walk test, Functional Reach Test, Berg Balance Scale and the Four Square Step Test. In addition, the following self-report forms included the Multiple Sclerosis Walking Scale and Modified Fatigue Impact Scale. At the termination, both groups had significantly increased their walking speed ( P=0.021) and mean step length ( P=0.023). According to the 2-minute and 6-minute walking tests, both groups at the end of the intervention program had increased their walking speed. Mean (SD) increase in the Pilates and physical therapy groups were 39.1 (78.3) and 25.3 (67.2) meters, respectively. There was no effect of group X time in all instrumented and clinical balance and gait measures. Pilates is a possible treatment option for people with multiple sclerosis in order to improve their walking and balance capabilities. However, this approach does not have any significant advantage over standardized physical therapy.
Ongoing research experiments at the former Soviet nuclear test site in eastern Kazakhstan
Leith, William S.; Kluchko, Luke J.; Konovalov, Vladimir; Vouille, Gerard
2002-01-01
Degelen mountain, located in EasternKazakhstan near the city of Semipalatinsk, was once the Soviets most active underground nuclear test site. Two hundred fifteen nuclear tests were conducted in 181 tunnels driven horizontally into its many ridges--almost twice the number of tests as at any other Soviet underground nuclear test site. It was also the site of the first Soviet underground nuclear test--a 1-kiloton device detonated on October 11, 1961. Until recently, the details of testing at Degelen were kept secret and have been the subject of considerable speculation. However, in 1991, the Semipalatinsk test site became part of the newly independent Republic of Kazakhstan; and in 1995, the Kazakhstani government concluded an agreement with the U.S. Department of Defense to eliminate the nuclear testing infrastructure in Kazakhstan. This agreement, which calls for the "demilitarization of the infrastructure directly associated with the nuclear weapons test tunnels," has been implemented as the "Degelen Mountain Tunnel Closure Program." The U.S. Defense Threat Reduction Agency, in partnership with the Department of Energy, has permitted the use of the tunnel closure project at the former nuclear test site as a foundation on which to support cost-effective, research-and-development-funded experiments. These experiments are principally designed to improve U.S. capabilities to monitor and verify the Comprehensive Test Ban Treaty (CTBT), but have provided a new source of information on the effects of nuclear and chemical explosions on hard, fractured rock environments. These new data extends and confirms the results of recent Russian publications on the rock environment at the site and the mechanical effects of large-scale chemical and nuclear testing. In 1998, a large-scale tunnel closure experiment, Omega-1, was conducted in Tunnel 214 at Degelen mountain. In this experiment, a 100-ton chemical explosive blast was used to test technologies for monitoring the Comprehensive Nuclear Test Ban Treaty, and to calibrate a portion of the CTBT's International Monitoring System. This experiment has also provided important benchmark data on the mechanical behavior of hard, dense, fractured rock, and has demonstrated the feasibility of fielding large-scale calibration explosions, which are specified as a "confidence-building measure" in the CTBT Protocol. Two other large-scale explosion experiments, Omega-2 and Omega-3, are planned for the summer of 1999 and 2000. Like the Tunnel 214 test, the 1999 experiment will include close-in monitoring of near-source effects, as well as contributing to the calibration of key seismic stations for the Comprehensive Test Ban Treaty. The Omega-3 test will examine the effect of multiple blasts on the fractured rock environment.
Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B
2013-03-23
Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.
Large-scale, high-density (up to 512 channels) recording of local circuits in behaving animals
Berényi, Antal; Somogyvári, Zoltán; Nagy, Anett J.; Roux, Lisa; Long, John D.; Fujisawa, Shigeyoshi; Stark, Eran; Leonardo, Anthony; Harris, Timothy D.
2013-01-01
Monitoring representative fractions of neurons from multiple brain circuits in behaving animals is necessary for understanding neuronal computation. Here, we describe a system that allows high-channel-count recordings from a small volume of neuronal tissue using a lightweight signal multiplexing headstage that permits free behavior of small rodents. The system integrates multishank, high-density recording silicon probes, ultraflexible interconnects, and a miniaturized microdrive. These improvements allowed for simultaneous recordings of local field potentials and unit activity from hundreds of sites without confining free movements of the animal. The advantages of large-scale recordings are illustrated by determining the electroanatomic boundaries of layers and regions in the hippocampus and neocortex and constructing a circuit diagram of functional connections among neurons in real anatomic space. These methods will allow the investigation of circuit operations and behavior-dependent interregional interactions for testing hypotheses of neural networks and brain function. PMID:24353300
Mejias, Jorge F; Murray, John D; Kennedy, Henry; Wang, Xiao-Jing
2016-11-01
Interactions between top-down and bottom-up processes in the cerebral cortex hold the key to understanding attentional processes, predictive coding, executive control, and a gamut of other brain functions. However, the underlying circuit mechanism remains poorly understood and represents a major challenge in neuroscience. We approached this problem using a large-scale computational model of the primate cortex constrained by new directed and weighted connectivity data. In our model, the interplay between feedforward and feedback signaling depends on the cortical laminar structure and involves complex dynamics across multiple (intralaminar, interlaminar, interareal, and whole cortex) scales. The model was tested by reproducing, as well as providing insights into, a wide range of neurophysiological findings about frequency-dependent interactions between visual cortical areas, including the observation that feedforward pathways are associated with enhanced gamma (30 to 70 Hz) oscillations, whereas feedback projections selectively modulate alpha/low-beta (8 to 15 Hz) oscillations. Furthermore, the model reproduces a functional hierarchy based on frequency-dependent Granger causality analysis of interareal signaling, as reported in recent monkey and human experiments, and suggests a mechanism for the observed context-dependent hierarchy dynamics. Together, this work highlights the necessity of multiscale approaches and provides a modeling platform for studies of large-scale brain circuit dynamics and functions.
Mejias, Jorge F.; Murray, John D.; Kennedy, Henry; Wang, Xiao-Jing
2016-01-01
Interactions between top-down and bottom-up processes in the cerebral cortex hold the key to understanding attentional processes, predictive coding, executive control, and a gamut of other brain functions. However, the underlying circuit mechanism remains poorly understood and represents a major challenge in neuroscience. We approached this problem using a large-scale computational model of the primate cortex constrained by new directed and weighted connectivity data. In our model, the interplay between feedforward and feedback signaling depends on the cortical laminar structure and involves complex dynamics across multiple (intralaminar, interlaminar, interareal, and whole cortex) scales. The model was tested by reproducing, as well as providing insights into, a wide range of neurophysiological findings about frequency-dependent interactions between visual cortical areas, including the observation that feedforward pathways are associated with enhanced gamma (30 to 70 Hz) oscillations, whereas feedback projections selectively modulate alpha/low-beta (8 to 15 Hz) oscillations. Furthermore, the model reproduces a functional hierarchy based on frequency-dependent Granger causality analysis of interareal signaling, as reported in recent monkey and human experiments, and suggests a mechanism for the observed context-dependent hierarchy dynamics. Together, this work highlights the necessity of multiscale approaches and provides a modeling platform for studies of large-scale brain circuit dynamics and functions. PMID:28138530
Sensor Measurement Strategies for Monitoring Offshore Wind and Wave Energy Devices
NASA Astrophysics Data System (ADS)
O'Donnell, Deirdre; Srbinovsky, Bruno; Murphy, Jimmy; Popovici, Emanuel; Pakrashi, Vikram
2015-07-01
While the potential of offshore wind and wave energy devices is well established and accepted, operations and maintenance issues are still not very well researched or understood. In this regard, scaled model testing has gained popularity over time for such devices at various technological readiness levels. The dynamic response of these devices are typically measured by different instruments during such scaled tests but agreed sensor choice, measurement and placement guidelines are still not in place. This paper compared the dynamic responses of some of these sensors from a scaled ocean wave testing to highlight the importance of sensor measurement strategies. The possibility of using multiple, cheaper sensors of seemingly inferior performance as opposed to the deployment of a small number of expensive and accurate sensors are also explored. An energy aware adaptive sampling theory is applied to highlight the possibility of more efficient computing when large volumes of data are available from the tested structures. Efficient sensor measurement strategies are expected to have a positive impact on the development of an device at different technological readiness levels while it is expected to be helpful in reducing operation and maintenance costs if such an approach is considered for the devices when they are in operation.
Behrangrad, Shabnam; Kordi Yoosefinejad, Amin
2018-03-01
The purpose of this study is to investigate the validity and reliability of the Persian version of the Multidimensional Assessment of Fatigue Scale (MAFS) in an Iranian population with multiple sclerosis. A self-reported survey on fatigue including the MAFS, Fatigue Impact Scale and demographic measures was completed by 130 patients with multiple sclerosis and 60 healthy persons sampled with a convenience method. Test-retest reliability and validity were evaluated 3 days apart. Construct validity of the MAFS was assessed with the Fatigue Impact Scale. The MAFS had high internal consistency (Cronbach's alpha >0.9) and 3-d test-retest reliability (intraclass correlation coefficient = 0.99). Correlation between the Fatigue Impact Scale and MAFS was high (r = 0.99). Correlation between MAFS scores and the Expanded Disability Status Scale was also strong (r = 0.85). Questionnaire items showed acceptable item-scale correlation (0.968-0.993). The Persian version of the MAFS appears to be a valid and reliable questionnaire. It is an appropriate short multidimensional instrument to assess fatigue in patients with multiple sclerosis in clinical practice and research. Implications for Rehabilitation The Persian version of Multidimensional Assessment of Fatigue is a valid and reliable instrument for the assessment and monitoring the fatigue in Persian-language patients with multiple sclerosis. It is very easy to administer and a time efficient scale in comparison to other instruments evaluating fatigue in patients with multiple sclerosis.
1982-08-01
AD-AIA 700 FLORIDA UN1V GAINESVILLE DEPT OF ENVIRONMENTAL ENGIN -ETC F/G 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMOR--ENL...Conway ecosystem and is part of the Large- Scale Operations Management Test (LSOMT) of the Aquatic Plant Control Research Program (APCRP) at the WES...should be cited as follows: Blancher, E. C., II, and Fellows, C. R. 1982. "Large-Scale Operations Management Test of Use of the White Amur for Control
1983-07-01
TEST CHART NATIONAL BVIREAU OF StANARS-1963- I AQUATIC PLANT CONTROL RESEARCH PROGRAM TECHNICAL REPORT A-78-2 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF...Waterways Experiment Station P. 0. Box 631, Vicksburg, Miss. 39180 83 11 01 018 - I ., lit I III I | LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE...No. 3. RECIPIENT’S CATALOG NUMBER Technical Report A-78-2 Aa 1 Lj 19 ________5!1___ A. TITLE (Ad Subtitle) LARGE-SCALE OPERATIONS MANAGEMENT S. TYPE
The Positive and Negative Consequences of Multiple-Choice Testing
ERIC Educational Resources Information Center
Roediger, Henry L.; Marsh, Elizabeth J.
2005-01-01
Multiple-choice tests are commonly used in educational settings but with unknown effects on students' knowledge. The authors examined the consequences of taking a multiple-choice test on a later general knowledge test in which students were warned not to guess. A large positive testing effect was obtained: Prior testing of facts aided final…
Fire management over large landscapes: a hierarchical approach
Kenneth G. Boykin
2008-01-01
Management planning for fires becomes increasingly difficult as scale increases. Stratification provides land managers with multiple scales in which to prepare plans. Using statistical techniques, Geographic Information Systems (GIS), and meetings with land managers, we divided a large landscape of over 2 million acres (White Sands Missile Range) into parcels useful in...
ERIC Educational Resources Information Center
Turner, Henry J.
2014-01-01
This dissertation of practice utilized a multiple case-study approach to examine distributed leadership within five school districts that were attempting to gain acceptance of a large-scale 1:1 technology initiative. Using frame theory and distributed leadership theory as theoretical frameworks, this study interviewed each district's…
NASA Astrophysics Data System (ADS)
Pan, Zhenying; Yu, Ye Feng; Valuckas, Vytautas; Yap, Sherry L. K.; Vienne, Guillaume G.; Kuznetsov, Arseniy I.
2018-05-01
Cheap large-scale fabrication of ordered nanostructures is important for multiple applications in photonics and biomedicine including optical filters, solar cells, plasmonic biosensors, and DNA sequencing. Existing methods are either expensive or have strict limitations on the feature size and fabrication complexity. Here, we present a laser-based technique, plasmonic nanoparticle lithography, which is capable of rapid fabrication of large-scale arrays of sub-50 nm holes on various substrates. It is based on near-field enhancement and melting induced under ordered arrays of plasmonic nanoparticles, which are brought into contact or in close proximity to a desired material and acting as optical near-field lenses. The nanoparticles are arranged in ordered patterns on a flexible substrate and can be attached and removed from the patterned sample surface. At optimized laser fluence, the nanohole patterning process does not create any observable changes to the nanoparticles and they have been applied multiple times as reusable near-field masks. This resist-free nanolithography technique provides a simple and cheap solution for large-scale nanofabrication.
NASA Astrophysics Data System (ADS)
Gong, L.
2013-12-01
Large-scale hydrological models and land surface models are by far the only tools for accessing future water resources in climate change impact studies. Those models estimate discharge with large uncertainties, due to the complex interaction between climate and hydrology, the limited quality and availability of data, as well as model uncertainties. A new purely data-based scale-extrapolation method is proposed, to estimate water resources for a large basin solely from selected small sub-basins, which are typically two-orders-of-magnitude smaller than the large basin. Those small sub-basins contain sufficient information, not only on climate and land surface, but also on hydrological characteristics for the large basin In the Baltic Sea drainage basin, best discharge estimation for the gauged area was achieved with sub-basins that cover 2-4% of the gauged area. There exist multiple sets of sub-basins that resemble the climate and hydrology of the basin equally well. Those multiple sets estimate annual discharge for gauged area consistently well with 5% average error. The scale-extrapolation method is completely data-based; therefore it does not force any modelling error into the prediction. The multiple predictions are expected to bracket the inherent variations and uncertainties of the climate and hydrology of the basin. The method can be applied in both un-gauged basins and un-gauged periods with uncertainty estimation.
NASA Astrophysics Data System (ADS)
Li, Xiaowen; Janiga, Matthew A.; Wang, Shuguang; Tao, Wei-Kuo; Rowe, Angela; Xu, Weixin; Liu, Chuntao; Matsui, Toshihisa; Zhang, Chidong
2018-04-01
Evolution of precipitation structures are simulated and compared with radar observations for the November Madden-Julian Oscillation (MJO) event during the DYNAmics of the MJO (DYNAMO) field campaign. Three ground-based, ship-borne, and spaceborne precipitation radars and three cloud-resolving models (CRMs) driven by observed large-scale forcing are used to study precipitation structures at different locations over the central equatorial Indian Ocean. Convective strength is represented by 0-dBZ echo-top heights, and convective organization by contiguous 17-dBZ areas. The multi-radar and multi-model framework allows for more stringent model validations. The emphasis is on testing models' ability to simulate subtle differences observed at different radar sites when the MJO event passed through. The results show that CRMs forced by site-specific large-scale forcing can reproduce not only common features in cloud populations but also subtle variations observed by different radars. The comparisons also revealed common deficiencies in CRM simulations where they underestimate radar echo-top heights for the strongest convection within large, organized precipitation features. Cross validations with multiple radars and models also enable quantitative comparisons in CRM sensitivity studies using different large-scale forcing, microphysical schemes and parameters, resolutions, and domain sizes. In terms of radar echo-top height temporal variations, many model sensitivity tests have better correlations than radar/model comparisons, indicating robustness in model performance on this aspect. It is further shown that well-validated model simulations could be used to constrain uncertainties in observed echo-top heights when the low-resolution surveillance scanning strategy is used.
2011-07-01
intellectual ability. It is fashioned after the Wechsler Adult Intelligence Scale (Ref 11), which is the most widely used, individually administered test...Multidimensional Aptitude Battery-II Manual, Sigma Assessment Systems Inc., London, 2003. 11. Wechsler D, Wechsler Adult Intelligence Scale® – Third...AFRL-SA-WP-TR-2011-0006 MULTIPLE APTITUDE NORMATIVE INTELLIGENCE TESTING THAT DISTINGUISHES U.S. AIR FORCE MQ-1 PREDATOR SENSOR
MacKillop, James; Weafer, Jessica; Gray, Joshua; Oshri, Assaf; Palmer, Abraham; de Wit, Harriet
2016-01-01
Rationale Impulsivity has been strongly linked to addictive behaviors, but can be operationalized in a number of ways that vary considerably in overlap, suggesting multidimensionality. Objective This study tested the hypothesis that the latent structure among multiple measures of impulsivity would reflect three broad categories: impulsive choice, reflecting discounting of delayed rewards; impulsive action, reflecting ability to inhibit a prepotent motor response; and impulsive personality traits, reflecting self-reported attributions of self-regulatory capacity. Methods The study used a cross-sectional confirmatory factor analysis of multiple impulsivity assessments. Participants were 1252 young adults (62% female) with low levels of addictive behavior who were assessed in individual laboratory rooms at the University of Chicago and the University of Georgia. The battery comprised a delay discounting task, Monetary Choice Questionnaire, Conners Continuous Performance Test, Go/NoGo Task, Stop Signal Task, Barratt Impulsivity Scale, and the UPPS-P Impulsive Behavior Scale. Results The hypothesized three-factor model provided the best fit to the data, although Sensation Seeking was excluded from the final model. The three latent factors were largely unrelated to each other and were variably associated with substance use. Conclusions These findings support the hypothesis that diverse measures of impulsivity can broadly be organized into three categories that are largely distinct from one another. These findings warrant investigation among individuals with clinical levels of addictive behavior and may be applied to understanding the underlying biological mechanisms of these categories. PMID:27449350
Jayawardene, Wasantha Parakrama; YoussefAgha, Ahmed Hassan
2014-01-01
This study aimed to identify the sequential patterns of drug use initiation, which included prescription drugs misuse (PDM), among 12th-grade students in Indiana. The study also tested the suitability of the data mining method Market Basket Analysis (MBA) to detect common drug use initiation sequences in large-scale surveys. Data from 2007 to 2009 Annual Surveys of Alcohol, Tobacco, and Other Drug Use by Indiana Children and Adolescents were used for this study. A close-ended, self-administered questionnaire was used to ask adolescents about the use of 21 substance categories and the age of first use. "Support%" and "confidence%" statistics of Market Basket Analysis detected multiple and substitute addictions, respectively. The lifetime prevalence of using any addictive substance was 73.3%, and it has been decreasing during past few years. Although the lifetime prevalence of PDM was 19.2%, it has been increasing. Males and whites were more likely to use drugs and engage in multiple addictions. Market Basket Analysis identified common drug use initiation sequences that involved 11 drugs. High levels of support existed for associations among alcohol, cigarettes, and marijuana, whereas associations that included prescription drugs had medium levels of support. Market Basket Analysis is useful for the detection of common substance use initiation sequences in large-scale surveys. Before initiation of prescription drugs, physicians should consider the adolescents' risk of addiction. Prevention programs should address multiple addictions, substitute addictions, common sequences in drug use initiation, sex and racial differences in PDM, and normative beliefs of parents and adolescents in relation to PDM.
NASA Astrophysics Data System (ADS)
Sreekanth, J.; Moore, Catherine
2018-04-01
The application of global sensitivity and uncertainty analysis techniques to groundwater models of deep sedimentary basins are typically challenged by large computational burdens combined with associated numerical stability issues. The highly parameterized approaches required for exploring the predictive uncertainty associated with the heterogeneous hydraulic characteristics of multiple aquifers and aquitards in these sedimentary basins exacerbate these issues. A novel Patch Modelling Methodology is proposed for improving the computational feasibility of stochastic modelling analysis of large-scale and complex groundwater models. The method incorporates a nested groundwater modelling framework that enables efficient simulation of groundwater flow and transport across multiple spatial and temporal scales. The method also allows different processes to be simulated within different model scales. Existing nested model methodologies are extended by employing 'joining predictions' for extrapolating prediction-salient information from one model scale to the next. This establishes a feedback mechanism supporting the transfer of information from child models to parent models as well as parent models to child models in a computationally efficient manner. This feedback mechanism is simple and flexible and ensures that while the salient small scale features influencing larger scale prediction are transferred back to the larger scale, this does not require the live coupling of models. This method allows the modelling of multiple groundwater flow and transport processes using separate groundwater models that are built for the appropriate spatial and temporal scales, within a stochastic framework, while also removing the computational burden associated with live model coupling. The utility of the method is demonstrated by application to an actual large scale aquifer injection scheme in Australia.
Wang, Chun; Zheng, Yi; Chang, Hua-Hua
2014-01-01
With the advent of web-based technology, online testing is becoming a mainstream mode in large-scale educational assessments. Most online tests are administered continuously in a testing window, which may post test security problems because examinees who take the test earlier may share information with those who take the test later. Researchers have proposed various statistical indices to assess the test security, and one most often used index is the average test-overlap rate, which was further generalized to the item pooling index (Chang & Zhang, 2002, 2003). These indices, however, are all defined as the means (that is, the expected proportion of common items among examinees) and they were originally proposed for computerized adaptive testing (CAT). Recently, multistage testing (MST) has become a popular alternative to CAT. The unique features of MST make it important to report not only the mean, but also the standard deviation (SD) of test overlap rate, as we advocate in this paper. The standard deviation of test overlap rate adds important information to the test security profile, because for the same mean, a large SD reflects that certain groups of examinees share more common items than other groups. In this study, we analytically derived the lower bounds of the SD under MST, with the results under CAT as a benchmark. It is shown that when the mean overlap rate is the same between MST and CAT, the SD of test overlap tends to be larger in MST. A simulation study was conducted to provide empirical evidence. We also compared the security of MST under the single-pool versus the multiple-pool designs; both analytical and simulation studies show that the non-overlapping multiple-pool design will slightly increase the security risk.
NASA Astrophysics Data System (ADS)
Sentić, Stipo; Sessions, Sharon L.
2017-06-01
The weak temperature gradient (WTG) approximation is a method of parameterizing the influences of the large scale on local convection in limited domain simulations. WTG simulations exhibit multiple equilibria in precipitation; depending on the initial moisture content, simulations can precipitate or remain dry for otherwise identical boundary conditions. We use a hypothesized analogy between multiple equilibria in precipitation in WTG simulations, and dry and moist regions of organized convection to study tropical convective organization. We find that the range of wind speeds that support multiple equilibria depends on sea surface temperature (SST). Compared to the present SST, low SSTs support a narrower range of multiple equilibria at higher wind speeds. In contrast, high SSTs exhibit a narrower range of multiple equilibria at low wind speeds. This suggests that at high SSTs, organized convection might occur with lower surface forcing. To characterize convection at different SSTs, we analyze the change in relationships between precipitation rate, atmospheric stability, moisture content, and the large-scale transport of moist entropy and moisture with increasing SSTs. We find an increase in large-scale export of moisture and moist entropy from dry simulations with increasing SST, which is consistent with a strengthening of the up-gradient transport of moisture from dry regions to moist regions in organized convection. Furthermore, the changes in diagnostic relationships with SST are consistent with more intense convection in precipitating regions of organized convection for higher SSTs.
Large-Scale Hybrid Motor Testing. Chapter 10
NASA Technical Reports Server (NTRS)
Story, George
2006-01-01
Hybrid rocket motors can be successfully demonstrated at a small scale virtually anywhere. There have been many suitcase sized portable test stands assembled for demonstration of hybrids. They show the safety of hybrid rockets to the audiences. These small show motors and small laboratory scale motors can give comparative burn rate data for development of different fuel/oxidizer combinations, however questions that are always asked when hybrids are mentioned for large scale applications are - how do they scale and has it been shown in a large motor? To answer those questions, large scale motor testing is required to verify the hybrid motor at its true size. The necessity to conduct large-scale hybrid rocket motor tests to validate the burn rate from the small motors to application size has been documented in several place^'^^.^. Comparison of small scale hybrid data to that of larger scale data indicates that the fuel burn rate goes down with increasing port size, even with the same oxidizer flux. This trend holds for conventional hybrid motors with forward oxidizer injection and HTPB based fuels. While the reason this is occurring would make a great paper or study or thesis, it is not thoroughly understood at this time. Potential causes include the fact that since hybrid combustion is boundary layer driven, the larger port sizes reduce the interaction (radiation, mixing and heat transfer) from the core region of the port. This chapter focuses on some of the large, prototype sized testing of hybrid motors. The largest motors tested have been AMROC s 250K-lbf thrust motor at Edwards Air Force Base and the Hybrid Propulsion Demonstration Program s 250K-lbf thrust motor at Stennis Space Center. Numerous smaller tests were performed to support the burn rate, stability and scaling concepts that went into the development of those large motors.
ERIC Educational Resources Information Center
Si, Yajuan; Reiter, Jerome P.
2013-01-01
In many surveys, the data comprise a large number of categorical variables that suffer from item nonresponse. Standard methods for multiple imputation, like log-linear models or sequential regression imputation, can fail to capture complex dependencies and can be difficult to implement effectively in high dimensions. We present a fully Bayesian,…
Keith G. Tidball
2014-01-01
The role of community-based natural resources management in the form of "greening" after large scale system shocks and surprises is argued to provide multiple benefits via engagement with living elements of social-ecological systems and subsequent enhanced resilience at multiple scales. The importance of so-called social-ecological symbols, especially the...
Periodic Hydraulic Testing for Discerning Fracture Network Connections
NASA Astrophysics Data System (ADS)
Becker, M.; Le Borgne, T.; Bour, O.; Guihéneuf, N.; Cole, M.
2015-12-01
Discrete fracture network (DFN) models often predict highly variable hydraulic connections between injection and pumping wells used for enhanced oil recovery, geothermal energy extraction, and groundwater remediation. Such connections can be difficult to verify in fractured rock systems because standard pumping or pulse interference tests interrogate too large a volume to pinpoint specific connections. Three field examples are presented in which periodic hydraulic tests were used to obtain information about hydraulic connectivity in fractured bedrock. The first site, a sandstone in New York State, involves only a single fracture at a scale of about 10 m. The second site, a granite in Brittany, France, involves a fracture network at about the same scale. The third site, a granite/schist in the U.S. State of New Hampshire, involves a complex network at scale of 30-60 m. In each case periodic testing provided an enhanced view of hydraulic connectivity over previous constant rate tests. Periodic testing is particularly adept at measuring hydraulic diffusivity, which is a more effective parameter than permeability for identify the complexity of flow pathways between measurement locations. Periodic tests were also conducted at multiple frequencies which provides a range in the radius of hydraulic penetration away from the oscillating well. By varying the radius of penetration, we attempt to interrogate the structure of the fracture network. Periodic tests, therefore, may be uniquely suited for verifying and/or calibrating DFN models.
Fracture behavior of large-scale thin-sheet aluminum alloy
NASA Technical Reports Server (NTRS)
Dewit, Roland; Fields, Richard J.; Mordfin, Leonard; Low, Samuel R.; Harne, Donald
1994-01-01
A series of fracture tests on large-scale, pre-cracked, aluminum alloy panels is being carried out to examine and to characterize the process by which cracks propagate and link up in this material. Extended grips and test fixtures were specially designed to enable the panel specimens to be loaded in tension, in a 1780-kN-capacity universal testing machine. Twelve panel specimens, each consisting of a single sheet of bare 2024-T3 aluminum alloy, 3988 mm high, 2286 mm wide, and 1.016 mm thick are being fabricated with simulated through-cracks oriented horizontally at mid-height. Using existing information, a test matrix has been set up that explores regions of failure that are controlled by fracture mechanics, with additional tests near the boundary between plastic collapse and fracture. In addition, a variety of multiple site damage (MSD) configurations have been included to distinguish between various proposed linkage mechanisms. All tests but one use anti-buckling guides. At this writing seven specimens have been tested. Three were fabricated with a single central crack, three others had multiple cracks on each side of the central crack, and one had a single crack but no anti-buckling guides. Each fracture event was recorded on film, video, computer, magnetic tape, and occasionally optical microscopy. The visual showed the crack tip with a load meter in the field of view, using motion picture film for one tip and SVHS video tape for the other. The computer recorded the output of the testing machine load cell, the stroke, and twelve strain gages at 1.5 second intervals. A wideband FM magnetic tape recorder was used to record data from the same sources. The data were analyzed by two different procedures: (1) the plastic zone model based on the residual strength diagram; and (2) the R-curve. The first three tests were used to determine the basic material properties, and these results were then used in the analysis of the two subsequent tests with MSD cracks. There is good agreement between measured values and results obtained from the model.
Multi-scale occupancy estimation and modelling using multiple detection methods
Nichols, James D.; Bailey, Larissa L.; O'Connell, Allan F.; Talancy, Neil W.; Grant, Evan H. Campbell; Gilbert, Andrew T.; Annand, Elizabeth M.; Husband, Thomas P.; Hines, James E.
2008-01-01
Occupancy estimation and modelling based on detection–nondetection data provide an effective way of exploring change in a species’ distribution across time and space in cases where the species is not always detected with certainty. Today, many monitoring programmes target multiple species, or life stages within a species, requiring the use of multiple detection methods. When multiple methods or devices are used at the same sample sites, animals can be detected by more than one method.We develop occupancy models for multiple detection methods that permit simultaneous use of data from all methods for inference about method-specific detection probabilities. Moreover, the approach permits estimation of occupancy at two spatial scales: the larger scale corresponds to species’ use of a sample unit, whereas the smaller scale corresponds to presence of the species at the local sample station or site.We apply the models to data collected on two different vertebrate species: striped skunks Mephitis mephitis and red salamanders Pseudotriton ruber. For striped skunks, large-scale occupancy estimates were consistent between two sampling seasons. Small-scale occupancy probabilities were slightly lower in the late winter/spring when skunks tend to conserve energy, and movements are limited to males in search of females for breeding. There was strong evidence of method-specific detection probabilities for skunks. As anticipated, large- and small-scale occupancy areas completely overlapped for red salamanders. The analyses provided weak evidence of method-specific detection probabilities for this species.Synthesis and applications. Increasingly, many studies are utilizing multiple detection methods at sampling locations. The modelling approach presented here makes efficient use of detections from multiple methods to estimate occupancy probabilities at two spatial scales and to compare detection probabilities associated with different detection methods. The models can be viewed as another variation of Pollock's robust design and may be applicable to a wide variety of scenarios where species occur in an area but are not always near the sampled locations. The estimation approach is likely to be especially useful in multispecies conservation programmes by providing efficient estimates using multiple detection devices and by providing device-specific detection probability estimates for use in survey design.
Voices from Test-Takers: Further Evidence for Language Assessment Validation and Use
ERIC Educational Resources Information Center
Cheng, Liying; DeLuca, Christopher
2011-01-01
Test-takers' interpretations of validity as related to test constructs and test use have been widely debated in large-scale language assessment. This study contributes further evidence to this debate by examining 59 test-takers' perspectives in writing large-scale English language tests. Participants wrote about their test-taking experiences in…
Strain localization in models and nature: bridging the gaps.
NASA Astrophysics Data System (ADS)
Burov, E.; Francois, T.; Leguille, J.
2012-04-01
Mechanisms of strain localization and their role in tectonic evolution are still largely debated. Indeed, the laboratory data on strain localization processes are not abundant, they do not cover the entire range of possible mechanisms and have to be extrapolated, sometimes with greatest uncertainties, to geological scales while the observations of localization processes at outcrop scale are scarce, not always representative, and usually are difficult to quantify. Numerical thermo-mechanical models allow us to investigate the relative importance of some of the localization processes whether they are hypothesized or observed at laboratory or outcrop scale. The numerical models can test different observationally or analytically derived laws in terms of their applicability to natural scales and tectonic processes. The models are limited, however, in their capacity of reproduction of physical mechanisms, and necessary simplify the softening laws leading to "numerical" localization. Numerical strain localization is also limited by grid resolution and the ability of specific numerical codes to handle large strains and the complexity of the associated physical phenomena. Hence, multiple iterations between observations and models are needed to elucidate the causes of strain localization in nature. We here investigate the relative impact of different weakening laws on localization of deformation using large-strain thermo-mechanical models. We test using several "generic" rifting and collision settings, the implications of structural softening, tectonic heritage, shear heating, friction angle and cohesion softening, ductile softening (mimicking grain-size reduction) as well as of a number of other mechanisms such as fluid-assisted phase changes. The results suggest that different mechanisms of strain localization may interfere in nature, yet it most cases it is not evident to establish quantifiable links between the laboratory data and the best-fitting parameters of the effective softening laws that allow to reproduce large scale tectonic evolution. For example, one of most effective and widely used mechanisms of "numerical" strain localization is friction angle softening. Yet, namely this law appears to be most difficult to justify from physical and observational grounds.
Intrinsic fluctuations of the proton saturation momentum scale in high multiplicity p+p collisions
McLerran, Larry; Tribedy, Prithwish
2015-11-02
High multiplicity events in p+p collisions are studied using the theory of the Color Glass Condensate. Here, we show that intrinsic fluctuations of the proton saturation momentum scale are needed in addition to the sub-nucleonic color charge fluctuations to explain the very high multiplicity tail of distributions in p+p collisions. It is presumed that the origin of such intrinsic fluctuations is non-perturbative in nature. Classical Yang Mills simulations using the IP-Glasma model are performed to make quantitative estimations. Furthermore, we find that fluctuations as large as O(1) of the average values of the saturation momentum scale can lead to raremore » high multiplicity events seen in p+p data at RHIC and LHC energies. Using the available data on multiplicity distributions we try to constrain the distribution of the proton saturation momentum scale and make predictions for the multiplicity distribution in 13 TeV p+p collisions.« less
Role of optometry school in single day large scale school vision testing
Anuradha, N; Ramani, Krishnakumar
2015-01-01
Background: School vision testing aims at identification and management of refractive errors. Large-scale school vision testing using conventional methods is time-consuming and demands a lot of chair time from the eye care professionals. A new strategy involving a school of optometry in single day large scale school vision testing is discussed. Aim: The aim was to describe a new approach of performing vision testing of school children on a large scale in a single day. Materials and Methods: A single day vision testing strategy was implemented wherein 123 members (20 teams comprising optometry students and headed by optometrists) conducted vision testing for children in 51 schools. School vision testing included basic vision screening, refraction, frame measurements, frame choice and referrals for other ocular problems. Results: A total of 12448 children were screened, among whom 420 (3.37%) were identified to have refractive errors. 28 (1.26%) children belonged to the primary, 163 to middle (9.80%), 129 (4.67%) to secondary and 100 (1.73%) to the higher secondary levels of education respectively. 265 (2.12%) children were referred for further evaluation. Conclusion: Single day large scale school vision testing can be adopted by schools of optometry to reach a higher number of children within a short span. PMID:25709271
Geospatial optimization of siting large-scale solar projects
Macknick, Jordan; Quinby, Ted; Caulfield, Emmet; Gerritsen, Margot; Diffendorfer, James E.; Haines, Seth S.
2014-01-01
guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.
Quantitative nanoscopy: Tackling sampling limitations in (S)TEM imaging of polymers and composites.
Gnanasekaran, Karthikeyan; Snel, Roderick; de With, Gijsbertus; Friedrich, Heiner
2016-01-01
Sampling limitations in electron microscopy questions whether the analysis of a bulk material is representative, especially while analyzing hierarchical morphologies that extend over multiple length scales. We tackled this problem by automatically acquiring a large series of partially overlapping (S)TEM images with sufficient resolution, subsequently stitched together to generate a large-area map using an in-house developed acquisition toolbox (TU/e Acquisition ToolBox) and stitching module (TU/e Stitcher). In addition, we show that quantitative image analysis of the large scale maps provides representative information that can be related to the synthesis and process conditions of hierarchical materials, which moves electron microscopy analysis towards becoming a bulk characterization tool. We demonstrate the power of such an analysis by examining two different multi-phase materials that are structured over multiple length scales. Copyright © 2015 Elsevier B.V. All rights reserved.
Physical activity correlates with neurological impairment and disability in multiple sclerosis.
Motl, Robert W; Snook, Erin M; Wynn, Daniel R; Vollmer, Timothy
2008-06-01
This study examined the correlation of physical activity with neurological impairment and disability in persons with multiple sclerosis (MS). Eighty individuals with MS wore an accelerometer for 7 days and completed the Symptom Inventory (SI), Performance Scales (PS), and Expanded Disability Status Scale. There were large negative correlations between the accelerometer and SI (r = -0.56; rho = -0.58) and Expanded Disability Status Scale (r = -0.60; rho = -0.69) and a moderate negative correlation between the accelerometer and PS (r = -0.39; rho = -0.48) indicating that physical activity was associated with reduced neurological impairment and disability. Such findings provide a preliminary basis for using an accelerometer and the SI and PS as outcome measures in large-scale prospective and experimental examinations of the effect of physical activity behavior on disability and dependence in MS.
NASA Astrophysics Data System (ADS)
Wu, Qiujie; Tan, Liu; Xu, Sen; Liu, Dabin; Min, Li
2018-04-01
Numerous accidents of emulsion explosive (EE) are attributed to uncontrolled thermal decomposition of ammonium nitrate emulsion (ANE, the intermediate of EE) and EE in large scale. In order to study the thermal decomposition characteristics of ANE and EE in different scales, a large-scale test of modified vented pipe test (MVPT), and two laboratory-scale tests of differential scanning calorimeter (DSC) and accelerating rate calorimeter (ARC) were applied in the present study. The scale effect and water effect both play an important role in the thermal stability of ANE and EE. The measured decomposition temperatures of ANE and EE in MVPT are 146°C and 144°C, respectively, much lower than those in DSC and ARC. As the size of the same sample in DSC, ARC, and MVPT successively increases, the onset temperatures decrease. In the same test, the measured onset temperature value of ANE is higher than that of EE. The water composition of the sample stabilizes the sample. The large-scale test of MVPT can provide information for the real-life operations. The large-scale operations have more risks, and continuous overheating should be avoided.
The positive and negative consequences of multiple-choice testing.
Roediger, Henry L; Marsh, Elizabeth J
2005-09-01
Multiple-choice tests are commonly used in educational settings but with unknown effects on students' knowledge. The authors examined the consequences of taking a multiple-choice test on a later general knowledge test in which students were warned not to guess. A large positive testing effect was obtained: Prior testing of facts aided final cued-recall performance. However, prior testing also had negative consequences. Prior reading of a greater number of multiple-choice lures decreased the positive testing effect and increased production of multiple-choice lures as incorrect answers on the final test. Multiple-choice testing may inadvertently lead to the creation of false knowledge.
Measuring emotions during epistemic activities: the Epistemically-Related Emotion Scales.
Pekrun, Reinhard; Vogl, Elisabeth; Muis, Krista R; Sinatra, Gale M
2017-09-01
Measurement instruments assessing multiple emotions during epistemic activities are largely lacking. We describe the construction and validation of the Epistemically-Related Emotion Scales, which measure surprise, curiosity, enjoyment, confusion, anxiety, frustration, and boredom occurring during epistemic cognitive activities. The instrument was tested in a multinational study of emotions during learning from conflicting texts (N = 438 university students from the United States, Canada, and Germany). The findings document the reliability, internal validity, and external validity of the instrument. A seven-factor model best fit the data, suggesting that epistemically-related emotions should be conceptualised in terms of discrete emotion categories, and the scales showed metric invariance across the North American and German samples. Furthermore, emotion scores changed over time as a function of conflicting task information and related significantly to perceived task value and use of cognitive and metacognitive learning strategies.
Leading a change process to improve health service delivery.
Bahamon, Claire; Dwyer, Joseph; Buxbaum, Ann
2006-01-01
In the fields of health and development, donors channel multiple resources into the design of new practices and technologies, as well as small-scale programmes to test them. But successful practices are rarely scaled up to the level where they beneficially impact large, impoverished populations. An effective process for change is to use the experiences of new practices gained at the programme level for full-scale implementation. To make an impact, new practices need to be applied, and supported by management systems, at many organizational levels. At every level, potential implementers and likely beneficiaries must first recognize some characteristics that would benefit them in the new practices. An effective change process, led by a dedicated internal change agent, comprises several well-defined phases that successively broaden and institutionalize the use of new practices. PMID:16917654
Klimyuk, Victor; Pogue, Gregory; Herz, Stefan; Butler, John; Haydon, Hugh
2014-01-01
This review describes the adaptation of the plant virus-based transient expression system, magnICON(®) for the at-scale manufacturing of pharmaceutical proteins. The system utilizes so-called "deconstructed" viral vectors that rely on Agrobacterium-mediated systemic delivery into the plant cells for recombinant protein production. The system is also suitable for production of hetero-oligomeric proteins like immunoglobulins. By taking advantage of well established R&D tools for optimizing the expression of protein of interest using this system, product concepts can reach the manufacturing stage in highly competitive time periods. At the manufacturing stage, the system offers many remarkable features including rapid production cycles, high product yield, virtually unlimited scale-up potential, and flexibility for different manufacturing schemes. The magnICON system has been successfully adaptated to very different logistical manufacturing formats: (1) speedy production of multiple small batches of individualized pharmaceuticals proteins (e.g. antigens comprising individualized vaccines to treat NonHodgkin's Lymphoma patients) and (2) large-scale production of other pharmaceutical proteins such as therapeutic antibodies. General descriptions of the prototype GMP-compliant manufacturing processes and facilities for the product formats that are in preclinical and clinical testing are provided.
BlazeDEM3D-GPU A Large Scale DEM simulation code for GPUs
NASA Astrophysics Data System (ADS)
Govender, Nicolin; Wilke, Daniel; Pizette, Patrick; Khinast, Johannes
2017-06-01
Accurately predicting the dynamics of particulate materials is of importance to numerous scientific and industrial areas with applications ranging across particle scales from powder flow to ore crushing. Computational discrete element simulations is a viable option to aid in the understanding of particulate dynamics and design of devices such as mixers, silos and ball mills, as laboratory scale tests comes at a significant cost. However, the computational time required to simulate an industrial scale simulation which consists of tens of millions of particles can take months to complete on large CPU clusters, making the Discrete Element Method (DEM) unfeasible for industrial applications. Simulations are therefore typically restricted to tens of thousands of particles with highly detailed particle shapes or a few million of particles with often oversimplified particle shapes. However, a number of applications require accurate representation of the particle shape to capture the macroscopic behaviour of the particulate system. In this paper we give an overview of the recent extensions to the open source GPU based DEM code, BlazeDEM3D-GPU, that can simulate millions of polyhedra and tens of millions of spheres on a desktop computer with a single or multiple GPUs.
NASA Technical Reports Server (NTRS)
Evans, Richard K.; Hill, Gerald M.
2012-01-01
Very large space environment test facilities present unique engineering challenges in the design of facility data systems. Data systems of this scale must be versatile enough to meet the wide range of data acquisition and measurement requirements from a diverse set of customers and test programs, but also must minimize design changes to maintain reliability and serviceability. This paper presents an overview of the common architecture and capabilities of the facility data acquisition systems available at two of the world?s largest space environment test facilities located at the NASA Glenn Research Center?s Plum Brook Station in Sandusky, Ohio; namely, the Space Propulsion Research Facility (commonly known as the B-2 facility) and the Space Power Facility (SPF). The common architecture of the data systems is presented along with details on system scalability and efficient measurement systems analysis and verification. The architecture highlights a modular design, which utilizes fully-remotely managed components, enabling the data systems to be highly configurable and support multiple test locations with a wide-range of measurement types and very large system channel counts.
NASA Technical Reports Server (NTRS)
Evans, Richard K.; Hill, Gerald M.
2014-01-01
Very large space environment test facilities present unique engineering challenges in the design of facility data systems. Data systems of this scale must be versatile enough to meet the wide range of data acquisition and measurement requirements from a diverse set of customers and test programs, but also must minimize design changes to maintain reliability and serviceability. This paper presents an overview of the common architecture and capabilities of the facility data acquisition systems available at two of the world's largest space environment test facilities located at the NASA Glenn Research Center's Plum Brook Station in Sandusky, Ohio; namely, the Space Propulsion Research Facility (commonly known as the B-2 facility) and the Space Power Facility (SPF). The common architecture of the data systems is presented along with details on system scalability and efficient measurement systems analysis and verification. The architecture highlights a modular design, which utilizes fully-remotely managed components, enabling the data systems to be highly configurable and support multiple test locations with a wide-range of measurement types and very large system channel counts.
Calving distributions of individual bulls in multiple-sire pastures
USDA-ARS?s Scientific Manuscript database
The objective of this project was to quantify patterns in the calving rate of sires in multiple-sire pastures over seven years at a large-scale cow-calf operation. Data consisted of reproductive and genomic records from multiple-sire breeding pastures (n=33) at the United States Meat Animal Research...
Misra, Sanchit; Pamnany, Kiran; Aluru, Srinivas
2015-01-01
Construction of whole-genome networks from large-scale gene expression data is an important problem in systems biology. While several techniques have been developed, most cannot handle network reconstruction at the whole-genome scale, and the few that can, require large clusters. In this paper, we present a solution on the Intel Xeon Phi coprocessor, taking advantage of its multi-level parallelism including many x86-based cores, multiple threads per core, and vector processing units. We also present a solution on the Intel® Xeon® processor. Our solution is based on TINGe, a fast parallel network reconstruction technique that uses mutual information and permutation testing for assessing statistical significance. We demonstrate the first ever inference of a plant whole genome regulatory network on a single chip by constructing a 15,575 gene network of the plant Arabidopsis thaliana from 3,137 microarray experiments in only 22 minutes. In addition, our optimization for parallelizing mutual information computation on the Intel Xeon Phi coprocessor holds out lessons that are applicable to other domains.
Meador, M.R.; Whittier, T.R.; Goldstein, R.M.; Hughes, R.M.; Peck, D.V.
2008-01-01
Consistent assessments of biological condition are needed across multiple ecoregions to provide a greater understanding of the spatial extent of environmental degradation. However, consistent assessments at large geographic scales are often hampered by lack of uniformity in data collection, analyses, and interpretation. The index of biotic integrity (IBI) has been widely used in eastern and central North America, where fish assemblages are complex and largely composed of native species, but IBI development has been hindered in the western United States because of relatively low fish species richness and greater relative abundance of alien fishes. Approaches to developing IBIs rarely provide a consistent means of assessing biological condition across multiple ecoregions. We conducted an evaluation of IBIs recently proposed for three ecoregions of the western United States using an independent data set covering a large geographic scale. We standardized the regional IBIs and developed biological condition criteria, assessed the responsiveness of IBIs to basin-level land uses, and assessed their precision and concordance with basin-scale IBIs. Standardized IBI scores from 318 sites in the western United States comprising mountain, plains, and xeric ecoregions were significantly related to combined urban and agricultural land uses. Standard deviations and coefficients of variation revealed relatively low variation in IBI scores based on multiple sampling reaches at sites. A relatively high degree of corroboration with independent, locally developed IBIs indicates that the regional IBIs are robust across large geographic scales, providing precise and accurate assessments of biological condition for western U.S. streams. ?? Copyright by the American Fisheries Society 2008.
Influence of spasticity on mobility and balance in persons with multiple sclerosis.
Sosnoff, Jacob J; Gappmaier, Eduard; Frame, Amy; Motl, Robert W
2011-09-01
Spasticity is a motor disorder characterized by a velocity-dependent increase in tonic stretch reflexes that presumably affects mobility and balance. This investigation examined the hypothesis that persons with multiple sclerosis (MS) who have spasticity of the lower legs would have more impairment of mobility and balance compared to those without spasticity. Participants were 34 ambulatory persons with a definite diagnosis of MS. The expanded disability status scale (EDSS) was used to characterize disability in the study sample. All participants underwent measurements of spasticity in the gastroc-soleus muscles of both legs (modified Ashworth scale), walking speed (timed 25-foot walk), mobility (Timed Up and Go), walking endurance (6-minute walk test), self-reported impact of MS on walking ability (Multiple Sclerosis Walking Scale-12), and balance (Berg Balance Test and Activities-specific Balance Confidence Scale). Fifteen participants had spasticity of the gastroc-soleus muscles based on modified Ashworth scale scores. The spasticity group had lower median EDSS scores indicating greater disability (P=0.03). Mobility and balance were significantly more impaired in the group with spasticity compared to the group without spasticity: timed 25-foot walk (P = 0.02, d = -0.74), Timed Up and Go (P = 0.01, d = -0.84), 6-minute walk test (P < 0.01, d = 1.03), Multiple Sclerosis Walking Scale-12 (P = 0.04, d = -0.76), Berg Balance Test (P = 0.02, d = -0.84) and Activities-specific Balance Confidence Scale (P = 0.04, d = -0.59). Spasticity in the gastroc-soleus muscles appears to have negative effect on mobility and balance in persons with MS. The relationship between spasticity and disability in persons with MS requires further exploration.
Yu, Yao; Hu, Hao; Bohlender, Ryan J; Hu, Fulan; Chen, Jiun-Sheng; Holt, Carson; Fowler, Jerry; Guthery, Stephen L; Scheet, Paul; Hildebrandt, Michelle A T; Yandell, Mark; Huff, Chad D
2018-04-06
High-throughput sequencing data are increasingly being made available to the research community for secondary analyses, providing new opportunities for large-scale association studies. However, heterogeneity in target capture and sequencing technologies often introduce strong technological stratification biases that overwhelm subtle signals of association in studies of complex traits. Here, we introduce the Cross-Platform Association Toolkit, XPAT, which provides a suite of tools designed to support and conduct large-scale association studies with heterogeneous sequencing datasets. XPAT includes tools to support cross-platform aware variant calling, quality control filtering, gene-based association testing and rare variant effect size estimation. To evaluate the performance of XPAT, we conducted case-control association studies for three diseases, including 783 breast cancer cases, 272 ovarian cancer cases, 205 Crohn disease cases and 3507 shared controls (including 1722 females) using sequencing data from multiple sources. XPAT greatly reduced Type I error inflation in the case-control analyses, while replicating many previously identified disease-gene associations. We also show that association tests conducted with XPAT using cross-platform data have comparable performance to tests using matched platform data. XPAT enables new association studies that combine existing sequencing datasets to identify genetic loci associated with common diseases and other complex traits.
A multiple scales approach to maximal superintegrability
NASA Astrophysics Data System (ADS)
Gubbiotti, G.; Latini, D.
2018-07-01
In this paper we present a simple, algorithmic test to establish if a Hamiltonian system is maximally superintegrable or not. This test is based on a very simple corollary of a theorem due to Nekhoroshev and on a perturbative technique called the multiple scales method. If the outcome is positive, this test can be used to suggest maximal superintegrability, whereas when the outcome is negative it can be used to disprove it. This method can be regarded as a finite dimensional analog of the multiple scales method as a way to produce soliton equations. We use this technique to show that the real counterpart of a mechanical system found by Jules Drach in 1935 is, in general, not maximally superintegrable. We give some hints on how this approach could be applied to classify maximally superintegrable systems by presenting a direct proof of the well-known Bertrand’s theorem.
Wan, Shixiang; Zou, Quan
2017-01-01
Multiple sequence alignment (MSA) plays a key role in biological sequence analyses, especially in phylogenetic tree construction. Extreme increase in next-generation sequencing results in shortage of efficient ultra-large biological sequence alignment approaches for coping with different sequence types. Distributed and parallel computing represents a crucial technique for accelerating ultra-large (e.g. files more than 1 GB) sequence analyses. Based on HAlign and Spark distributed computing system, we implement a highly cost-efficient and time-efficient HAlign-II tool to address ultra-large multiple biological sequence alignment and phylogenetic tree construction. The experiments in the DNA and protein large scale data sets, which are more than 1GB files, showed that HAlign II could save time and space. It outperformed the current software tools. HAlign-II can efficiently carry out MSA and construct phylogenetic trees with ultra-large numbers of biological sequences. HAlign-II shows extremely high memory efficiency and scales well with increases in computing resource. THAlign-II provides a user-friendly web server based on our distributed computing infrastructure. HAlign-II with open-source codes and datasets was established at http://lab.malab.cn/soft/halign.
NASA Technical Reports Server (NTRS)
D'Souza, Christopher; Milenkovich, Zoran; Wilson, Zachary; Huich, David; Bendle, John; Kibler, Angela
2011-01-01
The Space Operations Simulation Center (SOSC) at the Lockheed Martin (LM) Waterton Campus in Littleton, Colorado is a dynamic test environment focused on Autonomous Rendezvous and Docking (AR&D) development testing and risk reduction activities. The SOSC supports multiple program pursuits and accommodates testing Guidance, Navigation, and Control (GN&C) algorithms for relative navigation, hardware testing and characterization, as well as software and test process development. The SOSC consists of a high bay (60 meters long by 15.2 meters wide by 15.2 meters tall) with dual six degree-of-freedom (6DOF) motion simulators and a single fixed base 6DOF robot. The large testing area (maximum sensor-to-target effective range of 60 meters) allows for large-scale, flight-like simulations of proximity maneuvers and docking events. The facility also has two apertures for access to external extended-range outdoor target test operations. In addition, the facility contains four Mission Operations Centers (MOCs) with connectivity to dual high bay control rooms and a data/video interface room. The high bay is rated at Class 300,000 (. 0.5 m maximum particles/m3) cleanliness and includes orbital lighting simulation capabilities.
NASA Astrophysics Data System (ADS)
Thorslund, J.; Jarsjo, J.; Destouni, G.
2017-12-01
The quality of freshwater resources is increasingly impacted by human activities. Humans also extensively change the structure of landscapes, which may alter natural hydrological processes. To manage and maintain freshwater of good water quality, it is critical to understand how pollutants are released into, transported and transformed within the hydrological system. Some key scientific questions include: What are net downstream impacts of pollutants across different hydroclimatic and human disturbance conditions, and on different scales? What are the functions within and between components of the landscape, such as wetlands, on mitigating pollutant load delivery to downstream recipients? We explore these questions by synthesizing results from several relevant case study examples of intensely human-impacted hydrological systems. These case study sites have been specifically evaluated in terms of net impact of human activities on pollutant input to the aquatic system, as well as flow-path distributions trough wetlands as a potential ecosystem service of pollutant mitigation. Results shows that although individual wetlands have high retention capacity, efficient net retention effects were not always achieved at a larger landscape scale. Evidence suggests that the function of wetlands as mitigation solutions to pollutant loads is largely controlled by large-scale parallel and circular flow-paths, through which multiple wetlands are interconnected in the landscape. To achieve net mitigation effects at large scale, a large fraction of the polluted large-scale flows must be transported through multiple connected wetlands. Although such large-scale flow interactions are critical for assessing water pollution spreading and fate through the landscape, our synthesis shows a frequent lack of knowledge at such scales. We suggest ways forward for addressing the mismatch between the large scales at which key pollutant pressures and water quality changes take place and the relatively scale at which most studies and implementations are currently made. These suggestions can help bridge critical knowledge gaps, as needed for improving water quality predictions and mitigation solutions under human and environmental changes.
Self-consistency tests of large-scale dynamics parameterizations for single-column modeling
Edman, Jacob P.; Romps, David M.
2015-03-18
Large-scale dynamics parameterizations are tested numerically in cloud-resolving simulations, including a new version of the weak-pressure-gradient approximation (WPG) introduced by Edman and Romps (2014), the weak-temperature-gradient approximation (WTG), and a prior implementation of WPG. We perform a series of self-consistency tests with each large-scale dynamics parameterization, in which we compare the result of a cloud-resolving simulation coupled to WTG or WPG with an otherwise identical simulation with prescribed large-scale convergence. In self-consistency tests based on radiative-convective equilibrium (RCE; i.e., no large-scale convergence), we find that simulations either weakly coupled or strongly coupled to either WPG or WTG are self-consistent, butmore » WPG-coupled simulations exhibit a nonmonotonic behavior as the strength of the coupling to WPG is varied. We also perform self-consistency tests based on observed forcings from two observational campaigns: the Tropical Warm Pool International Cloud Experiment (TWP-ICE) and the ARM Southern Great Plains (SGP) Summer 1995 IOP. In these tests, we show that the new version of WPG improves upon prior versions of WPG by eliminating a potentially troublesome gravity-wave resonance.« less
Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J
2017-01-01
Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales.
Hahn, Beth A.; Dreitz, Victoria J.; George, T. Luke
2017-01-01
Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer’s sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer’s sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales. PMID:29065128
The Saskatchewan River Basin - a large scale observatory for water security research (Invited)
NASA Astrophysics Data System (ADS)
Wheater, H. S.
2013-12-01
The 336,000 km2 Saskatchewan River Basin (SaskRB) in Western Canada illustrates many of the issues of Water Security faced world-wide. It poses globally-important science challenges due to the diversity in its hydro-climate and ecological zones. With one of the world's more extreme climates, it embodies environments of global significance, including the Rocky Mountains (source of the major rivers in Western Canada), the Boreal Forest (representing 30% of Canada's land area) and the Prairies (home to 80% of Canada's agriculture). Management concerns include: provision of water resources to more than three million inhabitants, including indigenous communities; balancing competing needs for water between different uses, such as urban centres, industry, agriculture, hydropower and environmental flows; issues of water allocation between upstream and downstream users in the three prairie provinces; managing the risks of flood and droughts; and assessing water quality impacts of discharges from major cities and intensive agricultural production. Superimposed on these issues is the need to understand and manage uncertain water futures, including effects of economic growth and environmental change, in a highly fragmented water governance environment. Key science questions focus on understanding and predicting the effects of land and water management and environmental change on water quantity and quality. To address the science challenges, observational data are necessary across multiple scales. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and multiple jurisdictions. The SaskRB has therefore been developed as a large scale observatory, now a Regional Hydroclimate Project of the World Climate Research Programme's GEWEX project, and is available to contribute to the emerging North American Water Program. State-of-the-art hydro-ecological experimental sites have been developed for the key biomes, and a river and lake biogeochemical research facility, focussed on impacts of nutrients and exotic chemicals. Data are integrated at SaskRB scale to support the development of improved large scale climate and hydrological modelling products, the development of DSS systems for local, provincial and basin-scale management, and the development of related social science research, engaging stakeholders in the research and exploring their values and priorities for water security. The observatory provides multiple scales of observation and modelling required to develop: a) new climate, hydrological and ecological science and modelling tools to address environmental change in key environments, and their integrated effects and feedbacks at large catchment scale, b) new tools needed to support river basin management under uncertainty, including anthropogenic controls on land and water management and c) the place-based focus for the development of new transdisciplinary science.
... lung. Radiation Accident Large-scale accidents from atomic bomb testing fallout released iodine-131 and strontium-90. ... lung. Radiation Accident Large-scale accidents from atomic bomb testing fallout released iodine-131 and strontium-90. ...
Gravitational lenses and large scale structure
NASA Technical Reports Server (NTRS)
Turner, Edwin L.
1987-01-01
Four possible statistical tests of the large scale distribution of cosmic material are described. Each is based on gravitational lensing effects. The current observational status of these tests is also summarized.
Dalgas, U; Langeskov-Christensen, M; Skjerbæk, A; Jensen, E; Baert, I; Romberg, A; Santoyo Medina, C; Gebara, B; Maertens de Noordhout, B; Knuts, K; Béthoux, F; Rasova, K; Severijns, D; Bibby, B M; Kalron, A; Norman, B; Van Geel, F; Wens, I; Feys, P
2018-04-15
The relationship between fatigue impact and walking capacity and perceived ability in patients with multiple sclerosis (MS) is inconclusive in the existing literature. A better understanding might guide new treatment avenues for fatigue and/or walking capacity in patients with MS. To investigate the relationship between the subjective impact of fatigue and objective walking capacity as well as subjective walking ability in MS patients. A cross-sectional multicenter study design was applied. Ambulatory MS patients (n = 189, age: 47.6 ± 10.5 years; gender: 115/74 women/men; Expanded Disability Status Scale (EDSS): 4.1 ± 1.8 [range: 0-6.5]) were tested at 11 sites. Objective tests of walking capacity included short walking tests (Timed 25-Foot Walk (T25FW), 10-Metre Walk Test (10mWT) at usual and fastest speed and the timed up and go (TUG)), and long walking tests (2- and 6-Minute Walk Tests (MWT). Subjective walking ability was tested applying the Multiple Sclerosis Walking Scale-12 (MSWS-12). Fatigue impact was measured by the self-reported modified fatigue impact scale (MFIS) consisting of a total score (MFIS total ) and three subscales (MFIS physical , MFIS cognitive and MFIS psychosocial ). Uni- and multivariate regression analysis were performed to evaluate the relation between walking and fatigue impact. MFIS total was negatively related with long (6MWT, r = -0.14, p = 0.05) and short composite (TUG, r = -0.22, p = 0.003) walking measures. MFIS physical showed a significant albeit weak relationship to walking speed in all walking capacity tests (r = -0.22 to -0.33, p < .0001), which persisted in the multivariate linear regression analysis. Subjective walking ability (MSWS-12) was related to MFIS total (r = 0.49, p < 0.0001), as well as to all other subscales of MFIS (r = 0.24-0.63, p < 0.001), showing stronger relationships than objective measures of walking. The physical impact of fatigue is weakly related to objective walking capacity, while general, physical, cognitive and psychosocial fatigue impact are weakly to moderately related to subjective walking ability, when analysed in a large heterogeneous sample of MS patients. Copyright © 2018 Elsevier B.V. All rights reserved.
Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.
Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao
2016-04-01
To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes.
1980-10-01
Development; Problem Identification and Assessment for Aquatic Plant Management; Natural Succession of Aquatic Plants; Large-Scale Operations Management Test...of Insects and Pathogens for Control of Waterhyacinth in Louisiana; Large-Scale Operations Management Test to Evaluate Prevention Methodology for...Control of Eurasian Watermilfoil in Washington; Large-Scale Operations Management Test Using the White Amur at Lake Conway, Florida; and Aquatic Plant Control Activities in the Panama Canal Zone.
The Expanded Large Scale Gap Test
1987-03-01
NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T
Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo
2016-01-01
Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. PMID:27154272
Electrical characterization of a Mapham inverter using pulse testing techniques
NASA Technical Reports Server (NTRS)
Baumann, E. D.; Myers, I. T.; Hammoud, A. N.
1990-01-01
The use of a multiple pulse testing technique to determine the electrical characteristics of large megawatt-level power systems for aerospace missions is proposed. An innovative test method based on the multiple pulse technique is demonstrated on a 2-kW Mapham inverter. The concept of this technique shows that characterization of large power systems under electrical equilibrium at rated power can be accomplished without large costly power supplies. The heat generation that occurs in systems when tested in a continuous mode is eliminated. The results indicate that there is a good agreement between this testing technique and that of steady state testing.
CROSS-SCALE CORRELATIONS AND THE DESIGN AND ANALYSIS OF AVIAN HABITAT SELECTION STUDIES
It has long been suggested that birds select habitat hierarchically, progressing from coarser to finer spatial scales. This hypothesis, in conjunction with the realization that many organisms likely respond to environmental patterns at multiple spatial scales, has led to a large ...
Real-time terrain storage generation from multiple sensors towards mobile robot operation interface.
Song, Wei; Cho, Seoungjae; Xi, Yulong; Cho, Kyungeun; Um, Kyhyun
2014-01-01
A mobile robot mounted with multiple sensors is used to rapidly collect 3D point clouds and video images so as to allow accurate terrain modeling. In this study, we develop a real-time terrain storage generation and representation system including a nonground point database (PDB), ground mesh database (MDB), and texture database (TDB). A voxel-based flag map is proposed for incrementally registering large-scale point clouds in a terrain model in real time. We quantize the 3D point clouds into 3D grids of the flag map as a comparative table in order to remove the redundant points. We integrate the large-scale 3D point clouds into a nonground PDB and a node-based terrain mesh using the CPU. Subsequently, we program a graphics processing unit (GPU) to generate the TDB by mapping the triangles in the terrain mesh onto the captured video images. Finally, we produce a nonground voxel map and a ground textured mesh as a terrain reconstruction result. Our proposed methods were tested in an outdoor environment. Our results show that the proposed system was able to rapidly generate terrain storage and provide high resolution terrain representation for mobile mapping services and a graphical user interface between remote operators and mobile robots.
Real-Time Terrain Storage Generation from Multiple Sensors towards Mobile Robot Operation Interface
Cho, Seoungjae; Xi, Yulong; Cho, Kyungeun
2014-01-01
A mobile robot mounted with multiple sensors is used to rapidly collect 3D point clouds and video images so as to allow accurate terrain modeling. In this study, we develop a real-time terrain storage generation and representation system including a nonground point database (PDB), ground mesh database (MDB), and texture database (TDB). A voxel-based flag map is proposed for incrementally registering large-scale point clouds in a terrain model in real time. We quantize the 3D point clouds into 3D grids of the flag map as a comparative table in order to remove the redundant points. We integrate the large-scale 3D point clouds into a nonground PDB and a node-based terrain mesh using the CPU. Subsequently, we program a graphics processing unit (GPU) to generate the TDB by mapping the triangles in the terrain mesh onto the captured video images. Finally, we produce a nonground voxel map and a ground textured mesh as a terrain reconstruction result. Our proposed methods were tested in an outdoor environment. Our results show that the proposed system was able to rapidly generate terrain storage and provide high resolution terrain representation for mobile mapping services and a graphical user interface between remote operators and mobile robots. PMID:25101321
Drive-by large-region acoustic noise-source mapping via sparse beamforming tomography.
Tuna, Cagdas; Zhao, Shengkui; Nguyen, Thi Ngoc Tho; Jones, Douglas L
2016-10-01
Environmental noise is a risk factor for human physical and mental health, demanding an efficient large-scale noise-monitoring scheme. The current technology, however, involves extensive sound pressure level (SPL) measurements at a dense grid of locations, making it impractical on a city-wide scale. This paper presents an alternative approach using a microphone array mounted on a moving vehicle to generate two-dimensional acoustic tomographic maps that yield the locations and SPLs of the noise-sources sparsely distributed in the neighborhood traveled by the vehicle. The far-field frequency-domain delay-and-sum beamforming output power values computed at multiple locations as the vehicle drives by are used as tomographic measurements. The proposed method is tested with acoustic data collected by driving an electric vehicle with a rooftop-mounted microphone array along a straight road next to a large open field, on which various pre-recorded noise-sources were produced by a loudspeaker at different locations. The accuracy of the tomographic imaging results demonstrates the promise of this approach for rapid, low-cost environmental noise-monitoring.
Comparing multi-module connections in membrane chromatography scale-up.
Yu, Zhou; Karkaria, Tishtar; Espina, Marianela; Hunjun, Manjeet; Surendran, Abera; Luu, Tina; Telychko, Julia; Yang, Yan-Ping
2015-07-20
Membrane chromatography is increasingly used for protein purification in the biopharmaceutical industry. Membrane adsorbers are often pre-assembled by manufacturers as ready-to-use modules. In large-scale protein manufacturing settings, the use of multiple membrane modules for a single batch is often required due to the large quantity of feed material. The question as to how multiple modules can be connected to achieve optimum separation and productivity has been previously approached using model proteins and mass transport theories. In this study, we compare the performance of multiple membrane modules in series and in parallel in the production of a protein antigen. Series connection was shown to provide superior separation compared to parallel connection in the context of competitive adsorption. Copyright © 2015 Elsevier B.V. All rights reserved.
Katul, Gabriel G; Porporato, Amilcare; Nikora, Vladimir
2012-12-01
The existence of a "-1" power-law scaling at low wavenumbers in the longitudinal velocity spectrum of wall-bounded turbulence was explained by multiple mechanisms; however, experimental support has not been uniform across laboratory studies. This letter shows that Heisenberg's eddy viscosity approach can provide a theoretical framework that bridges these multiple mechanisms and explains the elusiveness of the "-1" power law in some experiments. Novel theoretical outcomes are conjectured about the role of intermittency and very-large scale motions in modifying the k⁻¹ scaling.
Paul S Wills, PhD; Pfeiffer, Timothy; Baptiste, Richard; Watten, Barnaby J.
2016-01-01
Control of alkalinity, dissolved carbon dioxide (dCO2), and pH are critical in marine recirculating aquaculture systems (RAS) in order to maintain health and maximize growth. A small-scale prototype aragonite sand filled fluidized bed reactor was tested under varying conditions of alkalinity and dCO2 to develop and model the response of dCO2 across the reactor. A large-scale reactor was then incorporated into an operating marine recirculating aquaculture system to observe the reactor as the system moved toward equilibrium. The relationship between alkalinity dCO2, and pH across the reactor are described by multiple regression equations. The change in dCO2 across the small-scale reactor indicated a strong likelihood that an equilibrium alkalinity would be maintained by using a fluidized bed aragonite reactor. The large-scale reactor verified this observation and established equilibrium at an alkalinity of approximately 135 mg/L as CaCO3, dCO2 of 9 mg/L, and a pH of 7.0 within 4 days that was stable during a 14 day test period. The fluidized bed aragonite reactor has the potential to simplify alkalinity and pH control, and aid in dCO2 control in RAS design and operation. Aragonite sand, purchased in bulk, is less expensive than sodium bicarbonate and could reduce overall operating production costs.
Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems
NASA Astrophysics Data System (ADS)
Koch, Patrick Nathan
Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.
From a meso- to micro-scale connectome: array tomography and mGRASP
Rah, Jong-Cheol; Feng, Linqing; Druckmann, Shaul; Lee, Hojin; Kim, Jinhyun
2015-01-01
Mapping mammalian synaptic connectivity has long been an important goal of neuroscience because knowing how neurons and brain areas are connected underpins an understanding of brain function. Meeting this goal requires advanced techniques with single synapse resolution and large-scale capacity, especially at multiple scales tethering the meso- and micro-scale connectome. Among several advanced LM-based connectome technologies, Array Tomography (AT) and mammalian GFP-Reconstitution Across Synaptic Partners (mGRASP) can provide relatively high-throughput mapping synaptic connectivity at multiple scales. AT- and mGRASP-assisted circuit mapping (ATing and mGRASPing), combined with techniques such as retrograde virus, brain clearing techniques, and activity indicators will help unlock the secrets of complex neural circuits. Here, we discuss these useful new tools to enable mapping of brain circuits at multiple scales, some functional implications of spatial synaptic distribution, and future challenges and directions of these endeavors. PMID:26089781
Qu, Long; Guennel, Tobias; Marshall, Scott L
2013-12-01
Following the rapid development of genome-scale genotyping technologies, genetic association mapping has become a popular tool to detect genomic regions responsible for certain (disease) phenotypes, especially in early-phase pharmacogenomic studies with limited sample size. In response to such applications, a good association test needs to be (1) applicable to a wide range of possible genetic models, including, but not limited to, the presence of gene-by-environment or gene-by-gene interactions and non-linearity of a group of marker effects, (2) accurate in small samples, fast to compute on the genomic scale, and amenable to large scale multiple testing corrections, and (3) reasonably powerful to locate causal genomic regions. The kernel machine method represented in linear mixed models provides a viable solution by transforming the problem into testing the nullity of variance components. In this study, we consider score-based tests by choosing a statistic linear in the score function. When the model under the null hypothesis has only one error variance parameter, our test is exact in finite samples. When the null model has more than one variance parameter, we develop a new moment-based approximation that performs well in simulations. Through simulations and analysis of real data, we demonstrate that the new test possesses most of the aforementioned characteristics, especially when compared to existing quadratic score tests or restricted likelihood ratio tests. © 2013, The International Biometric Society.
Gandolfi, Marialuisa; Munari, Daniele; Geroin, Christian; Gajofatto, Alberto; Benedetti, Maria Donata; Midiri, Alessandro; Carla, Fontana; Picelli, Alessandro; Waldner, Andreas; Smania, Nicola
2015-10-01
Impaired sensory integration contributes to balance disorders in patients with multiple sclerosis (MS). The objective of this paper is to compare the effects of sensory integration balance training against conventional rehabilitation on balance disorders, the level of balance confidence perceived, quality of life, fatigue, frequency of falls, and sensory integration processing on a large sample of patients with MS. This single-blind, randomized, controlled trial involved 80 outpatients with MS (EDSS: 1.5-6.0) and subjective symptoms of balance disorders. The experimental group (n = 39) received specific training to improve central integration of afferent sensory inputs; the control group (n = 41) received conventional rehabilitation (15 treatment sessions of 50 minutes each). Before, after treatment, and at one month post-treatment, patients were evaluated by a blinded rater using the Berg Balance Scale (BBS), Activities-specific Balance Confidence Scale (ABC), Multiple Sclerosis Quality of Life-54, Fatigue Severity Scale (FSS), number of falls and the Sensory Organization Balance Test (SOT). The experimental training program produced greater improvements than the control group training on the BBS (p < 0.001), the FSS (p < 0.002), number of falls (p = 0.002) and SOT (p < 0.05). Specific training to improve central integration of afferent sensory inputs may ameliorate balance disorders in patients with MS. Clinical Trial Registration (NCT01040117). © The Author(s), 2015.
An Illustrative Guide to the Minerva Framework
NASA Astrophysics Data System (ADS)
Flom, Erik; Leonard, Patrick; Hoeffel, Udo; Kwak, Sehyun; Pavone, Andrea; Svensson, Jakob; Krychowiak, Maciej; Wendelstein 7-X Team Collaboration
2017-10-01
Modern phsyics experiments require tracking and modelling data and their associated uncertainties on a large scale, as well as the combined implementation of multiple independent data streams for sophisticated modelling and analysis. The Minerva Framework offers a centralized, user-friendly method of large-scale physics modelling and scientific inference. Currently used by teams at multiple large-scale fusion experiments including the Joint European Torus (JET) and Wendelstein 7-X (W7-X), the Minerva framework provides a forward-model friendly architecture for developing and implementing models for large-scale experiments. One aspect of the framework involves so-called data sources, which are nodes in the graphical model. These nodes are supplied with engineering and physics parameters. When end-user level code calls a node, it is checked network-wide against its dependent nodes for changes since its last implementation and returns version-specific data. Here, a filterscope data node is used as an illustrative example of the Minerva Framework's data management structure and its further application to Bayesian modelling of complex systems. This work has been carried out within the framework of the EUROfusion Consortium and has received funding from the Euratom research and training programme 2014-2018 under Grant Agreement No. 633053.
Küçük, Fadime; Kara, Bilge; Poyraz, Esra Çoşkuner; İdiman, Egemen
2016-01-01
[Purpose] The aim of this study was to determine the effects of clinical Pilates in multiple sclerosis patients. [Subjects and Methods] Twenty multiple sclerosis patients were enrolled in this study. The participants were divided into two groups as the clinical Pilates and control groups. Cognition (Multiple Sclerosis Functional Composite), balance (Berg Balance Scale), physical performance (timed performance tests, Timed up and go test), tiredness (Modified Fatigue Impact scale), depression (Beck Depression Inventory), and quality of life (Multiple Sclerosis International Quality of Life Questionnaire) were measured before and after treatment in all participants. [Results] There were statistically significant differences in balance, timed performance, tiredness and Multiple Sclerosis Functional Composite tests between before and after treatment in the clinical Pilates group. We also found significant differences in timed performance tests, the Timed up and go test and the Multiple Sclerosis Functional Composite between before and after treatment in the control group. According to the difference analyses, there were significant differences in Multiple Sclerosis Functional Composite and Multiple Sclerosis International Quality of Life Questionnaire scores between the two groups in favor of the clinical Pilates group. There were statistically significant clinical differences in favor of the clinical Pilates group in comparison of measurements between the groups. Clinical Pilates improved cognitive functions and quality of life compared with traditional exercise. [Conclusion] In Multiple Sclerosis treatment, clinical Pilates should be used as a holistic approach by physical therapists. PMID:27134355
Küçük, Fadime; Kara, Bilge; Poyraz, Esra Çoşkuner; İdiman, Egemen
2016-03-01
[Purpose] The aim of this study was to determine the effects of clinical Pilates in multiple sclerosis patients. [Subjects and Methods] Twenty multiple sclerosis patients were enrolled in this study. The participants were divided into two groups as the clinical Pilates and control groups. Cognition (Multiple Sclerosis Functional Composite), balance (Berg Balance Scale), physical performance (timed performance tests, Timed up and go test), tiredness (Modified Fatigue Impact scale), depression (Beck Depression Inventory), and quality of life (Multiple Sclerosis International Quality of Life Questionnaire) were measured before and after treatment in all participants. [Results] There were statistically significant differences in balance, timed performance, tiredness and Multiple Sclerosis Functional Composite tests between before and after treatment in the clinical Pilates group. We also found significant differences in timed performance tests, the Timed up and go test and the Multiple Sclerosis Functional Composite between before and after treatment in the control group. According to the difference analyses, there were significant differences in Multiple Sclerosis Functional Composite and Multiple Sclerosis International Quality of Life Questionnaire scores between the two groups in favor of the clinical Pilates group. There were statistically significant clinical differences in favor of the clinical Pilates group in comparison of measurements between the groups. Clinical Pilates improved cognitive functions and quality of life compared with traditional exercise. [Conclusion] In Multiple Sclerosis treatment, clinical Pilates should be used as a holistic approach by physical therapists.
Quantification of Treatment Effect Modification on Both an Additive and Multiplicative Scale
Girerd, Nicolas; Rabilloud, Muriel; Pibarot, Philippe; Mathieu, Patrick; Roy, Pascal
2016-01-01
Background In both observational and randomized studies, associations with overall survival are by and large assessed on a multiplicative scale using the Cox model. However, clinicians and clinical researchers have an ardent interest in assessing absolute benefit associated with treatments. In older patients, some studies have reported lower relative treatment effect, which might translate into similar or even greater absolute treatment effect given their high baseline hazard for clinical events. Methods The effect of treatment and the effect modification of treatment were respectively assessed using a multiplicative and an additive hazard model in an analysis adjusted for propensity score in the context of coronary surgery. Results The multiplicative model yielded a lower relative hazard reduction with bilateral internal thoracic artery grafting in older patients (Hazard ratio for interaction/year = 1.03, 95%CI: 1.00 to 1.06, p = 0.05) whereas the additive model reported a similar absolute hazard reduction with increasing age (Delta for interaction/year = 0.10, 95%CI: -0.27 to 0.46, p = 0.61). The number needed to treat derived from the propensity score-adjusted multiplicative model was remarkably similar at the end of the follow-up in patients aged < = 60 and in patients >70. Conclusions The present example demonstrates that a lower treatment effect in older patients on a relative scale can conversely translate into a similar treatment effect on an additive scale due to large baseline hazard differences. Importantly, absolute risk reduction, either crude or adjusted, can be calculated from multiplicative survival models. We advocate for a wider use of the absolute scale, especially using additive hazard models, to assess treatment effect and treatment effect modification. PMID:27045168
Orthographic and Phonological Neighborhood Databases across Multiple Languages.
Marian, Viorica
2017-01-01
The increased globalization of science and technology and the growing number of bilinguals and multilinguals in the world have made research with multiple languages a mainstay for scholars who study human function and especially those who focus on language, cognition, and the brain. Such research can benefit from large-scale databases and online resources that describe and measure lexical, phonological, orthographic, and semantic information. The present paper discusses currently-available resources and underscores the need for tools that enable measurements both within and across multiple languages. A general review of language databases is followed by a targeted introduction to databases of orthographic and phonological neighborhoods. A specific focus on CLEARPOND illustrates how databases can be used to assess and compare neighborhood information across languages, to develop research materials, and to provide insight into broad questions about language. As an example of how using large-scale databases can answer questions about language, a closer look at neighborhood effects on lexical access reveals that not only orthographic, but also phonological neighborhoods can influence visual lexical access both within and across languages. We conclude that capitalizing upon large-scale linguistic databases can advance, refine, and accelerate scientific discoveries about the human linguistic capacity.
Fracture Testing of Large-Scale Thin-Sheet Aluminum Alloy (MS Word file)
DOT National Transportation Integrated Search
1996-02-01
Word Document; A series of fracture tests on large-scale, precracked, aluminum alloy panels were carried out to examine and characterize the process by which cracks propagate and link up in this material. Extended grips and test fixtures were special...
Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing.
Zhao, Shanrong; Prenger, Kurt; Smith, Lance; Messina, Thomas; Fan, Hongtao; Jaeger, Edward; Stephens, Susan
2013-06-27
Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of the box. Rainbow is available for third-party implementation and use, and can be downloaded from http://s3.amazonaws.com/jnj_rainbow/index.html.
Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing
2013-01-01
Background Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Results Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Conclusions Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of the box. Rainbow is available for third-party implementation and use, and can be downloaded from http://s3.amazonaws.com/jnj_rainbow/index.html. PMID:23802613
Large-scale gene function analysis with the PANTHER classification system.
Mi, Huaiyu; Muruganujan, Anushya; Casagrande, John T; Thomas, Paul D
2013-08-01
The PANTHER (protein annotation through evolutionary relationship) classification system (http://www.pantherdb.org/) is a comprehensive system that combines gene function, ontology, pathways and statistical analysis tools that enable biologists to analyze large-scale, genome-wide data from sequencing, proteomics or gene expression experiments. The system is built with 82 complete genomes organized into gene families and subfamilies, and their evolutionary relationships are captured in phylogenetic trees, multiple sequence alignments and statistical models (hidden Markov models or HMMs). Genes are classified according to their function in several different ways: families and subfamilies are annotated with ontology terms (Gene Ontology (GO) and PANTHER protein class), and sequences are assigned to PANTHER pathways. The PANTHER website includes a suite of tools that enable users to browse and query gene functions, and to analyze large-scale experimental data with a number of statistical tests. It is widely used by bench scientists, bioinformaticians, computer scientists and systems biologists. In the 2013 release of PANTHER (v.8.0), in addition to an update of the data content, we redesigned the website interface to improve both user experience and the system's analytical capability. This protocol provides a detailed description of how to analyze genome-wide experimental data with the PANTHER classification system.
Multi-color electron microscopy by element-guided identification of cells, organelles and molecules.
Scotuzzi, Marijke; Kuipers, Jeroen; Wensveen, Dasha I; de Boer, Pascal; Hagen, Kees C W; Hoogenboom, Jacob P; Giepmans, Ben N G
2017-04-07
Cellular complexity is unraveled at nanometer resolution using electron microscopy (EM), but interpretation of macromolecular functionality is hampered by the difficulty in interpreting grey-scale images and the unidentified molecular content. We perform large-scale EM on mammalian tissue complemented with energy-dispersive X-ray analysis (EDX) to allow EM-data analysis based on elemental composition. Endogenous elements, labels (gold and cadmium-based nanoparticles) as well as stains are analyzed at ultrastructural resolution. This provides a wide palette of colors to paint the traditional grey-scale EM images for composition-based interpretation. Our proof-of-principle application of EM-EDX reveals that endocrine and exocrine vesicles exist in single cells in Islets of Langerhans. This highlights how elemental mapping reveals unbiased biomedical relevant information. Broad application of EM-EDX will further allow experimental analysis on large-scale tissue using endogenous elements, multiple stains, and multiple markers and thus brings nanometer-scale 'color-EM' as a promising tool to unravel molecular (de)regulation in biomedicine.
Multi-color electron microscopy by element-guided identification of cells, organelles and molecules
Scotuzzi, Marijke; Kuipers, Jeroen; Wensveen, Dasha I.; de Boer, Pascal; Hagen, Kees (C.) W.; Hoogenboom, Jacob P.; Giepmans, Ben N. G.
2017-01-01
Cellular complexity is unraveled at nanometer resolution using electron microscopy (EM), but interpretation of macromolecular functionality is hampered by the difficulty in interpreting grey-scale images and the unidentified molecular content. We perform large-scale EM on mammalian tissue complemented with energy-dispersive X-ray analysis (EDX) to allow EM-data analysis based on elemental composition. Endogenous elements, labels (gold and cadmium-based nanoparticles) as well as stains are analyzed at ultrastructural resolution. This provides a wide palette of colors to paint the traditional grey-scale EM images for composition-based interpretation. Our proof-of-principle application of EM-EDX reveals that endocrine and exocrine vesicles exist in single cells in Islets of Langerhans. This highlights how elemental mapping reveals unbiased biomedical relevant information. Broad application of EM-EDX will further allow experimental analysis on large-scale tissue using endogenous elements, multiple stains, and multiple markers and thus brings nanometer-scale ‘color-EM’ as a promising tool to unravel molecular (de)regulation in biomedicine. PMID:28387351
Simulating Coupling Complexity in Space Plasmas: First Results from a new code
NASA Astrophysics Data System (ADS)
Kryukov, I.; Zank, G. P.; Pogorelov, N. V.; Raeder, J.; Ciardo, G.; Florinski, V. A.; Heerikhuisen, J.; Li, G.; Petrini, F.; Shematovich, V. I.; Winske, D.; Shaikh, D.; Webb, G. M.; Yee, H. M.
2005-12-01
The development of codes that embrace 'coupling complexity' via the self-consistent incorporation of multiple physical scales and multiple physical processes in models has been identified by the NRC Decadal Survey in Solar and Space Physics as a crucial necessary development in simulation/modeling technology for the coming decade. The National Science Foundation, through its Information Technology Research (ITR) Program, is supporting our efforts to develop a new class of computational code for plasmas and neutral gases that integrates multiple scales and multiple physical processes and descriptions. We are developing a highly modular, parallelized, scalable code that incorporates multiple scales by synthesizing 3 simulation technologies: 1) Computational fluid dynamics (hydrodynamics or magneto-hydrodynamics-MHD) for the large-scale plasma; 2) direct Monte Carlo simulation of atoms/neutral gas, and 3) transport code solvers to model highly energetic particle distributions. We are constructing the code so that a fourth simulation technology, hybrid simulations for microscale structures and particle distributions, can be incorporated in future work, but for the present, this aspect will be addressed at a test-particle level. This synthesis we will provide a computational tool that will advance our understanding of the physics of neutral and charged gases enormously. Besides making major advances in basic plasma physics and neutral gas problems, this project will address 3 Grand Challenge space physics problems that reflect our research interests: 1) To develop a temporal global heliospheric model which includes the interaction of solar and interstellar plasma with neutral populations (hydrogen, helium, etc., and dust), test-particle kinetic pickup ion acceleration at the termination shock, anomalous cosmic ray production, interaction with galactic cosmic rays, while incorporating the time variability of the solar wind and the solar cycle. 2) To develop a coronal mass ejection and interplanetary shock propagation model for the inner and outer heliosphere, including, at a test-particle level, wave-particle interactions and particle acceleration at traveling shock waves and compression regions. 3) To develop an advanced Geospace General Circulation Model (GGCM) capable of realistically modeling space weather events, in particular the interaction with CMEs and geomagnetic storms. Furthermore, by implementing scalable run-time supports and sophisticated off- and on-line prediction algorithms, we anticipate important advances in the development of automatic and intelligent system software to optimize a wide variety of 'embedded' computations on parallel computers. Finally, public domain MHD and hydrodynamic codes had a transforming effect on space and astrophysics. We expect that our new generation, open source, public domain multi-scale code will have a similar transformational effect in a variety of disciplines, opening up new classes of problems to physicists and engineers alike.
Bose-Einstein correlations in pp and PbPb collisions with ALICE at the LHC
Kisiel, Adam
2018-05-14
We report on the results of identical pion femtoscopy at the LHC. The Bose-Einstein correlation analysis was performed on the large-statistics ALICE p+p at sqrt{s}= 0.9 TeV and 7 TeV datasets collected during 2010 LHC running and the first Pb+Pb dataset at sqrt{s_NN}= 2.76 TeV. Detailed pion femtoscopy studies in heavy-ion collisions have shown that emission region sizes ("HBT radii") decrease with increasing pair momentum, which is understood as a manifestation of the collective behavior of matter. 3D radii were also found to universally scale with event multiplicity. In p+p collisions at 7 TeV one measures multiplicities which are comparable with those registered in peripheral AuAu and CuCu collisions at RHIC, so direct comparisons and tests of scaling laws are now possible. We show the results of double-differential 3D pion HBT analysis, as a function of multiplicity and pair momentum. The results for two collision energies are compared to results obtained in the heavy-ion collisions at similar multiplicity and p+p collisions at lower energy. We identify the relevant scaling variables for the femtoscopic radii and discuss the similarities and differences to results from heavy-ions. The observed trends give insight into the soft particle production mechanism in p+p collisions and suggest that a self-interacting collective system may be created in sufficiently high multiplicity events. First results for the central Pb+Pb collisions are also shown. A significant increase of the reaction zone volume and lifetime in comparison to RHIC is observed. Signatures of collective hydrodynamics-like behavior of the system are also apparent, and are compared to model predictions.
[Effect of preventive treatment on cognitive performance in patients with multiple sclerosis].
Shorobura, Maria S
2018-01-01
Introduction: cognitive, emotional and psychopathological changes play a significant role in the clinical picture of multiple sclerosis and influence the effectiveness of drug therapy, working capacity, quality of life, and the process of rehabilitation of patients with multiple sclerosis. The aim: investigate the changes in cognitive function in patients with multiple sclerosis, such as information processing speed and working memory of patients before and after treatment with immunomodulating drug. Materials and methods:33 patients examined reliably diagnosed with multiple sclerosis who were treated with preventive examinations and treatment from 2012 to 2016. For all patients with multiple sclerosis had clinical-neurological examination (neurological status using the EDSS scale) and the cognitive status was evaluated using the PASAT auditory test. Patient screening was performed before, during and after the therapy. Statistical analysis of the results was performed in the system Statistica 8.0. We used Student's t-test (t), Mann-Whitney test (Z). Person evaluated the correlation coefficients and Spearman (r, R), Wilcoxon criterion (T), Chi-square (X²). Results: The age of patients with multiple sclerosis affects the growth and EDSS scale score decrease PASAT to treatment. Duration of illness affects the EDSS scale score and performance PASAT. Indicators PASAT not significantly decreased throughout the treatment. Conclusions: glatiramer acetate has a positive effect on cognitive function, information processing speed and working memory patients with multiple sclerosis, which is one of the important components of the therapeutic effect of this drug.
Multiple pathways of commodity crop expansion in tropical forest landscapes
NASA Astrophysics Data System (ADS)
Meyfroidt, Patrick; Carlson, Kimberly M.; Fagan, Matthew E.; Gutiérrez-Vélez, Victor H.; Macedo, Marcia N.; Curran, Lisa M.; DeFries, Ruth S.; Dyer, George A.; Gibbs, Holly K.; Lambin, Eric F.; Morton, Douglas C.; Robiglio, Valentina
2014-07-01
Commodity crop expansion, for both global and domestic urban markets, follows multiple land change pathways entailing direct and indirect deforestation, and results in various social and environmental impacts. Here we compare six published case studies of rapid commodity crop expansion within forested tropical regions. Across cases, between 1.7% and 89.5% of new commodity cropland was sourced from forestlands. Four main factors controlled pathways of commodity crop expansion: (i) the availability of suitable forestland, which is determined by forest area, agroecological or accessibility constraints, and land use policies, (ii) economic and technical characteristics of agricultural systems, (iii) differences in constraints and strategies between small-scale and large-scale actors, and (iv) variable costs and benefits of forest clearing. When remaining forests were unsuitable for agriculture and/or policies restricted forest encroachment, a larger share of commodity crop expansion occurred by conversion of existing agricultural lands, and land use displacement was smaller. Expansion strategies of large-scale actors emerge from context-specific balances between the search for suitable lands; transaction costs or conflicts associated with expanding into forests or other state-owned lands versus smallholder lands; net benefits of forest clearing; and greater access to infrastructure in already-cleared lands. We propose five hypotheses to be tested in further studies: (i) land availability mediates expansion pathways and the likelihood that land use is displaced to distant, rather than to local places; (ii) use of already-cleared lands is favored when commodity crops require access to infrastructure; (iii) in proportion to total agricultural expansion, large-scale actors generate more clearing of mature forests than smallholders; (iv) property rights and land tenure security influence the actors participating in commodity crop expansion, the form of land use displacement, and livelihood outcomes; (v) intensive commodity crops may fail to spare land when inducing displacement. We conclude that understanding pathways of commodity crop expansion is essential to improve land use governance.
Development of a 3D printer using scanning projection stereolithography
Lee, Michael P.; Cooper, Geoffrey J. T.; Hinkley, Trevor; Gibson, Graham M.; Padgett, Miles J.; Cronin, Leroy
2015-01-01
We have developed a system for the rapid fabrication of low cost 3D devices and systems in the laboratory with micro-scale features yet cm-scale objects. Our system is inspired by maskless lithography, where a digital micromirror device (DMD) is used to project patterns with resolution up to 10 µm onto a layer of photoresist. Large area objects can be fabricated by stitching projected images over a 5cm2 area. The addition of a z-stage allows multiple layers to be stacked to create 3D objects, removing the need for any developing or etching steps but at the same time leading to true 3D devices which are robust, configurable and scalable. We demonstrate the applications of the system by printing a range of micro-scale objects as well as a fully functioning microfluidic droplet device and test its integrity by pumping dye through the channels. PMID:25906401
A measure of association for ordered categorical data in population-based studies
Nelson, Kerrie P; Edwards, Don
2016-01-01
Ordinal classification scales are commonly used to define a patient’s disease status in screening and diagnostic tests such as mammography. Challenges arise in agreement studies when evaluating the association between many raters’ classifications of patients’ disease or health status when an ordered categorical scale is used. In this paper, we describe a population-based approach and chance-corrected measure of association to evaluate the strength of relationship between multiple raters’ ordinal classifications where any number of raters can be accommodated. In contrast to Shrout and Fleiss’ intraclass correlation coefficient, the proposed measure of association is invariant with respect to changes in disease prevalence. We demonstrate how unique characteristics of individual raters can be explored using random effects. Simulation studies are conducted to demonstrate the properties of the proposed method under varying assumptions. The methods are applied to two large-scale agreement studies of breast cancer screening and prostate cancer severity. PMID:27184590
Ding, Edwin; Lefrancois, Simon; Kutz, Jose Nathan; Wise, Frank W.
2011-01-01
The mode-locking of dissipative soliton fiber lasers using large mode area fiber supporting multiple transverse modes is studied experimentally and theoretically. The averaged mode-locking dynamics in a multi-mode fiber are studied using a distributed model. The co-propagation of multiple transverse modes is governed by a system of coupled Ginzburg–Landau equations. Simulations show that stable and robust mode-locked pulses can be produced. However, the mode-locking can be destabilized by excessive higher-order mode content. Experiments using large core step-index fiber, photonic crystal fiber, and chirally-coupled core fiber show that mode-locking can be significantly disturbed in the presence of higher-order modes, resulting in lower maximum single-pulse energies. In practice, spatial mode content must be carefully controlled to achieve full pulse energy scaling. This paper demonstrates that mode-locking performance is very sensitive to the presence of multiple waveguide modes when compared to systems such as amplifiers and continuous-wave lasers. PMID:21731106
Ding, Edwin; Lefrancois, Simon; Kutz, Jose Nathan; Wise, Frank W
2011-01-01
The mode-locking of dissipative soliton fiber lasers using large mode area fiber supporting multiple transverse modes is studied experimentally and theoretically. The averaged mode-locking dynamics in a multi-mode fiber are studied using a distributed model. The co-propagation of multiple transverse modes is governed by a system of coupled Ginzburg-Landau equations. Simulations show that stable and robust mode-locked pulses can be produced. However, the mode-locking can be destabilized by excessive higher-order mode content. Experiments using large core step-index fiber, photonic crystal fiber, and chirally-coupled core fiber show that mode-locking can be significantly disturbed in the presence of higher-order modes, resulting in lower maximum single-pulse energies. In practice, spatial mode content must be carefully controlled to achieve full pulse energy scaling. This paper demonstrates that mode-locking performance is very sensitive to the presence of multiple waveguide modes when compared to systems such as amplifiers and continuous-wave lasers.
A unified model explains commonness and rarity on coral reefs.
Connolly, Sean R; Hughes, Terry P; Bellwood, David R
2017-04-01
Abundance patterns in ecological communities have important implications for biodiversity maintenance and ecosystem functioning. However, ecological theory has been largely unsuccessful at capturing multiple macroecological abundance patterns simultaneously. Here, we propose a parsimonious model that unifies widespread ecological relationships involving local aggregation, species-abundance distributions, and species associations, and we test this model against the metacommunity structure of reef-building corals and coral reef fishes across the western and central Pacific. For both corals and fishes, the unified model simultaneously captures extremely well local species-abundance distributions, interspecific variation in the strength of spatial aggregation, patterns of community similarity, species accumulation, and regional species richness, performing far better than alternative models also examined here and in previous work on coral reefs. Our approach contributes to the development of synthetic theory for large-scale patterns of community structure in nature, and to addressing ongoing challenges in biodiversity conservation at macroecological scales. © 2017 The Authors. Ecology Letters published by CNRS and John Wiley & Sons Ltd.
Cold Flow Propulsion Test Complex Pulse Testing
NASA Technical Reports Server (NTRS)
McDougal, Kris
2016-01-01
When the propellants in a liquid rocket engine burn, the rocket not only launches and moves in space, it causes forces that interact with the vehicle itself. When these interactions occur under specific conditions, the vehicle's structures and components can become unstable. One instability of primary concern is termed pogo (named after the movement of a pogo stick), in which the oscillations (cycling movements) cause large loads, or pressure, against the vehicle, tanks, feedlines, and engine. Marshall Space Flight Center (MSFC) has developed a unique test technology to understand and quantify the complex fluid movements and forces in a liquid rocket engine that contribute strongly to both engine and integrated vehicle performance and stability. This new test technology was established in the MSFC Cold Flow Propulsion Test Complex to allow injection and measurement of scaled propellant flows and measurement of the resulting forces at multiple locations throughout the engine.
Evaluation of nucleus segmentation in digital pathology images through large scale image synthesis
NASA Astrophysics Data System (ADS)
Zhou, Naiyun; Yu, Xiaxia; Zhao, Tianhao; Wen, Si; Wang, Fusheng; Zhu, Wei; Kurc, Tahsin; Tannenbaum, Allen; Saltz, Joel; Gao, Yi
2017-03-01
Digital histopathology images with more than 1 Gigapixel are drawing more and more attention in clinical, biomedical research, and computer vision fields. Among the multiple observable features spanning multiple scales in the pathology images, the nuclear morphology is one of the central criteria for diagnosis and grading. As a result it is also the mostly studied target in image computing. Large amount of research papers have devoted to the problem of extracting nuclei from digital pathology images, which is the foundation of any further correlation study. However, the validation and evaluation of nucleus extraction have yet been formulated rigorously and systematically. Some researches report a human verified segmentation with thousands of nuclei, whereas a single whole slide image may contain up to million. The main obstacle lies in the difficulty of obtaining such a large number of validated nuclei, which is essentially an impossible task for pathologist. We propose a systematic validation and evaluation approach based on large scale image synthesis. This could facilitate a more quantitatively validated study for current and future histopathology image analysis field.
2012-03-13
Source Approach Part II. Altairnano Lithium Ion Nano-scaled Titanate Oxide Cell and Module Abuse Testing 14. ABSTRACT 16. SECURITY CLASSIFICATION OF...Lithium Ion Nano-scaled Titanate Oxide Cell and Module Abuse Testing Report Title ABSTRACT This final report for Contract W911NF-09-C-0135 transmits the...prototype development. The second (Part II.) is "Altairnano Lithium Ion Nano-scaled Titanate Oxide Cell and Module Abuse Test Report". The
NASA Technical Reports Server (NTRS)
Zapata, R. N.; Humphris, R. R.; Henderson, K. C.
1974-01-01
Based on the premises that (1) magnetic suspension techniques can play a useful role in large-scale aerodynamic testing and (2) superconductor technology offers the only practical hope for building large-scale magnetic suspensions, an all-superconductor three-component magnetic suspension and balance facility was built as a prototype and was tested successfully. Quantitative extrapolations of design and performance characteristics of this prototype system to larger systems compatible with existing and planned high Reynolds number facilities have been made and show that this experimental technique should be particularly attractive when used in conjunction with large cryogenic wind tunnels.
NASA Technical Reports Server (NTRS)
Zapata, R. N.; Humphris, R. R.; Henderson, K. C.
1975-01-01
Based on the premises that magnetic suspension techniques can play a useful role in large scale aerodynamic testing, and that superconductor technology offers the only practical hope for building large scale magnetic suspensions, an all-superconductor 3-component magnetic suspension and balance facility was built as a prototype and tested sucessfully. Quantitative extrapolations of design and performance characteristics of this prototype system to larger systems compatible with existing and planned high Reynolds number facilities at Langley Research Center were made and show that this experimental technique should be particularly attractive when used in conjunction with large cryogenic wind tunnels.
ERIC Educational Resources Information Center
Decker, Dawn M.; Hixson, Michael D.; Shaw, Amber; Johnson, Gloria
2014-01-01
The purpose of this study was to examine whether using a multiple-measure framework yielded better classification accuracy than oral reading fluency (ORF) or maze alone in predicting pass/fail rates for middle-school students on a large-scale reading assessment. Participants were 178 students in Grades 7 and 8 from a Midwestern school district.…
Test aspects of the JPL Viterbi decoder
NASA Technical Reports Server (NTRS)
Breuer, M. A.
1989-01-01
The generation of test vectors and design-for-test aspects of the Jet Propulsion Laboratory (JPL) Very Large Scale Integration (VLSI) Viterbi decoder chip is discussed. Each processor integrated circuit (IC) contains over 20,000 gates. To achieve a high degree of testability, a scan architecture is employed. The logic has been partitioned so that very few test vectors are required to test the entire chip. In addition, since several blocks of logic are replicated numerous times on this chip, test vectors need only be generated for each block, rather than for the entire circuit. These unique blocks of logic have been identified and test sets generated for them. The approach employed for testing was to use pseudo-exhaustive test vectors whenever feasible. That is, each cone of logid is tested exhaustively. Using this approach, no detailed logic design or fault model is required. All faults which modify the function of a block of combinational logic are detected, such as all irredundant single and multiple stuck-at faults.
Morphological evidence for discrete stocks of yellow perch in Lake Erie
Kocovsky, Patrick M.; Knight, Carey T.
2012-01-01
Identification and management of unique stocks of exploited fish species are high-priority management goals in the Laurentian Great Lakes. We analyzed whole-body morphometrics of 1430 yellow perch Perca flavescens captured during 2007–2009 from seven known spawning areas in Lake Erie to determine if morphometrics vary among sites and management units to assist in identification of spawning stocks of this heavily exploited species. Truss-based morphometrics (n = 21 measurements) were analyzed using principal component analysis followed by ANOVA of the first three principal components to determine whether yellow perch from the several sampling sites varied morphometrically. Duncan's multiple range test was used to determine which sites differed from one another to test whether morphometrics varied at scales finer than management unit. Morphometrics varied significantly among sites and annually, but differences among sites were much greater. Sites within the same management unit typically differed significantly from one another, indicating morphometric variation at a scale finer than management unit. These results are largely congruent with recently-published studies on genetic variation of yellow perch from many of the same sampling sites. Thus, our results provide additional evidence that there are discrete stocks of yellow perch in Lake Erie and that management units likely comprise multiple stocks.
A successful trap design for capturing large terrestrial snakes
Shirley J. Burgdorf; D. Craig Rudolph; Richard N. Conner; Daniel Saenz; Richard R. Schaefer
2005-01-01
Large scale trapping protocols for snakes can be expensive and require large investments of personnel and time. Typical methods, such as pitfall and small funnel traps, are not useful or suitable for capturing large snakes. A method was needed to survey multiple blocks of habitat for the Louisiana Pine Snake (Pituophis ruthveni), throughout its...
NASA Astrophysics Data System (ADS)
Camera, Corrado; Bruggeman, Adriana; Hadjinicolaou, Panos; Pashiardis, Stelios; Lange, Manfred A.
2014-01-01
High-resolution gridded daily data sets are essential for natural resource management and the analyses of climate changes and their effects. This study aims to evaluate the performance of 15 simple or complex interpolation techniques in reproducing daily precipitation at a resolution of 1 km2 over topographically complex areas. Methods are tested considering two different sets of observation densities and different rainfall amounts. We used rainfall data that were recorded at 74 and 145 observational stations, respectively, spread over the 5760 km2 of the Republic of Cyprus, in the Eastern Mediterranean. Regression analyses utilizing geographical copredictors and neighboring interpolation techniques were evaluated both in isolation and combined. Linear multiple regression (LMR) and geographically weighted regression methods (GWR) were tested. These included a step-wise selection of covariables, as well as inverse distance weighting (IDW), kriging, and 3D-thin plate splines (TPS). The relative rank of the different techniques changes with different station density and rainfall amounts. Our results indicate that TPS performs well for low station density and large-scale events and also when coupled with regression models. It performs poorly for high station density. The opposite is observed when using IDW. Simple IDW performs best for local events, while a combination of step-wise GWR and IDW proves to be the best method for large-scale events and high station density. This study indicates that the use of step-wise regression with a variable set of geographic parameters can improve the interpolation of large-scale events because it facilitates the representation of local climate dynamics.
NASA Astrophysics Data System (ADS)
Pratiwi, W. N.; Rochintaniawati, D.; Agustin, R. R.
2018-05-01
This research was focused on investigating the effect of multiple intelligence -based learning as a learning approach towards students’ concept mastery and interest in learning matter. The one-group pre-test - post-test design was used in this research towards a sample which was according to the suitable situation of the research sample, n = 13 students of the 7th grade in a private school in Bandar Seri Begawan. The students’ concept mastery was measured using achievement test and given at the pre-test and post-test, meanwhile the students’ interest level was measured using a Likert Scale for interest. Based on the analysis of the data, the result shows that the normalized gain was .61, which was considered as a medium improvement. in other words, students’ concept mastery in matter increased after being taught using multiple intelligence-based learning. The Likert scale of interest shows that most students have a high interest in learning matter after being taught by multiple intelligence-based learning. Therefore, it is concluded that multiple intelligence – based learning helped in improving students’ concept mastery and gain students’ interest in learning matter.
Food waste impact on municipal solid waste angle of internal friction.
Cho, Young Min; Ko, Jae Hac; Chi, Liqun; Townsend, Timothy G
2011-01-01
The impact of food waste content on the municipal solid waste (MSW) friction angle was studied. Using reconstituted fresh MSW specimens with different food waste content (0%, 40%, 58%, and 80%), 48 small-scale (100-mm-diameter) direct shear tests and 12 large-scale (430 mm × 430 mm) direct shear tests were performed. A stress-controlled large-scale direct shear test device allowing approximately 170-mm sample horizontal displacement was designed and used. At both testing scales, the mobilized internal friction angle of MSW decreased considerably as food waste content increased. As food waste content increased from 0% to 40% and from 40% to 80%, the mobilized internal friction angles (estimated using the mobilized peak (ultimate) shear strengths of the small-scale direct shear tests) decreased from 39° to 31° and from 31° to 7°, respectively, while those of large-scale tests decreased from 36° to 26° and from 26° to 15°, respectively. Most friction angle measurements produced in this study fell within the range of those previously reported for MSW. Copyright © 2010 Elsevier Ltd. All rights reserved.
Icing Simulation Research Supporting the Ice-Accretion Testing of Large-Scale Swept-Wing Models
NASA Technical Reports Server (NTRS)
Yadlin, Yoram; Monnig, Jaime T.; Malone, Adam M.; Paul, Bernard P.
2018-01-01
The work summarized in this report is a continuation of NASA's Large-Scale, Swept-Wing Test Articles Fabrication; Research and Test Support for NASA IRT contract (NNC10BA05 -NNC14TA36T) performed by Boeing under the NASA Research and Technology for Aerospace Propulsion Systems (RTAPS) contract. In the study conducted under RTAPS, a series of icing tests in the Icing Research Tunnel (IRT) have been conducted to characterize ice formations on large-scale swept wings representative of modern commercial transport airplanes. The outcome of that campaign was a large database of ice-accretion geometries that can be used for subsequent aerodynamic evaluation in other experimental facilities and for validation of ice-accretion prediction codes.
Mirus, Benjamin B.; Halford, Keith J.; Sweetkind, Donald; ...
2016-02-18
The suitability of geologic frameworks for extrapolating hydraulic conductivity (K) to length scales commensurate with hydraulic data is difficult to assess. A novel method is presented for evaluating assumed relations between K and geologic interpretations for regional-scale groundwater modeling. The approach relies on simultaneous interpretation of multiple aquifer tests using alternative geologic frameworks of variable complexity, where each framework is incorporated as prior information that assumes homogeneous K within each model unit. This approach is tested at Pahute Mesa within the Nevada National Security Site (USA), where observed drawdowns from eight aquifer tests in complex, highly faulted volcanic rocks providemore » the necessary hydraulic constraints. The investigated volume encompasses 40 mi3 (167 km3) where drawdowns traversed major fault structures and were detected more than 2 mi (3.2 km) from pumping wells. Complexity of the five frameworks assessed ranges from an undifferentiated mass of rock with a single unit to 14 distinct geologic units. Results show that only four geologic units can be justified as hydraulically unique for this location. The approach qualitatively evaluates the consistency of hydraulic property estimates within extents of investigation and effects of geologic frameworks on extrapolation. Distributions of transmissivity are similar within the investigated extents irrespective of the geologic framework. In contrast, the extrapolation of hydraulic properties beyond the volume investigated with interfering aquifer tests is strongly affected by the complexity of a given framework. As a result, testing at Pahute Mesa illustrates how this method can be employed to determine the appropriate level of geologic complexity for large-scale groundwater modeling.« less
Mirus, Benjamin B.; Halford, Keith J.; Sweetkind, Donald; Fenelon, Joseph M.
2016-01-01
The suitability of geologic frameworks for extrapolating hydraulic conductivity (K) to length scales commensurate with hydraulic data is difficult to assess. A novel method is presented for evaluating assumed relations between K and geologic interpretations for regional-scale groundwater modeling. The approach relies on simultaneous interpretation of multiple aquifer tests using alternative geologic frameworks of variable complexity, where each framework is incorporated as prior information that assumes homogeneous K within each model unit. This approach is tested at Pahute Mesa within the Nevada National Security Site (USA), where observed drawdowns from eight aquifer tests in complex, highly faulted volcanic rocks provide the necessary hydraulic constraints. The investigated volume encompasses 40 mi3 (167 km3) where drawdowns traversed major fault structures and were detected more than 2 mi (3.2 km) from pumping wells. Complexity of the five frameworks assessed ranges from an undifferentiated mass of rock with a single unit to 14 distinct geologic units. Results show that only four geologic units can be justified as hydraulically unique for this location. The approach qualitatively evaluates the consistency of hydraulic property estimates within extents of investigation and effects of geologic frameworks on extrapolation. Distributions of transmissivity are similar within the investigated extents irrespective of the geologic framework. In contrast, the extrapolation of hydraulic properties beyond the volume investigated with interfering aquifer tests is strongly affected by the complexity of a given framework. Testing at Pahute Mesa illustrates how this method can be employed to determine the appropriate level of geologic complexity for large-scale groundwater modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mirus, Benjamin B.; Halford, Keith J.; Sweetkind, Donald
The suitability of geologic frameworks for extrapolating hydraulic conductivity (K) to length scales commensurate with hydraulic data is difficult to assess. A novel method is presented for evaluating assumed relations between K and geologic interpretations for regional-scale groundwater modeling. The approach relies on simultaneous interpretation of multiple aquifer tests using alternative geologic frameworks of variable complexity, where each framework is incorporated as prior information that assumes homogeneous K within each model unit. This approach is tested at Pahute Mesa within the Nevada National Security Site (USA), where observed drawdowns from eight aquifer tests in complex, highly faulted volcanic rocks providemore » the necessary hydraulic constraints. The investigated volume encompasses 40 mi3 (167 km3) where drawdowns traversed major fault structures and were detected more than 2 mi (3.2 km) from pumping wells. Complexity of the five frameworks assessed ranges from an undifferentiated mass of rock with a single unit to 14 distinct geologic units. Results show that only four geologic units can be justified as hydraulically unique for this location. The approach qualitatively evaluates the consistency of hydraulic property estimates within extents of investigation and effects of geologic frameworks on extrapolation. Distributions of transmissivity are similar within the investigated extents irrespective of the geologic framework. In contrast, the extrapolation of hydraulic properties beyond the volume investigated with interfering aquifer tests is strongly affected by the complexity of a given framework. As a result, testing at Pahute Mesa illustrates how this method can be employed to determine the appropriate level of geologic complexity for large-scale groundwater modeling.« less
J. D. Shaw; J. N. Long; M. T. Thompson; R. J. DeRose
2010-01-01
A complex of drought, insects, and disease is causing widespread mortality in multiple forest types across western North America. These forest types range from dry Pinus-Juniperus woodlands to moist, montane Picea-Abies forests. Although large-scale mortality events are known from the past and considered part of natural cycles, recent events have largely been...
Rilov, Gil; Schiel, David R
2011-01-01
Predicting the strength and context-dependency of species interactions across multiple scales is a core area in ecology. This is especially challenging in the marine environment, where populations of most predators and prey are generally open, because of their pelagic larval phase, and recruitment of both is highly variable. In this study we use a comparative-experimental approach on small and large spatial scales to test the relationship between predation intensity and prey recruitment and their relative importance in shaping populations of a dominant rocky intertidal space occupier, mussels, in the context of seascape (availability of nearby subtidal reef habitat). Predation intensity on transplanted mussels was tested inside and outside cages and recruitment was measured with standard larval settlement collectors. We found that on intertidal rocky benches with contiguous subtidal reefs in New Zealand, mussel larval recruitment is usually low but predation on recruits by subtidal consumers (fish, crabs) is intense during high tide. On nearby intertidal rocky benches with adjacent sandy subtidal habitats, larval recruitment is usually greater but subtidal predators are typically rare and predation is weaker. Multiple regression analysis showed that predation intensity accounts for most of the variability in the abundance of adult mussels compared to recruitment. This seascape-dependent, predation-recruitment relationship could scale up to explain regional community variability. We argue that community ecology models should include seascape context-dependency and its effects on recruitment and species interactions for better predictions of coastal community dynamics and structure.
Hybrid Computerized Adaptive Testing: From Group Sequential Design to Fully Sequential Design
ERIC Educational Resources Information Center
Wang, Shiyu; Lin, Haiyan; Chang, Hua-Hua; Douglas, Jeff
2016-01-01
Computerized adaptive testing (CAT) and multistage testing (MST) have become two of the most popular modes in large-scale computer-based sequential testing. Though most designs of CAT and MST exhibit strength and weakness in recent large-scale implementations, there is no simple answer to the question of which design is better because different…
Garnier, Aurélie; Pennekamp, Frank; Lemoine, Mélissa; Petchey, Owen L
2017-12-01
Global environmental change has negative impacts on ecological systems, impacting the stable provision of functions, goods, and services. Whereas effects of individual environmental changes (e.g. temperature change or change in resource availability) are reasonably well understood, we lack information about if and how multiple changes interact. We examined interactions among four types of environmental disturbance (temperature, nutrient ratio, carbon enrichment, and light) in a fully factorial design using a microbial aquatic ecosystem and observed responses of dissolved oxygen saturation at three temporal scales (resistance, resilience, and return time). We tested whether multiple disturbances combine in a dominant, additive, or interactive fashion, and compared the predictability of dissolved oxygen across scales. Carbon enrichment and shading reduced oxygen concentration in the short term (i.e. resistance); although no other effects or interactions were statistically significant, resistance decreased as the number of disturbances increased. In the medium term, only enrichment accelerated recovery, but none of the other effects (including interactions) were significant. In the long term, enrichment and shading lengthened return times, and we found significant two-way synergistic interactions between disturbances. The best performing model (dominant, additive, or interactive) depended on the temporal scale of response. In the short term (i.e. for resistance), the dominance model predicted resistance of dissolved oxygen best, due to a large effect of carbon enrichment, whereas none of the models could predict the medium term (i.e. resilience). The long-term response was best predicted by models including interactions among disturbances. Our results indicate the importance of accounting for the temporal scale of responses when researching the effects of environmental disturbances on ecosystems. © 2017 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
Xu, Jiuping; Feng, Cuiying
2014-01-01
This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method.
Methods, caveats and the future of large-scale microelectrode recordings in the non-human primate
Dotson, Nicholas M.; Goodell, Baldwin; Salazar, Rodrigo F.; Hoffman, Steven J.; Gray, Charles M.
2015-01-01
Cognitive processes play out on massive brain-wide networks, which produce widely distributed patterns of activity. Capturing these activity patterns requires tools that are able to simultaneously measure activity from many distributed sites with high spatiotemporal resolution. Unfortunately, current techniques with adequate coverage do not provide the requisite spatiotemporal resolution. Large-scale microelectrode recording devices, with dozens to hundreds of microelectrodes capable of simultaneously recording from nearly as many cortical and subcortical areas, provide a potential way to minimize these tradeoffs. However, placing hundreds of microelectrodes into a behaving animal is a highly risky and technically challenging endeavor that has only been pursued by a few groups. Recording activity from multiple electrodes simultaneously also introduces several statistical and conceptual dilemmas, such as the multiple comparisons problem and the uncontrolled stimulus response problem. In this perspective article, we discuss some of the techniques that we, and others, have developed for collecting and analyzing large-scale data sets, and address the future of this emerging field. PMID:26578906
NASA Astrophysics Data System (ADS)
Hartmann, Alfred; Redfield, Steve
1989-04-01
This paper discusses design of large-scale (1000x 1000) optical crossbar switching networks for use in parallel processing supercom-puters. Alternative design sketches for an optical crossbar switching network are presented using free-space optical transmission with either a beam spreading/masking model or a beam steering model for internodal communications. The performances of alternative multiple access channel communications protocol-unslotted and slotted ALOHA and carrier sense multiple access (CSMA)-are compared with the performance of the classic arbitrated bus crossbar of conventional electronic parallel computing. These comparisons indicate an almost inverse relationship between ease of implementation and speed of operation. Practical issues of optical system design are addressed, and an optically addressed, composite spatial light modulator design is presented for fabrication to arbitrarily large scale. The wide range of switch architecture, communications protocol, optical systems design, device fabrication, and system performance problems presented by these design sketches poses a serious challenge to practical exploitation of highly parallel optical interconnects in advanced computer designs.
Xu, Jiuping
2014-01-01
This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708
1981-11-01
AD-AI09 516 FLORIDA UNIV GAINESVILLE DEPT OF ENVIRONMENTAL ENGIN--ETC F/G 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE,WHITE AMUR--ETC(U... OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC PLANTS Report I: Baseline Studies Volume I: The Aquatic Macropyes of...COVERED LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF Report 2 of a series THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC (In 7 volumes) PLANTS
1982-08-01
AD-A-11 701 ORANGE COUNTY POLLUTION CONTROL DEPT ORLANDO FL F/0 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR--ETC(U) AUG 82 H...8217 OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL -OF PROBLEM AQ.UATIC PLANTS SECOND YEAR POSTSTOCKING RESULTS Volume, Vt The Water...and Subetie) S. TYPE OF REPORT & PERIOD COVERED LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF Report 3 of a series THE WHITE AMUR FOR CONTROL OF
Large-scale dynamics associated with clustering of extratropical cyclones affecting Western Europe
NASA Astrophysics Data System (ADS)
Pinto, Joaquim G.; Gómara, Iñigo; Masato, Giacomo; Dacre, Helen F.; Woollings, Tim; Caballero, Rodrigo
2015-04-01
Some recent winters in Western Europe have been characterized by the occurrence of multiple extratropical cyclones following a similar path. The occurrence of such cyclone clusters leads to large socio-economic impacts due to damaging winds, storm surges, and floods. Recent studies have statistically characterized the clustering of extratropical cyclones over the North Atlantic and Europe and hypothesized potential physical mechanisms responsible for their formation. Here we analyze 4 months characterized by multiple cyclones over Western Europe (February 1990, January 1993, December 1999, and January 2007). The evolution of the eddy driven jet stream, Rossby wave-breaking, and upstream/downstream cyclone development are investigated to infer the role of the large-scale flow and to determine if clustered cyclones are related to each other. Results suggest that optimal conditions for the occurrence of cyclone clusters are provided by a recurrent extension of an intensified eddy driven jet toward Western Europe lasting at least 1 week. Multiple Rossby wave-breaking occurrences on both the poleward and equatorward flanks of the jet contribute to the development of these anomalous large-scale conditions. The analysis of the daily weather charts reveals that upstream cyclone development (secondary cyclogenesis, where new cyclones are generated on the trailing fronts of mature cyclones) is strongly related to cyclone clustering, with multiple cyclones developing on a single jet streak. The present analysis permits a deeper understanding of the physical reasons leading to the occurrence of cyclone families over the North Atlantic, enabling a better estimation of the associated cumulative risk over Europe.
Effects of multiple-scale driving on turbulence statistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Hyunju; Cho, Jungyeon, E-mail: hyunju527@gmail.com, E-mail: jcho@cnu.ac.kr
2014-01-01
Turbulence is ubiquitous in astrophysical fluids such as the interstellar medium and the intracluster medium. In turbulence studies, it is customary to assume that fluid is driven on a single scale. However, in astrophysical fluids, there can be many different driving mechanisms that act on different scales. If there are multiple energy-injection scales, the process of energy cascade and turbulence dynamo will be different compared with the case of the single energy-injection scale. In this work, we perform three-dimensional incompressible/compressible magnetohydrodynamic turbulence simulations. We drive turbulence in Fourier space in two wavenumber ranges, 2≤k≤√12 (large scale) and 15 ≲ kmore » ≲ 26 (small scale). We inject different amount of energy in each range by changing the amplitude of forcing in the range. We present the time evolution of the kinetic and magnetic energy densities and discuss the turbulence dynamo in the presence of energy injections at two scales. We show how kinetic, magnetic, and density spectra are affected by the two-scale energy injections and we discuss the observational implications. In the case ε {sub L} < ε {sub S}, where ε {sub L} and ε {sub S} are energy-injection rates at the large and small scales, respectively, our results show that even a tiny amount of large-scale energy injection can significantly change the properties of turbulence. On the other hand, when ε {sub L} ≳ ε {sub S}, the small-scale driving does not influence the turbulence statistics much unless ε {sub L} ∼ ε {sub S}.« less
Multi-group measurement invariance of the multiple sclerosis walking scale-12?
Motl, Robert W; Mullen, Sean; McAuley, Edward
2012-03-01
One primary assumption underlying the interpretation of composite multiple sclerosis walking scale-12 (MSWS-12) scores across levels of disability status is multi-group measurement invariance. This assumption was tested in the present study between samples that differed in self-reported disability status. Participants (n = 867) completed a battery of questionnaires that included the MSWS-12 and patient-determined disease step (PDDS) scale. The multi-group invariance was tested between samples that had PDDS scores of ≤2 (i.e. no mobility limitation; n = 470) and PDDS scores ≥3 (onset of mobility limitation; n = 397) using Mplus 6·0. The omnibus test of equal covariance matrices indicated that the MSWS-12 was not invariant between the two samples that differed in disability status. The source of non-invariance occurred with the initial equivalence test of the factor structure itself. We provide evidence that questions the unambiguous interpretation of scores from the MSWS-12 as a measure of walking impairment between samples of persons with multiple sclerosis who differ in disability status.
Portable parallel stochastic optimization for the design of aeropropulsion components
NASA Technical Reports Server (NTRS)
Sues, Robert H.; Rhodes, G. S.
1994-01-01
This report presents the results of Phase 1 research to develop a methodology for performing large-scale Multi-disciplinary Stochastic Optimization (MSO) for the design of aerospace systems ranging from aeropropulsion components to complete aircraft configurations. The current research recognizes that such design optimization problems are computationally expensive, and require the use of either massively parallel or multiple-processor computers. The methodology also recognizes that many operational and performance parameters are uncertain, and that uncertainty must be considered explicitly to achieve optimum performance and cost. The objective of this Phase 1 research was to initialize the development of an MSO methodology that is portable to a wide variety of hardware platforms, while achieving efficient, large-scale parallelism when multiple processors are available. The first effort in the project was a literature review of available computer hardware, as well as review of portable, parallel programming environments. The first effort was to implement the MSO methodology for a problem using the portable parallel programming language, Parallel Virtual Machine (PVM). The third and final effort was to demonstrate the example on a variety of computers, including a distributed-memory multiprocessor, a distributed-memory network of workstations, and a single-processor workstation. Results indicate the MSO methodology can be well-applied towards large-scale aerospace design problems. Nearly perfect linear speedup was demonstrated for computation of optimization sensitivity coefficients on both a 128-node distributed-memory multiprocessor (the Intel iPSC/860) and a network of workstations (speedups of almost 19 times achieved for 20 workstations). Very high parallel efficiencies (75 percent for 31 processors and 60 percent for 50 processors) were also achieved for computation of aerodynamic influence coefficients on the Intel. Finally, the multi-level parallelization strategy that will be needed for large-scale MSO problems was demonstrated to be highly efficient. The same parallel code instructions were used on both platforms, demonstrating portability. There are many applications for which MSO can be applied, including NASA's High-Speed-Civil Transport, and advanced propulsion systems. The use of MSO will reduce design and development time and testing costs dramatically.
Wang, Lu-Yong; Fasulo, D
2006-01-01
Genome-wide association study for complex diseases will generate massive amount of single nucleotide polymorphisms (SNPs) data. Univariate statistical test (i.e. Fisher exact test) was used to single out non-associated SNPs. However, the disease-susceptible SNPs may have little marginal effects in population and are unlikely to retain after the univariate tests. Also, model-based methods are impractical for large-scale dataset. Moreover, genetic heterogeneity makes the traditional methods harder to identify the genetic causes of diseases. A more recent random forest method provides a more robust method for screening the SNPs in thousands scale. However, for more large-scale data, i.e., Affymetrix Human Mapping 100K GeneChip data, a faster screening method is required to screening SNPs in whole-genome large scale association analysis with genetic heterogeneity. We propose a boosting-based method for rapid screening in large-scale analysis of complex traits in the presence of genetic heterogeneity. It provides a relatively fast and fairly good tool for screening and limiting the candidate SNPs for further more complex computational modeling task.
PROPERTIES IMPORTANT TO MIXING FOR WTP LARGE SCALE INTEGRATED TESTING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koopman, D.; Martino, C.; Poirier, M.
2012-04-26
Large Scale Integrated Testing (LSIT) is being planned by Bechtel National, Inc. to address uncertainties in the full scale mixing performance of the Hanford Waste Treatment and Immobilization Plant (WTP). Testing will use simulated waste rather than actual Hanford waste. Therefore, the use of suitable simulants is critical to achieving the goals of the test program. External review boards have raised questions regarding the overall representativeness of simulants used in previous mixing tests. Accordingly, WTP requested the Savannah River National Laboratory (SRNL) to assist with development of simulants for use in LSIT. Among the first tasks assigned to SRNL wasmore » to develop a list of waste properties that matter to pulse-jet mixer (PJM) mixing of WTP tanks. This report satisfies Commitment 5.2.3.1 of the Department of Energy Implementation Plan for Defense Nuclear Facilities Safety Board Recommendation 2010-2: physical properties important to mixing and scaling. In support of waste simulant development, the following two objectives are the focus of this report: (1) Assess physical and chemical properties important to the testing and development of mixing scaling relationships; (2) Identify the governing properties and associated ranges for LSIT to achieve the Newtonian and non-Newtonian test objectives. This includes the properties to support testing of sampling and heel management systems. The test objectives for LSIT relate to transfer and pump out of solid particles, prototypic integrated operations, sparger operation, PJM controllability, vessel level/density measurement accuracy, sampling, heel management, PJM restart, design and safety margin, Computational Fluid Dynamics (CFD) Verification and Validation (V and V) and comparison, performance testing and scaling, and high temperature operation. The slurry properties that are most important to Performance Testing and Scaling depend on the test objective and rheological classification of the slurry (i.e., Newtonian or non-Newtonian). The most important properties for testing with Newtonian slurries are the Archimedes number distribution and the particle concentration. For some test objectives, the shear strength is important. In the testing to collect data for CFD V and V and CFD comparison, the liquid density and liquid viscosity are important. In the high temperature testing, the liquid density and liquid viscosity are important. The Archimedes number distribution combines effects of particle size distribution, solid-liquid density difference, and kinematic viscosity. The most important properties for testing with non-Newtonian slurries are the slurry yield stress, the slurry consistency, and the shear strength. The solid-liquid density difference and the particle size are also important. It is also important to match multiple properties within the same simulant to achieve behavior representative of the waste. Other properties such as particle shape, concentration, surface charge, and size distribution breadth, as well as slurry cohesiveness and adhesiveness, liquid pH and ionic strength also influence the simulant properties either directly or through other physical properties such as yield stress.« less
Total-dose radiation effects data for semiconductor devices, volume 3
NASA Technical Reports Server (NTRS)
Price, W. E.; Martin, K. E.; Nichols, D. K.; Gauthier, M. K.; Brown, S. F.
1982-01-01
Volume 3 of this three-volume set provides a detailed analysis of the data in Volumes 1 and 2, most of which was generated for the Galileo Orbiter Program in support of NASA space programs. Volume 1 includes total ionizing dose radiation test data on diodes, bipolar transistors, field effect transistors, and miscellaneous discrete solid-state devices. Volume 2 includes similar data on integrated circuits and a few large-scale integrated circuits. The data of Volumes 1 and 2 are combined in graphic format in Volume 3 to provide a comparison of radiation sensitivities of devices of a given type and different manufacturer, a comparison of multiple tests for a single data code, a comparison of multiple tests for a single lot, and a comparison of radiation sensitivities vs time (date codes). All data were generated using a steady-state 2.5-MeV electron source (Dynamitron) or a Cobalt-60 gamma ray source. The data that compose Volume 3 represent 26 different device types, 224 tests, and a total of 1040 devices. A comparison of the effects of steady-state electrons and Cobat-60 gamma rays is also presented.
Komatsu, Masayo; Nezu, Satoko; Tomioka, Kimiko; Hazaki, Kan; Harano, Akihiro; Morikawa, Masayuki; Takagi, Masahiro; Yamada, Masahiro; Matsumoto, Yoshitaka; Iwamoto, Junko; Ishizuka, Rika; Saeki, Keigo; Okamoto, Nozomi; Kurumatani, Norio
2013-01-01
To investigate factors associated with activities of daily living in independently living elderly persons in a community. The potential subjects were 4,472 individuals aged 65 years and older who voluntarily participated in a large cohort study, the Fujiwara-kyo study. We used self-administered questionnaires consisting of an activities of daily living (ADL) questionnaire with the Physical Fitness Test established by the Ministry of Education, Culture, Sports, Science and Technology (12 ADL items) to determine the index of higher-level physical independence, demographics, Geriatric Depression Scale, and so on. Mini-mental state examination, measurement of physical fitness, and blood tests were also carried out. A lower ADL level was defined as having a total score of the 12 ADL items (range, 12-36 points) that was below the first quartile of a total score for all the subjects. Factors associated with a low ADL level were examined by multiple logistic regression. A total of 4,198 remained as subjects for analysis. The male, female and 5-year-old groups showed significant differences in the median score of 12 ADL items between any two groups. The highest odds ratio among factors associated with lower ADL level by multiple logistic regression with mutually adjusted independent variables was 4.49 (95%CI: 2.82-7.17) in the groups of "very sharp pain" or "strong pain" during the last month. Low physical ability, self-awareness of limb weakness, a BMI of over 25, low physical activity, cerebrovascular disorder, depression, low cognitive function, unable "to see normally", unable "to hear someone", "muscle, bone and joint pain" were independently associated with lower ADL level. Multiple factors are associated with lower ADL level assessed on the basis of the 12 ADL items.
Owen, Sheldon F.; Berl, Jacob L.; Edwards, John W.; Ford, W. Mark; Wood, Petra Bohall
2015-01-01
We studied a raccoon (Procyon lotor) population within a managed central Appalachian hardwood forest in West Virginia to investigate the effects of intensive forest management on raccoon spatial requirements and habitat selection. Raccoon home-range (95% utilization distribution) and core-area (50% utilization distribution) size differed between sexes with males maintaining larger (2×) home ranges and core areas than females. Home-range and core-area size did not differ between seasons for either sex. We used compositional analysis to quantify raccoon selection of six different habitat types at multiple spatial scales. Raccoons selected riparian corridors (riparian management zones [RMZ]) and intact forests (> 70 y old) at the core-area spatial scale. RMZs likely were used by raccoons because they provided abundant denning resources (i.e., large-diameter trees) as well as access to water. Habitat composition associated with raccoon foraging locations indicated selection for intact forests, riparian areas, and regenerating harvest (stands <10 y old). Although raccoons were able to utilize multiple habitat types for foraging resources, a selection of intact forest and RMZs at multiple spatial scales indicates the need of mature forest (with large-diameter trees) for this species in managed forests in the central Appalachians.
Software Manages Documentation in a Large Test Facility
NASA Technical Reports Server (NTRS)
Gurneck, Joseph M.
2001-01-01
The 3MCS computer program assists and instrumentation engineer in performing the 3 essential functions of design, documentation, and configuration management of measurement and control systems in a large test facility. Services provided by 3MCS are acceptance of input from multiple engineers and technicians working at multiple locations;standardization of drawings;automated cross-referencing; identification of errors;listing of components and resources; downloading of test settings; and provision of information to customers.
Do Streaks Matter in Multiple-Choice Tests?
ERIC Educational Resources Information Center
Kiss, Hubert János; Selei, Adrienn
2018-01-01
Success in life is determined to a large extent by school performance, which in turn depends heavily on grades obtained in exams. In this study, we investigate a particular type of exam: multiple-choice tests. More concretely, we study if patterns of correct answers in multiple-choice tests affect performance. We design an experiment to study if…
Large-scale retrieval for medical image analytics: A comprehensive review.
Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting
2018-01-01
Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
A stochastic model of weather states and concurrent daily precipitation at multiple precipitation stations is described. our algorithms are invested for classification of daily weather states; k means, fuzzy clustering, principal components, and principal components coupled with ...
Telescopic multi-resolution augmented reality
NASA Astrophysics Data System (ADS)
Jenkins, Jeffrey; Frenchi, Christopher; Szu, Harold
2014-05-01
To ensure a self-consistent scaling approximation, the underlying microscopic fluctuation components can naturally influence macroscopic means, which may give rise to emergent observable phenomena. In this paper, we describe a consistent macroscopic (cm-scale), mesoscopic (micron-scale), and microscopic (nano-scale) approach to introduce Telescopic Multi-Resolution (TMR) into current Augmented Reality (AR) visualization technology. We propose to couple TMR-AR by introducing an energy-matter interaction engine framework that is based on known Physics, Biology, Chemistry principles. An immediate payoff of TMR-AR is a self-consistent approximation of the interaction between microscopic observables and their direct effect on the macroscopic system that is driven by real-world measurements. Such an interdisciplinary approach enables us to achieve more than multiple scale, telescopic visualization of real and virtual information but also conducting thought experiments through AR. As a result of the consistency, this framework allows us to explore a large dimensionality parameter space of measured and unmeasured regions. Towards this direction, we explore how to build learnable libraries of biological, physical, and chemical mechanisms. Fusing analytical sensors with TMR-AR libraries provides a robust framework to optimize testing and evaluation through data-driven or virtual synthetic simulations. Visualizing mechanisms of interactions requires identification of observable image features that can indicate the presence of information in multiple spatial and temporal scales of analog data. The AR methodology was originally developed to enhance pilot-training as well as `make believe' entertainment industries in a user-friendly digital environment We believe TMR-AR can someday help us conduct thought experiments scientifically, to be pedagogically visualized in a zoom-in-and-out, consistent, multi-scale approximations.
Olszewski, John; Winona, Linda; Oshima, Kevin H
2005-04-01
The use of ultrafiltration as a concentration method to recover viruses from environmental waters was investigated. Two ultrafiltration systems (hollow fiber and tangential flow) in a large- (100 L) and small-scale (2 L) configuration were able to recover greater than 50% of multiple viruses (bacteriophage PP7 and T1 and poliovirus type 2) from varying water turbidities (10-157 nephelometric turbidity units (NTU)) simultaneously. Mean recoveries (n = 3) in ground and surface water by the large-scale hollow fiber ultrafiltration system (100 L) were comparable to recoveries observed in the small-scale system (2 L). Recovery of seeded viruses in highly turbid waters from small-scale tangential flow (2 L) (screen and open channel) and hollow fiber ultrafilters (2 L) (small pilot) were greater than 70%. Clogging occurred in the hollow fiber pencil module and when particulate concentrations exceeded 1.6 g/L and 5.5 g/L (dry mass) in the screen and open channel filters, respectively. The small pilot module was able to filter all concentrates without clogging. The small pilot hollow fiber ultrafilter was used to test recovery of seeded viruses from surface waters from different geographical regions in 10-L volumes. Recoveries >70% were observed from all locations.
On the Interactions Between Planetary and Mesoscale Dynamics in the Oceans
NASA Astrophysics Data System (ADS)
Grooms, I.; Julien, K. A.; Fox-Kemper, B.
2011-12-01
Multiple-scales asymptotic methods are used to investigate the interaction of planetary and mesoscale dynamics in the oceans. We find three regimes. In the first, the slow, large-scale planetary flow sets up a baroclinically unstable background which leads to vigorous mesoscale eddy generation, but the eddy dynamics do not affect the planetary dynamics. In the second, the planetary flow feels the effects of the eddies, but appears to be unable to generate them. The first two regimes rely on horizontally isotropic large-scale dynamics. In the third regime, large-scale anisotropy, as exists for example in the Antarctic Circumpolar Current and in western boundary currents, allows the large-scale dynamics to both generate and respond to mesoscale eddies. We also discuss how the investigation may be brought to bear on the problem of parameterization of unresolved mesoscale dynamics in ocean general circulation models.
NASA Technical Reports Server (NTRS)
Schlundt, D. W.
1976-01-01
The installed performance degradation of a swivel nozzle thrust deflector system obtained during increased vectoring angles of a large-scale test program was investigated and improved. Small-scale models were used to generate performance data for analyzing selected swivel nozzle configurations. A single-swivel nozzle design model with five different nozzle configurations and a twin-swivel nozzle design model, scaled to 0.15 size of the large-scale test hardware, were statically tested at low exhaust pressure ratios of 1.4, 1.3, 1.2, and 1.1 and vectored at four nozzle positions from 0 deg cruise through 90 deg vertical used for the VTOL mode.
Measuring Growth with Vertical Scales
ERIC Educational Resources Information Center
Briggs, Derek C.
2013-01-01
A vertical score scale is needed to measure growth across multiple tests in terms of absolute changes in magnitude. Since the warrant for subsequent growth interpretations depends upon the assumption that the scale has interval properties, the validation of a vertical scale would seem to require methods for distinguishing interval scales from…
ERIC Educational Resources Information Center
Cizek, Gregory J.
2009-01-01
Reliability and validity are two characteristics that must be considered whenever information about student achievement is collected. However, those characteristics--and the methods for evaluating them--differ in large-scale testing and classroom testing contexts. This article presents the distinctions between reliability and validity in the two…
DOT National Transportation Integrated Search
1994-10-01
A shake test was performed on the Large Scale Dynamic Rig in the 40- by 80-Foot Wind Tunnel in support of the McDonnell Douglas Advanced Rotor Technology (MDART) Test Program. The shake test identifies the hub modes and the dynamic calibration matrix...
Developing a Strategy for Using Technology-Enhanced Items in Large-Scale Standardized Tests
ERIC Educational Resources Information Center
Bryant, William
2017-01-01
As large-scale standardized tests move from paper-based to computer-based delivery, opportunities arise for test developers to make use of items beyond traditional selected and constructed response types. Technology-enhanced items (TEIs) have the potential to provide advantages over conventional items, including broadening construct measurement,…
Kangas, Antti J; Soininen, Pasi; Lawlor, Debbie A; Davey Smith, George; Ala-Korpela, Mika
2017-01-01
Abstract Detailed metabolic profiling in large-scale epidemiologic studies has uncovered novel biomarkers for cardiometabolic diseases and clarified the molecular associations of established risk factors. A quantitative metabolomics platform based on nuclear magnetic resonance spectroscopy has found widespread use, already profiling over 400,000 blood samples. Over 200 metabolic measures are quantified per sample; in addition to many biomarkers routinely used in epidemiology, the method simultaneously provides fine-grained lipoprotein subclass profiling and quantification of circulating fatty acids, amino acids, gluconeogenesis-related metabolites, and many other molecules from multiple metabolic pathways. Here we focus on applications of magnetic resonance metabolomics for quantifying circulating biomarkers in large-scale epidemiology. We highlight the molecular characterization of risk factors, use of Mendelian randomization, and the key issues of study design and analyses of metabolic profiling for epidemiology. We also detail how integration of metabolic profiling data with genetics can enhance drug development. We discuss why quantitative metabolic profiling is becoming widespread in epidemiology and biobanking. Although large-scale applications of metabolic profiling are still novel, it seems likely that comprehensive biomarker data will contribute to etiologic understanding of various diseases and abilities to predict disease risks, with the potential to translate into multiple clinical settings. PMID:29106475
Ma, Li; Runesha, H Birali; Dvorkin, Daniel; Garbe, John R; Da, Yang
2008-01-01
Background Genome-wide association studies (GWAS) using single nucleotide polymorphism (SNP) markers provide opportunities to detect epistatic SNPs associated with quantitative traits and to detect the exact mode of an epistasis effect. Computational difficulty is the main bottleneck for epistasis testing in large scale GWAS. Results The EPISNPmpi and EPISNP computer programs were developed for testing single-locus and epistatic SNP effects on quantitative traits in GWAS, including tests of three single-locus effects for each SNP (SNP genotypic effect, additive and dominance effects) and five epistasis effects for each pair of SNPs (two-locus interaction, additive × additive, additive × dominance, dominance × additive, and dominance × dominance) based on the extended Kempthorne model. EPISNPmpi is the parallel computing program for epistasis testing in large scale GWAS and achieved excellent scalability for large scale analysis and portability for various parallel computing platforms. EPISNP is the serial computing program based on the EPISNPmpi code for epistasis testing in small scale GWAS using commonly available operating systems and computer hardware. Three serial computing utility programs were developed for graphical viewing of test results and epistasis networks, and for estimating CPU time and disk space requirements. Conclusion The EPISNPmpi parallel computing program provides an effective computing tool for epistasis testing in large scale GWAS, and the epiSNP serial computing programs are convenient tools for epistasis analysis in small scale GWAS using commonly available computer hardware. PMID:18644146
Bioinspired Wood Nanotechnology for Functional Materials.
Berglund, Lars A; Burgert, Ingo
2018-05-01
It is a challenging task to realize the vision of hierarchically structured nanomaterials for large-scale applications. Herein, the biomaterial wood as a large-scale biotemplate for functionalization at multiple scales is discussed, to provide an increased property range to this renewable and CO 2 -storing bioresource, which is available at low cost and in large quantities. The Progress Report reviews the emerging field of functional wood materials in view of the specific features of the structural template and novel nanotechnological approaches for the development of wood-polymer composites and wood-mineral hybrids for advanced property profiles and new functions. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Gap Test Calibrations and Their Scaling
NASA Astrophysics Data System (ADS)
Sandusky, Harold
2011-06-01
Common tests for measuring the threshold for shock initiation are the NOL large scale gap test (LSGT) with a 50.8-mm diameter donor/gap and the expanded large scale gap test (ELSGT) with a 95.3-mm diameter donor/gap. Despite the same specifications for the explosive donor and polymethyl methacrylate (PMMA) gap in both tests, calibration of shock pressure in the gap versus distance from the donor scales by a factor of 1.75, not the 1.875 difference in their sizes. Recently reported model calculations suggest that the scaling discrepancy results from the viscoelastic properties of PMMA in combination with different methods for obtaining shock pressure. This is supported by the consistent scaling of these donors when calibrated in water-filled aquariums. Calibrations with water gaps will be provided and compared with PMMA gaps. Scaling for other donor systems will also be provided. Shock initiation data with water gaps will be reviewed.
Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo
2016-07-08
Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Gap Test Calibrations And Their Scalin
NASA Astrophysics Data System (ADS)
Sandusky, Harold
2012-03-01
Common tests for measuring the threshold for shock initiation are the NOL large scale gap test (LSGT) with a 50.8-mm diameter donor/gap and the expanded large scale gap test (ELSGT) with a 95.3-mm diameter donor/gap. Despite the same specifications for the explosive donor and polymethyl methacrylate (PMMA) gap in both tests, calibration of shock pressure in the gap versus distance from the donor scales by a factor of 1.75, not the 1.875 difference in their sizes. Recently reported model calculations suggest that the scaling discrepancy results from the viscoelastic properties of PMMA in combination with different methods for obtaining shock pressure. This is supported by the consistent scaling of these donors when calibrated in water-filled aquariums. Calibrations and their scaling are compared for other donors with PMMA gaps and for various donors in water.
Validity and reliability of a pilot scale for assessment of multiple system atrophy symptoms.
Matsushima, Masaaki; Yabe, Ichiro; Takahashi, Ikuko; Hirotani, Makoto; Kano, Takahiro; Horiuchi, Kazuhiro; Houzen, Hideki; Sasaki, Hidenao
2017-01-01
Multiple system atrophy (MSA) is a rare progressive neurodegenerative disorder for which brief yet sensitive scale is required in order for use in clinical trials and general screening. We previously compared several scales for the assessment of MSA symptoms and devised an eight-item pilot scale with large standardized response mean [handwriting, finger taps, transfers, standing with feet together, turning trunk, turning 360°, gait, body sway]. The aim of the present study is to investigate the validity and reliability of a simple pilot scale for assessment of multiple system atrophy symptoms. Thirty-two patients with MSA (15 male/17 female; 20 cerebellar subtype [MSA-C]/12 parkinsonian subtype [MSA-P]) were prospectively registered between January 1, 2014 and February 28, 2015. Patients were evaluated by two independent raters using the Unified MSA Rating Scale (UMSARS), Scale for Assessment and Rating of Ataxia (SARA), and the pilot scale. Correlations between UMSARS, SARA, pilot scale scores, intraclass correlation coefficients (ICCs), and Cronbach's alpha coefficients were calculated. Pilot scale scores significantly correlated with scores for UMSARS Parts I, II, and IV as well as with SARA scores. Intra-rater and inter-rater ICCs and Cronbach's alpha coefficients remained high (> 0.94) for all measures. The results of the present study indicate the validity and reliability of the eight-item pilot scale, particularly for the assessment of symptoms in patients with early state multiple system atrophy.
ERIC Educational Resources Information Center
Puhan, Gautam
2009-01-01
The purpose of this study is to determine the extent of scale drift on a test that employs cut scores. It was essential to examine scale drift for this testing program because new forms in this testing program are often put on scale through a series of intermediate equatings (known as equating chains). This process may cause equating error to…
1981-11-01
OPERATIONS MANAGEMENT S. TYPE OF REPORT A PERIOD COVERED TEST OF THE USE OF THE WHITE AMUR FOR CONTROL OF Report 2 of a series PROBLEM AQUATIC PLANTS...111. 1981. "Large-Scale Operations Management Test of the Use of the White Amur for Control of Problem Aquatic Plants; Report 2, First Year Poststock...Al 3 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC PLANTS A MODEL FOR EVALUATION OF
NASA Astrophysics Data System (ADS)
Danovaro, Roberto; Carugati, Laura; Corinaldesi, Cinzia; Gambi, Cristina; Guilini, Katja; Pusceddu, Antonio; Vanreusel, Ann
2013-08-01
The deep sea is the largest biome of the biosphere. The knowledge of the spatial variability of deep-sea biodiversity is one of the main challenges of marine ecology and evolutionary biology. The choice of the observational spatial scale is assumed to play a key role for understanding processes structuring the deep-sea benthic communities and one of the most typical features of marine biodiversity distribution is the existence of bathymetric gradients. However, the analysis of biodiversity bathymetric gradients and the associated changes in species composition (beta diversity) typically compared large depth ranges (with intervals of 500 to 1000 or even 2000 m depth among sites). To test whether significant changes in alpha and beta diversity occur also at fine-scale bathymetric gradients (i.e., within few hundred-meter depth intervals) the variability of deep-sea nematode biodiversity and assemblage composition along a bathymetric transect (200-1200 m depth) with intervals of 200 m among sampling depths, was investigated. A hierarchical sampling strategy for the analysis of nematode species richness, beta diversity, functional (trophic) diversity, and related environmental variables, was used. The results indicate the lack of significant differences in taxonomic and functional diversity across sampling depths, but the presence of high beta diversity at all spatial scales investigated: between cores collected from the same box corer (on average 56%), among deployments at the same depth (58%), and between all sampling depths (62%). Such high beta diversity is influenced by the presence of small-scale patchiness in the deep sea and is also related to the large number of rare or very rare species (typically accounting for >80% of total species richness). Moreover, the number of ubiquitous nematode species across all sampling depths is quite low (ca. 15%). Multiple regression analyses provide evidence that such patterns could be related to the different availability, composition and size spectra of food particles in the sediments. Additionally, though to a lesser extent, our results indicate, that selective predation can influence the nematode trophic composition. These findings suggest that a multiple scale analysis based on a nested sampling design could significantly improve our knowledge of bathymetric patterns of deep-sea biodiversity and its drivers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-09-25
The Megatux platform enables the emulation of large scale (multi-million node) distributed systems. In particular, it allows for the emulation of large-scale networks interconnecting a very large number of emulated computer systems. It does this by leveraging virtualization and associated technologies to allow hundreds of virtual computers to be hosted on a single moderately sized server or workstation. Virtualization technology provided by modern processors allows for multiple guest OSs to run at the same time, sharing the hardware resources. The Megatux platform can be deployed on a single PC, a small cluster of a few boxes or a large clustermore » of computers. With a modest cluster, the Megatux platform can emulate complex organizational networks. By using virtualization, we emulate the hardware, but run actual software enabling large scale without sacrificing fidelity.« less
Integrating neuroinformatics tools in TheVirtualBrain.
Woodman, M Marmaduke; Pezard, Laurent; Domide, Lia; Knock, Stuart A; Sanz-Leon, Paula; Mersmann, Jochen; McIntosh, Anthony R; Jirsa, Viktor
2014-01-01
TheVirtualBrain (TVB) is a neuroinformatics Python package representing the convergence of clinical, systems, and theoretical neuroscience in the analysis, visualization and modeling of neural and neuroimaging dynamics. TVB is composed of a flexible simulator for neural dynamics measured across scales from local populations to large-scale dynamics measured by electroencephalography (EEG), magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI), and core analytic and visualization functions, all accessible through a web browser user interface. A datatype system modeling neuroscientific data ties together these pieces with persistent data storage, based on a combination of SQL and HDF5. These datatypes combine with adapters allowing TVB to integrate other algorithms or computational systems. TVB provides infrastructure for multiple projects and multiple users, possibly participating under multiple roles. For example, a clinician might import patient data to identify several potential lesion points in the patient's connectome. A modeler, working on the same project, tests these points for viability through whole brain simulation, based on the patient's connectome, and subsequent analysis of dynamical features. TVB also drives research forward: the simulator itself represents the culmination of several simulation frameworks in the modeling literature. The availability of the numerical methods, set of neural mass models and forward solutions allows for the construction of a wide range of brain-scale simulation scenarios. This paper briefly outlines the history and motivation for TVB, describing the framework and simulator, giving usage examples in the web UI and Python scripting.
Integrating neuroinformatics tools in TheVirtualBrain
Woodman, M. Marmaduke; Pezard, Laurent; Domide, Lia; Knock, Stuart A.; Sanz-Leon, Paula; Mersmann, Jochen; McIntosh, Anthony R.; Jirsa, Viktor
2014-01-01
TheVirtualBrain (TVB) is a neuroinformatics Python package representing the convergence of clinical, systems, and theoretical neuroscience in the analysis, visualization and modeling of neural and neuroimaging dynamics. TVB is composed of a flexible simulator for neural dynamics measured across scales from local populations to large-scale dynamics measured by electroencephalography (EEG), magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI), and core analytic and visualization functions, all accessible through a web browser user interface. A datatype system modeling neuroscientific data ties together these pieces with persistent data storage, based on a combination of SQL and HDF5. These datatypes combine with adapters allowing TVB to integrate other algorithms or computational systems. TVB provides infrastructure for multiple projects and multiple users, possibly participating under multiple roles. For example, a clinician might import patient data to identify several potential lesion points in the patient's connectome. A modeler, working on the same project, tests these points for viability through whole brain simulation, based on the patient's connectome, and subsequent analysis of dynamical features. TVB also drives research forward: the simulator itself represents the culmination of several simulation frameworks in the modeling literature. The availability of the numerical methods, set of neural mass models and forward solutions allows for the construction of a wide range of brain-scale simulation scenarios. This paper briefly outlines the history and motivation for TVB, describing the framework and simulator, giving usage examples in the web UI and Python scripting. PMID:24795617
A Combined Ethical and Scientific Analysis of Large-scale Tests of Solar Climate Engineering
NASA Astrophysics Data System (ADS)
Ackerman, T. P.
2017-12-01
Our research group recently published an analysis of the combined ethical and scientific issues surrounding large-scale testing of stratospheric aerosol injection (SAI; Lenferna et al., 2017, Earth's Future). We are expanding this study in two directions. The first is extending this same analysis to other geoengineering techniques, particularly marine cloud brightening (MCB). MCB has substantial differences to SAI in this context because MCB can be tested over significantly smaller areas of the planet and, following injection, has a much shorter lifetime of weeks as opposed to years for SAI. We examine issues such as the role of intent, the lesser of two evils, and the nature of consent. In addition, several groups are currently considering climate engineering governance tools such as a code of ethics and a registry. We examine how these tools might influence climate engineering research programs and, specifically, large-scale testing. The second direction of expansion is asking whether ethical and scientific issues associated with large-scale testing are so significant that they effectively preclude moving ahead with climate engineering research and testing. Some previous authors have suggested that no research should take place until these issues are resolved. We think this position is too draconian and consider a more nuanced version of this argument. We note, however, that there are serious questions regarding the ability of the scientific research community to move to the point of carrying out large-scale tests.
Demonstration of Active Power Controls by Utility-Scale PV Power Plant in an Island Grid: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gevorgian, Vahan; O'Neill, Barbara
The National Renewable Energy Laboratory (NREL), AES, and the Puerto Rico Electric Power Authority conducted a demonstration project on a utility-scale photovoltaic (PV) plant to test the viability of providing important ancillary services from this facility. As solar generation increases globally, there is a need for innovation and increased operational flexibility. A typical PV power plant consists of multiple power electronic inverters and can contribute to grid stability and reliability through sophisticated 'grid-friendly' controls. In this way, it may mitigate the impact of its variability on the grid and contribute to important system requirements more like traditional generators. In 2015,more » testing was completed on a 20-MW AES plant in Puerto Rico, and a large amount of test data was produced and analyzed that demonstrates the ability of PV power plants to provide various types of new grid-friendly controls. This data showed how active power controls can leverage PV's value from being simply an intermittent energy resource to providing additional ancillary services for an isolated island grid. Specifically, the tests conducted included PV plant participation in automatic generation control, provision of droop response, and fast frequency response.« less
Williams, Mary R; Sigman, Michael E; Lewis, Jennifer; Pitan, Kelly McHugh
2012-10-10
A bayesian soft classification method combined with target factor analysis (TFA) is described and tested for the analysis of fire debris data. The method relies on analysis of the average mass spectrum across the chromatographic profile (i.e., the total ion spectrum, TIS) from multiple samples taken from a single fire scene. A library of TIS from reference ignitable liquids with assigned ASTM classification is used as the target factors in TFA. The class-conditional distributions of correlations between the target and predicted factors for each ASTM class are represented by kernel functions and analyzed by bayesian decision theory. The soft classification approach assists in assessing the probability that ignitable liquid residue from a specific ASTM E1618 class, is present in a set of samples from a single fire scene, even in the presence of unspecified background contributions from pyrolysis products. The method is demonstrated with sample data sets and then tested on laboratory-scale burn data and large-scale field test burns. The overall performance achieved in laboratory and field test of the method is approximately 80% correct classification of fire debris samples. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Integrity of Bolted Angle Connections Subjected to Simulated Column Removal
Weigand, Jonathan M.; Berman, Jeffrey W.
2016-01-01
Large-scale tests of steel gravity framing systems (SGFSs) have shown that the connections are critical to the system integrity, when a column suffers damage that compromises its ability to carry gravity loads. When supporting columns were removed, the SGFSs redistributed gravity loads through the development of an alternate load path in a sustained tensile configuration resulting from large vertical deflections. The ability of the system to sustain such an alternate load path depends on the capacity of the gravity connections to remain intact after undergoing large rotation and axial extension demands, for which they were not designed. This study experimentally evaluates the performance of steel bolted angle connections subjected to loading consistent with an interior column removal. The characteristic connection behaviors are described and the performance of multiple connection configurations are compared in terms of their peak resistances and deformation capacities. PMID:27110059
Aerodynamic design on high-speed trains
NASA Astrophysics Data System (ADS)
Ding, San-San; Li, Qiang; Tian, Ai-Qin; Du, Jian; Liu, Jia-Li
2016-04-01
Compared with the traditional train, the operational speed of the high-speed train has largely improved, and the dynamic environment of the train has changed from one of mechanical domination to one of aerodynamic domination. The aerodynamic problem has become the key technological challenge of high-speed trains and significantly affects the economy, environment, safety, and comfort. In this paper, the relationships among the aerodynamic design principle, aerodynamic performance indexes, and design variables are first studied, and the research methods of train aerodynamics are proposed, including numerical simulation, a reduced-scale test, and a full-scale test. Technological schemes of train aerodynamics involve the optimization design of the streamlined head and the smooth design of the body surface. Optimization design of the streamlined head includes conception design, project design, numerical simulation, and a reduced-scale test. Smooth design of the body surface is mainly used for the key parts, such as electric-current collecting system, wheel truck compartment, and windshield. The aerodynamic design method established in this paper has been successfully applied to various high-speed trains (CRH380A, CRH380AM, CRH6, CRH2G, and the Standard electric multiple unit (EMU)) that have met expected design objectives. The research results can provide an effective guideline for the aerodynamic design of high-speed trains.
An Approach to Scoring and Equating Tests with Binary Items: Piloting With Large-Scale Assessments
ERIC Educational Resources Information Center
Dimitrov, Dimiter M.
2016-01-01
This article describes an approach to test scoring, referred to as "delta scoring" (D-scoring), for tests with dichotomously scored items. The D-scoring uses information from item response theory (IRT) calibration to facilitate computations and interpretations in the context of large-scale assessments. The D-score is computed from the…
Convergence between biological, behavioural and genetic determinants of obesity.
Ghosh, Sujoy; Bouchard, Claude
2017-12-01
Multiple biological, behavioural and genetic determinants or correlates of obesity have been identified to date. Genome-wide association studies (GWAS) have contributed to the identification of more than 100 obesity-associated genetic variants, but their roles in causal processes leading to obesity remain largely unknown. Most variants are likely to have tissue-specific regulatory roles through joint contributions to biological pathways and networks, through changes in gene expression that influence quantitative traits, or through the regulation of the epigenome. The recent availability of large-scale functional genomics resources provides an opportunity to re-examine obesity GWAS data to begin elucidating the function of genetic variants. Interrogation of knockout mouse phenotype resources provides a further avenue to test for evidence of convergence between genetic variation and biological or behavioural determinants of obesity.
PyHLA: tests for the association between HLA alleles and diseases.
Fan, Yanhui; Song, You-Qiang
2017-02-06
Recently, several tools have been designed for human leukocyte antigen (HLA) typing using single nucleotide polymorphism (SNP) array and next-generation sequencing (NGS) data. These tools provide high-throughput and cost-effective approaches for identifying HLA types. Therefore, tools for downstream association analysis are highly desirable. Although several tools have been designed for multi-allelic marker association analysis, they were designed only for microsatellite markers and do not scale well with increasing data volumes, or they were designed for large-scale data but provided a limited number of tests. We have developed a Python package called PyHLA, which implements several methods for HLA association analysis, to fill the gap. PyHLA is a tailor-made, easy to use, and flexible tool designed specifically for the association analysis of the HLA types imputed from genome-wide genotyping and NGS data. PyHLA provides functions for association analysis, zygosity tests, and interaction tests between HLA alleles and diseases. Monte Carlo permutation and several methods for multiple testing corrections have also been implemented. PyHLA provides a convenient and powerful tool for HLA analysis. Existing methods have been integrated and desired methods have been added in PyHLA. Furthermore, PyHLA is applicable to small and large sample sizes and can finish the analysis in a timely manner on a personal computer with different platforms. PyHLA is implemented in Python. PyHLA is a free, open source software distributed under the GPLv2 license. The source code, tutorial, and examples are available at https://github.com/felixfan/PyHLA.
ERIC Educational Resources Information Center
Burns, G. Leonard; Walsh, James A.; Servera, Mateu; Lorenzo-Seva, Urbano; Cardo, Esther; Rodriguez-Fornells, Antoni
2013-01-01
Exploratory structural equation modeling (SEM) was applied to a multiple indicator (26 individual symptom ratings) by multitrait (ADHD-IN, ADHD-HI and ODD factors) by multiple source (mothers, fathers and teachers) model to test the invariance, convergent and discriminant validity of the Child and Adolescent Disruptive Behavior Inventory with 872…
NASA Astrophysics Data System (ADS)
Nardi, Albert; Idiart, Andrés; Trinchero, Paolo; de Vries, Luis Manuel; Molinero, Jorge
2014-08-01
This paper presents the development, verification and application of an efficient interface, denoted as iCP, which couples two standalone simulation programs: the general purpose Finite Element framework COMSOL Multiphysics® and the geochemical simulator PHREEQC. The main goal of the interface is to maximize the synergies between the aforementioned codes, providing a numerical platform that can efficiently simulate a wide number of multiphysics problems coupled with geochemistry. iCP is written in Java and uses the IPhreeqc C++ dynamic library and the COMSOL Java-API. Given the large computational requirements of the aforementioned coupled models, special emphasis has been placed on numerical robustness and efficiency. To this end, the geochemical reactions are solved in parallel by balancing the computational load over multiple threads. First, a benchmark exercise is used to test the reliability of iCP regarding flow and reactive transport. Then, a large scale thermo-hydro-chemical (THC) problem is solved to show the code capabilities. The results of the verification exercise are successfully compared with those obtained using PHREEQC and the application case demonstrates the scalability of a large scale model, at least up to 32 threads.
NASA Astrophysics Data System (ADS)
Rastorguev, A. S.; Utkin, N. D.; Chumak, O. V.
2017-08-01
Agekyan's λ-factor that allows for the effect of multiplicity of stellar encounters with large impact parameters has been used for the first time to directly calculate the diffusion coefficients in the phase space of a stellar system. Simple estimates show that the cumulative effect, i.e., the total contribution of distant encounters to the change in the velocity of a test star, given the multiplicity of stellar encounters, is finite, and the logarithmic divergence inherent in the classical description of diffusion is removed, as was shown previously byKandrup using a different, more complex approach. In this case, the expressions for the diffusion coefficients, as in the classical description, contain the logarithm of the ratio of two independent quantities: the mean interparticle distance and the impact parameter of a close encounter. However, the physical meaning of this logarithmic factor changes radically: it reflects not the divergence but the presence of two characteristic length scales inherent in the stellar medium.
Valdivia, Nelson; Díaz, María J.; Holtheuer, Jorge; Garrido, Ignacio; Huovinen, Pirjo; Gómez, Iván
2014-01-01
Understanding the variation of biodiversity along environmental gradients and multiple spatial scales is relevant for theoretical and management purposes. Hereby, we analysed the spatial variability in diversity and structure of intertidal and subtidal macrobenthic Antarctic communities along vertical environmental stress gradients and across multiple horizontal spatial scales. Since biotic interactions and local topographic features are likely major factors for coastal assemblages, we tested the hypothesis that fine-scale processes influence the effects of the vertical environmental stress gradients on the macrobenthic diversity and structure. We used nested sampling designs in the intertidal and subtidal habitats, including horizontal spatial scales ranging from few centimetres to 1000s of metres along the rocky shore of Fildes Peninsula, King George Island. In both intertidal and subtidal habitats, univariate and multivariate analyses showed a marked vertical zonation in taxon richness and community structure. These patterns depended on the horizontal spatial scale of observation, as all analyses showed a significant interaction between height (or depth) and the finer spatial scale analysed. Variance and pseudo-variance components supported our prediction for taxon richness, community structure, and the abundance of dominant species such as the filamentous green alga Urospora penicilliformis (intertidal), the herbivore Nacella concinna (intertidal), the large kelp-like Himantothallus grandifolius (subtidal), and the red crustose red alga Lithothamnion spp. (subtidal). We suggest that in coastal ecosystems strongly governed by physical factors, fine-scale processes (e.g. biotic interactions and refugia availability) are still relevant for the structuring and maintenance of the local communities. The spatial patterns found in this study serve as a necessary benchmark to understand the dynamics and adaptation of natural assemblages in response to observed and predicted environmental changes in Antarctica. PMID:24956114
Development and Initial Testing of the Tiltrotor Test Rig
NASA Technical Reports Server (NTRS)
Acree, C. W., Jr.; Sheikman, A. L.
2018-01-01
The NASA Tiltrotor Test Rig (TTR) is a new, large-scale proprotor test system, developed jointly with the U.S. Army and Air Force, to develop a new, large-scale proprotor test system for the National Full-Scale Aerodynamics Complex (NFAC). The TTR is designed to test advanced proprotors up to 26 feet in diameter at speeds up to 300 knots, and even larger rotors at lower airspeeds. This combination of size and speed is unprecedented and is necessary for research into 21st-century tiltrotors and other advanced rotorcraft concepts. The TTR will provide critical data for validation of state-of-the-art design and analysis tools.
ERIC Educational Resources Information Center
Stricker, Lawrence J.; Rock, Donald A.; Bridgeman, Brent
2015-01-01
This study explores stereotype threat on low-stakes tests used in a large-scale assessment, math and reading tests in the Education Longitudinal Study of 2002 (ELS). Issues identified in laboratory research (though not observed in studies of high-stakes tests) were assessed: whether inquiring about their race and gender is related to the…
ERIC Educational Resources Information Center
Toro, Maritsa
2011-01-01
The statistical assessment of dimensionality provides evidence of the underlying constructs measured by a survey or test instrument. This study focuses on educational measurement, specifically tests comprised of items described as multidimensional. That is, items that require examinee proficiency in multiple content areas and/or multiple cognitive…
Steed, Chad A.; Halsey, William; Dehoff, Ryan; ...
2017-02-16
Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A.; Halsey, William; Dehoff, Ryan
Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less
NASA Astrophysics Data System (ADS)
Beichner, Robert
2015-03-01
The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).
Ferrari, Renata; Marzinelli, Ezequiel M; Ayroza, Camila Rezende; Jordan, Alan; Figueira, Will F; Byrne, Maria; Malcolm, Hamish A; Williams, Stefan B; Steinberg, Peter D
2018-01-01
Marine protected areas (MPAs) are designed to reduce threats to biodiversity and ecosystem functioning from anthropogenic activities. Assessment of MPAs effectiveness requires synchronous sampling of protected and non-protected areas at multiple spatial and temporal scales. We used an autonomous underwater vehicle to map benthic communities in replicate 'no-take' and 'general-use' (fishing allowed) zones within three MPAs along 7o of latitude. We recorded 92 taxa and 38 morpho-groups across three large MPAs. We found that important habitat-forming biota (e.g. massive sponges) were more prevalent and abundant in no-take zones, while short ephemeral algae were more abundant in general-use zones, suggesting potential short-term effects of zoning (5-10 years). Yet, short-term effects of zoning were not detected at the community level (community structure or composition), while community structure varied significantly among MPAs. We conclude that by allowing rapid, simultaneous assessments at multiple spatial scales, autonomous underwater vehicles are useful to document changes in marine communities and identify adequate scales to manage them. This study advanced knowledge of marine benthic communities and their conservation in three ways. First, we quantified benthic biodiversity and abundance, generating the first baseline of these benthic communities against which the effectiveness of three large MPAs can be assessed. Second, we identified the taxonomic resolution necessary to assess both short and long-term effects of MPAs, concluding that coarse taxonomic resolution is sufficient given that analyses of community structure at different taxonomic levels were generally consistent. Yet, observed differences were taxa-specific and may have not been evident using our broader taxonomic classifications, a classification of mid to high taxonomic resolution may be necessary to determine zoning effects on key taxa. Third, we provide an example of statistical analyses and sampling design that once temporal sampling is incorporated will be useful to detect changes of marine benthic communities across multiple spatial and temporal scales.
Ayroza, Camila Rezende; Jordan, Alan; Figueira, Will F.; Byrne, Maria; Malcolm, Hamish A.; Williams, Stefan B.; Steinberg, Peter D.
2018-01-01
Marine protected areas (MPAs) are designed to reduce threats to biodiversity and ecosystem functioning from anthropogenic activities. Assessment of MPAs effectiveness requires synchronous sampling of protected and non-protected areas at multiple spatial and temporal scales. We used an autonomous underwater vehicle to map benthic communities in replicate ‘no-take’ and ‘general-use’ (fishing allowed) zones within three MPAs along 7o of latitude. We recorded 92 taxa and 38 morpho-groups across three large MPAs. We found that important habitat-forming biota (e.g. massive sponges) were more prevalent and abundant in no-take zones, while short ephemeral algae were more abundant in general-use zones, suggesting potential short-term effects of zoning (5–10 years). Yet, short-term effects of zoning were not detected at the community level (community structure or composition), while community structure varied significantly among MPAs. We conclude that by allowing rapid, simultaneous assessments at multiple spatial scales, autonomous underwater vehicles are useful to document changes in marine communities and identify adequate scales to manage them. This study advanced knowledge of marine benthic communities and their conservation in three ways. First, we quantified benthic biodiversity and abundance, generating the first baseline of these benthic communities against which the effectiveness of three large MPAs can be assessed. Second, we identified the taxonomic resolution necessary to assess both short and long-term effects of MPAs, concluding that coarse taxonomic resolution is sufficient given that analyses of community structure at different taxonomic levels were generally consistent. Yet, observed differences were taxa-specific and may have not been evident using our broader taxonomic classifications, a classification of mid to high taxonomic resolution may be necessary to determine zoning effects on key taxa. Third, we provide an example of statistical analyses and sampling design that once temporal sampling is incorporated will be useful to detect changes of marine benthic communities across multiple spatial and temporal scales. PMID:29547656
Evaluating Unmanned Aerial Platforms for Cultural Heritage Large Scale Mapping
NASA Astrophysics Data System (ADS)
Georgopoulos, A.; Oikonomou, C.; Adamopoulos, E.; Stathopoulou, E. K.
2016-06-01
When it comes to large scale mapping of limited areas especially for cultural heritage sites, things become critical. Optical and non-optical sensors are developed to such sizes and weights that can be lifted by such platforms, like e.g. LiDAR units. At the same time there is an increase in emphasis on solutions that enable users to get access to 3D information faster and cheaper. Considering the multitude of platforms, cameras and the advancement of algorithms in conjunction with the increase of available computing power this challenge should and indeed is further investigated. In this paper a short review of the UAS technologies today is attempted. A discussion follows as to their applicability and advantages, depending on their specifications, which vary immensely. The on-board cameras available are also compared and evaluated for large scale mapping. Furthermore a thorough analysis, review and experimentation with different software implementations of Structure from Motion and Multiple View Stereo algorithms, able to process such dense and mostly unordered sequence of digital images is also conducted and presented. As test data set, we use a rich optical and thermal data set from both fixed wing and multi-rotor platforms over an archaeological excavation with adverse height variations and using different cameras. Dense 3D point clouds, digital terrain models and orthophotos have been produced and evaluated for their radiometric as well as metric qualities.
MODFLOW-LGR: Practical application to a large regional dataset
NASA Astrophysics Data System (ADS)
Barnes, D.; Coulibaly, K. M.
2011-12-01
In many areas of the US, including southwest Florida, large regional-scale groundwater models have been developed to aid in decision making and water resources management. These models are subsequently used as a basis for site-specific investigations. Because the large scale of these regional models is not appropriate for local application, refinement is necessary to analyze the local effects of pumping wells and groundwater related projects at specific sites. The most commonly used approach to date is Telescopic Mesh Refinement or TMR. It allows the extraction of a subset of the large regional model with boundary conditions derived from the regional model results. The extracted model is then updated and refined for local use using a variable sized grid focused on the area of interest. MODFLOW-LGR, local grid refinement, is an alternative approach which allows model discretization at a finer resolution in areas of interest and provides coupling between the larger "parent" model and the locally refined "child." In the present work, these two approaches are tested on a mining impact assessment case in southwest Florida using a large regional dataset (The Lower West Coast Surficial Aquifer System Model). Various metrics for performance are considered. They include: computation time, water balance (as compared to the variable sized grid), calibration, implementation effort, and application advantages and limitations. The results indicate that MODFLOW-LGR is a useful tool to improve local resolution of regional scale models. While performance metrics, such as computation time, are case-dependent (model size, refinement level, stresses involved), implementation effort, particularly when regional models of suitable scale are available, can be minimized. The creation of multiple child models within a larger scale parent model makes it possible to reuse the same calibrated regional dataset with minimal modification. In cases similar to the Lower West Coast model, where a model is larger than optimal for direct application as a parent grid, a combination of TMR and LGR approaches should be used to develop a suitable parent grid.
NASA Astrophysics Data System (ADS)
Tang, Zhanqi; Jiang, Nan
2018-05-01
This study reports the modifications of scale interaction and arrangement in a turbulent boundary layer perturbed by a wall-mounted circular cylinder. Hot-wire measurements were executed at multiple streamwise and wall-normal wise locations downstream of the cylindrical element. The streamwise fluctuating signals were decomposed into large-, small-, and dissipative-scale signatures by corresponding cutoff filters. The scale interaction under the cylindrical perturbation was elaborated by comparing the small- and dissipative-scale amplitude/frequency modulation effects downstream of the cylinder element with the results observed in the unperturbed case. It was obtained that the large-scale fluctuations perform a stronger amplitude modulation on both the small and dissipative scales in the near-wall region. At the wall-normal positions of the cylinder height, the small-scale amplitude modulation coefficients are redistributed by the cylinder wake. The similar observation was noted in small-scale frequency modulation; however, the dissipative-scale frequency modulation seems to be independent of the cylindrical perturbation. The phase-relationship observation indicated that the cylindrical perturbation shortens the time shifts between both the small- and dissipative-scale variations (amplitude and frequency) and large-scale fluctuations. Then, the integral time scale dependence of the phase-relationship between the small/dissipative scales and large scales was also discussed. Furthermore, the discrepancy of small- and dissipative-scale time shifts relative to the large-scale motions was examined, which indicates that the small-scale amplitude/frequency leads the dissipative scales.
Pain and Cognition in Multiple Sclerosis.
Scherder, R; Kant, N; Wolf, E; Pijnenburg, A C M; Scherder, E
2017-10-01
The goal of the present study was to examine the relationship between pain and cognition in patients with multiple sclerosis. Cross-sectional. Nursing home and personal environment of the investigators. Two groups of participants were included: 91 patients with multiple sclerosis and 80 matched control participants. The level of pain was measured by the following pain scales: Number of Words Chosen-Affective, Colored Analogue Scale for pain intensity and suffering from pain, and the Faces Pain Scale. Mood was tested by administering the Beck Depression Inventory and the Symptom Check List-90 anxiety and depression subscale. Global cognitive functioning was assessed by the Mini Mental State Examination. Memory and executive functions were assessed by several neuropsychological tests. Multiple sclerosis (MS) patients scored significantly lower than control participants on the majority of the neuropsychological tests. The MS patients experienced more pain compared with control participants, despite the fact that they were taking significantly more pain medication. No significant correlation was observed between cognition and pain in MS patients. Verbal working memory explained 10% of pain intensity (trend). Mood appeared to be a significant predictor of pain in patients with multiple sclerosis. The lack of a relationship between cognition and pain might be explained by the fact that, compared with control participants, patients with multiple sclerosis activate other non-pain-related areas to perform executive functions and memory tasks. © 2017 American Academy of Pain Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Interrelationships of locus of control content dimensions and hopelessness.
Ward, L C; Thomas, L L
1985-07-01
Items from three locus of control (LOC) tests and the Beck Hopelessness Scale were administered to 197 college students. Factor analyses produced multiple factors for each LOC test, but the Beck scale proved to be unidimensional. Factor scales were constructed for each test, and scores were factor analyzed to discover common content. Each LOC test contained a salient dimension that described belief in luck, chance, or fate, and corresponding scales were well correlated. Internal control was the second common theme, with variations according to whether control was attributed to oneself or to people in general. The third common component expressed a personal helplessness or powerlessness. Each common factor was loaded by the Hopelessness Scale, which also correlated with all but one LOC factor scale.
An Update on ToxCast™ | Science Inventory | US EPA
In its first phase, ToxCast™ is profiling over 300 well-characterized chemicals (primarily pesticides) in over 400 HTS endpoints. These endpoints include biochemical assays of protein function, cell-based transcriptional reporter assays, multi-cell interaction assays, transcriptomics on primary cell cultures, and developmental assays in zebrafish embryos. Almost all of the compounds being examined in Phase 1 of ToxCast™ have been tested in traditional toxicology tests, including developmental toxicity, multi-generation studies, and sub-chronic and chronic rodent bioassays Lessons learned to date for ToxCast: Large amounts of quality HTS data can be economically obtained. Large scale data sets will be required to understand potential for biological activity. Value in having multiple assays with overlapping coverage of biological pathways and a variety of methodologies Concentration-response will be important for ultimate interpretation Data transparency will be important for acceptance. Metabolic capabilities and coverage of developmental toxicity pathways will need additional attention. Need to define the gold standard Partnerships are needed to bring critical mass and expertise.
SANDO syndrome in a cohort of 107 patients with CPEO and mitochondrial DNA deletions.
Hanisch, Frank; Kornhuber, Malte; Alston, Charlotte L; Taylor, Robert W; Deschauer, Marcus; Zierz, Stephan
2015-06-01
The sensory ataxic neuropathy with dysarthria and ophthalmoparesis (SANDO) syndrome is a subgroup of mitochondrial chronic progressive external ophthalmoplegia (CPEO)-plus disorders associated with multiple mitochondrial DNA (mtDNA) deletions. There is no systematic survey on SANDO in patients with CPEO with either single or multiple large-scale mtDNA deletions. In this retrospective analysis, we characterised the frequency, the genetic and clinical phenotype of 107 index patients with mitochondrial CPEO (n=66 patients with single and n=41 patients with multiple mtDNA deletions) and assessed these for clinical evidence of a SANDO phenotype. Patients with multiple mtDNA deletions were additionally screened for mutations in the nuclear-encoded POLG, SLC25A4, PEO1 and RRM2B genes. The clinical, histological and genetic data of 11 patients with SANDO were further analysed. None of the 66 patients with single, large-scale mtDNA deletions fulfilled the clinical criteria of SANDO syndrome. In contrast, 9 of 41 patients (22%) with multiple mtDNA deletions and two additional family members fulfilled the clinical criteria for SANDO. Within this subgroup, multiple mtDNA deletions were associated with the following nuclear mutations: POLG (n=6), PEO1 (n=2), unidentified (n=2). The combination of sensory ataxic neuropathy with ophthalmoparesis (SANO) was observed in 70% of patients with multiple mtDNA deletions but only in 4% with single deletions. The combination of CPEO and sensory ataxic neuropathy (SANO, incomplete SANDO) was found in 43% of patients with multiple mtDNA deletions but not in patients with single deletions. The SANDO syndrome seems to indicate a cluster of symptoms within the wide range of multisystemic symptoms associated with mitochondrial CPEO. SANO seems to be the most frequent phenotype associated with multiple mtDNA deletions in our cohort but not or is rarely associated with single, large-scale mtDNA deletions. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Hering, Daniel; Carvalho, Laurence; Argillier, Christine; Beklioglu, Meryem; Borja, Angel; Cardoso, Ana Cristina; Duel, Harm; Ferreira, Teresa; Globevnik, Lidija; Hanganu, Jenica; Hellsten, Seppo; Jeppesen, Erik; Kodeš, Vit; Solheim, Anne Lyche; Nõges, Tiina; Ormerod, Steve; Panagopoulos, Yiannis; Schmutz, Stefan; Venohr, Markus; Birk, Sebastian
2015-01-15
Water resources globally are affected by a complex mixture of stressors resulting from a range of drivers, including urban and agricultural land use, hydropower generation and climate change. Understanding how stressors interfere and impact upon ecological status and ecosystem services is essential for developing effective River Basin Management Plans and shaping future environmental policy. This paper details the nature of these problems for Europe's water resources and the need to find solutions at a range of spatial scales. In terms of the latter, we describe the aims and approaches of the EU-funded project MARS (Managing Aquatic ecosystems and water Resources under multiple Stress) and the conceptual and analytical framework that it is adopting to provide this knowledge, understanding and tools needed to address multiple stressors. MARS is operating at three scales: At the water body scale, the mechanistic understanding of stressor interactions and their impact upon water resources, ecological status and ecosystem services will be examined through multi-factorial experiments and the analysis of long time-series. At the river basin scale, modelling and empirical approaches will be adopted to characterise relationships between multiple stressors and ecological responses, functions, services and water resources. The effects of future land use and mitigation scenarios in 16 European river basins will be assessed. At the European scale, large-scale spatial analysis will be carried out to identify the relationships amongst stress intensity, ecological status and service provision, with a special focus on large transboundary rivers, lakes and fish. The project will support managers and policy makers in the practical implementation of the Water Framework Directive (WFD), of related legislation and of the Blueprint to Safeguard Europe's Water Resources by advising the 3rd River Basin Management Planning cycle, the revision of the WFD and by developing new tools for diagnosing and predicting multiple stressors. Copyright © 2014. Published by Elsevier B.V.
Large-scale recovery of an endangered amphibian despite ongoing exposure to multiple stressors
Knapp, Roland A.; Fellers, Gary M.; Kleeman, Patrick M.; Miller, David A. W.; Vrendenburg, Vance T.; Rosenblum, Erica Bree; Briggs, Cheryl J.
2016-01-01
Amphibians are one of the most threatened animal groups, with 32% of species at risk for extinction. Given this imperiled status, is the disappearance of a large fraction of the Earth’s amphibians inevitable, or are some declining species more resilient than is generally assumed? We address this question in a species that is emblematic of many declining amphibians, the endangered Sierra Nevada yellow-legged frog (Rana sierrae). Based on >7,000 frog surveys conducted across Yosemite National Park over a 20-y period, we show that, after decades of decline and despite ongoing exposure to multiple stressors, including introduced fish, the recently emerged disease chytridiomycosis, and pesticides, R. sierrae abundance increased sevenfold during the study and at a rate of 11% per year. These increases occurred in hundreds of populations throughout Yosemite, providing a rare example of amphibian recovery at an ecologically relevant spatial scale. Results from a laboratory experiment indicate that these increases may be in part because of reduced frog susceptibility to chytridiomycosis. The disappearance of nonnative fish from numerous water bodies after cessation of stocking also contributed to the recovery. The large-scale increases in R. sierrae abundance that we document suggest that, when habitats are relatively intact and stressors are reduced in their importance by active management or species’ adaptive responses, declines of some amphibians may be partially reversible, at least at a regional scale. Other studies conducted over similarly large temporal and spatial scales are critically needed to provide insight and generality about the reversibility of amphibian declines at a global scale.
Large-scale recovery of an endangered amphibian despite ongoing exposure to multiple stressors.
Knapp, Roland A; Fellers, Gary M; Kleeman, Patrick M; Miller, David A W; Vredenburg, Vance T; Rosenblum, Erica Bree; Briggs, Cheryl J
2016-10-18
Amphibians are one of the most threatened animal groups, with 32% of species at risk for extinction. Given this imperiled status, is the disappearance of a large fraction of the Earth's amphibians inevitable, or are some declining species more resilient than is generally assumed? We address this question in a species that is emblematic of many declining amphibians, the endangered Sierra Nevada yellow-legged frog (Rana sierrae). Based on >7,000 frog surveys conducted across Yosemite National Park over a 20-y period, we show that, after decades of decline and despite ongoing exposure to multiple stressors, including introduced fish, the recently emerged disease chytridiomycosis, and pesticides, R. sierrae abundance increased sevenfold during the study and at a rate of 11% per year. These increases occurred in hundreds of populations throughout Yosemite, providing a rare example of amphibian recovery at an ecologically relevant spatial scale. Results from a laboratory experiment indicate that these increases may be in part because of reduced frog susceptibility to chytridiomycosis. The disappearance of nonnative fish from numerous water bodies after cessation of stocking also contributed to the recovery. The large-scale increases in R. sierrae abundance that we document suggest that, when habitats are relatively intact and stressors are reduced in their importance by active management or species' adaptive responses, declines of some amphibians may be partially reversible, at least at a regional scale. Other studies conducted over similarly large temporal and spatial scales are critically needed to provide insight and generality about the reversibility of amphibian declines at a global scale.
Large-scale recovery of an endangered amphibian despite ongoing exposure to multiple stressors
Knapp, Roland A.; Fellers, Gary M.; Kleeman, Patrick M.; Miller, David A. W.; Rosenblum, Erica Bree; Briggs, Cheryl J.
2016-01-01
Amphibians are one of the most threatened animal groups, with 32% of species at risk for extinction. Given this imperiled status, is the disappearance of a large fraction of the Earth’s amphibians inevitable, or are some declining species more resilient than is generally assumed? We address this question in a species that is emblematic of many declining amphibians, the endangered Sierra Nevada yellow-legged frog (Rana sierrae). Based on >7,000 frog surveys conducted across Yosemite National Park over a 20-y period, we show that, after decades of decline and despite ongoing exposure to multiple stressors, including introduced fish, the recently emerged disease chytridiomycosis, and pesticides, R. sierrae abundance increased sevenfold during the study and at a rate of 11% per year. These increases occurred in hundreds of populations throughout Yosemite, providing a rare example of amphibian recovery at an ecologically relevant spatial scale. Results from a laboratory experiment indicate that these increases may be in part because of reduced frog susceptibility to chytridiomycosis. The disappearance of nonnative fish from numerous water bodies after cessation of stocking also contributed to the recovery. The large-scale increases in R. sierrae abundance that we document suggest that, when habitats are relatively intact and stressors are reduced in their importance by active management or species’ adaptive responses, declines of some amphibians may be partially reversible, at least at a regional scale. Other studies conducted over similarly large temporal and spatial scales are critically needed to provide insight and generality about the reversibility of amphibian declines at a global scale. PMID:27698128
The Washback Effect of Konkoor on Teachers' Attitudes toward Their Teaching
ERIC Educational Resources Information Center
Birjandi, Parviz; Shirkhani, Servat
2012-01-01
Large scale tests have been considered by many scholars in the field of language testing and teaching to influence teaching and learning considerably. The present study looks at the effect of a large scale test (Konkoor) on the attitudes of teachers in high schools. Konkoor is the university entrance examination in Iran which is taken by at least…
ERIC Educational Resources Information Center
Qi, Sen; Mitchell, Ross E.
2012-01-01
The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the…
Sebastião, Emerson; Sandroff, Brian M; Learmonth, Yvonne C; Motl, Robert W
2016-07-01
To examine the validity of the timed Up and Go (TUG) test as a measure of functional mobility in persons with multiple sclerosis (MS) by using a comprehensive framework based on construct validity (ie, convergent and divergent validity). Cross-sectional study. Hospital setting. Community-residing persons with MS (N=47). Not applicable. Main outcome measures included the TUG test, timed 25-foot walk test, 6-minute walk test, Multiple Sclerosis Walking Scale-12, Late-Life Function and Disability Instrument, posturography evaluation, Activities-specific Balance Confidence scale, Symbol Digits Modalities Test, Expanded Disability Status Scale, and the number of steps taken per day. The TUG test was strongly associated with other valid outcome measures of ambulatory mobility (Spearman rank correlation, rs=.71-.90) and disability status (rs=.80), moderately to strongly associated with balance confidence (rs=.66), and weakly associated with postural control (ie, balance) (rs=.31). The TUG test was moderately associated with cognitive processing speed (rs=.59), but not associated with other nonambulatory measures (ie, Late-Life Function and Disability Instrument-upper extremity function). Our findings support the validity of the TUG test as a measure of functional mobility. This warrants its inclusion in patients' assessment alongside other valid measures of functional mobility in both clinical and research practice in persons with MS. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Karasek, Robert; Choi, BongKyoo; Ostergren, Per-Olof; Ferrario, Marco; De Smet, Patrick
2007-01-01
Scale comparative properties of "JCQ-like" questionnaires with respect to the JCQ have been little known. Assessing validity and reliability of two methods for generating comparable scale scores between the Job Content Questionnaire (JCQ) and JCQ-like questionnaires in sub-populations of the large Job Stress, Absenteeism and Coronary Heart Disease European Cooperative (JACE) study: the Swedish version of Demand-Control Questionnaire (DCQ) and a transformed Multinational Monitoring of Trends and Determinants in Cardiovascular Disease Project (MONICA) questionnaire. A random population sample of all Malmo males and females aged 52-58 (n = 682) years was given a new test questionnaire with both instruments (the JCQ and the DCQ). Comparability-facilitating algorithms were created (Method I). For the transformed Milan MONICA questionnaire, a simple weighting system was used (Method II). The converted scale scores from the JCQ-like questionnaires were found to be reliable and highly correlated to those of the original JCQ. However, agreements for the high job strain group between the JCQ and the DCQ, and between the JCQ and the DCQ (Method I applied) were only moderate (Kappa). Use of a multiple level job strain scale generated higher levels of job strain agreement, as did a new job strain definition that excludes the intermediate levels of the job strain distribution. The two methods were valid and generally reliable.
Integrated Data Modeling and Simulation on the Joint Polar Satellite System Program
NASA Technical Reports Server (NTRS)
Roberts, Christopher J.; Boyce, Leslye; Smith, Gary; Li, Angela; Barrett, Larry
2012-01-01
The Joint Polar Satellite System is a modern, large-scale, complex, multi-mission aerospace program, and presents a variety of design, testing and operational challenges due to: (1) System Scope: multi-mission coordination, role, responsibility and accountability challenges stemming from porous/ill-defined system and organizational boundaries (including foreign policy interactions) (2) Degree of Concurrency: design, implementation, integration, verification and operation occurring simultaneously, at multiple scales in the system hierarchy (3) Multi-Decadal Lifecycle: technical obsolesce, reliability and sustainment concerns, including those related to organizational and industrial base. Additionally, these systems tend to become embedded in the broader societal infrastructure, resulting in new system stakeholders with perhaps different preferences (4) Barriers to Effective Communications: process and cultural issues that emerge due to geographic dispersion and as one spans boundaries including gov./contractor, NASA/Other USG, and international relationships.
Evaluating large-scale health programmes at a district level in resource-limited countries.
Svoronos, Theodore; Mate, Kedar S
2011-11-01
Recent experience in evaluating large-scale global health programmes has highlighted the need to consider contextual differences between sites implementing the same intervention. Traditional randomized controlled trials are ill-suited for this purpose, as they are designed to identify whether an intervention works, not how, when and why it works. In this paper we review several evaluation designs that attempt to account for contextual factors that contribute to intervention effectiveness. Using these designs as a base, we propose a set of principles that may help to capture information on context. Finally, we propose a tool, called a driver diagram, traditionally used in implementation that would allow evaluators to systematically monitor changing dynamics in project implementation and identify contextual variation across sites. We describe an implementation-related example from South Africa to underline the strengths of the tool. If used across multiple sites and multiple projects, the resulting driver diagrams could be pooled together to form a generalized theory for how, when and why a widely-used intervention works. Mechanisms similar to the driver diagram are urgently needed to complement existing evaluations of large-scale implementation efforts.
Fire extinguishing tests -80 with methyl alcohol gasoline
NASA Astrophysics Data System (ADS)
Holmstedt, G.; Ryderman, A.; Carlsson, B.; Lennmalm, B.
1980-10-01
Large scale tests and laboratory experiments were carried out for estimating the extinguishing effectiveness of three alcohol resistant aqueous film forming foams (AFFF), two alcohol resistant fluoroprotein foams and two detergent foams in various poolfires: gasoline, isopropyl alcohol, acetone, methyl-ethyl ketone, methyl alcohol and M15 (a gasoline, methyl alcohol, isobutene mixture). The scaling down of large scale tests for developing a reliable laboratory method was especially examined. The tests were performed with semidirect foam application, in pools of 50, 11, 4, 0.6, and 0.25 sq m. Burning time, temperature distribution in the liquid, and thermal radiation were determined. An M15 fire can be extinguished with a detergent foam, but it is impossible to extinguish fires in polar solvents, such as methyl alcohol, acetone, and isopropyl alcohol with detergent foams, AFFF give the best results; and performances with small pools can hardly be correlated with results from large scale fires.
Rilov, Gil; Schiel, David R.
2011-01-01
Predicting the strength and context-dependency of species interactions across multiple scales is a core area in ecology. This is especially challenging in the marine environment, where populations of most predators and prey are generally open, because of their pelagic larval phase, and recruitment of both is highly variable. In this study we use a comparative-experimental approach on small and large spatial scales to test the relationship between predation intensity and prey recruitment and their relative importance in shaping populations of a dominant rocky intertidal space occupier, mussels, in the context of seascape (availability of nearby subtidal reef habitat). Predation intensity on transplanted mussels was tested inside and outside cages and recruitment was measured with standard larval settlement collectors. We found that on intertidal rocky benches with contiguous subtidal reefs in New Zealand, mussel larval recruitment is usually low but predation on recruits by subtidal consumers (fish, crabs) is intense during high tide. On nearby intertidal rocky benches with adjacent sandy subtidal habitats, larval recruitment is usually greater but subtidal predators are typically rare and predation is weaker. Multiple regression analysis showed that predation intensity accounts for most of the variability in the abundance of adult mussels compared to recruitment. This seascape-dependent, predation-recruitment relationship could scale up to explain regional community variability. We argue that community ecology models should include seascape context-dependency and its effects on recruitment and species interactions for better predictions of coastal community dynamics and structure. PMID:21887351
Kubsik, Anna; Klimkiewicz, Paulina; Klimkiewicz, Robert; Jankowska, Katarzyna; Jankowska, Agnieszka; Woldańska-Okońska, Marta
2014-07-01
Multiple sclerosis is a chronic, inflammatory, demyelinating disease of the central nervous system, which is characterized by diverse symptomatology. Most often affects people at a young age gradually leading to their disability. Looking for new therapies to alleviate neurological deficits caused by the disease. One of the alternative methods of therapy is high - tone power therapy. The article is a comparison of high-tone power therapy and kinesis in improving patients with multiple sclerosis. The aim of this study was to evaluate the effectiveness of high-tone power therapy and exercises in kinesis on the functional status of patients with multiple sclerosis. The study involved 20 patients with multiple sclerosis, both sexes, treated at the Department of Rehabilitation and Physical Medicine in Lodz. Patients were randomly divided into two groups studied. In group high-tone power therapy applied for 60 minutes, while in group II were used exercises for kinesis. Treatment time for both groups of patients was 15 days. To assess the functional status scale was used: Expanded Disability Status Scale of Kurtzke (EDSS), as well as by Barthel ADL Index. Assessment of quality of life were made using MSQOL Questionnaire-54. For the evaluation of gait and balance using Tinetti scale, and pain VAS rated, and Laitinen. Changes in muscle tone was assessed on the basis of the Ashworth scale. Both group I and II improved on scales conducted before and after therapy. In group I, in which the applied high-tone power therapy, reported statistically significant results in 9 out of 10 tested parameters, while in group II, which was used in the exercises in kinesis an improvement in 6 out of 10 tested parameters. Correlating the results of both the test groups in relation to each other did not show statistically significant differences. High-Tone Power Therapy beneficial effect on the functional status of patients with multiple sclerosis. Obtaining results in terms of number of tested parameters allows for the use of this therapy in the comprehensive improvement of patients with multiple sclerosis. Exercises from the scheme kinesis favorable impact on the functional status of patients with MS and are essential in the rehabilitation of these patients. In any group, no adverse effects were observed.
Future of applied watershed science at regional scales
Lee Benda; Daniel Miller; Steve Lanigan; Gordon Reeves
2009-01-01
Resource managers must deal increasingly with land use and conservation plans applied at large spatial scales (watersheds, landscapes, states, regions) involving multiple interacting federal agencies and stakeholders. Access to a geographically focused and application-oriented database would allow users in different locations and with different concerns to quickly...
Spotted Towhee population dynamics in a riparian restoration context
Stacy L. Small; Frank R., III Thompson; Geoffery R. Geupel; John Faaborg
2007-01-01
We investigated factors at multiple scales that might influence nest predation risk for Spotted Towhees (Pipilo maculates) along the Sacramento River, California, within the context of large-scale riparian habitat restoration. We used the logistic-exposure method and Akaike's information criterion (AIC) for model selection to compare predator...
Tanigawa, Makoto; Stein, Jason; Park, John; Kosa, Peter; Cortese, Irene; Bielekova, Bibiana
2017-01-01
While magnetic resonance imaging contrast-enhancing lesions represent an excellent screening tool for disease-modifying treatments in relapsing-remitting multiple sclerosis (RRMS), this biomarker is insensitive for testing therapies against compartmentalized inflammation in progressive multiple sclerosis (MS). Therefore, alternative sensitive outcomes are needed. Using machine learning, clinician-acquired disability scales can be combined with timed measures of neurological functions such as walking speed (e.g. 25-foot walk; 25FW) or fine finger movements (e.g. 9-hole peg test; 9HPT) into sensitive composite clinical scales, such as the recently developed combinatorial, weight-adjusted disability scale (CombiWISE). Ideally, these complementary simplified measurements of certain neurological functions could be performed regularly at patients' homes using smartphones. We asked whether tests amenable to adaptation to smartphone technology, such as finger and foot tapping have comparable sensitivity and specificity to current non-clinician-acquired disability measures. We observed that finger and foot tapping can differentiate RRMS and progressive MS in a cross-sectional study and can also measure yearly and two-year disease progression in the latter, with better power (based on z-scores) in comparison to currently utilized 9HPT and 25FW. Replacing the 9HPT and 25FW with simplified tests broadly adaptable to smartphone technology may enhance the power of composite scales for progressive MS.
NASA Astrophysics Data System (ADS)
Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.
2017-12-01
The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.
Clarification of the memory artefact in the assessment of suggestibility.
Willner, P
2008-04-01
The Gudjonsson Suggestibility Scale (GSS) assesses suggestibility by asking respondents to recall a short story, followed by exposure to leading questions and pressure to change their responses. Suggestibility, as assessed by the GSS, appears to be elevated in people with intellectual disabilities (ID). This has been shown to reflect to some extent the fact that people with ID have poor recall of the story; however, there are discrepancies in this relationship. The aim of the present study was to investigate whether a closer match between memory and suggestibility would be found using a measure of recognition memory rather than free recall. Three modifications to the procedure were presented to users of a learning disabilities day service. In all three experiments, a measure of forced-choice recognition memory was built into the suggestibility test. In experiments 1 and 2, the GSS was presented using either divided presentation (splitting the story into two halves, with memory and suggestibility tests after each half) or multiple presentation (the story was presented three times before presentation of the memory and suggestibility tests). Participants were tested twice, once with the standard version of the test and once with one of the modified versions. In experiment 3, an alternative suggestibility scale (ASS3) was created, based on real events in a learning disabilities day service. The ASS3 was presented to one group of participants who had been present at the events, and a second group who attended a different day service, to whom the events were unfamiliar. As observed previously, suggestibility was not closely related to free recall performance: recall was increased equally by all three manipulations, but they produced, respectively, no effect, a modest effect and a large effect on suggestibility. However, the effects on suggestibility were closely related to performance on the forced-choice recognition memory task: divided presentation of the GSS2 had no effect on either of these measures; multiple presentation of the GSS2 produced a modest increase in recognition memory and a modest decrease in suggestibility; and replacing the GSS with the ASS3 produced a large increase in recognition memory and a large decrease in suggestibility. The results support earlier findings that the GSS is likely to overestimate how suggestible a person will be in relation to a personally significant event. This reflects poor recognition memory for the material being tested, rather than increased suggestibility per se. People with ID may in fact be relatively non-suggestible for well-remembered events, which would include personally significant events, particularly those witnessed recently.
NASA Astrophysics Data System (ADS)
Usman, Muhammad
2018-04-01
Bismide semiconductor materials and heterostructures are considered a promising candidate for the design and implementation of photonic, thermoelectric, photovoltaic, and spintronic devices. This work presents a detailed theoretical study of the electronic and optical properties of strongly coupled GaBixAs1 -x /GaAs multiple quantum well (MQW) structures. Based on a systematic set of large-scale atomistic tight-binding calculations, our results reveal that the impact of atomic-scale fluctuations in alloy composition is stronger than the interwell coupling effect, and plays an important role in the electronic and optical properties of the investigated MQW structures. Independent of QW geometry parameters, alloy disorder leads to a strong confinement of charge carriers, a large broadening of the hole energies, and a red-shift in the ground-state transition wavelength. Polarization-resolved optical transition strengths exhibit a striking effect of disorder, where the inhomogeneous broadening could exceed an order of magnitude for MQWs, in comparison to a factor of about 3 for single QWs. The strong influence of alloy disorder effects persists when small variations in the size and composition of MQWs typically expected in a realistic experimental environment are considered. The presented results highlight the limited scope of continuum methods and emphasize on the need for large-scale atomistic approaches to design devices with tailored functionalities based on the novel properties of bismide materials.
NASA Astrophysics Data System (ADS)
Ivanov, Martin; Warrach-Sagi, Kirsten; Wulfmeyer, Volker
2018-04-01
A new approach for rigorous spatial analysis of the downscaling performance of regional climate model (RCM) simulations is introduced. It is based on a multiple comparison of the local tests at the grid cells and is also known as `field' or `global' significance. The block length for the local resampling tests is precisely determined to adequately account for the time series structure. New performance measures for estimating the added value of downscaled data relative to the large-scale forcing fields are developed. The methodology is exemplarily applied to a standard EURO-CORDEX hindcast simulation with the Weather Research and Forecasting (WRF) model coupled with the land surface model NOAH at 0.11 ∘ grid resolution. Daily precipitation climatology for the 1990-2009 period is analysed for Germany for winter and summer in comparison with high-resolution gridded observations from the German Weather Service. The field significance test controls the proportion of falsely rejected local tests in a meaningful way and is robust to spatial dependence. Hence, the spatial patterns of the statistically significant local tests are also meaningful. We interpret them from a process-oriented perspective. While the downscaled precipitation distributions are statistically indistinguishable from the observed ones in most regions in summer, the biases of some distribution characteristics are significant over large areas in winter. WRF-NOAH generates appropriate stationary fine-scale climate features in the daily precipitation field over regions of complex topography in both seasons and appropriate transient fine-scale features almost everywhere in summer. As the added value of global climate model (GCM)-driven simulations cannot be smaller than this perfect-boundary estimate, this work demonstrates in a rigorous manner the clear additional value of dynamical downscaling over global climate simulations. The evaluation methodology has a broad spectrum of applicability as it is distribution-free, robust to spatial dependence, and accounts for time series structure.
The relativistic feedback discharge model of terrestrial gamma ray flashes
NASA Astrophysics Data System (ADS)
Dwyer, Joseph R.
2012-02-01
As thunderclouds charge, the large-scale fields may approach the relativistic feedback threshold, above which the production of relativistic runaway electron avalanches becomes self-sustaining through the generation of backward propagating runaway positrons and backscattered X-rays. Positive intracloud (IC) lightning may force the large-scale electric fields inside thunderclouds above the relativistic feedback threshold, causing the number of runaway electrons, and the resulting X-ray and gamma ray emission, to grow exponentially, producing very large fluxes of energetic radiation. As the flux of runaway electrons increases, ionization eventually causes the electric field to discharge, bringing the field below the relativistic feedback threshold again and reducing the flux of runaway electrons. These processes are investigated with a new model that includes the production, propagation, diffusion, and avalanche multiplication of runaway electrons; the production and propagation of X-rays and gamma rays; and the production, propagation, and annihilation of runaway positrons. In this model, referred to as the relativistic feedback discharge model, the large-scale electric fields are calculated self-consistently from the charge motion of the drifting low-energy electrons and ions, produced from the ionization of air by the runaway electrons, including two- and three-body attachment and recombination. Simulation results show that when relativistic feedback is considered, bright gamma ray flashes are a natural consequence of upward +IC lightning propagating in large-scale thundercloud fields. Furthermore, these flashes have the same time structures, including both single and multiple pulses, intensities, angular distributions, current moments, and energy spectra as terrestrial gamma ray flashes, and produce large current moments that should be observable in radio waves.
Space transportation booster engine thrust chamber technology, large scale injector
NASA Technical Reports Server (NTRS)
Schneider, J. A.
1993-01-01
The objective of the Large Scale Injector (LSI) program was to deliver a 21 inch diameter, 600,000 lbf thrust class injector to NASA/MSFC for hot fire testing. The hot fire test program would demonstrate the feasibility and integrity of the full scale injector, including combustion stability, chamber wall compatibility (thermal management), and injector performance. The 21 inch diameter injector was delivered in September of 1991.
Cloud-based MOTIFSIM: Detecting Similarity in Large DNA Motif Data Sets.
Tran, Ngoc Tam L; Huang, Chun-Hsi
2017-05-01
We developed the cloud-based MOTIFSIM on Amazon Web Services (AWS) cloud. The tool is an extended version from our web-based tool version 2.0, which was developed based on a novel algorithm for detecting similarity in multiple DNA motif data sets. This cloud-based version further allows researchers to exploit the computing resources available from AWS to detect similarity in multiple large-scale DNA motif data sets resulting from the next-generation sequencing technology. The tool is highly scalable with expandable AWS.
Nelson, Jason M; Lindstrom, Will; Foels, Patricia A
2015-01-01
Test anxiety and its correlates were examined with college students with and without specific reading disability (RD; n = 50 in each group). Results indicated that college students with RD reported higher test anxiety than did those without RD, and the magnitude of these differences was in the medium range on two test anxiety scales. Relative to college students without RD, up to 5 times as many college students with RD reported clinically significant test anxiety. College students with RD reported significantly higher cognitively based test anxiety than physically based test anxiety. Reading skills, verbal ability, and processing speed were not correlated with test anxiety. General intelligence, nonverbal ability, and working memory were negatively correlated with test anxiety, and the magnitude of these correlations was medium to large. When these three cognitive constructs were considered together in multiple regression analyses, only working memory and nonverbal ability emerged as significant predictors and varied based on the test anxiety measure. Implications for assessment and intervention are discussed. © Hammill Institute on Disabilities 2013.
Validating Bayesian truth serum in large-scale online human experiments.
Frank, Morgan R; Cebrian, Manuel; Pickard, Galen; Rahwan, Iyad
2017-01-01
Bayesian truth serum (BTS) is an exciting new method for improving honesty and information quality in multiple-choice survey, but, despite the method's mathematical reliance on large sample sizes, existing literature about BTS only focuses on small experiments. Combined with the prevalence of online survey platforms, such as Amazon's Mechanical Turk, which facilitate surveys with hundreds or thousands of participants, BTS must be effective in large-scale experiments for BTS to become a readily accepted tool in real-world applications. We demonstrate that BTS quantifiably improves honesty in large-scale online surveys where the "honest" distribution of answers is known in expectation on aggregate. Furthermore, we explore a marketing application where "honest" answers cannot be known, but find that BTS treatment impacts the resulting distributions of answers.
Validating Bayesian truth serum in large-scale online human experiments
Frank, Morgan R.; Cebrian, Manuel; Pickard, Galen; Rahwan, Iyad
2017-01-01
Bayesian truth serum (BTS) is an exciting new method for improving honesty and information quality in multiple-choice survey, but, despite the method’s mathematical reliance on large sample sizes, existing literature about BTS only focuses on small experiments. Combined with the prevalence of online survey platforms, such as Amazon’s Mechanical Turk, which facilitate surveys with hundreds or thousands of participants, BTS must be effective in large-scale experiments for BTS to become a readily accepted tool in real-world applications. We demonstrate that BTS quantifiably improves honesty in large-scale online surveys where the “honest” distribution of answers is known in expectation on aggregate. Furthermore, we explore a marketing application where “honest” answers cannot be known, but find that BTS treatment impacts the resulting distributions of answers. PMID:28494000
GPA, GMAT, and Scale: A Method of Quantification of Admissions Criteria.
ERIC Educational Resources Information Center
Sobol, Marion G.
1984-01-01
Multiple regression analysis was used to establish a scale, measuring college student involvement in campus activities, work experience, technical background, references, and goals. This scale was tested to see whether it improved the prediction of success in graduate school. (Author/MLW)
NASA Astrophysics Data System (ADS)
Menge, B. A.; Gouhier, T.; Chan, F.; Hacker, S.; Menge, D.; Nielsen, K. J.
2016-02-01
Ecology focuses increasingly on the issue of matching spatial and temporal scales responsible for ecosystem pattern and dynamics. Benthic coastal communities traditionally were studied at local scales using mostly short-term research, while environmental (oceanographic, climatic) drivers were investigated at large scales (e.g., regional to oceanic, mostly offshore) using combined snapshot and monitoring (time series) research. The comparative-experimental approach combines local-scale studies at multiple sites spanning large-scale environmental gradients in combination with monitoring of inner shelf oceanographic conditions including upwelling/downwelling wind forcing and their consequences (e.g., temperature), and inputs of subsidies (larvae, phytoplankton, detritus). Temporal scale varies depending on the questions, but can extend from years to decades. We discuss two examples of rocky intertidal ecosystem dynamics, one at a regional scale (California Current System, CCS) and one at an interhemispheric scale. In the upwelling-dominated CCS, 52% and 32% of the variance in local community structure (functional group abundances at 13 sites across 725 km) was explained by external factors (ecological subsidies, oceanographic conditions, geographic location), and species interactions, respectively. The interhemispheric study tested the intermittent upwelling hypothesis (IUH), which predicts that key ecological processes will vary unimodally along a persistent downwelling to persistent upwelling gradient. Using 14-22 sites, unimodal relationships between ecological subsidies (phytoplankton, prey recruitment), prey responses (barnacle colonization, mussel growth) and species interactions (competition rate, predation rate and effect) and the Bakun upwelling index calculated at each site accounted for 50% of the variance. Hence, external factors can account for about half of locally-expressed community structure and dynamics.
Multiple Object Retrieval in Image Databases Using Hierarchical Segmentation Tree
ERIC Educational Resources Information Center
Chen, Wei-Bang
2012-01-01
The purpose of this research is to develop a new visual information analysis, representation, and retrieval framework for automatic discovery of salient objects of user's interest in large-scale image databases. In particular, this dissertation describes a content-based image retrieval framework which supports multiple-object retrieval. The…
Comparing Learner Community Behavior in Multiple Presentations of a Massive Open Online Course
ERIC Educational Resources Information Center
Gallagher, Silvia Elena; Savage, Timothy
2015-01-01
Massive Online Open Courses (MOOCs) can create large scale communities of learners who collaborate, interact and discuss learning materials and activities. MOOCs are often delivered multiple times with similar content to different cohorts of learners. However, research into the differences of learner communication, behavior and expectation between…
Comparing Learner Community Behavior in Multiple Presentations of a Massive Open Online Course
ERIC Educational Resources Information Center
Gallagher, Silvia Elena; Savage, Timothy
2016-01-01
Massive Online Open Courses (MOOCs) can create large scale communities of learners who collaborate, interact and discuss learning materials and activities. MOOCs are often delivered multiple times with similar content to different cohorts of learners. However, research into the differences of learner communication, behavior and expectation between…
ERIC Educational Resources Information Center
Wilcox, Bethany R.; Pollock, Steven J.
2014-01-01
Free-response research-based assessments, like the Colorado Upper-division Electrostatics Diagnostic (CUE), provide rich, fine-grained information about students' reasoning. However, because of the difficulties inherent in scoring these assessments, the majority of the large-scale conceptual assessments in physics are multiple choice. To increase…
Floating Data and the Problem with Illustrating Multiple Regression.
ERIC Educational Resources Information Center
Sachau, Daniel A.
2000-01-01
Discusses how to introduce basic concepts of multiple regression by creating a large-scale, three-dimensional regression model using the classroom walls and floor. Addresses teaching points that should be covered and reveals student reaction to the model. Finds that the greatest benefit of the model is the low fear, walk-through, nonmathematical…
Achieving ultra-high temperatures with a resistive emitter array
NASA Astrophysics Data System (ADS)
Danielson, Tom; Franks, Greg; Holmes, Nicholas; LaVeigne, Joe; Matis, Greg; McHugh, Steve; Norton, Dennis; Vengel, Tony; Lannon, John; Goodwin, Scott
2016-05-01
The rapid development of very-large format infrared detector arrays has challenged the IR scene projector community to also develop larger-format infrared emitter arrays to support the testing of systems incorporating these detectors. In addition to larger formats, many scene projector users require much higher simulated temperatures than can be generated with current technology in order to fully evaluate the performance of their systems and associated processing algorithms. Under the Ultra High Temperature (UHT) development program, Santa Barbara Infrared Inc. (SBIR) is developing a new infrared scene projector architecture capable of producing both very large format (>1024 x 1024) resistive emitter arrays and improved emitter pixel technology capable of simulating very high apparent temperatures. During earlier phases of the program, SBIR demonstrated materials with MWIR apparent temperatures in excess of 1400 K. New emitter materials have subsequently been selected to produce pixels that achieve even higher apparent temperatures. Test results from pixels fabricated using the new material set will be presented and discussed. A 'scalable' Read In Integrated Circuit (RIIC) is also being developed under the same UHT program to drive the high temperature pixels. This RIIC will utilize through-silicon via (TSV) and Quilt Packaging (QP) technologies to allow seamless tiling of multiple chips to fabricate very large arrays, and thus overcome the yield limitations inherent in large-scale integrated circuits. Results of design verification testing of the completed RIIC will be presented and discussed.
Multi-scale Homogenization of Caddisfly Metacomminities in Human-modified Landscapes
NASA Astrophysics Data System (ADS)
Simião-Ferreira, Juliana; Nogueira, Denis Silva; Santos, Anna Claudia; De Marco, Paulo; Angelini, Ronaldo
2018-04-01
The multiple scale of stream networks spatial organization reflects the hierarchical arrangement of streams habitats with increasingly levels of complexity from sub-catchments until entire hydrographic basins. Through these multiple spatial scales, local stream habitats form nested subsets of increasingly landscape scale and habitat size with varying contributions of both alpha and beta diversity for the regional diversity. Here, we aimed to test the relative importance of multiple nested hierarchical levels of spatial scales while determining alpha and beta diversity of caddisflies in regions with different levels of landscape degradation in a core Cerrado area in Brazil. We used quantitative environmental variables to test the hypothesis that landscape homogenization affects the contribution of alpha and beta diversity of caddisflies to regional diversity. We found that the contribution of alpha and beta diversity for gamma diversity varied according to landscape degradation. Sub-catchments with more intense agriculture had lower diversity at multiple levels, markedly alpha and beta diversities. We have also found that environmental predictors mainly associated with water quality, channel size, and habitat integrity (lower scores indicate stream degradation) were related to community dissimilarity at the catchment scale. For an effective management of the headwater biodiversity of caddisfly, towards the conservation of these catchments, heterogeneous streams with more pristine riparian vegetation found within the river basin need to be preserved in protected areas. Additionally, in the most degraded areas the restoration of riparian vegetation and size increase of protected areas will be needed to accomplish such effort.
Impact of playing American professional football on long-term brain function.
Amen, Daniel G; Newberg, Andrew; Thatcher, Robert; Jin, Yi; Wu, Joseph; Keator, David; Willeumier, Kristen
2011-01-01
The authors recruited 100 active and former National Football League players, representing 27 teams and all positions. Players underwent a clinical history, brain SPECT imaging, qEEG, and multiple neuropsychological measures, including MicroCog. Relative to a healthy-comparison group, players showed global decreased perfusion, especially in the prefrontal, temporal, parietal, and occipital lobes, and cerebellar regions. Quantitative EEG findings were consistent, showing elevated slow waves in the frontal and temporal regions. Significant decreases from normal values were found in most neuropsychological tests. This is the first large-scale brain-imaging study to demonstrate significant differences consistent with a chronic brain trauma pattern in professional football players.
Recent Developments in Language Assessment and the Case of Four Large-Scale Tests of ESOL Ability
ERIC Educational Resources Information Center
Stoynoff, Stephen
2009-01-01
This review article surveys recent developments and validation activities related to four large-scale tests of L2 English ability: the iBT TOEFL, the IELTS, the FCE, and the TOEIC. In addition to describing recent changes to these tests, the paper reports on validation activities that were conducted on the measures. The results of this research…
NASA Technical Reports Server (NTRS)
Jackson, Karen E.
1990-01-01
Scale model technology represents one method of investigating the behavior of advanced, weight-efficient composite structures under a variety of loading conditions. It is necessary, however, to understand the limitations involved in testing scale model structures before the technique can be fully utilized. These limitations, or scaling effects, are characterized. in the large deflection response and failure of composite beams. Scale model beams were loaded with an eccentric axial compressive load designed to produce large bending deflections and global failure. A dimensional analysis was performed on the composite beam-column loading configuration to determine a model law governing the system response. An experimental program was developed to validate the model law under both static and dynamic loading conditions. Laminate stacking sequences including unidirectional, angle ply, cross ply, and quasi-isotropic were tested to examine a diversity of composite response and failure modes. The model beams were loaded under scaled test conditions until catastrophic failure. A large deflection beam solution was developed to compare with the static experimental results and to analyze beam failure. Also, the finite element code DYCAST (DYnamic Crash Analysis of STructure) was used to model both the static and impulsive beam response. Static test results indicate that the unidirectional and cross ply beam responses scale as predicted by the model law, even under severe deformations. In general, failure modes were consistent between scale models within a laminate family; however, a significant scale effect was observed in strength. The scale effect in strength which was evident in the static tests was also observed in the dynamic tests. Scaling of load and strain time histories between the scale model beams and the prototypes was excellent for the unidirectional beams, but inconsistent results were obtained for the angle ply, cross ply, and quasi-isotropic beams. Results show that valuable information can be obtained from testing on scale model composite structures, especially in the linear elastic response region. However, due to scaling effects in the strength behavior of composite laminates, caution must be used in extrapolating data taken from a scale model test when that test involves failure of the structure.
NASA Astrophysics Data System (ADS)
Ji, H.; Bhattacharjee, A.; Prager, S.; Daughton, W. S.; Bale, S. D.; Carter, T. A.; Crocker, N.; Drake, J. F.; Egedal, J.; Sarff, J.; Wallace, J.; Chen, Y.; Cutler, R.; Fox, W. R., II; Heitzenroeder, P.; Kalish, M.; Jara-Almonte, J.; Myers, C. E.; Ren, Y.; Yamada, M.; Yoo, J.
2015-12-01
The FLARE device (flare.pppl.gov) is a new intermediate-scale plasma experiment under construction at Princeton to study magnetic reconnection in regimes directly relevant to space, solar and astrophysical plasmas. The existing small-scale experiments have been focusing on the single X-line reconnection process either with small effective sizes or at low Lundquist numbers, but both of which are typically very large in natural plasmas. The configuration of the FLARE device is designed to provide experimental access to the new regimes involving multiple X-lines, as guided by a reconnection "phase diagram" [Ji & Daughton, PoP (2011)]. Most of major components of the FLARE device have been designed and are under construction. The device will be assembled and installed in 2016, followed by commissioning and operation in 2017. The planned research on FLARE as a user facility will be discussed on topics including the multiple scale nature of magnetic reconnection from global fluid scales to ion and electron kinetic scales. Results from scoping simulations based on particle and fluid codes and possible comparative research with space measurements will be presented.
Phulwaria, Mahendra; Rai, Manoj K.; Patel, Ashok Kumar; Kataria, Vinod; Shekhawat, N. S.
2012-01-01
Celastrus paniculatus, belonging to the family Celastraceae, is an important medicinal plant of India. Owing to the ever-increasing demand from the pharmaceutical industry, the species is being overexploited, thereby threatening its stock in the wild. Poor seed viability coupled with low germination restricts its propagation through sexual means. Thus, alternative approaches such as in vitro techniques are highly desirable for large-scale propagation of this medicinally important plant. Nodal segments, obtained from a 12-year-old mature plant, were used as explants for multiple shoot induction. Shoot multiplication was achieved by repeated transfer of mother explants and subculturing of in vitro produced shoot clumps on Murashige and Skoog's (MS) medium supplemented with various concentrations of 6-benzylaminopurine (BAP) alone or in combination with auxin (indole-3-acetic acid (IAA) or α-naphthalene acetic acid (NAA)). The maximum number of shoots (47.75 ± 2.58) was observed on MS medium supplemented with BAP (0.5 mg L−1) and IAA (0.1 mg L−1). In vitro raised shoots were rooted under ex vitro conditions after treating them with indole-3-butyric acid (300 mg L−1) for 3 min. Over 95 % of plantlets acclimatized successfully. The genetic fidelity of the regenerated plants was assessed using random amplified polymorphic DNA. No polymorphism was detected in regenerated plants and the mother plant, revealing the genetic fidelity of the in vitro raised plantlets. The protocol discussed could be effectively employed for large-scale multiplication of C. paniculatus. Its commercial application could be realized for the large-scale multiplication and supply to the State Forest Department.
Large-scale wind tunnel tests of a sting-supported V/STOL fighter model at high angles of attack
NASA Technical Reports Server (NTRS)
Stoll, F.; Minter, E. A.
1981-01-01
A new sting model support has been developed for the NASA/Ames 40- by 80-Foot Wind Tunnel. This addition to the facility permits testing of relatively large models to large angles of attack or angles of yaw depending on model orientation. An initial test on the sting is described. This test used a 0.4-scale powered V/STOL model designed for testing at angles of attack to 90 deg and greater. A method for correcting wake blockage was developed and applied to the force and moment data. Samples of this data and results of surface-pressure measurements are presented.
Fine-scale characteristics of interplanetary sector
NASA Technical Reports Server (NTRS)
Behannon, K. W.; Neubauer, F. M.; Barnstoff, H.
1980-01-01
The structure of the interplanetary sector boundaries observed by Helios 1 within sector transition regions was studied. Such regions consist of intermediate (nonspiral) average field orientations in some cases, as well as a number of large angle directional discontinuities (DD's) on the fine scale (time scales 1 hour). Such DD's are found to be more similar to tangential than rotational discontinuities, to be oriented on average more nearly perpendicular than parallel to the ecliptic plane to be accompanied usually by a large dip ( 80%) in B and, with a most probable thickness of 3 x 10 to the 4th power km, significantly thicker previously studied. It is hypothesized that the observed structures represent multiple traversals of the global heliospheric current sheet due to local fluctuations in the position of the sheet. There is evidence that such fluctuations are sometimes produced by wavelike motions or surface corrugations of scale length 0.05 - 0.1 AU superimposed on the large scale structure.
Homogenization techniques for population dynamics in strongly heterogeneous landscapes.
Yurk, Brian P; Cobbold, Christina A
2018-12-01
An important problem in spatial ecology is to understand how population-scale patterns emerge from individual-level birth, death, and movement processes. These processes, which depend on local landscape characteristics, vary spatially and may exhibit sharp transitions through behavioural responses to habitat edges, leading to discontinuous population densities. Such systems can be modelled using reaction-diffusion equations with interface conditions that capture local behaviour at patch boundaries. In this work we develop a novel homogenization technique to approximate the large-scale dynamics of the system. We illustrate our approach, which also generalizes to multiple species, with an example of logistic growth within a periodic environment. We find that population persistence and the large-scale population carrying capacity is influenced by patch residence times that depend on patch preference, as well as movement rates in adjacent patches. The forms of the homogenized coefficients yield key theoretical insights into how large-scale dynamics arise from the small-scale features.
ERIC Educational Resources Information Center
Yonker, Julie E.
2011-01-01
With the advent of online test banks and large introductory classes, instructors have often turned to textbook publisher-generated multiple-choice question (MCQ) exams in their courses. Multiple-choice questions are often divided into categories of factual or applied, thereby implicating levels of cognitive processing. This investigation examined…
Land Systems Impacts of Hydropower Development
NASA Astrophysics Data System (ADS)
Wu, G. C.; Torn, M. S.
2016-12-01
Hydropower is often seen as the low-cost, low-carbon, and high-return technology for meeting rising electricity demand and fueling economic growth. Despite the magnitude and pace of hydropower expansion in many developing countries, the potential land use and land cover change (LULCC), particularly indirect LULCC, resulting from hydropower development is poorly understood. Hydropower-driven LULCC can have multiple impacts ranging from global and local climate modification (e.g., increased extreme precipitation events or increased greenhouse gas emissions), ecosystem degradation and fragmentation, to feedbacks on hydropower generation (e.g., increased sedimentation of the reservoir). As a result, a better understanding of both direct and indirect LULCC impacts can inform a more integrated and low-impact model for energy planning in countries with transitioning or growing energy portfolios. This study uses multi-scale remote sensing imagery (Landsat, MODIS, fine-resolution commercial imagery) to estimate LULCC from past hydropower projects intended primarily for electricity generation in 12 countries in Africa, South and Central America, South Asia, and Southeast Asia. It is important to examine multiple locations to determine how socio-political and environmental context determines the magnitude of LULCC. Previous studies have called for the need to scale-up local case studies to examine "cumulative impacts" of multiple development activities within a watershed. We use a pre-test/post-test quasi-experimental design using a time series of classified images and vegetation indices before and after hydropower plant construction as the response variable in an interrupted time series regression analysis. This statistical technique measures the "treatment" effect of hydropower development on indirect LULCC. Preliminary results show land use change and landscape fragmentation following hydropower development, primarily agricultural and urban in nature. These results suggest that indirect land use change should be considered in the energy planning process and design of environmental impact assessments. The large-scale land system impact assessment method used in this study can be extended to examine other intensive development projects such as road construction and mining.
Solar Confocal Interferometers for Sub-Picometer-Resolution Spectral Filters
NASA Technical Reports Server (NTRS)
Gary, G. Allen; Pietraszewski, Chris; West, Edward A.; Dines, Terence C.
2006-01-01
The confocal Fabry-Perot interferometer allows sub-picometer spectral resolution of Fraunhofer line profiles. Such high spectral resolution is needed to keep pace with the higher spatial resolution of the new set of large-aperture solar telescopes. The line-of-sight spatial resolution derived for line profile inversions would then track the improvements of the transverse spatial scale provided by the larger apertures. The confocal interferometer's unique properties allow a simultaneous increase in both etendue and spectral power. Methods: We have constructed and tested two confocal interferometers. Conclusions: In this paper we compare the confocal interferometer with other spectral imaging filters, provide initial design parameters, show construction details for two designs, and report on the laboratory test results for these interferometers, and propose a multiple etalon system for future testing of these units and to obtain sub-picometer spectral resolution information on the photosphere in both the visible and near-infrared.
Jiménez, José; García, Emilio J; Llaneza, Luis; Palacios, Vicente; González, Luis Mariano; García-Domínguez, Francisco; Múñoz-Igualada, Jaime; López-Bao, José Vicente
2016-08-01
In many cases, the first step in large-carnivore management is to obtain objective, reliable, and cost-effective estimates of population parameters through procedures that are reproducible over time. However, monitoring predators over large areas is difficult, and the data have a high level of uncertainty. We devised a practical multimethod and multistate modeling approach based on Bayesian hierarchical-site-occupancy models that combined multiple survey methods to estimate different population states for use in monitoring large predators at a regional scale. We used wolves (Canis lupus) as our model species and generated reliable estimates of the number of sites with wolf reproduction (presence of pups). We used 2 wolf data sets from Spain (Western Galicia in 2013 and Asturias in 2004) to test the approach. Based on howling surveys, the naïve estimation (i.e., estimate based only on observations) of the number of sites with reproduction was 9 and 25 sites in Western Galicia and Asturias, respectively. Our model showed 33.4 (SD 9.6) and 34.4 (3.9) sites with wolf reproduction, respectively. The number of occupied sites with wolf reproduction was 0.67 (SD 0.19) and 0.76 (0.11), respectively. This approach can be used to design more cost-effective monitoring programs (i.e., to define the sampling effort needed per site). Our approach should inspire well-coordinated surveys across multiple administrative borders and populations and lead to improved decision making for management of large carnivores on a landscape level. The use of this Bayesian framework provides a simple way to visualize the degree of uncertainty around population-parameter estimates and thus provides managers and stakeholders an intuitive approach to interpreting monitoring results. Our approach can be widely applied to large spatial scales in wildlife monitoring where detection probabilities differ between population states and where several methods are being used to estimate different population parameters. © 2016 Society for Conservation Biology.
Braga, Virgínia F; Mendes, Giselle C; Oliveira, Raphael T R; Soares, Carla Q G; Resende, Cristiano F; Pinto, Leandro C; Santana, Reinaldo de; Viccini, Lyderson F; Raposo, Nádia R B; Peixoto, Paulo H P
2012-03-01
This work describes an efficient micropropagation protocol for Verbena litoralis and the study of the antinociceptive and antioxidant activities in extracts of this species. For the establishment in vitro, surface-sterilization procedures and PVPP showed high efficiency in fungal-bacterial contamination and phenol oxidation controls. Nodal segments cultivation in MS medium supplemented with 6-benzyladenine (7.5 µM)/α-naphthaleneacetic acid (NAA; 0.005 µM) induced multiple shoots. Elongated shoots were rooted with IAA (0.2 µM). Acclimatization rates were elevated and the plants showed the typical features of this species. The hexanic fraction (HF) of powdered leaves presented a radical scavenging activity with IC(50) = 169.3 µg mL(-1). HF showed a non-dose dependent analgesic activity in the writhing test; its antinociceptive activity in the hot plate test was restricted to 500 mg kg(-1), which is the highest dose. The results of this study showed the potential of tissue culture on conservation and large scale multiplication and confirmed the traditional folk medicine use of V. litoralis.
Numerical propulsion system simulation
NASA Technical Reports Server (NTRS)
Lytle, John K.; Remaklus, David A.; Nichols, Lester D.
1990-01-01
The cost of implementing new technology in aerospace propulsion systems is becoming prohibitively expensive. One of the major contributors to the high cost is the need to perform many large scale system tests. Extensive testing is used to capture the complex interactions among the multiple disciplines and the multiple components inherent in complex systems. The objective of the Numerical Propulsion System Simulation (NPSS) is to provide insight into these complex interactions through computational simulations. This will allow for comprehensive evaluation of new concepts early in the design phase before a commitment to hardware is made. It will also allow for rapid assessment of field-related problems, particularly in cases where operational problems were encountered during conditions that would be difficult to simulate experimentally. The tremendous progress taking place in computational engineering and the rapid increase in computing power expected through parallel processing make this concept feasible within the near future. However it is critical that the framework for such simulations be put in place now to serve as a focal point for the continued developments in computational engineering and computing hardware and software. The NPSS concept which is described will provide that framework.
NASA Astrophysics Data System (ADS)
Turner, Sean W. D.; Marlow, David; Ekström, Marie; Rhodes, Bruce G.; Kularathna, Udaya; Jeffrey, Paul J.
2014-04-01
Despite a decade of research into climate change impacts on water resources, the scientific community has delivered relatively few practical methodological developments for integrating uncertainty into water resources system design. This paper presents an application of the "decision scaling" methodology for assessing climate change impacts on water resources system performance and asks how such an approach might inform planning decisions. The decision scaling method reverses the conventional ethos of climate impact assessment by first establishing the climate conditions that would compel planners to intervene. Climate model projections are introduced at the end of the process to characterize climate risk in such a way that avoids the process of propagating those projections through hydrological models. Here we simulated 1000 multisite synthetic monthly streamflow traces in a model of the Melbourne bulk supply system to test the sensitivity of system performance to variations in streamflow statistics. An empirical relation was derived to convert decision-critical flow statistics to climatic units, against which 138 alternative climate projections were plotted and compared. We defined the decision threshold in terms of a system yield metric constrained by multiple performance criteria. Our approach allows for fast and simple incorporation of demand forecast uncertainty and demonstrates the reach of the decision scaling method through successful execution in a large and complex water resources system. Scope for wider application in urban water resources planning is discussed.
NASA Astrophysics Data System (ADS)
Mateo, Cherry May R.; Yamazaki, Dai; Kim, Hyungjun; Champathong, Adisorn; Vaze, Jai; Oki, Taikan
2017-10-01
Global-scale river models (GRMs) are core tools for providing consistent estimates of global flood hazard, especially in data-scarce regions. Due to former limitations in computational power and input datasets, most GRMs have been developed to use simplified representations of flow physics and run at coarse spatial resolutions. With increasing computational power and improved datasets, the application of GRMs to finer resolutions is becoming a reality. To support development in this direction, the suitability of GRMs for application to finer resolutions needs to be assessed. This study investigates the impacts of spatial resolution and flow connectivity representation on the predictive capability of a GRM, CaMa-Flood, in simulating the 2011 extreme flood in Thailand. Analyses show that when single downstream connectivity (SDC) is assumed, simulation results deteriorate with finer spatial resolution; Nash-Sutcliffe efficiency coefficients decreased by more than 50 % between simulation results at 10 km resolution and 1 km resolution. When multiple downstream connectivity (MDC) is represented, simulation results slightly improve with finer spatial resolution. The SDC simulations result in excessive backflows on very flat floodplains due to the restrictive flow directions at finer resolutions. MDC channels attenuated these effects by maintaining flow connectivity and flow capacity between floodplains in varying spatial resolutions. While a regional-scale flood was chosen as a test case, these findings should be universal and may have significant impacts on large- to global-scale simulations, especially in regions where mega deltas exist.These results demonstrate that a GRM can be used for higher resolution simulations of large-scale floods, provided that MDC in rivers and floodplains is adequately represented in the model structure.
Chu, Dong; Guo, Dong; Tao, Yunli; Jiang, Defeng; Li, Jie; Zhang, Youjun
2014-01-01
The sweetpotato whitefly Bemisia tabaci Q species is a recent invader and important pest of agricultural crops in China. This research tested the hypothesis that the Q populations that establish in agricultural fields in northern China each year are derived from multiple secondary introductions and/or local populations that overwinter in greenhouses (the pest cannot survive winters in the field in northern China). Here, we report the evidence that the Q populations in agricultural fields mainly derive from multiple secondary introductions. In addition, the common use of greenhouses during the winter in certain locations in northern China helps increase the genetic diversity and the genetic structure of the pest. The genetic structure information generated from this long-term and large-scale field analysis increases our understanding of B. tabaci Q as an invasive pest and has important implications for B. tabaci Q management. PMID:24637851
Potential for geophysical experiments in large scale tests.
Dieterich, J.H.
1981-01-01
Potential research applications for large-specimen geophysical experiments include measurements of scale dependence of physical parameters and examination of interactions with heterogeneities, especially flaws such as cracks. In addition, increased specimen size provides opportunities for improved recording resolution and greater control of experimental variables. Large-scale experiments using a special purpose low stress (100MPa).-Author
Hastrup, Sidsel; Damgaard, Dorte; Johnsen, Søren Paaske; Andersen, Grethe
2016-07-01
We designed and validated a simple prehospital stroke scale to identify emergent large vessel occlusion (ELVO) in patients with acute ischemic stroke and compared the scale to other published scales for prediction of ELVO. A national historical test cohort of 3127 patients with information on intracranial vessel status (angiography) before reperfusion therapy was identified. National Institutes of Health Stroke Scale (NIHSS) items with the highest predictive value of occlusion of a large intracranial artery were identified, and the most optimal combination meeting predefined criteria to ensure usefulness in the prehospital phase was determined. The predictive performance of Prehospital Acute Stroke Severity (PASS) scale was compared with other published scales for ELVO. The PASS scale was composed of 3 NIHSS scores: level of consciousness (month/age), gaze palsy/deviation, and arm weakness. In derivation of PASS 2/3 of the test cohort was used and showed accuracy (area under the curve) of 0.76 for detecting large arterial occlusion. Optimal cut point ≥2 abnormal scores showed: sensitivity=0.66 (95% CI, 0.62-0.69), specificity=0.83 (0.81-0.85), and area under the curve=0.74 (0.72-0.76). Validation on 1/3 of the test cohort showed similar performance. Patients with a large artery occlusion on angiography with PASS ≥2 had a median NIHSS score of 17 (interquartile range=6) as opposed to PASS <2 with a median NIHSS score of 6 (interquartile range=5). The PASS scale showed equal performance although more simple when compared with other scales predicting ELVO. The PASS scale is simple and has promising accuracy for prediction of ELVO in the field. © 2016 American Heart Association, Inc.
ERIC Educational Resources Information Center
Waninge, A.; Rook, R. A.; Dijkhuizen, A.; Gielen, E.; van der Schans, C. P.
2011-01-01
Caregivers of persons with profound intellectual and multiple disabilities (PIMD) often describe the quality of the daily movements of these persons in terms of flexibility or stiffness. Objective outcome measures for flexibility and stiffness are muscle tone or level of spasticity. Two instruments used to grade muscle tone and spasticity are the…
Large-scale diversity of slope fishes: pattern inconsistency between multiple diversity indices.
Gaertner, Jean-Claude; Maiorano, Porzia; Mérigot, Bastien; Colloca, Francesco; Politou, Chrissi-Yianna; Gil De Sola, Luis; Bertrand, Jacques A; Murenu, Matteo; Durbec, Jean-Pierre; Kallianiotis, Argyris; Mannini, Alessandro
2013-01-01
Large-scale studies focused on the diversity of continental slope ecosystems are still rare, usually restricted to a limited number of diversity indices and mainly based on the empirical comparison of heterogeneous local data sets. In contrast, we investigate large-scale fish diversity on the basis of multiple diversity indices and using 1454 standardized trawl hauls collected throughout the upper and middle slope of the whole northern Mediterranean Sea (36°3'- 45°7' N; 5°3'W - 28°E). We have analyzed (1) the empirical relationships between a set of 11 diversity indices in order to assess their degree of complementarity/redundancy and (2) the consistency of spatial patterns exhibited by each of the complementary groups of indices. Regarding species richness, our results contrasted both the traditional view based on the hump-shaped theory for bathymetric pattern and the commonly-admitted hypothesis of a large-scale decreasing trend correlated with a similar gradient of primary production in the Mediterranean Sea. More generally, we found that the components of slope fish diversity we analyzed did not always show a consistent pattern of distribution according either to depth or to spatial areas, suggesting that they are not driven by the same factors. These results, which stress the need to extend the number of indices traditionally considered in diversity monitoring networks, could provide a basis for rethinking not only the methodological approach used in monitoring systems, but also the definition of priority zones for protection. Finally, our results call into question the feasibility of properly investigating large-scale diversity patterns using a widespread approach in ecology, which is based on the compilation of pre-existing heterogeneous and disparate data sets, in particular when focusing on indices that are very sensitive to sampling design standardization, such as species richness.
Large scale modulation of high frequency acoustic waves in periodic porous media.
Boutin, Claude; Rallu, Antoine; Hans, Stephane
2012-12-01
This paper deals with the description of the modulation at large scale of high frequency acoustic waves in gas saturated periodic porous media. High frequencies mean local dynamics at the pore scale and therefore absence of scale separation in the usual sense of homogenization. However, although the pressure is spatially varying in the pores (according to periodic eigenmodes), the mode amplitude can present a large scale modulation, thereby introducing another type of scale separation to which the asymptotic multi-scale procedure applies. The approach is first presented on a periodic network of inter-connected Helmholtz resonators. The equations governing the modulations carried by periodic eigenmodes, at frequencies close to their eigenfrequency, are derived. The number of cells on which the carrying periodic mode is defined is therefore a parameter of the modeling. In a second part, the asymptotic approach is developed for periodic porous media saturated by a perfect gas. Using the "multicells" periodic condition, one obtains the family of equations governing the amplitude modulation at large scale of high frequency waves. The significant difference between modulations of simple and multiple mode are evidenced and discussed. The features of the modulation (anisotropy, width of frequency band) are also analyzed.
Higher-Order Factor Structure of the Differential Ability Scales-II: Consistency across Ages 4 to 17
ERIC Educational Resources Information Center
Keith, Timothy Z.; Low, Justin A.; Reynolds, Matthew R.; Patel, Puja G.; Ridley, Kristen P.
2010-01-01
The recently published second edition of the Differential Abilities Scale (DAS-II) is designed to measure multiple broad and general abilities from Cattell-Horn-Carroll (CHC) theory. Although the technical manual presents information supporting the test's structure, additional research is needed to determine the constructs measured by the test and…
NASA Astrophysics Data System (ADS)
Tan, Z.; Schneider, T.; Teixeira, J.; Lam, R.; Pressel, K. G.
2014-12-01
Sub-grid scale (SGS) closures in current climate models are usually decomposed into several largely independent parameterization schemes for different cloud and convective processes, such as boundary layer turbulence, shallow convection, and deep convection. These separate parameterizations usually do not converge as the resolution is increased or as physical limits are taken. This makes it difficult to represent the interactions and smooth transition among different cloud and convective regimes. Here we present an eddy-diffusivity mass-flux (EDMF) closure that represents all sub-grid scale turbulent, convective, and cloud processes in a unified parameterization scheme. The buoyant updrafts and precipitative downdrafts are parameterized with a prognostic multiple-plume mass-flux (MF) scheme. The prognostic term for the mass flux is kept so that the life cycles of convective plumes are better represented. The interaction between updrafts and downdrafts are parameterized with the buoyancy-sorting model. The turbulent mixing outside plumes is represented by eddy diffusion, in which eddy diffusivity (ED) is determined from a turbulent kinetic energy (TKE) calculated from a TKE balance that couples the environment with updrafts and downdrafts. Similarly, tracer variances are decomposed consistently between updrafts, downdrafts and the environment. The closure is internally coupled with a probabilistic cloud scheme and a simple precipitation scheme. We have also developed a relatively simple two-stream radiative scheme that includes the longwave (LW) and shortwave (SW) effects of clouds, and the LW effect of water vapor. We have tested this closure in a single-column model for various regimes spanning stratocumulus, shallow cumulus, and deep convection. The model is also run towards statistical equilibrium with climatologically relevant large-scale forcings. These model tests are validated against large-eddy simulation (LES) with the same forcings. The comparison of results verifies the capacity of this closure to realistically represent different cloud and convective processes. Implementation of the closure in an idealized GCM allows us to study cloud feedbacks to climate change and to study the interactions between clouds, convections, and the large-scale circulation.
A blended continuous–discontinuous finite element method for solving the multi-fluid plasma model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sousa, E.M., E-mail: sousae@uw.edu; Shumlak, U., E-mail: shumlak@uw.edu
The multi-fluid plasma model represents electrons, multiple ion species, and multiple neutral species as separate fluids that interact through short-range collisions and long-range electromagnetic fields. The model spans a large range of temporal and spatial scales, which renders the model stiff and presents numerical challenges. To address the large range of timescales, a blended continuous and discontinuous Galerkin method is proposed, where the massive ion and neutral species are modeled using an explicit discontinuous Galerkin method while the electrons and electromagnetic fields are modeled using an implicit continuous Galerkin method. This approach is able to capture large-gradient ion and neutralmore » physics like shock formation, while resolving high-frequency electron dynamics in a computationally efficient manner. The details of the Blended Finite Element Method (BFEM) are presented. The numerical method is benchmarked for accuracy and tested using two-fluid one-dimensional soliton problem and electromagnetic shock problem. The results are compared to conventional finite volume and finite element methods, and demonstrate that the BFEM is particularly effective in resolving physics in stiff problems involving realistic physical parameters, including realistic electron mass and speed of light. The benefit is illustrated by computing a three-fluid plasma application that demonstrates species separation in multi-component plasmas.« less
GLAD: a system for developing and deploying large-scale bioinformatics grid.
Teo, Yong-Meng; Wang, Xianbing; Ng, Yew-Kwong
2005-03-01
Grid computing is used to solve large-scale bioinformatics problems with gigabytes database by distributing the computation across multiple platforms. Until now in developing bioinformatics grid applications, it is extremely tedious to design and implement the component algorithms and parallelization techniques for different classes of problems, and to access remotely located sequence database files of varying formats across the grid. In this study, we propose a grid programming toolkit, GLAD (Grid Life sciences Applications Developer), which facilitates the development and deployment of bioinformatics applications on a grid. GLAD has been developed using ALiCE (Adaptive scaLable Internet-based Computing Engine), a Java-based grid middleware, which exploits the task-based parallelism. Two bioinformatics benchmark applications, such as distributed sequence comparison and distributed progressive multiple sequence alignment, have been developed using GLAD.
NASA Astrophysics Data System (ADS)
Kim, Jeong-Gyu; Kim, Woong-Tae; Ostriker, Eve C.; Skinner, M. Aaron
2017-12-01
We present an implementation of an adaptive ray-tracing (ART) module in the Athena hydrodynamics code that accurately and efficiently handles the radiative transfer involving multiple point sources on a three-dimensional Cartesian grid. We adopt a recently proposed parallel algorithm that uses nonblocking, asynchronous MPI communications to accelerate transport of rays across the computational domain. We validate our implementation through several standard test problems, including the propagation of radiation in vacuum and the expansions of various types of H II regions. Additionally, scaling tests show that the cost of a full ray trace per source remains comparable to that of the hydrodynamics update on up to ∼ {10}3 processors. To demonstrate application of our ART implementation, we perform a simulation of star cluster formation in a marginally bound, turbulent cloud, finding that its star formation efficiency is 12% when both radiation pressure forces and photoionization by UV radiation are treated. We directly compare the radiation forces computed from the ART scheme with those from the M1 closure relation. Although the ART and M1 schemes yield similar results on large scales, the latter is unable to resolve the radiation field accurately near individual point sources.
Afshar, Yaser; Sbalzarini, Ivo F.
2016-01-01
Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 1010 pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments. PMID:27046144
Afshar, Yaser; Sbalzarini, Ivo F
2016-01-01
Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 10(10) pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments.
Consistent assessments of biological condition are needed across multiple ecoregions to provide a greater understanding of the spatial extent of environmental degradation. However, consistent assessments at large geographic scales are often hampered by lack of uniformity in data ...
Andrew P. Kinziger; Rodney J. Nakamoto; Bret C. Harvey
2014-01-01
Given the general pattern of invasions with severe ecological consequences commonly resulting from multiple introductions of large numbers of individuals on the intercontinental scale, we explored an example of a highly successful, ecologically significant invader introduced over a short distance, possibly via minimal propagule pressure. The Sacramento pikeminnow (
USDA-ARS?s Scientific Manuscript database
Many of the most dramatic and surprising effects of global change on ecological systems will occur across large spatial extents, from regions to continents. Multiple ecosystem types will be impacted across a range of interacting spatial and temporal scales. The ability of ecologists to understand an...
In 1990, EMAP's Coastal Monitoring Program conducted its first regional sampling program in the Virginian Province. This first effort focused only at large spatial scales (regional) with some stratification to examine estuarine types. In the ensuing decade, EMAP-Coastal has condu...
ERIC Educational Resources Information Center
Fantuzzo, John; Perlman, Staci; Sproul, Faith; Minney, Ashley; Perry, Marlo A.; Li, Feifei
2012-01-01
The study developed multiple independent scales of early childhood teacher experiences (ECTES). ECTES was co-constructed with preschool, kindergarten, and first grade teachers in a large urban school district. Demographic, ECTES, and teaching practices data were collected from 584 teachers. Factor analyses documented three teacher experience…
NASA Technical Reports Server (NTRS)
Ross, R. G., Jr.
1982-01-01
The Jet Propulsion Laboratory has developed a number of photovoltaic test and measurement specifications to guide the development of modules toward the requirements of future large-scale applications. Experience with these specifications and the extensive module measurement and testing that has accompanied their use is examined. Conclusions are drawn relative to three aspects of product certification: performance measurement, endurance testing and safety evaluation.
Large-scale data analysis of power grid resilience across multiple US service regions
NASA Astrophysics Data System (ADS)
Ji, Chuanyi; Wei, Yun; Mei, Henry; Calzada, Jorge; Carey, Matthew; Church, Steve; Hayes, Timothy; Nugent, Brian; Stella, Gregory; Wallace, Matthew; White, Joe; Wilcox, Robert
2016-05-01
Severe weather events frequently result in large-scale power failures, affecting millions of people for extended durations. However, the lack of comprehensive, detailed failure and recovery data has impeded large-scale resilience studies. Here, we analyse data from four major service regions representing Upstate New York during Super Storm Sandy and daily operations. Using non-stationary spatiotemporal random processes that relate infrastructural failures to recoveries and cost, our data analysis shows that local power failures have a disproportionally large non-local impact on people (that is, the top 20% of failures interrupted 84% of services to customers). A large number (89%) of small failures, represented by the bottom 34% of customers and commonplace devices, resulted in 56% of the total cost of 28 million customer interruption hours. Our study shows that extreme weather does not cause, but rather exacerbates, existing vulnerabilities, which are obscured in daily operations.
Langner, Robert; Cieslik, Edna C.; Rottschy, Claudia; Eickhoff, Simon B.
2016-01-01
Cognitive flexibility, a core aspect of executive functioning, is required for the speeded shifting between different tasks and sets. Using an interindividual differences approach, we examined whether cognitive flexibility, as assessed by the Delis–Kaplan card-sorting test, is associated with gray matter volume (GMV) and functional connectivity (FC) of regions of a core network of multiple cognitive demands as well as with different facets of trait impulsivity. The core multiple-demand network was derived from three large-scale neuroimaging meta-analyses and only included regions that showed consistent associations with sustained attention, working memory as well as inhibitory control. We tested to what extent self-reported impulsivity as well as GMV and resting-state FC in this core network predicted cognitive flexibility independently and incrementally. Our analyses revealed that card-sorting performance correlated positively with GMV of the right anterior insula, FC between bilateral anterior insula and midcingulate cortex/supplementary motor area as well as the impulsivity dimension “Premeditation.” Importantly, GMV, FC and impulsivity together accounted for more variance of card-sorting performance than every parameter alone. Our results therefore indicate that various factors contribute individually to cognitive flexibility, underlining the need to search across multiple modalities when aiming to unveil the mechanisms behind executive functioning. PMID:24878823
Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.
2016-01-01
Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933
Robbins, Blaine
2013-01-01
Sociologists, political scientists, and economists all suggest that culture plays a pivotal role in the development of large-scale cooperation. In this study, I used generalized trust as a measure of culture to explore if and how culture impacts intentional homicide, my operationalization of cooperation. I compiled multiple cross-national data sets and used pooled time-series linear regression, single-equation instrumental-variables linear regression, and fixed- and random-effects estimation techniques on an unbalanced panel of 118 countries and 232 observations spread over a 15-year time period. Results suggest that culture and large-scale cooperation form a tenuous relationship, while economic factors such as development, inequality, and geopolitics appear to drive large-scale cooperation.
A visualization tool to support decision making in environmental and biological planning
Romañach, Stephanie S.; McKelvy, James M.; Conzelmann, Craig; Suir, Kevin J.
2014-01-01
Large-scale ecosystem management involves consideration of many factors for informed decision making. The EverVIEW Data Viewer is a cross-platform desktop decision support tool to help decision makers compare simulation model outputs from competing plans for restoring Florida's Greater Everglades. The integration of NetCDF metadata conventions into EverVIEW allows end-users from multiple institutions within and beyond the Everglades restoration community to share information and tools. Our development process incorporates continuous interaction with targeted end-users for increased likelihood of adoption. One of EverVIEW's signature features is side-by-side map panels, which can be used to simultaneously compare species or habitat impacts from alternative restoration plans. Other features include examination of potential restoration plan impacts across multiple geographic or tabular displays, and animation through time. As a result of an iterative, standards-driven approach, EverVIEW is relevant to large-scale planning beyond Florida, and is used in multiple biological planning efforts in the United States.
New architecture for utility scale electricity from concentrator photovoltaics
NASA Astrophysics Data System (ADS)
Angel, Roger; Connors, Thomas; Davison, Warren; Olbert, Blain; Sivanandam, Suresh
2010-08-01
The paper describes a new system architecture optimized for utility-scale generation with concentrating photovoltaic cells (CPV) at fossil fuel price. We report on-sun tests of the architecture and development at the University of Arizona of the manufacturing processes adapted for high volume production. The new system takes advantage of triple-junction cells to convert concentrated sunlight into electricity. These commercially available cells have twice the conversion efficiency of silicon panels (40%) and one-tenth the cost per watt, when used at 1000x concentration. Telescope technology is adapted to deliver concentrated light to the cells at minimum cost. The architecture combines three novel elements: large (3.1 m x 3.1 m square) dish reflectors made as back-silvered glass monoliths; 2.5 kW receivers at each dish focus, each one incorporating a spherical field lens to deliver uniform illumination to multiple cells; and a lightweight steel spaceframe structure to hold multiple dish/receiver units in coalignment and oriented to the sun. Development of the process for replicating single-piece reflector dishes is well advanced at the Steward Observatory Mirror Lab. End-to-end system tests have been completed with single cells. A lightweight steel spaceframe to hold and track eight dish/receiver units to generate 20 kW has been completed. A single 2.5 kW receiver is presently under construction, and is expected to be operated in an end-to-end on-sun test with a monolithic dish before the end of 2010. The University of Arizona has granted an exclusive license to REhnu, LLC to commercialize this technology.
Development of fire test methods for airplane interior materials
NASA Technical Reports Server (NTRS)
Tustin, E. A.
1978-01-01
Fire tests were conducted in a 737 airplane fuselage at NASA-JSC to characterize jet fuel fires in open steel pans (simulating post-crash fire sources and a ruptured airplane fuselage) and to characterize fires in some common combustibles (simulating in-flight fire sources). Design post-crash and in-flight fire source selections were based on these data. Large panels of airplane interior materials were exposed to closely-controlled large scale heating simulations of the two design fire sources in a Boeing fire test facility utilizing a surplused 707 fuselage section. Small samples of the same airplane materials were tested by several laboratory fire test methods. Large scale and laboratory scale data were examined for correlative factors. Published data for dangerous hazard levels in a fire environment were used as the basis for developing a method to select the most desirable material where trade-offs in heat, smoke and gaseous toxicant evolution must be considered.
High-Stakes Accountability: Student Anxiety and Large-Scale Testing
ERIC Educational Resources Information Center
von der Embse, Nathaniel P.; Witmer, Sara E.
2014-01-01
This study examined the relationship between student anxiety about high-stakes testing and their subsequent test performance. The FRIEDBEN Test Anxiety Scale was administered to 1,134 11th-grade students, and data were subsequently collected on their statewide assessment performance. Test anxiety was a significant predictor of test performance…
Hebert, J R; Clemow, L; Pbert, L; Ockene, I S; Ockene, J K
1995-04-01
Self-report of dietary intake could be biased by social desirability or social approval thus affecting risk estimates in epidemiological studies. These constructs produce response set biases, which are evident when testing in domains characterized by easily recognizable correct or desirable responses. Given the social and psychological value ascribed to diet, assessment methodologies used most commonly in epidemiological studies are particularly vulnerable to these biases. Social desirability and social approval biases were tested by comparing nutrient scores derived from multiple 24-hour diet recalls (24HR) on seven randomly assigned days with those from two 7-day diet recalls (7DDR) (similar in some respects to commonly used food frequency questionnaires), one administered at the beginning of the test period (pre) and one at the end (post). Statistical analysis included correlation and multiple linear regression. Cross-sectionally, no relationships between social approval score and the nutritional variables existed. Social desirability score was negatively correlated with most nutritional variables. In linear regression analysis, social desirability score produced a large downward bias in nutrient estimation in the 7DDR relative to the 24HR. For total energy, this bias equalled about 50 kcal/point on the social desirability scale or about 450 kcal over its interquartile range. The bias was approximately twice as large for women as for men and only about half as large in the post measures. Individuals having the highest 24HR-derived fat and total energy intake scores had the largest downward bias due to social desirability. We observed a large downward bias in reporting food intake related to social desirability score. These results are consistent with the theoretical constructs on which the hypothesis is based. The effect of social desirability bias is discussed in terms of its influence on epidemiological estimates of effect. Suggestions are made for future work aimed at improving dietary assessment methodologies and adjusting risk estimates for this bias.
Anderson, R.N.; Boulanger, A.; Bagdonas, E.P.; Xu, L.; He, W.
1996-12-17
The invention utilizes 3-D and 4-D seismic surveys as a means of deriving information useful in petroleum exploration and reservoir management. The methods use both single seismic surveys (3-D) and multiple seismic surveys separated in time (4-D) of a region of interest to determine large scale migration pathways within sedimentary basins, and fine scale drainage structure and oil-water-gas regions within individual petroleum producing reservoirs. Such structure is identified using pattern recognition tools which define the regions of interest. The 4-D seismic data sets may be used for data completion for large scale structure where time intervals between surveys do not allow for dynamic evolution. The 4-D seismic data sets also may be used to find variations over time of small scale structure within individual reservoirs which may be used to identify petroleum drainage pathways, oil-water-gas regions and, hence, attractive drilling targets. After spatial orientation, and amplitude and frequency matching of the multiple seismic data sets, High Amplitude Event (HAE) regions consistent with the presence of petroleum are identified using seismic attribute analysis. High Amplitude Regions are grown and interconnected to establish plumbing networks on the large scale and reservoir structure on the small scale. Small scale variations over time between seismic surveys within individual reservoirs are identified and used to identify drainage patterns and bypassed petroleum to be recovered. The location of such drainage patterns and bypassed petroleum may be used to site wells. 22 figs.
Anderson, Roger N.; Boulanger, Albert; Bagdonas, Edward P.; Xu, Liqing; He, Wei
1996-01-01
The invention utilizes 3-D and 4-D seismic surveys as a means of deriving information useful in petroleum exploration and reservoir management. The methods use both single seismic surveys (3-D) and multiple seismic surveys separated in time (4-D) of a region of interest to determine large scale migration pathways within sedimentary basins, and fine scale drainage structure and oil-water-gas regions within individual petroleum producing reservoirs. Such structure is identified using pattern recognition tools which define the regions of interest. The 4-D seismic data sets may be used for data completion for large scale structure where time intervals between surveys do not allow for dynamic evolution. The 4-D seismic data sets also may be used to find variations over time of small scale structure within individual reservoirs which may be used to identify petroleum drainage pathways, oil-water-gas regions and, hence, attractive drilling targets. After spatial orientation, and amplitude and frequency matching of the multiple seismic data sets, High Amplitude Event (HAE) regions consistent with the presence of petroleum are identified using seismic attribute analysis. High Amplitude Regions are grown and interconnected to establish plumbing networks on the large scale and reservoir structure on the small scale. Small scale variations over time between seismic surveys within individual reservoirs are identified and used to identify drainage patterns and bypassed petroleum to be recovered. The location of such drainage patterns and bypassed petroleum may be used to site wells.
Nuclear Energy Assessment Battery. Form C.
ERIC Educational Resources Information Center
Showers, Dennis Edward
This publication consists of a nuclear energy assessment battery for secondary level students. The test contains 44 multiple choice items and is organized into four major sections. Parts include: (1) a knowledge scale; (2) attitudes toward nuclear energy; (3) a behaviors and intentions scale; and (4) an anxiety scale. Directions are provided for…
Ice Accretion Test Results for Three Large-Scale Swept-Wing Models in the NASA Icing Research Tunnel
NASA Technical Reports Server (NTRS)
Broeren, Andy; Potapczuk, Mark; Lee, Sam; Malone, Adam; Paul, Ben; Woodard, Brian
2016-01-01
The design and certification of modern transport airplanes for flight in icing conditions increasing relies on three-dimensional numerical simulation tools for ice accretion prediction. There is currently no publically available, high-quality, ice accretion database upon which to evaluate the performance of icing simulation tools for large-scale swept wings that are representative of modern commercial transport airplanes. The purpose of this presentation is to present the results of a series of icing wind tunnel test campaigns whose aim was to provide an ice accretion database for large-scale, swept wings.
epiDMS: Data Management and Analytics for Decision-Making From Epidemic Spread Simulation Ensembles.
Liu, Sicong; Poccia, Silvestro; Candan, K Selçuk; Chowell, Gerardo; Sapino, Maria Luisa
2016-12-01
Carefully calibrated large-scale computational models of epidemic spread represent a powerful tool to support the decision-making process during epidemic emergencies. Epidemic models are being increasingly used for generating forecasts of the spatial-temporal progression of epidemics at different spatial scales and for assessing the likely impact of different intervention strategies. However, the management and analysis of simulation ensembles stemming from large-scale computational models pose challenges, particularly when dealing with multiple interdependent parameters, spanning multiple layers and geospatial frames, affected by complex dynamic processes operating at different resolutions. We describe and illustrate with examples a novel epidemic simulation data management system, epiDMS, that was developed to address the challenges that arise from the need to generate, search, visualize, and analyze, in a scalable manner, large volumes of epidemic simulation ensembles and observations during the progression of an epidemic. epiDMS is a publicly available system that facilitates management and analysis of large epidemic simulation ensembles. epiDMS aims to fill an important hole in decision-making during healthcare emergencies by enabling critical services with significant economic and health impact. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.
Dynamic effective connectivity in cortically embedded systems of recurrently coupled synfire chains.
Trengove, Chris; Diesmann, Markus; van Leeuwen, Cees
2016-02-01
As a candidate mechanism of neural representation, large numbers of synfire chains can efficiently be embedded in a balanced recurrent cortical network model. Here we study a model in which multiple synfire chains of variable strength are randomly coupled together to form a recurrent system. The system can be implemented both as a large-scale network of integrate-and-fire neurons and as a reduced model. The latter has binary-state pools as basic units but is otherwise isomorphic to the large-scale model, and provides an efficient tool for studying its behavior. Both the large-scale system and its reduced counterpart are able to sustain ongoing endogenous activity in the form of synfire waves, the proliferation of which is regulated by negative feedback caused by collateral noise. Within this equilibrium, diverse repertoires of ongoing activity are observed, including meta-stability and multiple steady states. These states arise in concert with an effective connectivity structure (ECS). The ECS admits a family of effective connectivity graphs (ECGs), parametrized by the mean global activity level. Of these graphs, the strongly connected components and their associated out-components account to a large extent for the observed steady states of the system. These results imply a notion of dynamic effective connectivity as governing neural computation with synfire chains, and related forms of cortical circuitry with complex topologies.
Performance of lap splices in large-scale column specimens affected by ASR and/or DEF.
DOT National Transportation Integrated Search
2012-06-01
This research program conducted a large experimental program, which consisted of the design, construction, : curing, deterioration, and structural load testing of 16 large-scale column specimens with a critical lap splice : region, and then compared ...
Summary measures of agreement and association between many raters' ordinal classifications.
Mitani, Aya A; Freer, Phoebe E; Nelson, Kerrie P
2017-10-01
Interpretation of screening tests such as mammograms usually require a radiologist's subjective visual assessment of images, often resulting in substantial discrepancies between radiologists' classifications of subjects' test results. In clinical screening studies to assess the strength of agreement between experts, multiple raters are often recruited to assess subjects' test results using an ordinal classification scale. However, using traditional measures of agreement in some studies is challenging because of the presence of many raters, the use of an ordinal classification scale, and unbalanced data. We assess and compare the performances of existing measures of agreement and association as well as a newly developed model-based measure of agreement to three large-scale clinical screening studies involving many raters' ordinal classifications. We also conduct a simulation study to demonstrate the key properties of the summary measures. The assessment of agreement and association varied according to the choice of summary measure. Some measures were influenced by the underlying prevalence of disease and raters' marginal distributions and/or were limited in use to balanced data sets where every rater classifies every subject. Our simulation study indicated that popular measures of agreement and association are prone to underlying disease prevalence. Model-based measures provide a flexible approach for calculating agreement and association and are robust to missing and unbalanced data as well as the underlying disease prevalence. Copyright © 2017 Elsevier Inc. All rights reserved.
A Diagnostic Study of Pre-Service Teachers' Competency in Multiple-Choice Item Development
ERIC Educational Resources Information Center
Asim, Alice E.; Ekuri, Emmanuel E.; Eni, Eni I.
2013-01-01
Large class size is an issue in testing at all levels of Education. As a panacea to this, multiple choice test formats has become very popular. This case study was designed to diagnose pre-service teachers' competency in constructing questions (IQT); direct questions (DQT); and best answer (BAT) varieties of multiple choice items. Subjects were 88…
Conjugate-Gradient Algorithms For Dynamics Of Manipulators
NASA Technical Reports Server (NTRS)
Fijany, Amir; Scheid, Robert E.
1993-01-01
Algorithms for serial and parallel computation of forward dynamics of multiple-link robotic manipulators by conjugate-gradient method developed. Parallel algorithms have potential for speedup of computations on multiple linked, specialized processors implemented in very-large-scale integrated circuits. Such processors used to stimulate dynamics, possibly faster than in real time, for purposes of planning and control.
Multi-resource and multi-scale approaches for meeting the challenge of managing multiple species
Frank R. Thompson; Deborah M. Finch; John R. Probst; Glen D. Gaines; David S. Dobkin
1999-01-01
The large number of Neotropical migratory bird (NTMB) species and their diverse habitat requirements create conflicts and difficulties for land managers and conservationists. We provide examples of assessments or conservation efforts that attempt to address the problem of managing for multiple NTMB species. We advocate approaches at a variety of spatial and geographic...
Feys, Peter; Moumdjian, Lousin; Van Halewyck, Florian; Wens, Inez; Eijnde, Bert O; Van Wijmeersch, Bart; Popescu, Veronica; Van Asch, Paul
2017-11-01
Exercise therapy studies in persons with multiple sclerosis (pwMS) primarily focused on motor outcomes in mid disease stage, while cognitive function and neural correlates were only limitedly addressed. This pragmatic randomized controlled study investigated the effects of a remotely supervised community-located "start-to-run" program on physical and cognitive function, fatigue, quality of life, brain volume, and connectivity. In all, 42 pwMS were randomized to either experimental (EXP) or waiting list control (WLC) group. The EXP group received individualized training instructions during 12 weeks (3×/week), to be performed in their community aiming to participate in a running event. Measures were physical (VO 2max , sit-to-stand test, Six-Minute Walk Test (6MWT), Multiple Sclerosis Walking Scale-12 (MSWS-12)) and cognitive function (Rao's Brief Repeatable Battery (BRB), Paced Auditory Serial Attention Test (PASAT)), fatigue (Fatigue Scale for Motor and Cognitive Function (FSMC)), quality of life (Multiple Sclerosis Impact Scale-29 (MSIS-29)), and imaging. Brain volumes and diffusion tensor imaging (DTI) were quantified using FSL-SIENA/FIRST and FSL-TBSS. In all, 35 pwMS completed the trial. Interaction effects in favor of the EXP group were found for VO 2max , sit-to-stand test, MSWS-12, Spatial Recall Test, FSMC, MSIS-29, and pallidum volume. VO 2max improved by 1.5 mL/kg/min, MSWS-12 by 4, FSMC by 11, and MSIS-29 by 14 points. The Spatial Recall Test improved by more than 10%. Community-located run training improved aerobic capacity, functional mobility, visuospatial memory, fatigue, and quality of life and pallidum volume in pwMS.
Katzner, Todd E.; Turk, Philip J.; Duerr, Adam E.; Miller, Tricia A.; Lanzone, Michael J.; Cooper, Jeff L.; Brandes, David; Tremblay, Junior A.; Lemaître, Jérôme
2015-01-01
Large birds regularly use updrafts to subsidize flight. Although most research on soaring bird flight has focused on use of thermal updrafts, there is evidence suggesting that many species are likely to use multiple modes of subsidy. We tested the degree to which a large soaring species uses multiple modes of subsidy to provide insights into the decision-making that underlies flight behaviour. We statistically classified more than 22 000 global positioning satellite–global system for mobile communications telemetry points collected at 30-s intervals to identify the type of subsidized flight used by 32 migrating golden eagles during spring in eastern North America. Eagles used subsidized flight on 87% of their journey. They spent 41.9% ± 1.5 (, range: 18–56%) of their subsidized northbound migration using thermal soaring, 45.2% ± 2.1 (12–65%) of time gliding between thermals, and 12.9% ± 2.2 (1–55%) of time using orographic updrafts. Golden eagles responded to the variable local-scale meteorological events they encountered by switching flight behaviour to take advantage of multiple modes of subsidy. Orographic soaring occurred more frequently in morning and evening, earlier in the migration season, and when crosswinds and tail winds were greatest. Switching between flight modes allowed migration for relatively longer periods each day and frequent switching behaviour has implications for a better understanding of avian flight behaviour and of the evolution of use of subsidy in flight. PMID:26538556
Advanced Grid-Friendly Controls Demonstration Project for Utility-Scale PV Power Plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gevorgian, Vahan; O'Neill, Barbara
A typical photovoltaic (PV) power plant consists of multiple power electronic inverters and can contribute to grid stability and reliability through sophisticated 'grid-friendly' controls. The availability and dissemination of actual test data showing the viability of advanced utility-scale PV controls among all industry stakeholders can leverage PV's value from being simply an energy resource to providing additional ancillary services that range from variability smoothing and frequency regulation to power quality. Strategically partnering with a selected utility and/or PV power plant operator is a key condition for a successful demonstration project. The U.S. Department of Energy's (DOE's) Solar Energy Technologies Officemore » selected the National Renewable Energy Laboratory (NREL) to be a principal investigator in a two-year project with goals to (1) identify a potential partner(s), (2) develop a detailed scope of work and test plan for a field project to demonstrate the gird-friendly capabilities of utility-scale PV power plants, (3) facilitate conducting actual demonstration tests, and (4) disseminate test results among industry stakeholders via a joint NREL/DOE publication and participation in relevant technical conferences. The project implementation took place in FY 2014 and FY 2015. In FY14, NREL established collaborations with AES and First Solar Electric, LLC, to conduct demonstration testing on their utility-scale PV power plants in Puerto Rico and Texas, respectively, and developed test plans for each partner. Both Puerto Rico Electric Power Authority and the Electric Reliability Council of Texas expressed interest in this project because of the importance of such advanced controls for the reliable operation of their power systems under high penetration levels of variable renewable generation. During FY15, testing was completed on both plants, and a large amount of test data was produced and analyzed that demonstrates the ability of PV power plants to provide various types of new grid-friendly controls.« less
ERIC Educational Resources Information Center
Pantzare, Anna Lind
2015-01-01
In most large-scale assessment systems a set of rather expensive external quality controls are implemented in order to guarantee the quality of interrater reliability. This study empirically examines if teachers' ratings of national tests in mathematics can be reliable without using monitoring, training, or other methods of external quality…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Inoue, T.; Shirakata, K.; Kinjo, K.
To obtain the data necessary for evaluating the nuclear design method of a large-scale fast breeder reactor, criticality tests with a large- scale homogeneous reactor were conducted as part of a joint research program by Japan and the U.S. Analyses of the tests are underway in both countries. The purpose of this paper is to describe the status of this project.
ERIC Educational Resources Information Center
Cimbricz, Sandra K.; McConn, Matthew L.
2015-01-01
This article explores the intersection of new, large-scale standards-based testing, teacher accountability policy, and secondary curriculum and instruction in the United States. Two federally funded consortia--the Smarter Balanced Assessment Consortium and the Partnership for Readiness of College and Careers--prove focal to this paper, as these…
ERIC Educational Resources Information Center
Copp, Derek T.
2017-01-01
Large-scale assessment (LSA) is a tool used by education authorities for several purposes, including the promotion of teacher-based instructional change. In Canada, all 10 provinces engage in large-scale testing across several grade levels and subjects, and also have the common expectation that the results data will be used to improve instruction…
ERIC Educational Resources Information Center
Kroopnick, Marc Howard
2010-01-01
When Item Response Theory (IRT) is operationally applied for large scale assessments, unidimensionality is typically assumed. This assumption requires that the test measures a single latent trait. Furthermore, when tests are vertically scaled using IRT, the assumption of unidimensionality would require that the battery of tests across grades…
Fire extinguishing tests -80 with methyl alcohol gasoline (in MIXED)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holmstedt, G.; Ryderman, A.; Carlsson, B.
1980-01-01
Large scale tests and laboratory experiments were carried out for estimating the extinguishing effectiveness of three alcohol resistant aqueous film forming foams (AFFF), two alcohol resistant fluoroprotein foams and two detergent foams in various poolfires: gasoline, isopropyl alcohol, acetone, methyl-ethyl ketone, methyl alcohol and M15 (a gasoline, methyl alcohol, isobutene mixture). The scaling down of large scale tests for developing a reliable laboratory method was especially examined. The tests were performed with semidirect foam application, in pools of 50, 11, 4, 0.6, and 0.25 sq m. Burning time, temperature distribution in the liquid, and thermal radiation were determined. An M15more » fire can be extinguished with a detergent foam, but it is impossible to extinguish fires in polar solvents, such as methyl alcohol, acetone, and isopropyl alcohol with detergent foams, AFFF give the best results, and performances with small pools can hardly be correlated with results from large scale fires.« less
Non-linear scale interactions in a forced turbulent boundary layer
NASA Astrophysics Data System (ADS)
Duvvuri, Subrahmanyam; McKeon, Beverley
2015-11-01
A strong phase-organizing influence exerted by a single synthetic large-scale spatio-temporal mode on directly-coupled (through triadic interactions) small scales in a turbulent boundary layer forced by a spatially-impulsive dynamic wall-roughness patch was previously demonstrated by the authors (J. Fluid Mech. 2015, vol. 767, R4). The experimental set-up was later enhanced to allow for simultaneous forcing of multiple scales in the flow. Results and analysis are presented from a new set of novel experiments where two distinct large scales are forced in the flow by a dynamic wall-roughness patch. The internal non-linear forcing of two other scales with triadic consistency to the artificially forced large scales, corresponding to sum and difference in wavenumbers, is dominated by the latter. This allows for a forcing-response (input-output) type analysis of the two triadic scales, and naturally lends itself to a resolvent operator based model (e.g. McKeon & Sharma, J. Fluid Mech. 2010, vol. 658, pp. 336-382) of the governing Navier-Stokes equations. The support of AFOSR (grant #FA 9550-12-1-0469, program manager D. Smith) is gratefully acknowledged.
Low-Cost Nested-MIMO Array for Large-Scale Wireless Sensor Applications.
Zhang, Duo; Wu, Wen; Fang, Dagang; Wang, Wenqin; Cui, Can
2017-05-12
In modern communication and radar applications, large-scale sensor arrays have increasingly been used to improve the performance of a system. However, the hardware cost and circuit power consumption scale linearly with the number of sensors, which makes the whole system expensive and power-hungry. This paper presents a low-cost nested multiple-input multiple-output (MIMO) array, which is capable of providing O ( 2 N 2 ) degrees of freedom (DOF) with O ( N ) physical sensors. The sensor locations of the proposed array have closed-form expressions. Thus, the aperture size and number of DOF can be predicted as a function of the total number of sensors. Additionally, with the help of time-sequence-phase-weighting (TSPW) technology, only one receiver channel is required for sampling the signals received by all of the sensors, which is conducive to reducing the hardware cost and power consumption. Numerical simulation results demonstrate the effectiveness and superiority of the proposed array.
Low-Cost Nested-MIMO Array for Large-Scale Wireless Sensor Applications
Zhang, Duo; Wu, Wen; Fang, Dagang; Wang, Wenqin; Cui, Can
2017-01-01
In modern communication and radar applications, large-scale sensor arrays have increasingly been used to improve the performance of a system. However, the hardware cost and circuit power consumption scale linearly with the number of sensors, which makes the whole system expensive and power-hungry. This paper presents a low-cost nested multiple-input multiple-output (MIMO) array, which is capable of providing O(2N2) degrees of freedom (DOF) with O(N) physical sensors. The sensor locations of the proposed array have closed-form expressions. Thus, the aperture size and number of DOF can be predicted as a function of the total number of sensors. Additionally, with the help of time-sequence-phase-weighting (TSPW) technology, only one receiver channel is required for sampling the signals received by all of the sensors, which is conducive to reducing the hardware cost and power consumption. Numerical simulation results demonstrate the effectiveness and superiority of the proposed array. PMID:28498329
Stability of Rasch Scales over Time
ERIC Educational Resources Information Center
Taylor, Catherine S.; Lee, Yoonsun
2010-01-01
Item response theory (IRT) methods are generally used to create score scales for large-scale tests. Research has shown that IRT scales are stable across groups and over time. Most studies have focused on items that are dichotomously scored. Now Rasch and other IRT models are used to create scales for tests that include polytomously scored items.…
Caldwell, Robert R
2011-12-28
The challenge to understand the physical origin of the cosmic acceleration is framed as a problem of gravitation. Specifically, does the relationship between stress-energy and space-time curvature differ on large scales from the predictions of general relativity. In this article, we describe efforts to model and test a generalized relationship between the matter and the metric using cosmological observations. Late-time tracers of large-scale structure, including the cosmic microwave background, weak gravitational lensing, and clustering are shown to provide good tests of the proposed solution. Current data are very close to proving a critical test, leaving only a small window in parameter space in the case that the generalized relationship is scale free above galactic scales.
Moving to stay in place: behavioral mechanisms for coexistence of African large carnivores.
Vanak, Abi Tamim; Fortin, Daniel; Thaker, Maria; Ogden, Monika; Owen, Cailey; Greatwood, Sophie; Slotow, Rob
2013-11-01
Most ecosystems have multiple predator species that not only compete for shared prey, but also pose direct threats to each other. These intraguild interactions are key drivers of carnivore community structure, with ecosystem-wide cascading effects. Yet, behavioral mechanisms for coexistence of multiple carnivore species remain poorly understood. The challenges of studying large, free-ranging carnivores have resulted in mainly coarse-scale examination of behavioral strategies without information about all interacting competitors. We overcame some of these challenges by examining the concurrent fine-scale movement decisions of almost all individuals of four large mammalian carnivore species in a closed terrestrial system. We found that the intensity ofintraguild interactions did not follow a simple hierarchical allometric pattern, because spatial and behavioral tactics of subordinate species changed with threat and resource levels across seasons. Lions (Panthera leo) were generally unrestricted and anchored themselves in areas rich in not only their principal prey, but also, during periods of resource limitation (dry season), rich in the main prey for other carnivores. Because of this, the greatest cost (potential intraguild predation) for subordinate carnivores was spatially coupled with the highest potential benefit of resource acquisition (prey-rich areas), especially in the dry season. Leopard (P. pardus) and cheetah (Acinonyx jubatus) overlapped with the home range of lions but minimized their risk using fine-scaled avoidance behaviors and restricted resource acquisition tactics. The cost of intraguild competition was most apparent for cheetahs, especially during the wet season, as areas with energetically rewarding large prey (wildebeest) were avoided when they overlapped highly with the activity areas of lions. Contrary to expectation, the smallest species (African wild dog, Lycaon pictus) did not avoid only lions, but also used multiple tactics to minimize encountering all other competitors. Intraguild competition thus forced wild dogs into areas with the lowest resource availability year round. Coexistence of multiple carnivore species has typically been explained by dietary niche separation, but our multi-scaled movement results suggest that differences in resource acquisition may instead be a consequence of avoiding intraguild competition. We generate a more realistic representation of hierarchical behavioral interactions that may ultimately drive spatially explicit trophic structures of multi-predator communities.
The ranking probability approach and its usage in design and analysis of large-scale studies.
Kuo, Chia-Ling; Zaykin, Dmitri
2013-01-01
In experiments with many statistical tests there is need to balance type I and type II error rates while taking multiplicity into account. In the traditional approach, the nominal [Formula: see text]-level such as 0.05 is adjusted by the number of tests, [Formula: see text], i.e., as 0.05/[Formula: see text]. Assuming that some proportion of tests represent "true signals", that is, originate from a scenario where the null hypothesis is false, power depends on the number of true signals and the respective distribution of effect sizes. One way to define power is for it to be the probability of making at least one correct rejection at the assumed [Formula: see text]-level. We advocate an alternative way of establishing how "well-powered" a study is. In our approach, useful for studies with multiple tests, the ranking probability [Formula: see text] is controlled, defined as the probability of making at least [Formula: see text] correct rejections while rejecting hypotheses with [Formula: see text] smallest P-values. The two approaches are statistically related. Probability that the smallest P-value is a true signal (i.e., [Formula: see text]) is equal to the power at the level [Formula: see text], to an very good excellent approximation. Ranking probabilities are also related to the false discovery rate and to the Bayesian posterior probability of the null hypothesis. We study properties of our approach when the effect size distribution is replaced for convenience by a single "typical" value taken to be the mean of the underlying distribution. We conclude that its performance is often satisfactory under this simplification; however, substantial imprecision is to be expected when [Formula: see text] is very large and [Formula: see text] is small. Precision is largely restored when three values with the respective abundances are used instead of a single typical effect size value.
Demonstration of Essential Reliability Services by a 300-MW Solar Photovoltaic Power Plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loutan, Clyde; Klauer, Peter; Chowdhury, Sirajul
The California Independent System Operator (CAISO), First Solar, and the National Renewable Energy Laboratory (NREL) conducted a demonstration project on a large utility-scale photovoltaic (PV) power plant in California to test its ability to provide essential ancillary services to the electric grid. With increasing shares of solar- and wind-generated energy on the electric grid, traditional generation resources equipped with automatic governor control (AGC) and automatic voltage regulation controls -- specifically, fossil thermal -- are being displaced. The deployment of utility-scale, grid-friendly PV power plants that incorporate advanced capabilities to support grid stability and reliability is essential for the large-scale integrationmore » of PV generation into the electric power grid, among other technical requirements. A typical PV power plant consists of multiple power electronic inverters and can contribute to grid stability and reliability through sophisticated 'grid-friendly' controls. In this way, PV power plants can be used to mitigate the impact of variability on the grid, a role typically reserved for conventional generators. In August 2016, testing was completed on First Solar's 300-MW PV power plant, and a large amount of test data was produced and analyzed that demonstrates the ability of PV power plants to use grid-friendly controls to provide essential reliability services. These data showed how the development of advanced power controls can enable PV to become a provider of a wide range of grid services, including spinning reserves, load following, voltage support, ramping, frequency response, variability smoothing, and frequency regulation to power quality. Specifically, the tests conducted included various forms of active power control such as AGC and frequency regulation; droop response; and reactive power, voltage, and power factor controls. This project demonstrated that advanced power electronics and solar generation can be controlled to contribute to system-wide reliability. It was shown that the First Solar plant can provide essential reliability services related to different forms of active and reactive power controls, including plant participation in AGC, primary frequency control, ramp rate control, and voltage regulation. For AGC participation in particular, by comparing the PV plant testing results to the typical performance of individual conventional technologies, we showed that regulation accuracy by the PV plant is 24-30 points better than fast gas turbine technologies. The plant's ability to provide volt-ampere reactive control during periods of extremely low power generation was demonstrated as well. The project team developed a pioneering demonstration concept and test plan to show how various types of active and reactive power controls can leverage PV generation's value from being a simple variable energy resource to a resource that provides a wide range of ancillary services. With this project's approach to a holistic demonstration on an actual, large, utility-scale, operational PV power plant and dissemination of the obtained results, the team sought to close some gaps in perspectives that exist among various stakeholders in California and nationwide by providing real test data.« less
NASA Technical Reports Server (NTRS)
Scholl, R. E. (Editor)
1979-01-01
Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.
NASA Astrophysics Data System (ADS)
Jin, L.; Meeks, J. L.; Hubbard, K. A.; Kurian, L. M.; Siegel, D. I.; Lautz, L. K.; Otz, M. H.
2007-12-01
Temporary storage of surface water at channel sides and pools significantly affects water and solute transport downstream in watersheds. Beavers, natural "stream channel engineers", build dams which obstruct stream flow and temporarily store water in small to large ponds within stream channels. These ponds substantially delay water movement and increase the water residence time in the system. To study how water and solutes move through these obstructed stream channels, we did multiple dye tracing tests at Cherry Creek, a main tributary to Red Canyon Creek (Wind River Range, Wyoming). First we surveyed beaver dam distributions in detail within the study reaches. We then introduced dyes four times from July 2nd to 6th, 2007 using a scale-up approach. The observation site was fixed at the mouth of Cherry Creek, and 1.5 grams of Rhodamine WT (RWT) dye was injected sequentially at upstream sites with increasing test reach length. The reach lengths scaled up from 500m to 2.5 km. A field fluorometer recorded RWT concentrations every 15 seconds. The results show non-linear decreases of the peak concentration of the dye tracing cloud as the reach scaled up. Also, the times to 1.) the arrivals of the leading edges (Tl), 2.) the peak concentrations (Tp) and 3.) the tailing edges (Tt) and 4) the durations of the tracer cloud (Td) behaved non-linearly as function of length scale. For example, plots of arrivals of leading edges and tailing edges with scale distance appear to define curves of the form; Tl=27.665e1.07× Distance (r2=0.99) and Tt=162.62e0.8551× Distance (r2=0.99), respectively. The greatest non-linearity occurred for the time of tailing and the least for the time of leading edge. These observations are consistent with what would be expected with greater density of dams and/or storage volumes as the reach length increased upgradient. To come to a first approximation, we are currently modeling the breakthrough curves with the solute transport code OTIS to address the relative differences in average travel velocity, longitudinal dispersion, and storage parameters from the mouth to the headwaters of the creek.
NASA Astrophysics Data System (ADS)
Fischer, P.
1997-12-01
Weak distortions of background galaxies are rapidly emerging as a powerful tool for the measurement of galaxy cluster mass distributions. Lensing based studies have the advantage of being direct measurements of mass and are not model-dependent as are other techniques (X-ray, radial velocities). To date studies have been limited by CCD field size meaning that full coverage of the clusters out to the virial radii and beyond has not been possible. Probing this large radius region is essential for testing models of large scale structure formation. New wide field CCD mosaics, for the first time, allow mass measurements out to very large radius. We have obtained images for a sample of clusters with the ``Big Throughput Camera'' (BTC) on the CTIO 4m. This camera comprises four thinned SITE 2048(2) CCDs, each 15arcmin on a side for a total area of one quarter of a square degree. We have developed an automated reduction pipeline which: 1) corrects for spatial distortions, 2) corrects for PSF anisotropy, 3) determines relative scaling and background levels, and 4) combines multiple exposures. In this poster we will present some preliminary results of our cluster lensing study. This will include radial mass and light profiles and 2-d mass and galaxy density maps.
Tilley, Barbara C.; LaPelle, Nancy R.; Goetz, Christopher G.; Stebbins, Glenn T.
2016-01-01
Background Cognitive pretesting, a qualitative step in scale development, precedes field testing and assesses the difficulty of instrument completion for examiners and respondents. Cognitive pretesting assesses respondent interest, attention span, discomfort, and comprehension, and highlights problems with the logical structure of questions/response options that can affect understanding. In the past this approach was not consistently used in the development or revision of movement disorders scales. Methods We applied qualitative cognitive pretesting using testing guides in development of the Movement Disorder Society-sponsored revision of the Unified Parkinson’s Disease Rating Scale (MDS-UPDRS). The guides were based on qualitative techniques, verbal probing and “think-aloud” interviewing, to identify problems with the scale from the patient and rater perspectives. English-speaking Parkinson’s disease patients and movement disorders specialists (raters) from multiple specialty clinics in the United States, Western Europe and Canada used the MDS-UPDRS and completed the testing guides. Results Two rounds of cognitive pretesting were necessary before proceeding to field testing of the revised scale to assess clinimetric properties. Scale revisions based on cognitive pretesting included changes in phrasing, simplification of some questions, and addition of a reassuring statement explaining that not all PD patients experience the symptoms described in the questions. Conclusions The strategy of incorporating cognitive pretesting into scale development and revision provides a model for other movement disorders scales. Cognitive pretesting is being used in translating the MDS-UPDRS into multiple languages to improve comprehension and acceptance and in the development of a new Unified Dyskinesia Rating Scale for Parkinson’s disease patients. PMID:24613868
A Comparison of Methods to Screen Middle School Students for Reading and Math Difficulties
ERIC Educational Resources Information Center
Nelson, Peter M.; Van Norman, Ethan R.; Lackner, Stacey K.
2016-01-01
The current study explored multiple ways in which middle schools can use and integrate data sources to predict proficiency on future high-stakes state achievement tests. The diagnostic accuracy of (a) prior achievement data, (b) teacher rating scale scores, (c) a composite score combining state test scores and rating scale responses, and (d) two…
Testing and analysis of flat and curved panels with multiple cracks
NASA Technical Reports Server (NTRS)
Broek, David; Jeong, David Y.; Thomson, Douglas
1994-01-01
An experimental and analytical investigation of multiple cracking in various types of test specimens is described in this paper. The testing phase is comprised of a flat unstiffened panel series and curved stiffened and unstiffened panel series. The test specimens contained various configurations for initial damage. Static loading was applied to these specimens until ultimate failure, while loads and crack propagation were recorded. This data provides the basis for developing and validating methodologies for predicting linkup of multiple cracks, progression to failure, and overall residual strength. The results from twelve flat coupon and ten full scale curved panel tests are presented. In addition, an engineering analysis procedure was developed to predict multiple crack linkup. Reasonable agreement was found between predictions and actual test results for linkup and residual strength for both flat and curved panels. The results indicate that an engineering analysis approach has the potential to quantitatively assess the effect of multiple cracks in the arrest capability of an aircraft fuselage structure.
Trans-National Scale-Up of Services in Global Health
Shahin, Ilan; Sohal, Raman; Ginther, John; Hayden, Leigh; MacDonald, John A.; Mossman, Kathryn; Parikh, Himanshu; McGahan, Anita; Mitchell, Will; Bhattacharyya, Onil
2014-01-01
Background Scaling up innovative healthcare programs offers a means to improve access, quality, and health equity across multiple health areas. Despite large numbers of promising projects, little is known about successful efforts to scale up. This study examines trans-national scale, whereby a program operates in two or more countries. Trans-national scale is a distinct measure that reflects opportunities to replicate healthcare programs in multiple countries, thereby providing services to broader populations. Methods Based on the Center for Health Market Innovations (CHMI) database of nearly 1200 health programs, the study contrasts 116 programs that have achieved trans-national scale with 1,068 single-country programs. Data was collected on the programs' health focus, service activity, legal status, and funding sources, as well as the programs' locations (rural v. urban emphasis), and founding year; differences are reported with statistical significance. Findings This analysis examines 116 programs that have achieved trans-national scale (TNS) across multiple disease areas and activity types. Compared to 1,068 single-country programs, we find that trans-nationally scaled programs are more donor-reliant; more likely to focus on targeted health needs such as HIV/AIDS, TB, malaria, or family planning rather than provide more comprehensive general care; and more likely to engage in activities that support healthcare services rather than provide direct clinical care. Conclusion This work, based on a large data set of health programs, reports on trans-national scale with comparison to single-country programs. The work is a step towards understanding when programs are able to replicate their services as they attempt to expand health services for the poor across countries and health areas. A subset of these programs should be the subject of case studies to understand factors that affect the scaling process, particularly seeking to identify mechanisms that lead to improved health outcomes. PMID:25375328
DEIVA: a web application for interactive visual analysis of differential gene expression profiles.
Harshbarger, Jayson; Kratz, Anton; Carninci, Piero
2017-01-07
Differential gene expression (DGE) analysis is a technique to identify statistically significant differences in RNA abundance for genes or arbitrary features between different biological states. The result of a DGE test is typically further analyzed using statistical software, spreadsheets or custom ad hoc algorithms. We identified a need for a web-based system to share DGE statistical test results, and locate and identify genes in DGE statistical test results with a very low barrier of entry. We have developed DEIVA, a free and open source, browser-based single page application (SPA) with a strong emphasis on being user friendly that enables locating and identifying single or multiple genes in an immediate, interactive, and intuitive manner. By design, DEIVA scales with very large numbers of users and datasets. Compared to existing software, DEIVA offers a unique combination of design decisions that enable inspection and analysis of DGE statistical test results with an emphasis on ease of use.
The Social Life of a Data Base
NASA Technical Reports Server (NTRS)
Linde, Charlotte; Wales, Roxana; Clancy, Dan (Technical Monitor)
2002-01-01
This paper presents the complex social life of a large data base. The topics include: 1) Social Construction of Mechanisms of Memory; 2) Data Bases: The Invisible Memory Mechanism; 3) The Human in the Machine; 4) Data of the Study: A Large-Scale Problem Reporting Data Base; 5) The PRACA Study; 6) Description of PRACA; 7) PRACA and Paper; 8) Multiple Uses of PRACA; 9) The Work of PRACA; 10) Multiple Forms of Invisibility; 11) Such Systems are Everywhere; and 12) Two Morals to the Story. This paper is in viewgraph form.
ERIC Educational Resources Information Center
Papenberg, Martin; Musch, Jochen
2017-01-01
In multiple-choice tests, the quality of distractors may be more important than their number. We therefore examined the joint influence of distractor quality and quantity on test functioning by providing a sample of 5,793 participants with five parallel test sets consisting of items that differed in the number and quality of distractors.…
Gadkar, Vijay J; Filion, Martin
2013-06-01
In various experimental systems, limiting available amounts of RNA may prevent a researcher from performing large-scale analyses of gene transcripts. One way to circumvent this is to 'pre-amplify' the starting RNA/cDNA, so that sufficient amounts are available for any downstream analysis. In the present study, we report the development of a novel protocol for constructing amplified cDNA libraries using the Phi29 DNA polymerase based multiple displacement amplification (MDA) system. Using as little as 200 ng of total RNA, we developed a linear concatenation strategy to make the single-stranded cDNA template amenable for MDA. The concatenation, made possible by the template switching property of the reverse transcriptase enzyme, resulted in the amplified cDNA library with intact 5' ends. MDA generated micrograms of template, allowing large-scale polymerase chain reaction analyses or other large-scale downstream applications. As the amplified cDNA library contains intact 5' ends, it is also compatible with 5' RACE analyses of specific gene transcripts. Empirical validation of this protocol is demonstrated on a highly characterized (tomato) and an uncharacterized (corn gromwell) experimental system.
USDA-ARS?s Scientific Manuscript database
The objective of this paper is to study shedding patterns of cows infected with Mycobacterium avium subsp. paratuberculosis (MAP). While multiple single farm studies of MAP dynamics were reported, there is not large-scale meta-analysis of both natural and experimental infections. Large difference...
Large-Scale Low-Boom Inlet Test Overview
NASA Technical Reports Server (NTRS)
Hirt, Stefanie
2011-01-01
This presentation provides a high level overview of the Large-Scale Low-Boom Inlet Test and was presented at the Fundamental Aeronautics 2011 Technical Conference. In October 2010 a low-boom supersonic inlet concept with flow control was tested in the 8'x6' supersonic wind tunnel at NASA Glenn Research Center (GRC). The primary objectives of the test were to evaluate the inlet stability and operability of a large-scale low-boom supersonic inlet concept by acquiring performance and flowfield validation data, as well as evaluate simple, passive, bleedless inlet boundary layer control options. During this effort two models were tested: a dual stream inlet intended to model potential flight hardware and a single stream design to study a zero-degree external cowl angle and to permit surface flow visualization of the vortex generator flow control on the internal centerbody surface. The tests were conducted by a team of researchers from NASA GRC, Gulfstream Aerospace Corporation, University of Illinois at Urbana-Champaign, and the University of Virginia
Grains of connectivity: analysis at multiple spatial scales in landscape genetics.
Galpern, Paul; Manseau, Micheline; Wilson, Paul
2012-08-01
Landscape genetic analyses are typically conducted at one spatial scale. Considering multiple scales may be essential for identifying landscape features influencing gene flow. We examined landscape connectivity for woodland caribou (Rangifer tarandus caribou) at multiple spatial scales using a new approach based on landscape graphs that creates a Voronoi tessellation of the landscape. To illustrate the potential of the method, we generated five resistance surfaces to explain how landscape pattern may influence gene flow across the range of this population. We tested each resistance surface using a raster at the spatial grain of available landscape data (200 m grid squares). We then used our method to produce up to 127 additional grains for each resistance surface. We applied a causal modelling framework with partial Mantel tests, where evidence of landscape resistance is tested against an alternative hypothesis of isolation-by-distance, and found statistically significant support for landscape resistance to gene flow in 89 of the 507 spatial grains examined. We found evidence that major roads as well as the cumulative effects of natural and anthropogenic disturbance may be contributing to the genetic structure. Using only the original grid surface yielded no evidence for landscape resistance to gene flow. Our results show that using multiple spatial grains can reveal landscape influences on genetic structure that may be overlooked with a single grain, and suggest that coarsening the grain of landcover data may be appropriate for highly mobile species. We discuss how grains of connectivity and related analyses have potential landscape genetic applications in a broad range of systems. © 2012 Blackwell Publishing Ltd.
Ogawa, Takeshi; Aihara, Takatsugu; Shimokawa, Takeaki; Yamashita, Okito
2018-04-24
Creative insight occurs with an "Aha!" experience when solving a difficult problem. Here, we investigated large-scale networks associated with insight problem solving. We recruited 232 healthy participants aged 21-69 years old. Participants completed a magnetic resonance imaging study (MRI; structural imaging and a 10 min resting-state functional MRI) and an insight test battery (ITB) consisting of written questionnaires (matchstick arithmetic task, remote associates test, and insight problem solving task). To identify the resting-state functional connectivity (RSFC) associated with individual creative insight, we conducted an exploratory voxel-based morphometry (VBM)-constrained RSFC analysis. We identified positive correlations between ITB score and grey matter volume (GMV) in the right insula and middle cingulate cortex/precuneus, and a negative correlation between ITB score and GMV in the left cerebellum crus 1 and right supplementary motor area. We applied seed-based RSFC analysis to whole brain voxels using the seeds obtained from the VBM and identified insight-positive/negative connections, i.e. a positive/negative correlation between the ITB score and individual RSFCs between two brain regions. Insight-specific connections included motor-related regions whereas creative-common connections included a default mode network. Our results indicate that creative insight requires a coupling of multiple networks, such as the default mode, semantic and cerebral-cerebellum networks.
NASA Astrophysics Data System (ADS)
Beggrow, Elizabeth P.; Ha, Minsu; Nehm, Ross H.; Pearl, Dennis; Boone, William J.
2014-02-01
The landscape of science education is being transformed by the new Framework for Science Education (National Research Council, A framework for K-12 science education: practices, crosscutting concepts, and core ideas. The National Academies Press, Washington, DC, 2012), which emphasizes the centrality of scientific practices—such as explanation, argumentation, and communication—in science teaching, learning, and assessment. A major challenge facing the field of science education is developing assessment tools that are capable of validly and efficiently evaluating these practices. Our study examined the efficacy of a free, open-source machine-learning tool for evaluating the quality of students' written explanations of the causes of evolutionary change relative to three other approaches: (1) human-scored written explanations, (2) a multiple-choice test, and (3) clinical oral interviews. A large sample of undergraduates (n = 104) exposed to varying amounts of evolution content completed all three assessments: a clinical oral interview, a written open-response assessment, and a multiple-choice test. Rasch analysis was used to compute linear person measures and linear item measures on a single logit scale. We found that the multiple-choice test displayed poor person and item fit (mean square outfit >1.3), while both oral interview measures and computer-generated written response measures exhibited acceptable fit (average mean square outfit for interview: person 0.97, item 0.97; computer: person 1.03, item 1.06). Multiple-choice test measures were more weakly associated with interview measures (r = 0.35) than the computer-scored explanation measures (r = 0.63). Overall, Rasch analysis indicated that computer-scored written explanation measures (1) have the strongest correspondence to oral interview measures; (2) are capable of capturing students' normative scientific and naive ideas as accurately as human-scored explanations, and (3) more validly detect understanding than the multiple-choice assessment. These findings demonstrate the great potential of machine-learning tools for assessing key scientific practices highlighted in the new Framework for Science Education.
NASA Technical Reports Server (NTRS)
Penrose, C. J.
1987-01-01
The difficulties of modeling the complex recirculating flow fields produced by multiple jet STOVL aircraft close to the ground have led to extensive use of experimental model tests to predict intake Hot Gas Reingestion (HGR). Model test results reliability is dependent on a satisfactory set of scaling rules which must be validated by fully comparable full scale tests. Scaling rules devised in the U.K. in the mid 60's gave good model/full scale agreement for the BAe P1127 aircraft. Until recently no opportunity has occurred to check the applicability of the rules to the high energy exhaust of current ASTOVL aircraft projects. Such an opportunity has arisen following tests on a Tethered Harrier. Comparison of this full scale data and results from tests on a model configuration approximating to the full scale aircraft geometry has shown discrepancies between HGR levels. These discrepancies although probably due to geometry and other model/scale differences indicate some reexamination of the scaling rules is needed. Therefore the scaling rules are reviewed, further scaling studies planned are described and potential areas for further work are suggested.
Forslin, Mia; Kottorp, Anders; Kierkegaard, Marie; Johansson, Sverker
2016-11-11
To translate and culturally adapt the Acceptance of Chronic Health Conditions (ACHC) Scale for people with multiple sclerosis into Swedish, and to analyse the psychometric properties of the Swedish version. Ten people with multiple sclerosis participated in translation and cultural adaptation of the ACHC Scale; 148 people with multiple sclerosis were included in evaluation of the psychometric properties of the scale. Translation and cultural adaptation were carried out through translation and back-translation, by expert committee evaluation and pre-test with cognitive interviews in people with multiple sclerosis. The psychometric properties of the Swedish version were evaluated using Rasch analysis. The Swedish version of the ACHC Scale was an acceptable equivalent to the original version. Seven of the original 10 items fitted the Rasch model and demonstrated ability to separate between groups. A 5-item version, including 2 items and 3 super-items, demonstrated better psychometric properties, but lower ability to separate between groups. The Swedish version of the ACHC Scale with the original 10 items did not fit the Rasch model. Two solutions, either with 7 items (ACHC-7) or with 2 items and 3 super-items (ACHC-5), demonstrated acceptable psychometric properties. Use of the ACHC-5 Scale with super-items is recommended, since this solution adjusts for local dependency among items.
Large-Scale 3D Printing: The Way Forward
NASA Astrophysics Data System (ADS)
Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid
2018-03-01
Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.
HMC algorithm with multiple time scale integration and mass preconditioning
NASA Astrophysics Data System (ADS)
Urbach, C.; Jansen, K.; Shindler, A.; Wenger, U.
2006-01-01
We present a variant of the HMC algorithm with mass preconditioning (Hasenbusch acceleration) and multiple time scale integration. We have tested this variant for standard Wilson fermions at β=5.6 and at pion masses ranging from 380 to 680 MeV. We show that in this situation its performance is comparable to the recently proposed HMC variant with domain decomposition as preconditioner. We give an update of the "Berlin Wall" figure, comparing the performance of our variant of the HMC algorithm to other published performance data. Advantages of the HMC algorithm with mass preconditioning and multiple time scale integration are that it is straightforward to implement and can be used in combination with a wide variety of lattice Dirac operators.
D.R. Magness; J.M. Morton; F. Huettmann; F.S. Chapin; A.D. McGuire
2011-01-01
Rapid climate change, in conjunction with other anthropogenic drivers, has the potential to cause mass species extinction. To minimize this risk, conservation reserves need to be coordinated at multiple spatial scales because the climate envelopes of many species may shift rapidly across large geographic areas. In addition, novel species assemblages and ecological...
Samuel A. Cushman; Nicholas B. Elliot; David W. Macdonald; Andrew J. Loveridge
2015-01-01
Habitat loss and fragmentation are among the major drivers of population declines and extinction, particularly in large carnivores. Connectivity models provide practical tools for assessing fragmentation effects and developing mitigation or conservation responses. To be useful to conservation practitioners, connectivity models need to incorporate multiple scales and...
Managing landscapes at multiple scales for sustainability of ecosystem functions (Preface)
R.A. Birdsey; R. Lucas; Y. Pan; G. Sun; E.J. Gustafson; A.H. Perera
2010-01-01
The science of landscape ecology is a rapidly evolving academic field with an emphasis on studying large-scale spatial heterogeneity created by natural influences and human activities. These advances have important implications for managing and conserving natural resources. At a September 2008 IUFRO conference in Chengdu, Sichuan, P.R. China, we highlighted both the...
Reliability and Clinical Significance of Mobility and Balance Assessments in Multiple Sclerosis
ERIC Educational Resources Information Center
Learmonth, Yvonne C.; Paul, Lorna; McFadyen, Angus K.; Mattison, Paul; Miller, Linda
2012-01-01
The aim of the study was to establish the test-retest reliability, clinical significance and precision of four mobility and balance measures--the Timed 25-Foot Walk, Six-minute Walk, Timed Up and Go and the Berg Balance Scale--in individuals moderately affected by multiple sclerosis. Twenty four participants with multiple sclerosis (Extended…
Query-Adaptive Hash Code Ranking for Large-Scale Multi-View Visual Search.
Liu, Xianglong; Huang, Lei; Deng, Cheng; Lang, Bo; Tao, Dacheng
2016-10-01
Hash-based nearest neighbor search has become attractive in many applications. However, the quantization in hashing usually degenerates the discriminative power when using Hamming distance ranking. Besides, for large-scale visual search, existing hashing methods cannot directly support the efficient search over the data with multiple sources, and while the literature has shown that adaptively incorporating complementary information from diverse sources or views can significantly boost the search performance. To address the problems, this paper proposes a novel and generic approach to building multiple hash tables with multiple views and generating fine-grained ranking results at bitwise and tablewise levels. For each hash table, a query-adaptive bitwise weighting is introduced to alleviate the quantization loss by simultaneously exploiting the quality of hash functions and their complement for nearest neighbor search. From the tablewise aspect, multiple hash tables are built for different data views as a joint index, over which a query-specific rank fusion is proposed to rerank all results from the bitwise ranking by diffusing in a graph. Comprehensive experiments on image search over three well-known benchmarks show that the proposed method achieves up to 17.11% and 20.28% performance gains on single and multiple table search over the state-of-the-art methods.
DOT National Transportation Integrated Search
2015-03-01
A large experimental program, consisting of the design, construction, curing, exposure, and structural load : testing of 16 large-scale column specimens with a critical lap splice region that were influenced by varying : stages of alkali-silica react...
NASA Astrophysics Data System (ADS)
Guervilly, C.; Cardin, P.
2017-12-01
Convection is the main heat transport process in the liquid cores of planets. The convective flows are thought to be turbulent and constrained by rotation (corresponding to high Reynolds numbers Re and low Rossby numbers Ro). Under these conditions, and in the absence of magnetic fields, the convective flows can produce coherent Reynolds stresses that drive persistent large-scale zonal flows. The formation of large-scale flows has crucial implications for the thermal evolution of planets and the generation of large-scale magnetic fields. In this work, we explore this problem with numerical simulations using a quasi-geostrophic approximation to model convective and zonal flows at Re 104 and Ro 10-4 for Prandtl numbers relevant for liquid metals (Pr 0.1). The formation of intense multiple zonal jets strongly affects the convective heat transport, leading to the formation of a mean temperature staircase. We also study the generation of magnetic fields by the quasi-geostrophic flows at low magnetic Prandtl numbers.
Economically viable large-scale hydrogen liquefaction
NASA Astrophysics Data System (ADS)
Cardella, U.; Decker, L.; Klein, H.
2017-02-01
The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.
The Structural Heat Intercept-Insulation-Vibration Evaluation Rig (SHIVER)
NASA Technical Reports Server (NTRS)
Johnson, W. L.; Zoeckler, J. G.; Best-Ameen, L. M.
2015-01-01
NASA is currently investigating methods to reduce the boil-off rate on large cryogenic upper stages. Two such methods to reduce the total heat load on existing upper stages are vapor cooling of the cryogenic tank support structure and integration of thick multilayer insulation systems to the upper stage of a launch vehicle. Previous efforts have flown a 2-layer MLI blanket and shown an improved thermal performance, and other efforts have ground-tested blankets up to 70 layers thick on tanks with diameters between 2 3 meters. However, thick multilayer insulation installation and testing in both thermal and structural modes has not been completed on a large scale tank. Similarly, multiple vapor cooled shields are common place on science payload helium dewars; however, minimal effort has gone into intercepting heat on large structural surfaces associated with rocket stages. A majority of the vapor cooling effort focuses on metallic cylinders called skirts, which are the most common structural components for launch vehicles. In order to provide test data for comparison with analytical models, a representative test tank is currently being designed to include skirt structural systems with integral vapor cooling. The tank is 4 m in diameter and 6.8 m tall to contain 5000 kg of liquid hydrogen. A multilayer insulation system will be designed to insulate the tank and structure while being installed in a representative manner that can be extended to tanks up to 10 meters in diameter. In order to prove that the insulation system and vapor cooling attachment methods are structurally sound, acoustic testing will also be performed on the system. The test tank with insulation and vapor cooled shield installed will be tested thermally in the B2 test facility at NASAs Plumbrook Station both before and after being vibration tested at Plumbrooks Space Power Facility.
A Meta-Analysis of Growth Trends from Vertically Scaled Assessments
ERIC Educational Resources Information Center
Dadey, Nathan; Briggs, Derek C.
2012-01-01
A vertical scale, in principle, provides a common metric across tests with differing difficulties (e.g., spanning multiple grades) so that statements of "absolute" growth can be made. This paper compares 16 states' 2007-2008 effect size growth trends on vertically scaled reading and math assessments across grades 3 to 8. Two patterns…
ERIC Educational Resources Information Center
Tindal, Gerald; Lee, Daesik; Geller, Leanne Ketterlin
2008-01-01
In this paper we review different methods for teachers to recommend accommodations in large scale tests. Then we present data on the stability of their judgments on variables relevant to this decision-making process. The outcomes from the judgments support the need for a more explicit model. Four general categories are presented: student…
High-Speed Interrogation for Large-Scale Fiber Bragg Grating Sensing
Hu, Chenyuan; Bai, Wei
2018-01-01
A high-speed interrogation scheme for large-scale fiber Bragg grating (FBG) sensing arrays is presented. This technique employs parallel computing and pipeline control to modulate incident light and demodulate the reflected sensing signal. One Electro-optic modulator (EOM) and one semiconductor optical amplifier (SOA) were used to generate a phase delay to filter reflected spectrum form multiple candidate FBGs with the same optical path difference (OPD). Experimental results showed that the fastest interrogation delay time for the proposed method was only about 27.2 us for a single FBG interrogation, and the system scanning period was only limited by the optical transmission delay in the sensing fiber owing to the multiple simultaneous central wavelength calculations. Furthermore, the proposed FPGA-based technique had a verified FBG wavelength demodulation stability of ±1 pm without average processing. PMID:29495263
High-Speed Interrogation for Large-Scale Fiber Bragg Grating Sensing.
Hu, Chenyuan; Bai, Wei
2018-02-24
A high-speed interrogation scheme for large-scale fiber Bragg grating (FBG) sensing arrays is presented. This technique employs parallel computing and pipeline control to modulate incident light and demodulate the reflected sensing signal. One Electro-optic modulator (EOM) and one semiconductor optical amplifier (SOA) were used to generate a phase delay to filter reflected spectrum form multiple candidate FBGs with the same optical path difference (OPD). Experimental results showed that the fastest interrogation delay time for the proposed method was only about 27.2 us for a single FBG interrogation, and the system scanning period was only limited by the optical transmission delay in the sensing fiber owing to the multiple simultaneous central wavelength calculations. Furthermore, the proposed FPGA-based technique had a verified FBG wavelength demodulation stability of ±1 pm without average processing.
DistributedFBA.jl: High-level, high-performance flux balance analysis in Julia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heirendt, Laurent; Thiele, Ines; Fleming, Ronan M. T.
Flux balance analysis and its variants are widely used methods for predicting steady-state reaction rates in biochemical reaction networks. The exploration of high dimensional networks with such methods is currently hampered by software performance limitations. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on a subset or all the reactions of large and huge-scale networks, on any number of threads or nodes. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on amore » subset or all the reactions of large and huge-scale networks, on any number of threads or nodes.« less
DistributedFBA.jl: High-level, high-performance flux balance analysis in Julia
Heirendt, Laurent; Thiele, Ines; Fleming, Ronan M. T.
2017-01-16
Flux balance analysis and its variants are widely used methods for predicting steady-state reaction rates in biochemical reaction networks. The exploration of high dimensional networks with such methods is currently hampered by software performance limitations. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on a subset or all the reactions of large and huge-scale networks, on any number of threads or nodes. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on amore » subset or all the reactions of large and huge-scale networks, on any number of threads or nodes.« less
NASA Astrophysics Data System (ADS)
Fonseca, R. A.; Vieira, J.; Fiuza, F.; Davidson, A.; Tsung, F. S.; Mori, W. B.; Silva, L. O.
2013-12-01
A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ˜106 cores and sustained performance over ˜2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios.
Utilization of Large Scale Surface Models for Detailed Visibility Analyses
NASA Astrophysics Data System (ADS)
Caha, J.; Kačmařík, M.
2017-11-01
This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.
Weak gravitational lensing due to large-scale structure of the universe
NASA Technical Reports Server (NTRS)
Jaroszynski, Michal; Park, Changbom; Paczynski, Bohdan; Gott, J. Richard, III
1990-01-01
The effect of the large-scale structure of the universe on the propagation of light rays is studied. The development of the large-scale density fluctuations in the omega = 1 universe is calculated within the cold dark matter scenario using a smooth particle approximation. The propagation of about 10 to the 6th random light rays between the redshift z = 5 and the observer was followed. It is found that the effect of shear is negligible, and the amplification of single images is dominated by the matter in the beam. The spread of amplifications is very small. Therefore, the filled-beam approximation is very good for studies of strong lensing by galaxies or clusters of galaxies. In the simulation, the column density was averaged over a comoving area of approximately (1/h Mpc)-squared. No case of a strong gravitational lensing was found, i.e., no 'over-focused' image that would suggest that a few images might be present. Therefore, the large-scale structure of the universe as it is presently known does not produce multiple images with gravitational lensing on a scale larger than clusters of galaxies.
Theodore Weller
2008-01-01
Regional conservation plans are increasingly used to plan for and protect biodiversity at large spatial scales however the means of quantitatively evaluating their effectiveness are rarely specified. Multiple-species approaches, particular those which employ site-occupancy estimation, have been proposed as robust and efficient alternatives for assessing the status of...
ERIC Educational Resources Information Center
Stephens, Keri K.; Barrett, Ashley K.; Mahometa, Michael J.
2013-01-01
This study relies on information theory, social presence, and source credibility to uncover what best helps people grasp the urgency of an emergency. We surveyed a random sample of 1,318 organizational members who received multiple notifications about a large-scale emergency. We found that people who received 3 redundant messages coming through at…
ERIC Educational Resources Information Center
Hodkowski, Nicola M.; Gardner, Amber; Jorgensen, Cody; Hornbein, Peter; Johnson, Heather L.; Tzur, Ron
2016-01-01
In this paper we examine the application of Tzur's (2007) fine-grained assessment to the design of an assessment measure of a particular multiplicative scheme so that non-interview, good enough data can be obtained (on a large scale) to infer into elementary students' reasoning. We outline three design principles that surfaced through our recent…
NASA Astrophysics Data System (ADS)
Panagopoulos, Yiannis; Gassman, Philip W.; Jha, Manoj K.; Kling, Catherine L.; Campbell, Todd; Srinivasan, Raghavan; White, Michael; Arnold, Jeffrey G.
2015-05-01
Nonpoint source pollution from agriculture is the main source of nitrogen and phosphorus in the stream systems of the Corn Belt region in the Midwestern US. This region is comprised of two large river basins, the intensely row-cropped Upper Mississippi River Basin (UMRB) and Ohio-Tennessee River Basin (OTRB), which are considered the key contributing areas for the Northern Gulf of Mexico hypoxic zone according to the US Environmental Protection Agency. Thus, in this area it is of utmost importance to ensure that intensive agriculture for food, feed and biofuel production can coexist with a healthy water environment. To address these objectives within a river basin management context, an integrated modeling system has been constructed with the hydrologic Soil and Water Assessment Tool (SWAT) model, capable of estimating river basin responses to alternative cropping and/or management strategies. To improve modeling performance compared to previous studies and provide a spatially detailed basis for scenario development, this SWAT Corn Belt application incorporates a greatly refined subwatershed structure based on 12-digit hydrologic units or 'subwatersheds' as defined by the US Geological Service. The model setup, calibration and validation are time-demanding and challenging tasks for these large systems, given the scale intensive data requirements, and the need to ensure the reliability of flow and pollutant load predictions at multiple locations. Thus, the objectives of this study are both to comprehensively describe this large-scale modeling approach, providing estimates of pollution and crop production in the region as well as to present strengths and weaknesses of integrated modeling at such a large scale along with how it can be improved on the basis of the current modeling structure and results. The predictions were based on a semi-automatic hydrologic calibration approach for large-scale and spatially detailed modeling studies, with the use of the Sequential Uncertainty Fitting algorithm (SUFI-2) and the SWAT-CUP interface, followed by a manual water quality calibration on a monthly basis. The refined modeling approach developed in this study led to successful predictions across most parts of the Corn Belt region and can be used for testing pollution mitigation measures and agricultural economic scenarios, providing useful information to policy makers and recommendations on similar efforts at the regional scale.
A multi-species framework for landscape conservation planning
Schwenk, W. Scott; Donovan, Therese
2011-01-01
Rapidly changing landscapes have spurred the need for quantitative methods for conservation assessment and planning that encompass large spatial extents. We devised and tested a multispecies framework for conservation planning to complement single-species assessments and ecosystem-level approaches. Our framework consisted of 4 elements: sampling to effectively estimate population parameters, measuring how human activity affects landscapes at multiple scales, analyzing the relation between landscape characteristics and individual species occurrences, and evaluating and comparing the responses of multiple species to landscape modification. We applied the approach to a community of terrestrial birds across 25,000 km2 with a range of intensities of human development. Human modification of land cover, road density, and other elements of the landscape, measured at multiple spatial extents, had large effects on occupancy of the 67 species studied. Forest composition within 1 km of points had a strong effect on occupancy of many species and a range of negative, intermediate, and positive associations. Road density within 1 km of points, percent evergreen forest within 300 m, and distance from patch edge were also strongly associated with occupancy for many species. We used the occupancy results to group species into 11 guilds that shared patterns of association with landscape characteristics. Our multispecies approach to conservation planning allowed us to quantify the trade-offs of different scenarios of land-cover change in terms of species occupancy.
A Multispecies Framework for Landscape Conservation Planning
Schwenk, W.S.; Donovan, T.M.
2011-01-01
Rapidly changing landscapes have spurred the need for quantitative methods for conservation assessment and planning that encompass large spatial extents. We devised and tested a multispecies framework for conservation planning to complement single-species assessments and ecosystem-level approaches. Our framework consisted of 4 elements: sampling to effectively estimate population parameters, measuring how human activity affects landscapes at multiple scales, analyzing the relation between landscape characteristics and individual species occurrences, and evaluating and comparing the responses of multiple species to landscape modification. We applied the approach to a community of terrestrial birds across 25,000 km2 with a range of intensities of human development. Human modification of land cover, road density, and other elements of the landscape, measured at multiple spatial extents, had large effects on occupancy of the 67 species studied. Forest composition within 1 km of points had a strong effect on occupancy of many species and a range of negative, intermediate, and positive associations. Road density within 1 km of points, percent evergreen forest within 300 m, and distance from patch edge were also strongly associated with occupancy for many species. We used the occupancy results to group species into 11 guilds that shared patterns of association with landscape characteristics. Our multispecies approach to conservation planning allowed us to quantify the trade-offs of different scenarios of land-cover change in terms of species occupancy. ?? 2011 Society for Conservation Biology.
UTILIZATION OF TREATABILITY AND PILOT TESTS TO PREDICT CAH BIOREMEDIATION
Multiple tools have been suggested to help in the design of enhanced anaerobic bioremediation systems for CAHs:
- Extensive high quality microcosm testing followed by small-scale, thoroughly observed field pilot tests (i.e., RABITT Protocol, Morse 1998)
- More limited ...
Dafforn, Katherine A; Kelaher, Brendan P; Simpson, Stuart L; Coleman, Melinda A; Hutchings, Pat A; Clark, Graeme F; Knott, Nathan A; Doblin, Martina A; Johnston, Emma L
2013-01-01
Ecological communities are increasingly exposed to multiple chemical and physical stressors, but distinguishing anthropogenic impacts from other environmental drivers remains challenging. Rarely are multiple stressors investigated in replicated studies over large spatial scales (>1000 kms) or supported with manipulations that are necessary to interpret ecological patterns. We measured the composition of sediment infaunal communities in relation to anthropogenic and natural stressors at multiple sites within seven estuaries. We observed increases in the richness and abundance of polychaete worms in heavily modified estuaries with severe metal contamination, but no changes in the diversity or abundance of other taxa. Estuaries in which toxic contaminants were elevated also showed evidence of organic enrichment. We hypothesised that the observed response of polychaetes was not a 'positive' response to toxic contamination or a reduction in biotic competition, but due to high levels of nutrients in heavily modified estuaries driving productivity in the water column and enriching the sediment over large spatial scales. We deployed defaunated field-collected sediments from the surveyed estuaries in a small scale experiment, but observed no effects of sediment characteristics (toxic or enriching). Furthermore, invertebrate recruitment instead reflected the low diversity and abundance observed during field surveys of this relatively 'pristine' estuary. This suggests that differences observed in the survey are not a direct consequence of sediment characteristics (even severe metal contamination) but are related to parameters that covary with estuary modification such as enhanced productivity from nutrient inputs and the diversity of the local species pool. This has implications for the interpretation of diversity measures in large-scale monitoring studies in which the observed patterns may be strongly influenced by many factors that covary with anthropogenic modification.
Dafforn, Katherine A.; Kelaher, Brendan P.; Simpson, Stuart L.; Coleman, Melinda A.; Hutchings, Pat A.; Clark, Graeme F.; Knott, Nathan A.; Doblin, Martina A.; Johnston, Emma L.
2013-01-01
Ecological communities are increasingly exposed to multiple chemical and physical stressors, but distinguishing anthropogenic impacts from other environmental drivers remains challenging. Rarely are multiple stressors investigated in replicated studies over large spatial scales (>1000 kms) or supported with manipulations that are necessary to interpret ecological patterns. We measured the composition of sediment infaunal communities in relation to anthropogenic and natural stressors at multiple sites within seven estuaries. We observed increases in the richness and abundance of polychaete worms in heavily modified estuaries with severe metal contamination, but no changes in the diversity or abundance of other taxa. Estuaries in which toxic contaminants were elevated also showed evidence of organic enrichment. We hypothesised that the observed response of polychaetes was not a ‘positive’ response to toxic contamination or a reduction in biotic competition, but due to high levels of nutrients in heavily modified estuaries driving productivity in the water column and enriching the sediment over large spatial scales. We deployed defaunated field-collected sediments from the surveyed estuaries in a small scale experiment, but observed no effects of sediment characteristics (toxic or enriching). Furthermore, invertebrate recruitment instead reflected the low diversity and abundance observed during field surveys of this relatively ‘pristine’ estuary. This suggests that differences observed in the survey are not a direct consequence of sediment characteristics (even severe metal contamination) but are related to parameters that covary with estuary modification such as enhanced productivity from nutrient inputs and the diversity of the local species pool. This has implications for the interpretation of diversity measures in large-scale monitoring studies in which the observed patterns may be strongly influenced by many factors that covary with anthropogenic modification. PMID:24098816
Robbins, Blaine
2013-01-01
Sociologists, political scientists, and economists all suggest that culture plays a pivotal role in the development of large-scale cooperation. In this study, I used generalized trust as a measure of culture to explore if and how culture impacts intentional homicide, my operationalization of cooperation. I compiled multiple cross-national data sets and used pooled time-series linear regression, single-equation instrumental-variables linear regression, and fixed- and random-effects estimation techniques on an unbalanced panel of 118 countries and 232 observations spread over a 15-year time period. Results suggest that culture and large-scale cooperation form a tenuous relationship, while economic factors such as development, inequality, and geopolitics appear to drive large-scale cooperation. PMID:23527211
Large Scale GW Calculations on the Cori System
NASA Astrophysics Data System (ADS)
Deslippe, Jack; Del Ben, Mauro; da Jornada, Felipe; Canning, Andrew; Louie, Steven
The NERSC Cori system, powered by 9000+ Intel Xeon-Phi processors, represents one of the largest HPC systems for open-science in the United States and the world. We discuss the optimization of the GW methodology for this system, including both node level and system-scale optimizations. We highlight multiple large scale (thousands of atoms) case studies and discuss both absolute application performance and comparison to calculations on more traditional HPC architectures. We find that the GW method is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism across many layers of the system. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division, as part of the Computational Materials Sciences Program.
Lagrangian space consistency relation for large scale structure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horn, Bart; Hui, Lam; Xiao, Xiao
Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias & Riotto and Peloso & Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present.more » Furthermore, the simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space.« less
Lagrangian space consistency relation for large scale structure
Horn, Bart; Hui, Lam; Xiao, Xiao
2015-09-29
Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias & Riotto and Peloso & Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present.more » Furthermore, the simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space.« less
Web tools for large-scale 3D biological images and atlases
2012-01-01
Background Large-scale volumetric biomedical image data of three or more dimensions are a significant challenge for distributed browsing and visualisation. Many images now exceed 10GB which for most users is too large to handle in terms of computer RAM and network bandwidth. This is aggravated when users need to access tens or hundreds of such images from an archive. Here we solve the problem for 2D section views through archive data delivering compressed tiled images enabling users to browse through very-large volume data in the context of a standard web-browser. The system provides an interactive visualisation for grey-level and colour 3D images including multiple image layers and spatial-data overlay. Results The standard Internet Imaging Protocol (IIP) has been extended to enable arbitrary 2D sectioning of 3D data as well a multi-layered images and indexed overlays. The extended protocol is termed IIP3D and we have implemented a matching server to deliver the protocol and a series of Ajax/Javascript client codes that will run in an Internet browser. We have tested the server software on a low-cost linux-based server for image volumes up to 135GB and 64 simultaneous users. The section views are delivered with response times independent of scale and orientation. The exemplar client provided multi-layer image views with user-controlled colour-filtering and overlays. Conclusions Interactive browsing of arbitrary sections through large biomedical-image volumes is made possible by use of an extended internet protocol and efficient server-based image tiling. The tools open the possibility of enabling fast access to large image archives without the requirement of whole image download and client computers with very large memory configurations. The system was demonstrated using a range of medical and biomedical image data extending up to 135GB for a single image volume. PMID:22676296
Organizing "mountains of words" for data analysis, both qualitative and quantitative.
Johnson, Bruce D; Dunlap, Eloise; Benoit, Ellen
2010-04-01
Qualitative research creates mountains of words. U.S. federal funding supports mostly structured qualitative research, which is designed to test hypotheses using semiquantitative coding and analysis. This article reports on strategies for planning, organizing, collecting, managing, storing, retrieving, analyzing, and writing about qualitative data so as to most efficiently manage the mountains of words collected in large-scale ethnographic projects. Multiple benefits accrue from this approach. Field expenditures are linked to units of work so productivity is measured, many staff in various locations have access to use and analyze the data, quantitative data can be derived from data that is primarily qualitative, and improved efficiencies of resources are developed.