Vázquez Peña, Fernando; Harzheim, Erno; Terrasa, Sergio; Berra, Silvina
2017-02-01
To validate the Brazilian short version of the PCAT for adult patients in Spanish. Analysis of secondary data from studies made to validate the extended version of the PCAT questionnaire. City of Córdoba, Argentina. Primary health care. The sample consisted of 46% of parents, whose children were enrolled in secondary education in three institutes in the city of Cordoba, and the remaining 54% were adult users of the National University of Cordoba Health Insurance. Pearson's correlation coefficient comparing the extended and short versions. Goodness-of-fit indices in confirmatory factor analysis, composite reliability, average variance extracted, and Cronbach's alpha values, in order to assess the construct validity and the reliability of the short version. The values of Pearson's correlation coefficient between this short version and the long version were high .818 (P<.001), implying a very good criterion validity. The indicators of good global adjustment to the confirmatory factor analysis were good. The value of composite reliability was good (.802), but under the variance media extracted: .3306, since 3 variables had weak factorials loads. The Cronbach's alpha was acceptable (.85). The short version of the PCAT-users developed in Brazil showed an acceptable psychometric performance in Spanish as a quick assessment tool, in a comparative study with the extended version. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.
Undergraduate Performance in Solving Ill-Defined Biochemistry Problems
Sensibaugh, Cheryl A.; Madrid, Nathaniel J.; Choi, Hye-Jeong; Anderson, William L.; Osgood, Marcy P.
2017-01-01
With growing interest in promoting skills related to the scientific process, we studied performance in solving ill-defined problems demonstrated by graduating biochemistry majors at a public, minority-serving university. As adoption of techniques for facilitating the attainment of higher-order learning objectives broadens, so too does the need to appropriately measure and understand student performance. We extended previous validation of the Individual Problem Solving Assessment (IPSA) and administered multiple versions of the IPSA across two semesters of biochemistry courses. A final version was taken by majors just before program exit, and student responses on that version were analyzed both quantitatively and qualitatively. This mixed-methods study quantifies student performance in scientific problem solving, while probing the qualitative nature of unsatisfactory solutions. Of the five domains measured by the IPSA, we found that average graduates were only successful in two areas: evaluating given experimental data to state results and reflecting on performance after the solution to the problem was provided. The primary difficulties in each domain were quite different. The most widespread challenge for students was to design an investigation that rationally aligned with a given hypothesis. We also extend the findings into pedagogical recommendations. PMID:29180350
Di Riso, Daniela; Salcuni, Silvia; Lis, Adriana; Delvecchio, Elisa
2017-01-01
Affect in Play Scale-Preschool (APS-P) is one of the few standardized tools to measure pretend play. APS-P is an effective measure of symbolic play, able to detect both cognitive and affective dimensions which classically designated play in children, but often are evaluated separately and are scarcely integrated. The scale uses 5 min standardized play task with a set of toys. Recently the scale was extended from 6 to 10 years old and validated in Italy preschool and school-aged children. Some of the main limitations of this measure are that it requires videotaping, verbatim transcripts, and an extensive scoring training, which could compromise its clinical utility. For these reasons, a Brief version of the measure was developed by the original authors. This paper will focus on an APS-P Brief Version and its Extended Version through ages (6–10 years), which consists “in vivo” coding. This study aimed to evaluate construct and external validity of this APS-P Brief Version and its Extended Version in a sample of 538 Italian children aged 4-to-10 years. Confirmatory factor analysis yielded a two correlated factor structure including an affective and a cognitive factor. APS-P-BR and its Extended Version factor scores strongly related to APS-P Extended Version factor scores. Significant relationships were found with a divergent thinking task. Results suggest that the APS-P-BR and its Extended Version is an encouraging brief measure assessing pretend play using toys. It would easily substitute the APS-P and its Extended Version in clinical and research settings, reducing time and difficulties in scoring procedures and maintaining the same strengths. PMID:28553243
On the Inclusion of Externally Controlled Actions in Action Planning
ERIC Educational Resources Information Center
Tsai, Jessica Chia-Chin; Knoblich, Gunther; Sebanz, Natalie
2011-01-01
According to ideomotor theories, perceiving action effects produced by others triggers corresponding action representations in the observer. We tested whether this principle extends to actions performed by externally controlled limbs and tools. Participants performed a go-no-go version of a spatial compatibility task in which their own actions…
Extended spectrum SWIR camera with user-accessible Dewar
NASA Astrophysics Data System (ADS)
Benapfl, Brendan; Miller, John Lester; Vemuri, Hari; Grein, Christoph; Sivananthan, Siva
2017-02-01
Episensors has developed a series of extended short wavelength infrared (eSWIR) cameras based on high-Cd concentration Hg1-xCdxTe absorbers. The cameras have a bandpass extending to 3 microns cutoff wavelength, opening new applications relative to traditional InGaAs-based cameras. Applications and uses are discussed and examples given. A liquid nitrogen pour-filled version was initially developed. This was followed by a compact Stirling-cooled version with detectors operating at 200 K. Each camera has unique sensitivity and performance characteristics. The cameras' size, weight and power specifications are presented along with images captured with band pass filters and eSWIR sources to demonstrate spectral response beyond 1.7 microns. The soft seal Dewars of the cameras are designed for accessibility, and can be opened and modified in a standard laboratory environment. This modular approach allows user flexibility for swapping internal components such as cold filters and cold stops. The core electronics of the Stirlingcooled camera are based on a single commercial field programmable gate array (FPGA) that also performs on-board non-uniformity corrections, bad pixel replacement, and directly drives any standard HDMI display.
A novel tool for evaluating children's musical abilities across age and culture
Peretz, Isabelle; Gosselin, Nathalie; Nan, Yun; Caron-Caplette, Emilie; Trehub, Sandra E.; Béland, Renée
2013-01-01
The present study introduces a novel tool for assessing musical abilities in children: The Montreal Battery of Evaluation of Musical Abilities (MBEMA). The battery, which comprises tests of memory, scale, contour, interval, and rhythm, was administered to 245 children in Montreal and 91 in Beijing (Experiment 1), and an abbreviated version was administered to an additional 85 children in Montreal (in less than 20 min; Experiment 2). All children were 6–8 years of age. Their performance indicated that both versions of the MBEMA are sensitive to individual differences and to musical training. The sensitivity of the tests extends to Mandarin-speaking children despite the fact that they show enhanced performance relative to French-speaking children. Because this Chinese advantage is not limited to musical pitch but extends to rhythm and memory, it is unlikely that it results from early exposure to a tonal language. In both cultures and versions of the tests, amount of musical practice predicts performance. Thus, the MBEMA can serve as an objective, short and up-to-date test of musical abilities in a variety of situations, from the identification of children with musical difficulties to the assessment of the effects of musical training in typically developing children of different cultures. PMID:23847479
Analysis of rosen piezoelectric transformers with a varying cross-section.
Xue, H; Yang, J; Hu, Y
2008-07-01
We study the effects of a varying cross-section on the performance of Rosen piezoelectric transformers operating with length extensional modes of rods. A theoretical analysis is performed using an extended version of a one-dimensional model developed in a previous paper. Numerical results based on the theoretical analysis are presented.
EPANET is a Windows program that performs extended period simulation of hydraulic and water-quality behavior within pressurized pipe networks. A network can consist of pipes, nodes (pipe junctions), pumps, valves and storage tanks or reservoirs. EPANET tracks the flow of water in...
A Hybrid Ant Colony Optimization Algorithm for the Extended Capacitated Arc Routing Problem.
Li-Ning Xing; Rohlfshagen, P; Ying-Wu Chen; Xin Yao
2011-08-01
The capacitated arc routing problem (CARP) is representative of numerous practical applications, and in order to widen its scope, we consider an extended version of this problem that entails both total service time and fixed investment costs. We subsequently propose a hybrid ant colony optimization (ACO) algorithm (HACOA) to solve instances of the extended CARP. This approach is characterized by the exploitation of heuristic information, adaptive parameters, and local optimization techniques: Two kinds of heuristic information, arc cluster information and arc priority information, are obtained continuously from the solutions sampled to guide the subsequent optimization process. The adaptive parameters ease the burden of choosing initial values and facilitate improved and more robust results. Finally, local optimization, based on the two-opt heuristic, is employed to improve the overall performance of the proposed algorithm. The resulting HACOA is tested on four sets of benchmark problems containing a total of 87 instances with up to 140 nodes and 380 arcs. In order to evaluate the effectiveness of the proposed method, some existing capacitated arc routing heuristics are extended to cope with the extended version of this problem; the experimental results indicate that the proposed ACO method outperforms these heuristics.
NASA Technical Reports Server (NTRS)
Lamar, J. E.; Herbert, H. E.
1982-01-01
The latest production version, MARK IV, of the NASA-Langley vortex lattice computer program is summarized. All viable subcritical aerodynamic features of previous versions were retained. This version extends the previously documented program capabilities to four planforms, 400 panels, and enables the user to obtain vortex-flow aerodynamics on cambered planforms, flowfield properties off the configuration in attached flow, and planform longitudinal load distributions.
Investigating Underlying Components of the ICT Indicators Measurement Scale: The Extended Version
ERIC Educational Resources Information Center
Akbulut, Yavuz
2009-01-01
This study aimed to investigate the underlying components constituting the extended version of the ICT Indicators Measurement Scale (ICTIMS), which was developed in 2007, and extended in the current study through the addition of 34 items. New items addressing successful ICT integration at education faculties were identified through the examination…
Study of Microburst Detection Performance during 1985 in Memphis, Tennessee.
1987-08-05
downburst into two categories depending on the outbursts’ hori- zontal scale: 1) macroburst - a large downburst with its’ outburst winds extending in... Macroburst . University of Chicago, 122 pp. Merritt, M.W., 1987: Microburst Divergent Outflow Algorithm, Version 2. MIT Lincoln Laboratory Weather Radar
GDF v2.0, an enhanced version of GDF
NASA Astrophysics Data System (ADS)
Tsoulos, Ioannis G.; Gavrilis, Dimitris; Dermatas, Evangelos
2007-12-01
An improved version of the function estimation program GDF is presented. The main enhancements of the new version include: multi-output function estimation, capability of defining custom functions in the grammar and selection of the error function. The new version has been evaluated on a series of classification and regression datasets, that are widely used for the evaluation of such methods. It is compared to two known neural networks and outperforms them in 5 (out of 10) datasets. Program summaryTitle of program: GDF v2.0 Catalogue identifier: ADXC_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXC_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 98 147 No. of bytes in distributed program, including test data, etc.: 2 040 684 Distribution format: tar.gz Programming language: GNU C++ Computer: The program is designed to be portable in all systems running the GNU C++ compiler Operating system: Linux, Solaris, FreeBSD RAM: 200000 bytes Classification: 4.9 Does the new version supersede the previous version?: Yes Nature of problem: The technique of function estimation tries to discover from a series of input data a functional form that best describes them. This can be performed with the use of parametric models, whose parameters can adapt according to the input data. Solution method: Functional forms are being created by genetic programming which are approximations for the symbolic regression problem. Reasons for new version: The GDF package was extended in order to be more flexible and user customizable than the old package. The user can extend the package by defining his own error functions and he can extend the grammar of the package by adding new functions to the function repertoire. Also, the new version can perform function estimation of multi-output functions and it can be used for classification problems. Summary of revisions: The following features have been added to the package GDF: Multi-output function approximation. The package can now approximate any function f:R→R. This feature gives also to the package the capability of performing classification and not only regression. User defined function can be added to the repertoire of the grammar, extending the regression capabilities of the package. This feature is limited to 3 functions, but easily this number can be increased. Capability of selecting the error function. The package offers now to the user apart from the mean square error other error functions such as: mean absolute square error, maximum square error. Also, user defined error functions can be added to the set of error functions. More verbose output. The main program displays more information to the user as well as the default values for the parameters. Also, the package gives to the user the capability to define an output file, where the output of the gdf program for the testing set will be stored after the termination of the process. Additional comments: A technical report describing the revisions, experiments and test runs is packaged with the source code. Running time: Depending on the train data.
Ito, Shinya; Hansen, Michael E.; Heiland, Randy; Lumsdaine, Andrew; Litke, Alan M.; Beggs, John M.
2011-01-01
Transfer entropy (TE) is an information-theoretic measure which has received recent attention in neuroscience for its potential to identify effective connectivity between neurons. Calculating TE for large ensembles of spiking neurons is computationally intensive, and has caused most investigators to probe neural interactions at only a single time delay and at a message length of only a single time bin. This is problematic, as synaptic delays between cortical neurons, for example, range from one to tens of milliseconds. In addition, neurons produce bursts of spikes spanning multiple time bins. To address these issues, here we introduce a free software package that allows TE to be measured at multiple delays and message lengths. To assess performance, we applied these extensions of TE to a spiking cortical network model (Izhikevich, 2006) with known connectivity and a range of synaptic delays. For comparison, we also investigated single-delay TE, at a message length of one bin (D1TE), and cross-correlation (CC) methods. We found that D1TE could identify 36% of true connections when evaluated at a false positive rate of 1%. For extended versions of TE, this dramatically improved to 73% of true connections. In addition, the connections correctly identified by extended versions of TE accounted for 85% of the total synaptic weight in the network. Cross correlation methods generally performed more poorly than extended TE, but were useful when data length was short. A computational performance analysis demonstrated that the algorithm for extended TE, when used on currently available desktop computers, could extract effective connectivity from 1 hr recordings containing 200 neurons in ∼5 min. We conclude that extending TE to multiple delays and message lengths improves its ability to assess effective connectivity between spiking neurons. These extensions to TE soon could become practical tools for experimentalists who record hundreds of spiking neurons. PMID:22102894
Study of fault tolerant software technology for dynamic systems
NASA Technical Reports Server (NTRS)
Caglayan, A. K.; Zacharias, G. L.
1985-01-01
The major aim of this study is to investigate the feasibility of using systems-based failure detection isolation and compensation (FDIC) techniques in building fault-tolerant software and extending them, whenever possible, to the domain of software fault tolerance. First, it is shown that systems-based FDIC methods can be extended to develop software error detection techniques by using system models for software modules. In particular, it is demonstrated that systems-based FDIC techniques can yield consistency checks that are easier to implement than acceptance tests based on software specifications. Next, it is shown that systems-based failure compensation techniques can be generalized to the domain of software fault tolerance in developing software error recovery procedures. Finally, the feasibility of using fault-tolerant software in flight software is investigated. In particular, possible system and version instabilities, and functional performance degradation that may occur in N-Version programming applications to flight software are illustrated. Finally, a comparative analysis of N-Version and recovery block techniques in the context of generic blocks in flight software is presented.
A new version of Visual tool for estimating the fractal dimension of images
NASA Astrophysics Data System (ADS)
Grossu, I. V.; Felea, D.; Besliu, C.; Jipa, Al.; Bordeianu, C. C.; Stan, E.; Esanu, T.
2010-04-01
This work presents a new version of a Visual Basic 6.0 application for estimating the fractal dimension of images (Grossu et al., 2009 [1]). The earlier version was limited to bi-dimensional sets of points, stored in bitmap files. The application was extended for working also with comma separated values files and three-dimensional images. New version program summaryProgram title: Fractal Analysis v02 Catalogue identifier: AEEG_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 9999 No. of bytes in distributed program, including test data, etc.: 4 366 783 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 30 M Classification: 14 Catalogue identifier of previous version: AEEG_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 1999 Does the new version supersede the previous version?: Yes Nature of problem: Estimating the fractal dimension of 2D and 3D images. Solution method: Optimized implementation of the box-counting algorithm. Reasons for new version:The previous version was limited to bitmap image files. The new application was extended in order to work with objects stored in comma separated values (csv) files. The main advantages are: Easier integration with other applications (csv is a widely used, simple text file format); Less resources consumed and improved performance (only the information of interest, the "black points", are stored); Higher resolution (the points coordinates are loaded into Visual Basic double variables [2]); Possibility of storing three-dimensional objects (e.g. the 3D Sierpinski gasket). In this version the optimized box-counting algorithm [1] was extended to the three-dimensional case. Summary of revisions:The application interface was changed from SDI (single document interface) to MDI (multi-document interface). One form was added in order to provide a graphical user interface for the new functionalities (fractal analysis of 2D and 3D images stored in csv files). Additional comments: User friendly graphical interface; Easy deployment mechanism. Running time: In the first approximation, the algorithm is linear. References:[1] I.V. Grossu, C. Besliu, M.V. Rusu, Al. Jipa, C.C. Bordeianu, D. Felea, Comput. Phys. Comm. 180 (2009) 1999-2001.[2] F. Balena, Programming Microsoft Visual Basic 6.0, Microsoft Press, US, 1999.
Kennedy, David O; Scholey, Andrew B
2004-06-01
Effects of a combination of caffeine and glucose were assessed in two double-blind, placebo-controlled, cross-over studies during extended performance of cognitively demanding tasks. In the first study, 30 participants received two drinks containing carbohydrate and caffeine (68 g/38 mg; 68 g/46 mg, respectively) and a placebo drink, in counter-balanced order, on separate days. In the second study 26 participants received a drink containing 60 g of carbohydrate and 33 mg of caffeine and a placebo drink. In both studies, participants completed a 10-min battery of tasks comprising 2-min versions of Serial 3s and Serial 7s subtraction tasks and a 5-min version of the Rapid Visual Information Processing task (RVIP), plus a rating of 'mental fatigue', once before the drink and six times in succession commencing 10 min after its consumption. In comparison to placebo, all three active drinks improved the accuracy of RVIP performance and both the drink with the higher level of caffeine in first study and the active drink in the second study resulted in lower ratings of mental fatigue. These results indicate that a combination of caffeine and glucose can ameliorate deficits in cognitive performance and subjective fatigue during extended periods of cognitive demand.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrett, Brian; Brightwell, Ronald B.; Grant, Ryan
This report presents a specification for the Portals 4 networ k programming interface. Portals 4 is intended to allow scalable, high-performance network communication betwee n nodes of a parallel computing system. Portals 4 is well suited to massively parallel processing and embedded syste ms. Portals 4 represents an adaption of the data movement layer developed for massively parallel processing platfor ms, such as the 4500-node Intel TeraFLOPS machine. Sandia's Cplant cluster project motivated the development of Version 3.0, which was later extended to Version 3.3 as part of the Cray Red Storm machine and XT line. Version 4 is tarmore » geted to the next generation of machines employing advanced network interface architectures that support enh anced offload capabilities.« less
The Portals 4.0 network programming interface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrett, Brian W.; Brightwell, Ronald Brian; Pedretti, Kevin
2012-11-01
This report presents a specification for the Portals 4.0 network programming interface. Portals 4.0 is intended to allow scalable, high-performance network communication between nodes of a parallel computing system. Portals 4.0 is well suited to massively parallel processing and embedded systems. Portals 4.0 represents an adaption of the data movement layer developed for massively parallel processing platforms, such as the 4500-node Intel TeraFLOPS machine. Sandias Cplant cluster project motivated the development of Version 3.0, which was later extended to Version 3.3 as part of the Cray Red Storm machine and XT line. Version 4.0 is targeted to the next generationmore » of machines employing advanced network interface architectures that support enhanced offload capabilities.« less
Stimulus discriminability may bias value-based probabilistic learning.
Schutte, Iris; Slagter, Heleen A; Collins, Anne G E; Frank, Michael J; Kenemans, J Leon
2017-01-01
Reinforcement learning tasks are often used to assess participants' tendency to learn more from the positive or more from the negative consequences of one's action. However, this assessment often requires comparison in learning performance across different task conditions, which may differ in the relative salience or discriminability of the stimuli associated with more and less rewarding outcomes, respectively. To address this issue, in a first set of studies, participants were subjected to two versions of a common probabilistic learning task. The two versions differed with respect to the stimulus (Hiragana) characters associated with reward probability. The assignment of character to reward probability was fixed within version but reversed between versions. We found that performance was highly influenced by task version, which could be explained by the relative perceptual discriminability of characters assigned to high or low reward probabilities, as assessed by a separate discrimination experiment. Participants were more reliable in selecting rewarding characters that were more discriminable, leading to differences in learning curves and their sensitivity to reward probability. This difference in experienced reinforcement history was accompanied by performance biases in a test phase assessing ability to learn from positive vs. negative outcomes. In a subsequent large-scale web-based experiment, this impact of task version on learning and test measures was replicated and extended. Collectively, these findings imply a key role for perceptual factors in guiding reward learning and underscore the need to control stimulus discriminability when making inferences about individual differences in reinforcement learning.
Identifying Shortcomings in the Measurement of Service Quality.
ERIC Educational Resources Information Center
Fogarty, Gerard; Catts, R.; Forlin, C.
2000-01-01
Studied the use of SERVPEFR the performance component of the Service Quality Scale (SERVQUAL) in 2 studies involving 113 and 212 customers of businesses in Australia and investigated a revised (extended) version of SERVPEFR with 122 customers. Results suggest that SERVPEFR items are too easy to rate highly, and that the revisions did not overcome…
Modified NASA-Lewis chemical equilibrium code for MHD applications
NASA Technical Reports Server (NTRS)
Sacks, R. A.; Geyer, H. K.; Grammel, S. J.; Doss, E. D.
1979-01-01
A substantially modified version of the NASA-Lewis Chemical Equilibrium Code was recently developed. The modifications were designed to extend the power and convenience of the Code as a tool for performing combustor analysis for MHD systems studies. The effect of the programming details is described from a user point of view.
The portals 4.0.1 network programming interface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrett, Brian W.; Brightwell, Ronald Brian; Pedretti, Kevin
2013-04-01
This report presents a specification for the Portals 4.0 network programming interface. Portals 4.0 is intended to allow scalable, high-performance network communication between nodes of a parallel computing system. Portals 4.0 is well suited to massively parallel processing and embedded systems. Portals 4.0 represents an adaption of the data movement layer developed for massively parallel processing platforms, such as the 4500-node Intel TeraFLOPS machine. Sandias Cplant cluster project motivated the development of Version 3.0, which was later extended to Version 3.3 as part of the Cray Red Storm machine and XT line. Version 4.0 is targeted to the next generationmore » of machines employing advanced network interface architectures that support enhanced offload capabilities. 3« less
Extended version of the "Sniffin' Sticks" identification test: test-retest reliability and validity.
Sorokowska, A; Albrecht, E; Haehner, A; Hummel, T
2015-03-30
The extended, 32-item version of the Sniffin' Sticks identification test was developed in order to create a precise tool enabling repeated, longitudinal testing of individual olfactory subfunctions. Odors of the previous test version had to be changed for technical reasons, and the odor identification test needed re-investigation in terms of reliability, validity, and normative values. In our study we investigated olfactory abilities of a group of 100 patients with olfactory dysfunction and 100 controls. We reconfirmed the high test-retest reliability of the extended version of the Sniffin' Sticks identification test and high correlations between the new and the original part of this tool. In addition, we confirmed the validity of the test as it discriminated clearly between controls and patients with olfactory loss. The additional set of 16 odor identification sticks can be either included in the current olfactory test, thus creating a more detailed diagnosis tool, or it can be used separately, enabling to follow olfactory function over time. Additionally, the normative values presented in our paper might provide useful guidelines for interpretation of the extended identification test results. The revised version of the Sniffin' Sticks 32-item odor identification test is a reliable and valid tool for the assessment of olfactory function. Copyright © 2015 Elsevier B.V. All rights reserved.
2008-07-07
analyzing multivariate data sets. The system was developed using the Java Development Kit (JDK) version 1.5; and it yields interactive performance on a... script and captures output from the MATLAB’s “regress” and “stepwisefit” utilities that perform simple and stepwise regression, respectively. The MATLAB...Statistical Association, vol. 85, no. 411, pp. 664–675, 1990. [9] H. Hauser, F. Ledermann, and H. Doleisch, “ Angular brushing of extended parallel coordinates
ERIC Educational Resources Information Center
Spielman, Jennifer; Ramig, Lorraine O.; Mahler, Leslie; Halpern, Angela; Gavin, William J.
2007-01-01
Purpose: The present study examined vocal SPL, voice handicap, and speech characteristics in Parkinson's disease (PD) following an extended version of the Lee Silverman Voice Treatment (LSVT), to help determine whether current treatment dosages can be altered without compromising clinical outcomes. Method: Twelve participants with idiopathic PD…
Regional yield predictions of malting barley by remote sensing and ancillary data
NASA Astrophysics Data System (ADS)
Weissteiner, Christof J.; Braun, Matthias; Kuehbauch, Walter
2004-02-01
Yield forecasts are of high interest to the malting and brewing industry in order to allow the most convenient purchasing policy of raw materials. Within this investigation, malting barley yield forecasts (Hordeum vulgare L.) were performed for typical growing regions in South-Western Germany. Multisensoral and multitemporal Remote Sensing data on one hand and ancillary meteorological, agrostatistical, topographical and pedological data on the other hand were used as input data for prediction models, which were based on an empirical-statistical modeling approach. Since spring barley production is depending on acreage and on the yield per area, classification is needed, which was performed by a supervised multitemporal classification algorithm, utilizing optical Remote Sensing data (LANDSAT TM/ETM+). Comparison between a pixel-based and an object-oriented classification algorithm was carried out. The basic version of the yield estimation model was conducted by means of linear correlation of Remote Sensing data (NOAA-AVHRR NDVI), CORINE land cover data and agrostatistical data. In an extended version meteorological data (temperature, precipitation, etc.) and soil data was incorporated. Both, basic and extended prediction systems, led to feasible results, depending on the selection of the time span for NDVI accumulation.
Self-consistency in the phonon space of the particle-phonon coupling model
NASA Astrophysics Data System (ADS)
Tselyaev, V.; Lyutorovich, N.; Speth, J.; Reinhard, P.-G.
2018-04-01
In the paper the nonlinear generalization of the time blocking approximation (TBA) is presented. The TBA is one of the versions of the extended random-phase approximation (RPA) developed within the Green-function method and the particle-phonon coupling model. In the generalized version of the TBA the self-consistency principle is extended onto the phonon space of the model. The numerical examples show that this nonlinear version of the TBA leads to the convergence of results with respect to enlarging the phonon space of the model.
libgapmis: extending short-read alignments
2013-01-01
Background A wide variety of short-read alignment programmes have been published recently to tackle the problem of mapping millions of short reads to a reference genome, focusing on different aspects of the procedure such as time and memory efficiency, sensitivity, and accuracy. These tools allow for a small number of mismatches in the alignment; however, their ability to allow for gaps varies greatly, with many performing poorly or not allowing them at all. The seed-and-extend strategy is applied in most short-read alignment programmes. After aligning a substring of the reference sequence against the high-quality prefix of a short read--the seed--an important problem is to find the best possible alignment between a substring of the reference sequence succeeding and the remaining suffix of low quality of the read--extend. The fact that the reads are rather short and that the gap occurrence frequency observed in various studies is rather low suggest that aligning (parts of) those reads with a single gap is in fact desirable. Results In this article, we present libgapmis, a library for extending pairwise short-read alignments. Apart from the standard CPU version, it includes ultrafast SSE- and GPU-based implementations. libgapmis is based on an algorithm computing a modified version of the traditional dynamic-programming matrix for sequence alignment. Extensive experimental results demonstrate that the functions of the CPU version provided in this library accelerate the computations by a factor of 20 compared to other programmes. The analogous SSE- and GPU-based implementations accelerate the computations by a factor of 6 and 11, respectively, compared to the CPU version. The library also provides the user the flexibility to split the read into fragments, based on the observed gap occurrence frequency and the length of the read, thereby allowing for a variable, but bounded, number of gaps in the alignment. Conclusions We present libgapmis, a library for extending pairwise short-read alignments. We show that libgapmis is better-suited and more efficient than existing algorithms for this task. The importance of our contribution is underlined by the fact that the provided functions may be seamlessly integrated into any short-read alignment pipeline. The open-source code of libgapmis is available at http://www.exelixis-lab.org/gapmis. PMID:24564250
Schema Versioning for Multitemporal Relational Databases.
ERIC Educational Resources Information Center
De Castro, Cristina; Grandi, Fabio; Scalas, Maria Rita
1997-01-01
Investigates new design options for extended schema versioning support for multitemporal relational databases. Discusses the improved functionalities they may provide. Outlines options and basic motivations for the new design solutions, as well as techniques for the management of proposed schema versioning solutions, includes algorithms and…
A New Improved and Extended Version of the Multicell Bacterial Simulator gro.
Gutiérrez, Martín; Gregorio-Godoy, Paula; Pérez Del Pulgar, Guillermo; Muñoz, Luis E; Sáez, Sandra; Rodríguez-Patón, Alfonso
2017-08-18
gro is a cell programming language developed in Klavins Lab for simulating colony growth and cell-cell communication. It is used as a synthetic biology prototyping tool for simulating multicellular biocircuits and microbial consortia. In this work, we present several extensions made to gro that improve the performance of the simulator, make it easier to use, and provide new functionalities. The new version of gro is between 1 and 2 orders of magnitude faster than the original version. It is able to grow microbial colonies with up to 10 5 cells in less than 10 min. A new library, CellEngine, accelerates the resolution of spatial physical interactions between growing and dividing cells by implementing a new shoving algorithm. A genetic library, CellPro, based on Probabilistic Timed Automata, simulates gene expression dynamics using simplified and easy to compute digital proteins. We also propose a more convenient language specification layer, ProSpec, based on the idea that proteins drive cell behavior. CellNutrient, another library, implements Monod-based growth and nutrient uptake functionalities. The intercellular signaling management was improved and extended in a library called CellSignals. Finally, bacterial conjugation, another local cell-cell communication process, was added to the simulator. To show the versatility and potential outreach of this version of gro, we provide studies and novel examples ranging from synthetic biology to evolutionary microbiology. We believe that the upgrades implemented for gro have made it into a powerful and fast prototyping tool capable of simulating a large variety of systems and synthetic biology designs.
Motivation and Design of the Sirocco Storage System Version 1.0.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curry, Matthew Leon; Ward, H. Lee; Danielson, Geoffrey Charles
Sirocco is a massively parallel, high performance storage system for the exascale era. It emphasizes client-to-client coordination, low server-side coupling, and free data movement to improve resilience and performance. Its architecture is inspired by peer-to-peer and victim- cache architectures. By leveraging these ideas, Sirocco natively supports several media types, including RAM, flash, disk, and archival storage, with automatic migration between levels. Sirocco also includes storage interfaces and support that are more advanced than typical block storage. Sirocco enables clients to efficiently use key-value storage or block-based storage with the same interface. It also provides several levels of transactional data updatesmore » within a single storage command, including full ACID-compliant updates. This transaction support extends to updating several objects within a single transaction. Further support is provided for con- currency control, enabling greater performance for workloads while providing safe concurrent modification. By pioneering these and other technologies and techniques in the storage system, Sirocco is poised to fulfill a need for a massively scalable, write-optimized storage system for exascale systems. This is version 1.0 of a document reflecting the current and planned state of Sirocco. Further versions of this document will be accessible at http://www.cs.sandia.gov/Scalable_IO/ sirocco .« less
This paper describes the development and implementation of an extendable aqueous-phase chemistry option (AQCHEM − KMT(I)) for the Community Multiscale Air Quality (CMAQ) modeling system, version 5.1. Here, the Kinetic PreProcessor (KPP), version 2.2.3, is used t...
NASA Astrophysics Data System (ADS)
Monson, D. J.; Seegmiller, H. L.; McConnaughey, P. K.
1990-06-01
In this paper experimental measurements are compared with Navier-Stokes calculations using seven different turbulence models for the internal flow in a two-dimensional U-duct. The configuration is representative of many internal flows of engineering interst that experience strong curvature. In an effort to improve agreement, this paper tests several versions of the two-equation k-epsilon turbulence model including the standard version, an extended version with a production range time scale, and a version that includes curvature time scales. Each is tested in its high and low Reynolds number formulations. Calculations using these new models and the original mixing length model are compared here with measurements of mean and turbulence velocities, static pressure and skin friction in the U-duct at two Reynolds numbers. The comparisons show that only the low Reynolds number version of the extended k-epsilon model does a reasonable job of predicting the important features of this flow at both Reynolds numbers tested.
On Hunting Animals of the Biometric Menagerie for Online Signature.
Houmani, Nesma; Garcia-Salicetti, Sonia
2016-01-01
Individuals behave differently regarding to biometric authentication systems. This fact was formalized in the literature by the concept of Biometric Menagerie, defining and labeling user groups with animal names in order to reflect their characteristics with respect to biometric systems. This concept was illustrated for face, fingerprint, iris, and speech modalities. The present study extends the Biometric Menagerie to online signatures, by proposing a novel methodology that ties specific quality measures for signatures to categories of the Biometric Menagerie. Such measures are combined for retrieving automatically writer categories of the extended version of the Biometric Menagerie. Performance analysis with different types of classifiers shows the pertinence of our approach on the well-known MCYT-100 database.
This paper describes the development and implementation of an extendable aqueous-phase chemistry option (AQCHEM − KMT(I)) for the Community Multiscale Air Quality (CMAQ) modeling system, version 5.1. Here, the Kinetic PreProcessor (KPP), version 2.2.3, is used to generate a Rosen...
Functional Extended Redundancy Analysis
ERIC Educational Resources Information Center
Hwang, Heungsun; Suk, Hye Won; Lee, Jang-Han; Moskowitz, D. S.; Lim, Jooseop
2012-01-01
We propose a functional version of extended redundancy analysis that examines directional relationships among several sets of multivariate variables. As in extended redundancy analysis, the proposed method posits that a weighed composite of each set of exogenous variables influences a set of endogenous variables. It further considers endogenous…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cabello, Adan
We introduce an extended version of a previous all-versus-nothing proof of impossibility of Einstein-Podolsky-Rosen's local elements of reality for two photons entangled both in polarization and path degrees of freedom (A. Cabello, quant-ph/0507259), which leads to a Bell's inequality where the classical bound is 8 and the quantum prediction is 16. A simple estimation of the detection efficiency required to close the detection loophole using this extended version gives {eta}>0.69. This efficiency is lower than that required for previous proposals.
NASA Technical Reports Server (NTRS)
Abramson, N.
1974-01-01
The Aloha system was studied and developed and extended to advanced forms of computer communications networks. Theoretical and simulation studies of Aloha type radio channels for use in packet switched communications networks were performed. Improved versions of the Aloha communications techniques and their extensions were tested experimentally. A packet radio repeater suitable for use with the Aloha system operational network was developed. General studies of the organization of multiprocessor systems centered on the development of the BCC 500 computer were concluded.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Seyong; Vetter, Jeffrey S
Computer architecture experts expect that non-volatile memory (NVM) hierarchies will play a more significant role in future systems including mobile, enterprise, and HPC architectures. With this expectation in mind, we present NVL-C: a novel programming system that facilitates the efficient and correct programming of NVM main memory systems. The NVL-C programming abstraction extends C with a small set of intuitive language features that target NVM main memory, and can be combined directly with traditional C memory model features for DRAM. We have designed these new features to enable compiler analyses and run-time checks that can improve performance and guard againstmore » a number of subtle programming errors, which, when left uncorrected, can corrupt NVM-stored data. Moreover, to enable recovery of data across application or system failures, these NVL-C features include a flexible directive for specifying NVM transactions. So that our implementation might be extended to other compiler front ends and languages, the majority of our compiler analyses are implemented in an extended version of LLVM's intermediate representation (LLVM IR). We evaluate NVL-C on a number of applications to show its flexibility, performance, and correctness.« less
Gobba, F; Ghersi, R; Martinelli, Simona; Richeldi, Arianna; Clerici, Piera; Grazioli, P
2008-01-01
Data on self-reported symptoms and/or functional impairments are important in research on work-related musculoskeletal disorders (WRMSDs). In such cases the availability of international standardized questionnaires is extremely important since they permit comparison of studies performed in different Countries. Translation into Italian and validation of the Nordic Musculoskeletal Questionnaire (NMQ), a tool which is widely used in studies on WRMSDs in the international scientific literature. The extended Canadian version of the NMQ was translated into Italian. As per the current guidelines, the cross-cultural adaptation was performed by translation of the items from French, back-translation by independent mother-tongue translators and committee review. The resulting version of the questionnaire underwent pre-testing in 3 independent groups of subjects. The comprehensibility, reliability (internal consistency and reproducibility) and sensitivity were evaluated. After translation/back-translation and review of the items the comprehensibility of the Italian version of the questionnaire was judged good in a group of 40 workers. The internal consistency was evaluated using the Cronbach's Alpha test in the same group and in another 98 engineering workers: the results were on the whole acceptable. The reproducibility, which was tested with Cohen's Kappa test in the 40 workers, was good/excellent. In a preliminary evaluation, performed in 30 outpatients of a of Rehabilitation Service, sensitivity was very good. The results show that the Italian version of the Nordic Musculoskeletal Questionnaire is valid for self-administration and can be applied in 'field" studies on self-reported musculoskeletal symptoms and functional impairments in group of workers.
Management of Object Histories in the SWALLOW Repository,
1980-07-01
time of this future version. Since the end time of the current version should not be automatically extended up to tile start time of tile token until...and T is determined by the speed with which the available online version StoraIge fills up . Unfortunately, since versions of different objects are...of these images is accessible by Illlowing tie chain of pointers in the object history. The other images use up storage, but do not have an adverse
Implementing Journaling in a Linux Shared Disk File System
NASA Technical Reports Server (NTRS)
Preslan, Kenneth W.; Barry, Andrew; Brassow, Jonathan; Cattelan, Russell; Manthei, Adam; Nygaard, Erling; VanOort, Seth; Teigland, David; Tilstra, Mike; O'Keefe, Matthew;
2000-01-01
In computer systems today, speed and responsiveness is often determined by network and storage subsystem performance. Faster, more scalable networking interfaces like Fibre Channel and Gigabit Ethernet provide the scaffolding from which higher performance computer systems implementations may be constructed, but new thinking is required about how machines interact with network-enabled storage devices. In this paper we describe how we implemented journaling in the Global File System (GFS), a shared-disk, cluster file system for Linux. Our previous three papers on GFS at the Mass Storage Symposium discussed our first three GFS implementations, their performance, and the lessons learned. Our fourth paper describes, appropriately enough, the evolution of GFS version 3 to version 4, which supports journaling and recovery from client failures. In addition, GFS scalability tests extending to 8 machines accessing 8 4-disk enclosures were conducted: these tests showed good scaling. We describe the GFS cluster infrastructure, which is necessary for proper recovery from machine and disk failures in a collection of machines sharing disks using GFS. Finally, we discuss the suitability of Linux for handling the big data requirements of supercomputing centers.
NASA Astrophysics Data System (ADS)
Miyakawa, Tomoki
2017-04-01
The global cloud/cloud-system resolving model NICAM and its new fully-coupled version NICOCO is run on one of the worlds top-tier supercomputers, the K computer. NICOCO couples the full-3D ocean component COCO of the general circulation model MIROC using a general-purpose coupler Jcup. We carried out multiple MJO simulations using NICAM and the new ocean-coupled version NICOCO to examine their extended-range MJO prediction skills and the impact of ocean coupling. NICAM performs excellently in terms of MJO prediction, maintaining a valid skill up to 27 days after the model is initialized (Miyakawa et al 2014). As is the case in most global models, ocean coupling frees the model from being anchored by the observed SST and allows the model climate to drift away further from reality compared to the atmospheric version of the model. Thus, it is important to evaluate the model bias, and in an initial value problem such as the seasonal extended-range prediction, it is essential to be able to distinguish the actual signal from the early transition of the model from the observed state to its own climatology. Since NICAM is a highly resource-demanding model, evaluation and tuning of the model climatology (order of years) is challenging. Here we focus on the initial 100 days to estimate the early drift of the model, and subsequently evaluate MJO prediction skills of NICOCO. Results show that in the initial 100 days, NICOCO forms a La-Nina like SST bias compared to observation, with a warmer Maritime Continent warm pool and a cooler equatorial central Pacific. The enhanced convection over the Maritime Continent associated with this bias project on to the real-time multi-variate MJO indices (RMM, Wheeler and Hendon 2004), and contaminates the MJO skill score. However, the bias does not appear to demolish the MJO signal severely. The model maintains a valid MJO prediction skill up to nearly 4 weeks when evaluated after linearly removing the early drift component estimated from the 54 simulations. Furthermore, NICOCO outperforms NICAM by far if we focus on events associated with large oceanic signals.
BRST symmetry for a torus knot
NASA Astrophysics Data System (ADS)
Pandey, Vipul Kumar; Prasad Mandal, Bhabani
2017-08-01
We develop BRST symmetry for the first time for a particle on the surface of a torus knot by analyzing the constraints of the system. The theory contains 2nd-class constraints and has been extended by introducing the Wess-Zumino term to convert it into a theory with first-class constraints. BFV analysis of the extended theory is performed to construct BRST/anti-BRST symmetries for the particle on a torus knot. The nilpotent BRST/anti-BRST charges which generate such symmetries are constructed explicitly. The states annihilated by these nilpotent charges consist of the physical Hilbert space. We indicate how various effective theories on the surface of the torus knot are related through the generalized version of the BRST transformation with finite-field-dependent parameters.
On Hunting Animals of the Biometric Menagerie for Online Signature
Houmani, Nesma; Garcia-Salicetti, Sonia
2016-01-01
Individuals behave differently regarding to biometric authentication systems. This fact was formalized in the literature by the concept of Biometric Menagerie, defining and labeling user groups with animal names in order to reflect their characteristics with respect to biometric systems. This concept was illustrated for face, fingerprint, iris, and speech modalities. The present study extends the Biometric Menagerie to online signatures, by proposing a novel methodology that ties specific quality measures for signatures to categories of the Biometric Menagerie. Such measures are combined for retrieving automatically writer categories of the extended version of the Biometric Menagerie. Performance analysis with different types of classifiers shows the pertinence of our approach on the well-known MCYT-100 database. PMID:27054836
Estimation and enhancement of real-time software reliability through mutation analysis
NASA Technical Reports Server (NTRS)
Geist, Robert; Offutt, A. J.; Harris, Frederick C., Jr.
1992-01-01
A simulation-based technique for obtaining numerical estimates of the reliability of N-version, real-time software is presented. An extended stochastic Petri net is employed to represent the synchronization structure of N versions of the software, where dependencies among versions are modeled through correlated sampling of module execution times. Test results utilizing specifications for NASA's planetary lander control software indicate that mutation-based testing could hold greater potential for enhancing reliability than the desirable but perhaps unachievable goal of independence among N versions.
An all-FORTRAN version of NASTRAN for the VAX
NASA Technical Reports Server (NTRS)
Purves, L.
1981-01-01
All FORTRAN version of NASA structural analysis program NASATRAN is implemented on DEC VAX-series computer. Applications of NASATRAN extend to almost every type of linear structure and construction. Two special features are available in VAX version; program is executed from terminal in manner permitting use of VAX interactive debugger, and links are interactively restarted when desired by first making copy of all NASATRAN work files.
FUB at TREC 2008 Relevance Feedback Track: Extending Rocchio with Distributional Term Analysis
2008-11-01
starting point is the improved version [ Salton and Buckley 1990] of the original Rocchio’s formula [Rocchio 1971]: newQ = α ⋅ origQ + β R r r∈R ∑ − γR...earlier studies about the low effect of the main relevance feedback parameters on retrieval performance (e.g., Salton and Buckley 1990), while they seem...Relevance feedback in information retrieval. In The SMART retrieval system - experiments in automatic document processing, Salton , G., Ed., Prentice Hall
NASA Technical Reports Server (NTRS)
Craidon, C. B.
1983-01-01
A computer program was developed to extend the geometry input capabilities of previous versions of a supersonic zero lift wave drag computer program. The arbitrary geometry input description is flexible enough to describe almost any complex aircraft concept, so that highly accurate wave drag analysis can now be performed because complex geometries can be represented accurately and do not have to be modified to meet the requirements of a restricted input format.
2012-12-01
month no- cost extension for this study was approved on 7 November 2012, extending study activities through December 2013. A modified statement of work...approved as part of the no- cost extension and currently pending approval by the USARIEM HURC (Amendment #14), is presented in Table 2. 5...ptjournal.apta.org/Downloaded from 13 time. SAS 9.2 was used to perform a mixed model analysis with random individual intercept to account for
Luck, Tobias; Pabst, Alexander; Rodriguez, Francisca S; Schroeter, Matthias L; Witte, Veronica; Hinz, Andreas; Mehnert, Anja; Engel, Christoph; Loeffler, Markus; Thiery, Joachim; Villringer, Arno; Riedel-Heller, Steffi G
2018-05-01
To provide new age-, sex-, and education-specific reference values for an extended version of the well-established Consortium to Establish a Registry for Alzheimer's Disease Neuropsychological Assessment Battery (CERAD-NAB) that additionally includes the Trail Making Test and the Verbal Fluency Test-S-Words. Norms were calculated based on the cognitive performances of n = 1,888 dementia-free participants (60-79 years) from the population-based German LIFE-Adult-Study. Multiple regressions were used to examine the association of the CERAD-NAB scores with age, sex, and education. In order to calculate the norms, quantile and censored quantile regression analyses were performed estimating marginal means of the test scores at 2.28, 6.68, 10, 15.87, 25, 50, 75, and 90 percentiles for age-, sex-, and education-specific subgroups. Multiple regression analyses revealed that younger age was significantly associated with better cognitive performance in 15 CERAD-NAB measures and higher education with better cognitive performance in all 17 measures. Women performed significantly better than men in 12 measures and men than women in four measures. The determined norms indicate ceiling effects for the cognitive performances in the Boston Naming, Word List Recognition, Constructional Praxis Copying, and Constructional Praxis Recall tests. The new norms for the extended CERAD-NAB will be useful for evaluating dementia-free German-speaking adults in a broad variety of relevant cognitive domains. The extended CERAD-NAB follows more closely the criteria for the new DSM-5 Mild and Major Neurocognitive Disorder. Additionally, it could be further developed to include a test for social cognition. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Applied estimation for hybrid dynamical systems using perceptional information
NASA Astrophysics Data System (ADS)
Plotnik, Aaron M.
This dissertation uses the motivating example of robotic tracking of mobile deep ocean animals to present innovations in robotic perception and estimation for hybrid dynamical systems. An approach to estimation for hybrid systems is presented that utilizes uncertain perceptional information about the system's mode to improve tracking of its mode and continuous states. This results in significant improvements in situations where previously reported methods of estimation for hybrid systems perform poorly due to poor distinguishability of the modes. The specific application that motivates this research is an automatic underwater robotic observation system that follows and films individual deep ocean animals. A first version of such a system has been developed jointly by the Stanford Aerospace Robotics Laboratory and Monterey Bay Aquarium Research Institute (MBARI). This robotic observation system is successfully fielded on MBARI's ROVs, but agile specimens often evade the system. When a human ROV pilot performs this task, one advantage that he has over the robotic observation system in these situations is the ability to use visual perceptional information about the target, immediately recognizing any changes in the specimen's behavior mode. With the approach of the human pilot in mind, a new version of the robotic observation system is proposed which is extended to (a) derive perceptional information (visual cues) about the behavior mode of the tracked specimen, and (b) merge this dissimilar, discrete and uncertain information with more traditional continuous noisy sensor data by extending existing algorithms for hybrid estimation. These performance enhancements are enabled by integrating techniques in hybrid estimation, computer vision and machine learning. First, real-time computer vision and classification algorithms extract a visual observation of the target's behavior mode. Existing hybrid estimation algorithms are extended to admit this uncertain but discrete observation, complementing the information available from more traditional sensors. State tracking is achieved using a new form of Rao-Blackwellized particle filter called the mode-observed Gaussian Particle Filter. Performance is demonstrated using data from simulation and data collected on actual specimens in the ocean. The framework for estimation using both traditional and perceptional information is easily extensible to other stochastic hybrid systems with mode-related perceptional observations available.
70 Years of Making the World Safer: Extended
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Extended version with narration. This video shows our roles in making the world safer — working to end World War II, providing stable isotopes for research, providing unique precision manufacturing capabilities, and meeting nonproliferation and global security missions.
Toward extending the educational interpreter performance assessment to cued speech.
Krause, Jean C; Kegl, Judy A; Schick, Brenda
2008-01-01
The Educational Interpreter Performance Assessment (EIPA) is as an important research tool for examining the quality of interpreters who use American Sign Language or a sign system in classroom settings, but it is not currently applicable to educational interpreters who use Cued Speech (CS). In order to determine the feasibility of extending the EIPA to include CS, a pilot EIPA test was developed and administered to 24 educational CS interpreters. Fifteen of the interpreters' performances were evaluated two to three times in order to assess reliability. Results show that the instrument has good construct validity and test-retest reliability. Although more interrater reliability data are needed, intrarater reliability was quite high (0.9), suggesting that the pilot test can be rated as reliably as signing versions of the EIPA. Notably, only 48% of interpreters who formally participated in pilot testing performed at a level that could be considered minimally acceptable. In light of similar performance levels previously reported for interpreters who sign (e.g., Schick, Williams, & Kupermintz, 2006), these results suggest that interpreting services for deaf and hard-of hearing students, regardless of the communication option used, are often inadequate and could seriously hinder access to the classroom environment.
Toward Extending the Educational Interpreter Performance Assessment to Cued Speech
Krause, Jean C.; Kegl, Judy A.; Schick, Brenda
2008-01-01
The Educational Interpreter Performance Assessment (EIPA) is as an important research tool for examining the quality of interpreters who use American Sign Language or a sign system in classroom settings, but it is not currently applicable to educational interpreters who use Cued Speech (CS). In order to determine the feasibility of extending the EIPA to include CS, a pilot EIPA test was developed and administered to 24 educational CS interpreters. Fifteen of the interpreters’ performances were evaluated two to three times in order to assess reliability. Results show that the instrument has good construct validity and test–retest reliability. Although more interrater reliability data are needed, intrarater reliability was quite high (0.9), suggesting that the pilot test can be rated as reliably as signing versions of the EIPA. Notably, only 48% of interpreters who formally participated in pilot testing performed at a level that could be considered minimally acceptable. In light of similar performance levels previously reported for interpreters who sign (e.g., Schick, Williams, & Kupermintz, 2006), these results suggest that interpreting services for deaf and hard-of hearing students, regardless of the communication option used, are often inadequate and could seriously hinder access to the classroom environment. PMID:18042791
AN OVERVIEW OF EPANET VERSION 3.0
EPANET is a widely used public domain software package for modeling the hydraulic and water quality behavior of water distribution systems over an extended period of time. The last major update to the code was version 2.0 released in 2000 (Rossman, 2000). Since that time there ha...
NASA Astrophysics Data System (ADS)
Bird, Adam; Murphy, Christophe; Dobson, Geoff
2017-09-01
RANKERN 16 is the latest version of the point-kernel gamma radiation transport Monte Carlo code from AMEC Foster Wheeler's ANSWERS Software Service. RANKERN is well established in the UK shielding community for radiation shielding and dosimetry assessments. Many important developments have been made available to users in this latest release of RANKERN. The existing general 3D geometry capability has been extended to include import of CAD files in the IGES format providing efficient full CAD modelling capability without geometric approximation. Import of tetrahedral mesh and polygon surface formats has also been provided. An efficient voxel geometry type has been added suitable for representing CT data. There have been numerous input syntax enhancements and an extended actinide gamma source library. This paper describes some of the new features and compares the performance of the new geometry capabilities.
CBP Toolbox Version 3.0 “Beta Testing” Performance Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, III, F. G.
2016-07-29
One function of the Cementitious Barriers Partnership (CBP) is to assess available models of cement degradation and to assemble suitable models into a “Toolbox” that would be made available to members of the partnership, as well as the DOE Complex. To this end, SRNL and Vanderbilt University collaborated to develop an interface using the GoldSim software to the STADIUM @ code developed by SIMCO Technologies, Inc. and LeachXS/ORCHESTRA developed by Energy research Centre of the Netherlands (ECN). Release of Version 3.0 of the CBP Toolbox is planned in the near future. As a part of this release, an increased levelmore » of quality assurance for the partner codes and the GoldSim interface has been developed. This report documents results from evaluation testing of the ability of CBP Toolbox 3.0 to perform simulations of concrete degradation applicable to performance assessment of waste disposal facilities. Simulations of the behavior of Savannah River Saltstone Vault 2 and Vault 1/4 concrete subject to sulfate attack and carbonation over a 500- to 1000-year time period were run using a new and upgraded version of the STADIUM @ code and the version of LeachXS/ORCHESTRA released in Version 2.0 of the CBP Toolbox. Running both codes allowed comparison of results from two models which take very different approaches to simulating cement degradation. In addition, simulations of chloride attack on the two concretes were made using the STADIUM @ code. The evaluation sought to demonstrate that: 1) the codes are capable of running extended realistic simulations in a reasonable amount of time; 2) the codes produce “reasonable” results; the code developers have provided validation test results as part of their code QA documentation; and 3) the two codes produce results that are consistent with one another. Results of the evaluation testing showed that the three criteria listed above were met by the CBP partner codes. Therefore, it is concluded that the codes can be used to support performance assessment. This conclusion takes into account the QA documentation produced for the partner codes and for the CBP Toolbox.« less
NASA Technical Reports Server (NTRS)
Conger, A. M.; Hancock, D. W.; Hayne, G. S.; Brooks, R. L.
2008-01-01
The purpose of this document is to present and document GEOSAT Follow-On (GFO) performance analyses and results. This is the eighth Assessment Report since the initial report. This report extends the performance assessment since acceptance to 27 December 2007. Since launch, a variety of GFO performance studies have been performed: Appendix A provides an accumulative index of those studies. We began the inclusion of analyses of the JASON altimeter after the end of the Topographic Experiment (TOPEX) mission. Prior to this, JASON and TOPEX were compared during our assessment of theTOPEX altimeter. With the end of the TOPEX mission, we developed methods to report on JASON as it relates to GFO.
Light transport feature for SCINFUL.
Etaati, G R; Ghal-Eh, N
2008-03-01
An extended version of the scintillator response function prediction code SCINFUL has been developed by incorporating PHOTRACK, a Monte Carlo light transport code. Comparisons of calculated and experimental results for organic scintillators exposed to neutrons show that the extended code improves the predictive capability of SCINFUL.
Damarell, Raechel A; Tieman, Jennifer J; Sladek, Ruth M
2013-07-02
PubMed translations of OvidSP Medline search filters offer searchers improved ease of access. They may also facilitate access to PubMed's unique content, including citations for the most recently published biomedical evidence. Retrieving this content requires a search strategy comprising natural language terms ('textwords'), rather than Medical Subject Headings (MeSH). We describe a reproducible methodology that uses a validated PubMed search filter translation to create a textword-only strategy to extend retrieval to PubMed's unique heart failure literature. We translated an OvidSP Medline heart failure search filter for PubMed and established version equivalence in terms of indexed literature retrieval. The PubMed version was then run within PubMed to identify citations retrieved by the filter's MeSH terms (Heart failure, Left ventricular dysfunction, and Cardiomyopathy). It was then rerun with the same MeSH terms restricted to searching on title and abstract fields (i.e. as 'textwords'). Citations retrieved by the MeSH search but not the textword search were isolated. Frequency analysis of their titles/abstracts identified natural language alternatives for those MeSH terms that performed less effectively as textwords. These terms were tested in combination to determine the best performing search string for reclaiming this 'lost set'. This string, restricted to searching on PubMed's unique content, was then combined with the validated PubMed translation to extend the filter's performance in this database. The PubMed heart failure filter retrieved 6829 citations. Of these, 834 (12%) failed to be retrieved when MeSH terms were converted to textwords. Frequency analysis of the 834 citations identified five high frequency natural language alternatives that could improve retrieval of this set (cardiac failure, cardiac resynchronization, left ventricular systolic dysfunction, left ventricular diastolic dysfunction, and LV dysfunction). Together these terms reclaimed 157/834 (18.8%) of lost citations. MeSH terms facilitate precise searching in PubMed's indexed subset. They may, however, work less effectively as search terms prior to subject indexing. A validated PubMed search filter can be used to develop a supplementary textword-only search strategy to extend retrieval to PubMed's unique content. A PubMed heart failure search filter is available on the CareSearch website (http://www.caresearch.com.au) providing access to both indexed and non-indexed heart failure evidence.
Software for Automation of Real-Time Agents, Version 2
NASA Technical Reports Server (NTRS)
Fisher, Forest; Estlin, Tara; Gaines, Daniel; Schaffer, Steve; Chouinard, Caroline; Engelhardt, Barbara; Wilklow, Colette; Mutz, Darren; Knight, Russell; Rabideau, Gregg;
2005-01-01
Version 2 of Closed Loop Execution and Recovery (CLEaR) has been developed. CLEaR is an artificial intelligence computer program for use in planning and execution of actions of autonomous agents, including, for example, Deep Space Network (DSN) antenna ground stations, robotic exploratory ground vehicles (rovers), robotic aircraft (UAVs), and robotic spacecraft. CLEaR automates the generation and execution of command sequences, monitoring the sequence execution, and modifying the command sequence in response to execution deviations and failures as well as new goals for the agent to achieve. The development of CLEaR has focused on the unification of planning and execution to increase the ability of the autonomous agent to perform under tight resource and time constraints coupled with uncertainty in how much of resources and time will be required to perform a task. This unification is realized by extending the traditional three-tier robotic control architecture by increasing the interaction between the software components that perform deliberation and reactive functions. The increase in interaction reduces the need to replan, enables earlier detection of the need to replan, and enables replanning to occur before an agent enters a state of failure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burns, Heather; Flach, Greg; Smith, Frank
2014-01-10
The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy (US DOE) Office of Tank Waste Management. The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that help improve understanding and predictions of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. The CBP Software Toolbox – “Version 1.0” was released early in FY2013 and was used to support DOE-EM performance assessments in evaluating various degradation mechanisms that included sulfate attack, carbonation and constituent leaching. The sulfate attackmore » analysis predicted the extent and damage that sulfate ingress will have on concrete vaults over extended time (i.e., > 1000 years) and the carbonation analysis provided concrete degradation predictions from rebar corrosion. The new release “Version 2.0” includes upgraded carbonation software and a new software module to evaluate degradation due to chloride attack. Also included in the newer version are a dual regime module allowing evaluation of contaminant release in two regimes – both fractured and un-fractured. The integrated software package has also been upgraded with new plotting capabilities and many other features that increase the “user-friendliness” of the package. Experimental work has been generated to provide data to calibrate the models to improve the credibility of the analysis and reduce the uncertainty. Tools selected for and developed under this program have been used to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up to or longer than 100 years for operating facilities and longer than 1000 years for waste disposal. The CBP Software Toolbox is and will continue to produce tangible benefits to the working DOE Performance Assessment (PA) community.« less
VizieR Online Data Catalog: Zwicky Galaxy Catalog (Zwicky+ 1968)
NASA Astrophysics Data System (ADS)
Zwicky, F.; et al.
1996-03-01
This document describes a computer version of that part of the CGCG (Zwicky et al. 1961-68) containing all the alphanumeric information for galaxies. All known errors found by Zwicky and many others are corrected as well as erroneous quotations from other catalogs (Shapley & Ames 1932, Bigay 1951, Pettit 1954, Humason et al. 1956, Holmberg 1958). It is an illusion to consider all the errors are found. There are some misprints even in the most extended list of misprints (Paturel et al. 1991). We have compiled two files: zwigal.ori and zwigal.add. The first one contains the original information from CGCG for galaxies. The second one contains the data from above mentioned other catalogs given in CGCG. We have made no attempts to supply the catalog with any new information. A detailed comparison with the machine-readable version of Zwicky galaxies prepared by R.S. Hill (NSSDC ADC #7049 or CDS VII/49) was performed. Our version contains more data on individual galaxies - designation, description, magnitudes, velocity. All galaxies in the Coma center are included. However Hill's version contains data for Zwicky fields, Palomar Sky Survey plate number as well as Mead-Luyten-Palomar number. There are 27837 different galaxies and 29418 entries in CGCG. (2 data files).
NASA Astrophysics Data System (ADS)
Lieberman, Harris R.; Kramer, F. Matthew; Montain, Scott J.; Niro, Philip; Young, Andrew J.
2005-05-01
Until recently scientists had limited opportunities to study human cognitive performance in non-laboratory, fully ambulatory situations. Recently, advances in technology have made it possible to extend behavioral assessment to the field environment. One of the first devices to measure human behavior in the field was the wrist-worn actigraph. This device, now widely employed, can acquire minute-by-minute information on an individual"s level of motor activity. Actigraphs can, with reasonable accuracy, distinguish sleep from waking, the most critical and basic aspect of human behavior. However, rapid technologic advances have provided the opportunity to collect much more information from fully ambulatory humans. Our laboratory has developed a series of wrist-worn devices, which are not much larger then a watch, which can assess simple and choice reaction time, vigilance and memory. In addition, the devices can concurrently assess motor activity with much greater temporal resolution then the standard actigraph. Furthermore, they continuously monitor multiple environmental variables including temperature, humidity, sound and light. We have employed these monitors during training and simulated military operations to collect information that would typically be unavailable under such circumstances. In this paper we will describe various versions of the vigilance monitor and how each successive version extended the capabilities of the device. Samples of data from several studies are presented, included studies conducted in harsh field environments during simulated infantry assaults, a Marine Corps Officer training course and mechanized infantry (Stryker) operations. The monitors have been useful for documenting environmental conditions experienced by wearers, studying patterns of sleep and activity and examining the effects of nutritional manipulations on warfighter performance.
Robot education peers in a situated primary school study: Personalisation promotes child learning.
Baxter, Paul; Ashurst, Emily; Read, Robin; Kennedy, James; Belpaeme, Tony
2017-01-01
The benefit of social robots to support child learning in an educational context over an extended period of time is evaluated. Specifically, the effect of personalisation and adaptation of robot social behaviour is assessed. Two autonomous robots were embedded within two matched classrooms of a primary school for a continuous two week period without experimenter supervision to act as learning companions for the children for familiar and novel subjects. Results suggest that while children in both personalised and non-personalised conditions learned, there was increased child learning of a novel subject exhibited when interacting with a robot that personalised its behaviours, with indications that this benefit extended to other class-based performance. Additional evidence was obtained suggesting that there is increased acceptance of the personalised robot peer over a non-personalised version. These results provide the first evidence in support of peer-robot behavioural personalisation having a positive influence on learning when embedded in a learning environment for an extended period of time.
Robot education peers in a situated primary school study: Personalisation promotes child learning
Ashurst, Emily; Read, Robin; Kennedy, James; Belpaeme, Tony
2017-01-01
The benefit of social robots to support child learning in an educational context over an extended period of time is evaluated. Specifically, the effect of personalisation and adaptation of robot social behaviour is assessed. Two autonomous robots were embedded within two matched classrooms of a primary school for a continuous two week period without experimenter supervision to act as learning companions for the children for familiar and novel subjects. Results suggest that while children in both personalised and non-personalised conditions learned, there was increased child learning of a novel subject exhibited when interacting with a robot that personalised its behaviours, with indications that this benefit extended to other class-based performance. Additional evidence was obtained suggesting that there is increased acceptance of the personalised robot peer over a non-personalised version. These results provide the first evidence in support of peer-robot behavioural personalisation having a positive influence on learning when embedded in a learning environment for an extended period of time. PMID:28542648
Taking Proof based Verified Computation a Few Steps Closer to Practicality (extended version)
2012-06-27
general s2 + s, in general V’s per-instance CPU costs Issue commit queries (e + 2c) · n/β (e + 2c) · n/β Process commit responses d d Issue PCP...size (# of instances) (§2.3) e: cost of encrypting an element in F d : cost of decrypting an encrypted element f : cost of multiplying in F h: cost of...domain D (such as the integers, Z, or the rationals, Q) to equivalent constraints over a finite field, the programmer or compiler performs 3We suspect
Documentation for the machine-readable version of the Henry Draper Catalogue (edition 1985)
NASA Technical Reports Server (NTRS)
Roman, N. G.; Warren, W. H., Jr.
1985-01-01
An updated, corrected and extended machine-readable version of the catalog is described. Published and unpublished errors discovered in the previous version was corrected; letters indicating supplemental stars in the BD have been moved to a new byte to distinguish them from double-star components; and the machine readable portion of The Henry Draper Extension (HDE) (HA 100) was converted to the same format as the main catalog, with additional data added as necessary.
CSMP Mathematics for Kindergarten, Teacher's Guide [and] Worksheets. Final Experimental Version.
ERIC Educational Resources Information Center
Vandeputte, Christiane
This guide represents the final experimental version of an extended pilot project which was conducted in the United States between 1973 and 1976. The manner of presentation and the pedagogical ideas and tools are based on the works of Georges and Frederique Papy. They are recognized as having introduced colored arrow drawings…
USSAERO computer program development, versions B and C
NASA Technical Reports Server (NTRS)
Woodward, F. A.
1980-01-01
Versions B and C of the unified subsonic and supersonic aerodynamic analysis program, USSAERO, are described. Version B incorporates a new symmetrical singularity method to provide improved surface pressure distributions on wings in subsonic flow. Version C extends the range of application of the program to include the analysis of multiple engine nacelles or finned external stores. In addition, nonlinear compressibility effects in high subsonic and supersonic flows are approximated using a correction based on the local Mach number at panel control points. Several examples are presented comparing the results of these programs with other panel methods and experimental data.
PalymSys (TM): An extended version of CLIPS for construction and reasoning using blackboards
NASA Technical Reports Server (NTRS)
Bryson, Travis; Ballard, Dan
1994-01-01
This paper describes PalymSys(TM) -- an extended version of the CLIPS language that is designed to facilitate the implementation of blackboard systems. The paper first describes the general characteristics of blackboards and shows how a control blackboard architecture can be used by AI systems to examine their own behavior and adapt to real-time problem-solving situations by striking a balance between domain and control reasoning. The paper then describes the use of PalymSys in the development of a situation assessment subsystem for use aboard Army helicopters. This system performs real-time inferencing about the current battlefield situation using multiple domain blackboards as well as a control blackboard. A description of the control and domain blackboards and their implementation is presented. The paper also describes modifications made to the standard CLIPS 6.02 language in PalymSys(TM) 2.0. These include: (1) a dynamic Dempster-Shafer belief network whose structure is completely specifiable at run-time in the consequent of a PalymSys(TM) rule, (2) extension of the run command including a continuous run feature that enables the system to run even when the agenda is empty, and (3) a built-in communications link that uses shared memory to communicate with other independent processes.
BIOLOGICAL NETWORK EXPLORATION WITH CYTOSCAPE 3
Su, Gang; Morris, John H.; Demchak, Barry; Bader, Gary D.
2014-01-01
Cytoscape is one of the most popular open-source software tools for the visual exploration of biomedical networks composed of protein, gene and other types of interactions. It offers researchers a versatile and interactive visualization interface for exploring complex biological interconnections supported by diverse annotation and experimental data, thereby facilitating research tasks such as predicting gene function and pathway construction. Cytoscape provides core functionality to load, visualize, search, filter and save networks, and hundreds of Apps extend this functionality to address specific research needs. The latest generation of Cytoscape (version 3.0 and later) has substantial improvements in function, user interface and performance relative to previous versions. This protocol aims to jump-start new users with specific protocols for basic Cytoscape functions, such as installing Cytoscape and Cytoscape Apps, loading data, visualizing and navigating the network, visualizing network associated data (attributes) and identifying clusters. It also highlights new features that benefit experienced users. PMID:25199793
DOE Office of Scientific and Technical Information (OSTI.GOV)
2006-10-25
The purpose of the eXtended MetaData Registry (XMDR) prototype is to demonstrate the feasibility and utility of constructing an extended metadata registry, i.e., one which encompasses richer classification support, facilities for including terminologies, and better support for formal specification of semantics. The prototype registry will also serve as a reference implementation for the revised versions of ISO 11179, Parts 2 and 3 to help guide production implementations.
Mapping the Martian Meteorology
NASA Technical Reports Server (NTRS)
Allison, M.; Ross, J. D.; Solomon, N.
1999-01-01
The Mars-adapted version of the NASA/GISS general circulation model (GCM) has been applied to the hourly/daily simulation of the planet's meteorology over several seasonal orbits. The current running version of the model includes a diurnal solar cycle, CO2 sublimation, and a mature parameterization of upper level wave drag with a vertical domain extending from the surface up to the 6microb level. The benchmark simulations provide a four-dimensional archive for the comparative evaluation of various schemes for the retrieval of winds from anticipated polar orbiter measurements of temperatures by the Pressure Modulator Infrared Radiometer. Additional information is contained in the original extended abstract.
NASA Technical Reports Server (NTRS)
Kazerooni, H.
1991-01-01
A human's ability to perform physical tasks is limited, not only by his intelligence, but by his physical strength. If, in an appropriate environment, a machine's mechanical power is closely integrated with a human arm's mechanical power under the control of the human intellect, the resulting system will be superior to a loosely integrated combination of a human and a fully automated robot. Therefore, we must develop a fundamental solution to the problem of 'extending' human mechanical power. The work presented here defines 'extenders' as a class of robot manipulators worn by humans to increase human mechanical strength, while the wearer's intellect remains the central control system for manipulating the extender. The human, in physical contact with the extender, exchanges power and information signals with the extender. The aim is to determine the fundamental building blocks of an intelligent controller, a controller which allows interaction between humans and a broad class of computer-controlled machines via simultaneous exchange of both power and information signals. The prevalent trend in automation has been to physically separate the human from the machine so the human must always send information signals via an intermediary device (e.g., joystick, pushbutton, light switch). Extenders, however are perfect examples of self-powered machines that are built and controlled for the optimal exchange of power and information signals with humans. The human wearing the extender is in physical contact with the machine, so power transfer is unavoidable and information signals from the human help to control the machine. Commands are transferred to the extender via the contact forces and the EMG signals between the wearer and the extender. The extender augments human motor ability without accepting any explicit commands: it accepts the EMG signals and the contact force between the person's arm and the extender, and the extender 'translates' them into a desired position. In this unique configuration, mechanical power transfer between the human and the extender occurs because the human is pushing against the extender. The extender transfers to the human's hand, in feedback fashion, a scaled-down version of the actual external load which the extender is manipulating. This natural feedback force on the human's hand allows him to 'feel' a modified version of the external forces on the extender. The information signals from the human (e.g., EMG signals) to the computer reflect human cognitive ability, and the power transfer between the human and the machine (e.g., physical interaction) reflects human physical ability. Thus the information transfer to the machine augments cognitive ability, and the power transfer augments motor ability. These two actions are coupled through the human cognitive/motor dynamic behavior. The goal is to derive the control rules for a class of computer-controlled machines that augment human physical and cognitive abilities in certain manipulative tasks.
Psychometric evaluation of the English version of the Extended Post-event Processing Questionnaire.
Wong, Quincy J J
2015-01-01
The importance of post-event processing (PEP) in prominent models of social anxiety disorder has led to the development of measures that tap this cognitive construct. The 17-item Extended Post-event Processing Questionnaire (E-PEPQ) is one of the most comprehensive measures of PEP developed to date. However, the measure was developed in German and the psychometric properties of the English version of the E-PEPQ have not yet been examined. The current study examined the factor structure, internal consistency, and construct validity of the English version of the E-PEPQ. English-speaking participants (N = 560) completed the English version of the E-PEPQ, a measure of social anxiety and a measure of depression. A 15-item version of the E-PEPQ with a correlated three-factor structure (referred to as the E-PEPQ-15) emerged as the best fitting model using confirmatory factor analyses, and the E-PEPQ-15 and its subscales demonstrated good internal consistency. The E-PEPQ-15 and two of its three subscales also had significantly stronger positive associations with the social anxiety measure than with the depression measure. The psychometric properties of the E-PEPQ-15 obtained in the current study justify the use of the measure in research, particularly in the domain of social anxiety.
Cloud-based MOTIFSIM: Detecting Similarity in Large DNA Motif Data Sets.
Tran, Ngoc Tam L; Huang, Chun-Hsi
2017-05-01
We developed the cloud-based MOTIFSIM on Amazon Web Services (AWS) cloud. The tool is an extended version from our web-based tool version 2.0, which was developed based on a novel algorithm for detecting similarity in multiple DNA motif data sets. This cloud-based version further allows researchers to exploit the computing resources available from AWS to detect similarity in multiple large-scale DNA motif data sets resulting from the next-generation sequencing technology. The tool is highly scalable with expandable AWS.
STEVE -- User Guide and Reference Manual
NASA Astrophysics Data System (ADS)
Fish, Adrian
This document describes an extended version of the EVE editor that has been tailored to the general Starlink user's requirements. This extended editor is STarlink Eve or STEve, and this document (along with it's introductory companion SUN/125) describes this editor, and offers additional help, advice and tips on general EVE usage.
A new version of code Java for 3D simulation of the CCA model
NASA Astrophysics Data System (ADS)
Zhang, Kebo; Xiong, Hailing; Li, Chao
2016-07-01
In this paper we present a new version of the program of CCA model. In order to benefit from the advantages involved in the latest technologies, we migrated the running environment from JDK1.6 to JDK1.7. And the old program was optimized into a new framework, so promoted extendibility.
ERIC Educational Resources Information Center
Sohlberg, Karl; Liu, Xiang
2013-01-01
Herein, a slightly enhanced version of extended Huckel molecular orbital theory is applied to demonstrate the spontaneous distortion of 1,3,5,7-cyclooctatetraene from a perfect octagon, a consequence of the Jahn-Teller effect. The exercise is accessible to students who have been introduced to basic quantum mechanics and extended Huckel molecular…
The X-33 Extended Flight Test Range
NASA Technical Reports Server (NTRS)
Mackall, Dale A.; Sakahara, Robert; Kremer, Steven E.
1998-01-01
Development of an extended test range, with range instrumentation providing continuous vehicle communications, is required to flight-test the X-33, a scaled version of a reusable launch vehicle. The extended test range provides vehicle communications coverage from California to landing at Montana or Utah. This paper provides an overview of the approaches used to meet X-33 program requirements, including using multiple ground stations, and methods to reduce problems caused by reentry plasma radio frequency blackout. The advances used to develop the extended test range show other hypersonic and access-to-space programs can benefit from the development of the extended test range.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsugane, Keisuke; Boku, Taisuke; Murai, Hitoshi
Recently, the Partitioned Global Address Space (PGAS) parallel programming model has emerged as a usable distributed memory programming model. XcalableMP (XMP) is a PGAS parallel programming language that extends base languages such as C and Fortran with directives in OpenMP-like style. XMP supports a global-view model that allows programmers to define global data and to map them to a set of processors, which execute the distributed global data as a single thread. In XMP, the concept of a coarray is also employed for local-view programming. In this study, we port Gyrokinetic Toroidal Code - Princeton (GTC-P), which is a three-dimensionalmore » gyrokinetic PIC code developed at Princeton University to study the microturbulence phenomenon in magnetically confined fusion plasmas, to XMP as an example of hybrid memory model coding with the global-view and local-view programming models. In local-view programming, the coarray notation is simple and intuitive compared with Message Passing Interface (MPI) programming while the performance is comparable to that of the MPI version. Thus, because the global-view programming model is suitable for expressing the data parallelism for a field of grid space data, we implement a hybrid-view version using a global-view programming model to compute the field and a local-view programming model to compute the movement of particles. Finally, the performance is degraded by 20% compared with the original MPI version, but the hybrid-view version facilitates more natural data expression for static grid space data (in the global-view model) and dynamic particle data (in the local-view model), and it also increases the readability of the code for higher productivity.« less
Tsugane, Keisuke; Boku, Taisuke; Murai, Hitoshi; ...
2016-06-01
Recently, the Partitioned Global Address Space (PGAS) parallel programming model has emerged as a usable distributed memory programming model. XcalableMP (XMP) is a PGAS parallel programming language that extends base languages such as C and Fortran with directives in OpenMP-like style. XMP supports a global-view model that allows programmers to define global data and to map them to a set of processors, which execute the distributed global data as a single thread. In XMP, the concept of a coarray is also employed for local-view programming. In this study, we port Gyrokinetic Toroidal Code - Princeton (GTC-P), which is a three-dimensionalmore » gyrokinetic PIC code developed at Princeton University to study the microturbulence phenomenon in magnetically confined fusion plasmas, to XMP as an example of hybrid memory model coding with the global-view and local-view programming models. In local-view programming, the coarray notation is simple and intuitive compared with Message Passing Interface (MPI) programming while the performance is comparable to that of the MPI version. Thus, because the global-view programming model is suitable for expressing the data parallelism for a field of grid space data, we implement a hybrid-view version using a global-view programming model to compute the field and a local-view programming model to compute the movement of particles. Finally, the performance is degraded by 20% compared with the original MPI version, but the hybrid-view version facilitates more natural data expression for static grid space data (in the global-view model) and dynamic particle data (in the local-view model), and it also increases the readability of the code for higher productivity.« less
ERIC Educational Resources Information Center
Kaufman, Burt; And Others
This guide represents the final experimental version of an extended pilot project which was conducted in the United States between 1973 and 1976. The manner of presentation and pedagogical ideas and tools are based on the works of George and Frederique Papy. They are recognized as having introduced colored arrow drawings ("papygrams")…
ERIC Educational Resources Information Center
CEMREL, Inc., St. Ann, MO.
This guide represents the final experimental version of an extended pilot project which was conducted in the United States between 1973 and 1976. The manner of presentation and pedagogical ideas and tools are based on the works of Georges and Frederique Papy. They are recognized as having introduced colored arrow drawings ("papygrams")…
ERIC Educational Resources Information Center
Huang, Francis L.; Cornell, Dewey G.
2016-01-01
Although school climate has long been recognized as an important factor in the school improvement process, there are few psychometrically supported measures based on teacher perspectives. The current study replicated and extended the factor structure, concurrent validity, and test-retest reliability of the teacher version of the Authoritative…
ERIC Educational Resources Information Center
Lin, Yueh-Hsien; Su, Chwen-Yng; Guo, Wei-Yuan; Wuang, Yee-Pay
2012-01-01
The Hooper Visual Organization Test (HVOT) is a measure of visuosynthetic ability. Previously, the psychometric properties of the HVOT have been evaluated for Chinese-speaking children aged 5-11 years. This study reports development and further evidence of reliability and validity for a second version involving an extended age range of healthy…
Does Extended Telephone Callback Counselling Prevent Smoking Relapse?
ERIC Educational Resources Information Center
Segan, C. J.; Borland, R.
2011-01-01
This randomized controlled trial tested whether extended callback counselling that proactively engaged ex-smokers with the task of embracing a smoke-free lifestyle (four to six calls delivered 1-3 months after quitting, i.e. when craving levels and perceived need for help had declined) could reduce relapse compared with a revised version of…
An improved architecture for video rate image transformations
NASA Technical Reports Server (NTRS)
Fisher, Timothy E.; Juday, Richard D.
1989-01-01
Geometric image transformations are of interest to pattern recognition algorithms for their use in simplifying some aspects of the pattern recognition process. Examples include reducing sensitivity to rotation, scale, and perspective of the object being recognized. The NASA Programmable Remapper can perform a wide variety of geometric transforms at full video rate. An architecture is proposed that extends its abilities and alleviates many of the first version's shortcomings. The need for the improvements are discussed in the context of the initial Programmable Remapper and the benefits and limitations it has delivered. The implementation and capabilities of the proposed architecture are discussed.
Consideration of computer limitations in implementing on-line controls. M.S. Thesis
NASA Technical Reports Server (NTRS)
Roberts, G. K.
1976-01-01
A formal statement of the optimal control problem which includes the interval of dicretization as an optimization parameter, and extend this to include selection of a control algorithm as part of the optimization procedure, is formulated. The performance of the scalar linear system depends on the discretization interval. Discrete-time versions of the output feedback regulator and an optimal compensator, and the use of these results in presenting an example of a system for which fast partial-state-feedback control better minimizes a quadratic cost than either a full-state feedback control or a compensator, are developed.
Stone, Amanda E; Roper, Jaimie A; Herman, Daniel C; Hass, Chris J
2018-05-01
Persons with anterior cruciate ligament reconstruction (ACLR) show deficits in gait and neuromuscular control following rehabilitation. This altered behavior extends to locomotor adaptation and learning, however the contributing factors to this observed behavior have yet to be investigated. The purpose of this study was to assess differences in locomotor adaptation and learning between ACLR and controls, and identify underlying contributors to motor adaptation in these individuals. Twenty ACLR individuals and 20 healthy controls (CON) agreed to participate in this study. Participants performed four cognitive and dexterity tasks (local version of Trail Making Test, reaction time test, electronic pursuit rotor test, and the Purdue pegboard). Three-dimensional kinematics were also collected while participants walked on a split-belt treadmill. ACLR individuals completed the local versions of Trails A and Trails B significantly faster than CON. During split-belt walking, ACLR individuals demonstrated smaller step length asymmetry during EARLY and LATE adaptation, smaller double support asymmetry during MID adaptation, and larger stance time asymmetry during DE-ADAPT compared with CON. ACLR individuals performed better during tasks that required visual attention and task switching and were less perturbed during split-belt walking compared to controls. Persons with ACLR may use different strategies than controls, cognitive or otherwise, to adapt locomotor patterns.
Conceptual Comparison of Population Based Metaheuristics for Engineering Problems
Green, Paul
2015-01-01
Metaheuristic algorithms are well-known optimization tools which have been employed for solving a wide range of optimization problems. Several extensions of differential evolution have been adopted in solving constrained and nonconstrained multiobjective optimization problems, but in this study, the third version of generalized differential evolution (GDE) is used for solving practical engineering problems. GDE3 metaheuristic modifies the selection process of the basic differential evolution and extends DE/rand/1/bin strategy in solving practical applications. The performance of the metaheuristic is investigated through engineering design optimization problems and the results are reported. The comparison of the numerical results with those of other metaheuristic techniques demonstrates the promising performance of the algorithm as a robust optimization tool for practical purposes. PMID:25874265
Conceptual comparison of population based metaheuristics for engineering problems.
Adekanmbi, Oluwole; Green, Paul
2015-01-01
Metaheuristic algorithms are well-known optimization tools which have been employed for solving a wide range of optimization problems. Several extensions of differential evolution have been adopted in solving constrained and nonconstrained multiobjective optimization problems, but in this study, the third version of generalized differential evolution (GDE) is used for solving practical engineering problems. GDE3 metaheuristic modifies the selection process of the basic differential evolution and extends DE/rand/1/bin strategy in solving practical applications. The performance of the metaheuristic is investigated through engineering design optimization problems and the results are reported. The comparison of the numerical results with those of other metaheuristic techniques demonstrates the promising performance of the algorithm as a robust optimization tool for practical purposes.
Raytheon Stirling/pulse Tube Cryocooler Development
NASA Astrophysics Data System (ADS)
Kirkconnell, C. S.; Hon, R. C.; Kesler, C. H.; Roberts, T.
2008-03-01
The first generation flight-design Stirling/pulse tube "hybrid" two-stage cryocooler has entered initial performance and environmental testing. The status and early results of the testing are presented. Numerous improvements have been implemented as compared to the preceding brassboard versions to improve performance, extend life, and enhance launch survivability. This has largely been accomplished by incorporating successful flight-design features from the Raytheon Stirling one-stage cryocooler product line. These design improvements are described. In parallel with these mechanical cryocooler development efforts, a third generation electronics module is being developed that will support hybrid Stirling/pulse tube and Stirling cryocoolers. Improvements relative to the second generation design relate to improved radiation hardness, reduced parts count, and improved vibration cancellation capability. Progress on the electronics is also presented.
Six and Three-Hourly Meteorological Observations From 223 Former U.S.S.R. Stations (NPD-048)
Razuvaev, V. N. [All-Russian Research Institute of Hydrometeorological Information, World Data Center, Russia; Apasova, E. B. [All-Russian Research Institute of Hydrometeorological Information, World Data Center, Russia; Martuganov, R. A. [All-Russian Research Institute of Hydrometeorological Information, World Data Center, Russia; Kaiser, D. P. [CDIAC, Oak Ridge National Laboratory; Marino, G. P. [CDIAC, Oak Ridge National Laboratory
2007-11-01
This database contains 6- and 3-hourly meteorological observations from a 223-station network of the former Soviet Union. These data have been made available through cooperation between the two principal climate data centers of the United States and Russia: the National Climatic Data Center (NCDC), in Asheville, North Carolina, and the All-Russian Research Institute of Hydrometeorological Information-World Data Centre (RIHMI-WDC) in Obninsk, Russia. The first version of this database extended through the mid-1980s (ending year dependent upon station) and was made available in 1995 by the Carbon Dioxide Information Analysis Center (CDIAC) as NDP-048. A second version of the database extended the data records through 1990. This third, and current version of the database includes data through 2000 for over half of the stations (mainly for Russia), whereas the remainder of the stations have records extending through various years of the 1990s. Because of the break up of the Soviet Union in 1991, and since RIHMI-WDC is a Russian institution, only Russain stations are generally available through 2000. The non-Russian station records in this database typically extend through 1991. Station records consist of 6- and 3-hourly observations of some 24 meteorological variables including temperature, past and present weather type, precipitation amount, cloud amount and type, sea level pressure, relative humidity, and wind direction and speed. The 6-hourly observations extend from 1936 through 1965; the 3-hourly observations extend from 1966 through 2000 (or through the latest year available). These data have undergone extensive quality assurance checks by RIHMI-WDC, NCDC, and CDIAC. The database represents a wealth of meteorological information for a large and climatologically important portion of the earth's land area, and should prove extremely useful for a wide variety of regional climate change studies.
Analysis of rotor vibratory loads using higher harmonic pitch control
NASA Technical Reports Server (NTRS)
Quackenbush, Todd R.; Bliss, Donald B.; Boschitsch, Alexander H.; Wachspress, Daniel A.
1992-01-01
Experimental studies of isolated rotors in forward flight have indicated that higher harmonic pitch control can reduce rotor noise. These tests also show that such pitch inputs can generate substantial vibratory loads. The modification is summarized of the RotorCRAFT (Computation of Rotor Aerodynamics in Forward flighT) analysis of isolated rotors to study the vibratory loading generated by high frequency pitch inputs. The original RotorCRAFT code was developed for use in the computation of such loading, and uses a highly refined rotor wake model to facilitate this task. The extended version of RotorCRAFT incorporates a variety of new features including: arbitrary periodic root pitch control; computation of blade stresses and hub loads; improved modeling of near wake unsteady effects; and preliminary implementation of a coupled prediction of rotor airloads and noise. Correlation studies are carried out with existing blade stress and vibratory hub load data to assess the performance of the extended code.
Hyper-Fractal Analysis: A visual tool for estimating the fractal dimension of 4D objects
NASA Astrophysics Data System (ADS)
Grossu, I. V.; Grossu, I.; Felea, D.; Besliu, C.; Jipa, Al.; Esanu, T.; Bordeianu, C. C.; Stan, E.
2013-04-01
This work presents a new version of a Visual Basic 6.0 application for estimating the fractal dimension of images and 3D objects (Grossu et al. (2010) [1]). The program was extended for working with four-dimensional objects stored in comma separated values files. This might be of interest in biomedicine, for analyzing the evolution in time of three-dimensional images. New version program summaryProgram title: Hyper-Fractal Analysis (Fractal Analysis v03) Catalogue identifier: AEEG_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v3_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 745761 No. of bytes in distributed program, including test data, etc.: 12544491 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 100M Classification: 14 Catalogue identifier of previous version: AEEG_v2_0 Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 831-832 Does the new version supersede the previous version? Yes Nature of problem: Estimating the fractal dimension of 4D images. Solution method: Optimized implementation of the 4D box-counting algorithm. Reasons for new version: Inspired by existing applications of 3D fractals in biomedicine [3], we extended the optimized version of the box-counting algorithm [1, 2] to the four-dimensional case. This might be of interest in analyzing the evolution in time of 3D images. The box-counting algorithm was extended in order to support 4D objects, stored in comma separated values files. A new form was added for generating 2D, 3D, and 4D test data. The application was tested on 4D objects with known dimension, e.g. the Sierpinski hypertetrahedron gasket, Df=ln(5)/ln(2) (Fig. 1). The algorithm could be extended, with minimum effort, to higher number of dimensions. Easy integration with other applications by using the very simple comma separated values file format for storing multi-dimensional images. Implementation of χ2 test as a criterion for deciding whether an object is fractal or not. User friendly graphical interface. Hyper-Fractal Analysis-Test on the Sierpinski hypertetrahedron 4D gasket (Df=ln(5)/ln(2)≅2.32). Running time: In a first approximation, the algorithm is linear [2]. References: [1] V. Grossu, D. Felea, C. Besliu, Al. Jipa, C.C. Bordeianu, E. Stan, T. Esanu, Computer Physics Communications, 181 (2010) 831-832. [2] I.V. Grossu, C. Besliu, M.V. Rusu, Al. Jipa, C. C. Bordeianu, D. Felea, Computer Physics Communications, 180 (2009) 1999-2001. [3] J. Ruiz de Miras, J. Navas, P. Villoslada, F.J. Esteban, Computer Methods and Programs in Biomedicine, 104 Issue 3 (2011) 452-460.
NASA Technical Reports Server (NTRS)
Roman, N. G.; Warren, W. H., Jr.
1984-01-01
An updated, corrected and extended machine readable version of the Smithsonian Astrophysical Observatory star catalog (SAO) is described. Published and unpublished errors discovered in the previous version have been corrected, and multiple star and supplemental BD identifications added to stars where more than one SAO entry has the same Durchmusterung number. Henry Draper Extension (HDE) numbers have been added for stars found in both volumes of the extension. Data for duplicate SAO entries (those referring to the same star) have been blanked out, but the records themselves have been retained and flagged so that sequencing and record count are identical to the published catalog.
Documentation for the machine-readable version of the SAO-HD-GC-DM cross index version 1983
NASA Technical Reports Server (NTRS)
Roman, N. G.; Warren, W. H., Jr.; Schofield, N., Jr.
1983-01-01
An updated and extended machine readable version of the Smithsonian Astrophysical Observatory star catalog (SAO) is described. A correction of all errors which were found since preparation of the original catalog which resulted from misidentifications and omissions of components in multiple star systems and missing Durchmusterung numbers (the common identifier) in the SAO Catalog are included and component identifications from the Index of Visual Double Stars (IDS) are appended to all multiple SAO entries with the same DM numbers, and lower case letter identifiers for supplemental BD stars are added. A total of 11,398 individual corrections and data additions is incorporated into the present version of the cross index.
2013-01-01
Background PubMed translations of OvidSP Medline search filters offer searchers improved ease of access. They may also facilitate access to PubMed’s unique content, including citations for the most recently published biomedical evidence. Retrieving this content requires a search strategy comprising natural language terms (‘textwords’), rather than Medical Subject Headings (MeSH). We describe a reproducible methodology that uses a validated PubMed search filter translation to create a textword-only strategy to extend retrieval to PubMed’s unique heart failure literature. Methods We translated an OvidSP Medline heart failure search filter for PubMed and established version equivalence in terms of indexed literature retrieval. The PubMed version was then run within PubMed to identify citations retrieved by the filter’s MeSH terms (Heart failure, Left ventricular dysfunction, and Cardiomyopathy). It was then rerun with the same MeSH terms restricted to searching on title and abstract fields (i.e. as ‘textwords’). Citations retrieved by the MeSH search but not the textword search were isolated. Frequency analysis of their titles/abstracts identified natural language alternatives for those MeSH terms that performed less effectively as textwords. These terms were tested in combination to determine the best performing search string for reclaiming this ‘lost set’. This string, restricted to searching on PubMed’s unique content, was then combined with the validated PubMed translation to extend the filter’s performance in this database. Results The PubMed heart failure filter retrieved 6829 citations. Of these, 834 (12%) failed to be retrieved when MeSH terms were converted to textwords. Frequency analysis of the 834 citations identified five high frequency natural language alternatives that could improve retrieval of this set (cardiac failure, cardiac resynchronization, left ventricular systolic dysfunction, left ventricular diastolic dysfunction, and LV dysfunction). Together these terms reclaimed 157/834 (18.8%) of lost citations. Conclusions MeSH terms facilitate precise searching in PubMed’s indexed subset. They may, however, work less effectively as search terms prior to subject indexing. A validated PubMed search filter can be used to develop a supplementary textword-only search strategy to extend retrieval to PubMed’s unique content. A PubMed heart failure search filter is available on the CareSearch website (http://www.caresearch.com.au) providing access to both indexed and non-indexed heart failure evidence. PMID:23819658
Salomon-Ferrer, Romelia; Götz, Andreas W; Poole, Duncan; Le Grand, Scott; Walker, Ross C
2013-09-10
We present an implementation of explicit solvent all atom classical molecular dynamics (MD) within the AMBER program package that runs entirely on CUDA-enabled GPUs. First released publicly in April 2010 as part of version 11 of the AMBER MD package and further improved and optimized over the last two years, this implementation supports the three most widely used statistical mechanical ensembles (NVE, NVT, and NPT), uses particle mesh Ewald (PME) for the long-range electrostatics, and runs entirely on CUDA-enabled NVIDIA graphics processing units (GPUs), providing results that are statistically indistinguishable from the traditional CPU version of the software and with performance that exceeds that achievable by the CPU version of AMBER software running on all conventional CPU-based clusters and supercomputers. We briefly discuss three different precision models developed specifically for this work (SPDP, SPFP, and DPDP) and highlight the technical details of the approach as it extends beyond previously reported work [Götz et al., J. Chem. Theory Comput. 2012, DOI: 10.1021/ct200909j; Le Grand et al., Comp. Phys. Comm. 2013, DOI: 10.1016/j.cpc.2012.09.022].We highlight the substantial improvements in performance that are seen over traditional CPU-only machines and provide validation of our implementation and precision models. We also provide evidence supporting our decision to deprecate the previously described fully single precision (SPSP) model from the latest release of the AMBER software package.
Correcting the extended-source calibration for the Herschel-SPIRE Fourier-transform spectrometer
NASA Astrophysics Data System (ADS)
Valtchanov, I.; Hopwood, R.; Bendo, G.; Benson, C.; Conversi, L.; Fulton, T.; Griffin, M. J.; Joubaud, T.; Lim, T.; Lu, N.; Marchili, N.; Makiwa, G.; Meyer, R. A.; Naylor, D. A.; North, C.; Papageorgiou, A.; Pearson, C.; Polehampton, E. T.; Scott, J.; Schulz, B.; Spencer, L. D.; van der Wiel, M. H. D.; Wu, R.
2018-03-01
We describe an update to the Herschel-Spectral and Photometric Imaging Receiver (SPIRE) Fourier-transform spectrometer (FTS) calibration for extended sources, which incorporates a correction for the frequency-dependent far-field feedhorn efficiency, ηff. This significant correction affects all FTS extended-source calibrated spectra in sparse or mapping mode, regardless of the spectral resolution. Line fluxes and continuum levels are underestimated by factors of 1.3-2 in thespectrometer long wavelength band (447-1018 GHz; 671-294 μm) and 1.4-1.5 in the spectrometer short wavelength band (944-1568 GHz; 318-191 μm). The correction was implemented in the FTS pipeline version 14.1 and has also been described in the SPIRE Handbook since 2017 February. Studies based on extended-source calibrated spectra produced prior to this pipeline version should be critically reconsidered using the current products available in the Herschel Science Archive. Once the extended-source calibrated spectra are corrected for ηff, the synthetic photometry and the broad-band intensities from SPIRE photometer maps agree within 2-4 per cent - similar levels to the comparison of point-source calibrated spectra and photometry from point-source calibrated maps. The two calibration schemes for the FTS are now self-consistent: the conversion between the corrected extended-source and point-source calibrated spectra can be achieved with the beam solid angle and a gain correction that accounts for the diffraction loss.
Off-design performance loss model for radial turbines with pivoting, variable-area stators
NASA Technical Reports Server (NTRS)
Meitner, P. L.; Glassman, A. J.
1980-01-01
An off-design performance loss model was developed for variable stator (pivoted vane), radial turbines through analytical modeling and experimental data analysis. Stator loss is determined by a viscous loss model; stator vane end-clearance leakage effects are determined by a clearance flow model. Rotor loss coefficient were obtained by analyzing the experimental data from a turbine rotor previously tested with six stators having throat areas from 20 to 144 percent of design area and were correlated with stator-to-rotor throat area ratio. An incidence loss model was selected to obtain best agreement with experimental results. Predicted turbine performance is compared with experimental results for the design rotor as well as with results for extended and cutback versions of the rotor. Sample calculations were made to show the effects of stator vane end-clearance leakage.
Investigation of nickel hydrogen battery technology for the RADARSAT spacecraft
NASA Technical Reports Server (NTRS)
Mccoy, D. A.; Lackner, J. L.
1986-01-01
The low Earth orbit (LEO) operations of the RADARSAT spacecraft require high performance batteries to provide energy to the payload and platform during eclipse period. Nickel Hydrogen cells are currently competing with the more traditional Nickel Cadmium cells for high performance spacecraft applications at geostationary Earth orbit (GEO) and Leo. Nickel Hydrogen cells appear better suited for high power applications where high currents and high Depths of Discharge are required. Although a number of GEO missions have flown with Nickel Hydrogen batteries, it is not readily apparent that the LEO version of the Nickel Hydrogen cell is able to withstand the extended cycle lifetime (5 years) of the RADARSAT mission. The problems associated with Nickel Hydrogen cells are discussed in the contex of RADARSAT mission and a test program designed to characterize cell performance is presented.
Recent Updates to the CFD General Notation System (CGNS)
NASA Technical Reports Server (NTRS)
Rumsey, Christopher L.; Wedan, Bruce; Hauser, Thomas; Poinot, Marc
2012-01-01
The CFD General Notation System (CGNS) - a general, portable, and extensible standard for the storage and retrieval of computational fluid dynamics (CFD) analysis data has been in existence for more than a decade (Version 1.0 was released in May 1998). Both structured and unstructured CFD data are covered by the standard, and CGNS can be easily extended to cover any sort of data imaginable, while retaining backward compatibility with existing CGNS data files and software. Although originally designed for CFD, it is readily extendable to any field of computational analysis. In early 2011, CGNS Version 3.1 was released, which added significant capabilities. This paper describes these recent enhancements and highlights the continued usefulness of the CGNS methodology.
Invariance levels across language versions of the PISA 2009 reading comprehension tests in Spain.
Elosua Oliden, Paula; Mujika Lizaso, Josu
2013-01-01
The PISA project provides the basis for studying curriculum design and for comparing factors associated with school effectiveness. These studies are only valid if the different language versions are equivalent to each other. In Spain, the application of PISA in autonomous regions with their own languages means that equivalency must also be extended to the Spanish, Galician, Catalan and Basque versions of the test. The aim of this work was to analyse the equivalence among the four language versions of the Reading Comprehension Test (PISA 2009). After defining the testlet as the unit of analysis, equivalence among the language versions was analysed using two invariance testing procedures: multiple-group mean and covariance structure analyses for ordinal data and ordinal logistic regression. The procedures yielded concordant results supporting metric equivalence across all four language versions: Spanish, Basque, Galician and Catalan. The equivalence supports the estimated reading literacy score comparability among the language versions used in Spain.
Multiphase model for transformation induced plasticity. Extended Leblond's model
NASA Astrophysics Data System (ADS)
Weisz-Patrault, Daniel
2017-09-01
Transformation induced plasticity (TRIP) classically refers to plastic strains observed during phase transitions that occur under mechanical loads (that can be lower than the yield stress). A theoretical approach based on homogenization is proposed to deal with multiphase changes and to extend the validity of the well known and widely used model proposed by Leblond (1989). The approach is similar, but several product phases are considered instead of one and several assumptions have been released. Thus, besides the generalization for several phases, one can mention three main improvements in the calculation of the local equivalent plastic strain: the deviatoric part of the phase transformation is taken into account, both parent and product phases are elastic-plastic with linear isotropic hardening and the applied stress is considered. Results show that classical issues of singularities arising in the Leblond's model (corrected by ad hoc numerical functions or thresholding) are solved in this contribution excepted when the applied equivalent stress reaches the yield stress. Indeed, in this situation the parent phase is entirely plastic as soon as the phase transformation begins and the same singularity as in the Leblond's model arises. A physical explanation of the cutoff function is introduced in order to regularize the singularity. Furthermore, experiments extracted from the literature dealing with multiphase transitions and multiaxial loads are compared with the original Leblond's model and the proposed extended version. For the extended version, very good agreement is observed without any fitting procedures (i.e., material parameters are extracted from other dedicated experiments) and for the original version results are more qualitative.
Flens, Gerard; Smits, Niels; Terwee, Caroline B; Dekker, Joost; Huijbrechts, Irma; de Beurs, Edwin
2017-03-01
We developed a Dutch-Flemish version of the patient-reported outcomes measurement information system (PROMIS) adult V1.0 item bank for depression as input for computerized adaptive testing (CAT). As item bank, we used the Dutch-Flemish translation of the original PROMIS item bank (28 items) and additionally translated 28 U.S. depression items that failed to make the final U.S. item bank. Through psychometric analysis of a combined clinical and general population sample ( N = 2,010), 8 added items were removed. With the final item bank, we performed several CAT simulations to assess the efficiency of the extended (48 items) and the original item bank (28 items), using various stopping rules. Both item banks resulted in highly efficient and precise measurement of depression and showed high similarity between the CAT simulation scores and the full item bank scores. We discuss the implications of using each item bank and stopping rule for further CAT development.
NASA Technical Reports Server (NTRS)
Roman, Nancy G.; Warren, Wayne H., Jr.
1989-01-01
An updated, corrected, and extended machine readable version of the catalog is described. Published and unpublished errors discovered in the previous version were corrected, and multiple star and supplemental BD identifications were added to stars where more than one SAO entry has the same Durchmusterung number. Henry Draper Extension (HDE) numbers were added for stars found in both volumes of the extension. Data for duplicate SAO entries (those referring to the same star) were flagged. J2000 positions in usual units and in radians were added.
ERIC Educational Resources Information Center
CEMREL, Inc., St. Ann, MO.
This guide represents the final experimental version of an extended pilot project which was conducted in the United States between 1973 and 1976. The manner of presentation and the pedagogical ideas and tools are based on the works of Georges and Frederique Papy. They are recognized as having introduced colored arrow drawings…
Thermosphere-Ionosphere-Mesosphere Modeling Using the TIME-GCM
2014-09-30
respectively. The CCM3 is the NCAR Community Climate Model, Version 3.6, a GCM of the troposphere and stratosphere. All models include self-consistent...middle atmosphere version of the NCAR Community Climate Model, (2) the NCAR TIME-GCM, and (3) the Model for Ozone and Related Chemical Tracers (MOZART... troposphere , but the impacts of such events extend well into the mesosphere. The coupled NCAR thermosphere-ionosphere-mesosphere- electrodynamics general
NASA Astrophysics Data System (ADS)
Nageswararao, M. M.; Mohanty, U. C.; Nair, Archana; Ramakrishna, S. S. V. S.
2016-06-01
The precipitation during winter (December through February) over India is highly variable in terms of time and space. Maximum precipitation occurs over the Himalaya region, which is important for water resources and agriculture sectors over the region and also for the economy of the country. Therefore, in the present global warming era, the realistic prediction of winter precipitation over India is important for planning and implementing agriculture and water management strategies. The National Centers for Environmental Prediction (NCEP) issued the operational prediction of climatic variables in monthly to seasonal scale since 2004 using their first version of fully coupled global climate model known as Climate Forecast System (CFSv1). In 2011, a new version of CFS (CFSv2) was introduced with the incorporation of significant changes in older version of CFS (CFSv1). The new version of CFS is required to compare in detail with the older version in the context of simulating the winter precipitation over India. Therefore, the current study presents a detailed analysis on the performance of CFSv2 as compared to CFSv1 for the winter precipitation over India. The hindcast runs of both CFS versions from 1982 to 2008 with November initial conditions are used and the model's precipitation is evaluated with that of India Meteorological Department (IMD). The models simulated wind and geopotential height against the National Center for Atmospheric Research (NCEP-NCAR) reanalysis-2 (NNRP2) and remote response patterns of SST against Extended Reconstructed Sea Surface Temperatures version 3b (ERSSTv3b) are examined for the same period. The analyses of winter precipitation revealed that both the models are able to replicate the patterns of observed climatology; interannual variability and coefficient of variation. However, the magnitude is lesser than IMD observation that can be attributed to the model's inability to simulate the observed remote response of sea surface temperatures to all India winter precipitation. Of the two, CFSv1 is appreciable in capturing year-to-year variations in observed winter precipitation while CFSv2 failed in simulating the same. CFSv1 has accounted for less mean bias and RMSE errors along with good correlations and index of agreements than CFSv2 for predicting winter precipitation over India. In addition, the CFSv1 is also having a high probability of detection in predicting different categories (normal, excess and deficit) of observed winter precipitation over India.
Free energy landscapes of a highly structured β-hairpin peptide and its single mutant
NASA Astrophysics Data System (ADS)
Kim, Eunae; Yang, Changwon; Jang, Soonmin; Pak, Youngshang
2008-10-01
We investigated the free energy landscapes of a highly structured β-hairpin peptide (MBH12) and a less structured peptide with a single mutation of Tyr6 to Asp6 (MBH10). For the free energy mapping, starting from an extended conformation, the replica exchange molecular dynamic simulations for two β-hairpins were performed using a modified version of an all-atom force field employing an implicit solvation (param99MOD5/GBSA). With the present simulation approach, we demonstrated that detailed stability changes associated with the sequence modification from MBH12 to MBH10 are quantitatively well predicted at the all-atom level.
Meisel, Susanne F; Freeman, Maddie; Waller, Jo; Fraser, Lindsay; Gessler, Sue; Jacobs, Ian; Kalsi, Jatinderpal; Manchanda, Ranjit; Rahman, Belinda; Side, Lucy; Wardle, Jane; Lanceley, Anne; Sanderson, Saskia C
2017-11-16
Risk stratification using genetic and other types of personal information could improve current best available approaches to ovarian cancer risk reduction, improving identification of women at increased risk of ovarian cancer and reducing unnecessary interventions for women at lower risk. Amounts of information given to women may influence key informed decision-related outcomes, e.g. knowledge. The primary aim of this study was to compare informed decision-related outcomes between women given one of two versions (gist vs. extended) of a decision aid about stratified ovarian cancer risk-management. This was an experimental survey study comparing the effects of brief (gist) information with lengthier, more detailed (extended) information on cognitions relevant to informed decision-making about participating in risk-stratified ovarian cancer screening. Women with no personal history of ovarian cancer were recruited through an online survey company and randomised to view the gist (n = 512) or extended (n = 519) version of a website-based decision aid and completed an online survey. Primary outcomes were knowledge and intentions. Secondary outcomes included attitudes (values) and decisional conflict. There were no significant differences between the gist and extended conditions in knowledge about ovarian cancer (time*group interaction: F = 0.20, p = 0.66) or intention to participate in ovarian cancer screening based on genetic risk assessment (t(1029) = 0.43, p = 0.67). There were also no between-groups differences in secondary outcomes. In the sample overall (n = 1031), knowledge about ovarian cancer increased from before to after exposure to the decision aid (from 5.71 to 6.77 out of a possible 10: t = 19.04, p < 0.001), and 74% of participants said that they would participate in ovarian cancer screening based on genetic risk assessment. No differences in knowledge or intentions were found between women who viewed the gist version and women who viewed the extended version of a decision aid about risk-stratified ovarian cancer screening. Knowledge increased for women in both decision aid groups. Further research is needed to determine the ideal volume and type of content for decision aids about stratified ovarian cancer risk-management. This study was registered with the ISRCTN registry; registration number: ISRCTN48627877 .
A New Architecture for Extending the Capabilities of the Copernicus Trajectory Optimization Program
NASA Technical Reports Server (NTRS)
Williams, Jacob
2015-01-01
This paper describes a new plugin architecture developed for the Copernicus spacecraft trajectory optimization program. Details of the software architecture design and development are described, as well as examples of how the capability can be used to extend the tool in order to expand the type of trajectory optimization problems that can be solved. The inclusion of plugins is a significant update to Copernicus, allowing user-created algorithms to be incorporated into the tool for the first time. The initial version of the new capability was released to the Copernicus user community with version 4.1 in March 2015, and additional refinements and improvements were included in the recent 4.2 release. It is proving quite useful, enabling Copernicus to solve problems that it was not able to solve before.
Cheng, Eddie W L; Chu, Samuel K W
2016-08-01
Given the increasing use of web technology for teaching and learning, this study developed and examined an extended version of the theory of planned behaviour (TPB) model, which explained students' intention to collaborate online for their group projects. Results indicated that past experience predicted the three antecedents of intention, while past behaviour was predictive of subjective norm and perceived behavioural control. Moreover, the three antecedents (attitude towards e-collaboration, subjective norm and perceived behavioural control) were found to significantly predict e-collaborative intention. This study explored the use of the "remember" type of awareness (i.e. past experience) and evaluated the value of the "know" type of awareness (i.e. past behaviour) in the TPB model. © 2015 International Union of Psychological Science.
Schönfeld, Sabine; Ehlers, Anke
2006-11-01
Individuals with posttraumatic stress disorder (PTSD) show overgeneral memory (OGM) when retrieving autobiographical memories to word cues. We investigated whether OGM extends to picture cues and whether it is related to PTSD symptoms and cognitions. Trauma survivors with (n = 29) and without (n = 26) PTSD completed the standard Autobiographical Memory Test (AMT) and a novel picture version. Compared to the no-PTSD group, the PTSD group showed OGM in both test versions. Pictures facilitated specific memory retrieval, but this effect was no longer significant when verbal intelligence or depressive symptoms were controlled. OGM correlated with PTSD symptoms and perceived self-change; with intrusive memories, their perceived "nowness," responses to intrusions (thought suppression, rumination, dissociation), and negative interpretations of symptoms. Copyright 2006 APA, all rights reserved.
Assessment of the simulated climate in two versions of the RegT-Band
NASA Astrophysics Data System (ADS)
da Rocha, Rosmeri; Reboita, Michelle; Llopart, Marta
2017-04-01
This study evaluates two simulations carried out with the tropical band version of the Regional Climate Model (RegT-Band). The purpose was to compare the performance of the RegCM 4.4.5 and 4.6 versions (RegT4.4.5 and RegT4.6). The domain used in the simulations extends from 45° S to 45° N and covers all tropical longitudes, with grid spacing of 39 km, 18 sigma-pressure vertical levels. The initial and boundary conditions for the simulations were provided by ERA-Interim reanalysis and the analyzed period is from January 2005 to December 2008. Regarding the physical parameterizations schemes were used the Emanuel scheme to solve cumulus convection and Community Land Model version 4.5 (CLM4.5) to surface-atmosphere interactions. Seasonal simulated precipitation was compared with Global Precipitation Climatology Project (GPCP) while 2 meters air temperature with ERA-Interim reanalysis. The main results of this study are that RegT4.6 reduces the wet bias over the oceans and the cold bias over the continents compared with RegT4.4.5. In austral summer, RegT4.6 improves the simulation reducing the precipitation amounts mainly over Indian Ocean, Indonesia and eastern northeastern Brazil. However, both versions underestimate the precipitation over the South America Convergence Zone (SACZ). During the austral winter, RegT4.6 simulates the precipitation similar to GPCP over India and it reduces the cold bias over this country compared with RegT4.4.5. However, over the South of Africa, Australia and central-southeast South America, RegT4.6 simulates a strong warm bias.
Cholinesterase Inhibitors Improve Both Memory and Complex Learning in Aged Beagle Dogs
Araujo, Joseph A.; Greig, Nigel H.; Ingram, Donald K.; Sandin, Johan; de Rivera, Christina; Milgram, Norton W.
2016-01-01
Similar to patients with Alzheimer’s disease (AD), dogs exhibit age-dependent cognitive decline, amyloid-β (Aβ) pathology, and evidence of cholinergic hypofunction. The present study sought to further investigate the role of cholinergic hypofunction in the canine model by examining the effect of the cholinesterase inhibitors phenserine and donepezil on performance of two tasks, a delayed non-matching-to-position task (DNMP) designed to assess working memory, and an oddity discrimination learning task designed to assess complex learning, in aged dogs. Phenserine (0.5 mg/kg; PO) significantly improved performance on the DNMP at the longest delay compared to wash-out and partially attenuated scopolamine-induced deficits (15 μg/kg; SC). Phenserine also improved learning on a difficult version of an oddity discrimination task compared to placebo, but had no effect on an easier version. We also examined the effects of three doses of donepezil (0.75, 1.5, and 6 mg/kg; PO) on performance of the DNMP. Similar to the results with phenserine, 1.5 mg/kg of donepezil improved performance at the longest delay compared to baseline and wash-out, indicative of memory enhancement. These results further extend the findings of cholinergic hypofunction in aged dogs and provide pharmacological validation of the canine model with a cholinesterase inhibitor approved for use in AD. Collectively, these studies support utilizing the aged dog in future screening of therapeutics for AD, as well as for investigating the links among cholinergic function, Aβ pathology, and cognitive decline. PMID:21593569
Managing the Fruit Fly Experiment.
ERIC Educational Resources Information Center
Jeszenszky, Arleen W.
1997-01-01
Describes a sophisticated version of the fruit fly experiment for teaching concepts about genetics to biology students. Provides students with the opportunity to work with live animals over an extended period. (JRH)
Alkozei, Anna; Smith, Ryan; Dailey, Natalie S; Bajaj, Sahil; Killgore, William D S
2017-01-01
Acute exposure to light within the blue wavelengths has been shown to enhance alertness and vigilance, and lead to improved speed on reaction time tasks, possibly due to activation of the noradrenergic system. It remains unclear, however, whether the effects of blue light extend beyond simple alertness processes to also enhance other aspects of cognition, such as memory performance. The aim of this study was to investigate the effects of a thirty minute pulse of blue light versus placebo (amber light) exposure in healthy normally rested individuals in the morning during verbal memory consolidation (i.e., 1.5 hours after memory acquisition) using an abbreviated version of the California Verbal Learning Test (CVLT-II). At delayed recall, individuals who received blue light (n = 12) during the consolidation period showed significantly better long-delay verbal recall than individuals who received amber light exposure (n = 18), while controlling for the effects of general intelligence, depressive symptoms and habitual wake time. These findings extend previous work demonstrating the effect of blue light on brain activation and alertness to further demonstrate its effectiveness at facilitating better memory consolidation and subsequent retention of verbal material. Although preliminary, these findings point to a potential application of blue wavelength light to optimize memory performance in healthy populations. It remains to be determined whether blue light exposure may also enhance performance in clinical populations with memory deficits.
NREL Fuels and Engines R&D Revs Up Vehicle Efficiency, Performance (Text
Version) | News | NREL Fuels and Engines R&D Revs Up Vehicle Efficiency, Performance (Text Version) NREL Fuels and Engines R&D Revs Up Vehicle Efficiency, Performance (Text Version) NREL's combustion to the evolution of how fuels interact with engine and vehicle design. This is a text version of
SCIATRAN 3.1: A new radiative transfer model and retrieval package
NASA Astrophysics Data System (ADS)
Rozanov, Alexei; Rozanov, Vladimir; Kokhanovsky, Alexander; Burrows, John P.
The SCIATRAN 3.1 package is a result of further development of the SCIATRAN 2.X software family which, similar to previous versions, comprises a radiative transfer model and a retrieval block. After an implementation of the vector radiative transfer model in SCIATRAN 3.0 the spectral range covered by the model has been extended into the thermal infrared ranging to approximately 40 micrometers. Another major improvement has been done accounting for the underlying surface effects. Among others, a sophisticated representation of the water surface with a bidirectional reflection distribution function (BRDF) has been implemented accounting for the Fresnel reflection of the polarized light and for the effect of foam. A newly developed representation for a snow surface allows radiative transfer calculations to be performed within an unpolluted or soiled snow layer. Furthermore, a new approach has been implemented allowing radiative transfer calculations to be performed for a coupled atmosphere-ocean system. This means that, the underlying ocean is not considered as a purely reflecting surface any more. Instead, full radiative transfer calculations are performed within the water allowing the user to simulate the radiance within both the atmosphere and the ocean. Similar to previous versions, the simulations can be performed for any viewing geometry typi-cal for atmospheric observations in the UV-Vis-NIR-TIR spectral range (nadir, limb, off-axis, etc.) as well as for any observer location within or outside the Earth's atmosphere including underwater observations. Similar to the precursor version, the new model is freely available for non-commercial use via the web page of the University of Bremen. In this presentation a short description of the software package, especially of the new features of the radiative transfer model is given, including remarks on the availability for the scientific community. Furthermore, some application examples of the radiative transfer model are shown.
Optical Breath Gas Extravehicular Activity Sensor for the Advanced Portable Life Support System
NASA Technical Reports Server (NTRS)
Wood, William R.; Casias, Miguel E.; Pilgrim, Jeffrey S.; Chullen, Cinda; Campbell, Colin
2016-01-01
The function of the infrared gas transducer used during extravehicular activity (EVA) in the current space suit is to measure and report the concentration of carbon dioxide (CO2) in the ventilation loop. The next generation portable life support system (PLSS) requires highly accurate CO2 sensing technology with performance beyond that presently in use on the International Space Station extravehicular mobility unit (EMU). Further, that accuracy needs to be provided over the full operating pressure range of the suit (3 to 25 psia). Accommodation within space suits demands that optical sensors meet stringent size, weight, and power requirements. A laser diode (LD) sensor based on infrared absorption spectroscopy is being developed for this purpose by Vista Photonics, Inc. Version 1.0 prototype devices were delivered to NASA Johnson Space Center (JSC) in September 2011. The prototypes were upgraded with more sophisticated communications and faster response times to version 2.0 and delivered to JSC in July 2012. The sensors incorporate a laser diode based CO2 channel that also includes an incidental water vapor (humidity) measurement. The prototypes are controlled digitally with an field-programmable gate array microcontroller architecture. Based on the results of the iterative instrument development, further prototype development and testing of instruments were performed leveraging the lessons learned where feasible. The present development extends and upgrades the earlier hardware for the advanced PLSS 2.5 prototypes for testing at JSC. The prototypes provide significantly enhanced accuracy for water vapor measurement and eliminate wavelength drift affecting the earlier versions. Various improvements to the electronics and gas sampling are currently being advanced including the companion development of engineering development units that will ultimately be capable of radiation tolerance. The combination of low power electronics with the performance of a long wavelength laser spectrometer enables multi-gas sensors with significantly increased performance over that presently offered in the EMU.
NASA Technical Reports Server (NTRS)
Klinar, Walter J.; Healy, Frederick M.
1955-01-01
An investigation of a 0.034-scale model of the production version of the Chance Vought F7U-3 airplane has been conducted in the Langley 20-foot free-spinning tunnel. The inverted and erect spin and recovery characteristics of the model were determined for the combat loading with the model in the clean condition and the effect of extending slats was investigated. A brief investigation of pilot ejection was also performed. The results indicate that the inverted spin-recovery characteristics of the airplane will be satisfactory by full rudder reversal. If the rudders can only be neutralized because of high pedal forces in the inverted spins, satisfactory recovery will be obtained if the auxiliary rudders can be moved to neutral or against the spin provided the stick is held full forward. Optimum control technique for satisfactory recovery from erect spins will be full rudder reversal in conjunction with aileron movement to full with the spin (stick right in a right spin). Extension of the slats will have a slightly adverse effect on recoveries from (1 inverted spins but will have a favorable effect on recoveries from erect spins. The results of brief tests indicate that if a pilot is ejected during a spin while a spin-recovery parachute is extended and fully inflated, he will probably clear the tail parachute.
Investigation of the Finite Element Software Packages at KSC
NASA Technical Reports Server (NTRS)
Lu, Chu-Ho
1991-01-01
The useful and powerful features of NASTRAN and three real world problems for the testing of the capabilities of different NASTRAN versions are discussed. The test problems involve direct transient analysis, nonlinear analysis, and static analysis. The experiences in using graphics software packages are also discussed. It was found that MSC/XL can be more useful if it can be improved to generate picture files of the analysis results and to extend its capabilities to support finite element codes other than MSC/NASTRAN. It was found that the current version of SDRC/I-DEAS (version VI) may have bugs in the module 'Data Loader'.
NASA Technical Reports Server (NTRS)
Rui, Hualan; Vollmer, B.; Teng, W.; Beaudoing, H.; Rodell, M.; Silberstein, D.
2015-01-01
GLDAS-2.0 data have been reprocessed with updated Princeton meteorological forcing data within the Land Information System (LIS) Version 7, and temporal coverage have been extended to 1948-2012.Global Land Data Assimilation System Version 2 (GLDAS-2) has two components: GLDAS-2.0: entirely forced with the Princeton meteorological forcing data GLDAS-2.1: forced with atmospheric analysis and observation-based data after 2001In order to create more climatologically consistent data sets, NASA GSFC's Hydrological Sciences Laboratory (HSL) has recently reprocessed the GLDAS-2.0, by using updated Princeton meteorological forcing data within the LIS Version 7.GLDAS-2.0 data and data services are provided at NASA GES DISC Hydrology Data and Information Services Center (HDISC), in collaboration with HSL.
Rahimpour, M; Mohammadzadeh Asl, B
2016-07-01
Monitoring atrial activity via P waves, is an important feature of the arrhythmia detection procedure. The aim of this paper is to present an algorithm for P wave detection in normal and some abnormal records by improving existing methods in the field of signal processing. In contrast to the classical approaches, which are completely blind to signal dynamics, our proposed method uses the extended Kalman filter, EKF25, to estimate the state variables of the equations modeling the dynamic of an ECG signal. This method is a modified version of the nonlinear dynamical model previously introduced for a generation of synthetic ECG signals and fiducial point extraction in normal ones. It is capable of estimating the separate types of activity of the heart with reasonable accuracy and performs well in the presence of morphological variations in the waveforms and ectopic beats. The MIT-BIH Arrhythmia and QT databases have been used to evaluate the performance of the proposed method. The results show that this method has Se = 98.38% and Pr = 96.74% in the overall records (considering normal and abnormal rhythms).
Enabling devices, empowering people: the design and evaluation of Trackball EdgeWrite.
Wobbrock, Jacob O; Myers, Brad A
2008-01-01
To describe the research and development that led to Trackball EdgeWrite, a gestural text entry method that improves desktop input for some people with motor impairments. To compare the character-level version of this technique with a new word-level version. Further, to compare the technique with competitor techniques that use on-screen keyboards. A rapid and iterative design-and-test approach was used to generate working prototypes and elicit quantitative and qualitative feedback from a veteran trackball user. In addition, theoretical modelling based on the Steering law was used to compare competing designs. One result is a refined software artifact, Trackball EdgeWrite, which represents the outcome of this investigation. A theoretical result shows the speed benefit of word-level stroking compared to character-level stroking, which resulted in a 45.0% improvement. Empirical results of a trackball user with a spinal cord injury indicate a peak performance of 8.25 wpm with the character-level version of Trackball EdgeWrite and 12.09 wpm with the word-level version, a 46.5% improvement. Log file analysis of extended real-world text entry shows stroke savings of 43.9% with the word-level version. Both versions of Trackball EdgeWrite were better than on-screen keyboards, particularly regarding user preferences. Follow-up correspondence shows that the veteran trackball user with a spinal cord injury still uses Trackball EdgeWrite on a daily basis 2 years after his initial exposure to the software. Trackball EdgeWrite is a successful new method for desktop text entry and may have further implications for able-bodied users of mobile technologies. Theoretical modelling is useful in combination with empirical testing to explore design alternatives. Single-user lab and field studies can be useful for driving a rapid iterative cycle of innovation and development.
Lieberman, Harris R; Kramer, F Matthew; Montain, Scott J; Niro, Philip
2007-05-01
Limited opportunities to study human cognitive performance in non-laboratory, ambulatory situations exist. However, advances in technology make it possible to extend behavioral assessments to the field. One of the first devices to measure human behavior in the field was the wrist-worn actigraph. This device acquires minute-by-minute information on an individual's physical activity and can distinguish sleep from waking, the most basic aspect of behavior. Our laboratory developed a series of wrist-worn devices, not much larger than a watch, which assess reaction time, vigilance and memory. The devices concurrently assess motor activity with greater temporal resolution than standard actigraphs. They also continuously monitor multiple environmental variables including temperature, humidity, sound, and light. These monitors have been employed during training and simulated military operations to collect behavioral and environmental information that would typically be unavailable under such circumstances. Development of the vigilance monitor, and how each successive version extended capabilities of the device are described. Data from several studies are presented, including studies conducted in harsh field environments during a simulated infantry assault, an officer training course. The monitors simultaneously documented environmental conditions, patterns of sleep and activity and effects of nutritional manipulations on cognitive performance. They provide a new method to relate cognitive performance to real world environmental conditions and assess effects of various interventions on human behavior in the field. They can also monitor cognitive performance in real time, and if it is degraded, attempt to intervene to maintain
Development and Characterization of High-Efficiency, High-Specific Impulse Xenon Hall Thrusters
NASA Technical Reports Server (NTRS)
Hofer, Richard R.; Jacobson, David (Technical Monitor)
2004-01-01
This dissertation presents research aimed at extending the efficient operation of 1600 s specific impulse Hall thruster technology to the 2000 to 3000 s range. Motivated by previous industry efforts and mission studies, the aim of this research was to develop and characterize xenon Hall thrusters capable of both high-specific impulse and high-efficiency operation. During the development phase, the laboratory-model NASA 173M Hall thrusters were designed and their performance and plasma characteristics were evaluated. Experiments with the NASA-173M version 1 (v1) validated the plasma lens magnetic field design. Experiments with the NASA 173M version 2 (v2) showed there was a minimum current density and optimum magnetic field topography at which efficiency monotonically increased with voltage. Comparison of the thrusters showed that efficiency can be optimized for specific impulse by varying the plasma lens. During the characterization phase, additional plasma properties of the NASA 173Mv2 were measured and a performance model was derived. Results from the model and experimental data showed how efficient operation at high-specific impulse was enabled through regulation of the electron current with the magnetic field. The electron Hall parameter was approximately constant with voltage, which confirmed efficient operation can be realized only over a limited range of Hall parameters.
Improved methods for predicting peptide binding affinity to MHC class II molecules.
Jensen, Kamilla Kjaergaard; Andreatta, Massimo; Marcatili, Paolo; Buus, Søren; Greenbaum, Jason A; Yan, Zhen; Sette, Alessandro; Peters, Bjoern; Nielsen, Morten
2018-07-01
Major histocompatibility complex class II (MHC-II) molecules are expressed on the surface of professional antigen-presenting cells where they display peptides to T helper cells, which orchestrate the onset and outcome of many host immune responses. Understanding which peptides will be presented by the MHC-II molecule is therefore important for understanding the activation of T helper cells and can be used to identify T-cell epitopes. We here present updated versions of two MHC-II-peptide binding affinity prediction methods, NetMHCII and NetMHCIIpan. These were constructed using an extended data set of quantitative MHC-peptide binding affinity data obtained from the Immune Epitope Database covering HLA-DR, HLA-DQ, HLA-DP and H-2 mouse molecules. We show that training with this extended data set improved the performance for peptide binding predictions for both methods. Both methods are publicly available at www.cbs.dtu.dk/services/NetMHCII-2.3 and www.cbs.dtu.dk/services/NetMHCIIpan-3.2. © 2018 John Wiley & Sons Ltd.
Extended behavioural device modelling and circuit simulation with Qucs-S
NASA Astrophysics Data System (ADS)
Brinson, M. E.; Kuznetsov, V.
2018-03-01
Current trends in circuit simulation suggest a growing interest in open source software that allows access to more than one simulation engine while simultaneously supporting schematic drawing tools, behavioural Verilog-A and XSPICE component modelling, and output data post-processing. This article introduces a number of new features recently implemented in the 'Quite universal circuit simulator - SPICE variant' (Qucs-S), including structure and fundamental schematic capture algorithms, at the same time highlighting their use in behavioural semiconductor device modelling. Particular importance is placed on the interaction between Qucs-S schematics, equation-defined devices, SPICE B behavioural sources and hardware description language (HDL) scripts. The multi-simulator version of Qucs is a freely available tool that offers extended modelling and simulation features compared to those provided by legacy circuit simulators. The performance of a number of Qucs-S modelling extensions are demonstrated with a GaN HEMT compact device model and data obtained from tests using the Qucs-S/Ngspice/Xyce ©/SPICE OPUS multi-engine circuit simulator.
NASA Astrophysics Data System (ADS)
Lazonder, Ard W.; Wiskerke-Drost, Sjanou
2015-02-01
Several studies found that direct instruction and task structuring can effectively promote children's ability to design unconfounded experiments. The present study examined whether the impact of these interventions extends to other scientific reasoning skills by comparing the inquiry activities of 55 fifth-graders randomly assigned to one of three conditions. Children in the control condition investigated a four-variable inquiry task without additional support. Performance of this task in the direct instruction condition was preceded by a short training in experimental design, whereas children in the task structuring condition, who did not receive the introductory training, were given a version of the task that addressed the four variables one at a time. Analysis of children's experimentation behavior confirmed that direct instruction and task structuring are equally effective and superior to unguided inquiry. Both interventions also evoked more determinate predictions and valid inferences. These findings demonstrate that the effect of short-term interventions designed to promote unconfounded experimentation extends beyond the control of variables.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (UNIX VERSION)
NASA Technical Reports Server (NTRS)
Donnell, B.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (IBM PC VERSION)
NASA Technical Reports Server (NTRS)
Donnell, B.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Riley, G.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (DEC VAX VMS VERSION)
NASA Technical Reports Server (NTRS)
Donnell, B.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
A method to perform a fast fourier transform with primitive image transformations.
Sheridan, Phil
2007-05-01
The Fourier transform is one of the most important transformations in image processing. A major component of this influence comes from the ability to implement it efficiently on a digital computer. This paper describes a new methodology to perform a fast Fourier transform (FFT). This methodology emerges from considerations of the natural physical constraints imposed by image capture devices (camera/eye). The novel aspects of the specific FFT method described include: 1) a bit-wise reversal re-grouping operation of the conventional FFT is replaced by the use of lossless image rotation and scaling and 2) the usual arithmetic operations of complex multiplication are replaced with integer addition. The significance of the FFT presented in this paper is introduced by extending a discrete and finite image algebra, named Spiral Honeycomb Image Algebra (SHIA), to a continuous version, named SHIAC.
Blaser, R E; Wilber, Julie
2013-11-01
Performance on a typical pen-and-paper (figural) version of the Traveling Salesman Problem was compared to performance on a room-sized navigational version of the same task. Nine configurations were designed to examine the use of the nearest-neighbor (NN), cluster approach, and convex-hull strategies. Performance decreased with an increasing number of nodes internal to the hull, and improved when the NN strategy produced the optimal path. There was no overall difference in performance between figural and navigational task modalities. However, there was an interaction between modality and configuration, with evidence that participants relied more heavily on the NN strategy in the figural condition. Our results suggest that participants employed similar, but not identical, strategies when solving figural and navigational versions of the problem. Surprisingly, there was no evidence that participants favored global strategies in the figural version and local strategies in the navigational version.
Pezzuti, Lina; Mastrantonio, Elisa; Orsini, Arturo
2013-01-01
The goal of this project was to construct and validate an ecological version of the Wisconsin Card Sorting Test (WCST) aimed at the elderly. This was accomplished by replacing the geometric stimuli of the traditional version with stimuli belonging to the semantic category of transport vehicles, and by elimination of the color yellow. The results showed the ecological WCST version was preferred over the traditional version and older people felt less tired during test performance. In the two independent normal elderly groups, all pairs of scores that can be derived from the WCST correlated significantly with each other. Six of 11 outcome measures of the traditional WCST-128 (long) version were significantly influenced by age. By contrast, in the WCST-64 (short) version and in the ecological WCST-54 version only one measure was affected by the age variable. No significant effect of education level or gender emerged from the results in any WCST version. Again, the subjects with cognitive deterioration had lower performance in the ecological WCST-54 version than in the two traditional WCST versions. It seems reasonable to assume that the ecological version of WCST is more discriminating and has more advantages than the traditional versions. Further research on individual differences in the performance on this task will increase understanding of the components of the test, and of the variety of factors and possible deficits that could lead to an impaired performance of the test.
Use of the focusing multi-slit ion optical system at RUssian Diagnostic Injector (RUDI)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Listopad, A.; Davydenko, V.; Ivanov, A.
2012-02-15
The upgrade of the diagnostic neutral beam injector RUDI in 2010 was performed to increase the beam density at the focal plane in accordance with the requirements of charge-exchange recombination spectroscopy diagnostics. A new focusing ion-optical system (IOS) with slit beamlets and an enlarged aperture was optimized for 50% higher nominal beam current and reduced angular divergence with respect to the previous multi-aperture IOS version. The upgraded injector provides the beam current up to 3 A, the measured beam divergence in the direction along the slits is 0.35 deg. Additionally, the plasma generator was modified to extend the beam pulsemore » to 8 s.« less
A Generalized Approach for Measuring Relationships Among Genes.
Wang, Lijun; Ahsan, Md Asif; Chen, Ming
2017-07-21
Several methods for identifying relationships among pairs of genes have been developed. In this article, we present a generalized approach for measuring relationships between any pairs of genes, which is based on statistical prediction. We derive two particular versions of the generalized approach, least squares estimation (LSE) and nearest neighbors prediction (NNP). According to mathematical proof, LSE is equivalent to the methods based on correlation; and NNP is approximate to one popular method called the maximal information coefficient (MIC) according to the performances in simulations and real dataset. Moreover, the approach based on statistical prediction can be extended from two-genes relationships to multi-genes relationships. This application would help to identify relationships among multi-genes.
2004-04-15
STEP will carry concentric test masses to Earth orbit to test a fundamental assumption underlying Einstein's theory of general relativity: that gravitational mass is equivalent to inertial mass. STEP is a 21st-century version of the test that Galileo is said to have performed by dropping a carnon ball and a musket ball simultaneously from the top of the Leaning Tower of Pisa to compare their accelerations. During the STEP experiment, four pairs of test masses will be falling around the Earth, and their accelerations will be measured by superconducting quantum interference devices (SQUIDS). The extended time sensitivity of the instruments will allow the measurements to be a million times more accurate than those made in modern ground-based tests.
Version pressure feedback mechanisms for speculative versioning caches
Eichenberger, Alexandre E.; Gara, Alan; O& #x27; Brien, Kathryn M.; Ohmacht, Martin; Zhuang, Xiaotong
2013-03-12
Mechanisms are provided for controlling version pressure on a speculative versioning cache. Raw version pressure data is collected based on one or more threads accessing cache lines of the speculative versioning cache. One or more statistical measures of version pressure are generated based on the collected raw version pressure data. A determination is made as to whether one or more modifications to an operation of a data processing system are to be performed based on the one or more statistical measures of version pressure, the one or more modifications affecting version pressure exerted on the speculative versioning cache. An operation of the data processing system is modified based on the one or more determined modifications, in response to a determination that one or more modifications to the operation of the data processing system are to be performed, to affect the version pressure exerted on the speculative versioning cache.
Micijevic, Esad; Morfitt, Ron
2010-01-01
Systematic characterization and calibration of the Landsat sensors and the assessment of image data quality are performed using the Image Assessment System (IAS). The IAS was first introduced as an element of the Landsat 7 (L7) Enhanced Thematic Mapper Plus (ETM+) ground segment and recently extended to Landsat 4 (L4) and 5 (L5) Thematic Mappers (TM) and Multispectral Sensors (MSS) on-board the Landsat 1-5 satellites. In preparation for the Landsat Data Continuity Mission (LDCM), the IAS was developed for the Earth Observer 1 (EO-1) Advanced Land Imager (ALI) with a capability to assess pushbroom sensors. This paper describes the LDCM version of the IAS and how it relates to unique calibration and validation attributes of its on-board imaging sensors. The LDCM IAS system will have to handle a significantly larger number of detectors and the associated database than the previous IAS versions. An additional challenge is that the LDCM IAS must handle data from two sensors, as the LDCM products will combine the Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) spectral bands.
Tensor Fukunaga-Koontz transform for small target detection in infrared images
NASA Astrophysics Data System (ADS)
Liu, Ruiming; Wang, Jingzhuo; Yang, Huizhen; Gong, Chenglong; Zhou, Yuanshen; Liu, Lipeng; Zhang, Zhen; Shen, Shuli
2016-09-01
Infrared small targets detection plays a crucial role in warning and tracking systems. Some novel methods based on pattern recognition technology catch much attention from researchers. However, those classic methods must reshape images into vectors with the high dimensionality. Moreover, vectorizing breaks the natural structure and correlations in the image data. Image representation based on tensor treats images as matrices and can hold the natural structure and correlation information. So tensor algorithms have better classification performance than vector algorithms. Fukunaga-Koontz transform is one of classification algorithms and it is a vector version method with the disadvantage of all vector algorithms. In this paper, we first extended the Fukunaga-Koontz transform into its tensor version, tensor Fukunaga-Koontz transform. Then we designed a method based on tensor Fukunaga-Koontz transform for detecting targets and used it to detect small targets in infrared images. The experimental results, comparison through signal-to-clutter, signal-to-clutter gain and background suppression factor, have validated the advantage of the target detection based on the tensor Fukunaga-Koontz transform over that based on the Fukunaga-Koontz transform.
MPI-Defrost: Extension of Defrost to MPI-based Cluster Environment
NASA Astrophysics Data System (ADS)
Amin, Mustafa A.; Easther, Richard; Finkel, Hal
2011-06-01
MPI-Defrost extends Frolov’s Defrost to an MPI-based cluster environment. This version has been restricted to a single field. Restoring two-field support should be straightforward, but will require some code changes. Some output options may also not be fully supported under MPI. This code was produced to support our own work, and has been made available for the benefit of anyone interested in either oscillon simulations or an MPI capable version of Defrost, and it is provided on an "as-is" basis. Andrei Frolov is the primary developer of Defrost and we thank him for placing his work under the GPL (GNU Public License), and thus allowing us to distribute this modified version.
CERT Resilience Management Model - Mail-Specific Process Areas: Mail Revenue Assurance (Version 1.0)
2014-08-01
Revenue Assurance ( MRA ), Mail Transportation (MT), and Mail Delivery (MD)—were accepted by the USPIS, as well as an initial draft of the MRA PA...versions of two complete PAs, MI [Allen 2014b] and MRA , were accepted by the USPIS. Following this initial effort, the USPIS asked CERT to extend the...Revenue Assurance ( MRA ) is to ensure that the USPS is compensated for all mail that is accepted, transported, and delivered. Outline MRA:SG1
Development of an Automatic Differentiation Version of the FPX Rotor Code
NASA Technical Reports Server (NTRS)
Hu, Hong
1996-01-01
The ADIFOR2.0 automatic differentiator is applied to the FPX rotor code along with the grid generator GRGN3. The FPX is an eXtended Full-Potential CFD code for rotor calculations. The automatic differentiation version of the code is obtained, which provides both non-geometry and geometry sensitivity derivatives. The sensitivity derivatives via automatic differentiation are presented and compared with divided difference generated derivatives. The study shows that automatic differentiation method gives accurate derivative values in an efficient manner.
Batel, Susana; Castro, Paula
2018-06-28
The theory of social representations (TSR) and discursive psychology (DP) originated as different social psychological approaches and have at times been presented as incompatible. However, along the years convergence has also been acknowledged, and, lately, most of all, practised. With this paper, we discuss how versions of TSR focusing on self-other relations for examining cultural meaning systems in/through communication, and versions of DP focusing on discourse at cultural, ideological, and interactional levels, can come together. The goal is to help forge a stronger social-psychological exploration of how meaning is constructed and transformed in and through language, discourse, and communication, thus extending current understanding of social change. After presenting a theoretical proposal for integrating those versions of TSR and DP, we offer also an integrated analytical strategy. We suggest that together these proposals can, on one hand, help TSR systematize analyses of social change that are both more critical and better grounded in theorizations of language use, and, on the other, provide DP with analytical tools able to better examine both the relational contexts where the construction and transformation of meaning are performed and their effects on discourse. Finally, we give some illustrations of the use of this analytical strategy. © 2018 The British Psychological Society.
Evaluation of Extended-Wear Hearing Technology for Children with Hearing Loss.
Wolfe, Jace; Schafer, Erin; Martella, Natalie; Morais, Mila; Mann, Misty
2015-01-01
Research shows that many older children and teenagers who have mild to moderately severe sensorineural hearing loss do not use their hearing instruments during all waking hours. A variety of reasons may contribute toward this problem, including concerns about cosmetics associated with hearing aid use and the inconvenience of daily maintenance associated with hearing instruments. Extended-wear hearing instruments are inserted into the wearer's ear canal by an audiologist and are essentially invisible to outside observers. The goal of this study was to evaluate the potential benefits and limitations associated with use of extended-wear hearing instruments in a group of children with hearing loss. A two-way repeated measures design was used to examine performance differences obtained with the participants' daily-wear hearing instruments versus that obtained with extended-wear hearing instruments. Sixteen children, ages 10-17 yr old, with sensorineural hearing loss ranging from mild to moderately severe. Probe microphone measures were completed to evaluate the aided output of device. Behavioral test measures included word recognition in quiet, sentence recognition in noise, aided warble-tone thresholds, and psychophysical loudness scaling. Questionnaires were also administered to evaluate subjective performance with each hearing technology. Data logging suggested that many participants were not using their daily-wear hearing instruments during all waking hours (mean use was less than 6 h/day). Real ear probe microphone measurements indicated that a closer fit to the Desired Sensation Level Version 5 prescriptive targets was achieved with the children's daily-wear instruments when compared to the extended-wear instruments. There was no statistically significant difference in monosyllabic word recognition at 50 or 60 dBA obtained with the two hearing technologies. Sentence recognition in noise obtained with use of the extended-wear devices was, however, significantly better than what was obtained with the daily-wear hearing aids. Aided warble-tone thresholds indicated significantly better audibility for low-level sounds with use of the daily-wear hearing technology, but loudness scaling results produced mixed results. Specifically, the participants generally reported greater loudness perception with use of their daily-wear hearing aids at 2000 Hz, but use of the extended-wear hearing technology provided greater loudness perception at 4000 Hz. Finally, the participants reported higher levels of subjective performance with use of the extended-wear hearing instruments. Although some measures suggested that daily-wear hearing instruments provided better audibility than the extended-wear hearing devices, word recognition in quiet was similar with use of the two technologies, and sentence recognition in noise was better with the extended-wear hearing technology. In addition, the participants in this study reported better subjective benefit associated with the use of extended-wear hearing technology. Collectively, the results of this study suggest that extended-wear hearing technology is a viable option for older children and teenagers with mild to moderately severe hearing loss. American Academy of Audiology.
Network-based de-noising improves prediction from microarray data.
Kato, Tsuyoshi; Murata, Yukio; Miura, Koh; Asai, Kiyoshi; Horton, Paul B; Koji, Tsuda; Fujibuchi, Wataru
2006-03-20
Prediction of human cell response to anti-cancer drugs (compounds) from microarray data is a challenging problem, due to the noise properties of microarrays as well as the high variance of living cell responses to drugs. Hence there is a strong need for more practical and robust methods than standard methods for real-value prediction. We devised an extended version of the off-subspace noise-reduction (de-noising) method to incorporate heterogeneous network data such as sequence similarity or protein-protein interactions into a single framework. Using that method, we first de-noise the gene expression data for training and test data and also the drug-response data for training data. Then we predict the unknown responses of each drug from the de-noised input data. For ascertaining whether de-noising improves prediction or not, we carry out 12-fold cross-validation for assessment of the prediction performance. We use the Pearson's correlation coefficient between the true and predicted response values as the prediction performance. De-noising improves the prediction performance for 65% of drugs. Furthermore, we found that this noise reduction method is robust and effective even when a large amount of artificial noise is added to the input data. We found that our extended off-subspace noise-reduction method combining heterogeneous biological data is successful and quite useful to improve prediction of human cell cancer drug responses from microarray data.
Heimbauer, Lisa A; Antworth, Rebecca L; Owren, Michael J
2012-01-01
Nonhuman primates appear to capitalize more effectively on visual cues than corresponding auditory versions. For example, studies of inferential reasoning have shown that monkeys and apes readily respond to seeing that food is present ("positive" cuing) or absent ("negative" cuing). Performance is markedly less effective with auditory cues, with many subjects failing to use this input. Extending recent work, we tested eight captive tufted capuchins (Cebus apella) in locating food using positive and negative cues in visual and auditory domains. The monkeys chose between two opaque cups to receive food contained in one of them. Cup contents were either shown or shaken, providing location cues from both cups, positive cues only from the baited cup, or negative cues from the empty cup. As in previous work, subjects readily used both positive and negative visual cues to secure reward. However, auditory outcomes were both similar to and different from those of earlier studies. Specifically, all subjects came to exploit positive auditory cues, but none responded to negative versions. The animals were also clearly different in visual versus auditory performance. Results indicate that a significant proportion of capuchins may be able to use positive auditory cues, with experience and learning likely playing a critical role. These findings raise the possibility that experience may be significant in visually based performance in this task as well, and highlight that coming to grips with evident differences between visual versus auditory processing may be important for understanding primate cognition more generally.
Reliability and Validity of the Turkish Version of the Job Performance Scale Instrument.
Harmanci Seren, Arzu Kader; Tuna, Rujnan; Eskin Bacaksiz, Feride
2018-02-01
Objective measurement of the job performance of nursing staff using valid and reliable instruments is important in the evaluation of healthcare quality. A current, valid, and reliable instrument that specifically measures the performance of nurses is required for this purpose. The aim of this study was to determine the validity and reliability of the Turkish version of the Job Performance Instrument. This study used a methodological design and a sample of 240 nurses working at different units in four hospitals in Istanbul, Turkey. A descriptive data form, the Job Performance Scale, and the Employee Performance Scale were used to collect data. Data were analyzed using IBM SPSS Statistics Version 21.0 and LISREL Version 8.51. On the basis of the data analysis, the instrument was revised. Some items were deleted, and subscales were combined. The Turkish version of the Job Performance Instrument was determined to be valid and reliable to measure the performance of nurses. The instrument is suitable for evaluating current nursing roles.
NASA Technical Reports Server (NTRS)
Conger, A. M.; Hancock, D. W., III; Hayne, G. S.; Brooks, R. L.
2006-01-01
The purpose of this document is to present and document GFO performance analyses and results. This is the fifth Assessment Report since the initial report. This report extends the performance assessment since acceptance to 26 December 2005. The initial GFO Altimeter Engineering Assessment Report, March 2001 (NASA/TM-2001-209984/Ver.1/Vol.1) covered the GFO performance from Launch to Acceptance (10 February 1998 to 29 November 2000). The second of the series covered the performance from Acceptance to the end of Cycle 20 (29 November 2000 to 21 November 2001). The third of the series covered the performance from Acceptance to the end of Cycle 42 (29 November 2000 to 30 November 2002). The fourth of the series covered the performance from Acceptance to the end of Cycle 64 (29 November 2000 to 17 December 2003). The fifth of the series covered performance from Acceptance to the end of Cycle 86 (29 November 2000 to 17 December 2004). Since launch, we have performed a variety of GFO performance studies; an accumulative index of those studies is provided in Appendix A.
Multistage Planetary Power Transmissions
NASA Technical Reports Server (NTRS)
Hadden, G. B.; Dyba, G. J.; Ragen, M. A.; Kleckner, R. J.; Sheynin, L.
1986-01-01
PLANETSYS simulates thermomechanical performance of multistage planetary performance of multistage planetary power transmission. Two versions of code developed, SKF version and NASA version. Major function of program: compute performance characteristics of planet bearing for any of six kinematic inversions. PLANETSYS solves heat-balance equations for either steadystate or transient thermal conditions, and produces temperature maps for mechanical system.
Adaptive multi-GPU Exchange Monte Carlo for the 3D Random Field Ising Model
NASA Astrophysics Data System (ADS)
Navarro, Cristóbal A.; Huang, Wei; Deng, Youjin
2016-08-01
This work presents an adaptive multi-GPU Exchange Monte Carlo approach for the simulation of the 3D Random Field Ising Model (RFIM). The design is based on a two-level parallelization. The first level, spin-level parallelism, maps the parallel computation as optimal 3D thread-blocks that simulate blocks of spins in shared memory with minimal halo surface, assuming a constant block volume. The second level, replica-level parallelism, uses multi-GPU computation to handle the simulation of an ensemble of replicas. CUDA's concurrent kernel execution feature is used in order to fill the occupancy of each GPU with many replicas, providing a performance boost that is more notorious at the smallest values of L. In addition to the two-level parallel design, the work proposes an adaptive multi-GPU approach that dynamically builds a proper temperature set free of exchange bottlenecks. The strategy is based on mid-point insertions at the temperature gaps where the exchange rate is most compromised. The extra work generated by the insertions is balanced across the GPUs independently of where the mid-point insertions were performed. Performance results show that spin-level performance is approximately two orders of magnitude faster than a single-core CPU version and one order of magnitude faster than a parallel multi-core CPU version running on 16-cores. Multi-GPU performance is highly convenient under a weak scaling setting, reaching up to 99 % efficiency as long as the number of GPUs and L increase together. The combination of the adaptive approach with the parallel multi-GPU design has extended our possibilities of simulation to sizes of L = 32 , 64 for a workstation with two GPUs. Sizes beyond L = 64 can eventually be studied using larger multi-GPU systems.
Recent Development on the NOAA's Global Surface Temperature Dataset
NASA Astrophysics Data System (ADS)
Zhang, H. M.; Huang, B.; Boyer, T.; Lawrimore, J. H.; Menne, M. J.; Rennie, J.
2016-12-01
Global Surface Temperature (GST) is one of the most widely used indicators for climate trend and extreme analyses. A widely used GST dataset is the NOAA merged land-ocean surface temperature dataset known as NOAAGlobalTemp (formerly MLOST). The NOAAGlobalTemp had recently been updated from version 3.5.4 to version 4. The update includes a significant improvement in the ocean surface component (Extended Reconstructed Sea Surface Temperature or ERSST, from version 3b to version 4) which resulted in an increased temperature trends in recent decades. Since then, advancements in both the ocean component (ERSST) and land component (GHCN-Monthly) have been made, including the inclusion of Argo float SSTs and expanded EOT modes in ERSST, and the use of ISTI databank in GHCN-Monthly. In this presentation, we describe the impact of those improvements on the merged global temperature dataset, in terms of global trends and other aspects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weiland, T.; Bartsch, M.; Becker, U.
1997-02-01
MAFIA Version 4.0 is an almost completely new version of the general purpose electromagnetic simulator known since 13 years. The major improvements concern the new graphical user interface based on state of the art technology as well as a series of new solvers for new physics problems. MAFIA now covers heat distribution, electro-quasistatics, S-parameters in frequency domain, particle beam tracking in linear accelerators, acoustics and even elastodynamics. The solvers that were available in earlier versions have also been improved and/or extended, as for example the complex eigenmode solver, the 2D--3D coupled PIC solvers. Time domain solvers have new waveguide boundarymore » conditions with an extremely low reflection even near cutoff frequency, concentrated elements are available as well as a variety of signal processing options. Probably the most valuable addition are recursive sub-grid capabilities that enable modeling of very small details in large structures. {copyright} {ital 1997 American Institute of Physics.}« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weiland, T.; Bartsch, M.; Becker, U.
1997-02-01
MAFIA Version 4.0 is an almost completely new version of the general purpose electromagnetic simulator known since 13 years. The major improvements concern the new graphical user interface based on state of the art technology as well as a series of new solvers for new physics problems. MAFIA now covers heat distribution, electro-quasistatics, S-parameters in frequency domain, particle beam tracking in linear accelerators, acoustics and even elastodynamics. The solvers that were available in earlier versions have also been improved and/or extended, as for example the complex eigenmode solver, the 2D-3D coupled PIC solvers. Time domain solvers have new waveguide boundarymore » conditions with an extremely low reflection even near cutoff frequency, concentrated elements are available as well as a variety of signal processing options. Probably the most valuable addition are recursive sub-grid capabilities that enable modeling of very small details in large structures.« less
2013 Vehicle Theft Prevention Quick Reference Guide for the Law Enforcement Community
DOT National Transportation Integrated Search
2013-08-01
"This and future versions of the Vehicle TheftPrevention Quick Reference Guide for the Law Enforcement Community will provide comprehensive information for vehicle lines. The parts-marking requirements have been : extended to include: : all passe...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-16
... announced in the Federal Register at a later date. FOR FURTHER INFORMATION CONTACT: Tameka Cooper, GSP... is Tameka_Cooper@ustr.eop.gov . Public versions of the petitions submitted will be available for...
Relic gravitational waves and extended inflation
NASA Technical Reports Server (NTRS)
Turner, Michael S.; Wilczek, Frank
1990-01-01
In extended inflation, a new version of inflation where the transition from an inflationary to a radiation-dominated universe is accomplished by bubble nucleation, bubble collisions supply a potent - and potentially detectable - source of gravitational waves. The energy density in relic gravitons from bubble collisions is expected to be about 0.00005 of closure density. Their characteristic wavelength depends on the reheating temperature. If black holes are produced by bubble collisions, they will evaporate, producing shorter-wavelength gravitons.
DCU@TRECMed 2012: Using Ad-Hoc Baselines for Domain-Specific Retrieval
2012-11-01
description to extend the query, for example: Patients with complicated GERD who receive endoscopy will be extended with Gastroesophageal reflux disease ... Diseases and Related Health Problems, version 9) for the patient’s admission or discharge status [1, 5]; treating negation (e.g. negative test results or...codes were mapped to a description of the code, usually a short phrase/sentence. For instance, the ICD9 code 253.5 corresponds to the disease Diabetes
Agent based reasoning for the non-linear stochastic models of long-range memory
NASA Astrophysics Data System (ADS)
Kononovicius, A.; Gontis, V.
2012-02-01
We extend Kirman's model by introducing variable event time scale. The proposed flexible time scale is equivalent to the variable trading activity observed in financial markets. Stochastic version of the extended Kirman's agent based model is compared to the non-linear stochastic models of long-range memory in financial markets. The agent based model providing matching macroscopic description serves as a microscopic reasoning of the earlier proposed stochastic model exhibiting power law statistics.
NASA Technical Reports Server (NTRS)
Carpenter, M. H.
1988-01-01
The generalized chemistry version of the computer code SPARK is extended to include two higher-order numerical schemes, yielding fourth-order spatial accuracy for the inviscid terms. The new and old formulations are used to study the influences of finite rate chemical processes on nozzle performance. A determination is made of the computationally optimum reaction scheme for use in high-enthalpy nozzles. Finite rate calculations are compared with the frozen and equilibrium limits to assess the validity of each formulation. In addition, the finite rate SPARK results are compared with the constant ratio of specific heats (gamma) SEAGULL code, to determine its accuracy in variable gamma flow situations. Finally, the higher-order SPARK code is used to calculate nozzle flows having species stratification. Flame quenching occurs at low nozzle pressures, while for high pressures, significant burning continues in the nozzle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, Reshma; Ravache, Baptiste; Sartor, Dale
India launched the Energy Conservation Building Code (ECBC) in 2007, and a revised version in 2017 as ambitious first steps towards promoting energy efficiency in the building sector. Pioneering early adopters—building owners, A&E firms, and energy consultants—have taken the lead to design customized solutions for their energy-efficient buildings. This Guide offers a synthesizing framework, critical lessons, and guidance to meet and exceed ECBC. Its whole-building lifecycle assurance framework provides a user-friendly methodology to achieve high performance in terms of energy, environmental, and societal impact. Class A offices are selected as a target typology, being a high-growth sector, with significant opportunitiesmore » for energy savings. The practices may be extrapolated to other commercial building sectors, as well as extended to other regions with similar cultural, climatic, construction, and developmental contexts« less
Improving Students’ Evaluation of Informal Arguments
LARSON, AARON A.; BRITT, M. ANNE; KURBY, CHRISTOPHER A.
2010-01-01
Evaluating the structural quality of arguments is a skill important to students’ ability to comprehend the arguments of others and produce their own. The authors examined college and high school students’ ability to evaluate the quality of 2-clause (claim-reason) arguments and tested a tutorial to improve this ability. These experiments indicated that college and high school students had difficulty evaluating arguments on the basis of their quality. Experiments 1 and 2 showed that a tutorial explaining skills important to overall argument evaluation increased performance but that immediate feedback during training was necessary for teaching students to evaluate the claim-reason connection. Using a Web-based version of the tutorial, Experiment 3 extended this finding to the performance of high-school students. The study suggests that teaching the structure of an argument and teaching students to pay attention to the precise message of the claim can improve argument evaluation. PMID:20174611
Extending HPF for advanced data parallel applications
NASA Technical Reports Server (NTRS)
Chapman, Barbara; Mehrotra, Piyush; Zima, Hans
1994-01-01
The stated goal of High Performance Fortran (HPF) was to 'address the problems of writing data parallel programs where the distribution of data affects performance'. After examining the current version of the language we are led to the conclusion that HPF has not fully achieved this goal. While the basic distribution functions offered by the language - regular block, cyclic, and block cyclic distributions - can support regular numerical algorithms, advanced applications such as particle-in-cell codes or unstructured mesh solvers cannot be expressed adequately. We believe that this is a major weakness of HPF, significantly reducing its chances of becoming accepted in the numeric community. The paper discusses the data distribution and alignment issues in detail, points out some flaws in the basic language, and outlines possible future paths of development. Furthermore, we briefly deal with the issue of task parallelism and its integration with the data parallel paradigm of HPF.
Development and verification of NRC`s single-rod fuel performance codes FRAPCON-3 AND FRAPTRAN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beyer, C.E.; Cunningham, M.E.; Lanning, D.D.
1998-03-01
The FRAPCON and FRAP-T code series, developed in the 1970s and early 1980s, are used by the US Nuclear Regulatory Commission (NRC) to predict fuel performance during steady-state and transient power conditions, respectively. Both code series are now being updated by Pacific Northwest National Laboratory to improve their predictive capabilities at high burnup levels. The newest versions of the codes are called FRAPCON-3 and FRAPTRAN. The updates to fuel property and behavior models are focusing on providing best estimate predictions under steady-state and fast transient power conditions up to extended fuel burnups (> 55 GWd/MTU). Both codes will be assessedmore » against a data base independent of the data base used for code benchmarking and an estimate of code predictive uncertainties will be made based on comparisons to the benchmark and independent data bases.« less
Characterization of impulse noise and analysis of its effect upon correlation receivers
NASA Technical Reports Server (NTRS)
Houts, R. C.; Moore, J. D.
1971-01-01
A noise model is formulated to describe the impulse noise in many digital systems. A simplified model, which assumes that each noise burst contains a randomly weighted version of the same basic waveform, is used to derive the performance equations for a correlation receiver. The expected number of bit errors per noise burst is expressed as a function of the average signal energy, signal-set correlation coefficient, bit time, noise-weighting-factor variance and probability density function, and a time range function which depends on the crosscorrelation of the signal-set basis functions and the noise waveform. A procedure is established for extending the results for the simplified noise model to the general model. Unlike the performance results for Gaussian noise, it is shown that for impulse noise the error performance is affected by the choice of signal-set basis functions and that Orthogonal signaling is not equivalent to On-Off signaling with the same average energy.
Analyzing the association between functional connectivity of the brain and intellectual performance
Pamplona, Gustavo S. P.; Santos Neto, Gérson S.; Rosset, Sara R. E.; Rogers, Baxter P.; Salmon, Carlos E. G.
2015-01-01
Measurements of functional connectivity support the hypothesis that the brain is composed of distinct networks with anatomically separated nodes but common functionality. A few studies have suggested that intellectual performance may be associated with greater functional connectivity in the fronto-parietal network and enhanced global efficiency. In this fMRI study, we performed an exploratory analysis of the relationship between the brain's functional connectivity and intelligence scores derived from the Portuguese language version of the Wechsler Adult Intelligence Scale (WAIS-III) in a sample of 29 people, born and raised in Brazil. We examined functional connectivity between 82 regions, including graph theoretic properties of the overall network. Some previous findings were extended to the Portuguese-speaking population, specifically the presence of small-world organization of the brain and relationships of intelligence with connectivity of frontal, pre-central, parietal, occipital, fusiform and supramarginal gyrus, and caudate nucleus. Verbal comprehension was associated with global network efficiency, a new finding. PMID:25713528
Extended inflation from higher dimensional theories
NASA Technical Reports Server (NTRS)
Holman, Richard; Kolb, Edward W.; Vadas, Sharon L.; Wang, Yun
1990-01-01
The possibility is considered that higher dimensional theories may, upon reduction to four dimensions, allow extended inflation to occur. Two separate models are analayzed. One is a very simple toy model consisting of higher dimensional gravity coupled to a scalar field whose potential allows for a first-order phase transition. The other is a more sophisticated model incorporating the effects of non-trivial field configurations (monopole, Casimir, and fermion bilinear condensate effects) that yield a non-trivial potential for the radius of the internal space. It was found that extended inflation does not occur in these models. It was also found that the bubble nucleation rate in these theories is time dependent unlike the case in the original version of extended inflation.
Zhang, Yao; Huang, Jingfeng; Wang, Fumin; Blackburn, George Alan; Zhang, Hankui K; Wang, Xiuzhen; Wei, Chuanwen; Zhang, Kangyu; Wei, Chen
2017-07-25
The PROSPECT leaf optical model has, to date, well-separated the effects of total chlorophyll and carotenoids on leaf reflectance and transmittance in the 400-800 nm. Considering variations in chlorophyll a:b ratio with leaf age and physiological stress, a further separation of total plant-based chlorophylls into chlorophyll a and chlorophyll b is necessary for advanced monitoring of plant growth. In this study, we present an extended version of PROSPECT model (hereafter referred to as PROSPECT-MP) that can combine the effects of chlorophyll a, chlorophyll b and carotenoids on leaf directional hemispherical reflectance and transmittance (DHR and DHT) in the 400-800 nm. The LOPEX93 dataset was used to evaluate the capabilities of PROSPECT-MP for spectra modelling and pigment retrieval. The results show that PROSPECT-MP can both simultaneously retrieve leaf chlorophyll a and b, and also performs better than PROSPECT-5 in retrieving carotenoids concentrations. As for the simulation of DHR and DHT, the performances of PROSPECT-MP are similar to that of PROSPECT-5. This study demonstrates the potential of PROSPECT-MP for improving capabilities of remote sensing of leaf photosynthetic pigments (chlorophyll a, chlorophyll b and carotenoids) and for providing a framework for future refinements in the modelling of leaf optical properties.
Jiang, Xuejun; Guo, Xu; Zhang, Ning; Wang, Bo
2018-01-01
This article presents and investigates performance of a series of robust multivariate nonparametric tests for detection of location shift between two multivariate samples in randomized controlled trials. The tests are built upon robust estimators of distribution locations (medians, Hodges-Lehmann estimators, and an extended U statistic) with both unscaled and scaled versions. The nonparametric tests are robust to outliers and do not assume that the two samples are drawn from multivariate normal distributions. Bootstrap and permutation approaches are introduced for determining the p-values of the proposed test statistics. Simulation studies are conducted and numerical results are reported to examine performance of the proposed statistical tests. The numerical results demonstrate that the robust multivariate nonparametric tests constructed from the Hodges-Lehmann estimators are more efficient than those based on medians and the extended U statistic. The permutation approach can provide a more stringent control of Type I error and is generally more powerful than the bootstrap procedure. The proposed robust nonparametric tests are applied to detect multivariate distributional difference between the intervention and control groups in the Thai Healthy Choices study and examine the intervention effect of a four-session motivational interviewing-based intervention developed in the study to reduce risk behaviors among youth living with HIV. PMID:29672555
Engineering Risk Assessment of Space Thruster Challenge Problem
NASA Technical Reports Server (NTRS)
Mathias, Donovan L.; Mattenberger, Christopher J.; Go, Susie
2014-01-01
The Engineering Risk Assessment (ERA) team at NASA Ames Research Center utilizes dynamic models with linked physics-of-failure analyses to produce quantitative risk assessments of space exploration missions. This paper applies the ERA approach to the baseline and extended versions of the PSAM Space Thruster Challenge Problem, which investigates mission risk for a deep space ion propulsion system with time-varying thruster requirements and operations schedules. The dynamic mission is modeled using a combination of discrete and continuous-time reliability elements within the commercially available GoldSim software. Loss-of-mission (LOM) probability results are generated via Monte Carlo sampling performed by the integrated model. Model convergence studies are presented to illustrate the sensitivity of integrated LOM results to the number of Monte Carlo trials. A deterministic risk model was also built for the three baseline and extended missions using the Ames Reliability Tool (ART), and results are compared to the simulation results to evaluate the relative importance of mission dynamics. The ART model did a reasonable job of matching the simulation models for the baseline case, while a hybrid approach using offline dynamic models was required for the extended missions. This study highlighted that state-of-the-art techniques can adequately adapt to a range of dynamic problems.
Time-Extended Payoffs for Collectives of Autonomous Agents
NASA Technical Reports Server (NTRS)
Tumer, Kagan; Agogino, Adrian K.
2002-01-01
A collective is a set of self-interested agents which try to maximize their own utilities, along with a a well-defined, time-extended world utility function which rates the performance of the entire system. In this paper, we use theory of collectives to design time-extended payoff utilities for agents that are both aligned with the world utility, and are "learnable", i.e., the agents can readily see how their behavior affects their utility. We show that in systems where each agent aims to optimize such payoff functions, coordination arises as a byproduct of the agents selfishly pursuing their own goals. A game theoretic analysis shows that such payoff functions have the net effect of aligning the Nash equilibrium, Pareto optimal solution and world utility optimum, thus eliminating undesirable behavior such as agents working at cross-purposes. We then apply collective-based payoff functions to the token collection in a gridworld problem where agents need to optimize the aggregate value of tokens collected across an episode of finite duration (i.e., an abstracted version of rovers on Mars collecting scientifically interesting rock samples, subject to power limitations). We show that, regardless of the initial token distribution, reinforcement learning agents using collective-based payoff functions significantly outperform both natural extensions of single agent algorithms and global reinforcement learning solutions based on "team games".
Scalable smoothing strategies for a geometric multigrid method for the immersed boundary equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhalla, Amneet Pal Singh; Knepley, Matthew G.; Adams, Mark F.
2016-12-20
The immersed boundary (IB) method is a widely used approach to simulating fluid-structure interaction (FSI). Although explicit versions of the IB method can suffer from severe time step size restrictions, these methods remain popular because of their simplicity and generality. In prior work (Guy et al., Adv Comput Math, 2015), some of us developed a geometric multigrid preconditioner for a stable semi-implicit IB method under Stokes flow conditions; however, this solver methodology used a Vanka-type smoother that presented limited opportunities for parallelization. This work extends this Stokes-IB solver methodology by developing smoothing techniques that are suitable for parallel implementation. Specifically,more » we demonstrate that an additive version of the Vanka smoother can yield an effective multigrid preconditioner for the Stokes-IB equations, and we introduce an efficient Schur complement-based smoother that is also shown to be effective for the Stokes-IB equations. We investigate the performance of these solvers for a broad range of material stiffnesses, both for Stokes flows and flows at nonzero Reynolds numbers, and for thick and thin structural models. We show here that linear solver performance degrades with increasing Reynolds number and material stiffness, especially for thin interface cases. Nonetheless, the proposed approaches promise to yield effective solution algorithms, especially at lower Reynolds numbers and at modest-to-high elastic stiffnesses.« less
Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.
Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie
2011-12-01
A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.
Aust, Ulrike; Braunöder, Elisabeth
2015-02-01
The present experiment investigated pigeons' and humans' processing styles-local or global-in an exemplar-based visual categorization task in which category membership of every stimulus had to be learned individually, and in a rule-based task in which category membership was defined by a perceptual rule. Group Intact was trained with the original pictures (providing both intact local and global information), Group Scrambled was trained with scrambled versions of the same pictures (impairing global information), and Group Blurred was trained with blurred versions (impairing local information). Subsequently, all subjects were tested for transfer to the 2 untrained presentation modes. Humans outperformed pigeons regarding learning speed and accuracy as well as transfer performance and showed good learning irrespective of group assignment, whereas the pigeons of Group Blurred needed longer to learn the training tasks than the pigeons of Groups Intact and Scrambled. Also, whereas humans generalized equally well to any novel presentation mode, pigeons' transfer from and to blurred stimuli was impaired. Both species showed faster learning and, for the most part, better transfer in the rule-based than in the exemplar-based task, but there was no evidence of the used processing mode depending on the type of task (exemplar- or rule-based). Whereas pigeons relied on local information throughout, humans did not show a preference for either processing level. Additional tests with grayscale versions of the training stimuli, with versions that were both blurred and scrambled, and with novel instances of the rule-based task confirmed and further extended these findings. PsycINFO Database Record (c) 2015 APA, all rights reserved.
Parks, Donovan H.; Mankowski, Timothy; Zangooei, Somayyeh; Porter, Michael S.; Armanini, David G.; Baird, Donald J.; Langille, Morgan G. I.; Beiko, Robert G.
2013-01-01
GenGIS is free and open source software designed to integrate biodiversity data with a digital map and information about geography and habitat. While originally developed with microbial community analyses and phylogeography in mind, GenGIS has been applied to a wide range of datasets. A key feature of GenGIS is the ability to test geographic axes that can correspond to routes of migration or gradients that influence community similarity. Here we introduce GenGIS version 2, which extends the linear gradient tests introduced in the first version to allow comprehensive testing of all possible linear geographic axes. GenGIS v2 also includes a new plugin framework that supports the development and use of graphically driven analysis packages: initial plugins include implementations of linear regression and the Mantel test, calculations of alpha-diversity (e.g., Shannon Index) for all samples, and geographic visualizations of dissimilarity matrices. We have also implemented a recently published method for biomonitoring reference condition analysis (RCA), which compares observed species richness and diversity to predicted values to determine whether a given site has been impacted. The newest version of GenGIS supports vector data in addition to raster files. We demonstrate the new features of GenGIS by performing a full gradient analysis of an Australian kangaroo apple data set, by using plugins and embedded statistical commands to analyze human microbiome sample data, and by applying RCA to a set of samples from Atlantic Canada. GenGIS release versions, tutorials and documentation are freely available at http://kiwi.cs.dal.ca/GenGIS, and source code is available at https://github.com/beiko-lab/gengis. PMID:23922841
The Herschel-SPIRE Point Source Catalog Version 2
NASA Astrophysics Data System (ADS)
Schulz, Bernhard; Marton, Gábor; Valtchanov, Ivan; María Pérez García, Ana; Pintér, Sándor; Appleton, Phil; Kiss, Csaba; Lim, Tanya; Lu, Nanyao; Papageorgiou, Andreas; Pearson, Chris; Rector, John; Sánchez Portal, Miguel; Shupe, David; Tóth, Viktor L.; Van Dyk, Schuyler; Varga-Verebélyi, Erika; Xu, Kevin
2018-01-01
The Herschel-SPIRE instrument mapped about 8% of the sky in Submillimeter broad-band filters centered at 250, 350, and 500 microns (1199, 857, 600 GHz) with spatial resolutions of 17.9”, 24.2”, and 35.4” respectively. We present here the 2nd version of the SPIRE Point Source Catalog (SPSC). Stacking on WISE 22 micron catalog sources led to the identification of 108 maps, out of 6878, that had astrometry offsets of greater than 5”. After fixing these deviations and re-derivation of all affected map-mosaics, we repeated the systematic and homogeneous source extraction performed on all maps, using an improved version of the 4 different photometry extraction methods that were already employed in the generation of the first version catalog. Only regions affected by strong Galactic emission, mostly in the Galactic Plane, were excluded, as they exceeded the limits of the available source extraction methods. Aimed primarily at point sources, that allow for the best photometric accuracy, the catalog contains also significant fractions of slightly extended sources. With most SPIRE maps being confusion limited, uncertainties in flux densities were established as a function of structure noise and flux density, based on the results of artificial source insertion experiments into real data along a range of celestial backgrounds. Many sources have been rejected that do not pass the imposed SNR threshold, especially at flux densities approaching the extragalactic confusion limit. A range of additional flags provide information on the reliability of the flux information, as well as the spatial extent and orientation of a source. The catalog should be particularly helpful for determining cold dust content in extragalactic and galactic sources with low to moderate background confusion. We present an overview of catalog construction, detailed content, and validation results, with focus on the improvements achieved in the second version that is soon to be released.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carrington, David Bradley; Waters, Jiajia
KIVA-hpFE is a high performance computer software for solving the physics of multi-species and multiphase turbulent reactive flow in complex geometries having immersed moving parts. The code is written in Fortran 90/95 and can be used on any computer platform with any popular complier. The code is in two versions, a serial version and a parallel version utilizing MPICH2 type Message Passing Interface (MPI or Intel MPI) for solving distributed domains. The parallel version is at least 30x faster than the serial version and much faster than our previous generation of parallel engine modeling software, by many factors. The 5thmore » generation algorithm construction is a Galerkin type Finite Element Method (FEM) solving conservative momentum, species, and energy transport equations along with two-equation turbulent model k-ω Reynolds Averaged Navier-Stokes (RANS) model and a Vreman type dynamic Large Eddy Simulation (LES) method. The LES method is capable modeling transitional flow from laminar to fully turbulent; therefore, this LES method does not require special hybrid or blending to walls. The FEM projection method also uses a Petrov-Galerkin (P-G) stabilization along with pressure stabilization. We employ hierarchical basis sets, constructed on the fly with enrichment in areas associated with relatively larger error as determined by error estimation methods. In addition, when not using the hp-adaptive module, the code employs Lagrangian basis or shape functions. The shape functions are constructed for hexahedral, prismatic and tetrahedral elements. The software is designed to solve many types of reactive flow problems, from burners to internal combustion engines and turbines. In addition, the formulation allows for direct integration of solid bodies (conjugate heat transfer), as in heat transfer through housings, parts, cylinders. It can also easily be extended to stress modeling of solids, used in fluid structure interactions problems, solidification, porous media modeling and magneto hydrodynamics.« less
NASA Astrophysics Data System (ADS)
Sepehri, Alireza; Ghaffary, Tooraj; Naimi, Yaghoob
2018-03-01
We obtain the action of Moffat's Modified Gravity (MOG), a scalar-tensor-vector theory of gravitation, by generalizing the Horava-Witten mechanism to fourteen dimensions. We show that the resulting theory is anomaly-free. We propose an extended version of MOG that includes fermionic fields.
Physical Projections in BRST Treatments of Reparametrization Invariant Theories
NASA Astrophysics Data System (ADS)
Marnelius, Robert; Sandström, Niclas
Any regular quantum mechanical system may be cast into an Abelian gauge theory by simply reformulating it as a reparametrization invariant theory. We present a detailed study of the BRST quantization of such reparametrization invariant theories within a precise operator version of BRST which is related to the conventional BFV path integral formulation. Our treatments lead us to propose general rules for how physical wave functions and physical propagators are to be projected from the BRST singlets and propagators in the ghost extended BRST theory. These projections are performed by boundary conditions which are specified by the ingredients of BRST charge and precisely determined by the operator BRST. We demonstrate explicitly the validity of these rules for the considered class of models.
The Satellite Test of the Equivalence Principle (STEP)
NASA Technical Reports Server (NTRS)
2004-01-01
STEP will carry concentric test masses to Earth orbit to test a fundamental assumption underlying Einstein's theory of general relativity: that gravitational mass is equivalent to inertial mass. STEP is a 21st-century version of the test that Galileo is said to have performed by dropping a carnon ball and a musket ball simultaneously from the top of the Leaning Tower of Pisa to compare their accelerations. During the STEP experiment, four pairs of test masses will be falling around the Earth, and their accelerations will be measured by superconducting quantum interference devices (SQUIDS). The extended time sensitivity of the instruments will allow the measurements to be a million times more accurate than those made in modern ground-based tests.
NASA Technical Reports Server (NTRS)
Voigt, Kerstin
1992-01-01
We present MENDER, a knowledge based system that implements software design techniques that are specialized to automatically compile generate-and-patch problem solvers that satisfy global resource assignments problems. We provide empirical evidence of the superior performance of generate-and-patch over generate-and-test: even with constrained generation, for a global constraint in the domain of '2D-floorplanning'. For a second constraint in '2D-floorplanning' we show that even when it is possible to incorporate the constraint into a constrained generator, a generate-and-patch problem solver may satisfy the constraint more rapidly. We also briefly summarize how an extended version of our system applies to a constraint in the domain of 'multiprocessor scheduling'.
A review of Reynolds stress models for turbulent shear flows
NASA Technical Reports Server (NTRS)
Speziale, Charles G.
1995-01-01
A detailed review of recent developments in Reynolds stress modeling for incompressible turbulent shear flows is provided. The mathematical foundations of both two-equation models and full second-order closures are explored in depth. It is shown how these models can be systematically derived for two-dimensional mean turbulent flows that are close to equilibrium. A variety of examples are provided to demonstrate how well properly calibrated versions of these models perform for such flows. However, substantial problems remain for the description of more complex turbulent flows where there are large departures from equilibrium. Recent efforts to extend Reynolds stress models to nonequilibrium turbulent flows are discussed briefly along with the major modeling issues relevant to practical naval hydrodynamics applications.
Diagonal couplings of quantum Markov chains
NASA Astrophysics Data System (ADS)
Kümmerer, Burkhard; Schwieger, Kay
2016-05-01
In this paper we extend the coupling method from classical probability theory to quantum Markov chains on atomic von Neumann algebras. In particular, we establish a coupling inequality, which allow us to estimate convergence rates by analyzing couplings. For a given tensor dilation we construct a self-coupling of a Markov operator. It turns out that the coupling is a dual version of the extended dual transition operator studied by Gohm et al. We deduce that this coupling is successful if and only if the dilation is asymptotically complete.
Responsive Image Inline Filter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freeman, Ian
2016-10-20
RIIF is a contributed module for the Drupal php web application framework (drupal.org). It is written as a helper or sub-module of other code which is part of version 8 "core Drupal" and is intended to extend its functionality. It allows Drupal to resize images uploaded through the user-facing text editor within the Drupal GUI (a.k.a. "inline images") for various browser widths. This resizing is already done foe other images through the parent "Responsive Image" core module. This code extends that functionality to inline images.
Episodic foresight deficits in regular, but not recreational, cannabis users.
Mercuri, Kimberly; Terrett, Gill; Henry, Julie D; Curran, H Valerie; Elliott, Morgan; Rendell, Peter G
2018-06-01
Cannabis use is associated with a range of neurocognitive deficits, including impaired episodic memory. However, no study to date has assessed whether these difficulties extend to episodic foresight, a core component of which is the ability to mentally travel into one's personal future. This is a particularly surprising omission given that episodic memory is considered to be critical to engage episodic foresight. In the present study, we provide the first test of how episodic foresight is affected in the context of differing levels of cannabis use, and the degree to which performance on a measure of this construct is related to episodic memory. Fifty-seven regular cannabis users (23 recreational, 34 regular) and 57 controls were assessed using an adapted version of the Autobiographical Interview. The results showed that regular-users exhibited greater impairment of episodic foresight and episodic memory than both recreational-users and cannabis-naïve controls. These data therefore show for the first time that cannabis-related disruption of cognitive functioning extends to the capacity for episodic foresight, and they are discussed in relation to their potential implications for functional outcomes in this group.
Fast interrupt platform for extended DOS
NASA Technical Reports Server (NTRS)
Duryea, T. W.
1995-01-01
Extended DOS offers the unique combination of a simple operating system which allows direct access to the interrupt tables, 32 bit protected mode access to 4096 MByte address space, and the use of industry standard C compilers. The drawback is that fast interrupt handling requires both 32 bit and 16 bit versions of each real-time process interrupt handler to avoid mode switches on the interrupts. A set of tools has been developed which automates the process of transforming the output of a standard 32 bit C compiler to 16 bit interrupt code which directly handles the real mode interrupts. The entire process compiles one set of source code via a make file, which boosts productivity by making the management of the compile-link cycle very simple. The software components are in the form of classes written mostly in C. A foreground process written as a conventional application which can use the standard C libraries can communicate with the background real-time classes via a message passing mechanism. The platform thus enables the integration of high performance real-time processing into a conventional application framework.
Fraenkel, Liana; Stolar, Marilyn; Swift, Sarah; Street, Richard L.; Chowdhary, Harjinder; Peters, Ellen
2016-01-01
Background Order and amount of information influence patients’ risk perceptions, but most studies have evaluated patients’ reactions to written materials. The objective of this study was to examine the effect of four communication strategies, varying in their order and/or amount of information, on judgments related to an audible description of a new medication and among patients who varied in subjective numeracy. Methods We created five versions of a hypothetical scenario describing a new medication. The versions were composed to elucidate whether order and/or amount of the information describing benefits and adverse events influenced how subjects valued a new medication. After listening to a randomly assigned version, perceived medication value was measured by asking subjects to choose one of the following statements: the risks outweigh the benefits, the risks and benefits are equally balanced, or the benefits outweigh the risks. Results Of the 432 patients contacted, 389 participated in the study. Listening to a brief description of benefits followed by an extended description of adverse events resulted in a greater likelihood of perceiving that the medication’s benefits outweighed the risks compared to: 1) presenting the extended adverse events description before the benefits, 2) giving a greater amount of information related to benefits, and 3) sandwiching the adverse events between benefits. These associations were only observed among subjects with average or higher subjective numeracy. Conclusion If confirmed in future studies, our results suggest that, for patients with average or better subjective numeracy, perceived medication value is highest when a brief presentation of benefits is followed by an extended description of adverse events. PMID:27216580
Interactive Supercomputing’s Star-P Platform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edelman, Alan; Husbands, Parry; Leibman, Steve
2006-09-19
The thesis of this extended abstract is simple. High productivity comes from high level infrastructures. To measure this, we introduce a methodology that goes beyond the tradition of timing software in serial and tuned parallel modes. We perform a classroom productivity study involving 29 students who have written a homework exercise in a low level language (MPI message passing) and a high level language (Star-P with MATLAB client). Our conclusions indicate what perhaps should be of little surprise: (1) the high level language is always far easier on the students than the low level language. (2) The early versions ofmore » the high level language perform inadequately compared to the tuned low level language, but later versions substantially catch up. Asymptotically, the analogy must hold that message passing is to high level language parallel programming as assembler is to high level environments such as MATLAB, Mathematica, Maple, or even Python. We follow the Kepner method that correctly realizes that traditional speedup numbers without some discussion of the human cost of reaching these numbers can fail to reflect the true human productivity cost of high performance computing. Traditional data compares low level message passing with serial computation. With the benefit of a high level language system in place, in our case Star-P running with MATLAB client, and with the benefit of a large data pool: 29 students, each running the same code ten times on three evolutions of the same platform, we can methodically demonstrate the productivity gains. To date we are not aware of any high level system as extensive and interoperable as Star-P, nor are we aware of an experiment of this kind performed with this volume of data.« less
Callahan, Clara A; Hojat, Mohammadreza; Veloski, Jon; Erdmann, James B; Gonnella, Joseph S
2010-06-01
The Medical College Admission Test (MCAT) has undergone several revisions for content and validity since its inception. With another comprehensive review pending, this study examines changes in the predictive validity of the MCAT's three recent versions. Study participants were 7,859 matriculants in 36 classes entering Jefferson Medical College between 1970 and 2005; 1,728 took the pre-1978 version of the MCAT; 3,032 took the 1978-1991 version, and 3,099 took the post-1991 version. MCAT subtest scores were the predictors, and performance in medical school, attrition, scores on the medical licensing examinations, and ratings of clinical competence in the first year of residency were the criterion measures. No significant improvement in validity coefficients was observed for performance in medical school or residency. Validity coefficients for all three versions of the MCAT in predicting Part I/Step 1 remained stable (in the mid-0.40s, P < .01). A systematic decline was observed in the validity coefficients of the MCAT versions in predicting Part II/Step 2. It started at 0.47 for the pre-1978 version, decreased to between 0.42 and 0.40 for the 1978-1991 versions, and to 0.37 for the post-1991 version. Validity coefficients for the MCAT versions in predicting Part III/Step 3 remained near 0.30. These were generally larger for women than men. Although the findings support the short- and long-term predictive validity of the MCAT, opportunities to strengthen it remain. Subsequent revisions should increase the test's ability to predict performance on United States Medical Licensing Examination Step 2 and must minimize the differential validity for gender.
Distributed MRI reconstruction using Gadgetron-based cloud computing.
Xue, Hui; Inati, Souheil; Sørensen, Thomas Sangild; Kellman, Peter; Hansen, Michael S
2015-03-01
To expand the open source Gadgetron reconstruction framework to support distributed computing and to demonstrate that a multinode version of the Gadgetron can be used to provide nonlinear reconstruction with clinically acceptable latency. The Gadgetron framework was extended with new software components that enable an arbitrary number of Gadgetron instances to collaborate on a reconstruction task. This cloud-enabled version of the Gadgetron was deployed on three different distributed computing platforms ranging from a heterogeneous collection of commodity computers to the commercial Amazon Elastic Compute Cloud. The Gadgetron cloud was used to provide nonlinear, compressed sensing reconstruction on a clinical scanner with low reconstruction latency (eg, cardiac and neuroimaging applications). The proposed setup was able to handle acquisition and 11 -SPIRiT reconstruction of nine high temporal resolution real-time, cardiac short axis cine acquisitions, covering the ventricles for functional evaluation, in under 1 min. A three-dimensional high-resolution brain acquisition with 1 mm(3) isotropic pixel size was acquired and reconstructed with nonlinear reconstruction in less than 5 min. A distributed computing enabled Gadgetron provides a scalable way to improve reconstruction performance using commodity cluster computing. Nonlinear, compressed sensing reconstruction can be deployed clinically with low image reconstruction latency. © 2014 Wiley Periodicals, Inc.
Comparison of LEWICE and GlennICE in the SLD Regime
NASA Technical Reports Server (NTRS)
Wright, William B.; Potapczuk, Mark G.; Levinson, Laurie H.
2008-01-01
A research project is underway at the NASA Glenn Research Center (GRC) to produce computer software that can accurately predict ice growth under any meteorological conditions for any aircraft surface. This report will present results from two different computer programs. The first program, LEWICE version 3.2.2, has been reported on previously. The second program is GlennICE version 0.1. An extensive comparison of the results in a quantifiable manner against the database of ice shapes that have been generated in the GRC Icing Research Tunnel (IRT) has also been performed, including additional data taken to extend the database in the Super-cooled Large Drop (SLD) regime. This paper will show the differences in ice shape between LEWICE 3.2.2, GlennICE, and experimental data. This report will also provide a description of both programs. Comparisons are then made to recent additions to the SLD database and selected previous cases. Quantitative comparisons are shown for horn height, horn angle, icing limit, area, and leading edge thickness. The results show that the predicted results for both programs are within the accuracy limits of the experimental data for the majority of cases.
Cultural Connectedness and Its Relation to Mental Wellness for First Nations Youth.
Snowshoe, Angela; Crooks, Claire V; Tremblay, Paul F; Hinson, Riley E
2017-04-01
We explored the interrelationships among components of cultural connectedness (i.e., identity, traditions, and spirituality) and First Nations youth mental health using a brief version of the original Cultural Connectedness Scale. Participants included 290 First Nations youth (M age = 14.4) who were recruited from both urban and rural school settings in Saskatchewan and Southwestern Ontario. We performed a confirmatory factor analysis of the Cultural Connectedness Scale-Short Version (CCS-S) items to investigate the factor stability of the construct in our sample. We examined the relationships between the CCS-S subscales and self-efficacy, sense of self (present and future), school connectedness, and life satisfaction using hierarchical multiple linear regression analyses to establish the validity of the abbreviated measure. The results revealed that cultural connectedness, as measured by the 10-item CCS-S, had strong associations with the mental health indicators assessed and, in some cases, was associated with First Nations youth mental health above and beyond other social determinants of health. Our results extend findings from previous research on cultural connectedness by elucidating the meaning of its components and demonstrate the importance of culture for positive youth development.
Indexing molecules with chemical graph identifiers.
Gregori-Puigjané, Elisabet; Garriga-Sust, Rut; Mestres, Jordi
2011-09-01
Fast and robust algorithms for indexing molecules have been historically considered strategic tools for the management and storage of large chemical libraries. This work introduces a modified and further extended version of the molecular equivalence number naming adaptation of the Morgan algorithm (J Chem Inf Comput Sci 2001, 41, 181-185) for the generation of a chemical graph identifier (CGI). This new version corrects for the collisions recognized in the original adaptation and includes the ability to deal with graph canonicalization, ensembles (salts), and isomerism (tautomerism, regioisomerism, optical isomerism, and geometrical isomerism) in a flexible manner. Validation of the current CGI implementation was performed on the open NCI database and the drug-like subset of the ZINC database containing 260,071 and 5,348,089 structures, respectively. The results were compared with those obtained with some of the most widely used indexing codes, such as the CACTVS hash code and the new InChIKey. The analyses emphasize the fact that compound management activities, like duplicate analysis of chemical libraries, are sensitive to the exact definition of compound uniqueness and thus still depend, to a minor extent, on the type and flexibility of the molecular index being used. Copyright © 2011 Wiley Periodicals, Inc.
The bright-star masks for the HSC-SSP survey
NASA Astrophysics Data System (ADS)
Coupon, Jean; Czakon, Nicole; Bosch, James; Komiyama, Yutaka; Medezinski, Elinor; Miyazaki, Satoshi; Oguri, Masamune
2018-01-01
We present the procedure to build and validate the bright-star masks for the Hyper-Suprime-Cam Strategic Subaru Proposal (HSC-SSP) survey. To identify and mask the saturated stars in the full HSC-SSP footprint, we rely on the Gaia and Tycho-2 star catalogues. We first assemble a pure star catalogue down to GGaia < 18 after removing ˜1.5% of sources that appear extended in the Sloan Digital Sky Survey (SDSS). We perform visual inspection on the early data from the S16A internal release of HSC-SSP, finding that our star catalogue is 99.2% pure down to GGaia < 18. Second, we build the mask regions in an automated way using stacked detected source measurements around bright stars binned per GGaia magnitude. Finally, we validate those masks by visual inspection and comparison with the literature of galaxy number counts and angular two-point correlation functions. This version (Arcturus) supersedes the previous version (Sirius) used in the S16A internal and DR1 public releases. We publicly release the full masks and tools to flag objects in the entire footprint of the planned HSC-SSP observations at "ftp://obsftp.unige.ch/pub/coupon/brightStarMasks/HSC-SSP/".
Lin, Na; Chen, Hanning; Jing, Shikai; Liu, Fang; Liang, Xiaodan
2017-03-01
In recent years, symbiosis as a rich source of potential engineering applications and computational model has attracted more and more attentions in the adaptive complex systems and evolution computing domains. Inspired by different symbiotic coevolution forms in nature, this paper proposed a series of multi-swarm particle swarm optimizers called PS 2 Os, which extend the single population particle swarm optimization (PSO) algorithm to interacting multi-swarms model by constructing hierarchical interaction topologies and enhanced dynamical update equations. According to different symbiotic interrelationships, four versions of PS 2 O are initiated to mimic mutualism, commensalism, predation, and competition mechanism, respectively. In the experiments, with five benchmark problems, the proposed algorithms are proved to have considerable potential for solving complex optimization problems. The coevolutionary dynamics of symbiotic species in each PS 2 O version are also studied respectively to demonstrate the heterogeneity of different symbiotic interrelationships that effect on the algorithm's performance. Then PS 2 O is used for solving the radio frequency identification (RFID) network planning (RNP) problem with a mixture of discrete and continuous variables. Simulation results show that the proposed algorithm outperforms the reference algorithms for planning RFID networks, in terms of optimization accuracy and computation robustness.
A Flight Training Simulator for Instructing the Helicopter Autorotation Maneuver (Enhanced Version)
NASA Technical Reports Server (NTRS)
Rogers, Steven P.; Asbury, Charles N.
2000-01-01
Autorotation is a maneuver that permits a safe helicopter landing when the engine loses power. A catastrophe may occur if the pilot's control inputs are incorrect, insufficient, excessive, or poorly timed. Due to the danger involved, full-touchdown autorotations are very rarely practiced. Because in-flight autorotation training is risky, time-consuming, and expensive, the objective of the project was to develop the first helicopter flight simulator expressly designed to train students in this critical maneuver. A central feature of the project was the inclusion of an enhanced version of the Pilot-Rotorcraft Intelligent Symbology Management Simulator (PRISMS), a virtual-reality system developed by Anacapa Sciences and Thought Wave. A task analysis was performed to identify the procedural steps in the autorotation, to inventory the information needed to support student task performance, to identify typical errors, and to structure the simulator's practice environment. The system provides immediate knowledge of results, extensive practice of perceptual-motor skills, part-task training, and augmented cueing in a realistic cockpit environment. Additional work, described in this report, extended the capabilities of the simulator in three areas: 1. Incorporation of visual training aids to assist the student in learning the proper appearance of the visual scene when the maneuver is being properly performed; 2. Introduction of the requirement to land at a particular spot, as opposed to the wide, flat open field initially used, and development of appropriate metrics of success; and 3. Inclusion of wind speed and wind direction settings (and random variability settings) to add a more realistic challenge in "hitting the spot."
Real-time text extraction based on the page layout analysis system
NASA Astrophysics Data System (ADS)
Soua, M.; Benchekroun, A.; Kachouri, R.; Akil, M.
2017-05-01
Several approaches were proposed in order to extract text from scanned documents. However, text extraction in heterogeneous documents stills a real challenge. Indeed, text extraction in this context is a difficult task because of the variation of the text due to the differences of sizes, styles and orientations, as well as to the complexity of the document region background. Recently, we have proposed the improved hybrid binarization based on Kmeans method (I-HBK)5 to extract suitably the text from heterogeneous documents. In this method, the Page Layout Analysis (PLA), part of the Tesseract OCR engine, is used to identify text and image regions. Afterwards our hybrid binarization is applied separately on each kind of regions. In one side, gamma correction is employed before to process image regions. In the other side, binarization is performed directly on text regions. Then, a foreground and background color study is performed to correct inverted region colors. Finally, characters are located from the binarized regions based on the PLA algorithm. In this work, we extend the integration of the PLA algorithm within the I-HBK method. In addition, to speed up the separation of text and image step, we employ an efficient GPU acceleration. Through the performed experiments, we demonstrate the high F-measure accuracy of the PLA algorithm reaching 95% on the LRDE dataset. In addition, we illustrate the sequential and the parallel compared PLA versions. The obtained results give a speedup of 3.7x when comparing the parallel PLA implementation on GPU GTX 660 to the CPU version.
Extended quantification of the generalized recurrence plot
NASA Astrophysics Data System (ADS)
Riedl, Maik; Marwan, Norbert; Kurths, Jürgen
2016-04-01
The generalized recurrence plot is a modern tool for quantification of complex spatial patterns. Its application spans the analysis of trabecular bone structures, Turing structures, turbulent spatial plankton patterns, and fractals. But, it is also successfully applied to the description of spatio-temporal dynamics and the detection of regime shifts, such as in the complex Ginzburg-Landau- equation. The recurrence plot based determinism is a central measure in this framework quantifying the level of regularities in temporal and spatial structures. We extend this measure for the generalized recurrence plot considering additional operations of symmetry than the simple translation. It is tested not only on two-dimensional regular patterns and noise but also on complex spatial patterns reconstructing the parameter space of the complex Ginzburg-Landau-equation. The extended version of the determinism resulted in values which are consistent to the original recurrence plot approach. Furthermore, the proposed method allows a split of the determinism into parts which based on laminar and non-laminar regions of the two-dimensional pattern of the complex Ginzburg-Landau-equation. A comparison of these parts with a standard method of image classification, the co-occurrence matrix approach, shows differences especially in the description of patterns associated with turbulence. In that case, it seems that the extended version of the determinism allows a distinction of phase turbulence and defect turbulence by means of their spatial patterns. This ability of the proposed method promise new insights in other systems with turbulent dynamics coming from climatology, biology, ecology, and social sciences, for example.
An Automatic Measure of Cross-Language Text Structures
ERIC Educational Resources Information Center
Kim, Kyung
2018-01-01
In order to further validate and extend the application of "GIKS" (Graphical Interface of Knowledge Structure) beyond English, this investigation applies the "GIKS" to capture, visually represent, and compare text structures inherent in two "contrasting" languages. The English and parallel Korean versions of 50…
Model Identification of Integrated ARMA Processes
ERIC Educational Resources Information Center
Stadnytska, Tetiana; Braun, Simone; Werner, Joachim
2008-01-01
This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…
AmeriFlux US-A32 ARM-SGP Medford hay pasture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kueppers, Lara; Torn, Margaret; Biraud, Sebastien
This is the AmeriFlux version of the carbon flux data for the site US-A32 ARM-SGP Medford hay pasture. Site Description - This site is located at the ARM SGP Extended Facility E32, 8 km West of Medford, OK
Readability and Recall of Short Prose Passages: A Theoretical Analysis.
ERIC Educational Resources Information Center
Miller, James R.; Kintsch, Walter
1980-01-01
To support the view of readability as an interaction between a text and the reader's prose-processing capabilities, this article applies an extended and formalized version of the Kintch and van Dijk prose-processing model to 20 texts of varying readability. (Author/GSK)
Extending the capability of GYRE to calculate tidally forced stellar oscillations
NASA Astrophysics Data System (ADS)
Guo, Zhao; Gies, Douglas R.
2016-01-01
Tidally forced oscillations have been observed in many eccentric binary systems, such as KOI-54 and many other 'heart beat stars'. The tidal response of the star can be calculated by solving a revised stellar oscillations equations.The open-source stellar oscillation code GYRE (Townsend & Teitler 2013) can be used to solve the free stellar oscillation equations in both adiabatic and non-adiabatic cases. It uses a novel matrix exponential method which avoids many difficulties of the classical shooting and relaxation method. The new version also includes the effect of rotation in traditional approximation.After showing the code flow of GYRE, we revise its subroutines and extend its capability to calculate tidallyforced oscillations in both adiabatic and non-adiabatic cases following the procedure in the CAFein code (Valsecchi et al. 2013). In the end, we compare the tidal eigenfunctions with those calculated from CAFein.More details of the revision and a simple version of the code in MATLAB can be obtained upon request.
Brunet, Jean-François; Dagenais, Dominique; Therrien, Marc; Gartenberg, Daniel; Forest, Geneviève
2017-08-01
Despite its high sensitivity and validity in the context of sleep loss, the Psychomotor Vigilance Test (PVT) could be improved. The aim of the present study was to validate a new smartphone PVT-type application called sleep-2-Peak (s2P) by determining its ability to assess fatigue-related changes in alertness in a context of extended wakefulness. Short 3-min versions of s2P and of the classic PVT were administered at every even hour during a 35-h total sleep deprivation protocol. In addition, subjective measures of sleepiness were collected. The outcomes on these tests were then compared using Pearson product-moment correlations, t tests, and repeated measures within-groups analyses of variance. The results showed that both tests significantly correlated on all outcome variables, that both significantly distinguished between the alert and sleepy states in the same individual, and that both varied similarly through the sleep deprivation protocol as sleep loss accumulated. All outcome variables on both tests also correlated significantly with the subjective measures of sleepiness. These results suggest that a 3-min version of s2P is a valid tool for differentiating alert from sleepy states and is as sensitive as the PVT for tracking fatigue-related changes during extended wakefulness and sleep loss. Unlike the PVT, s2P does not provide feedback to subjects on each trial. We discuss how this feature of s2P raises the possibility that the performance results measured by s2P could be less impacted by motivational confounds, giving this tool added value in particular clinical and/or research settings.
Söderlind, Erik; Abrahamsson, Bertil; Erlandsson, Fredrik; Wanke, Christoph; Iordanov, Ventzeslav; von Corswant, Christian
2015-11-10
A clinical study was conducted to validate the in vivo drug release performance of IntelliCap® CR capsules. 12 healthy, male volunteers were administered IntelliCap® CR capsules, filled with metoprolol as a BCS 1 model drug, and programmed to release the drug with 3 different release profiles (2 linear profiles extending over 6h and 14h, respectively, and a pulsed profile with two equal pulses separated by 5h) using a cross-over design. An oral metoprolol solution was included as a reference. Standard bioavailability variables were determined. In vivo drug release-time profiles for the IntelliCap® CR capsules were calculated from the plasma drug concentrations by deconvolution, and they were subsequently compared with the in vitro drug release profiles including assessment of level A in vitro/in vivo correlation (IVIVC). The relative bioavailability for the linear, extended release profiles was about 85% which is similar to other extended release administrations of metoprolol. There was an excellent agreement between the predetermined release profiles and the in vivo release for these two administrations. For IntelliCap® CR capsules programmed to deliver 2 distinct and equal drug pulses, the first pulse was delivered as expected whereas only about half of the second dose was released. Thus, it is concluded that the IntelliCap® system is well suited for the fast and reliable generation of in vivo pharmacokinetic data for extended release drug profiles, e.g. in context of regional drug absorption investigations. For immediate release pulses delivered in the distal GI tract this version of the device appears however less suitable. Copyright © 2015 Elsevier B.V. All rights reserved.
Aono, Masashi; Kim, Song-Ju; Hara, Masahiko; Munakata, Toshinori
2014-03-01
The true slime mold Physarum polycephalum, a single-celled amoeboid organism, is capable of efficiently allocating a constant amount of intracellular resource to its pseudopod-like branches that best fit the environment where dynamic light stimuli are applied. Inspired by the resource allocation process, the authors formulated a concurrent search algorithm, called the Tug-of-War (TOW) model, for maximizing the profit in the multi-armed Bandit Problem (BP). A player (gambler) of the BP should decide as quickly and accurately as possible which slot machine to invest in out of the N machines and faces an "exploration-exploitation dilemma." The dilemma is a trade-off between the speed and accuracy of the decision making that are conflicted objectives. The TOW model maintains a constant intracellular resource volume while collecting environmental information by concurrently expanding and shrinking its branches. The conservation law entails a nonlocal correlation among the branches, i.e., volume increment in one branch is immediately compensated by volume decrement(s) in the other branch(es). Owing to this nonlocal correlation, the TOW model can efficiently manage the dilemma. In this study, we extend the TOW model to apply it to a stretched variant of BP, the Extended Bandit Problem (EBP), which is a problem of selecting the best M-tuple of the N machines. We demonstrate that the extended TOW model exhibits better performances for 2-tuple-3-machine and 2-tuple-4-machine instances of EBP compared with the extended versions of well-known algorithms for BP, the ϵ-Greedy and SoftMax algorithms, particularly in terms of its short-term decision-making capability that is essential for the survival of the amoeba in a hostile environment. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
SARAH 4: A tool for (not only SUSY) model builders
NASA Astrophysics Data System (ADS)
Staub, Florian
2014-06-01
We present the new version of the Mathematica package SARAH which provides the same features for a non-supersymmetric model as previous versions for supersymmetric models. This includes an easy and straightforward definition of the model, the calculation of all vertices, mass matrices, tadpole equations, and self-energies. Also the two-loop renormalization group equations for a general gauge theory are now included and have been validated with the independent Python code PyR@TE. Model files for FeynArts, CalcHep/CompHep, WHIZARD and in the UFO format can be written, and source code for SPheno for the calculation of the mass spectrum, a set of precision observables, and the decay widths and branching ratios of all states can be generated. Furthermore, the new version includes routines to output model files for Vevacious for both, supersymmetric and non-supersymmetric, models. Global symmetries are also supported with this version and by linking Susyno the handling of Lie groups has been improved and extended.
Implementing Kernel Methods Incrementally by Incremental Nonlinear Projection Trick.
Kwak, Nojun
2016-05-20
Recently, the nonlinear projection trick (NPT) was introduced enabling direct computation of coordinates of samples in a reproducing kernel Hilbert space. With NPT, any machine learning algorithm can be extended to a kernel version without relying on the so called kernel trick. However, NPT is inherently difficult to be implemented incrementally because an ever increasing kernel matrix should be treated as additional training samples are introduced. In this paper, an incremental version of the NPT (INPT) is proposed based on the observation that the centerization step in NPT is unnecessary. Because the proposed INPT does not change the coordinates of the old data, the coordinates obtained by INPT can directly be used in any incremental methods to implement a kernel version of the incremental methods. The effectiveness of the INPT is shown by applying it to implement incremental versions of kernel methods such as, kernel singular value decomposition, kernel principal component analysis, and kernel discriminant analysis which are utilized for problems of kernel matrix reconstruction, letter classification, and face image retrieval, respectively.
Mapping the Martian Meteorology
NASA Technical Reports Server (NTRS)
Allison, Michael; Ross, J. D.; Soloman, N.
1999-01-01
The Mars-adapted version of the NASA/GISS general circulation model (GCM) has been applied to the hourly/daily simulation of the planet's meteorology over several seasonal orbits. The current running version of the model includes a diurnal solar cycle, CO2 sublimation, and a mature parameterization of upper level wave drag with a vertical domain extending from the surface up to the 6 micro b level. The benchmark simulations provide a four-dimensional archive for the comparative evaluation of various schemes for the retrieval of winds from anticipated polar orbiter measurements of temperatures by the Pressure Modulator Infrared Radiometer.
GAP: yet another image processing system for solar observations.
NASA Astrophysics Data System (ADS)
Keller, C. U.
GAP is a versatile, interactive image processing system for analyzing solar observations, in particular extended time sequences, and for preparing publication quality figures. It consists of an interpreter that is based on a language with a control flow similar to PASCAL and C. The interpreter may be accessed from a command line editor and from user-supplied functions, procedures, and command scripts. GAP is easily expandable via external FORTRAN programs that are linked to the GAP interface routines. The current version of GAP runs on VAX, DECstation, Sun, and Apollo computers. Versions for MS-DOS and OS/2 are in preparation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Jing-Jy; Flood, Paul E.; LePoire, David
In this report, the results generated by RESRAD-RDD version 2.01 are compared with those produced by RESRAD-RDD version 1.7 for different scenarios with different sets of input parameters. RESRAD-RDD version 1.7 is spreadsheet-driven, performing calculations with Microsoft Excel spreadsheets. RESRAD-RDD version 2.01 revamped version 1.7 by using command-driven programs designed with Visual Basic.NET to direct calculations with data saved in Microsoft Access database, and re-facing the graphical user interface (GUI) to provide more flexibility and choices in guideline derivation. Because version 1.7 and version 2.01 perform the same calculations, the comparison of their results serves as verification of both versions.more » The verification covered calculation results for 11 radionuclides included in both versions: Am-241, Cf-252, Cm-244, Co-60, Cs-137, Ir-192, Po-210, Pu-238, Pu-239, Ra-226, and Sr-90. At first, all nuclidespecific data used in both versions were compared to ensure that they are identical. Then generic operational guidelines and measurement-based radiation doses or stay times associated with a specific operational guideline group were calculated with both versions using different sets of input parameters, and the results obtained with the same set of input parameters were compared. A total of 12 sets of input parameters were used for the verification, and the comparison was performed for each operational guideline group, from A to G, sequentially. The verification shows that RESRAD-RDD version 1.7 and RESRAD-RDD version 2.01 generate almost identical results; the slight differences could be attributed to differences in numerical precision with Microsoft Excel and Visual Basic.NET. RESRAD-RDD version 2.01 allows the selection of different units for use in reporting calculation results. The results of SI units were obtained and compared with the base results (in traditional units) used for comparison with version 1.7. The comparison shows that RESRAD-RDD version 2.01 correctly reports calculation results in the unit specified in the GUI.« less
Ng, Raymond; Lee, Chun Fan; Wong, Nan Soon; Luo, Nan; Yap, Yoon Sim; Lo, Soo Kien; Chia, Whay Kuang; Yee, Alethea; Krishna, Lalit; Goh, Cynthia; Cheung, Yin Bun
2012-01-01
The objective of the study was to examine the measurement properties of and comparability between the English and Chinese versions of the Functional Assessment of Cancer Therapy-Breast (FACT-B) in breast cancer patients in Singapore. This is an observational study of 271 Singaporean breast cancer patients. The known-group validity of FACT-B total score and Trial Outcome Index (TOI) were assessed in relation to performance status, evidence of disease, and treatment status cross-sectionally; responsiveness to change was assessed in relation to change in performance status longitudinally. Internal consistency and test-retest reliability were evaluated by the Cronbach's alpha and intraclass correlation coefficient (ICC), respectively. Multiple regression analyses were performed to compare the scores on the two language versions, adjusting for covariates. The FACT-B total score and TOI demonstrated known-group validity in differentiating patients with different clinical status. They showed high internal consistency and test-retest reliability, with Cronbach's alpha ranging from 0.87 to 0.91 and ICC ranging from 0.82 to 0.89. The English version was responsive to the change in performance status. The Chinese version was shown to be responsive to decline in performance status but the sample size of Chinese-speaking patients who improved in performance status was too small (N = 6) for conclusive analysis about responsiveness to improvement. Two items concerning sexuality had a high item non-response rate (50.2 and 14.4%). No practically significant difference was found in the total score and TOI between the two language versions despite minor differences in two of the 37 items. The English and Chinese versions of the FACT-B are valid, responsive, and reliable instruments in assessing health-related quality of life in breast cancer patients in Singapore. Data collected from the English and Chinese versions can be pooled and either version could be used for bilingual patients.
Mashnik, Stepan Georgievich; Kerby, Leslie Marie; Gudima, Konstantin K.; ...
2017-03-23
We extend the cascade-exciton model (CEM), and the Los Alamos version of the quark-gluon string model (LAQGSM), event generators of the Monte Carlo N-particle transport code version 6 (MCNP6), to describe production of energetic light fragments (LF) heavier than 4He from various nuclear reactions induced by particles and nuclei at energies up to about 1 TeV/nucleon. In these models, energetic LF can be produced via Fermi breakup, preequilibrium emission, and coalescence of cascade particles. Initially, we study several variations of the Fermi breakup model and choose the best option for these models. Then, we extend the modified exciton model (MEM)more » used by these codes to account for a possibility of multiple emission of up to 66 types of particles and LF (up to 28Mg) at the preequilibrium stage of reactions. Then, we expand the coalescence model to allow coalescence of LF from nucleons emitted at the intranuclear cascade stage of reactions and from lighter clusters, up to fragments with mass numbers A ≤ 7, in the case of CEM, and A ≤ 12, in the case of LAQGSM. Next, we modify MCNP6 to allow calculating and outputting spectra of LF and heavier products with arbitrary mass and charge numbers. The improved version of CEM is implemented into MCNP6. Lastly, we test the improved versions of CEM, LAQGSM, and MCNP6 on a variety of measured nuclear reactions. The modified codes give an improved description of energetic LF from particle- and nucleus-induced reactions; showing a good agreement with a variety of available experimental data. They have an improved predictive power compared to the previous versions and can be used as reliable tools in simulating applications involving such types of reactions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mashnik, Stepan Georgievich; Kerby, Leslie Marie; Gudima, Konstantin K.
We extend the cascade-exciton model (CEM), and the Los Alamos version of the quark-gluon string model (LAQGSM), event generators of the Monte Carlo N-particle transport code version 6 (MCNP6), to describe production of energetic light fragments (LF) heavier than 4He from various nuclear reactions induced by particles and nuclei at energies up to about 1 TeV/nucleon. In these models, energetic LF can be produced via Fermi breakup, preequilibrium emission, and coalescence of cascade particles. Initially, we study several variations of the Fermi breakup model and choose the best option for these models. Then, we extend the modified exciton model (MEM)more » used by these codes to account for a possibility of multiple emission of up to 66 types of particles and LF (up to 28Mg) at the preequilibrium stage of reactions. Then, we expand the coalescence model to allow coalescence of LF from nucleons emitted at the intranuclear cascade stage of reactions and from lighter clusters, up to fragments with mass numbers A ≤ 7, in the case of CEM, and A ≤ 12, in the case of LAQGSM. Next, we modify MCNP6 to allow calculating and outputting spectra of LF and heavier products with arbitrary mass and charge numbers. The improved version of CEM is implemented into MCNP6. Lastly, we test the improved versions of CEM, LAQGSM, and MCNP6 on a variety of measured nuclear reactions. The modified codes give an improved description of energetic LF from particle- and nucleus-induced reactions; showing a good agreement with a variety of available experimental data. They have an improved predictive power compared to the previous versions and can be used as reliable tools in simulating applications involving such types of reactions.« less
NASA Astrophysics Data System (ADS)
Stelea, Cristian; Dariescu, Marina-Aura; Dariescu, Ciprian
2018-05-01
We extend a known solution-generating technique for isotropic fluids in order to construct more general models of anisotropic stars with poloidal magnetic fields. In particular, we discuss the magnetized versions of some well-known exact solutions describing anisotropic stars and dark energy stars, and we describe some of their properties.
CHANGES TO THE CHEMICAL MECHANISMS FOR HAZARDOUS AIR POLLUTANTS IN CMAQ VERSION 4.6
The extended abstract describes a presentation to the 2006 conference of the Community Modeling and Analysis System. The presentation introduces two new mechanisms for the atmospheric photochemistry of Hazardous Air Pollutants (HAPs) to be used in regional air quality models. It ...
2006-10-01
measuring unit and the control computer have been flight tested using both a small UAV and the PRP-560 “ Ranger ” patrol and rescue hovercraft, which...version based on the G9- Galaxy ram air parachute. Recently, in an effort to further extend the system’s payload capacity, developmental flight tests
NASA Astrophysics Data System (ADS)
Bytev, Vladimir V.; Kniehl, Bernd A.
2016-09-01
We present a further extension of the HYPERDIRE project, which is devoted to the creation of a set of Mathematica-based program packages for manipulations with Horn-type hypergeometric functions on the basis of differential equations. Specifically, we present the implementation of the differential reduction for the Lauricella function FC of three variables. Catalogue identifier: AEPP_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEPP_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 243461 No. of bytes in distributed program, including test data, etc.: 61610782 Distribution format: tar.gz Programming language: Mathematica. Computer: All computers running Mathematica. Operating system: Operating systems running Mathematica. Classification: 4.4. Does the new version supersede the previous version?: No, it significantly extends the previous version. Nature of problem: Reduction of hypergeometric function FC of three variables to a set of basis functions. Solution method: Differential reduction. Reasons for new version: The extension package allows the user to handle the Lauricella function FC of three variables. Summary of revisions: The previous version goes unchanged. Running time: Depends on the complexity of the problem.
RSM 1.0 - A RESUPPLY SCHEDULER USING INTEGER OPTIMIZATION
NASA Technical Reports Server (NTRS)
Viterna, L. A.
1994-01-01
RSM, Resupply Scheduling Modeler, is a fully menu-driven program that uses integer programming techniques to determine an optimum schedule for replacing components on or before the end of a fixed replacement period. Although written to analyze the electrical power system on the Space Station Freedom, RSM is quite general and can be used to model the resupply of almost any system subject to user-defined resource constraints. RSM is based on a specific form of the general linear programming problem in which all variables in the objective function and all variables in the constraints are integers. While more computationally intensive, integer programming was required for accuracy when modeling systems with small quantities of components. Input values for component life cane be real numbers, RSM converts them to integers by dividing the lifetime by the period duration, then reducing the result to the next lowest integer. For each component, there is a set of constraints that insure that it is replaced before its lifetime expires. RSM includes user-defined constraints such as transportation mass and volume limits, as well as component life, available repair crew time and assembly sequences. A weighting factor allows the program to minimize factors such as cost. The program then performs an iterative analysis, which is displayed during the processing. A message gives the first period in which resources are being exceeded on each iteration. If the scheduling problem is unfeasible, the final message will also indicate the first period in which resources were exceeded. RSM is written in APL2 for IBM PC series computers and compatibles. A stand-alone executable version of RSM is provided; however, this is a "packed" version of RSM which can only utilize the memory within the 640K DOS limit. This executable requires at least 640K of memory and DOS 3.1 or higher. Source code for an APL2/PC workspace version is also provided. This version of RSM can make full use of any installed extended memory but must be run with the APL2 interpreter; and it requires an 80486 based microcomputer or an 80386 based microcomputer with an 80387 math coprocessor, at least 2Mb of extended memory, and DOS 3.3 or higher. The standard distribution medium for this package is one 5.25 inch 360K MS-DOS format diskette. RSM was developed in 1991. APL2 and IBM PC are registered trademarks of International Business Machines Corporation. MS-DOS is a registered trademark of Microsoft Corporation.
Evaluation of the Community Multiscale Air Quality (CMAQ) Model Version 5.1
The AMAD will performed two CMAQ model simulations, one with the current publically available version of the CMAQ model (v5.0.2) and the other with the new version of the CMAQ model (v5.1). The results of each model simulation are compared to observations and the performance of t...
Review and verification of CARE 3 mathematical model and code
NASA Technical Reports Server (NTRS)
Rose, D. M.; Altschul, R. E.; Manke, J. W.; Nelson, D. L.
1983-01-01
The CARE-III mathematical model and code verification performed by Boeing Computer Services were documented. The mathematical model was verified for permanent and intermittent faults. The transient fault model was not addressed. The code verification was performed on CARE-III, Version 3. A CARE III Version 4, which corrects deficiencies identified in Version 3, is being developed.
Maximum work extraction and implementation costs for nonequilibrium Maxwell's demons.
Sandberg, Henrik; Delvenne, Jean-Charles; Newton, Nigel J; Mitter, Sanjoy K
2014-10-01
We determine the maximum amount of work extractable in finite time by a demon performing continuous measurements on a quadratic Hamiltonian system subjected to thermal fluctuations, in terms of the information extracted from the system. The maximum work demon is found to apply a high-gain continuous feedback involving a Kalman-Bucy estimate of the system state and operates in nonequilibrium. A simple and concrete electrical implementation of the feedback protocol is proposed, which allows for analytic expressions of the flows of energy, entropy, and information inside the demon. This let us show that any implementation of the demon must necessarily include an external power source, which we prove both from classical thermodynamics arguments and from a version of Landauer's memory erasure argument extended to nonequilibrium linear systems.
Do as I … Did! Long-term memory of imitative actions in dogs (Canis familiaris).
Fugazza, Claudia; Pogány, Ákos; Miklósi, Ádám
2016-03-01
This study demonstrates long-term declarative memory of imitative actions in a non-human animal species. We tested 12 pet dogs for their ability to imitate human actions after retention intervals ranging from 1 to 24 h. For comparison, another 12 dogs were tested for the same actions without delay between demonstration and recall. Our test consisted of a modified version of the Do as I Do paradigm, combined with the two-action procedure to control for non-imitative processes. Imitative performance of dogs remained consistently high independent of increasing retention intervals, supporting the idea that dogs are able to retain mental representations of human actions for an extended period of time. The ability to imitate after such delays supports the use of long-term declarative memory.
Embedding Quantum Mechanics Into a Broader Noncontextual Theory: A Conciliatory Result
NASA Astrophysics Data System (ADS)
Garola, Claudio; Sozzo, Sandro
2010-12-01
The extended semantic realism ( ESR) model embodies the mathematical formalism of standard (Hilbert space) quantum mechanics in a noncontextual framework, reinterpreting quantum probabilities as conditional instead of absolute. We provide here an improved version of this model and show that it predicts that, whenever idealized measurements are performed, a modified Bell-Clauser-Horne-Shimony-Holt ( BCHSH) inequality holds if one takes into account all individual systems that are prepared, standard quantum predictions hold if one considers only the individual systems that are detected, and a standard BCHSH inequality holds at a microscopic (purely theoretical) level. These results admit an intuitive explanation in terms of an unconventional kind of unfair sampling and constitute a first example of the unified perspective that can be attained by adopting the ESR model.
Specifying real-time systems with interval logic
NASA Technical Reports Server (NTRS)
Rushby, John
1988-01-01
Pure temporal logic makes no reference to time. An interval temporal logic and an extension to that logic which includes real time constraints are described. The application of this logic by giving a specification for the well-known lift (elevator) example is demonstrated. It is shown how interval logic can be extended to include a notion of process. How the specification language and verification environment of EHDM could be enhanced to support this logic is described. A specification of the alternating bit protocol in this extended version of the specification language of EHDM is given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hayashi, A.; Hashimoto, T.; Horibe, M.
The quantum color coding scheme proposed by Korff and Kempe [e-print quant-ph/0405086] is easily extended so that the color coding quantum system is allowed to be entangled with an extra auxiliary quantum system. It is shown that in the extended scheme we need only {approx}2{radical}(N) quantum colors to order N objects in large N limit, whereas {approx}N/e quantum colors are required in the original nonextended version. The maximum success probability has asymptotics expressed by the Tracy-Widom distribution of the largest eigenvalue of a random Gaussian unitary ensemble (GUE) matrix.
Extending the Stabilized Supralinear Network model for binocular image processing.
Selby, Ben; Tripp, Bryan
2017-06-01
The visual cortex is both extensive and intricate. Computational models are needed to clarify the relationships between its local mechanisms and high-level functions. The Stabilized Supralinear Network (SSN) model was recently shown to account for many receptive field phenomena in V1, and also to predict subtle receptive field properties that were subsequently confirmed in vivo. In this study, we performed a preliminary exploration of whether the SSN is suitable for incorporation into large, functional models of the visual cortex, considering both its extensibility and computational tractability. First, whereas the SSN receives abstract orientation signals as input, we extended it to receive images (through a linear-nonlinear stage), and found that the extended version behaved similarly. Secondly, whereas the SSN had previously been studied in a monocular context, we found that it could also reproduce data on interocular transfer of surround suppression. Finally, we reformulated the SSN as a convolutional neural network, and found that it scaled well on parallel hardware. These results provide additional support for the plausibility of the SSN as a model of lateral interactions in V1, and suggest that the SSN is well suited as a component of complex vision models. Future work will use the SSN to explore relationships between local network interactions and sophisticated vision processes in large networks. Copyright © 2017 Elsevier Ltd. All rights reserved.
Scheduling System Assessment, and Development and Enhancement of Re-engineered Version of GPSS
NASA Technical Reports Server (NTRS)
Loganantharaj, Rasiah; Thomas, Bushrod; Passonno, Nicole
1996-01-01
The objective of this project is two-fold. First to provide an evaluation of a commercially developed version of the ground processing scheduling system (GPSS) for its applicability to the Kennedy Space Center (KSC) ground processing problem. Second, to work with the KSC GPSS development team and provide enhancement to the existing software. Systems reengineering is required to provide a sustainable system for the users and the software maintenance group. Using the LISP profile prototype code developed by the GPSS reverse reengineering groups as a building block, we have implemented the resource deconfliction portion of GPSS in common LISP using its object oriented features. The prototype corrects and extends some of the deficiencies of the current production version, plus it uses and builds on the classes from the development team's profile prototype.
NASA Astrophysics Data System (ADS)
Roggemann, M.; Soehnel, G.; Archer, G.
Atmospheric turbulence degrades the resolution of images of space objects far beyond that predicted by diffraction alone. Adaptive optics telescopes have been widely used for compensating these effects, but as users seek to extend the envelopes of operation of adaptive optics telescopes to more demanding conditions, such as daylight operation, and operation at low elevation angles, the level of compensation provided will degrade. We have been investigating the use of advanced wave front reconstructors and post detection image reconstruction to overcome the effects of turbulence on imaging systems in these more demanding scenarios. In this paper we show results comparing the optical performance of the exponential reconstructor, the least squares reconstructor, and two versions of a reconstructor based on the stochastic parallel gradient descent algorithm in a closed loop adaptive optics system using a conventional continuous facesheet deformable mirror and a Hartmann sensor. The performance of these reconstructors has been evaluated under a range of source visual magnitudes and zenith angles ranging up to 70 degrees. We have also simulated satellite images, and applied speckle imaging, multi-frame blind deconvolution algorithms, and deconvolution algorithms that presume the average point spread function is known to compute object estimates. Our work thus far indicates that the combination of adaptive optics and post detection image processing will extend the useful envelope of the current generation of adaptive optics telescopes.
Cross-language Babel structs—making scientific interfaces more efficient
NASA Astrophysics Data System (ADS)
Prantl, Adrian; Ebner, Dietmar; Epperly, Thomas G. W.
2013-01-01
Babel is an open-source language interoperability framework tailored to the needs of high-performance scientific computing. As an integral element of the Common Component Architecture, it is employed in a wide range of scientific applications where it is used to connect components written in different programming languages. In this paper we describe how we extended Babel to support interoperable tuple data types (structs). Structs are a common idiom in (mono-lingual) scientific application programming interfaces (APIs); they are an efficient way to pass tuples of nonuniform data between functions, and are supported natively by most programming languages. Using our extended version of Babel, developers of scientific codes can now pass structs as arguments between functions implemented in any of the supported languages. In C, C++, Fortran 2003/2008 and Chapel, structs can be passed without the overhead of data marshaling or copying, providing language interoperability at minimal cost. Other supported languages are Fortran 77, Fortran 90/95, Java and Python. We will show how we designed a struct implementation that is interoperable with all of the supported languages and present benchmark data to compare the performance of all language bindings, highlighting the differences between languages that offer native struct support and an object-oriented interface with getter/setter methods. A case study shows how structs can help simplify the interfaces of scientific codes significantly.
Peyrard, N; Dieckmann, U; Franc, A
2008-05-01
Models of infectious diseases are characterized by a phase transition between extinction and persistence. A challenge in contemporary epidemiology is to understand how the geometry of a host's interaction network influences disease dynamics close to the critical point of such a transition. Here we address this challenge with the help of moment closures. Traditional moment closures, however, do not provide satisfactory predictions close to such critical points. We therefore introduce a new method for incorporating longer-range correlations into existing closures. Our method is technically simple, remains computationally tractable and significantly improves the approximation's performance. Our extended closures thus provide an innovative tool for quantifying the influence of interaction networks on spatially or socially structured disease dynamics. In particular, we examine the effects of a network's clustering coefficient, as well as of new geometrical measures, such as a network's square clustering coefficients. We compare the relative performance of different closures from the literature, with or without our long-range extension. In this way, we demonstrate that the normalized version of the Bethe approximation-extended to incorporate long-range correlations according to our method-is an especially good candidate for studying influences of network structure. Our numerical results highlight the importance of the clustering coefficient and the square clustering coefficient for predicting disease dynamics at low and intermediate values of transmission rate, and demonstrate the significance of path redundancy for disease persistence.
Copeland, Jan; Rooke, Sally; Rodriquez, Dan; Norberg, Melissa M; Gibson, Lisa
2017-05-01
Previous studies have shown brief online self-help interventions to be a useful method of treating cannabis use and related problems; however, no studies have compared the effects of brief versus extended feedback for online brief intervention programs. The current study was a two arm randomised trial aimed at testing the short term effectiveness of a brief and extended feedback version of Grassessment, a brief online intervention for cannabis users that provides individualised feedback regarding use, motives, and harms. Participants (n=287) reporting at least one symptom of DSM IV cannabis abuse or dependence were recruited using online and offline advertising methods. Participants were randomised to receive either a brief or extended feedback version of the Grassessment program and were required to complete a one month follow up questionnaire. One hundred and ninety four participants completed the one month follow up. Wilcoxon analyses showed a significant decrease in past month quantity and frequency of cannabis use (ps<0.001; r=-0.41 and -0.40 respectively) and lower severity of dependence scores (p=0.002; r=-0.31) among those in the brief feedback condition. Participants in the extended feedback group also demonstrated significant decreases in patterns of use (ps<0.002; r=-0.39 and -0.33) but not severity of dependence (p=0.09; r=0.18). A Generalized Estimating Equation (GEE) analysis showed no significant interaction between length of feedback received and past month cannabis use frequency (p=0.78), quantity (p=0.73), or severity of dependence (p=0.47). This study adds support for the use of brief online self-complete interventions to reduce cannabis use and related problems in the short term. The findings suggest that in the case of the brief online screening and feedback program Grassessment, extended feedback does not lead to superior outcomes over brief feedback. Copyright © 2017 Elsevier Inc. All rights reserved.
Incremental Learning of Context Free Grammars by Parsing-Based Rule Generation and Rule Set Search
NASA Astrophysics Data System (ADS)
Nakamura, Katsuhiko; Hoshina, Akemi
This paper discusses recent improvements and extensions in Synapse system for inductive inference of context free grammars (CFGs) from sample strings. Synapse uses incremental learning, rule generation based on bottom-up parsing, and the search for rule sets. The form of production rules in the previous system is extended from Revised Chomsky Normal Form A→βγ to Extended Chomsky Normal Form, which also includes A→B, where each of β and γ is either a terminal or nonterminal symbol. From the result of bottom-up parsing, a rule generation mechanism synthesizes minimum production rules required for parsing positive samples. Instead of inductive CYK algorithm in the previous version of Synapse, the improved version uses a novel rule generation method, called ``bridging,'' which bridges the lacked part of the derivation tree for the positive string. The improved version also employs a novel search strategy, called serial search in addition to minimum rule set search. The synthesis of grammars by the serial search is faster than the minimum set search in most cases. On the other hand, the size of the generated CFGs is generally larger than that by the minimum set search, and the system can find no appropriate grammar for some CFL by the serial search. The paper shows experimental results of incremental learning of several fundamental CFGs and compares the methods of rule generation and search strategies.
Paats, A; Alumäe, T; Meister, E; Fridolin, I
2018-04-30
The aim of this study was to analyze retrospectively the influence of different acoustic and language models in order to determine the most important effects to the clinical performance of an Estonian language-based non-commercial radiology-oriented automatic speech recognition (ASR) system. An ASR system was developed for Estonian language in radiology domain by utilizing open-source software components (Kaldi toolkit, Thrax). The ASR system was trained with the real radiology text reports and dictations collected during development phases. The final version of the ASR system was tested by 11 radiologists who dictated 219 reports in total, in spontaneous manner in a real clinical environment. The audio files collected in the final phase were used to measure the performance of different versions of the ASR system retrospectively. ASR system versions were evaluated by word error rate (WER) for each speaker and modality and by WER difference for the first and the last version of the ASR system. Total average WER for the final version throughout all material was improved from 18.4% of the first version (v1) to 5.8% of the last (v8) version which corresponds to relative improvement of 68.5%. WER improvement was strongly related to modality and radiologist. In summary, the performance of the final ASR system version was close to optimal, delivering similar results to all modalities and being independent on user, the complexity of the radiology reports, user experience, and speech characteristics.
Modeling the fate and transport of bacteria in agricultural and pasture lands using APEX
USDA-ARS?s Scientific Manuscript database
The Agricultural Policy/Environmental eXtender (APEX) model is a whole farm to small watershed scale continuous simulation model developed for evaluating various land management strategies. The current version, APEX0806, does not have the modeling capacity for fecal indicator bacteria fate and trans...
Regional Impacts of extending inorganic and organic cloud chemistry with AQCHEM-KMT
Starting with CMAQ version 5.1, AQCHEM-KMT has been offered as a readily expandable option for cloud chemistry via application of the Kinetic PreProcessor (KPP). AQCHEM-KMT treats kinetic mass transfer between the gas and aqueous phases, ionization, chemical kinetics, droplet sc...
Multiple-Object Tracking in Children: The "Catch the Spies" Task
ERIC Educational Resources Information Center
Trick, L.M.; Jaspers-Fayer, F.; Sethi, N.
2005-01-01
Multiple-object tracking involves simultaneously tracking positions of a number of target-items as they move among distractors. The standard version of the task poses special challenges for children, demanding extended concentration and the ability to distinguish targets from identical-looking distractors, and may thus underestimate children's…
Concerns for Minority Groups in Communication Disorders. ASHA Reports No. 16.
ERIC Educational Resources Information Center
Bess, Fred H., Ed.; And Others
This monograph addresses topical issues in training, service delivery, and research for minorities in communication disorders. It presents extended versions of papers that were delivered at the conference, "Concerns for Minority Groups in Communication Disorders," held in Nashville, Tennessee on September 17-19, 1984. Papers include: "The First…
Factors Influencing Physical Activity among Postpartum Iranian Women
ERIC Educational Resources Information Center
Roozbahani, Nasrin; Ghofranipour, Fazlollah; Eftekhar Ardabili, Hassan; Hajizadeh, Ebrahim
2014-01-01
Background: Postpartum women are a population at risk for sedentary living. Physical activity (PA) prior to pregnancy may be effective in predicting similar behaviour in the postpartum period. Objective: To test a composite version of the extended transtheoretical model (TTM) by adding "past behaviour" in order to predict PA behaviour…
Extending the Theory and Practice of Education for Cosmopolitan Citizenship
ERIC Educational Resources Information Center
Osler, Audrey; Starkey, Hugh
2018-01-01
In 2003, citizenship education had recently been introduced to the national curriculum for England, and the model adopted was proving to be influential in a variety of settings worldwide. We sought to challenge a nationalist version of citizenship education by proposing "education for cosmopolitan citizenship" arguing for citizenship…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-30
... available a revised Word version, an Excel version, and a version on HUD's Energy and Performance... recipient may elect to submit to HUD either the Word, Excel, or EPIC versions; however, the Excel and EPIC versions are preferred because of their automated capabilities and reduced burden. The Word, Excel, and...
Impact of data layouts on the efficiency of GPU-accelerated IDW interpolation.
Mei, Gang; Tian, Hong
2016-01-01
This paper focuses on evaluating the impact of different data layouts on the computational efficiency of GPU-accelerated Inverse Distance Weighting (IDW) interpolation algorithm. First we redesign and improve our previous GPU implementation that was performed by exploiting the feature of CUDA dynamic parallelism (CDP). Then we implement three versions of GPU implementations, i.e., the naive version, the tiled version, and the improved CDP version, based upon five data layouts, including the Structure of Arrays (SoA), the Array of Structures (AoS), the Array of aligned Structures (AoaS), the Structure of Arrays of aligned Structures (SoAoS), and the Hybrid layout. We also carry out several groups of experimental tests to evaluate the impact. Experimental results show that: the layouts AoS and AoaS achieve better performance than the layout SoA for both the naive version and tiled version, while the layout SoA is the best choice for the improved CDP version. We also observe that: for the two combined data layouts (the SoAoS and the Hybrid), there are no notable performance gains when compared to other three basic layouts. We recommend that: in practical applications, the layout AoaS is the best choice since the tiled version is the fastest one among three versions. The source code of all implementations are publicly available.
PERI - Auto-tuning Memory Intensive Kernels for Multicore
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, David H; Williams, Samuel; Datta, Kaushik
2008-06-24
We present an auto-tuning approach to optimize application performance on emerging multicore architectures. The methodology extends the idea of search-based performance optimizations, popular in linear algebra and FFT libraries, to application-specific computational kernels. Our work applies this strategy to Sparse Matrix Vector Multiplication (SpMV), the explicit heat equation PDE on a regular grid (Stencil), and a lattice Boltzmann application (LBMHD). We explore one of the broadest sets of multicore architectures in the HPC literature, including the Intel Xeon Clovertown, AMD Opteron Barcelona, Sun Victoria Falls, and the Sony-Toshiba-IBM (STI) Cell. Rather than hand-tuning each kernel for each system, we developmore » a code generator for each kernel that allows us to identify a highly optimized version for each platform, while amortizing the human programming effort. Results show that our auto-tuned kernel applications often achieve a better than 4X improvement compared with the original code. Additionally, we analyze a Roofline performance model for each platform to reveal hardware bottlenecks and software challenges for future multicore systems and applications.« less
Exploring synchronisation in nonlinear data assimilation
NASA Astrophysics Data System (ADS)
Rodrigues-Pinheiro, Flavia; van Leeuwen, Peter Jan
2016-04-01
Present-day data assimilation methods are based on linearizations and face serious problems in strongly nonlinear cases such as convection. A promising solution to this problem is a particle filter, which provides a representation of the model probability density function (pdf) by a discrete set of model states, or particles. The basic particle filter uses Bayes's theorem directly, but does not work in high-dimensional cases. The performance can be improved by considering the proposal density freedom. This allows one to change the model equations to bring the particles closer to the observations, resulting in very efficient update schemes at observation times, but extending these schemes between observation times is computationally expensive. Simple solutions like nudging have been shown to be not powerful enough. A potential solution might be synchronization, in which one tries to synchronise the model of a system with the true evolution of the system via the observations. In practice this means that an extra term is added to the model equations that hampers growth of instabilities on the synchronization manifold. Especially the delayed versions, where observations are allowed to influence the state in the past have shown some remarkable successes. Unfortunately, all efforts ignore errors in the observations, and as soon as these are introduced the performance degrades considerably. There is a close connection between time-delayed synchronization and a Kalman Smoother, which does allow for observational (and other) errors. In this presentation we will explore this connection to the full, with a view to extend synchronization to more realistic settings. Specifically performance of the spread of information from observed to unobserved variables is studied in detail. The results indicate that this extended synchronisation is a promising tool to steer the model states towards the observations efficiently. If time permits, we will show initial results of embedding the new synchronization method into a particle filter.
NASA Astrophysics Data System (ADS)
Novelli, Antonio; Aguilar, Manuel A.; Nemmaoui, Abderrahim; Aguilar, Fernando J.; Tarantino, Eufemia
2016-10-01
This paper shows the first comparison between data from Sentinel-2 (S2) Multi Spectral Instrument (MSI) and Landsat 8 (L8) Operational Land Imager (OLI) headed up to greenhouse detection. Two closely related in time scenes, one for each sensor, were classified by using Object Based Image Analysis and Random Forest (RF). The RF input consisted of several object-based features computed from spectral bands and including mean values, spectral indices and textural features. S2 and L8 data comparisons were also extended using a common segmentation dataset extracted form VHR World-View 2 (WV2) imagery to test differences only due to their specific spectral contribution. The best band combinations to perform segmentation were found through a modified version of the Euclidian Distance 2 index. Four different RF classifications schemes were considered achieving 89.1%, 91.3%, 90.9% and 93.4% as the best overall accuracies respectively, evaluated over the whole study area.
The Reactome Pathway Knowledgebase
Jupe, Steven; Matthews, Lisa; Sidiropoulos, Konstantinos; Gillespie, Marc; Garapati, Phani; Haw, Robin; Jassal, Bijay; Korninger, Florian; May, Bruce; Milacic, Marija; Roca, Corina Duenas; Rothfels, Karen; Sevilla, Cristoffer; Shamovsky, Veronica; Shorser, Solomon; Varusai, Thawfeek; Viteri, Guilherme; Weiser, Joel
2018-01-01
Abstract The Reactome Knowledgebase (https://reactome.org) provides molecular details of signal transduction, transport, DNA replication, metabolism, and other cellular processes as an ordered network of molecular transformations—an extended version of a classic metabolic map, in a single consistent data model. Reactome functions both as an archive of biological processes and as a tool for discovering unexpected functional relationships in data such as gene expression profiles or somatic mutation catalogues from tumor cells. To support the continued brisk growth in the size and complexity of Reactome, we have implemented a graph database, improved performance of data analysis tools, and designed new data structures and strategies to boost diagram viewer performance. To make our website more accessible to human users, we have improved pathway display and navigation by implementing interactive Enhanced High Level Diagrams (EHLDs) with an associated icon library, and subpathway highlighting and zooming, in a simplified and reorganized web site with adaptive design. To encourage re-use of our content, we have enabled export of pathway diagrams as ‘PowerPoint’ files. PMID:29145629
Status of parallel Python-based implementation of UEDGE
NASA Astrophysics Data System (ADS)
Umansky, M. V.; Pankin, A. Y.; Rognlien, T. D.; Dimits, A. M.; Friedman, A.; Joseph, I.
2017-10-01
The tokamak edge transport code UEDGE has long used the code-development and run-time framework Basis. However, with the support for Basis expected to terminate in the coming years, and with the advent of the modern numerical language Python, it has become desirable to move UEDGE to Python, to ensure its long-term viability. Our new Python-based UEDGE implementation takes advantage of the portable build system developed for FACETS. The new implementation gives access to Python's graphical libraries and numerical packages for pre- and post-processing, and support of HDF5 simplifies exchanging data. The older serial version of UEDGE has used for time-stepping the Newton-Krylov solver NKSOL. The renovated implementation uses backward Euler discretization with nonlinear solvers from PETSc, which has the promise to significantly improve the UEDGE parallel performance. We will report on assessment of some of the extended UEDGE capabilities emerging in the new implementation, and will discuss the future directions. Work performed for U.S. DOE by LLNL under contract DE-AC52-07NA27344.
Dumont, Cyrielle; Lestini, Giulia; Le Nagard, Hervé; Mentré, France; Comets, Emmanuelle; Nguyen, Thu Thuy; Group, For The Pfim
2018-03-01
Nonlinear mixed-effect models (NLMEMs) are increasingly used for the analysis of longitudinal studies during drug development. When designing these studies, the expected Fisher information matrix (FIM) can be used instead of performing time-consuming clinical trial simulations. The function PFIM is the first tool for design evaluation and optimization that has been developed in R. In this article, we present an extended version, PFIM 4.0, which includes several new features. Compared with version 3.0, PFIM 4.0 includes a more complete pharmacokinetic/pharmacodynamic library of models and accommodates models including additional random effects for inter-occasion variability as well as discrete covariates. A new input method has been added to specify user-defined models through an R function. Optimization can be performed assuming some fixed parameters or some fixed sampling times. New outputs have been added regarding the FIM such as eigenvalues, conditional numbers, and the option of saving the matrix obtained after evaluation or optimization. Previously obtained results, which are summarized in a FIM, can be taken into account in evaluation or optimization of one-group protocols. This feature enables the use of PFIM for adaptive designs. The Bayesian individual FIM has been implemented, taking into account a priori distribution of random effects. Designs for maximum a posteriori Bayesian estimation of individual parameters can now be evaluated or optimized and the predicted shrinkage is also reported. It is also possible to visualize the graphs of the model and the sensitivity functions without performing evaluation or optimization. The usefulness of these approaches and the simplicity of use of PFIM 4.0 are illustrated by two examples: (i) an example of designing a population pharmacokinetic study accounting for previous results, which highlights the advantage of adaptive designs; (ii) an example of Bayesian individual design optimization for a pharmacodynamic study, showing that the Bayesian individual FIM can be a useful tool in therapeutic drug monitoring, allowing efficient prediction of estimation precision and shrinkage for individual parameters. PFIM 4.0 is a useful tool for design evaluation and optimization of longitudinal studies in pharmacometrics and is freely available at http://www.pfim.biostat.fr. Copyright © 2018 Elsevier B.V. All rights reserved.
Rational extended thermodynamics of a rarefied polyatomic gas with molecular relaxation processes
NASA Astrophysics Data System (ADS)
Arima, Takashi; Ruggeri, Tommaso; Sugiyama, Masaru
2017-10-01
We present a more refined version of rational extended thermodynamics of rarefied polyatomic gases in which molecular rotational and vibrational relaxation processes are treated individually. In this case, we need a triple hierarchy of the moment system and the system of balance equations is closed via the maximum entropy principle. Three different types of the production terms in the system, which are suggested by a generalized BGK-type collision term in the Boltzmann equation, are adopted. In particular, the rational extended thermodynamic theory with seven independent fields (ET7) is analyzed in detail. Finally, the dispersion relation of ultrasonic wave derived from the ET7 theory is confirmed by the experimental data for CO2, Cl2, and Br2 gases.
GWM-VI: groundwater management with parallel processing for multiple MODFLOW versions
Banta, Edward R.; Ahlfeld, David P.
2013-01-01
Groundwater Management–Version Independent (GWM–VI) is a new version of the Groundwater Management Process of MODFLOW. The Groundwater Management Process couples groundwater-flow simulation with a capability to optimize stresses on the simulated aquifer based on an objective function and constraints imposed on stresses and aquifer state. GWM–VI extends prior versions of Groundwater Management in two significant ways—(1) it can be used with any version of MODFLOW that meets certain requirements on input and output, and (2) it is structured to allow parallel processing of the repeated runs of the MODFLOW model that are required to solve the optimization problem. GWM–VI uses the same input structure for files that describe the management problem as that used by prior versions of Groundwater Management. GWM–VI requires only minor changes to the input files used by the MODFLOW model. GWM–VI uses the Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER-API) to implement both version independence and parallel processing. GWM–VI communicates with the MODFLOW model by manipulating certain input files and interpreting results from the MODFLOW listing file and binary output files. Nearly all capabilities of prior versions of Groundwater Management are available in GWM–VI. GWM–VI has been tested with MODFLOW-2005, MODFLOW-NWT (a Newton formulation for MODFLOW-2005), MF2005-FMP2 (the Farm Process for MODFLOW-2005), SEAWAT, and CFP (Conduit Flow Process for MODFLOW-2005). This report provides sample problems that demonstrate a range of applications of GWM–VI and the directory structure and input information required to use the parallel-processing capability.
Prevost, Marie; Carrier, Marie-Eve; Chowne, Gabrielle; Zelkowitz, Phyllis; Joseph, Lawrence; Gold, Ian
2014-01-01
The first aim of our study was to validate the French version of the Reading the Mind in the Eyes test, a theory of mind test. The second aim was to test whether cultural differences modulate performance on this test. A total of 109 participants completed the original English version and 97 participants completed the French version. Another group of 30 participants completed the French version twice, one week apart. We report a similar overall distribution of scores in both versions and no differences in the mean scores between them. However, 2 items in the French version did not collect a majority of responses, which differed from the results of the English version. Test-retest showed good stability of the French version. As expected, participants who do not speak French or English at home, and those born in Asia, performed worse than North American participants, and those who speak English or French at home. We report a French version with acceptable validity and good stability. The cultural differences observed support the idea that Asian culture does not use theory of mind to explain people's behaviours as much as North American people do.
Computer versus paper--does it make any difference in test performance?
Karay, Yassin; Schauber, Stefan K; Stosch, Christoph; Schüttpelz-Brauns, Katrin
2015-01-01
CONSTRUCT: In this study, we examine the differences in test performance between the paper-based and the computer-based version of the Berlin formative Progress Test. In this context it is the first study that allows controlling for students' prior performance. Computer-based tests make possible a more efficient examination procedure for test administration and review. Although university staff will benefit largely from computer-based tests, the question arises if computer-based tests influence students' test performance. A total of 266 German students from the 9th and 10th semester of medicine (comparable with the 4th-year North American medical school schedule) participated in the study (paper = 132, computer = 134). The allocation of the test format was conducted as a randomized matched-pair design in which students were first sorted according to their prior test results. The organizational procedure, the examination conditions, the room, and seating arrangements, as well as the order of questions and answers, were identical in both groups. The sociodemographic variables and pretest scores of both groups were comparable. The test results from the paper and computer versions did not differ. The groups remained within the allotted time, but students using the computer version (particularly the high performers) needed significantly less time to complete the test. In addition, we found significant differences in guessing behavior. Low performers using the computer version guess significantly more than low-performing students in the paper-pencil version. Participants in computer-based tests are not at a disadvantage in terms of their test results. The computer-based test required less processing time. The reason for the longer processing time when using the paper-pencil version might be due to the time needed to write the answer down, controlling for transferring the answer correctly. It is still not known why students using the computer version (particularly low-performing students) guess at a higher rate. Further studies are necessary to understand this finding.
Gawlik, Stephanie; Müller, Mitho; Hoffmann, Lutz; Dienes, Aimée; Reck, Corinna
2015-01-01
validated questionnaire assessment of fathers' experiences during childbirth is lacking in routine clinical practice. Salmon's Item List is a short, validated method used for the assessment of birth experience in mothers in both English- and German-speaking communities. With little to no validated data available for fathers, this pilot study aimed to assess the applicability of the German version of Salmon's Item List, including a multidimensional birth experience concept, in fathers. longitudinal study. Data were collected by questionnaires. University hospital in Germany. the birth experiences of 102 fathers were assessed four to six weeks post partum using the German version of Salmon's Item List. construct validity testing with exploratory factor analysis using principal component analysis with varimax rotation was performed to identify the dimensions of childbirth experiences. Internal consistency was also analysed. factor analysis yielded a four-factor solution comprising 17 items that accounted for 54.5% of the variance. The main domain was 'fulfilment', and the secondary domains were 'emotional distress', 'physical discomfort' and 'emotional adaption'. For fulfilment, Cronbach's α met conventional reliability standards (0.87). Salmon's Item List is an appropriate instrument to assess birth experience in fathers in terms of fulfilment. Larger samples need to be examined in order to prove the stability of the factor structure before this can be extended to routine clinical assessment. a reduced version of Salmon's Item List may be useful as a screening tool for general assessment. Copyright © 2014 Elsevier Ltd. All rights reserved.
Wong, P K S; Wong, D F K; Zhuang, X Y; Liu, Y
2017-03-01
The construct of self-determination has received considerable attention in the international field of intellectual disabilities (ID). Recently, there has been a rapid development of this construct in Chinese societies including Hong Kong. However, there is no locally validated instrument to measure self-determination in people with ID. This article explains the validation process of the AIR Self-Determination Scale - Chinese version (AIR SDS-C) adapted from the 24-item AIR Self-Determination Scale, developed by Wolman and his colleagues, which is used in school setting. People with mild/moderate ID aged 15 years or above were recruited from special schools and social services units in different regions of Hong Kong. Factor analysis and reliability tests were conducted. Data for a total of 356 participants were used for the analysis. A confirmatory factor analysis was performed to test the factorial construct, and Mplus 7.0 was used for the analysis. The factor structure proposed in the original English version was supported by the data, and all factor loadings were between 0.42 and 0.76. The whole scale achieved good reliability (Cronbach's α = 0.88 and ω = 0.90). The AIR SDS-C appears to be a valid and reliable scale. This study examined adult groups as well as student groups. The application of the scale can thus be extended to a wider population. The implications for theory building and practice are discussed. © 2016 MENCAP and International Association of the Scientific Study of Intellectual and Developmental Disabilities and John Wiley & Sons Ltd.
Stojmenova, Kristina; Sodnik, Jaka
2018-07-04
There are 3 standardized versions of the Detection Response Task (DRT), 2 using visual stimuli (remote DRT and head-mounted DRT) and one using tactile stimuli. In this article, we present a study that proposes and validates a type of auditory signal to be used as DRT stimulus and evaluate the proposed auditory version of this method by comparing it with the standardized visual and tactile version. This was a within-subject design study performed in a driving simulator with 24 participants. Each participant performed 8 2-min-long driving sessions in which they had to perform 3 different tasks: driving, answering to DRT stimuli, and performing a cognitive task (n-back task). Presence of additional cognitive load and type of DRT stimuli were defined as independent variables. DRT response times and hit rates, n-back task performance, and pupil size were observed as dependent variables. Significant changes in pupil size for trials with a cognitive task compared to trials without showed that cognitive load was induced properly. Each DRT version showed a significant increase in response times and a decrease in hit rates for trials with a secondary cognitive task compared to trials without. Similar and significantly better results in differences in response times and hit rates were obtained for the auditory and tactile version compared to the visual version. There were no significant differences in performance rate between the trials without DRT stimuli compared to trials with and among the trials with different DRT stimuli modalities. The results from this study show that the auditory DRT version, using the signal implementation suggested in this article, is sensitive to the effects of cognitive load on driver's attention and is significantly better than the remote visual and tactile version for auditory-vocal cognitive (n-back) secondary tasks.
NASA Astrophysics Data System (ADS)
Butykai, A.; Domínguez-García, P.; Mor, F. M.; Gaál, R.; Forró, L.; Jeney, S.
2017-11-01
The present document is an update of the previously published MatLab code for the calibration of optical tweezers in the high-resolution detection of the Brownian motion of non-spherical probes [1]. In this instance, an alternative version of the original code, based on the same physical theory [2], but focused on the automation of the calibration of measurements using spherical probes, is outlined. The new added code is useful for high-frequency microrheology studies, where the probe radius is known but the viscosity of the surrounding fluid maybe not. This extended calibration methodology is automatic, without the need of a user's interface. A code for calibration by means of thermal noise analysis [3] is also included; this is a method that can be applied when using viscoelastic fluids if the trap stiffness is previously estimated [4]. The new code can be executed in MatLab and using GNU Octave. Program Files doi:http://dx.doi.org/10.17632/s59f3gz729.1 Licensing provisions: GPLv3 Programming language: MatLab 2016a (MathWorks Inc.) and GNU Octave 4.0 Operating system: Linux and Windows. Supplementary material: A new document README.pdf includes basic running instructions for the new code. Journal reference of previous version: Computer Physics Communications, 196 (2015) 599 Does the new version supersede the previous version?: No. It adds alternative but compatible code while providing similar calibration factors. Nature of problem (approx. 50-250 words): The original code uses a MatLab-provided user's interface, which is not available in GNU Octave, and cannot be used outside of a proprietary software as MatLab. Besides, the process of calibration when using spherical probes needs an automatic method when calibrating big amounts of different data focused to microrheology. Solution method (approx. 50-250 words): The new code can be executed in the latest version of MatLab and using GNU Octave, a free and open-source alternative to MatLab. This code generates an automatic calibration process which requires only to write the input data in the main script. Additionally, we include a calibration method based on thermal noise statistics, which can be used with viscoelastic fluids if the trap stiffness is previously estimated. Reasons for the new version: This version extends the functionality of PFMCal for the particular case of spherical probes and unknown fluid viscosities. The extended code is automatic, works in different operating systems and it is compatible with GNU Octave. Summary of revisions: The original MatLab program in the previous version, which is executed by PFMCal.m, is not changed. Here, we have added two additional main archives named PFMCal_auto.m and PFMCal_histo.m, which implement automatic calculations of the calibration process and calibration through Boltzmann statistics, respectively. The process of calibration using this code for spherical beads is described in the README.pdf file provided in the new code submission. Here, we obtain different calibration factors, β (given in μm/V), according to [2], related to two statistical quantities: the mean-squared displacement (MSD), βMSD, and the velocity autocorrelation function (VAF), βVAF. Using that methodology, the trap stiffness, k, and the zero-shear viscosity of the fluid, η, can be calculated if the value of the particle's radius, a, is previously known. For comparison, we include in the extended code the method of calibration using the corner frequency of the power-spectral density (PSD) [5], providing a calibration factor βPSD. Besides, with the prior estimation of the trap stiffness, along with the known value of the particle's radius, we can use thermal noise statistics to obtain calibration factors, β, according to the quadratic form of the optical potential, βE, and related to the Gaussian distribution of the bead's positions, βσ2. This method has been demonstrated to be applicable to the calibration of optical tweezers when using non-Newtonian viscoelastic polymeric liquids [4]. An example of the results using this calibration process is summarized in Table 1. Using the data provided in the new code submission, for water and acetone fluids, we calculate all the calibration factors by using the original PFMCal.m and by the new non-GUI code PFMCal_auto.m and PFMCal_histo.m. Regarding the new code, PFMCal_auto.m returns η, k, βMSD, βVAF and βPSD, while PFMCal_histo.m provides βσ2 and βE. Table 1 shows how we obtain the expected viscosity of the two fluids at this temperature and how the different methods provide good agreement between trap stiffnesses and calibration factors. Additional comments including Restrictions and Unusual features (approx. 50-250 words): The original code, PFMCal.m, runs under MatLab using the Statistics Toolbox. The extended code, PFMCal_auto.m and PFMCal_histo.m, can be executed without modification using MatLab or GNU Octave. The code has been tested in Linux and Windows operating systems.
Rotary engine performance computer program (RCEMAP and RCEMAPPC): User's guide
NASA Technical Reports Server (NTRS)
Bartrand, Timothy A.; Willis, Edward A.
1993-01-01
This report is a user's guide for a computer code that simulates the performance of several rotary combustion engine configurations. It is intended to assist prospective users in getting started with RCEMAP and/or RCEMAPPC. RCEMAP (Rotary Combustion Engine performance MAP generating code) is the mainframe version, while RCEMAPPC is a simplified subset designed for the personal computer, or PC, environment. Both versions are based on an open, zero-dimensional combustion system model for the prediction of instantaneous pressures, temperature, chemical composition and other in-chamber thermodynamic properties. Both versions predict overall engine performance and thermal characteristics, including bmep, bsfc, exhaust gas temperature, average material temperatures, and turbocharger operating conditions. Required inputs include engine geometry, materials, constants for use in the combustion heat release model, and turbomachinery maps. Illustrative examples and sample input files for both versions are included.
An extension of fracture mechanics/technology to larger and smaller cracks/defects
Abé, Hiroyuki
2009-01-01
Fracture mechanics/technology is a key science and technology for the design and integrity assessment of the engineering structures. However, the conventional fracture mechanics has mostly targeted a limited size of cracks/defects, say of from several hundred microns to several tens of centimeters. The author and his group has tried to extend that limited size and establish a new version of fracture technology for very large cracks used in geothermal energy extraction and for very small cracks/defects or damage often appearing in the combination of mechanical and electronic components of engineering structures. Those new versions are reviewed in this paper. PMID:19907123
An extension of fracture mechanics/technology to larger and smaller cracks/defects.
Abé, Hiroyuki
2009-01-01
Fracture mechanics/technology is a key science and technology for the design and integrity assessment of the engineering structures. However, the conventional fracture mechanics has mostly targeted a limited size of cracks/defects, say of from several hundred microns to several tens of centimeters. The author and his group has tried to extend that limited size and establish a new version of fracture technology for very large cracks used in geothermal energy extraction and for very small cracks/defects or damage often appearing in the combination of mechanical and electronic components of engineering structures. Those new versions are reviewed in this paper.
DOVIS 2.0: an efficient and easy to use parallel virtual screening tool based on AutoDock 4.0.
Jiang, Xiaohui; Kumar, Kamal; Hu, Xin; Wallqvist, Anders; Reifman, Jaques
2008-09-08
Small-molecule docking is an important tool in studying receptor-ligand interactions and in identifying potential drug candidates. Previously, we developed a software tool (DOVIS) to perform large-scale virtual screening of small molecules in parallel on Linux clusters, using AutoDock 3.05 as the docking engine. DOVIS enables the seamless screening of millions of compounds on high-performance computing platforms. In this paper, we report significant advances in the software implementation of DOVIS 2.0, including enhanced screening capability, improved file system efficiency, and extended usability. To keep DOVIS up-to-date, we upgraded the software's docking engine to the more accurate AutoDock 4.0 code. We developed a new parallelization scheme to improve runtime efficiency and modified the AutoDock code to reduce excessive file operations during large-scale virtual screening jobs. We also implemented an algorithm to output docked ligands in an industry standard format, sd-file format, which can be easily interfaced with other modeling programs. Finally, we constructed a wrapper-script interface to enable automatic rescoring of docked ligands by arbitrarily selected third-party scoring programs. The significance of the new DOVIS 2.0 software compared with the previous version lies in its improved performance and usability. The new version makes the computation highly efficient by automating load balancing, significantly reducing excessive file operations by more than 95%, providing outputs that conform to industry standard sd-file format, and providing a general wrapper-script interface for rescoring of docked ligands. The new DOVIS 2.0 package is freely available to the public under the GNU General Public License.
Guindon, Stéphane; Dufayard, Jean-François; Lefort, Vincent; Anisimova, Maria; Hordijk, Wim; Gascuel, Olivier
2010-05-01
PhyML is a phylogeny software based on the maximum-likelihood principle. Early PhyML versions used a fast algorithm performing nearest neighbor interchanges to improve a reasonable starting tree topology. Since the original publication (Guindon S., Gascuel O. 2003. A simple, fast and accurate algorithm to estimate large phylogenies by maximum likelihood. Syst. Biol. 52:696-704), PhyML has been widely used (>2500 citations in ISI Web of Science) because of its simplicity and a fair compromise between accuracy and speed. In the meantime, research around PhyML has continued, and this article describes the new algorithms and methods implemented in the program. First, we introduce a new algorithm to search the tree space with user-defined intensity using subtree pruning and regrafting topological moves. The parsimony criterion is used here to filter out the least promising topology modifications with respect to the likelihood function. The analysis of a large collection of real nucleotide and amino acid data sets of various sizes demonstrates the good performance of this method. Second, we describe a new test to assess the support of the data for internal branches of a phylogeny. This approach extends the recently proposed approximate likelihood-ratio test and relies on a nonparametric, Shimodaira-Hasegawa-like procedure. A detailed analysis of real alignments sheds light on the links between this new approach and the more classical nonparametric bootstrap method. Overall, our tests show that the last version (3.0) of PhyML is fast, accurate, stable, and ready to use. A Web server and binary files are available from http://www.atgc-montpellier.fr/phyml/.
Lee, Chun Fan; Ng, Raymond; Luo, Nan; Wong, Nan Soon; Yap, Yoon Sim; Lo, Soo Kien; Chia, Whay Kuang; Yee, Alethea; Krishna, Lalit; Wong, Celest; Goh, Cynthia; Cheung, Yin Bun
2013-01-01
To examine the measurement properties of and comparability between the English and Chinese versions of the five-level EuroQoL Group's five-dimension questionnaire (EQ-5D) in breast cancer patients in Singapore. This is an observational study of 269 patients. Known-group validity and responsiveness of the EQ-5D utility index and visual analog scale (VAS) were assessed in relation to various clinical characteristics and longitudinal change in performance status, respectively. Convergent and divergent validity was examined by correlation coefficients between the EQ-5D and a breast cancer-specific instrument. Test-retest reliability was evaluated. The two language versions were compared by multiple regression analyses. For both English and Chinese versions, the EQ-5D utility index and VAS demonstrated known-group validity and convergent and divergent validity, and presented sufficient test-retest reliability (intraclass correlation = 0.72 to 0.83). The English version was responsive to changes in performance status. The Chinese version was responsive to decline in performance status, but there was no conclusive evidence about its responsiveness to improvement in performance status. In the comparison analyses of the utility index and VAS between the two language versions, borderline results were obtained, and equivalence cannot be definitely confirmed. The five-level EQ-5D is valid, responsive, and reliable in assessing health outcome of breast cancer patients. The English and Chinese versions provide comparable measurement results.
Andrews, Suzanne; Leeman, Lawrence; Yonke, Nicole
2017-09-01
Breech presentation affects 3-4% of pregnancies at term and malpresentation is the primary indication for 10-15% of cesarean deliveries. External cephalic version is an effective intervention that can decrease the need for cesarean delivery; however, timely identification of breech presentation is required. We hypothesized that women with a fetus in a breech presentation that is diagnosed after 38 weeks' estimated gestational age have a decreased likelihood of external cephalic version attempted and an increased likelihood of cesarean delivery. This was a retrospective cohort study. A chart review was performed for 251 women with breech presentation at term presenting to our tertiary referral university hospital for external cephalic version, cesarean for breech presentation, or vaginal breech delivery. Vaginal delivery was significantly more likely (31.1% vs 12.5%; P<.01) in women with breech presentation diagnosed before 38 weeks' estimated gestational age as external cephalic version was offered, and subsequently attempted in a greater proportion of women diagnosed before 38 weeks. External cephalic version was more successful when performed by physicians with greater procedural volume during the 3.5 year period of the study (59.1% for providers performing at least 10 procedures vs 31.3% if performing fewer than 10 procedures, P<.01). Results support the need for interventions to increase timely diagnosis of breech presentation as well as improved patient counseling and use of experienced providers for external cephalic version. © 2017 Wiley Periodicals, Inc.
Testing and Validating Gadget2 for GPUs
NASA Astrophysics Data System (ADS)
Wibking, Benjamin; Holley-Bockelmann, K.; Berlind, A. A.
2013-01-01
We are currently upgrading a version of Gadget2 (Springel et al., 2005) that is optimized for NVIDIA's CUDA GPU architecture (Frigaard, unpublished) to work with the latest libraries and graphics cards. Preliminary tests of its performance indicate a ~40x speedup in the particle force tree approximation calculation, with overall speedup of 5-10x for cosmological simulations run with GPUs compared to running on the same CPU cores without GPU acceleration. We believe this speedup can be reasonably increased by an additional factor of two with futher optimization, including overlap of computation on CPU and GPU. Tests of single-precision GPU numerical fidelity currently indicate accuracy of the mass function and the spectral power density to within a few percent of extended-precision CPU results with the unmodified form of Gadget. Additionally, we plan to test and optimize the GPU code for Millenium-scale "grand challenge" simulations of >10^9 particles, a scale that has been previously untested with this code, with the aid of the NSF XSEDE flagship GPU-based supercomputing cluster codenamed "Keeneland." Current work involves additional validation of numerical results, extending the numerical precision of the GPU calculations to double precision, and evaluating performance/accuracy tradeoffs. We believe that this project, if successful, will yield substantial computational performance benefits to the N-body research community as the next generation of GPU supercomputing resources becomes available, both increasing the electrical power efficiency of ever-larger computations (making simulations possible a decade from now at scales and resolutions unavailable today) and accelerating the pace of research in the field.
Motivational and metacognitive feedback in SQL-Tutor*
NASA Astrophysics Data System (ADS)
Hull, Alison; du Boulay, Benedict
2015-04-01
Motivation and metacognition are strongly intertwined, with learners high in self-efficacy more likely to use a variety of self-regulatory learning strategies, as well as to persist longer on challenging tasks. The aim of the research was to improve the learner's focus on the process and experience of problem-solving while using an Intelligent Tutoring System (ITS) and including motivational and metacognitive feedback based on the learner's past states and experiences. An existing ITS, SQL-Tutor, was used with first-year undergraduates studying a database module. The study used two versions of SQL-Tutor: the Control group used a base version providing domain feedback and the Study group used an extended version that also provided motivational and metacognitive feedback. This paper summarises the pre- and post-process results. Comparisons between groups showed some differing trends both in learning outcomes and behaviour in favour of the Study group.
FRAMES Metadata Reporting Templates for Ecohydrological Observations, version 1.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christianson, Danielle; Varadharajan, Charuleka; Christoffersen, Brad
FRAMES is a a set of Excel metadata files and package-level descriptive metadata that are designed to facilitate and improve capture of desired metadata for ecohydrological observations. The metadata are bundled with data files into a data package and submitted to a data repository (e.g. the NGEE Tropics Data Repository) via a web form. FRAMES standardizes reporting of diverse ecohydrological and biogeochemical data for synthesis across a range of spatiotemporal scales and incorporates many best data science practices. This version of FRAMES supports observations for primarily automated measurements collected by permanently located sensors, including sap flow (tree water use), leafmore » surface temperature, soil water content, dendrometry (stem diameter growth increment), and solar radiation. Version 1.1 extend the controlled vocabulary and incorporates functionality to facilitate programmatic use of data and FRAMES metadata (R code available at NGEE Tropics Data Repository).« less
Heck, Daniel W; Hilbig, Benjamin E; Moshagen, Morten
2017-08-01
Decision strategies explain how people integrate multiple sources of information to make probabilistic inferences. In the past decade, increasingly sophisticated methods have been developed to determine which strategy explains decision behavior best. We extend these efforts to test psychologically more plausible models (i.e., strategies), including a new, probabilistic version of the take-the-best (TTB) heuristic that implements a rank order of error probabilities based on sequential processing. Within a coherent statistical framework, deterministic and probabilistic versions of TTB and other strategies can directly be compared using model selection by minimum description length or the Bayes factor. In an experiment with inferences from given information, only three of 104 participants were best described by the psychologically plausible, probabilistic version of TTB. Similar as in previous studies, most participants were classified as users of weighted-additive, a strategy that integrates all available information and approximates rational decisions. Copyright © 2017 Elsevier Inc. All rights reserved.
Comparing Student Performance on the Old vs New Versions of the NAPLEX.
Welch, Adam C; Karpen, Samuel C
2018-04-01
Objective. To determine if the new 2016 version of the North American Pharmacy Licensure Examination (NAPLEX) affected scores when controlling for student performance on other measures using data from one institution. Methods. There were 201 records from the classes of 2014-2016. Doubly robust estimation using weighted propensity scores was used to compare NAPLEX scaled scores and pass rates while considering student performance on other measures. Of the potential controllers of student performance: Pharmacy Curricular Outcomes Assessment (PCOA), scaled composite scores from the Pharmacy College Admission Test (PCAT), and P3 Grade Point Average (GPA). Only PCOA and P3 GPA were found to be appropriate for propensity scoring. Results. The weighted NAPLEX scaled scores did not significantly drop from the old (2014-2015) to the new (2016) version of NAPLEX. The change in pass rates between the new and old versions of NAPLEX were also non-significant. Conclusion. Using data from one institution, the new version itself of the NAPLEX did not have a significant effect on NAPLEX scores or first-time pass rates when controlling for student performance on other measures. Colleges are encouraged to repeat this analysis with pooled data and larger sample sizes.
Grau-Guinea, L; Pérez-Enríquez, C; García-Escobar, G; Arrondo-Elizarán, C; Pereira-Cutiño, B; Florido-Santiago, M; Piqué-Candini, J; Planas, A; Paez, M; Peña-Casanova, J; Sánchez-Benavides, G
2018-05-08
The Free and Cued Selective Reminding Test (FCSRT) is widely used for the assessment of verbal episodic memory, mainly in patients with Alzheimer disease. A Spanish version of the FCSRT and normative data were developed within the NEURONORMA project. Availability of alternative, equivalent versions is useful for following patients up in clinical settings. This study aimed to develop an alternative version of the original FCSRT (version B) and to study its equivalence to the original Spanish test (version A), and its performance in a sample of healthy individuals, in order to develop reference data. We evaluated 232 healthy participants of the NEURONORMA-Plus project, aged between 18 and 90. Thirty-three participants were assessed with both versions using a counterbalanced design. High intra-class correlation coefficients (between 0.8 and 0.9) were observed in the equivalence study. While no significant differences in performance were observed in total recall scores, free recall scores were significantly lower for version B. These preliminary results suggest that the newly developed FCSRT version B is equivalent to version A in the main variables tested. Further studies are necessary to ensure interchangeability between versions. We provide normative data for the new version. Copyright © 2018 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.
Dual PECCS: a cognitive system for conceptual representation and categorization
NASA Astrophysics Data System (ADS)
Lieto, Antonio; Radicioni, Daniele P.; Rho, Valentina
2017-03-01
In this article we present an advanced version of Dual-PECCS, a cognitively-inspired knowledge representation and reasoning system aimed at extending the capabilities of artificial systems in conceptual categorization tasks. It combines different sorts of common-sense categorization (prototypical and exemplars-based categorization) with standard monotonic categorization procedures. These different types of inferential procedures are reconciled according to the tenets coming from the dual process theory of reasoning. On the other hand, from a representational perspective, the system relies on the hypothesis of conceptual structures represented as heterogeneous proxytypes. Dual-PECCS has been experimentally assessed in a task of conceptual categorization where a target concept illustrated by a simple common-sense linguistic description had to be identified by resorting to a mix of categorization strategies, and its output has been compared to human responses. The obtained results suggest that our approach can be beneficial to improve the representational and reasoning conceptual capabilities of standard cognitive artificial systems, and - in addition - that it may be plausibly applied to different general computational models of cognition. The current version of the system, in fact, extends our previous work, in that Dual- PECCS is now integrated and tested into two cognitive architectures, ACT-R and CLARION, implementing different assumptions on the underlying invariant structures governing human cognition. Such integration allowed us to extend our previous evaluation.
The Affective Establishment and Maintenance of Vygotsky's Zone of Proximal Development
ERIC Educational Resources Information Center
Levykh, Michael G.
2008-01-01
Many recent articles, research papers, and conference presentations about Lev Vygotsky's zone of proximal development (ZPD) emphasize the "extended" version of the ZPD that reflects human emotions and desires. In this essay, Michael G. Levykh expands on the extant literature on the ZPD through developing several new ideas. First, he maintains that…
Unit: Science and Safety, Inspection Set, National Trial Print.
ERIC Educational Resources Information Center
Australian Science Education Project, Toorak, Victoria.
This unit, a trial version prepared by the Australian Science Education Project, is intended to create in students an awareness of the potential hazards of a science room, to help build confidence by teaching safe techniques of apparatus manipulation, and to demonstrate the utility of planning work thoroughly. The safety principles are extended to…
Toward an Intercultural Stance: Teaching German and English through Telecollaboration
ERIC Educational Resources Information Center
Ware, Paige D.; Kramsch, Claire
2005-01-01
We discuss the challenges of Web-based teaching for language teachers and then describe in detail an extended episode of misunderstanding that occurred between 2 students discussing their versions of history during a classroom-based, asynchronous telecollaborative project between learners of German in the United States and learners of English in…
3D Numerical Simulation of Turbulent Buoyant Flow and Heat Transport in a Curved Open Channel
USDA-ARS?s Scientific Manuscript database
A three-dimensional buoyancy-extended version of kappa-epsilon turbulence model was developed for simulating the turbulent flow and heat transport in a curved open channel. The density- induced buoyant force was included in the model, and the influence of temperature stratification on flow field was...
Subjective Vitality and Patterns of Acculturation: Four Cases
ERIC Educational Resources Information Center
Ehala, Martin; Vedernikova, Elena
2015-01-01
The article presents a comparative analysis of the subjective vitalities (SVs) of the minority groups of Latvia (Russian-speakers), Lithuania (Russian-speakers and Poles) and Mari El (Maris) in the Russian Federation, with a particular focus on the Mari case. The same extended version of the SV questionnaire was used in quantitative surveys in all…
The Shock and Vibration Digest. Volume 15, Number 11
1983-11-01
concept of eigenstrain Dept. of Mech. Engrg., Univ. of Washington, Seattle, and the extended version of Eshelby’s method of equivalent WA, Rept. No. UWA...placement field, due to the presence of inhomogeneity, is AD-A126 444 given in terms of the eigenstrains . Key Words: Crack propagation Crack branching
ESP v2.0: Improved method for projecting U.S. GHG and air pollution emissions through 2055
This product includes both a presentation and an extended abstract. We describe the Emission Scenario Projection (ESP) method, version 2.0. ESP is used to develop multi-decadal projections of U.S. greenhouse gas (GHG) and criteria pollutant emissions. The resulting future-year em...
From 2002-2017, to What Extent has Turkish Security Policy Been Effective
2017-04-13
and equal to those of the Foreign, Energy, and Culture and Tourism ministries combined.77 Turkey’s use of overseas aid, export of its version of...with about 57 percent of its natural gas. Economic ties also extended to nuclear power, construction, tourism , and other sectors as well.114
Evaluating real-time Java for mission-critical large-scale embedded systems
NASA Technical Reports Server (NTRS)
Sharp, D. C.; Pla, E.; Luecke, K. R.; Hassan, R. J.
2003-01-01
This paper describes benchmarking results on an RT JVM. This paper extends previously published results by including additional tests, by being run on a recently available pre-release version of the first commercially supported RTSJ implementation, and by assessing results based on our experience with avionics systems in other languages.
Resolving Controlled Vocabulary in DITA Markup: A Case Example in Agroforestry
ERIC Educational Resources Information Center
Zschocke, Thomas
2012-01-01
Purpose: This paper aims to address the issue of matching controlled vocabulary on agroforestry from knowledge organization systems (KOS) and incorporating these terms in DITA markup. The paper has been selected for an extended version from MTSR'11. Design/methodology/approach: After a general description of the steps taken to harmonize controlled…
Developing Global Competences by Extended Chemistry Concept Maps
ERIC Educational Resources Information Center
Celestino, Teresa; Piumetti, Marco
2015-01-01
This work focuses on a possible teaching approach to promote chemistry learning for students during the first two years of technical high school in Italy (age 14-15). Critical thinking skills can be developed by integrating two different curriculum designs, converging in a novel didactic approach, a modified version of the Systemic Approach to…
Terminology of European Education and Training Policy: A Selection of 130 Key Terms. Second Edition
ERIC Educational Resources Information Center
Cedefop - European Centre for the Development of Vocational Training, 2014
2014-01-01
This multilingual glossary defines 130 key terms used in European education and training policy. It is an extended and updated version of "Terminology of European education and training policy" (2008) and "Terminology of vocational training policy" (2004). It considers new priorities of European union policy, mainly in skills…
ERIC Educational Resources Information Center
Yang, Lan; Arens, A. Katrin; Watkins, David A.
2016-01-01
In order to extend previous research on the twofold multidimensionality of academic self-concept (i.e. its domain-specific structure and separation into competence and affect components), the present study tests its generalisability among vocational students from mainland China. A Chinese version of self-description questionnaire I was…
Framework to parameterize and validate APEX to support deployment of the nutrient tracking tool
USDA-ARS?s Scientific Manuscript database
The Agricultural Policy Environmental eXtender (APEX) model is the scientific basis for the Nutrient Tracking Tool (NTT). NTT is an enhanced version of the Nitrogen Trading Tool, a user-friendly web-based computer program originally developed by the USDA. NTT was developed to estimate reductions in...
Elmer, Lawrence W; Juncos, Jorge L; Singer, Carlos; Truong, Daniel D; Criswell, Susan R; Parashos, Sotirios; Felt, Larissa; Johnson, Reed; Patni, Rajiv
2018-04-01
An Online First version of this article was made available online at http://link.springer.com/journal/40263/onlineFirst/page/1 on 12 March 2018. An error was subsequently identified in the article, and the following correction should be noted.
Full-f version of GENE for turbulence in open-field-line systems
NASA Astrophysics Data System (ADS)
Pan, Q.; Told, D.; Shi, E. L.; Hammett, G. W.; Jenko, F.
2018-06-01
Unique properties of plasmas in the tokamak edge, such as large amplitude fluctuations and plasma-wall interactions in the open-field-line regions, require major modifications of existing gyrokinetic codes originally designed for simulating core turbulence. To this end, the global version of the 3D2V gyrokinetic code GENE, so far employing a δf-splitting technique, is extended to simulate electrostatic turbulence in straight open-field-line systems. The major extensions are the inclusion of the velocity-space nonlinearity, the development of a conducting-sheath boundary, and the implementation of the Lenard-Bernstein collision operator. With these developments, the code can be run as a full-f code and can handle particle loss to and reflection from the wall. The extended code is applied to modeling turbulence in the Large Plasma Device (LAPD), with a reduced mass ratio and a much lower collisionality. Similar to turbulence in a tokamak scrape-off layer, LAPD turbulence involves collisions, parallel streaming, cross-field turbulent transport with steep profiles, and particle loss at the parallel boundary.
Shohaimi, Shamarina; Wei, Wong Yoke; Shariff, Zalilah Mohd
2014-01-01
Comprehensive feeding practices questionnaire (CFPQ) is an instrument specifically developed to evaluate parental feeding practices. It has been confirmed among children in America and applied to populations in France, Norway, and New Zealand. In order to extend the application of CFPQ, we conducted a factor structure validation of the translated version of CFPQ (CFPQ-M) using confirmatory factor analysis among mothers of primary school children (N = 397) in Malaysia. Several items were modified for cultural adaptation. Of 49 items, 39 items with loading factors >0.40 were retained in the final model. The confirmatory factor analysis revealed that the final model (twelve-factor model with 39 items and 2 error covariances) displayed the best fit for our sample (Chi-square = 1147; df = 634; P < 0.05; CFI = 0.900; RMSEA = 0.045; SRMR = 0.0058). The instrument with some modifications was confirmed among mothers of school children in Malaysia. The present study extends the usability of the CFPQ and enables researchers and parents to better understand the relationships between parental feeding practices and related problems such as childhood obesity.
SDA 7: A modular and parallel implementation of the simulation of diffusional association software
Martinez, Michael; Romanowska, Julia; Kokh, Daria B.; Ozboyaci, Musa; Yu, Xiaofeng; Öztürk, Mehmet Ali; Richter, Stefan
2015-01-01
The simulation of diffusional association (SDA) Brownian dynamics software package has been widely used in the study of biomacromolecular association. Initially developed to calculate bimolecular protein–protein association rate constants, it has since been extended to study electron transfer rates, to predict the structures of biomacromolecular complexes, to investigate the adsorption of proteins to inorganic surfaces, and to simulate the dynamics of large systems containing many biomacromolecular solutes, allowing the study of concentration‐dependent effects. These extensions have led to a number of divergent versions of the software. In this article, we report the development of the latest version of the software (SDA 7). This release was developed to consolidate the existing codes into a single framework, while improving the parallelization of the code to better exploit modern multicore shared memory computer architectures. It is built using a modular object‐oriented programming scheme, to allow for easy maintenance and extension of the software, and includes new features, such as adding flexible solute representations. We discuss a number of application examples, which describe some of the methods available in the release, and provide benchmarking data to demonstrate the parallel performance. © 2015 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc. PMID:26123630
A framework for expanding aqueous chemistry in the ...
This paper describes the development and implementation of an extendable aqueous-phase chemistry option (AQCHEM − KMT(I)) for the Community Multiscale Air Quality (CMAQ) modeling system, version 5.1. Here, the Kinetic PreProcessor (KPP), version 2.2.3, is used to generate a Rosenbrock solver (Rodas3) to integrate the stiff system of ordinary differential equations (ODEs) that describe the mass transfer, chemical kinetics, and scavenging processes of CMAQ clouds. CMAQ's standard cloud chemistry module (AQCHEM) is structurally limited to the treatment of a simple chemical mechanism. This work advances our ability to test and implement more sophisticated aqueous chemical mechanisms in CMAQ and further investigate the impacts of microphysical parameters on cloud chemistry. Box model cloud chemistry simulations were performed to choose efficient solver and tolerance settings, evaluate the implementation of the KPP solver, and assess the direct impacts of alternative solver and kinetic mass transfer on predicted concentrations for a range of scenarios. Month-long CMAQ simulations for winter and summer periods over the US reveal the changes in model predictions due to these cloud module updates within the full chemical transport model. While monthly average CMAQ predictions are not drastically altered between AQCHEM and AQCHEM − KMT, hourly concentration differences can be significant. With added in-cloud secondary organic aerosol (SOA) formation from bio
Hybrid Parallel Contour Trees, Version 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sewell, Christopher; Fasel, Patricia; Carr, Hamish
A common operation in scientific visualization is to compute and render a contour of a data set. Given a function of the form f : R^d -> R, a level set is defined as an inverse image f^-1(h) for an isovalue h, and a contour is a single connected component of a level set. The Reeb graph can then be defined to be the result of contracting each contour to a single point, and is well defined for Euclidean spaces or for general manifolds. For simple domains, the graph is guaranteed to be a tree, and is called the contourmore » tree. Analysis can then be performed on the contour tree in order to identify isovalues of particular interest, based on various metrics, and render the corresponding contours, without having to know such isovalues a priori. This code is intended to be the first data-parallel algorithm for computing contour trees. Our implementation will use the portable data-parallel primitives provided by Nvidia’s Thrust library, allowing us to compile our same code for both GPUs and multi-core CPUs. Native OpenMP and purely serial versions of the code will likely also be included. It will also be extended to provide a hybrid data-parallel / distributed algorithm, allowing scaling beyond a single GPU or CPU.« less
Developmental Progression of Looking and Reaching Performance on the A-not-B Task
Cuevas, Kimberly; Bell, Martha Ann
2013-01-01
From a neuropsychological perspective, the cognitive skills of working memory, inhibition, and attention and the maturation of the frontal lobe are requisites for successful A-not-B performance on both the looking and reaching versions of the task. This study used a longitudinal design to examine the developmental progression of infants’ performance on the looking and reaching versions of the A-not-B task. Twenty infants were tested on both versions of the task once a month from 5 to 10 months of age. Infants had higher object permanence scores on the looking version of the task from 5 to 8 months, with comparable performance across response modalities at 9 and 10 months. The same pattern of performance was found on nonreversal (A) trials: Infants performed better on looking trials from 5 to 7 months and they performed equally on both response trials from 8 to 10 months. Overall, infants performed better on looking reversal (B) trials than reaching reversal trials. These data suggest that performance differences between response modalities early in development can be attributed to major differences in the maturation of brain circuitry associated with the actual task response. PMID:20822245
Unstructured Euler flow solutions using hexahedral cell refinement
NASA Technical Reports Server (NTRS)
Melton, John E.; Cappuccio, Gelsomina; Thomas, Scott D.
1991-01-01
An attempt is made to extend grid refinement into three dimensions by using unstructured hexahedral grids. The flow solver is developed using the TIGER (topologically Independent Grid, Euler Refinement) as the starting point. The program uses an unstructured hexahedral mesh and a modified version of the Jameson four-stage, finite-volume Runge-Kutta algorithm for integration of the Euler equations. The unstructured mesh allows for local refinement appropriate for each freestream condition, thereby concentrating mesh cells in the regions of greatest interest. This increases the computational efficiency because the refinement is not required to extend throughout the entire flow field.
The finite cell method for polygonal meshes: poly-FCM
NASA Astrophysics Data System (ADS)
Duczek, Sascha; Gabbert, Ulrich
2016-10-01
In the current article, we extend the two-dimensional version of the finite cell method (FCM), which has so far only been used for structured quadrilateral meshes, to unstructured polygonal discretizations. Therefore, the adaptive quadtree-based numerical integration technique is reformulated and the notion of generalized barycentric coordinates is introduced. We show that the resulting polygonal (poly-)FCM approach retains the optimal rates of convergence if and only if the geometry of the structure is adequately resolved. The main advantage of the proposed method is that it inherits the ability of polygonal finite elements for local mesh refinement and for the construction of transition elements (e.g. conforming quadtree meshes without hanging nodes). These properties along with the performance of the poly-FCM are illustrated by means of several benchmark problems for both static and dynamic cases.
Carrier Modulation Via Waveform Probability Density Function
NASA Technical Reports Server (NTRS)
Williams, Glenn L.
2006-01-01
Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital one's or zero's. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental physical laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.
Carrier Modulation Via Waveform Probability Density Function
NASA Technical Reports Server (NTRS)
Williams, Glenn L.
2004-01-01
Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital ONEs or ZEROs. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental natural laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.
Solving free-plasma-boundary problems with the SIESTA MHD code
NASA Astrophysics Data System (ADS)
Sanchez, R.; Peraza-Rodriguez, H.; Reynolds-Barredo, J. M.; Tribaldos, V.; Geiger, J.; Hirshman, S. P.; Cianciosa, M.
2017-10-01
SIESTA is a recently developed MHD equilibrium code designed to perform fast and accurate calculations of ideal MHD equilibria for 3D magnetic configurations. It is an iterative code that uses the solution obtained by the VMEC code to provide a background coordinate system and an initial guess of the solution. The final solution that SIESTA finds can exhibit magnetic islands and stochastic regions. In its original implementation, SIESTA addressed only fixed-boundary problems. This fixed boundary condition somewhat restricts its possible applications. In this contribution we describe a recent extension of SIESTA that enables it to address free-plasma-boundary situations, opening up the possibility of investigating problems with SIESTA in which the plasma boundary is perturbed either externally or internally. As an illustration, the extended version of SIESTA is applied to a configuration of the W7-X stellarator.
Fault tolerance in an inner-outer solver: A GVR-enabled case study
Zhang, Ziming; Chien, Andrew A.; Teranishi, Keita
2015-04-18
Resilience is a major challenge for large-scale systems. It is particularly important for iterative linear solvers, since they take much of the time of many scientific applications. We show that single bit flip errors in the Flexible GMRES iterative linear solver can lead to high computational overhead or even failure to converge to the right answer. Informed by these results, we design and evaluate several strategies for fault tolerance in both inner and outer solvers appropriate across a range of error rates. We implement them, extending Trilinos’ solver library with the Global View Resilience (GVR) programming model, which provides multi-streammore » snapshots, multi-version data structures with portable and rich error checking/recovery. Lastly, experimental results validate correct execution with low performance overhead under varied error conditions.« less
Systems of fuzzy equations in structural mechanics
NASA Astrophysics Data System (ADS)
Skalna, Iwona; Rama Rao, M. V.; Pownuk, Andrzej
2008-08-01
Systems of linear and nonlinear equations with fuzzy parameters are relevant to many practical problems arising in structure mechanics, electrical engineering, finance, economics and physics. In this paper three methods for solving such equations are discussed: method for outer interval solution of systems of linear equations depending linearly on interval parameters, fuzzy finite element method proposed by Rama Rao and sensitivity analysis method. The performance and advantages of presented methods are described with illustrative examples. Extended version of the present paper can be downloaded from the web page of the UTEP [I. Skalna, M.V. Rama Rao, A. Pownuk, Systems of fuzzy equations in structural mechanics, The University of Texas at El Paso, Department of Mathematical Sciences Research Reports Series,
Solving large scale traveling salesman problems by chaotic neurodynamics.
Hasegawa, Mikio; Ikeguch, Tohru; Aihara, Kazuyuki
2002-03-01
We propose a novel approach for solving large scale traveling salesman problems (TSPs) by chaotic dynamics. First, we realize the tabu search on a neural network, by utilizing the refractory effects as the tabu effects. Then, we extend it to a chaotic neural network version. We propose two types of chaotic searching methods, which are based on two different tabu searches. While the first one requires neurons of the order of n2 for an n-city TSP, the second one requires only n neurons. Moreover, an automatic parameter tuning method of our chaotic neural network is presented for easy application to various problems. Last, we show that our method with n neurons is applicable to large TSPs such as an 85,900-city problem and exhibits better performance than the conventional stochastic searches and the tabu searches.
A finite difference Hartree-Fock program for atoms and diatomic molecules
NASA Astrophysics Data System (ADS)
Kobus, Jacek
2013-03-01
The newest version of the two-dimensional finite difference Hartree-Fock program for atoms and diatomic molecules is presented. This is an updated and extended version of the program published in this journal in 1996. It can be used to obtain reference, Hartree-Fock limit values of total energies and multipole moments for a wide range of diatomic molecules and their ions in order to calibrate existing and develop new basis sets, calculate (hyper)polarizabilities (αzz, βzzz, γzzzz, Az,zz, Bzz,zz) of atoms, homonuclear and heteronuclear diatomic molecules and their ions via the finite field method, perform DFT-type calculations using LDA or B88 exchange functionals and LYP or VWN correlations ones or the self-consistent multiplicative constant method, perform one-particle calculations with (smooth) Coulomb and Krammers-Henneberger potentials and take account of finite nucleus models. The program is easy to install and compile (tarball+configure+make) and can be used to perform calculations within double- or quadruple-precision arithmetic. Catalogue identifier: ADEB_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADEB_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 2 No. of lines in distributed program, including test data, etc.: 171196 No. of bytes in distributed program, including test data, etc.: 9481802 Distribution format: tar.gz Programming language: Fortran 77, C. Computer: any 32- or 64-bit platform. Operating system: Unix/Linux. RAM: Case dependent, from few MB to many GB Classification: 16.1. Catalogue identifier of previous version: ADEB_v1_0 Journal reference of previous version: Comput. Phys. Comm. 98(1996)346 Does the new version supersede the previous version?: Yes Nature of problem: The program finds virtually exact solutions of the Hartree-Fock and density functional theory type equations for atoms, diatomic molecules and their ions. The lowest energy eigenstates of a given irreducible representation and spin can be obtained. The program can be used to perform one-particle calculations with (smooth) Coulomb and Krammers-Henneberger potentials and also DFT-type calculations using LDA or B88 exchange functionals and LYP or VWN correlations ones or the self-consistent multiplicative constant method. Solution method: Single-particle two-dimensional numerical functions (orbitals) are used to construct an antisymmetric many-electron wave function of the restricted open-shell Hartree-Fock model. The orbitals are obtained by solving the Hartree-Fock equations as coupled two-dimensional second-order (elliptic) partial differential equations (PDEs). The Coulomb and exchange potentials are obtained as solutions of the corresponding Poisson equations. The PDEs are discretized by the eighth-order central difference stencil on a two-dimensional single grid, and the resulting large and sparse system of linear equations is solved by the (multicolour) successive overrelaxation ((MC)SOR) method. The self-consistent-field iterations are interwoven with the (MC)SOR ones and orbital energies and normalization factors are used to monitor the convergence. The accuracy of solutions depends mainly on the grid and the system under consideration, which means that within double precision arithmetic one can obtain orbitals and energies having up to 12 significant figures. If more accurate results are needed, quadruple-precision floating-point arithmetic can be used. Reasons for new version: Additional features, many modifications and corrections, improved convergence rate, overhauled code and documentation. Summary of revisions: see ChangeLog found in tar.gz archive Restrictions: The present version of the program is restricted to 60 orbitals. The maximum grid size is determined at compilation time. Unusual features: The program uses two C routines for allocating and deallocating memory. Several BLAS (Basic Linear Algebra System) routines are emulated by the program. When possible they should be replaced by their library equivalents. Additional comments: automake and autoconf tools are required to build and compile the program; checked with f77, gfortran and ifort compilers Running time: Very case dependent - from a few CPU seconds for the H2 defined on a small grid up to several weeks for the Hartree-Fock-limit calculations for 40-50 electron molecules.
How Do You Play? A Comparison among Children Aged 4–10
Delvecchio, Elisa; Li, Jian-Bin; Pazzagli, Chiara; Lis, Adriana; Mazzeschi, Claudia
2016-01-01
Pretend play has a central role for children's development and psychological well-being. However, there is a paucity of standardized and valid measures specifically devoted to assess the core domains involved in play activities in preschool and primary school children. The Affect in Play Scale-Preschool (4–5 years) and the Affect in Play Scale-Preschool Extended Version (6–10 years) are semi-structured parallel tools designed to explore child's cognitive and affective processes using a standardized play task. The current study administered this 5-min play task to 538 Italian children aged 4–10. The purposes were to compare play abilities in boys vs. girls and in preschool vs. primary school children, to correlate pretend play with divergent thinking and to evaluate the structural validity of the measure along the considered age span. No differences, excepting for Organization, were found between boys and girls, whereas school age children reported higher play abilities then the younger ones. External validity was assessed using correlational analysis with the divergent thinking task (the Alternate Uses Test) for preschoolers and primary school-aged children, in line with findings from Manova. Construct validity, assessed through the Confirmatory Factor Analysis, showed good fits for the two-factor model with cognitive and affective factor for both the Affect in Play Scale-Preschool and its Extended Version. A multi-group factor analysis suggested a partial invariance of the two-factor model across preschool (4–5 years old) and primary school-aged (6–10 years old) children. Results supported the use of the Affect in Play Scale-Preschool and its Extended Version as adequate measures to assess the interplay of cognitive and affective skills in preschool and school age children. The discussion highlights clinical and research implications linked to the possibility to have a unique play task able to assess child's affective and cognitive abilities throughout a quite wide life span (from 4 to 10 years old). PMID:27909423
Weiniger, Carolyn F; Ginosar, Yehuda; Elchalal, Uriel; Sharon, Einav; Nokrian, Malka; Ezra, Yossef
2007-12-01
To compare the success of external cephalic version using spinal analgesia with no analgesia among nulliparas. A prospective randomized controlled trial was performed in a tertiary referral center delivery suite. Nulliparous women at term requesting external cephalic version for breech presentation were randomized to receive spinal analgesia (7.5 mg bupivacaine) or no analgesia before the external cephalic version. An experienced obstetrician performed the external cephalic version. Primary outcome was successful conversion to vertex presentation. Seventy-four women were enrolled, and 70 analyzed (36 spinal, 34 no analgesia). Successful external cephalic version occurred among 24 of 36 (66.7%) women randomized to receive spinal analgesia compared with 11 of 34 (32.4%) without, P=.004 (95% confidence interval [CI] of the difference: 0.0954-0.5513). External cephalic version with spinal analgesia resulted in a lower visual analog pain score, 1.76+/-2.74 compared with 6.84+/-3.08 without, P<.001. A secondary analysis logistic regression model demonstrated that the odds of external cephalic version success was 4.0-fold higher when performed with spinal analgesia P=.02 (95% CI, odds ratio [OR] 1.2-12.9). Complete breech presentation before attempting external cephalic version increased the odds of success 8.2-fold, P=.001 (95% CI, OR 2.2-30.3). Placental position, estimated fetal weight, and maternal weight did not contribute to the success rate when spinal analgesia was used. There were no cases of placental abruption or fetal distress. Administration of spinal analgesia significantly increases the success rate of external cephalic version among nulliparous women at term, which allows possible normal vaginal delivery. ClinicalTrials.gov, www.clinicaltrials.gov, NCT00119184 I.
Elias, Liana R; Köhler, Cristiano A; Stubbs, Brendon; Maciel, Beatriz R; Cavalcante, Lígia M; Vale, Antonio M O; Gonda, Xénia; Quevedo, João; Hyphantis, Thomas N; Soares, Jair C; Vieta, Eduard; Carvalho, André F
2017-04-01
The assessment of affective temperaments has provided useful insights for the psychopathological understanding of affective disorders and for the conceptualization of bipolar spectrum disorders. The Temperament in Memphis Pisa and San Diego (TEMPS) instrument has been widely used in research, yet its psychometric properties and optimal factor structure are unclear. The PubMed/MEDLINE, PsycINFO, and EMBASE electronic databases were searched from inception until March 15th, 2016. Validation peer-reviewed studies of different versions of the TEMPS performed in adult samples were considered for inclusion. Twenty-seven studies (N=20,787) met inclusion criteria. Several versions of the TEMPS have been validated in 14 languages across 15 countries. The 110-item self-reported version of the TEMPS has been the most studied version. Most studies (50%) supported a five factor solution although few studies performed confirmatory factor analyses. A five-factor solution has consistently been reported for the 39-item version of the TEMPS-A. Overall, evidence indicates that different versions of the TEMPS have adequate internal consistency reliability, while the TEMPS-A-110 version has acceptable test-retest reliability. The methodological quality of included studies varied. A meta-analysis could not be performed due to the heterogeneity of settings and versions of the TEMPS utilized. Different versions of the TEMPS have been validated across different cultures. The short 39-item version of the TEMPS-A holds promise and merits further investigation. Culture-bound factors may influence the expression and/or assessment of affective temperaments with the TEMPS. Copyright © 2017 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Penttinen, Marjaana; Huovinen, Erkki; Ylitalo, Anna-Kaisa
2015-01-01
In the present study, education majors minoring in music education (n = 24) and music performance majors (n =14) read and performed the original version and melodically altered versions of a simple melody in a given tempo. Eye movements during music reading and piano performances were recorded. Errorless trials were analyzed to explore the…
An update to the Surface Ocean CO2 Atlas (SOCAT version 2)
NASA Astrophysics Data System (ADS)
Bakker, D. C. E.; Pfeil, B.; Smith, K.; Hankin, S.; Olsen, A.; Alin, S. R.; Cosca, C.; Harasawa, S.; Kozyr, A.; Nojiri, Y.; O'Brien, K. M.; Schuster, U.; Telszewski, M.; Tilbrook, B.; Wada, C.; Akl, J.; Barbero, L.; Bates, N.; Boutin, J.; Cai, W.-J.; Castle, R. D.; Chavez, F. P.; Chen, L.; Chierici, M.; Currie, K.; de Baar, H. J. W.; Evans, W.; Feely, R. A.; Fransson, A.; Gao, Z.; Hales, B.; Hardman-Mountford, N.; Hoppema, M.; Huang, W.-J.; Hunt, C. W.; Huss, B.; Ichikawa, T.; Johannessen, T.; Jones, E. M.; Jones, S. D.; Jutterström, S.; Kitidis, V.; Körtzinger, A.; Landschtzer, P.; Lauvset, S. K.; Lefèvre, N.; Manke, A. B.; Mathis, J. T.; Merlivat, L.; Metzl, N.; Murata, A.; Newberger, T.; Ono, T.; Park, G.-H.; Paterson, K.; Pierrot, D.; Ríos, A. F.; Sabine, C. L.; Saito, S.; Salisbury, J.; Sarma, V. V. S. S.; Schlitzer, R.; Sieger, R.; Skjelvan, I.; Steinhoff, T.; Sullivan, K.; Sun, H.; Sutton, A. J.; Suzuki, T.; Sweeney, C.; Takahashi, T.; Tjiputra, J.; Tsurushima, N.; van Heuven, S. M. A. C.; Vandemark, D.; Vlahos, P.; Wallace, D. W. R.; Wanninkhof, R.; Watson, A. J.
2013-08-01
The Surface Ocean CO2 Atlas (SOCAT) is an effort by the international marine carbon research community. It aims to improve access to carbon dioxide measurements in the surface oceans by regular releases of quality controlled and fully documented synthesis and gridded fCO2 (fugacity of carbon dioxide) products. SOCAT version 2 presented here extends the data set for the global oceans and coastal seas by four years and has 10.1 million surface water fCO2 values from 2660 cruises between 1968 and 2011. The procedures for creating version 2 have been comparable to those for version 1. The SOCAT website (http://www.socat.info/) provides access to the individual cruise data files, as well as to the synthesis and gridded data products. Interactive online tools allow visitors to explore the richness of the data. Scientific users can also retrieve the data as downloadable files or via Ocean Data View. Version 2 enables carbon specialists to expand their studies until 2011. Applications of SOCAT include process studies, quantification of the ocean carbon sink and its spatial, seasonal, year-to-year and longer-term variation, as well as initialisation or validation of ocean carbon models and coupled-climate carbon models.
Landsat Pathfinder tropical forest information management system
NASA Technical Reports Server (NTRS)
Salas, W.; Chomentowski, W.; Harville, J.; Skole, D.; Vellekamp, K.
1994-01-01
A Tropical Forest Information Management System_(TFIMS) has been designed to fulfill the needs of HTFIP in such a way that it tracks all aspects of the generation and analysis of the raw satellite data and the derived deforestation dataset. The system is broken down into four components: satellite image selection, processing, data management and archive management. However, as we began to think of how the TFIMS could also be used to make the data readily accessible to all user communities we realized that the initial system was too project oriented and could only be accessed locally. The new system needed development in the areas of data ingest and storage, while at the same time being implemented on a server environment with a network interface accessible via Internet. This paper summarizes the overall design of the existing prototype (version 0) information management system and then presents the design of the new system (version 1). The development of version 1 of the TFIMS is ongoing. There are no current plans for a gradual transition from version 0 to version 1 because the significant changes are in how the data within the HTFIP will be made accessible to the extended community of scientists, policy makers, educators, and students and not in the functionality of the basic system.
A generalized approach to complex networks
NASA Astrophysics Data System (ADS)
Costa, L. Da F.; da Rocha, L. E. C.
2006-03-01
This work describes how the formalization of complex network concepts in terms of discrete mathematics, especially mathematical morphology, allows a series of generalizations and important results ranging from new measurements of the network topology to new network growth models. First, the concepts of node degree and clustering coefficient are extended in order to characterize not only specific nodes, but any generic subnetwork. Second, the consideration of distance transform and rings are used to further extend those concepts in order to obtain a signature, instead of a single scalar measurement, ranging from the single node to whole graph scales. The enhanced discriminative potential of such extended measurements is illustrated with respect to the identification of correspondence between nodes in two complex networks, namely a protein-protein interaction network and a perturbed version of it.
Hoffecker, Lilian; Abbey, Dana
2017-01-01
The research demonstrates that a conference slide presentation translated into non-English languages reaches significantly larger and different audiences than an English presentation alone. The slides of a presentation from the Medical Library Association annual meeting were translated from English to Chinese, Japanese, and Russian and posted along with the English version to SlideShare, an open slide-hosting website. View counts, traffic sources, and geographic origins of the traffic for each language version were tracked over a twenty-two-month period. Total view counts for all 4 language versions amounted to 3,357 views, with the Chinese version accounting for 71% of the total views. The trends in view counts over time for the Japanese, Russian, and English versions were similar, with high interest at the beginning and a rapid drop and low level of viewing activity thereafter. The pattern of view counts for the Chinese version departed considerably from the other language versions, with very low activity at the beginning but a sharp rise 10 months later. This increase in activity was related to access to the presentations via a Taiwanese website that embedded the SlideShare website code. Language translation can be a difficult and time-consuming task. However, translation of a conference slide presentation with limited text is an achievable activity and engages an international audience for information that is often not noticed or lost. Although English is by far the primary language of science and other disciplines, it is not necessarily the first or preferred language of global researchers. By offering appropriate language versions, the authors of presentations can expand the reach of their work.
High-Performance Java Codes for Computational Fluid Dynamics
NASA Technical Reports Server (NTRS)
Riley, Christopher; Chatterjee, Siddhartha; Biswas, Rupak; Biegel, Bryan (Technical Monitor)
2001-01-01
The computational science community is reluctant to write large-scale computationally -intensive applications in Java due to concerns over Java's poor performance, despite the claimed software engineering advantages of its object-oriented features. Naive Java implementations of numerical algorithms can perform poorly compared to corresponding Fortran or C implementations. To achieve high performance, Java applications must be designed with good performance as a primary goal. This paper presents the object-oriented design and implementation of two real-world applications from the field of Computational Fluid Dynamics (CFD): a finite-volume fluid flow solver (LAURA, from NASA Langley Research Center), and an unstructured mesh adaptation algorithm (2D_TAG, from NASA Ames Research Center). This work builds on our previous experience with the design of high-performance numerical libraries in Java. We examine the performance of the applications using the currently available Java infrastructure and show that the Java version of the flow solver LAURA performs almost within a factor of 2 of the original procedural version. Our Java version of the mesh adaptation algorithm 2D_TAG performs within a factor of 1.5 of its original procedural version on certain platforms. Our results demonstrate that object-oriented software design principles are not necessarily inimical to high performance.
Extended and refined multi sensor reanalysis of total ozone for the period 1970-2012
NASA Astrophysics Data System (ADS)
van der A, R. J.; Allaart, M. A. F.; Eskes, H. J.
2015-07-01
The ozone multi-sensor reanalysis (MSR) is a multi-decadal ozone column data record constructed using all available ozone column satellite data sets, surface Brewer and Dobson observations and a data assimilation technique with detailed error modelling. The result is a high-resolution time series of 6-hourly global ozone column fields and forecast error fields that may be used for ozone trend analyses as well as detailed case studies. The ozone MSR is produced in two steps. First, the latest reprocessed versions of all available ozone column satellite data sets are collected and then are corrected for biases as a function of solar zenith angle (SZA), viewing zenith angle (VZA), time (trend), and stratospheric temperature using surface observations of the ozone column from Brewer and Dobson spectrophotometers from the World Ozone and Ultraviolet Radiation Data Centre (WOUDC). Subsequently the de-biased satellite observations are assimilated within the ozone chemistry and data assimilation model TMDAM. The MSR2 (MSR version 2) reanalysis upgrade described in this paper consists of an ozone record for the 43-year period 1970-2012. The chemistry transport model and data assimilation system have been adapted to improve the resolution, error modelling and processing speed. Backscatter ultraviolet (BUV) satellite observations have been included for the period 1970-1977. The total record is extended by 13 years compared to the first version of the ozone multi sensor reanalysis, the MSR1. The latest total ozone retrievals of 15 satellite instruments are used: BUV-Nimbus4, TOMS-Nimbus7, TOMS-EP, SBUV-7, -9, -11, -14, -16, -17, -18, -19, GOME, SCIAMACHY, OMI and GOME-2. The resolution of the model runs, assimilation and output is increased from 2° × 3° to 1° × 1°. The analysis is driven by 3-hourly meteorology from the ERA-Interim reanalysis of the European Centre for Medium-Range Weather Forecasts (ECMWF) starting from 1979, and ERA-40 before that date. The chemistry parameterization has been updated. The performance of the MSR2 analysis is studied with the help of observation-minus-forecast (OmF) departures from the data assimilation, by comparisons with the individual station observations and with ozone sondes. The OmF statistics show that the mean bias of the MSR2 analyses is less than 1 % with respect to de-biased satellite observations after 1979.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerby, Leslie Marie
Emission of light fragments (LF) from nuclear reactions is an open question. Different reaction mechanisms contribute to their production; the relative roles of each, and how they change with incident energy, mass number of the target, and the type and emission energy of the fragments is not completely understood. None of the available models are able to accurately predict emission of LF from arbitrary reactions. However, the ability to describe production of LF (especially at energies ≳ 30 MeV) from many reactions is important for different applications, such as cosmic-ray-induced Single Event Upsets (SEUs), radiation protection, and cancer therapy withmore » proton and heavy-ion beams, to name just a few. The Cascade-Exciton Model (CEM) version 03.03 and the Los Alamos version of the Quark-Gluon String Model (LAQGSM) version 03.03 event generators in Monte Carlo N-Particle Transport Code version 6 (MCNP6) describe quite well the spectra of fragments with sizes up to ⁴He across a broad range of target masses and incident energies (up to ~ 5 GeV for CEM and up to ~ 1 TeV/A for LAQGSM). However, they do not predict the high energy tails of LF spectra heavier than ⁴He well. Most LF with energies above several tens of MeV are emitted during the precompound stage of a reaction. The current versions of the CEM and LAQGSM event generators do not account for precompound emission of LF larger than ⁴He. The aim of our work is to extend the precompound model in them to include such processes, leading to an increase of predictive power of LF-production in MCNP6. This entails upgrading the Modified Exciton Model currently used at the preequilibrium stage in CEM and LAQGSM. It also includes expansion and examination of the coalescence and Fermi break-up models used in the precompound stages of spallation reactions within CEM and LAQGSM. Extending our models to include emission of fragments heavier than ⁴He at the precompound stage has indeed provided results that have much better agreement with experimental data.« less
NASA Astrophysics Data System (ADS)
Zhang, Yu; Seo, Dong-Jun
2017-03-01
This paper presents novel formulations of Mean field bias (MFB) and local bias (LB) correction schemes that incorporate conditional bias (CB) penalty. These schemes are based on the operational MFB and LB algorithms in the National Weather Service (NWS) Multisensor Precipitation Estimator (MPE). By incorporating CB penalty in the cost function of exponential smoothers, we are able to derive augmented versions of recursive estimators of MFB and LB. Two extended versions of MFB algorithms are presented, one incorporating spatial variation of gauge locations only (MFB-L), and the second integrating both gauge locations and CB penalty (MFB-X). These two MFB schemes and the extended LB scheme (LB-X) are assessed relative to the original MFB and LB algorithms (referred to as MFB-O and LB-O, respectively) through a retrospective experiment over a radar domain in north-central Texas, and through a synthetic experiment over the Mid-Atlantic region. The outcome of the former experiment indicates that introducing the CB penalty to the MFB formulation leads to small, but consistent improvements in bias and CB, while its impacts on hourly correlation and Root Mean Square Error (RMSE) are mixed. Incorporating CB penalty in LB formulation tends to improve the RMSE at high rainfall thresholds, but its impacts on bias are also mixed. The synthetic experiment suggests that beneficial impacts are more conspicuous at low gauge density (9 per 58,000 km2), and tend to diminish at higher gauge density. The improvement at high rainfall intensity is partly an outcome of the conservativeness of the extended LB scheme. This conservativeness arises in part from the more frequent presence of negative eigenvalues in the extended covariance matrix which leads to no, or smaller incremental changes to the smoothed rainfall amounts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Augustoni, Arnold L.
A laser safety and hazard analysis is presented, for the Coherent(r) driven Acculite(r) laser central to the Sandia Remote Sensing System (SRSS). The analysis is based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The trailer (B70) based SRSS laser system is a mobile platform which is used to perform laser interaction experiments and tests at various national test sites. The trailer based SRSS laser system is generally operated on the United State Air Forcemore » Starfire Optical Range (SOR) at Kirtland Air Force Base (KAFB), New Mexico. The laser is used to perform laser interaction testing inside the laser trailer as well as outside the trailer at target sites located at various distances. In order to protect personnel who work inside the Nominal Hazard Zone (NHZ) from hazardous laser exposures, it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength (wavelength bands) and calculate the appropriate minimum Optical Density (ODmin) necessary for the laser safety eyewear used by authorized personnel. Also, the Nominal Ocular Hazard Distance (NOHD) and The Extended Ocular Hazard Distance (EOHD) are calculated in order to protect unauthorized personnel who may have violated the boundaries of the control area and might enter into the laser's NHZ for testing outside the trailer. 4Page intentionally left blank« less
López, Enrique; Steiner, Alexander J; Hardy, David J; IsHak, Waguih W; Anderson, W Brantley
2016-01-01
This study explored within-subjects differences in the performance of 40 bilingual participants on the English and Spanish versions of the Wechsler Adult Intelligence Scale (WAIS) Digit Span task. To test the linguistic hypothesis that individuals would perform worse in Spanish because of its syllabic demand, we compared the number of syllables correctly recalled by each participant for every correct trial. Our analysis of the correct number of syllables remembered per trial showed that participants performed significantly better (i.e., recalling more syllables) in Spanish than in English on the total score. Findings suggest the Spanish version of the Digit Span (total score) was significantly more difficult than the English version utilizing traditional scoring methods. Moreover, the Forward Trial, rather than the Backward Trial, was more likely to show group differences between both language versions. Additionally, the Spanish trials of the Digit Span were correlated with language comprehension and verbal episodic memory measures, whereas the English trials of the Digit Span were correlated with confrontational naming and verbal fluency tasks. The results suggest that more research is necessary to further investigate other cognitive factors, rather than just syllabic demand, that might contribute to performance and outcome differences on the WAIS Digit Span in Spanish-English bilinguals.
IR DirectFET Extreme Environments Evaluation Final Report
NASA Technical Reports Server (NTRS)
Burmeister, Martin; Mottiwala, Amin
2008-01-01
In 2007, International Rectifier (IR) introduced a new version of its DirectFET metal oxide semiconductor field effect transistor (MOSFET) packaging. The new version (referred to as 'Version 2') enhances device moisture resistance, makes surface mount (SMT) assembly of these devices to printed wiring boards (PWBs) more repeatable, and subsequent assembly inspection simpler. In the present study, the National Aeronautics Space Administration (NASA) Jet Propulsion Laboratory (JPL), in collaboration with Stellar Microelectronics (Stellar), continued an evaluation of the DirectFET that they started together in 2006. The present study focused on comparing the two versions of the DirectFET and examining the suitability of the DirectFET devices for space applications. This study evaluated both versions of two DirectFET packaged devices that had both been shown in the 2006 study to have the best electrical and thermal properties: the IRF6635 and IRF6644. The present study evaluated (1) the relative electrical and thermal performance of both versions of each device, (2) the performance through high reliability testing, and (3) the performance of these devices in combination with a range of alternate solder alloys in the extreme thermal environments of deep space....
From the Campus to the Cloud: The Online Peer Assisted Learning Scheme
ERIC Educational Resources Information Center
Beaumont, Tim J.; Mannion, Aaron P.; Shen, Brice O.
2012-01-01
This paper reports on an online version of Peer Assisted Study Sessions (PASS), also known as Supplemental Instruction (SI), which was trialled in two subjects in the University of Melbourne in 2011. The program, named the Online Peer Assisted Learning (OPAL) scheme, was implemented with the aims of extending the benefits of a successful peer…
Digitizing a Heritage of Faded Memories: A Case Study on Extending Historical Research Capabilities
ERIC Educational Resources Information Center
Branting, Steven D.
2009-01-01
A historical fact is like a fata morgana, "always less than what really happened." Even consensus does not establish truth; otherwise history is merely the version of the past that people agree to accept. The students who participated in the acclaimed 5th Street Cemetery Necrogeographical Study innocently found themselves clashing with…
ERIC Educational Resources Information Center
Saunders, Kevin; Schweitzer, Janis
This Teacher's Guide for first-grade mathematics is an outgrowth of an extended pilot project conducted nationwide between 1973 and 1976. The manner of presentation and the pedagogical ideas and tools are based on the works of Georges and Frederique Papy. They are recognized as having introduced colored arrow drawings ("papygrams") and…
ERIC Educational Resources Information Center
Saunders, Kevin; And Others
This Teacher's Guide for first-grade mathematics is an outgrowth of an extended pilot project conducted nationwide between 1973 and 1976. The manner of presentation and the pedagogical ideas and tools are based on the works of Georges and Frederique Papy. They are recognized as having introduced colored arrow drawings ("papygrams") and…
ERIC Educational Resources Information Center
Kikas, Eve; Peets, Katlin; Tropp, Kristiina; Hinn, Maris
2009-01-01
The purpose of the present study was to examine the impact of sex, verbal reasoning, and normative beliefs on direct and indirect forms of aggression. Three scales from the Peer Estimated Conflict Behavior Questionnaire, Verbal Reasoning tests, and an extended version of Normative Beliefs About Aggression Scale were administered to 663 Estonian…
Assessing Mediation in Dyadic Data Using the Actor-Partner Interdependence Model
ERIC Educational Resources Information Center
Ledermann, Thomas; Macho, Siegfried; Kenny, David A.
2011-01-01
The assessment of mediation in dyadic data is an important issue if researchers are to test process models. Using an extended version of the actor-partner interdependence model the estimation and testing of mediation is complex, especially when dyad members are distinguishable (e.g., heterosexual couples). We show how the complexity of the model…
ERIC Educational Resources Information Center
Wingo, Nancy Pope; Ivankova, Nataliya V.; Moss, Jacqueline A.
2017-01-01
Academic leaders can better implement institutional strategic plans to promote online programs if they understand faculty perceptions about teaching online. An extended version of a model for technology acceptance, or TAM2 (Venkatesh & Davis, 2000), provided a framework for surveying and organizing the research literature about factors that…
ERIC Educational Resources Information Center
Randall, David
2016-01-01
This document extends the National Association of Scholars' (NAS's) critique of the College Board from Advanced Placement U.S. History (APUSH) to Advanced Placement European History (APEH). The College Board distorts APEH in the same way that it distorted the first version of APUSH. The traditional history of Europe tells how Europeans, uniquely,…
ERIC Educational Resources Information Center
Falleur, David M.
This presentation describes SuperPILOT, an extended version of Apple PILOT, a programming language for developing computer-assisted instruction (CAI) with the Apple II computer that includes the features of its early PILOT (Programmed Inquiry, Learning or Teaching) ancestors together with new features that make use of the Apple computer's advanced…
Learning in First-Year Biology: Approaches of Distance and On-Campus Students
ERIC Educational Resources Information Center
Quinn, Frances Catherine
2011-01-01
This paper aims to extend previous research into learning of tertiary biology, by exploring the learning approaches adopted by two groups of students studying the same first-year biology topic in either on-campus or off-campus "distance" modes. The research involved 302 participants, who responded to a topic-specific version of the Study Process…
A Service for Emotion Management: Turkish Version of the Adolescent Anger Rating Scale (AARS)
ERIC Educational Resources Information Center
Aslan, A. Esra; Sevincler-Togan, Seyhan
2009-01-01
An individual's activities are closely related with his/her communication abilities. One's awareness of his feelings and needs and to what extend he can control such feelings are the key factors which effect communication abilities. Webster (1996) defines anger as, "a strong emotion; a feeling that is oriented toward some real or supposed…
NASA Astrophysics Data System (ADS)
Batool, Fiza; Akram, Ghazala
2018-05-01
An improved (G'/G)-expansion method is proposed for extracting more general solitary wave solutions of the nonlinear fractional Cahn-Allen equation. The temporal fractional derivative is taken in the sense of Jumarie's fractional derivative. The results of this article are generalized and extended version of previously reported solutions.
On the Certain Topological Indices of Titania Nanotube TiO2[m, n
NASA Astrophysics Data System (ADS)
Javaid, M.; Liu, Jia-Bao; Rehman, M. A.; Wang, Shaohui
2017-07-01
A numeric quantity that characterises the whole structure of a molecular graph is called the topological index that predicts the physical features, chemical reactivities, and boiling activities of the involved chemical compound in the molecular graph. In this article, we give new mathematical expressions for the multiple Zagreb indices, the generalised Zagreb index, the fourth version of atom-bond connectivity (ABC4) index, and the fifth version of geometric-arithmetic (GA5) index of TiO2[m, n]. In addition, we compute the latest developed topological index called by Sanskruti index. At the end, a comparison is also included to estimate the efficiency of the computed indices. Our results extended some known conclusions.
MetAlign 3.0: performance enhancement by efficient use of advances in computer hardware.
Lommen, Arjen; Kools, Harrie J
2012-08-01
A new, multi-threaded version of the GC-MS and LC-MS data processing software, metAlign, has been developed which is able to utilize multiple cores on one PC. This new version was tested using three different multi-core PCs with different operating systems. The performance of noise reduction, baseline correction and peak-picking was 8-19 fold faster compared to the previous version on a single core machine from 2008. The alignment was 5-10 fold faster. Factors influencing the performance enhancement are discussed. Our observations show that performance scales with the increase in processor core numbers we currently see in consumer PC hardware development.
SMOS and AMSR-2 soil moisture evaluation using representative monitoring sites in southern Australia
NASA Astrophysics Data System (ADS)
Walker, J. P.; Mei Sun, M. S.; Rudiger, C.; Parinussa, R.; Koike, T.; Kerr, Y. H.
2016-12-01
The performance of soil moisture products from AMSR-2 and SMOS were evaluated against representative surface soil moisture stations within the Yanco study area in the Murrumbidgee Catchment, in southeast Australia. AMSR-2 Level 3 (L3) soil moisture products retrieved from two sets of brightness temperatures using the Japanese Aerospace exploration Agency (JAXA) and the Land Parameter Retrieval Model (LPRM) algorithms were included. For the LPRM algorithm, two different parameterization methods were applied. In the case of SMOS, two versions of the SMOS L3 soil moisture product were assessed. Results based on using "random" and representative stations to evaluate the products were contrasted. The latest versions of the JAXA (JX2) and LPRM (LP3) products were found to perform better than the earlier versions (JX1, LP1 and LP2). Moreover, soil moisture retrieval based on the latter version of brightness temperature and parameterization scheme improved when C-band observations were used, as opposed to the X-band data. Yet, X-band retrievals were found to perform better than C-band. Inter-comparing AMSR-2 X-band products from different acquisition times showed a better performance for 1:30 pm overpasses whereas SMOS 6:00 am retrievals were found to perform the best. The mean average error (MAE) goal accuracy of the AMSR-2 mission (MAE < 0.08 m3/m3) was met by both versions of the JAXA products, the LPRM X-band products retrieved from the reprocessed version of brightness temperatures, and both versions of SMOS products. Nevertheless, none of the products achieved the SMOS target accuracy of 0.04 m3/m3. Finally, the product performance depended on the statistics used in their evaluation; based on temporal and absolute accuracy JX2 is recommended, whereas LP3 X-band 1:30 pm and SMOS2 6:00 am are recommended based on temporal accuracy alone.
Efficient self-consistency for magnetic tight binding
NASA Astrophysics Data System (ADS)
Soin, Preetma; Horsfield, A. P.; Nguyen-Manh, D.
2011-06-01
Tight binding can be extended to magnetic systems by including an exchange interaction on an atomic site that favours net spin polarisation. We have used a published model, extended to include long-ranged Coulomb interactions, to study defects in iron. We have found that achieving self-consistency using conventional techniques was either unstable or very slow. By formulating the problem of achieving charge and spin self-consistency as a search for stationary points of a Harris-Foulkes functional, extended to include spin, we have derived a much more efficient scheme based on a Newton-Raphson procedure. We demonstrate the capabilities of our method by looking at vacancies and self-interstitials in iron. Self-consistency can indeed be achieved in a more efficient and stable manner, but care needs to be taken to manage this. The algorithm is implemented in the code PLATO. Program summaryProgram title:PLATO Catalogue identifier: AEFC_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFC_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 228 747 No. of bytes in distributed program, including test data, etc.: 1 880 369 Distribution format: tar.gz Programming language: C and PERL Computer: Apple Macintosh, PC, Unix machines Operating system: Unix, Linux, Mac OS X, Windows XP Has the code been vectorised or parallelised?: Yes. Up to 256 processors tested RAM: Up to 2 Gbytes per processor Classification: 7.3 External routines: LAPACK, BLAS and optionally ScaLAPACK, BLACS, PBLAS, FFTW Catalogue identifier of previous version: AEFC_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 2616 Does the new version supersede the previous version?: Yes Nature of problem: Achieving charge and spin self-consistency in magnetic tight binding can be very difficult. Our existing schemes failed altogether, or were very slow. Solution method: A new scheme for achieving self-consistency in orthogonal tight binding has been introduced that explicitly evaluates the first and second derivatives of the energy with respect to input charge and spin, and then uses these to search for stationary values of the energy. Reasons for new version: Bug fixes and new functionality. Summary of revisions: New charge and spin mixing scheme for orthogonal tight binding. Numerous small bug fixes. Restrictions: The new mixing scheme scales poorly with system size. In particular the memory usage scales as number of atoms to the power 4. It is restricted to systems with about 200 atoms or less. Running time: Test cases will run in a few minutes, large calculations may run for several days.
NASA Astrophysics Data System (ADS)
Saltos, Andrea
In efforts to perform accurate dosimetry, Oakes et al. [Nucl. Intrum. Mehods. (2013)] introduced a new portable solid state neutron rem meter based on an adaptation of the Bonner sphere and the position sensitive long counter. The system utilizes high thermal efficiency neutron detectors to generate a linear combination of measurement signals that are used to estimate the incident neutron spectra. The inversion problem associated to deduce dose from the counts in individual detector elements is addressed by applying a cross-correlation method which allows estimation of dose with average errors less than 15%. In this work, an evaluation of the performance of this system was extended to take into account new correlation techniques and neutron scattering contribution. To test the effectiveness of correlations, the Distance correlation, Pearson Product-Moment correlation, and their weighted versions were performed between measured spatial detector responses obtained from nine different test spectra, and the spatial response of Library functions generated by MCNPX. Results indicate that there is no advantage of using the Distance Correlation over the Pearson Correlation, and that weighted versions of these correlations do not increase their performance in evaluating dose. Both correlations were proven to work well even at low integrated doses measured for short periods of time. To evaluate the contribution produced by room-return neutrons on the dosimeter response, MCNPX was used to simulate dosimeter responses for five isotropic neutron sources placed inside different sizes of rectangular concrete rooms. Results show that the contribution of scattered neutrons to the response of the dosimeter can be significant, so that for most cases the dose is over predicted with errors as large as 500%. A possible method to correct for the contribution of room-return neutrons is also assessed and can be used as a good initial estimate on how to approach the problem.
An update to the Surface Ocean CO2 Atlas (SOCAT version 2)
NASA Astrophysics Data System (ADS)
Bakker, D. C. E.; Pfeil, B.; Smith, K.; Hankin, S.; Olsen, A.; Alin, S. R.; Cosca, C.; Harasawa, S.; Kozyr, A.; Nojiri, Y.; O'Brien, K. M.; Schuster, U.; Telszewski, M.; Tilbrook, B.; Wada, C.; Akl, J.; Barbero, L.; Bates, N. R.; Boutin, J.; Bozec, Y.; Cai, W.-J.; Castle, R. D.; Chavez, F. P.; Chen, L.; Chierici, M.; Currie, K.; de Baar, H. J. W.; Evans, W.; Feely, R. A.; Fransson, A.; Gao, Z.; Hales, B.; Hardman-Mountford, N. J.; Hoppema, M.; Huang, W.-J.; Hunt, C. W.; Huss, B.; Ichikawa, T.; Johannessen, T.; Jones, E. M.; Jones, S. D.; Jutterström, S.; Kitidis, V.; Körtzinger, A.; Landschützer, P.; Lauvset, S. K.; Lefèvre, N.; Manke, A. B.; Mathis, J. T.; Merlivat, L.; Metzl, N.; Murata, A.; Newberger, T.; Omar, A. M.; Ono, T.; Park, G.-H.; Paterson, K.; Pierrot, D.; Ríos, A. F.; Sabine, C. L.; Saito, S.; Salisbury, J.; Sarma, V. V. S. S.; Schlitzer, R.; Sieger, R.; Skjelvan, I.; Steinhoff, T.; Sullivan, K. F.; Sun, H.; Sutton, A. J.; Suzuki, T.; Sweeney, C.; Takahashi, T.; Tjiputra, J.; Tsurushima, N.; van Heuven, S. M. A. C.; Vandemark, D.; Vlahos, P.; Wallace, D. W. R.; Wanninkhof, R.; Watson, A. J.
2014-03-01
The Surface Ocean CO2 Atlas (SOCAT), an activity of the international marine carbon research community, provides access to synthesis and gridded fCO2 (fugacity of carbon dioxide) products for the surface oceans. Version 2 of SOCAT is an update of the previous release (version 1) with more data (increased from 6.3 million to 10.1 million surface water fCO2 values) and extended data coverage (from 1968-2007 to 1968-2011). The quality control criteria, while identical in both versions, have been applied more strictly in version 2 than in version 1. The SOCAT website (http://www.socat.info/) has links to quality control comments, metadata, individual data set files, and synthesis and gridded data products. Interactive online tools allow visitors to explore the richness of the data. Applications of SOCAT include process studies, quantification of the ocean carbon sink and its spatial, seasonal, year-to-year and longerterm variation, as well as initialisation or validation of ocean carbon models and coupled climate-carbon models. Data coverage Repository-References: Individual data set files and synthesis product: doi:10.1594/PANGAEA.811776 Gridded products: doi:10.3334/CDIAC/OTG.SOCAT_V2_GRID Available at: http://www.socat.info/ Coverage: 79° S to 90° N; 180° W to 180° E Location Name: Global Oceans and Coastal Seas Date/Time Start: 16 November 1968 ate/Time End: 26 December 2011
NASA Astrophysics Data System (ADS)
Rundle, J.; Rundle, P.; Donnellan, A.; Li, P.
2003-12-01
We consider the problem of the complex dynamics of earthquake fault systems, and whether numerical simulations can be used to define an ensemble forecasting technology similar to that used in weather and climate research. To effectively carry out such a program, we need 1) a topological realistic model to simulate the fault system; 2) data sets to constrain the model parameters through a systematic program of data assimilation; 3) a computational technology making use of modern paradigms of high performance and parallel computing systems; and 4) software to visualize and analyze the results. In particular, we focus attention of a new version of our code Virtual California (version 2001) in which we model all of the major strike slip faults extending throughout California, from the Mexico-California border to the Mendocino Triple Junction. We use the historic data set of earthquakes larger than magnitude M > 6 to define the frictional properties of all 654 fault segments (degrees of freedom) in the model. Previous versions of Virtual California had used only 215 fault segments to model the strike slip faults in southern California. To compute the dynamics and the associated surface deformation, we use message passing as implemented in the MPICH standard distribution on a small Beowulf cluster consisting of 10 cpus. We are also planning to run the code on significantly larger machines so that we can begin to examine much finer spatial scales of resolution, and to assess scaling properties of the code. We present results of simulations both as static images and as mpeg movies, so that the dynamical aspects of the computation can be assessed by the viewer. We also compute a variety of statistics from the simulations, including magnitude-frequency relations, and compare these with data from real fault systems.
NASA Technical Reports Server (NTRS)
Reed, Evan; Pellish, Jonathan
2016-01-01
In the space surrounding Earth there exists an active radiation environment consisting mostly of electrons and protons that have been trapped by Earths magnetic field. This radiation, also known as the Van Allen Belts, has the potential to damage man-made satellites in orbit; thus, proper precautions must be taken to shield NASA assets from this phenomenon. Data on the Van Allen Belts has been collected continuously by a multitude of space-based instruments since the beginning of space exploration. Subsequently, using theory to fill in the gaps in the collected data, computer models have been developed that take in the orbital information of a hypothetical mission and output the expected particle fluence and flux for that orbit. However, as new versions of the modeling system are released, users are left wondering how the new version differs from the old. Therefore, we performed a comparison of three different editions of the modeling system: AE8/AP8 (legacy), which is included in the model 9 graphical user interface as an option for ones calculations, AE9/AP9, and the Space Environment Information System (SPENVIS), which is an online-based form of AE8/AP8 developed by NASA and the European Space Agency that changed the code to allow the program to extrapolate data to predict fluence and flux at higher energies. Although this evaluation is still ongoing, it is predicted that the model 8 (legacy) and SPENVIS version will have identical outputs with the exception of the extended energy levels from SPENVIS, while model 9 will provide different fluences than model 8 based on additional magnetic field descriptions and on-orbit data.
NASA Astrophysics Data System (ADS)
Fahey, Kathleen M.; Carlton, Annmarie G.; Pye, Havala O. T.; Baek, Jaemeen; Hutzell, William T.; Stanier, Charles O.; Baker, Kirk R.; Wyat Appel, K.; Jaoui, Mohammed; Offenberg, John H.
2017-04-01
This paper describes the development and implementation of an extendable aqueous-phase chemistry option (AQCHEM - KMT(I)) for the Community Multiscale Air Quality (CMAQ) modeling system, version 5.1. Here, the Kinetic PreProcessor (KPP), version 2.2.3, is used to generate a Rosenbrock solver (Rodas3) to integrate the stiff system of ordinary differential equations (ODEs) that describe the mass transfer, chemical kinetics, and scavenging processes of CMAQ clouds. CMAQ's standard cloud chemistry module (AQCHEM) is structurally limited to the treatment of a simple chemical mechanism. This work advances our ability to test and implement more sophisticated aqueous chemical mechanisms in CMAQ and further investigate the impacts of microphysical parameters on cloud chemistry. Box model cloud chemistry simulations were performed to choose efficient solver and tolerance settings, evaluate the implementation of the KPP solver, and assess the direct impacts of alternative solver and kinetic mass transfer on predicted concentrations for a range of scenarios. Month-long CMAQ simulations for winter and summer periods over the US reveal the changes in model predictions due to these cloud module updates within the full chemical transport model. While monthly average CMAQ predictions are not drastically altered between AQCHEM and AQCHEM - KMT, hourly concentration differences can be significant. With added in-cloud secondary organic aerosol (SOA) formation from biogenic epoxides (AQCHEM - KMTI), normalized mean error and bias statistics are slightly improved for 2-methyltetrols and 2-methylglyceric acid at the Research Triangle Park measurement site in North Carolina during the Southern Oxidant and Aerosol Study (SOAS) period. The added in-cloud chemistry leads to a monthly average increase of 11-18 % in cloud
SOA at the surface in the eastern United States for June 2013.
NASA Technical Reports Server (NTRS)
Tirres, Lizet
1991-01-01
An evaluation of the aerodynamic performance of the solid version of an Allison-designed cooled radial turbine was conducted at NASA Lewis' Warm Turbine Test Facility. The resulting pressure and temperature measurements are used to calculate vane, rotor, and overall stage performance. These performance results are then compared to the analytical results obtained by using NASA's MTSB (MERIDL-TSONIC-BLAYER) code.
Building validation tools for knowledge-based systems
NASA Technical Reports Server (NTRS)
Stachowitz, R. A.; Chang, C. L.; Stock, T. S.; Combs, J. B.
1987-01-01
The Expert Systems Validation Associate (EVA), a validation system under development at the Lockheed Artificial Intelligence Center for more than a year, provides a wide range of validation tools to check the correctness, consistency and completeness of a knowledge-based system. A declarative meta-language (higher-order language), is used to create a generic version of EVA to validate applications written in arbitrary expert system shells. The architecture and functionality of EVA are presented. The functionality includes Structure Check, Logic Check, Extended Structure Check (using semantic information), Extended Logic Check, Semantic Check, Omission Check, Rule Refinement, Control Check, Test Case Generation, Error Localization, and Behavior Verification.
NASA Astrophysics Data System (ADS)
Ardenghi, Juan S.; Castagnino, M.; Campoamor-Stursberg, R.
2009-10-01
The nonrelativistic limit of the centrally extended Poincaré group is considered and their consequences in the modal Hamiltonian interpretation of quantum mechanics are discussed [O. Lombardi and M. Castagnino, Stud. Hist. Philos. Mod. Phys 39, 380 (2008); J. Phys, Conf. Ser. 128, 012014 (2008)]. Through the assumption that in quantum field theory the Casimir operators of the Poincaré group actualize, the nonrelativistic limit of the latter group yields to the actualization of the Casimir operators of the Galilei group, which is in agreement with the actualization rule of previous versions of modal Hamiltonian interpretation [Ardenghi et al., Found. Phys. (submitted)].
How drug life-cycle management patent strategies may impact formulary management.
Berger, Jan; Dunn, Jeffrey D; Johnson, Margaret M; Karst, Kurt R; Shear, W Chad
2016-10-01
Drug manufacturers may employ various life-cycle management patent strategies, which may impact managed care decision making regarding formulary planning and management strategies when single-source, branded oral pharmaceutical products move to generic status. Passage of the Hatch-Waxman Act enabled more rapid access to generic medications through the abbreviated new drug application process. Patent expirations of small-molecule medications and approvals of generic versions have led to substantial cost savings for health plans, government programs, insurers, pharmacy benefits managers, and their customers. However, considering that the cost of developing a single medication is estimated at $2.6 billion (2013 dollars), pharmaceutical patent protection enables companies to recoup investments, creating an incentive for innovation. Under current law, patent protection holds for 20 years from time of patent filing, although much of this time is spent in product development and regulatory review, leaving an effective remaining patent life of 7 to 10 years at the time of approval. To extend the product life cycle, drug manufacturers may develop variations of originator products and file for patents on isomers, metabolites, prodrugs, new drug formulations (eg, extended-release versions), and fixed-dose combinations. These additional patents and the complexities surrounding the timing of generic availability create challenges for managed care stakeholders attempting to gauge when generics may enter the market. An understanding of pharmaceutical patents and how intellectual property protection may be extended would benefit managed care stakeholders and help inform decisions regarding benefit management.
catsHTM: A Tool for Fast Accessing and Cross-matching Large Astronomical Catalogs
NASA Astrophysics Data System (ADS)
Soumagnac, Maayane T.; Ofek, Eran O.
2018-07-01
Fast access to large catalogs is required for some astronomical applications. Here we introduce the catsHTM tool, consisting of several large catalogs reformatted into HDF5-based file format, which can be downloaded and used locally. To allow fast access, the catalogs are partitioned into hierarchical triangular meshes and stored in HDF5 files. Several tools are provided to perform efficient cone searches at resolutions spanning from a few arc-seconds to degrees, within a few milliseconds time. The first released version includes the following catalogs (by alphabetical order): 2MASS, 2MASS extended sources, AKARI, APASS, Cosmos, DECaLS/DR5, FIRST, GAIA/DR1, GAIA/DR2, GALEX/DR6Plus7, HSC/v2, IPHAS/DR2, NED redshifts, NVSS, Pan-STARRS1/DR1, PTF photometric catalog, ROSAT faint source, SDSS sources, SDSS/DR14 spectroscopy, SkyMapper, Spitzer/SAGE, Spitzer/IRAC galactic center, UCAC4, UKIDSS/DR10, VST/ATLAS/DR3, VST/KiDS/DR3, WISE and XMM. We provide Python code that allows to perform cone searches, as well as MATLAB code for performing cone searches, catalog cross-matching, general searches, as well as load and create these catalogs.
NASA Astrophysics Data System (ADS)
Sigaut, Lorena; Villarruel, Cecilia; Ponce, María Laura; Ponce Dawson, Silvina
2017-06-01
Many cell signaling pathways involve the diffusion of messengers that bind and unbind to and from intracellular components. Quantifying their net transport rate under different conditions then requires having separate estimates of their free diffusion coefficient and binding or unbinding rates. In this paper, we show how performing sets of fluorescence correlation spectroscopy (FCS) experiments under different conditions, it is possible to quantify free diffusion coefficients and on and off rates of reaction-diffusion systems. We develop the theory and present a practical implementation for the case of the universal second messenger, calcium (Ca2 +) and single-wavelength dyes that increase their fluorescence upon Ca2 + binding. We validate the approach with experiments performed in aqueous solutions containing Ca2 + and Fluo4 dextran (both in its high and low affinity versions). Performing FCS experiments with tetramethylrhodamine-dextran in Xenopus laevis oocytes, we infer the corresponding free diffusion coefficients in the cytosol of these cells. Our approach can be extended to other physiologically relevant reaction-diffusion systems to quantify biophysical parameters that determine the dynamics of various variables of interest.
NASA Astrophysics Data System (ADS)
Leonard, William H.; Cavana, Gordon R.; Lowery, Lawrence F.
Discretion-the exercise of independent judgment-was observed to be lacking in most commercially available laboratory investigations for high school biology. An Extended Discretion (ED) laboratory approach was developed and tested experimentally against the BSCS Green Version laboratory program, using ten classes of 10th-grade biology in a suburban California high school. Five teachers were each assigned one experimental and one control group. The primary differences between the two approaches were that the BSCS was more prescriptive and directive than the ED approach and the ED approach increased discretionary demands upon the student over the school year. A treatment verification procedure showed statistically significant differences between the two approaches. The hypothesis under test was that when high school biology students are taught laboratory concepts under comparatively high discretionary demands, they would perform as well as or better than a similar group of students taught with BSCS Green Version investigations. A second hypothesis was that teachers would prefer to use the ED approach over the BSCS approach for their future classes. A t analysis between experimental and control groups for each teacher was employed. There were significant differences in favor of the ED group on laboratory report scores for three teachers and no differences for two teachers. There were significant differences in favor of the ED group on laboratory concepts quiz scores for three teachers, no differences for one teacher, and significant differences in favor of the BSCS group for only one teacher. A t analysis of teacher evaluation of the two approaches showed a significant teacher preference overall for the ED approach. Both experimental hypotheses were accepted. The ED approach was observed to be difficult for students at first, but it was found to be a workable and productive means of teaching laboratory concepts in biology which also required extensive use of individual student discretion.
NASA Astrophysics Data System (ADS)
Blum, Mirjam; Rozanov, Vladimir; Bracher, Astrid; Burrows, John P.
The radiative transfer model SCIATRAN [V. V. Rozanov et al., 2002; A. Rozanov et al., 2005, 2008] has been developed to model atmospheric radiative transfer. This model is mainly applied to improve the analysis of high spectrally resolved satellite data as, for instance, data of the instrument SCIAMACHY (Scanning Imaging Absorption Spectrometer for Atmospheric CHar-tographY) onboard the ENVISAT satellite. Within the present study, SCIATRAN has been extended by taking radiative processes as well as at the atmosphere-water interface as within the water into account, which were caused by water itself and its constituents. Comparisons of this extended version of SCIATRAN for in-situ data and for MERIS satellite information yield first results, which will be shown. It is expected that the new version of SCIATRAN, including the coupling of atmospheric and oceanic radiative transfer, will widen the use of high spectrally resolved data in the form of achieving new findings, such as information about ocean biooptics and biogeochemistry like, for example, biomass of different phytoplankton groups or CDOM fluorescence. In addition, it is awaited that the new version improves the retrieval of atmospheric trace gases above oceanic waters. References: 1. V. V. Rozanov, M. Buchwitz, K.-U. Eichmann, R. de Beek, and J. P. Burrows. Sciatran -a new radiative transfer model for geophysical applications in the 240-2400nm spectral region: the pseudo-spherical version. Adv. in Space Res. 29, 1831-1835 (2002) 2. A. Rozanov, V. V. Rozanov, M. Buchwitz, A. Kokhanovsky, and J. P. Burrows. SCIA-TRAN 2.0 -A new radiative tranfer model for geophysical applications in the 175-2400nm spectral region. Adv. in Space Res. 36, 1015-1019 (2005) 3. A. Rozanov. SCIATRAN 2.X: Radiative transfer model and retrieval software package. URL = http://www.iup.physik.uni-bremen.de/sciatran (2008)
Amnioinfusion to facilitate external cephalic version after initial failure.
Adama van Scheltema, P N; Feitsma, A H; Middeldorp, J M; Vandenbussche, F P H A; Oepkes, D
2006-09-01
To evaluate the effectiveness of antepartum transabdominal amnioinfusion to facilitate external cephalic version after initial failure. Women with a structurally normal fetus in breech lie at term, with a failed external cephalic version and an amniotic fluid index (AFI) less than 15 cm, were asked to participate in our study. After tocolysis with indomethacin, a transabdominal amnioinfusion was performed with an 18G spinal needle. Lactated Ringers solution was infused until the AFI reached 15 cm, with a maximum of 1 L. External cephalic version was performed directly afterward. Seven women participated in the study. The gestational age of the women was between 36(+4) and 38(+3) weeks, and three women were primiparous. The AFI ranged from 4 cm to 13 cm. A median amount of 1,000 mL Ringers solution (range 700-1,000 mL) was infused per procedure. The repeat external cephalic versions after amnioinfusion were not successful in any of the patients. In our experience, amnioinfusion does not facilitate external cephalic version.
Design versions of HTS three-phase cables with the minimized value of AC losses
NASA Astrophysics Data System (ADS)
Altov, V. A.; Balashov, N. N.; Degtyarenko, P. N.; Ivanov, S. S.; Kopylov, S. I.; Lipa, DA; Samoilenkov, S. V.; Sytnikov, V. E.; Zheltov, V. V.
2018-03-01
Design versions of HTS three-phase cables consisting of 2G HTS tapes have been investigated by the numerical simulation method with the aim of AC losses minimization. Two design versions of cables with the coaxial and extended rectangular cross-section shape are considered – the non-sectioned and sectioned one. In the latter each cable phase consists of sections connected in parallel. The optimal dimensions of sections and order of their alteration are chosen by appropriate calculations. The model used takes into account the current distribution between the sections and its non-uniformity within each single HTS tape as well. The following characteristics are varied: design version, dimension, positioning of extra copper layer in a cable, design of HTS tapes themselves and their mutual position. The dependence of AC losses on the latter two characteristics is considered in details, and the examples of cable designs optimized by the total set of characteristics for the medium class of voltages (10 – 60 kV) are given. At the critical current JC=5.1 кA per phase and current amplitudes lower than 0.85JC, the level of total AC losses does not exceed the natural cryostat heat losses.
NASA Technical Reports Server (NTRS)
Bosilovich, Michael G.; Suarez, Max J. (Editor); Schubert, Siegfried D.
1998-01-01
First ISLSCP Field Experiment (FIFE) observations have been used to validate the near-surface proper- ties of various versions of the Goddard Earth Observing System (GEOS) Data Assimilation System. The site- averaged FIFE data set extends from May 1987 through November 1989, allowing the investigation of several time scales, including the annual cycle, daily means and diurnal cycles. Furthermore, the development of the daytime convective planetary boundary layer is presented for several days. Monthly variations of the surface energy budget during the summer of 1988 demonstrate the affect of the prescribed surface soil wetness boundary conditions. GEOS data comes from the first frozen version of the assimilation system (GEOS-1 DAS) and two experimental versions of GEOS (v. 2.0 and 2.1) with substantially greater vertical resolution and other changes that influence the boundary layer. This report provides a baseline for future versions of the GEOS data assimilation system that will incorporate a state-of-the-art land surface parameterization. Several suggestions are proposed to improve the generality of future comparisons. These include the use of more diverse field experiment observations and an estimate of gridpoint heterogeneity from the new land surface parameterization.
Super Cooled Large Droplet Analysis of Several Geometries Using LEWICE3D Version 3
NASA Technical Reports Server (NTRS)
Bidwell, Colin S.
2011-01-01
Super Cooled Large Droplet (SLD) collection efficiency calculations were performed for several geometries using the LEWICE3D Version 3 software. The computations were performed using the NASA Glenn Research Center SLD splashing model which has been incorporated into the LEWICE3D Version 3 software. Comparisons to experiment were made where available. The geometries included two straight wings, a swept 64A008 wing tip, two high lift geometries, and the generic commercial transport DLR-F4 wing body configuration. In general the LEWICE3D Version 3 computations compared well with the 2D LEWICE 3.2.2 results and with experimental data where available.
Aeroelastic Optimization of Generalized Tube and Wing Aircraft Concepts Using HCDstruct Version 2.0
NASA Technical Reports Server (NTRS)
Quinlan, Jesse R.; Gern, Frank H.
2017-01-01
Major enhancements were made to the Higher-fidelity Conceptual Design and structural optimization (HCDstruct) tool developed at NASA Langley Research Center (LaRC). Whereas previous versions were limited to hybrid wing body (HWB) configurations, the current version of HCDstruct now supports the analysis of generalized tube and wing (TW) aircraft concepts. Along with significantly enhanced user input options for all air- craft configurations, these enhancements represent HCDstruct version 2.0. Validation was performed using a Boeing 737-200 aircraft model, for which primary structure weight estimates agreed well with available data. Additionally, preliminary analysis of the NASA D8 (ND8) aircraft concept was performed, highlighting several new features of the tool.
Extended Duration Orbiter (EDO) Improved Waste Collection System (IWCS)
NASA Technical Reports Server (NTRS)
1992-01-01
This overall front view shows the Extended Duration Orbiter (EDO) Waste Collection System (WCS) scheduled to fly aboard NASA's Endeavour, Orbiter Vehicle (OV) 105, for the STS-54 mission. Detailed Test Objective 662, Extended duration orbiter WCS evaluation, will verify the design of the new EDO WCS under microgravity conditions for a prolonged period. OV-105 has been modified with additional structures in the waste management compartment (WMC) and additional avionics to support/restrain the EDO WCS. Among the advantages the new IWCS is hoped to have over the currect WCS are greater dependability, better hygiene, virtually unlimited capacity, and more efficient preparation between shuttle missions. Unlike the previous WCS, the improved version will not have to be removed from the spacecraft to be readied for the next flight. The WCS was documented in JSC's Crew Systems Laboratory Bldg 7.
Vakil, Eli; Bloch, Ayala; Cohen, Haggar
2017-03-01
The serial reaction time (SRT) task has generated a very large amount of research. Nevertheless the debate continues as to the exact cognitive processes underlying implicit sequence learning. Thus, the first goal of this study is to elucidate the underlying cognitive processes enabling sequence acquisition. We therefore compared reaction time (RT) in sequence learning in a standard manual activated (MA) to that in an ocular activated (OA) version of the task, within a single experimental setting. The second goal is to use eye movement measures to compare anticipation, as an additional indication of sequence learning, between the two versions of the SRT. Performance of the group given the MA version of the task (n = 29) was compared with that of the group given the OA version (n = 30). The results showed that although overall, RT was faster for the OA group, the rate of sequence learning was similar to that of the MA group performing the standard version of the SRT. Because the stimulus-response association is automatic and exists prior to training in the OA task, the decreased reaction time in this version of the task reflects a purer measure of the sequence learning that occurs in the SRT task. The results of this study show that eye tracking anticipation can be measured directly and can serve as a direct measure of sequence learning. Finally, using the OA version of the SRT to study sequence learning presents a significant methodological contribution by making sequence learning studies possible among populations that struggle to perform manual responses.
Identification of nonclassical properties of light with multiplexing layouts
NASA Astrophysics Data System (ADS)
Sperling, J.; Eckstein, A.; Clements, W. R.; Moore, M.; Renema, J. J.; Kolthammer, W. S.; Nam, S. W.; Lita, A.; Gerrits, T.; Walmsley, I. A.; Agarwal, G. S.; Vogel, W.
2017-07-01
In Sperling et al. [Phys. Rev. Lett. 118, 163602 (2017), 10.1103/PhysRevLett.118.163602], we introduced and applied a detector-independent method to uncover nonclassicality. Here, we extend those techniques and give more details on the performed analysis. We derive a general theory of the positive-operator-valued measure that describes multiplexing layouts with arbitrary detectors. From the resulting quantum version of a multinomial statistics, we infer nonclassicality probes based on a matrix of normally ordered moments. We discuss these criteria and apply the theory to our data which are measured with superconducting transition-edge sensors. Our experiment produces heralded multiphoton states from a parametric down-conversion light source. We show that the known notions of sub-Poisson and sub-binomial light can be deduced from our general approach, and we establish the concept of sub-multinomial light, which is shown to outperform the former two concepts of nonclassicality for our data.
Filtering as a reasoning-control strategy: An experimental assessment
NASA Technical Reports Server (NTRS)
Pollack, Martha E.
1994-01-01
In dynamic environments, optimal deliberation about what actions to perform is impossible. Instead, it is sometimes necessary to trade potential decision quality for decision timeliness. One approach to achieving this trade-off is to endow intelligent agents with meta-level strategies that provide them guidance about when to reason (and what to reason about) and when to act. We describe our investigations of a particular meta-level reasoning strategy, filtering, in which an agent commits to the goals it has already adopted, and then filters from consideration new options that would conflict with the successful completion of existing goals. To investigate the utility of filtering, a series of experiments was conducted using the Tileworld testbed. Previous experiments conducted by Kinny and Georgeff used an earlier version of the Tileworld to demonstrate the feasibility of filtering. Results are presented that replicate and extend those of Kinny and Georgeff and demonstrate some significant environmental influences on the value of filtering.
Reading Aloud: Discrete Stage(s) Redux
Robidoux, Serje; Besner, Derek
2017-01-01
Interactive activation accounts of processing have had a broad and deep influence on cognitive psychology, particularly so in the context of computational accounts of reading aloud at the single word level. Here we address the issue of whether such a framework can simulate the joint effects of stimulus quality and word frequency (which have been shown to produce both additive and interactive effects depending on the context). We extend previous work on this question by considering an alternative implementation of a stimulus quality manipulation, and the role of interactive activation. Simulations with a version of the Dual Route Cascaded model (a model with interactive activation dynamics along the lexical route) demonstrate that the model is unable to simulate the entire pattern seen in human performance. We discuss how a hybrid interactive activation model that includes some context dependent staged processing could accommodate these data. PMID:28289395
NNEPEQ: Chemical equilibrium version of the Navy/NASA Engine Program
NASA Technical Reports Server (NTRS)
Fishbach, Laurence H.; Gordon, Sanford
1988-01-01
The Navy NASA Engine Program, NNEP, currently is in use at a large number of government agencies, commercial companies and universities. This computer code has bee used extensively to calculate the design and off-design (matched) performance of a broad range of turbine engines, ranging from subsonic turboprops to variable cycle engines for supersonic transports. Recently, there has been increased interest in applications for which NNEP was not capable of simulating, namely, high Mach applications, alternate fuels including cryogenics, and cycles such as the gas generator air-turbo-rocker (ATR). In addition, there is interest in cycles employing ejectors such as for military fighters. New engine component models had to be created for incorporation into NNEP, and it was found necessary to include chemical dissociation effects of high temperature gases. The incorporation of these extended capabilities into NNEP is discussed and some of the effects of these changes are illustrated.
Design, Fabrication, and Testing of an Auxiliary Cooling System for Jet Engines
NASA Technical Reports Server (NTRS)
Leamy, Kevin; Griffiths, Jim; Andersen, Paul; Joco, Fidel; Laski, Mark; Balser, Jeffrey (Technical Monitor)
2001-01-01
This report summarizes the technical effort of the Active Cooling for Enhanced Performance (ACEP) program sponsored by NASA. It covers the design, fabrication, and integrated systems testing of a jet engine auxiliary cooling system, or turbocooler, that significantly extends the use of conventional jet fuel as a heat sink. The turbocooler is designed to provide subcooled cooling air to the engine exhaust nozzle system or engine hot section. The turbocooler consists of three primary components: (1) a high-temperature air cycle machine driven by engine compressor discharge air, (2) a fuel/ air heat exchanger that transfers energy from the hot air to the fuel and uses a coating to mitigate fuel deposits, and (3) a high-temperature fuel injection system. The details of the turbocooler component designs and results of the integrated systems testing are documented. Industry Version-Data and information deemed subject to Limited Rights restrictions are omitted from this document.
NNEPEQ - Chemical equilibrium version of the Navy/NASA Engine Program
NASA Technical Reports Server (NTRS)
Fishbach, L. H.; Gordon, S.
1989-01-01
The Navy NASA Engine Program, NNEP, currently is in use at a large number of government agencies, commercial companies and universities. This computer code has been used extensively to calculate the design and off-design (matched) performance of a broad range of turbine engines, ranging from subsonic turboprops to variable cycle engines for supersonic transports. Recently, there has been increased interest in applications for which NNEP was not capable of simulating, namely, high Mach applications, alternate fuels including cryogenics, and cycles such as the gas generator air-turbo-rocker (ATR). In addition, there is interest in cycles employing ejectors such as for military fighters. New engine component models had to be created for incorporation into NNEP, and it was found necessary to include chemical dissociation effects of high temperature gases. The incorporation of these extended capabilities into NNEP is discussed and some of the effects of these changes are illustrated.
Identification of nonclassical properties of light with multiplexing layouts
Sperling, J.; Eckstein, A.; Clements, W. R.; Moore, M.; Renema, J. J.; Kolthammer, W. S.; Nam, S. W.; Lita, A.; Gerrits, T.; Walmsley, I. A.; Agarwal, G. S.; Vogel, W.
2018-01-01
In Sperling et al. [Phys. Rev. Lett. 118, 163602 (2017)], we introduced and applied a detector-independent method to uncover nonclassicality. Here, we extend those techniques and give more details on the performed analysis. We derive a general theory of the positive-operator-valued measure that describes multiplexing layouts with arbitrary detectors. From the resulting quantum version of a multinomial statistics, we infer nonclassicality probes based on a matrix of normally ordered moments. We discuss these criteria and apply the theory to our data which are measured with superconducting transition-edge sensors. Our experiment produces heralded multiphoton states from a parametric down-conversion light source. We show that the known notions of sub-Poisson and sub-binomial light can be deduced from our general approach, and we establish the concept of sub-multinomial light, which is shown to outperform the former two concepts of nonclassicality for our data. PMID:29670949
Finite cover method with mortar elements for elastoplasticity problems
NASA Astrophysics Data System (ADS)
Kurumatani, M.; Terada, K.
2005-06-01
Finite cover method (FCM) is extended to elastoplasticity problems. The FCM, which was originally developed under the name of manifold method, has recently been recognized as one of the generalized versions of finite element methods (FEM). Since the mesh for the FCM can be regular and squared regardless of the geometry of structures to be analyzed, structural analysts are released from a burdensome task of generating meshes conforming to physical boundaries. Numerical experiments are carried out to assess the performance of the FCM with such discretization in elastoplasticity problems. Particularly to achieve this accurately, the so-called mortar elements are introduced to impose displacement boundary conditions on the essential boundaries, and displacement compatibility conditions on material interfaces of two-phase materials or on joint surfaces between mutually incompatible meshes. The validity of the mortar approximation is also demonstrated in the elastic-plastic FCM.
Semi-active suspension for automotive application
NASA Astrophysics Data System (ADS)
Venhovens, Paul J. T.; Devlugt, Alex R.
The theoretical considerations for semi-active damping system evaluation, with respect to semi-active suspension and Kalman filtering, are discussed in terms of the software. Some prototype hardware developments are proposed. A significant improvement in ride comfort performance can be obtained, indicated by root mean square body acceleration values and frequency responses, using a switchable damper system with two settings. Nevertheless the improvement is accompanied by an increase in dynamic tire load variations. The main benefit of semi-active suspensions is the potential of changing the low frequency section of the transfer function. In practice this will support the impression of extra driving stability. It is advisable to apply an adaptive control strategy like the (extended) skyhook version switching more to the 'comfort' setting for straight (and smooth/moderate roughness) road running and switching to 'road holding' for handling maneuvers and possibly rough roads and discrete, severe events like potholes.
5D Tempest simulations of kinetic edge turbulence
NASA Astrophysics Data System (ADS)
Xu, X. Q.; Xiong, Z.; Cohen, B. I.; Cohen, R. H.; Dorr, M. R.; Hittinger, J. A.; Kerbel, G. D.; Nevins, W. M.; Rognlien, T. D.; Umansky, M. V.; Qin, H.
2006-10-01
Results are presented from the development and application of TEMPEST, a nonlinear five dimensional (3d2v) gyrokinetic continuum code. The simulation results and theoretical analysis include studies of H-mode edge plasma neoclassical transport and turbulence in real divertor geometry and its relationship to plasma flow generation with zero external momentum input, including the important orbit-squeezing effect due to the large electric field flow-shear in the edge. In order to extend the code to 5D, we have formulated a set of fully nonlinear electrostatic gyrokinetic equations and a fully nonlinear gyrokinetic Poisson's equation which is valid for both neoclassical and turbulence simulations. Our 5D gyrokinetic code is built on 4D version of Tempest neoclassical code with extension to a fifth dimension in binormal direction. The code is able to simulate either a full torus or a toroidal segment. Progress on performing 5D turbulence simulations will be reported.
Bauer, A S; Timpe, J; Edmonds, E C; Bechara, A; Tranel, D; Denburg, N L
2013-02-01
It has been shown that older adults perform less well than younger adults on the Iowa Gambling Task (IGT), a real-world type decision-making task that factors together reward, punishment, and uncertainty. To explore the reasons behind this age-related decrement, we administered to an adult life span sample of 265 healthy participants (Mdn age = 62.00 +/- 16.17 years; range [23-88]) 2 versions of the IGT, which have different contingencies for successful performance: A'B'C'D' requires choosing lower immediate reward (paired with lower delayed punishment); E'F'G'H' requires choosing higher immediate punishment (paired with higher delayed reward). There was a significant negative correlation between age and performance on the A'B'C'D' version of the IGT (r = -.16, p = .01), while there was essentially no correlation between age and performance on the E'F'G'H' version (r = -.07, p = .24). In addition, the rate of impaired performance in older participants was significantly higher for the A'B'C'D' version (23%) compared with the E'F'G'H' version (13%). A parsimonious account of these findings is an age-related increase in hypersensitivity to reward, whereby the decisions of older adults are disproportionately influenced by prospects of receiving reward, irrespective of the presence or degree of punishment. PsycINFO Database Record (c) 2013 APA, all rights reserved.
[Fetal version as ambulatory intervention].
Nohe, G; Hartmann, W; Klapproth, C E
1996-06-01
The external cephalic version (ECV) of the fetus at term reduces the maternal and fetal risks of intrapartum breech presentation and Caesarean delivery. Since 1986 over 800 external cephalic versions were performed in the outpatient Department of Obstetrics and Gynaecology of the Städtische Frauenklinik Stuttgart. 60.5% were successful. NO severe complications occurred. Sufficient amniotic fluid as well as the mobility of the fetal breech is a major criterion for the success of the ECV. Management requires a safe technique for mother and fetus. This includes ultrasonography, elektronic fetal monitoring and the ability to perform immediate caesarean delivery as well as the performance of ECV without analgesicas and sedatives. More than 70% of the ECV were successful without tocolysis. In unsuccessful cases the additional use of tocolysis improves the success rate only slightly. Therefore routine use of tocolysis does not appear necessary. External cephalic version can be recommended as an outpatient treatment without tocolysis.
NASA Technical Reports Server (NTRS)
Womble, M. E.; Potter, J. E.
1975-01-01
A prefiltering version of the Kalman filter is derived for both discrete and continuous measurements. The derivation consists of determining a single discrete measurement that is equivalent to either a time segment of continuous measurements or a set of discrete measurements. This prefiltering version of the Kalman filter easily handles numerical problems associated with rapid transients and ill-conditioned Riccati matrices. Therefore, the derived technique for extrapolating the Riccati matrix from one time to the next constitutes a new set of integration formulas which alleviate ill-conditioning problems associated with continuous Riccati equations. Furthermore, since a time segment of continuous measurements is converted into a single discrete measurement, Potter's square root formulas can be used to update the state estimate and its error covariance matrix. Therefore, if having the state estimate and its error covariance matrix at discrete times is acceptable, the prefilter extends square root filtering with all its advantages, to continuous measurement problems.
Enhancements to the IBM version of COSMIC/NASTRAN
NASA Technical Reports Server (NTRS)
Brown, W. Keith
1989-01-01
Major improvements were made to the IBM version of COSMIC/NASTRAN by RPK Corporation under contract to IBM Corporation. These improvements will become part of COSMIC's IBM version and will be available in the second quarter of 1989. The first improvement is the inclusion of code to take advantage of IBM's new Vector Facility (VF) on its 3090 machines. The remaining improvements are modifications that will benefit all users as a result of the extended addressing capability provided by the MVS/XA operating system. These improvements include the availability of an in-memory data base that potentially eliminates the need for I/O to the PRIxx disk files. Another improvement is the elimination of multiple load modules that have to be loaded for every link switch within NASTRAN. The last improvement allows for NASTRAN to execute above the 16 mega-byte line. This improvement allows for NASTRAN to have access to 2 giga-bytes of memory for open core and the in-memory data base.
Reynoso, G. A.; March, A. D.; Berra, C. M.; Strobietto, R. P.; Barani, M.; Iubatti, M.; Chiaradio, M. P.; Serebrisky, D.; Kahn, A.; Vaccarezza, O. A.; Leguiza, J. L.; Ceitlin, M.; Luna, D. A.; Bernaldo de Quirós, F. G.; Otegui, M. I.; Puga, M. C.; Vallejos, M.
2000-01-01
This presentation features linguistic and terminology management issues related to the development of the Spanish version of the Systematized Nomenclature of Medicine (SNOMED). It aims at describing the aspects of translating and the difficulties encountered in delivering a natural and consistent medical nomenclature. Bunge's three-layered model is referenced to analyze the sequence of symbolic concept representations. It further explains how a communicative translation based on a concept-to-concept approach was used to achieve the highest level of flawlessness and naturalness for the Spanish rendition of SNOMED. Translation procedures and techniques are described and exemplified. Both the computer-aided and human translation methods are portrayed. The scientific and translation team tasks are detailed, with focus on Newmark's four-level principle for the translation process, extended with a fifth further level relevant to the ontology to control the consistency of the typology of concepts. Finally the convenience for a common methodology to develop non-English versions of SNOMED is suggested. PMID:11079973
Adaptive partially hidden Markov models with application to bilevel image coding.
Forchhammer, S; Rasmussen, T S
1999-01-01
Partially hidden Markov models (PHMMs) have previously been introduced. The transition and emission/output probabilities from hidden states, as known from the HMMs, are conditioned on the past. This way, the HMM may be applied to images introducing the dependencies of the second dimension by conditioning. In this paper, the PHMM is extended to multiple sequences with a multiple token version and adaptive versions of PHMM coding are presented. The different versions of the PHMM are applied to lossless bilevel image coding. To reduce and optimize the model cost and size, the contexts are organized in trees and effective quantization of the parameters is introduced. The new coding methods achieve results that are better than the JBIG standard on selected test images, although at the cost of increased complexity. By the minimum description length principle, the methods presented for optimizing the code length may apply as guidance for training (P)HMMs for, e.g., segmentation or recognition purposes. Thereby, the PHMM models provide a new approach to image modeling.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House.
This report from the United States House of Representatives presents the complete amended version of the House bill to extend for 5 years the authorizations of appropriations for the programs under the Elementary and Secondary Education Act. The current reauthorization bill is known as the "Improving America's Schools Act." The first…
ERIC Educational Resources Information Center
Laner, S.; And Others
This report is a critical evaluation based on extended field trials and theoretical analysis of the time-span technique of measuring level of work in organizational hierarchies. It is broadly concluded that the technique does possess many of the desirable features claimed by its originator, but that earlier, less highly structured versions based…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-21
... proposing to amend its fees schedule as it relates to PULSe workstations. The text of the proposed rule... is to introduce fees for an on-floor version of the PULSe workstation and to extend the waiver of the PULSe Routing Intermediary fee. By way of background, the PULSe workstation is a front-end order entry...
A revised managers handbook for red pine in the North Central Region
Daniel W. Gilmore; Brian J. Palik
2006-01-01
This new version of the Red Pine Managers Guide gathers up-to-date information from many disciplines to address a wide range of red pine management issues. It provides guidance on managing red pine on extended rotations with a focus on landscape-scale objectives along with the traditional forest management tools focusing on production silviculture. The insect and...
Lee, Hang Wai; Chan, Albert S C; Kwong, Fuk Yee
2007-07-07
A rhodium-(S)-xyl-BINAP complex-catalyzed tandem formate decarbonylation and [2 + 2 + 1] carbonylative cyclization is described; this cooperative process utilizes formate as a condensed CO source, and the newly developed cascade protocol can be extended to its enantioselective version, providing up to 94% ee of the cyclopentenone adducts.
ERIC Educational Resources Information Center
Cai, Li
2013-01-01
Lord and Wingersky's (1984) recursive algorithm for creating summed score based likelihoods and posteriors has a proven track record in unidimensional item response theory (IRT) applications. Extending the recursive algorithm to handle multidimensionality is relatively simple, especially with fixed quadrature because the recursions can be defined…
ERIC Educational Resources Information Center
Gaudino, James L.; Harris, Allen C.
To extend the resistance-to-persuasion literature into a context relevant to public relations, a study examined the reactions of 147 Michigan State University students after viewing edited versions of President Ronald Reagan's televised address of February 26, 1986. Reagan's address, concerning his request for public support of an increase in…
ERIC Educational Resources Information Center
Sideridis, Georgios D.; Tsaousis, Ioannis; Al-harbi, Khaleel A.
2015-01-01
The purpose of the present study was to extend the model of measurement invariance by simultaneously estimating invariance across multiple populations in the dichotomous instrument case using multi-group confirmatory factor analytic and multiple indicator multiple causes (MIMIC) methodologies. Using the Arabic version of the General Aptitude Test…
Anthro-Centric Multisensory Interfaces for Sensory Augmentation of Telesurgery
2011-06-01
compares favorably to standing astride an operating table using laparoscopic instruments, the most favorable ergonomics would facilitate free movement...either through direct contact with the tissues or indirect contact via rigid laparoscopic instruments), opportunities now exist to utilize other...tele-surgical methods. Laparoscopic instruments were initially developed as extended versions of their counterparts used in open procedures (e.g
The Development of Bilingual Children's Early Spelling in English
ERIC Educational Resources Information Center
Liow, Susan J. Rickard; Lau, Lily H.-S.
2006-01-01
By using an extended version of R. Treiman, M. Cassar, and A. Zukowski's (1994) flaps spelling task (wa_er, is it t or d in water?), the authors investigated the metalinguistic awareness of 6-year-old bilingual children from 3 different language backgrounds (LBs): English-LB (English-L1, Mandarin-L2), Chinese-LB (Mandarin-L1, English L2), and…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-29
... reporting year and thereafter, while the current reporting, OMB Approval Number 0985-0008, will be extended to the end of the FY 2010 reporting cycle. The proposed FY 2011 version may be found on the AoA Web site link entitled Draft State Reporting Tool for Review available at http://www.aoa.gov/AoARoot...
ERIC Educational Resources Information Center
Zucker, Marla; Spinazzola, Joseph; Pollack, Amie Alley; Pepe, Lauren; Barry, Stephanie; Zhang, Lynda; van der Kolk, Bessel
2010-01-01
This study replicated and extended our previous evaluation of Urban Improv (UI), a theater-based youth violence prevention (YVP) program developed for urban youth. It assessed the replicability of positive program impacts when implemented by nonprogram originators, as well as the utility of a comprehensive version of the UI program that included a…
NASA Glenn Steady-State Heat Pipe Code Users Manual, DOS Input. Version 2
NASA Technical Reports Server (NTRS)
Tower, Leonard K.
2000-01-01
The heat pipe code LERCHP has been revised, corrected, and extended. New features include provisions for pipes with curvature and bends in "G" fields. Heat pipe limits are examined in detail and limit envelopes are shown for some sodium and lithium-filled heat pipes. Refluxing heat pipes and gas-loaded or variable conductance heat pipes were not considered.
ERIC Educational Resources Information Center
Warwick, Paul; Shaw, Stuart; Johnson, Martin
2015-01-01
The Assessment for Learning in International Contexts (ALIC) project sought to extend knowledge around teachers' understandings of Assessment for Learning (AfL). Using a modified version of a survey item devised by James and Pedder for use with teachers in England, evidence was gathered about the assessment practices that were highly valued by…
ERIC Educational Resources Information Center
Heagle, Amie I.; Rehfeldt, Ruth Anne
2006-01-01
Perspective-taking is an ability that requires a child to emit a selection response of informational states in himself or herself and in others. This study used an extended version of the Barnes-Holmes protocol developed in a series of studies by McHugh, Barnes-Holmes, and Barnes-Holmes (2004) to teach typically developing children between the…
Convergence acceleration of viscous flow computations
NASA Technical Reports Server (NTRS)
Johnson, G. M.
1982-01-01
A multiple-grid convergence acceleration technique introduced for application to the solution of the Euler equations by means of Lax-Wendroff algorithms is extended to treat compressible viscous flow. Computational results are presented for the solution of the thin-layer version of the Navier-Stokes equations using the explicit MacCormack algorithm, accelerated by a convective coarse-grid scheme. Extensions and generalizations are mentioned.
Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D T; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung
2014-01-01
Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/.
Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D. T.; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung
2014-01-01
Background Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. Results We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. Conclusions CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. Availability: CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/. PMID:24897343
Madsen, Thomas; Braun, Danielle; Peng, Gang; Parmigiani, Giovanni; Trippa, Lorenzo
2018-06-25
The Elston-Stewart peeling algorithm enables estimation of an individual's probability of harboring germline risk alleles based on pedigree data, and serves as the computational backbone of important genetic counseling tools. However, it remains limited to the analysis of risk alleles at a small number of genetic loci because its computing time grows exponentially with the number of loci considered. We propose a novel, approximate version of this algorithm, dubbed the peeling and paring algorithm, which scales polynomially in the number of loci. This allows extending peeling-based models to include many genetic loci. The algorithm creates a trade-off between accuracy and speed, and allows the user to control this trade-off. We provide exact bounds on the approximation error and evaluate it in realistic simulations. Results show that the loss of accuracy due to the approximation is negligible in important applications. This algorithm will improve genetic counseling tools by increasing the number of pathogenic risk alleles that can be addressed. To illustrate we create an extended five genes version of BRCAPRO, a widely used model for estimating the carrier probabilities of BRCA1 and BRCA2 risk alleles and assess its computational properties. © 2018 WILEY PERIODICALS, INC.
Shohaimi, Shamarina; Yoke Wei, Wong; Mohd Shariff, Zalilah
2014-01-01
Comprehensive feeding practices questionnaire (CFPQ) is an instrument specifically developed to evaluate parental feeding practices. It has been confirmed among children in America and applied to populations in France, Norway, and New Zealand. In order to extend the application of CFPQ, we conducted a factor structure validation of the translated version of CFPQ (CFPQ-M) using confirmatory factor analysis among mothers of primary school children (N = 397) in Malaysia. Several items were modified for cultural adaptation. Of 49 items, 39 items with loading factors >0.40 were retained in the final model. The confirmatory factor analysis revealed that the final model (twelve-factor model with 39 items and 2 error covariances) displayed the best fit for our sample (Chi-square = 1147; df = 634; P < 0.05; CFI = 0.900; RMSEA = 0.045; SRMR = 0.0058). The instrument with some modifications was confirmed among mothers of school children in Malaysia. The present study extends the usability of the CFPQ and enables researchers and parents to better understand the relationships between parental feeding practices and related problems such as childhood obesity. PMID:25538958
NASA Astrophysics Data System (ADS)
Eliseev, A. V.; Mokhov, I. I.; Chernokulsky, A. V.
2017-01-01
A module for simulating of natural fires (NFs) in the climate model of the A.M. Obukhov Institute of Atmospheric Physics, Russian Academy of Sciences (IAP RAS CM), is extended with respect to the influence of lightning activity and population density on the ignition frequency and fire suppression. The IAP RAS CM is used to perform numerical experiments in accordance with the conditions of the project that intercompares climate models, CMIP5 (Coupled Models Intercomparison Project, phase 5). The frequency of lightning flashes was assigned in accordance with the LIS/OTD satellite data. In the calculations performed, anthropogenic ignitions play an important role in NF occurrences, except for regions at subpolar latitudes and, to a lesser degree, tropical and subtropical regions. Taking into account the dependence of fire frequency on lightning activity and population density intensifies the influence of characteristics of natural fires on the climate changes in tropics and subtropics as compared to the version of the IAP RAS CM that does not take the influence of ignition sources on the large-scale characteristics of NFs into consideration.
Lattice Boltzmann Simulation Optimization on Leading Multicore Platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Samuel; Carter, Jonathan; Oliker, Leonid
2008-02-01
We present an auto-tuning approach to optimize application performance on emerging multicore architectures. The methodology extends the idea of search-based performance optimizations, popular in linear algebra and FFT libraries, to application-specific computational kernels. Our work applies this strategy to a lattice Boltzmann application (LBMHD) that historically has made poor use of scalar microprocessors due to its complex data structures and memory access patterns. We explore one of the broadest sets of multicore architectures in the HPC literature, including the Intel Clovertown, AMD Opteron X2, Sun Niagara2, STI Cell, as well as the single core Intel Itanium2. Rather than hand-tuning LBMHDmore » for each system, we develop a code generator that allows us identify a highly optimized version for each platform, while amortizing the human programming effort. Results show that our auto-tuned LBMHD application achieves up to a 14x improvement compared with the original code. Additionally, we present detailed analysis of each optimization, which reveal surprising hardware bottlenecks and software challenges for future multicore systems and applications.« less
Lattice Boltzmann simulation optimization on leading multicore platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, S.; Carter, J.; Oliker, L.
2008-01-01
We present an auto-tuning approach to optimize application performance on emerging multicore architectures. The methodology extends the idea of searchbased performance optimizations, popular in linear algebra and FFT libraries, to application-specific computational kernels. Our work applies this strategy to a lattice Boltzmann application (LBMHD) that historically has made poor use of scalar microprocessors due to its complex data structures and memory access patterns. We explore one of the broadest sets of multicore architectures in the HPC literature, including the Intel Clovertown, AMD Opteron X2, Sun Niagara2, STI Cell, as well as the single core Intel Itanium2. Rather than hand-tuning LBMHDmore » for each system, we develop a code generator that allows us identify a highly optimized version for each platform, while amortizing the human programming effort. Results show that our autotuned LBMHD application achieves up to a 14 improvement compared with the original code. Additionally, we present detailed analysis of each optimization, which reveal surprising hardware bottlenecks and software challenges for future multicore systems and applications.« less
Accelerated Dimension-Independent Adaptive Metropolis
Chen, Yuxin; Keyes, David E.; Law, Kody J.; ...
2016-10-27
This work describes improvements from algorithmic and architectural means to black-box Bayesian inference over high-dimensional parameter spaces. The well-known adaptive Metropolis (AM) algorithm [33] is extended herein to scale asymptotically uniformly with respect to the underlying parameter dimension for Gaussian targets, by respecting the variance of the target. The resulting algorithm, referred to as the dimension-independent adaptive Metropolis (DIAM) algorithm, also shows improved performance with respect to adaptive Metropolis on non-Gaussian targets. This algorithm is further improved, and the possibility of probing high-dimensional (with dimension d 1000) targets is enabled, via GPU-accelerated numerical libraries and periodically synchronized concurrent chains (justimore » ed a posteriori). Asymptotically in dimension, this GPU implementation exhibits a factor of four improvement versus a competitive CPU-based Intel MKL parallel version alone. Strong scaling to concurrent chains is exhibited, through a combination of longer time per sample batch (weak scaling) and yet fewer necessary samples to convergence. The algorithm performance is illustrated on several Gaussian and non-Gaussian target examples, in which the dimension may be in excess of one thousand.« less
Accelerated Dimension-Independent Adaptive Metropolis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yuxin; Keyes, David E.; Law, Kody J.
This work describes improvements from algorithmic and architectural means to black-box Bayesian inference over high-dimensional parameter spaces. The well-known adaptive Metropolis (AM) algorithm [33] is extended herein to scale asymptotically uniformly with respect to the underlying parameter dimension for Gaussian targets, by respecting the variance of the target. The resulting algorithm, referred to as the dimension-independent adaptive Metropolis (DIAM) algorithm, also shows improved performance with respect to adaptive Metropolis on non-Gaussian targets. This algorithm is further improved, and the possibility of probing high-dimensional (with dimension d 1000) targets is enabled, via GPU-accelerated numerical libraries and periodically synchronized concurrent chains (justimore » ed a posteriori). Asymptotically in dimension, this GPU implementation exhibits a factor of four improvement versus a competitive CPU-based Intel MKL parallel version alone. Strong scaling to concurrent chains is exhibited, through a combination of longer time per sample batch (weak scaling) and yet fewer necessary samples to convergence. The algorithm performance is illustrated on several Gaussian and non-Gaussian target examples, in which the dimension may be in excess of one thousand.« less
A study on the use of Gumbel approximation with the Bernoulli spatial scan statistic.
Read, S; Bath, P A; Willett, P; Maheswaran, R
2013-08-30
The Bernoulli version of the spatial scan statistic is a well established method of detecting localised spatial clusters in binary labelled point data, a typical application being the epidemiological case-control study. A recent study suggests the inferential accuracy of several versions of the spatial scan statistic (principally the Poisson version) can be improved, at little computational cost, by using the Gumbel distribution, a method now available in SaTScan(TM) (www.satscan.org). We study in detail the effect of this technique when applied to the Bernoulli version and demonstrate that it is highly effective, albeit with some increase in false alarm rates at certain significance thresholds. We explain how this increase is due to the discrete nature of the Bernoulli spatial scan statistic and demonstrate that it can affect even small p-values. Despite this, we argue that the Gumbel method is actually preferable for very small p-values. Furthermore, we extend previous research by running benchmark trials on 12 000 synthetic datasets, thus demonstrating that the overall detection capability of the Bernoulli version (i.e. ratio of power to false alarm rate) is not noticeably affected by the use of the Gumbel method. We also provide an example application of the Gumbel method using data on hospital admissions for chronic obstructive pulmonary disease. Copyright © 2013 John Wiley & Sons, Ltd.
Gozzi, Marta; Cherubini, Paolo; Papagno, Costanza; Bricolo, Emanuela
2011-05-01
Previous studies found mixed results concerning the role of working memory (WM) in the gambling task (GT). Here, we aimed at reconciling inconsistencies by showing that the standard version of the task can be solved using intuitive strategies operating automatically, while more complex versions require analytic strategies drawing on executive functions. In Study 1, where good performance on the GT could be achieved using intuitive strategies, participants performed well both with and without a concurrent WM load. In Study 2, where analytical strategies were required to solve a more complex version of the GT, participants without WM load performed well, while participants with WM load performed poorly. In Study 3, where the complexity of the GT was further increased, participants in both conditions performed poorly. In addition to the standard performance measure, we used participants' subjective expected utility, showing that it differs from the standard measure in some important aspects.
Gagne, Joshua J; Polinski, Jennifer M; Jiang, Wenlei; Dutcher, Sarah K; Xie, Jing; Lii, Joyce; Fulchino, Lisa A; Kesselheim, Aaron S
2016-08-01
US Food and Drug Administration approval for generic drugs relies on demonstrating pharmaceutical equivalence and bioequivalence; however, some drug products have unique attributes that necessitate product-specific approval pathways. We evaluated rates of patients' switching back to brand-name versions from generic versions of four drugs approved via such approaches. We used data from Optum LifeSciences Research Database to identify patients using a brand-name version of a study drug (acarbose tablets, salmon calcitonin nasal spray, enoxaparin sodium injection, and venlafaxine extended release tablets) or a control drug. We followed patients to identify switching to generic versions and then followed those who switched to identify whether they switched back to brand-name versions. We calculated switch and switch-back rates and used Kaplan-Meier and log-rank tests to compare rates between study and control drugs. Our cohort included 201 959 eligible patients. Brand-to-generic switch rates ranged from 66 to 106 switches per 100 person-years for study drugs and 80 to 110 for control drugs. Rates of switch-back to brand-name versions ranged from 5 to 37 among study drugs and 3 to 53 among control drugs. Switch-back rates were higher for venlafaxine vs. sertraline (p < 0.01) and calcitonin vs. alendronate (p = 0.01). Switch-back rates were lower for venlafaxine vs. paroxetine (p < 0.01) and acarbose vs. nateglinide (p < 0.01). Rates were similar for acarbose vs. glimepiride (p = 0.97) and for enoxaparin vs. fondiparinux (p = 0.11). As compared to control drugs, patients were not more likely to systematically switch back from generic to brand-name versions of the four study drugs. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
High Performance Analytics with the R3-Cache
NASA Astrophysics Data System (ADS)
Eavis, Todd; Sayeed, Ruhan
Contemporary data warehouses now represent some of the world’s largest databases. As these systems grow in size and complexity, however, it becomes increasingly difficult for brute force query processing approaches to meet the performance demands of end users. Certainly, improved indexing and more selective view materialization are helpful in this regard. Nevertheless, with warehouses moving into the multi-terabyte range, it is clear that the minimization of external memory accesses must be a primary performance objective. In this paper, we describe the R 3-cache, a natively multi-dimensional caching framework designed specifically to support sophisticated warehouse/OLAP environments. R 3-cache is based upon an in-memory version of the R-tree that has been extended to support buffer pages rather than disk blocks. A key strength of the R 3-cache is that it is able to utilize multi-dimensional fragments of previous query results so as to significantly minimize the frequency and scale of disk accesses. Moreover, the new caching model directly accommodates the standard relational storage model and provides mechanisms for pro-active updates that exploit the existence of query “hot spots”. The current prototype has been evaluated as a component of the Sidera DBMS, a “shared nothing” parallel OLAP server designed for multi-terabyte analytics. Experimental results demonstrate significant performance improvements relative to simpler alternatives.
Conservation Reasoning Ability and Performance on BSCS Blue Version Examinations
ERIC Educational Resources Information Center
Lawson, Anton E.; Nordland, Floyd H.
1977-01-01
Twenty-three high school biology students using the Biological Sciences Curriculum Study (BSCS) Blue Textbook were administered a weight conservation and two volume conservation tasks. A majority performed below formal-operational level, indicating that these students would be likely to encounter difficuluty with BSCS Blue Version materials. (MLH)
Stelmokas, Julija; Yassay, Lance; Giordani, Bruno; Dodge, Hiroko H.; Dinov, Ivo D.; Bhaumik, Arijit; Sathian, K.; Hampstead, Benjamin M.
2018-01-01
NeuroQuant (NQ) is a fully-automated program that overcomes several existing limitations in the clinical translation of MRI-derived volumetry. The current study characterized differences between the original (NQ1) and an updated NQ version (NQ2) by (i) replicating previously identified relationships between neuropsychological test performance and medial temporal lobe volumes, (ii) evaluating the level of agreement between NQ versions, and (iii) determining if the addition of NQ2 age-/sex-based z-scores hold greater clinical utility for prediction of memory impairment than standard percent of intracranial volume (%ICV) values. Sixty-seven healthy older adults and 65 MCI patients underwent structural MRI and completed cognitive testing, including the Immediate and Delayed Memory indices from the RBANS. Results generally replicated previous relationships between key medial temporal lobe regions and memory test performance, though comparison of NQ regions revealed statistically different values that were biased toward one version or the other depending on the region. NQ2 hippocampal z-scores explained additional variance in memory performance relative to %ICV values. Findings indicate that NQ1/2 medial temporal lobe volumes, especially age- and sex-based z-scores, hold clinical value, though caution is warranted when directly comparing volumes across NQ versions. PMID:29060939
Televised Dance: Evaluation of Three Approaches.
ERIC Educational Resources Information Center
Oglesbee, Frank W.
A study was conducted to determine whether dance-trained, television-trained, and regular television viewing audiences would evaluate different approaches to televising dance differently. Three versions of a dance performance were videotaped: (1) version A, a one-camera, one-shot recording; (2) version B, a two-camera, real-time-edited approach,…
Evolution of the Data Access Protocol in Response to Community Needs
NASA Astrophysics Data System (ADS)
Gallagher, J.; Caron, J. L.; Davis, E.; Fulker, D.; Heimbigner, D.; Holloway, D.; Howe, B.; Moe, S.; Potter, N.
2012-12-01
Under the aegis of the OPULS (OPeNDAP-Unidata Linked Servers) Project, funded by NOAA, version 2 of OPeNDAP's Data Access Protocol (DAP2) is being updated to version 4. DAP4 is the first major upgrade in almost two decades and will embody three main areas of advancement. First, the data-model extensions developed by the OPULS team focus on three areas: Better support for coverages, access to HDF5 files and access to relational databases. DAP2 support for coverages (defined as a sampled functions) was limited to simple rectangular coverages that work well for (some) model outputs and processed satellite data but that cannot represent trajectories or satellite swath data, for example. We have extended the coverage concept in DAP4 to remove these limitations. These changes are informed by work at Unidata on the Common Data Model and also by the OGC's abstract coverages specification. In a similar vein, we have extended DAP2's support for relations by including the concept of foreign keys, so that tables can be explicitly related to one another. Second, the web interfaces - web services - that provides access to data using via DAP will be more clearly defined and use other (, orthogonal), standards where they are appropriate. An important case is the XML interface, which provides a cleaner way to build other response media types such as JSON and RDF (for metadata) and to build support for Atom, thus simplify the integration of DAP servers with tools that support OpenSearch. Input from the ESIP federation and work performed with IOOS have informed our choices here. Last, DAP4-compliant servers will support richer data-processing capabilities than DAP2, enabling a wider array of server functions that manipulate data before returning values. Two projects currently are exploring just what can be done even with DAP2's server-function model: The MIIC project at LARC and OPULS itself (with work performed at the University of Washington). Both projects have demonstrated that server functions can be used to perform operations on large volumes of data and return results that are far smaller than would be required to achieve the same outcomes via client-side processing. We are using information from these efforts to inform the design of server functions in DAP4. Each of the three areas of DAP4 advancement is being guided by input from a number of community members, including an OPULS Advisory Committee.
Sato, Tatsuhiko
2015-01-01
By extending our previously established model, here we present a new model called "PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0," which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth's atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research.
Sato, Tatsuhiko
2015-01-01
By extending our previously established model, here we present a new model called “PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0,” which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth’s atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R 2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research. PMID:26674183
Health Impact of Domestic Violence against Saudi Women: Cross Sectional Study.
Al Dosary, Ahmad Hamad
2016-04-01
Domestic violence is a major public health problem. A wide range of health hazards result from violence against women directly, or from its long-term consequences. The objective of this study is to determine health related consequences of domestic violence against women. A community based cross-sectional study was carried through online survey; convenience sample was taken during the period between December 2013 and February 2014. 421 women completed the survey, who met the inclusion criteria and accepted willing to be a part of this study. The data was collected through online survey website. A validated Arabic version of NorVold Domestic Abuse Questionnaire (NOVAQ) was used as a tool to assess domestic violence among the study sample. Analysis was performed using SPSS, version 18.0. A total of 421 women participated in the survey. There was no significant correlation between socio-demographic characteristics and being abused or not. However, by further analysis we found more sexual abuse among non-working women P=0.048. There was significant correlation between abused women and general health status, doctor visits, depression, insomnia, and somatic symptoms. The consequences of abuse are profound, extending beyond the health of individual to affect the well-being of entire community. So, we recommend to increase community awareness through national awareness campaign, national prevalence survey of domestic violence and well trained health professionals for assessing domestic violence cases.
Identifying shortcomings in the measurement of service quality.
Fogarty, G; Catts, R; Forlin, C
2000-01-01
SERVPEFR, the performance component of the Service Quality Scale (SERVQUAL), has been shown to measure five underlying dimensions corresponding to Tangibles, Reliability, Responsiveness, Assurance, and Empathy (Parasuraman, Zeithaml, & Berry, 1988). This paper describes three separate studies employing SERVPERF in an Australian context. In the first of these studies (N = 113), a shortened 15-item version of the SERVPERF scale (SERVPERF-R) was found to be suitable for use in an Australian small business setting. A five-factor structure was identifiable but the factors were highly correlated, suggesting that they were not clearly distinct. The tendency for marked negative skewness observed by other researchers was also noted here. A follow-up study involving three other small businesses (N = 212) used Rasch analysis to test assumptions about the spread of items on the underlying continuum. These analyses indicated that there is an even, though narrow, spread of items across the continuum. The Rasch analysis suggested that the items in both SERVPERF and SERVPERF-R are too easy to rate highly and that more "difficult" items need to be added to the scale. The third study (N = 122) was conducted using a version of SERVPERF-R that included seven new items intended to extend the range of the scale. The new items, however, did not achieve this desirable outcome. The implications for service quality assessment are discussed.
Files synchronization from a large number of insertions and deletions
NASA Astrophysics Data System (ADS)
Ellappan, Vijayan; Kumari, Savera
2017-11-01
Synchronization between different versions of files is becoming a major issue that most of the applications are facing. To make the applications more efficient a economical algorithm is developed from the previously used algorithm of “File Loading Algorithm”. I am extending this algorithm in three ways: First, dealing with non-binary files, Second backup is generated for uploaded files and lastly each files are synchronized with insertions and deletions. User can reconstruct file from the former file with minimizing the error and also provides interactive communication by eliminating the frequency without any disturbance. The drawback of previous system is overcome by using synchronization, in which multiple copies of each file/record is created and stored in backup database and is efficiently restored in case of any unwanted deletion or loss of data. That is, to introduce a protocol that user B may use to reconstruct file X from file Y with suitably low probability of error. Synchronization algorithms find numerous areas of use, including data storage, file sharing, source code control systems, and cloud applications. For example, cloud storage services such as Drop box synchronize between local copies and cloud backups each time users make changes to local versions. Similarly, synchronization tools are necessary in mobile devices. Specialized synchronization algorithms are used for video and sound editing. Synchronization tools are also capable of performing data duplication.
NASA Astrophysics Data System (ADS)
Jung, Seongmoon; Sung, Wonmo; Lee, Jaegi; Ye, Sung-Joon
2018-01-01
Emerging radiological applications of gold nanoparticles demand low-energy electron/photon transport calculations including details of an atomic relaxation process. Recently, MCNP® version 6.1 (MCNP6.1) has been released with extended cross-sections for low-energy electron/photon, subshell photoelectric cross-sections, and more detailed atomic relaxation data than the previous versions. With this new feature, the atomic relaxation process of MCNP6.1 has not been fully tested yet with its new physics library (eprdata12) that is based on the Evaluated Atomic Data Library (EADL). In this study, MCNP6.1 was compared with GATEv7.2, PENELOPE2014, and EGSnrc that have been often used to simulate low-energy atomic relaxation processes. The simulations were performed to acquire both photon and electron spectra produced by interactions of 15 keV electrons or photons with a 10-nm-thick gold nano-slab. The photon-induced fluorescence X-rays from MCNP6.1 fairly agreed with those from GATEv7.2 and PENELOPE2014, while the electron-induced fluorescence X-rays of the four codes showed more or less discrepancies. A coincidence was observed in the photon-induced Auger electrons simulated by MCNP6.1 and GATEv7.2. A recent release of MCNP6.1 with eprdata12 can be used to simulate the photon-induced atomic relaxation.
The Invar tensor package: Differential invariants of Riemann
NASA Astrophysics Data System (ADS)
Martín-García, J. M.; Yllanes, D.; Portugal, R.
2008-10-01
The long standing problem of the relations among the scalar invariants of the Riemann tensor is computationally solved for all 6ṡ10 objects with up to 12 derivatives of the metric. This covers cases ranging from products of up to 6 undifferentiated Riemann tensors to cases with up to 10 covariant derivatives of a single Riemann. We extend our computer algebra system Invar to produce within seconds a canonical form for any of those objects in terms of a basis. The process is as follows: (1) an invariant is converted in real time into a canonical form with respect to the permutation symmetries of the Riemann tensor; (2) Invar reads a database of more than 6ṡ10 relations and applies those coming from the cyclic symmetry of the Riemann tensor; (3) then applies the relations coming from the Bianchi identity, (4) the relations coming from commutations of covariant derivatives, (5) the dimensionally-dependent identities for dimension 4, and finally (6) simplifies invariants that can be expressed as product of dual invariants. Invar runs on top of the tensor computer algebra systems xTensor (for Mathematica) and Canon (for Maple). Program summaryProgram title:Invar Tensor Package v2.0 Catalogue identifier:ADZK_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZK_v2_0.html Program obtainable from:CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions:Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.:3 243 249 No. of bytes in distributed program, including test data, etc.:939 Distribution format:tar.gz Programming language:Mathematica and Maple Computer:Any computer running Mathematica versions 5.0 to 6.0 or Maple versions 9 and 11 Operating system:Linux, Unix, Windows XP, MacOS RAM:100 Mb Word size:64 or 32 bits Supplementary material:The new database of relations is much larger than that for the previous version and therefore has not been included in the distribution. To obtain the Mathematica and Maple database files click on this link. Classification:1.5, 5 Does the new version supersede the previous version?:Yes. The previous version (1.0) only handled algebraic invariants. The current version (2.0) has been extended to cover differential invariants as well. Nature of problem:Manipulation and simplification of scalar polynomial expressions formed from the Riemann tensor and its covariant derivatives. Solution method:Algorithms of computational group theory to simplify expressions with tensors that obey permutation symmetries. Tables of syzygies of the scalar invariants of the Riemann tensor. Reasons for new version:With this new version, the user can manipulate differential invariants of the Riemann tensor. Differential invariants are required in many physical problems in classical and quantum gravity. Summary of revisions:The database of syzygies has been expanded by a factor of 30. New commands were added in order to deal with the enlarged database and to manipulate the covariant derivative. Restrictions:The present version only handles scalars, and not expressions with free indices. Additional comments:The distribution file for this program is over 53 Mbytes and therefore is not delivered directly when download or Email is requested. Instead a html file giving details of how the program can be obtained is sent. Running time:One second to fully reduce any monomial of the Riemann tensor up to degree 7 or order 10 in terms of independent invariants. The Mathematica notebook included in the distribution takes approximately 5 minutes to run.
[Linguistic adaptation of the Russian version of the Short-form McGill Pain Questionnaire-2].
Bakhtadze, M A; Bolotov, D A; Kuzminov, K O; Padun, M P; Zakharova, O B
Linguistic adaptation of the Russian version of the Short-form McGill Pain Questionnaire-2 (SF-MPQ-2), which is conceptually equivalent to the original questionnaire. The adaptation of the Russian version of SF-MPQ-2 was performed in accordance to established rules in several stages by two independent translators with the development of a consensus Russian version and its back translation by two independent translators and development of a consensus English version. The final Russian SF-MPQ-2 version was then created. The Russian version of the Short-form McGill Pain Questionnaire-2 (SF-MPQ-2-RU) was generated based on the established rules. This version was legally registered by the right holder - Mapi Research Trust and recommended for research in the Russian Federation.
New version: GRASP2K relativistic atomic structure package
NASA Astrophysics Data System (ADS)
Jönsson, P.; Gaigalas, G.; Bieroń, J.; Fischer, C. Froese; Grant, I. P.
2013-09-01
A revised version of GRASP2K [P. Jönsson, X. He, C. Froese Fischer, I.P. Grant, Comput. Phys. Commun. 177 (2007) 597] is presented. It supports earlier non-block and block versions of codes as well as a new block version in which the njgraf library module [A. Bar-Shalom, M. Klapisch, Comput. Phys. Commun. 50 (1988) 375] has been replaced by the librang angular package developed by Gaigalas based on the theory of [G. Gaigalas, Z.B. Rudzikas, C. Froese Fischer, J. Phys. B: At. Mol. Phys. 30 (1997) 3747, G. Gaigalas, S. Fritzsche, I.P. Grant, Comput. Phys. Commun. 139 (2001) 263]. Tests have shown that errors encountered by njgraf do not occur with the new angular package. The three versions are denoted v1, v2, and v3, respectively. In addition, in v3, the coefficients of fractional parentage have been extended to j=9/2, making calculations feasible for the lanthanides and actinides. Changes in v2 include minor improvements. For example, the new version of rci2 may be used to compute quantum electrodynamic (QED) corrections only from selected orbitals. In v3, a new program, jj2lsj, reports the percentage composition of the wave function in LSJ and the program rlevels has been modified to report the configuration state function (CSF) with the largest coefficient of an LSJ expansion. The bioscl2 and bioscl3 application programs have been modified to produce a file of transition data with one record for each transition in the same format as in ATSP2K [C. Froese Fischer, G. Tachiev, G. Gaigalas, M.R. Godefroid, Comput. Phys. Commun. 176 (2007) 559], which identifies each atomic state by the total energy and a label for the CSF with the largest expansion coefficient in LSJ intermediate coupling. All versions of the codes have been adapted for 64-bit computer architecture. Program SummaryProgram title: GRASP2K, version 1_1 Catalogue identifier: ADZL_v1_1 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/ADZL_v1_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 730252 No. of bytes in distributed program, including test data, etc.: 14808872 Distribution format: tar.gz Programming language: Fortran. Computer: Intel Xeon, 2.66 GHz. Operating system: Suse, Ubuntu, and Debian Linux 64-bit. RAM: 500 MB or more Classification: 2.1. Catalogue identifier of previous version: ADZL_v1_0 Journal reference of previous version: Comput. Phys. Comm. 177 (2007) 597 Does the new version supersede the previous version?: Yes Nature of problem: Prediction of atomic properties — atomic energy levels, oscillator strengths, radiative decay rates, hyperfine structure parameters, Landé gJ-factors, and specific mass shift parameters — using a multiconfiguration Dirac-Hartree-Fock approach. Solution method: The computational method is the same as in the previous GRASP2K [1] version except that for v3 codes the njgraf library module [2] for recoupling has been replaced by librang [3,4]. Reasons for new version: New angular libraries with improved performance are available. Also methodology for transforming from jj- to LSJ-coupling has been developed. Summary of revisions: New angular libraries where the coefficients of fractional parentage have been extended to j=9/2, making calculations feasible for the lanthanides and actinides. Inclusion of a new program jj2lsj, which reports the percentage composition of the wave function in LSJ. Transition programs have been modified to produce a file of transition data with one record for each transition in the same format as Atsp2K [C. Froese Fischer, G. Tachiev, G. Gaigalas and M.R. Godefroid, Comput. Phys. Commun. 176 (2007) 559], which identifies each atomic state by the total energy and a label for the CSF with the largest expansion coefficient in LSJ intermediate coupling. Updated to 64-bit architecture. A comprehensive user manual in pdf format for the program package has been added. Restrictions: The packing algorithm restricts the maximum number of orbitals to be ≤214. The tables of reduced coefficients of fractional parentage used in this version are limited to subshells with j≤9/2 [5]; occupied subshells with j>9/2 are, therefore, restricted to a maximum of two electrons. Some other parameters, such as the maximum number of subshells of a CSF outside a common set of closed shells are determined by a parameter.def file that can be modified prior to compile time. Unusual features: The bioscl3 program reports transition data in the same format as in Atsp2K [6], and the data processing program tables of the latter package can be used. The tables program takes a name.lsj file, usually a concatenated file of all the .lsj transition files for a given atom or ion, and finds the energy structure of the levels and the multiplet transition arrays. The tables posted at the website http://atoms.vuse.vanderbilt.edu are examples of tables produced by the tables program. With the extension of coefficients of fractional parentage to j=9/2, calculations for the lanthanides and actinides become possible. Running time: CPU time required to execute test cases: 70.5 s.
International Spinal Cord Injury Core Data Set (version 2.0)-including standardization of reporting.
Biering-Sørensen, F; DeVivo, M J; Charlifue, S; Chen, Y; New, P W; Noonan, V; Post, M W M; Vogel, L
2017-08-01
The study design includes expert opinion, feedback, revisions and final consensus. The objective of the study was to present the new knowledge obtained since the International Spinal Cord Injury (SCI) Core Data Set (Version 1.0) published in 2006, and describe the adjustments made in Version 2.0, including standardization of data reporting. International. Comments received from the SCI community were discussed in a working group (WG); suggestions from the WG were reviewed and revisions were made. All suggested revisions were considered, and a final version was circulated for final approval. The International SCI Core Data Set (Version 2.0) consists of 25 variables. Changes made to this version include the deletion of one variable 'Total Days Hospitalized' and addition of two variables 'Date of Rehabilitation Admission' and 'Date of Death.' The variable 'Injury Etiology' was extended with six non-traumatic categories, and corresponding 'Date of Injury' for non-traumatic cases, was defined as the date of first physician visit for symptoms related to spinal cord dysfunction. A category reflecting transgender was added. A response category was added to the variable on utilization of ventilatory assistance to document the use of continuous positive airway pressure for sleep apnea. Other clarifications were made to the text. The reporting of the pediatric SCI population was updated as age groups 0-5, 6-12, 13-14, 15-17 and 18-21. Collection of the core data set should be a basic requirement of all studies of SCI to facilitate accurate descriptions of patient populations and comparison of results across published studies from around the world.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NSTec Environmental Management
2011-07-20
Results for Version 4.110 of the Area 5 Radioactive Waste Management Site (RWMS) performance assessment (PA) model are summarized. Version 4.110 includes the fiscal year (FY) 2010 inventory estimate, including a future inventory estimate. Version 4.110 was implemented in GoldSim 10.11(SP4). The following changes have been implemented since the last baseline model, Version 4.105: (1) Updated the inventory and disposal unit configurations with data through the end of FY 2010. (1) Implemented Federal Guidance Report 13 Supplemental CD dose conversion factors (U.S. Environmental Protection Agency, 1999). Version 4.110 PA results comply with air pathway and all-pathways annual total effective dosemore » (TED) performance objectives (Tables 2 and 3, Figures 1 and 2). Air pathways results decrease moderately for all scenarios. The time of the maximum for the air pathway open rangeland scenario shifts from 1,000 to 100 years (y). All-pathways annual TED increases for all scenarios except the resident scenario. The maximum member of public all-pathways dose occurs at 1,000 y for the resident farmer scenario. The resident farmer dose was predominantly due to technetium-99 (Tc-99) (82 percent) and lead-210 (Pb-210) (13 percent). Pb-210 present at 1,000 y is produced predominantly by radioactive decay of uranium-234 (U-234) present at the time of disposal. All results for the postdrilling and intruder-agriculture scenarios comply with the performance objectives (Tables 4 and 5, Figures 3 and 4). The postdrilling intruder results are similar to Version 4.105 results. The intruder-agriculture results are similar to Version 4.105, except for the Pit 6 Radium Disposal Unit (RaDU). The intruder-agriculture result for the Shallow Land Burial (SLB) disposal units is a significant fraction of the performance objective and exceeds the performance objective at the 95th percentile. The intruder-agriculture dose is due predominantly to Tc-99 (75 percent) and U-238 (9.5 percent). The acute intruder scenario results comply with all performance objectives (Tables 6 and 7, Figures 5 and 6). The acute construction result for the SLB disposal units decreases significantly with this version. The maximum acute intruder dose occurs at 1,000 y for the SLB disposal units under the acute construction scenario. The acute intruder dose is caused by multiple radionuclides including U-238 (31 percent), Th-229 (28 percent), plutonium-239 (8.6 percent), U-233 (7.8 percent), and U-234 (6.7 percent). All results for radon-222 (Rn-222) flux density comply with the performance objective (Table 8, Figure 7). The mean Pit 13 RaDU flux density is close to the 0.74 Bq m{sup -2} s{sup -1} limit.« less
Second ROSAT all-sky survey (2RXS) source catalogue
NASA Astrophysics Data System (ADS)
Boller, Th.; Freyberg, M. J.; Trümper, J.; Haberl, F.; Voges, W.; Nandra, K.
2016-04-01
Aims: We present the second ROSAT all-sky survey source catalogue, hereafter referred to as the 2RXS catalogue. This is the second publicly released ROSAT catalogue of point-like sources obtained from the ROSAT all-sky survey (RASS) observations performed with the position-sensitive proportional counter (PSPC) between June 1990 and August 1991, and is an extended and revised version of the bright and faint source catalogues. Methods: We used the latest version of the RASS processing to produce overlapping X-ray images of 6.4° × 6.4° sky regions. To create a source catalogue, a likelihood-based detection algorithm was applied to these, which accounts for the variable point-spread function (PSF) across the PSPC field of view. Improvements in the background determination compared to 1RXS were also implemented. X-ray control images showing the source and background extraction regions were generated, which were visually inspected. Simulations were performed to assess the spurious source content of the 2RXS catalogue. X-ray spectra and light curves were extracted for the 2RXS sources, with spectral and variability parameters derived from these products. Results: We obtained about 135 000 X-ray detections in the 0.1-2.4 keV energy band down to a likelihood threshold of 6.5, as adopted in the 1RXS faint source catalogue. Our simulations show that the expected spurious content of the catalogue is a strong function of detection likelihood, and the full catalogue is expected to contain about 30% spurious detections. A more conservative likelihood threshold of 9, on the other hand, yields about 71 000 detections with a 5% spurious fraction. We recommend thresholds appropriate to the scientific application. X-ray images and overlaid X-ray contour lines provide an additional user product to evaluate the detections visually, and we performed our own visual inspections to flag uncertain detections. Intra-day variability in the X-ray light curves was quantified based on the normalised excess variance and a maximum amplitude variability analysis. X-ray spectral fits were performed using three basic models, a power law, a thermal plasma emission model, and black-body emission. Thirty-two large extended regions with diffuse emission and embedded point sources were identified and excluded from the present analysis. Conclusions: The 2RXS catalogue provides the deepest and cleanest X-ray all-sky survey catalogue in advance of eROSITA. The catalogue is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/588/A103
NASA Astrophysics Data System (ADS)
George, Atanasiu Catalin; Chiru, Anghel
2014-06-01
This paper aims on comparison between a turbocharged engine and a pressure wave charged engine. The comparison was accomplished using the engine simulation software AVL Boost, version 2010. The grahps were extracted using AVL Impress, version 2010. The performance increase is limited by the mechanical side of the simulated engine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curry, Matthew L.; Ferreira, Kurt Brian; Pedretti, Kevin Thomas Tauke
2012-03-01
This report documents thirteen of Sandia's contributions to the Computational Systems and Software Environment (CSSE) within the Advanced Simulation and Computing (ASC) program between fiscal years 2009 and 2012. It describes their impact on ASC applications. Most contributions are implemented in lower software levels allowing for application improvement without source code changes. Improvements are identified in such areas as reduced run time, characterizing power usage, and Input/Output (I/O). Other experiments are more forward looking, demonstrating potential bottlenecks using mini-application versions of the legacy codes and simulating their network activity on Exascale-class hardware. The purpose of this report is to provemore » that the team has completed milestone 4467-Demonstration of a Legacy Application's Path to Exascale. Cielo is expected to be the last capability system on which existing ASC codes can run without significant modifications. This assertion will be tested to determine where the breaking point is for an existing highly scalable application. The goal is to stretch the performance boundaries of the application by applying recent CSSE RD in areas such as resilience, power, I/O, visualization services, SMARTMAP, lightweight LWKs, virtualization, simulation, and feedback loops. Dedicated system time reservations and/or CCC allocations will be used to quantify the impact of system-level changes to extend the life and performance of the ASC code base. Finally, a simulation of anticipated exascale-class hardware will be performed using SST to supplement the calculations. Determine where the breaking point is for an existing highly scalable application: Chapter 15 presented the CSSE work that sought to identify the breaking point in two ASC legacy applications-Charon and CTH. Their mini-app versions were also employed to complete the task. There is no single breaking point as more than one issue was found with the two codes. The results were that applications can expect to encounter performance issues related to the computing environment, system software, and algorithms. Careful profiling of runtime performance will be needed to identify the source of an issue, in strong combination with knowledge of system software and application source code.« less
Carswell, C Melody; Lio, Cindy H; Grant, Russell; Klein, Martina I; Clarke, Duncan; Seales, W Brent; Strup, Stephen
2010-12-01
Subjective workload measures are usually administered in a visual-manual format, either electronically or by paper and pencil. However, vocal responses to spoken queries may sometimes be preferable, for example when experimental manipulations require continuous manual responding or when participants have certain sensory/motor impairments. In the present study, we evaluated the acceptability of the hands-free administration of two subjective workload questionnaires - the NASA Task Load Index (NASA-TLX) and the Multiple Resources Questionnaire (MRQ) - in a surgical training environment where manual responding is often constrained. Sixty-four undergraduates performed fifteen 90-s trials of laparoscopic training tasks (five replications of 3 tasks - cannulation, ring transfer, and rope manipulation). Half of the participants provided workload ratings using a traditional paper-and-pencil version of the NASA-TLX and MRQ; the remainder used a vocal (hands-free) version of the questionnaires. A follow-up experiment extended the evaluation of the hands-free version to actual medical students in a Minimally Invasive Surgery (MIS) training facility. The NASA-TLX was scored in 2 ways - (1) the traditional procedure using participant-specific weights to combine its 6 subscales, and (2) a simplified procedure - the NASA Raw Task Load Index (NASA-RTLX) - using the unweighted mean of the subscale scores. Comparison of the scores obtained from the hands-free and written administration conditions yielded coefficients of equivalence of r=0.85 (NASA-TLX) and r=0.81 (NASA-RTLX). Equivalence estimates for the individual subscales ranged from r=0.78 ("mental demand") to r=0.31 ("effort"). Both administration formats and scoring methods were equally sensitive to task and repetition effects. For the MRQ, the coefficient of equivalence for the hands-free and written versions was r=0.96 when tested on undergraduates. However, the sensitivity of the hands-free MRQ to task demands (η(partial)(2)=0.138) was substantially less than that for the written version (η(partial)(2)=0.252). This potential shortcoming of the hands-free MRQ did not seem to generalize to medical students who showed robust task effects when using the hands-free MRQ (η(partial)(2)=0.396). A detailed analysis of the MRQ subscales also revealed differences that may be attributable to a "spillover" effect in which participants' judgments about the demands of completing the questionnaires contaminated their judgments about the primary surgical training tasks. Vocal versions of the NASA-TLX are acceptable alternatives to standard written formats when researchers wish to obtain global workload estimates. However, care should be used when interpreting the individual subscales if the object is to make comparisons between studies or conditions that use different administration modalities. For the MRQ, the vocal version was less sensitive to experimental manipulations than its written counterpart; however, when medical students rather than undergraduates used the vocal version, the instrument's sensitivity increased well beyond that obtained with any other combination of administration modality and instrument in this study. Thus, the vocal version of the MRQ may be an acceptable workload assessment technique for selected populations, and it may even be a suitable substitute for the NASA-TLX. Copyright © 2010 Elsevier Ltd. All rights reserved.
O'Brien, Ed
2015-06-01
Who do we see when envisioning our "past self" and "future self"? Extant research finds a motivation to perceive improvement over time, such that past selves are seen as worse versions, and future selves as better versions, of current selves. However, the broader components comprising "worse" or "better" beyond domain-specific achievement (e.g., "Last year I failed at dieting, but next year I'll succeed") are less well understood. Are there more general qualities ascribed to the person we recall versus imagine being? Six studies suggest so, extending the 2-dimensional mind perception framework to the self: Past selves seem to possess highly emotional but not very rational minds, whereas future selves seem to possess highly rational but not very emotional minds (Studies 1a, 1b, 1c). Consistent with motivated improvement, this asymmetry does not emerge in evaluating others and applies uniquely to self-judgment (Study 2). Thus, our pervasive belief in changing for the "better" specifically means becoming more rational types of people. This observation has asymmetric consequences. Participants who brought to mind future selves sought intellectual enrichment (Study 3) and performed better on a self-control task (Study 4); however, participants who brought to mind past selves sought emotional enrichment and performed better on the same task when allegedly measuring enjoyment. These findings build a bridge between mind perception and intertemporal dynamics, raising novel implications for the present. Thinking about the future may not uniformly "improve" decisions and behaviors; rather, it mostly facilitates rational-related pursuits, whereas thinking about the past may enhance feeling-related experiences. (c) 2015 APA, all rights reserved).
Model-based damage evaluation of layered CFRP structures
NASA Astrophysics Data System (ADS)
Munoz, Rafael; Bochud, Nicolas; Rus, Guillermo; Peralta, Laura; Melchor, Juan; Chiachío, Juan; Chiachío, Manuel; Bond, Leonard J.
2015-03-01
An ultrasonic evaluation technique for damage identification of layered CFRP structures is presented. This approach relies on a model-based estimation procedure that combines experimental data and simulation of ultrasonic damage-propagation interactions. The CFPR structure, a [0/90]4s lay-up, has been tested in an immersion through transmission experiment, where a scan has been performed on a damaged specimen. Most ultrasonic techniques in industrial practice consider only a few features of the received signals, namely, time of flight, amplitude, attenuation, frequency contents, and so forth. In this case, once signals are captured, an algorithm is used to reconstruct the complete signal waveform and extract the unknown damage parameters by means of modeling procedures. A linear version of the data processing has been performed, where only Young modulus has been monitored and, in a second nonlinear version, the first order nonlinear coefficient β was incorporated to test the possibility of detection of early damage. The aforementioned physical simulation models are solved by the Transfer Matrix formalism, which has been extended from linear to nonlinear harmonic generation technique. The damage parameter search strategy is based on minimizing the mismatch between the captured and simulated signals in the time domain in an automated way using Genetic Algorithms. Processing all scanned locations, a C-scan of the parameter of each layer can be reconstructed, obtaining the information describing the state of each layer and each interface. Damage can be located and quantified in terms of changes in the selected parameter with a measurable extension. In the case of the nonlinear coefficient of first order, evidence of higher sensitivity to damage than imaging the linearly estimated Young Modulus is provided.
Development and Assessment of CTF for Pin-resolved BWR Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salko, Robert K; Wysocki, Aaron J; Collins, Benjamin S
2017-01-01
CTF is the modernized and improved version of the subchannel code, COBRA-TF. It has been adopted by the Consortium for Advanced Simulation for Light Water Reactors (CASL) for subchannel analysis applications and thermal hydraulic feedback calculations in the Virtual Environment for Reactor Applications Core Simulator (VERA-CS). CTF is now jointly developed by Oak Ridge National Laboratory and North Carolina State University. Until now, CTF has been used for pressurized water reactor modeling and simulation in CASL, but in the future it will be extended to boiling water reactor designs. This required development activities to integrate the code into the VERA-CSmore » workflow and to make it more ecient for full-core, pin resolved simulations. Additionally, there is a significant emphasis on producing high quality tools that follow a regimented software quality assurance plan in CASL. Part of this plan involves performing validation and verification assessments on the code that are easily repeatable and tied to specific code versions. This work has resulted in the CTF validation and verification matrix being expanded to include several two-phase flow experiments, including the General Electric 3 3 facility and the BWR Full-Size Fine Mesh Bundle Tests (BFBT). Comparisons with both experimental databases is reasonable, but the BFBT analysis reveals a tendency of CTF to overpredict void, especially in the slug flow regime. The execution of these tests is fully automated, analysis is documented in the CTF Validation and Verification manual, and the tests have become part of CASL continuous regression testing system. This paper will summarize these recent developments and some of the two-phase assessments that have been performed on CTF.« less
Formality Theorem for Hochschild Cochains via Transfer
NASA Astrophysics Data System (ADS)
Dolgushev, Vasily
2011-08-01
We construct a 2-colored operad Ger ∞ which, on the one hand, extends the operad Ger ∞ governing homotopy Gerstenhaber algebras and, on the other hand, extends the 2-colored operad governing open-closed homotopy algebras. We show that Tamarkin's Ger ∞-structure on the Hochschild cochain complex C •( A, A) of an A ∞-algebra A extends naturally to a {{Ger}^+_{infty}}-structure on the pair ( C •( A, A), A). We show that a formality quasi-isomorphism for the Hochschild cochains of the polynomial algebra can be obtained via transfer of this {{Ger}^+_{infty}}-structure to the cohomology of the pair ( C •( A, A), A). We show that {{Ger}^+_{infty}} is a sub DG operad of the first sheet E 1(SC) of the homology spectral sequence for the Fulton-MacPherson version SC of Voronov's Swiss Cheese operad. Finally, we prove that the DG operads {{Ger}^+_{infty}} and E 1(SC) are non-formal.
Extended Duration Orbiter (EDO) Improved Waste Collection System (IWCS)
NASA Technical Reports Server (NTRS)
1992-01-01
This high angle overall view shows the top side components of the Extended Duration Orbiter (EDO) Waste Collection System (WCS) scheduled to fly aboard NASA's Endeavour, Orbiter Vehicle (OV) 105, for the STS-54 mission. Detailed Test Objective 662, Extended duration orbiter WCS evaluation, will verify the design of the new EDO WCS under microgravity conditions for a prolonged period. OV-105 has been modified with additional structures in the waste management compartment (WMC) and additional avionics to support/restrain the EDO WCS. Among the advantages the new IWCS is hoped to have over the currect WCS are greater dependability, better hygiene, virtually unlimited capacity, and more efficient preparation between shuttle missions. Unlike the previous WCS, the improved version will not have to be removed from the spacecraft to be readied for the next flight. The WCS was documented in JSC's Crew Systems Laboratory Bldg 7.
NDL-v2.0: A new version of the numerical differentiation library for parallel architectures
NASA Astrophysics Data System (ADS)
Hadjidoukas, P. E.; Angelikopoulos, P.; Voglis, C.; Papageorgiou, D. G.; Lagaris, I. E.
2014-07-01
We present a new version of the numerical differentiation library (NDL) used for the numerical estimation of first and second order partial derivatives of a function by finite differencing. In this version we have restructured the serial implementation of the code so as to achieve optimal task-based parallelization. The pure shared-memory parallelization of the library has been based on the lightweight OpenMP tasking model allowing for the full extraction of the available parallelism and efficient scheduling of multiple concurrent library calls. On multicore clusters, parallelism is exploited by means of TORC, an MPI-based multi-threaded tasking library. The new MPI implementation of NDL provides optimal performance in terms of function calls and, furthermore, supports asynchronous execution of multiple library calls within legacy MPI programs. In addition, a Python interface has been implemented for all cases, exporting the functionality of our library to sequential Python codes. Catalog identifier: AEDG_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDG_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 63036 No. of bytes in distributed program, including test data, etc.: 801872 Distribution format: tar.gz Programming language: ANSI Fortran-77, ANSI C, Python. Computer: Distributed systems (clusters), shared memory systems. Operating system: Linux, Unix. Has the code been vectorized or parallelized?: Yes. RAM: The library uses O(N) internal storage, N being the dimension of the problem. It can use up to O(N2) internal storage for Hessian calculations, if a task throttling factor has not been set by the user. Classification: 4.9, 4.14, 6.5. Catalog identifier of previous version: AEDG_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180(2009)1404 Does the new version supersede the previous version?: Yes Nature of problem: The numerical estimation of derivatives at several accuracy levels is a common requirement in many computational tasks, such as optimization, solution of nonlinear systems, and sensitivity analysis. For a large number of scientific and engineering applications, the underlying functions correspond to simulation codes for which analytical estimation of derivatives is difficult or almost impossible. A parallel implementation that exploits systems with multiple CPUs is very important for large scale and computationally expensive problems. Solution method: Finite differencing is used with a carefully chosen step that minimizes the sum of the truncation and round-off errors. The parallel versions employ both OpenMP and MPI libraries. Reasons for new version: The updated version was motivated by our endeavors to extend a parallel Bayesian uncertainty quantification framework [1], by incorporating higher order derivative information as in most state-of-the-art stochastic simulation methods such as Stochastic Newton MCMC [2] and Riemannian Manifold Hamiltonian MC [3]. The function evaluations are simulations with significant time-to-solution, which also varies with the input parameters such as in [1, 4]. The runtime of the N-body-type of problem changes considerably with the introduction of a longer cut-off between the bodies. In the first version of the library, the OpenMP-parallel subroutines spawn a new team of threads and distribute the function evaluations with a PARALLEL DO directive. This limits the functionality of the library as multiple concurrent calls require nested parallelism support from the OpenMP environment. Therefore, either their function evaluations will be serialized or processor oversubscription is likely to occur due to the increased number of OpenMP threads. In addition, the Hessian calculations include two explicit parallel regions that compute first the diagonal and then the off-diagonal elements of the array. Due to the barrier between the two regions, the parallelism of the calculations is not fully exploited. These issues have been addressed in the new version by first restructuring the serial code and then running the function evaluations in parallel using OpenMP tasks. Although the MPI-parallel implementation of the first version is capable of fully exploiting the task parallelism of the PNDL routines, it does not utilize the caching mechanism of the serial code and, therefore, performs some redundant function evaluations in the Hessian and Jacobian calculations. This can lead to: (a) higher execution times if the number of available processors is lower than the total number of tasks, and (b) significant energy consumption due to wasted processor cycles. Overcoming these drawbacks, which become critical as the time of a single function evaluation increases, was the primary goal of this new version. Due to the code restructure, the MPI-parallel implementation (and the OpenMP-parallel in accordance) avoids redundant calls, providing optimal performance in terms of the number of function evaluations. Another limitation of the library was that the library subroutines were collective and synchronous calls. In the new version, each MPI process can issue any number of subroutines for asynchronous execution. We introduce two library calls that provide global and local task synchronizations, similarly to the BARRIER and TASKWAIT directives of OpenMP. The new MPI-implementation is based on TORC, a new tasking library for multicore clusters [5-7]. TORC improves the portability of the software, as it relies exclusively on the POSIX-Threads and MPI programming interfaces. It allows MPI processes to utilize multiple worker threads, offering a hybrid programming and execution environment similar to MPI+OpenMP, in a completely transparent way. Finally, to further improve the usability of our software, a Python interface has been implemented on top of both the OpenMP and MPI versions of the library. This allows sequential Python codes to exploit shared and distributed memory systems. Summary of revisions: The revised code improves the performance of both parallel (OpenMP and MPI) implementations. The functionality and the user-interface of the MPI-parallel version have been extended to support the asynchronous execution of multiple PNDL calls, issued by one or multiple MPI processes. A new underlying tasking library increases portability and allows MPI processes to have multiple worker threads. For both implementations, an interface to the Python programming language has been added. Restrictions: The library uses only double precision arithmetic. The MPI implementation assumes the homogeneity of the execution environment provided by the operating system. Specifically, the processes of a single MPI application must have identical address space and a user function resides at the same virtual address. In addition, address space layout randomization should not be used for the application. Unusual features: The software takes into account bound constraints, in the sense that only feasible points are used to evaluate the derivatives, and given the level of the desired accuracy, the proper formula is automatically employed. Running time: Running time depends on the function's complexity. The test run took 23 ms for the serial distribution, 25 ms for the OpenMP with 2 threads, 53 ms and 1.01 s for the MPI parallel distribution using 2 threads and 2 processes respectively and yield-time for idle workers equal to 10 ms. References: [1] P. Angelikopoulos, C. Paradimitriou, P. Koumoutsakos, Bayesian uncertainty quantification and propagation in molecular dynamics simulations: a high performance computing framework, J. Chem. Phys 137 (14). [2] H.P. Flath, L.C. Wilcox, V. Akcelik, J. Hill, B. van Bloemen Waanders, O. Ghattas, Fast algorithms for Bayesian uncertainty quantification in large-scale linear inverse problems based on low-rank partial Hessian approximations, SIAM J. Sci. Comput. 33 (1) (2011) 407-432. [3] M. Girolami, B. Calderhead, Riemann manifold Langevin and Hamiltonian Monte Carlo methods, J. R. Stat. Soc. Ser. B (Stat. Methodol.) 73 (2) (2011) 123-214. [4] P. Angelikopoulos, C. Paradimitriou, P. Koumoutsakos, Data driven, predictive molecular dynamics for nanoscale flow simulations under uncertainty, J. Phys. Chem. B 117 (47) (2013) 14808-14816. [5] P.E. Hadjidoukas, E. Lappas, V.V. Dimakopoulos, A runtime library for platform-independent task parallelism, in: PDP, IEEE, 2012, pp. 229-236. [6] C. Voglis, P.E. Hadjidoukas, D.G. Papageorgiou, I. Lagaris, A parallel hybrid optimization algorithm for fitting interatomic potentials, Appl. Soft Comput. 13 (12) (2013) 4481-4492. [7] P.E. Hadjidoukas, C. Voglis, V.V. Dimakopoulos, I. Lagaris, D.G. Papageorgiou, Supporting adaptive and irregular parallelism for non-linear numerical optimization, Appl. Math. Comput. 231 (2014) 544-559.
Prediction of Success in External Cephalic Version under Tocolysis: Still a Challenge.
Vaz de Macedo, Carolina; Clode, Nuno; Mendes da Graça, Luís
2015-01-01
External cephalic version is a procedure of fetal rotation to a cephalic presentation through manoeuvres applied to the maternal abdomen. There are several prognostic factors described in literature for external cephalic version success and prediction scores have been proposed, but their true implication in clinical practice is controversial. We aim to identify possible factors that could contribute to the success of an external cephalic version attempt in our population. We retrospectively examined 207 consecutive external cephalic version attempts under tocolysis conducted between January 1997 and July 2012. We consulted the department's database for the following variables: race, age, parity, maternal body mass index, gestational age, estimated fetal weight, breech category, placental location and amniotic fluid index. We performed descriptive and analytical statistics for each variable and binary logistic regression. External cephalic version was successful in 46.9% of cases (97/207). None of the included variables was associated with the outcome of external cephalic version attempts after adjustment for confounding factors. We present a success rate similar to what has been previously described in literature. However, in contrast to previous authors, we could not associate any of the analysed variables with success of the external cephalic version attempt. We believe this discrepancy is partly related to the type of statistical analysis performed. Even though there are numerous prognostic factors identified for the success in external cephalic version, care must be taken when counselling and selecting patients for this procedure. The data obtained suggests that external cephalic version should continue being offered to all eligible patients regardless of prognostic factors for success.
JPSS-1 VIIRS Version 2 At-Launch Relative Spectral Response Characterization and Performance
NASA Technical Reports Server (NTRS)
Moeller, Chris; Schwarting, Thomas; McIntire, Jeff; Moyer, Dave; Zeng, Jinan
2017-01-01
The relative spectral response (RSR) characterization of the JPSS-1 VIIRS spectral bands has achieved at launch status in the VIIRS Data Analysis Working Group February 2016 Version 2 RSR release. The Version 2 release improves upon the June 2015 Version 1 release by including December 2014 NIST TSIRCUS spectral measurements of VIIRS VisNIR bands in the analysis plus correcting CO2 influence on the band M13 RSR. The T-SIRCUS based characterization is merged with the summer 2014 SpMA based characterization of VisNIR bands (Version 1 release) to yield a fused RSR for these bands, combining the strengths of the T-SIRCUS and the SpMA measurement systems. The M13 RSR is updated by applying a model-based correction to mitigate CO2 attenuation of the SpMA source signal that occurred during M13 spectral measurements. The Version 2 release carries forward the Version 1 RSR for those bands that were not updated (M8-M12, M14-M16AB, I3-I5, DNBMGS). The Version 2 release includes band average (overall detectors and subsamples) RSR plus supporting RSR for each detector and subsample. The at-launch band average RSR have been used to populate Look-Up Tables supporting the sensor data record and environmental data record at-launch science products. Spectral performance metrics show that JPSS-1VIIRS RSR are compliant on specifications with a few minor exceptions. The Version 2 release, which replaces the Version 1 release, is currently available on the password-protected NASA JPSS-1 eRooms under EAR99 control.
Correlated Encounter Model for Cooperative Aircraft in the National Airspace System; Version 2.0
2018-05-08
Lincoln Laboratory MASSACHUSETTS INSTITUTE OF TECHNOLOGY LEXINGTON, MASSACHUSETTS Project Report ATC-440 Correlated Encounter Model for...specifically authorized by the U.S. Government may violate any copyrights that exist in this work. Correlated Encounter Model for Cooperative Aircraft...2008 Correlated Encounter Model for Cooperative Aircraft (CEM) subsequently referred to as the Extended Correlated Encounter Model (ECEM). This model
NASA Astrophysics Data System (ADS)
Boiti, M.; Pempinelli, F.; Pogrebkov, A.
1997-06-01
We consider, in the framework of the inverse scattering method, the solution of the Kadomtsev - Petviashvili equation in its version called KPI. The spectral theory is extended to the case in which the initial data 0266-5611/13/3/001/img1 are not vanishing along a finite number of directions at large distances on the plane.
ERIC Educational Resources Information Center
Roose, Annelore; Bijttebier, Patricia; Decoene, Stefaan; Claes, Laurence; Frick, Paul J.
2010-01-01
To provide an extended assessment of the affective features of psychopathy, Frick developed the Inventory of Callous and Unemotional Traits (ICU), which is a multi-informant questionnaire. Previous studies have provided initial support for the self-report version. The aim of the present study is to investigate the validity of self- as well as…
Simple Model with Time-Varying Fine-Structure ``Constant''
NASA Astrophysics Data System (ADS)
Berman, M. S.
2009-10-01
Extending the original version written in colaboration with L.A. Trevisan, we study the generalisation of Dirac's LNH, so that time-variation of the fine-structure constant, due to varying electrical and magnetic permittivities is included along with other variations (cosmological and gravitational ``constants''), etc. We consider the present Universe, and also an inflationary scenario. Rotation of the Universe is a given possibility in this model.
Grothendieck-Verdier duality patterns in quantum algebra
NASA Astrophysics Data System (ADS)
Manin, Yu I.
2017-08-01
After a brief survey of the basic definitions of Grothendieck-Verdier categories and dualities, I consider in this context dualities introduced earlier in the categories of quadratic algebras and operads, largely motivated by the theory of quantum groups. Finally, I argue that Dubrovin's `almost duality' in the theory of Frobenius manifolds and quantum cohomology must also fit a (possibly extended) version of Grothendieck-Verdier duality.
Gaussian operations and privacy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Navascues, Miguel; Acin, Antonio
2005-07-15
We consider the possibilities offered by Gaussian states and operations for two honest parties, Alice and Bob, to obtain privacy against a third eavesdropping party, Eve. We first extend the security analysis of the protocol proposed in [Navascues et al. Phys. Rev. Lett. 94, 010502 (2005)]. Then, we prove that a generalized version of this protocol does not allow one to distill a secret key out of bound entangled Gaussian states.
ERIC Educational Resources Information Center
Francis, Andrea Ploucher
2010-01-01
With the increased availability of technology to teachers, it becomes important for researchers and educators alike to understand why teachers choose to use technology for educational purposes. In this study, I use a weak version of the Computers as Social Actors (CASA) hypothesis (Reeves and Nass, 1996; Nass and Moon, 2000) to extend the concept…
ERIC Educational Resources Information Center
Wilding, John; Burke, Kate
2006-01-01
This study aimed to extend earlier work (Wilding, Munir, & Cornish, 2001; Wilding, 2003) which showed that children (aged 6-15) who were rated by their teachers as having poor attentional ability made more errors on a visual search task than children rated as having good attentional ability. The present study used a simpler version of the search…
Design and Implementation of Online Communities
2001-09-01
online community. In reality, the first online community predates even the ARPANET by over 100 years. The first “netizens” communicated on the 19th ...Century’s version of the Internet, which Tom Standage calls “The Victorian Internet” (Standage, 1998). The Victorian Internet was actually the...network of networks extending throughout much of the world. The similarities don’t stop there. The Victorian Internet spawned an extensive online
Realizing the Living Paper using the ProvONE Model for Reproducible Research
NASA Astrophysics Data System (ADS)
Jones, M. B.; Jones, C. S.; Ludäscher, B.; Missier, P.; Walker, L.; Slaughter, P.; Schildhauer, M.; Cuevas-Vicenttín, V.
2015-12-01
Science has advanced through traditional publications that codify research results as a permenant part of the scientific record. But because publications are static and atomic, researchers can only cite and reference a whole work when building on prior work of colleagues. The open source software model has demonstrated a new approach in which strong version control in an open environment can nurture an open ecosystem of software. Developers now commonly fork and extend software giving proper credit, with less repetition, and with confidence in the relationship to original software. Through initiatives like 'Beyond the PDF', an analogous model has been imagined for open science, in which software, data, analyses, and derived products become first class objects within a publishing ecosystem that has evolved to be finer-grained and is realized through a web of linked open data. We have prototyped a Living Paper concept by developing the ProvONE provenance model for scientific workflows, with prototype deployments in DataONE. ProvONE promotes transparency and openness by describing the authenticity, origin, structure, and processing history of research artifacts and by detailing the steps in computational workflows that produce derived products. To realize the Living Paper, we decompose scientific papers into their constituent products and publish these as compound objects in the DataONE federation of archival repositories. Each individual finding and sub-product of a reseach project (such as a derived data table, a workflow or script, a figure, an image, or a finding) can be independently stored, versioned, and cited. ProvONE provenance traces link these fine-grained products within and across versions of a paper, and across related papers that extend an original analysis. This allows for open scientific publishing in which researchers extend and modify findings, creating a dynamic, evolving web of results that collectively represent the scientific enterprise. The Living Paper provides detailed metadata for properly interpreting and verifying individual research findings, for tracing the origin of ideas, for launching new lines of inquiry, and for implementing transitive credit for research and engineering.
Interregional migration in an extended input-output model.
Madden, M; Trigg, A B
1990-01-01
"This article develops a two-region version of an extended input-output model that disaggregates consumption among employed, unemployed, and inmigrant households, and which explicitly models the influx into a region of migrants to take up a proportion of any jobs created in the regional economy. The model is empirically tested using real data for the Scotland (UK) regions of Strathclyde and Rest-of-Scotland. Sets of interregional economic, demographic, demo-economic, and econo-demographic multipliers are developed and discussed, and the effects of a range of economic and demographic impacts are modeled. The circumstances under which Hawkins-Simon conditions for non-negativity are breached are identified, and the limits of the model discussed." excerpt
OXlearn: a new MATLAB-based simulation tool for connectionist models.
Ruh, Nicolas; Westermann, Gert
2009-11-01
OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.
COMET-AR User's Manual: COmputational MEchanics Testbed with Adaptive Refinement
NASA Technical Reports Server (NTRS)
Moas, E. (Editor)
1997-01-01
The COMET-AR User's Manual provides a reference manual for the Computational Structural Mechanics Testbed with Adaptive Refinement (COMET-AR), a software system developed jointly by Lockheed Palo Alto Research Laboratory and NASA Langley Research Center under contract NAS1-18444. The COMET-AR system is an extended version of an earlier finite element based structural analysis system called COMET, also developed by Lockheed and NASA. The primary extensions are the adaptive mesh refinement capabilities and a new "object-like" database interface that makes COMET-AR easier to extend further. This User's Manual provides a detailed description of the user interface to COMET-AR from the viewpoint of a structural analyst.
Brazilian cross-cultural adaptation of “Return-to-work self-efficacy” questionnaire
Silva, João Silvestre; Griep, Rosane Härter; Lagerveld, Suzanne E; Fischer, Frida Marina
2017-01-01
ABSTRACT OBJECTIVE To describe the translation and early stages of cross-cultural adaptation of the questionnaire Verwachtingen over werken (or “Return-to-work self-efficacy”) for workers in sick leave due to mental disorders, from the original in Dutch to the Brazilian Portuguese language. METHODS A panel gathering experts was formed to determine the questionnaire conceptual and item equivalence. For semantic equivalence, the Dutch-Portuguese Brazilian translations were consolidated and consensus meetings were held to structure versions of the instrument. Each version was back-translated from Brazilian Portuguese to Dutch and evaluated by one of the authors of the original version. The final version was submitted to two pre-tests for operational equivalence. RESULTS The original questionnaire in Dutch was translated twice to Brazilian Portuguese. During the process, four consensus meetings of the experts’ panel were performed to create the versions. Each version was back-translated to Dutch. One of the authors of the original questionnaire performed an evaluation on the first three versions until the definition of the final one, which was titled Expectativas sobre o trabalho (Expectations about work). Pre-tests’ participants did not reported problems to fill the questionnaire. CONCLUSIONS Results indicate that the Brazilian Portuguese cross-culturally adapted version maintains the original meaning of the questionnaire, while including characteristics peculiar to the Brazilian reality. Measurement and functional equivalence of this version must still be evaluated before its application can be recommended for workers who have been absent from work due to mental disorders. PMID:28273232
Piredda, Giulia
2017-01-01
Clark and Chalmers (1998) introduced the extended mind hypothesis, according to which some mental states can be realized by non-biological external resources. A lively debate has flourished around this hypothesis, connected with the issues of embodiment, embeddedness, situatedness and enaction (cf. Clark, 2008; Menary, 2010; Shapiro, 2011). Two of the main criticisms addressed to the functionalist version of the extended mind thesis have been the so-called "coupling-constitution fallacy" and the alleged lack of a mark of the cognitive (Adams and Aizawa, 2001, 2005, 2009, 2010a,b). According to Adams and Aizawa, extended cognition is a logical possibility, but is not instantiated in our world. Following this view, they defend a "contingent intracranialism," based on a specific mark of the cognitive that they propose. In this paper I intend to show that neither criticism is effective against the extended cognition thesis. In particular: the mark of the cognitive proposed by Adams and Aizawa does not secure contingent intracranialism;the coupling-constitution fallacy criticizes extended cognition on precisely the point the theory was intended to defend: namely, that the best way to individuate cognitive systems, given a minimal mark of the cognitive, is to rely on coupling relations between agents and environmental resources.
Piredda, Giulia
2017-01-01
Clark and Chalmers (1998) introduced the extended mind hypothesis, according to which some mental states can be realized by non-biological external resources. A lively debate has flourished around this hypothesis, connected with the issues of embodiment, embeddedness, situatedness and enaction (cf. Clark, 2008; Menary, 2010; Shapiro, 2011). Two of the main criticisms addressed to the functionalist version of the extended mind thesis have been the so-called “coupling-constitution fallacy” and the alleged lack of a mark of the cognitive (Adams and Aizawa, 2001, 2005, 2009, 2010a,b). According to Adams and Aizawa, extended cognition is a logical possibility, but is not instantiated in our world. Following this view, they defend a “contingent intracranialism,” based on a specific mark of the cognitive that they propose. In this paper I intend to show that neither criticism is effective against the extended cognition thesis. In particular: the mark of the cognitive proposed by Adams and Aizawa does not secure contingent intracranialism;the coupling-constitution fallacy criticizes extended cognition on precisely the point the theory was intended to defend: namely, that the best way to individuate cognitive systems, given a minimal mark of the cognitive, is to rely on coupling relations between agents and environmental resources. PMID:29234294
Neuraxial blockade for external cephalic version: Cost analysis.
Yamasato, Kelly; Kaneshiro, Bliss; Salcedo, Jennifer
2015-07-01
Neuraxial blockade (epidural or spinal anesthesia/analgesia) with external cephalic version increases the external cephalic version success rate. Hospitals and insurers may affect access to neuraxial blockade for external cephalic version, but the costs to these institutions remain largely unstudied. The objective of this study was to perform a cost analysis of neuraxial blockade use during external cephalic version from hospital and insurance payer perspectives. Secondarily, we estimated the effect of neuraxial blockade on cesarean delivery rates. A decision-analysis model was developed using costs and probabilities occurring prenatally through the delivery hospital admission. Model inputs were derived from the literature, national databases, and local supply costs. Univariate and bivariate sensitivity analyses and Monte Carlo simulations were performed to assess model robustness. Neuraxial blockade was cost saving to both hospitals ($30 per delivery) and insurers ($539 per delivery) using baseline estimates. From both perspectives, however, the model was sensitive to multiple variables. Monte Carlo simulation indicated neuraxial blockade to be more costly in approximately 50% of scenarios. The model demonstrated that routine use of neuraxial blockade during external cephalic version, compared to no neuraxial blockade, prevented 17 cesarean deliveries for every 100 external cephalic versions attempted. Neuraxial blockade is associated with minimal hospital and insurer cost changes in the setting of external cephalic version, while reducing the cesarean delivery rate. © 2015 The Authors. Journal of Obstetrics and Gynaecology Research © 2015 Japan Society of Obstetrics and Gynecology.
Neuraxial blockade for external cephalic version: Cost analysis
Yamasato, Kelly; Kaneshiro, Bliss; Salcedo, Jennifer
2017-01-01
Aim Neuraxial blockade (epidural or spinal anesthesia/analgesia) with external cephalic version increases the external cephalic version success rate. Hospitals and insurers may affect access to neuraxial blockade for external cephalic version, but the costs to these institutions remain largely unstudied. The objective of this study was to perform a cost analysis of neuraxial blockade use during external cephalic version from hospital and insurance payer perspectives. Secondarily, we estimated the effect of neuraxial blockade on cesarean delivery rates. Methods A decision–analysis model was developed using costs and probabilities occurring prenatally through the delivery hospital admission. Model inputs were derived from the literature, national databases, and local supply costs. Univariate and bivariate sensitivity analyses and Monte Carlo simulations were performed to assess model robustness. Results Neuraxial blockade was cost saving to both hospitals ($30 per delivery) and insurers ($539 per delivery) using baseline estimates. From both perspectives, however, the model was sensitive to multiple variables. Monte Carlo simulation indicated neuraxial blockade to be more costly in approximately 50% of scenarios. The model demonstrated that routine use of neuraxial blockade during external cephalic version, compared to no neuraxial blockade, prevented 17 cesarean deliveries for every 100 external cephalic versions attempted. Conclusions Neuraxial blockade is associated with minimal hospital and insurer cost changes in the setting of external cephalic version, while reducing the cesarean delivery rate. PMID:25771920
NASA Astrophysics Data System (ADS)
Kuilman, Maartje; Karlsson, Bodil; Benze, Susanne; Megner, Linda
2017-11-01
Ice particles in the summer mesosphere - such as those connected to noctilucent clouds and polar mesospheric summer echoes - have since their discovery contributed to the uncovering of atmospheric processes on various scales ranging from interactions on molecular levels to global scale circulation patterns. While there are numerous model studies on mesospheric ice microphysics and how the clouds relate to the background atmosphere, there are at this point few studies using comprehensive global climate models to investigate observed variability and climatology of noctilucent clouds. In this study it is explored to what extent the large-scale inter-annual characteristics of noctilucent clouds are captured in a 30-year run - extending from 1979 to 2009 - of the nudged and extended version of the Canadian Middle Atmosphere Model (CMAM30). To construct and investigate zonal mean inter-seasonal variability in noctilucent cloud occurrence frequency and ice mass density in both hemispheres, a simple cloud model is applied in which it is assumed that the ice content is solely controlled by the local temperature and water vapor volume mixing ratio. The model results are compared to satellite observations, each having an instrument-specific sensitivity when it comes to detecting noctilucent clouds. It is found that the model is able to capture the onset dates of the NLC seasons in both hemispheres as well as the hemispheric differences in NLCs, such as weaker NLCs in the SH than in the NH and differences in cloud height. We conclude that the observed cloud climatology and zonal mean variability are well captured by the model.
NASA Astrophysics Data System (ADS)
Nageswararao, M. M.; Mohanty, U. C.; Kiran Prasad, S.; Osuri, Krishna K.; Ramakrishna, S. S. V. S.
2016-11-01
The surface air temperature during the winter season (December-February) in India adversely affects agriculture as well as day-to-day life. Therefore, the accurate prediction of winter temperature in extended range is of utmost importance. The National Center for Environmental Prediction (NCEP) has been providing climatic variables from the fully coupled global climate model, known as Climate Forecast System version 1 (CFSv1) on monthly to seasonal scale since 2004, and it has been upgraded to CFSv2 subsequently in 2011. In the present study, the performance of CFSv1 and CFSv2 in simulating the winter 2 m maximum, minimum, and mean temperatures ( T max, T min, and T mean, respectively) over India is evaluated with respect to India Meteorological Department (IMD) 1° × 1° observations. The hindcast data obtained from both versions of CFS from 1982 to 2009 (27 years) with November initial conditions (lead-1) are used. The analyses of winter ( T max, T min, and T mean) temperatures revealed that CFSv1 and CFSv2 are able to replicate the patterns of observed climatology, interannual variability, and coefficient of variation with a slight negative bias. Of the two, CFSv2 is appreciable in capturing increasing trends of winter temperatures like observed. The T max, T min, and T mean correlations from CFSv2 is significantly high (0.35, 0.53, and 0.51, respectively), while CFSv1 correlations are less (0.29, 0.15, and 0.12) and insignificant. This performance of CFSv2 may be due to the better estimation of surface heat budget terms and realistic CO2 concentration, which were absent in CFSv1. CFSv2 proved to have a high probability of detection in predicting different categories (below, near, and above normal) for winter T min, which are required for crop yield and public utility services, over north India.
Stelmokas, Julija; Yassay, Lance; Giordani, Bruno; Dodge, Hiroko H; Dinov, Ivo D; Bhaumik, Arijit; Sathian, K; Hampstead, Benjamin M
2017-01-01
NeuroQuant (NQ) is a fully-automated program that overcomes several existing limitations in the clinical translation of MRI-derived volumetry. The current study characterized differences between the original (NQ1) and an updated NQ version (NQ2) by 1) replicating previously identified relationships between neuropsychological test performance and medial temporal lobe volumes, 2) evaluating the level of agreement between NQ versions, and 3) determining if the addition of NQ2 age-/sex-based z-scores hold greater clinical utility for prediction of memory impairment than standard percent of intracranial volume (% ICV) values. Sixty-seven healthy older adults and 65 mild cognitive impairment patients underwent structural MRI and completed cognitive testing, including the Immediate and Delayed Memory indices from the Repeatable Battery for the Assessment of Neuropsychological Status. Results generally replicated previous relationships between key medial temporal lobe regions and memory test performance, though comparison of NQ regions revealed statistically different values that were biased toward one version or the other depending on the region. NQ2 hippocampal z-scores explained additional variance in memory performance relative to % ICV values. Findings indicate that NQ1/2 medial temporal lobe volumes, especially age- and sex-based z-scores, hold clinical value, though caution is warranted when directly comparing volumes across NQ versions.
FastDart : a fast, accurate and friendly version of DART code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rest, J.; Taboada, H.
2000-11-08
A new enhanced, visual version of DART code is presented. DART is a mechanistic model based code, developed for the performance calculation and assessment of aluminum dispersion fuel. Major issues of this new version are the development of a new, time saving calculation routine, able to be run on PC, a friendly visual input interface and a plotting facility. This version, available for silicide and U-Mo fuels,adds to the classical accuracy of DART models for fuel performance prediction, a faster execution and visual interfaces. It is part of a collaboration agreement between ANL and CNEA in the area of Lowmore » Enriched Uranium Advanced Fuels, held by the Implementation Arrangement for Technical Exchange and Cooperation in the Area of Peaceful Uses of Nuclear Energy.« less
An experimental investigation of fault tolerant software structures in an avionics application
NASA Technical Reports Server (NTRS)
Caglayan, Alper K.; Eckhardt, Dave E., Jr.
1989-01-01
The objective of this experimental investigation is to compare the functional performance and software reliability of competing fault tolerant software structures utilizing software diversity. In this experiment, three versions of the redundancy management software for a skewed sensor array have been developed using three diverse failure detection and isolation algorithms and incorporated into various N-version, recovery block and hybrid software structures. The empirical results show that, for maximum functional performance improvement in the selected application domain, the results of diverse algorithms should be voted before being processed by multiple versions without enforced diversity. Results also suggest that when the reliability gain with an N-version structure is modest, recovery block structures are more feasible since higher reliability can be obtained using an acceptance check with a modest reliability.
The Hydrologic Evaluation of Landfill Performance (HELP) computer program is a quasi-two-dimensional hydrologic model of water movement across, into, through and out of landfills. The model accepts weather, soil and design data. Landfill systems including various combinations o...
Performance Validation of Version 152.0 ANSER Control Laws for the F-18 HARV
NASA Technical Reports Server (NTRS)
Messina, Michael D.
1996-01-01
The Actuated Nose Strakes for Enhanced Rolling (ANSER) Control Laws were modified as a result of Phase 3 F/A-18 High Alpha Research Vehicle (HARV) flight testing. The control law modifications for the next software release were designated version 152.0. The Ada implementation was tested in the Hardware-In-the-Loop (HIL) simulation and results were compared to those obtained with the NASA Langley batch Fortran implementation of the control laws which are considered the 'truth model.' This report documents the performance validation test results between these implementations for ANSER control law version 152.0.
Preliminary Evaluation of the Community Multiscale Air Quality (CMAQ) Model Version 5.1
The AMAD will perform two annual CMAQ model simulations, one with the current publically available version of the CMAQ model (v5.0.2) and the other with the beta version of the new model (v5.1). The results of each model simulation will then be compared to observations and the pe...
Excoffier, Laurent; Lischer, Heidi E L
2010-05-01
We present here a new version of the Arlequin program available under three different forms: a Windows graphical version (Winarl35), a console version of Arlequin (arlecore), and a specific console version to compute summary statistics (arlsumstat). The command-line versions run under both Linux and Windows. The main innovations of the new version include enhanced outputs in XML format, the possibility to embed graphics displaying computation results directly into output files, and the implementation of a new method to detect loci under selection from genome scans. Command-line versions are designed to handle large series of files, and arlsumstat can be used to generate summary statistics from simulated data sets within an Approximate Bayesian Computation framework. © 2010 Blackwell Publishing Ltd.
Sveen, Unni; Andelic, Nada; Bautz-Holter, Erik; Røe, Cecilie
2015-01-01
To evaluate the psychometric properties of the Norwegian version of the Patient Competency Rating Scale (PCRS) in patients with traumatic brain injury (TBI) at 12 months post-injury. Demographic and injury-related data were registered upon admission to the hospital in 148 TBI patients with mild, moderate, or severe TBI. At 12 months post-injury, competency in activities and global functioning were measured using the PCRS patient version and the Glasgow Outcome Scale-Extended (GOSE). Descriptive reliability statistics, factor analysis and Rasch modeling were applied to explore the psychometric properties of the PCRS. External validity was evaluated using the GOSE. The PCRS can be divided into three subscales that reflect interpersonal/emotional, cognitive, and activities of daily living competency. The three-factor solution explained 56.6% of the variance in functioning. The internal consistency was very good, with a Cronbach's α of 0.95. Item 30, "controlling my laughter", did not load above 0.40 on any factors and did not fit the Rasch model. The external validity of the subscales was acceptable, with correlations between 0.50 and 0.52 with the GOSE. The Norwegian version of the PCRS is reliable, has an acceptable construct and external validity, and can be recommended for use during the later phases of TBI.
Assessing recent warming using instrumentally homogeneous sea surface temperature records.
Hausfather, Zeke; Cowtan, Kevin; Clarke, David C; Jacobs, Peter; Richardson, Mark; Rohde, Robert
2017-01-01
Sea surface temperature (SST) records are subject to potential biases due to changing instrumentation and measurement practices. Significant differences exist between commonly used composite SST reconstructions from the National Oceanic and Atmospheric Administration's Extended Reconstruction Sea Surface Temperature (ERSST), the Hadley Centre SST data set (HadSST3), and the Japanese Meteorological Agency's Centennial Observation-Based Estimates of SSTs (COBE-SST) from 2003 to the present. The update from ERSST version 3b to version 4 resulted in an increase in the operational SST trend estimate during the last 19 years from 0.07° to 0.12°C per decade, indicating a higher rate of warming in recent years. We show that ERSST version 4 trends generally agree with largely independent, near-global, and instrumentally homogeneous SST measurements from floating buoys, Argo floats, and radiometer-based satellite measurements that have been developed and deployed during the past two decades. We find a large cooling bias in ERSST version 3b and smaller but significant cooling biases in HadSST3 and COBE-SST from 2003 to the present, with respect to most series examined. These results suggest that reported rates of SST warming in recent years have been underestimated in these three data sets.
Student perceptions of secondary science: A performance technology application
NASA Astrophysics Data System (ADS)
Small, Belinda Rusnak
The primary purpose of this study was to identify influences blocking or promoting science performance from the lived K-12 classroom experience. Human Performance Technology protocols were used to understand factors promoting or hindering science performance. The goal was to gain information from the individual students' perspective to enhance opportunities for stakeholders to improve the current state of performance in science education. Individual perspectives of 10 secondary science students were examined using grounded theory protocols. Findings include students' science learning behaviors are influenced by two major themes, environmental supports and individual learning behaviors. The three environmental support factors identified include the methods students receive instruction, students' opportunities to access informal help apart from formal instruction, and students' feelings of teacher likability. Additionally, findings include three major factors causing individual learners to generate knowledge in science. Factors reported include personalizing information to transform data into knowledge, customizing learning opportunities to maximize peak performance, and tapping motivational opportunities to persevere through complex concepts. The emergent theory postulated is that if a performance problem exists in an educational setting, then integrating student perspectives into the cause analysis opens opportunity to align interventions for influencing student performance outcomes. An adapted version of Gilbert's Behavioral Engineering Model is presented as an organizational tool to display the findings. The boundaries of this Performance Technology application do not extend to the identification, selection, design, or implementation of solutions to improved science performance. However, as stakeholders begin to understand learner perspectives then aligned decisions may be created to support learners of science in a direct, cost effective manner.
Schunck, N.; Dobaczewski, J.; Satuła, W.; ...
2017-03-27
Here, we describe the new version (v2.73y) of the code hfodd which solves the nuclear Skyrme Hartree–Fock or Skyrme Hartree–Fock–Bogolyubov problem by using the Cartesian deformed harmonic-oscillator basis. In the new version, we have implemented the following new features: (i) full proton–neutron mixing in the particle–hole channel for Skyrme functionals, (ii) the Gogny force in both particle–hole and particle–particle channels, (iii) linear multi-constraint method at finite temperature, (iv) fission toolkit including the constraint on the number of particles in the neck between two fragments, calculation of the interaction energy between fragments, and calculation of the nuclear and Coulomb energy ofmore » each fragment, (v) the new version 200d of the code hfbtho, together with an enhanced interface between HFBTHO and HFODD, (vi) parallel capabilities, significantly extended by adding several restart options for large-scale jobs, (vii) the Lipkin translational energy correction method with pairing, (viii) higher-order Lipkin particle-number corrections, (ix) interface to a program plotting single-particle energies or Routhians, (x) strong-force isospin-symmetry-breaking terms, and (xi) the Augmented Lagrangian Method for calculations with 3D constraints on angular momentum and isospin. Finally, an important bug related to the calculation of the entropy at finite temperature and several other little significant errors of the previous published version were corrected.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schunck, N.; Dobaczewski, J.; Satuła, W.
Here, we describe the new version (v2.73y) of the code hfodd which solves the nuclear Skyrme Hartree–Fock or Skyrme Hartree–Fock–Bogolyubov problem by using the Cartesian deformed harmonic-oscillator basis. In the new version, we have implemented the following new features: (i) full proton–neutron mixing in the particle–hole channel for Skyrme functionals, (ii) the Gogny force in both particle–hole and particle–particle channels, (iii) linear multi-constraint method at finite temperature, (iv) fission toolkit including the constraint on the number of particles in the neck between two fragments, calculation of the interaction energy between fragments, and calculation of the nuclear and Coulomb energy ofmore » each fragment, (v) the new version 200d of the code hfbtho, together with an enhanced interface between HFBTHO and HFODD, (vi) parallel capabilities, significantly extended by adding several restart options for large-scale jobs, (vii) the Lipkin translational energy correction method with pairing, (viii) higher-order Lipkin particle-number corrections, (ix) interface to a program plotting single-particle energies or Routhians, (x) strong-force isospin-symmetry-breaking terms, and (xi) the Augmented Lagrangian Method for calculations with 3D constraints on angular momentum and isospin. Finally, an important bug related to the calculation of the entropy at finite temperature and several other little significant errors of the previous published version were corrected.« less
NASA Astrophysics Data System (ADS)
Huang, B.; Thorne, P.; Banzon, P. V. F.; Chepurin, G. A.; Lawrimore, J. H.; Menne, M. J.; Vose, R. S.; Smith, T. M.; Zhang, H. M.
2017-12-01
The monthly global 2°×2° Extended Reconstructed Sea Surface Temperature (ERSST) has been revised and updated from version 4 to version 5. This update incorporates a new release of ICOADS R3.0, a decade of near-surface data from Argo floats, and a new estimate of centennial sea-ice from HadISST2. A number of choices in aspects of quality control, bias adjustment and interpolation have been substantively revised. The resulting ERSST estimates have more realistic spatio-temporal variations, better representation of high latitude SSTs, and ship SST biases are now calculated relative to more accurate buoy measurements, while the global long-term trend remains about the same. Progressive experiments have been undertaken to highlight the effects of each change in data source and analysis technique upon the final product. The reconstructed SST is systematically decreased by 0.077°C, as the reference data source is switched from ship SST in v4 to modern buoy SST in v5. Furthermore, high latitude SSTs are decreased by 0.1°-0.2°C by using sea-ice concentration from HadISST2 over HadISST1. Changes arising from remaining innovations are mostly important at small space and time scales, primarily having an impact where and when input observations are sparse. Cross-validations and verifications with independent modern observations show that the updates incorporated in ERSSTv5 have improved the representation of spatial variability over the global oceans, the magnitude of El Niño and La Niña events, and the decadal nature of SST changes over 1930s-40s when observation instruments changed rapidly. Both long (1900-2015) and short (2000-2015) term SST trends in ERSSTv5 remain significant as in ERSSTv4.
Improving fast generation of halo catalogues with higher order Lagrangian perturbation theory
NASA Astrophysics Data System (ADS)
Munari, Emiliano; Monaco, Pierluigi; Sefusatti, Emiliano; Castorina, Emanuele; Mohammad, Faizan G.; Anselmi, Stefano; Borgani, Stefano
2017-03-01
We present the latest version of PINOCCHIO, a code that generates catalogues of dark matter haloes in an approximate but fast way with respect to an N-body simulation. This code version implements a new on-the-fly production of halo catalogue on the past light cone with continuous time sampling, and the computation of particle and halo displacements are extended up to third-order Lagrangian perturbation theory (LPT), in contrast with previous versions that used Zel'dovich approximation. We run PINOCCHIO on the same initial configuration of a reference N-body simulation, so that the comparison extends to the object-by-object level. We consider haloes at redshifts 0 and 1, using different LPT orders either for halo construction or to compute halo final positions. We compare the clustering properties of PINOCCHIO haloes with those from the simulation by computing the power spectrum and two-point correlation function in real and redshift space (monopole and quadrupole), the bispectrum and the phase difference of halo distributions. We find that 2LPT and 3LPT give noticeable improvement. 3LPT provides the best agreement with N-body when it is used to displace haloes, while 2LPT gives better results for constructing haloes. At the highest orders, linear bias is typically recovered at a few per cent level. In Fourier space and using 3LPT for halo displacements, the halo power spectrum is recovered to within 10 per cent up to kmax ∼ 0.5 h Mpc-1. The results presented in this paper have interesting implications for the generation of large ensemble of mock surveys for the scientific exploitation of data from big surveys.
NASA Astrophysics Data System (ADS)
Murray, Natalie; Bourne, Neil; Field, John
1997-07-01
Brar and Bless pioneeered the use of plate impact upon bars as a technique for investigating the 1D stress loading of glass. We wish to extend this technique by applying VISAR and embedded stress gauge measurements to a symmetrical version of the test. In this configuration two rods impact one upon the other in a symmetrical version of the Taylor test geometry in which the impact is perfectly rigid in the centre of mass frame. Previous work in the laboratory has characterised the three glass types (float, borosilicate and a high density lead glass). These experiments will identify the 1D stress failure mechanisms from high-speed photography and the stress and particle velocity histories will be interpreted in the light of these results. The differences in response of the three glasses will be highlighted.
Shielding from space radiations
NASA Technical Reports Server (NTRS)
Chang, C. Ken; Badavi, Forooz F.; Tripathi, Ram K.
1993-01-01
This Progress Report covering the period of December 1, 1992 to June 1, 1993 presents the development of an analytical solution to the heavy ion transport equation in terms of Green's function formalism. The mathematical development results are recasted into a highly efficient computer code for space applications. The efficiency of this algorithm is accomplished by a nonperturbative technique of extending the Green's function over the solution domain. The code may also be applied to accelerator boundary conditions to allow code validation in laboratory experiments. Results from the isotopic version of the code with 59 isotopes present for a single layer target material, for the case of an iron beam projectile at 600 MeV/nucleon in water is presented. A listing of the single layer isotopic version of the code is included.
Strehl-constrained reconstruction of post-adaptive optics data and the Software Package AIRY, v. 6.1
NASA Astrophysics Data System (ADS)
Carbillet, Marcel; La Camera, Andrea; Deguignet, Jérémy; Prato, Marco; Bertero, Mario; Aristidi, Éric; Boccacci, Patrizia
2014-08-01
We first briefly present the last version of the Software Package AIRY, version 6.1, a CAOS-based tool which includes various deconvolution methods, accelerations, regularizations, super-resolution, boundary effects reduction, point-spread function extraction/extrapolation, stopping rules, and constraints in the case of iterative blind deconvolution (IBD). Then, we focus on a new formulation of our Strehl-constrained IBD, here quantitatively compared to the original formulation for simulated near-infrared data of an 8-m class telescope equipped with adaptive optics (AO), showing their equivalence. Next, we extend the application of the original method to the visible domain with simulated data of an AO-equipped 1.5-m telescope, testing also the robustness of the method with respect to the Strehl ratio estimation.
Families in Bollywood cinema: changes and context.
Deakin, Nicholas; Bhugra, Dinesh
2012-04-01
With increasing and rapid urbanization and population changes in India, a growing number of people are migrating from rural areas to urban areas, which brings about major changes in support systems. As a result, the portrayal of families has also changed in Hindi cinema over the last 50 years. Recent family melodramas have focused on an idealized version of joint and extended families. In this paper we use some key Hindi films of the 1960s and of the last two decades to compare how films have changed and how, in view of changing audiences, they have created a version of the family which is far from real. Clinicians need to be aware of these changes while dealing with patients and their families (the latter may have unrealistic expectations of their own family members).
Bao, Yijun; Gaylord, Thomas K
2016-11-01
Multifilter phase imaging with partially coherent light (MFPI-PC) is a promising new quantitative phase imaging method. However, the existing MFPI-PC method is based on the paraxial approximation. In the present work, an analytical nonparaxial partially coherent phase optical transfer function is derived. This enables the MFPI-PC to be extended to the realistic nonparaxial case. Simulations over a wide range of test phase objects as well as experimental measurements on a microlens array verify higher levels of imaging accuracy compared to the paraxial method. Unlike the paraxial version, the nonparaxial MFPI-PC with obliquity factor correction exhibits no systematic error. In addition, due to its analytical expression, the increase in computation time compared to the paraxial version is negligible.
Introducing MCgrid 2.0: Projecting cross section calculations on grids
NASA Astrophysics Data System (ADS)
Bothmann, Enrico; Hartland, Nathan; Schumann, Steffen
2015-11-01
MCgrid is a software package that provides access to interpolation tools for Monte Carlo event generator codes, allowing for the fast and flexible variation of scales, coupling parameters and PDFs in cutting edge leading- and next-to-leading-order QCD calculations. We present the upgrade to version 2.0 which has a broader scope of interfaced interpolation tools, now providing access to fastNLO, and features an approximated treatment for the projection of MC@NLO-type calculations onto interpolation grids. MCgrid 2.0 also now supports the extended information provided through the HepMC event record used in the recent SHERPA version 2.2.0. The additional information provided therein allows for the support of multi-jet merged QCD calculations in a future update of MCgrid.
Dai, Meiling; Guo, Hongbo; Dortmans, Jos C. F. M.; Dekkers, Jojanneke; Nordholm, Johan; Daniels, Robert; van Kuppeveld, Frank J. M.; de Vries, Erik
2016-01-01
ABSTRACT Influenza A virus (IAV) attachment to and release from sialoside receptors is determined by the balance between hemagglutinin (HA) and neuraminidase (NA). The molecular determinants that mediate the specificity and activity of NA are still poorly understood. In this study, we aimed to design the optimal recombinant soluble NA protein to identify residues that affect NA enzymatic activity. To this end, recombinant soluble versions of four different NA proteins from H5N1 viruses were compared with their full-length counterparts. The soluble NA ectodomains were fused to three commonly used tetramerization domains. Our results indicate that the particular oligomerization domain used does not affect the Km value but may affect the specific enzymatic activity. This particularly holds true when the stalk domain is included and for NA ectodomains that display a low intrinsic ability to oligomerize. NA ectodomains extended with a Tetrabrachion domain, which forms a nearly parallel four-helix bundle, better mimicked the enzymatic properties of full-length proteins than when other coiled-coil tetramerization domains were used, which probably distort the stalk domain. Comparison of different NA proteins and mutagenic analysis of recombinant soluble versions thereof resulted in the identification of several residues that affected oligomerization of the NA head domain (position 95) and therefore the specific activity or sialic acid binding affinity (Km value; positions 252 and 347). This study demonstrates the potential of using recombinant soluble NA proteins to reveal determinants of NA assembly and enzymatic activity. IMPORTANCE The IAV HA and NA glycoproteins are important determinants of host tropism and pathogenicity. However, NA is relatively understudied compared to HA. Analysis of soluble versions of these glycoproteins is an attractive way to study their activities, as they are easily purified from cell culture media and applied in downstream assays. In the present study, we analyzed the enzymatic activity of different NA ectodomains with three commonly used tetramerization domains and compared them with full-length NA proteins. By performing a mutagenic analysis, we identified several residues that affected NA assembly, activity, and/or substrate binding. In addition, our results indicate that the design of the recombinant soluble NA protein, including the particular tetramerization domain, is an important determinant for maintaining the enzymatic properties within the head domain. NA ectodomains extended with a Tetrabrachion domain better mimicked the full-length proteins than when the other tetramerization domains were used. PMID:27512075
Evaluation of Aquarius Version-5 Sea Surface Salinity on various spatial and temporal scales
NASA Astrophysics Data System (ADS)
Lee, T.
2017-12-01
Sea surface salinity (SSS) products from Aquarius have had three public releases with progressive improvement in data quality: Versions 2, 3, and 4, with the last one being released in October 2015. A systematic assessment of the Version-4, Level-3 Aquarius SSS product was performed on various spatial and temporal scales by comparing it with gridded Argo products (Lee 2016, Geophys. Res. Lett.). The comparison showed that the consistency of Aquarius Version-4 SSS with gridded Argo products is comparable to that between two different gridded Argo products. However, significant seasonal biases remain in high-latitude oceans. Further improvements are being made by the Aquarius team. Aquarius Version 5.0 SSS is scheduled to be released in October 2017 as the final version of the Aquarius Project. This presentation provides a similar evaluation of Version-5 SSS as reported by Lee (2016) and contrast it with the current Version-4 SSS.
Hruban, L; Janků, P; Jordánová, K; Gerychová, R; Huser, M; Ventruba, P; Roztočil, A
2017-01-01
Evaluation of success rate and the safety of external cephalic version after 36 weeks of gestation. Retrospective analysis. Department of Obstetrics and Gynecology, Masaryk University, University Hospital Brno. A retrospective analysis of external cephalic version attempts performed on a group of 638 singleton breech pregnancies after 36 weeks gestation in the years 2003-2016 at the Department of Gynecology and Obstetrics, Masaryk University, Brno. The effectiveness, number and type of complications, mode of delivery and perinatal result were observed. The effectiveness of external cephalic version from breech to head presentation was 47.8% (305 cases). After a successful external cephalic version 238 patients (78.0%) gave birth vaginally. After unsuccessful cephalic version 130 patients (39.0%) gave birth vaginally. The number of serious complications did not exceed 0,9% and did not affect perinatal outcomes. External cephalic version-related emergency cesarean deliveries occurred in 6 cases (2 placental abruption, 4 abnormal cardiotocography). The fetal outcome was good in all these cases. The death of the fetus in connection with the external version has not occurred in our file. Spontaneous discharge of amniotic fluid within 24 hours after procedure occurred in 5 cases (0.8%). The spontaneous onset of labor within 24 hours of procedure occurred in 5 cases (0.8%). The pH value of a. umbilicalis < 7.00 occurred in 2 cases in the group with a successful external version and in the group with unsuccessful external version in 9 cases. The Apgar score in the 5th minute < 5 was both in the successful and unsuccessful group in 1 case. The external cephalic version of the fetus in the case of breech presentation after the 36th week of pregnancy is an effective and safe alternative for women who have a fear of the vaginal breech delivery. Performing the external cephalic version can reduce the rate of elective caesarean sections due to breech presentation at term.
JPSS-1 VIIRS version 2 at-launch relative spectral response characterization and performance
NASA Astrophysics Data System (ADS)
Moeller, Chris; Schwarting, Tom; McIntire, Jeff; Moyer, David I.; Zeng, Jinan
2016-09-01
The relative spectral response (RSR) characterization of the JPSS-1 VIIRS spectral bands has achieved "at launch" status in the VIIRS Data Analysis Working Group February 2016 Version 2 RSR release. The Version 2 release improves upon the June 2015 Version 1 release by including December 2014 NIST TSIRCUS spectral measurements of VIIRS VisNIR bands in the analysis plus correcting CO2 influence on the band M13 RSR. The T-SIRCUS based characterization is merged with the summer 2014 SpMA based characterization of VisNIR bands (Version 1 release) to yield a "fused" RSR for these bands, combining the strengths of the T-SIRCUS and the SpMA measurement systems. The M13 RSR is updated by applying a model-based correction to mitigate CO2 attenuation of the SpMA source signal that occurred during M13 spectral measurements. The Version 2 release carries forward the Version 1 RSR for those bands that were not updated (M8-M12, M14-M16A/B, I3-I5, DNBMGS). The Version 2 release includes band average (over all detectors and subsamples) RSR plus supporting RSR for each detector and subsample. The at-launch band average RSR have been used to populate Look-Up Tables supporting the sensor data record and environmental data record at-launch science products. Spectral performance metrics show that JPSS-1 VIIRS RSR are compliant on specifications with a few minor exceptions. The Version 2 release, which replaces the Version 1 release, is currently available on the password-protected NASA JPSS-1 eRooms under EAR99 control.
NASA Astrophysics Data System (ADS)
Krawczyk, Rafał D.; Czarski, Tomasz; Kolasiński, Piotr; Linczuk, Paweł; Poźniak, Krzysztof T.; Chernyshova, Maryna; Kasprowicz, Grzegorz; Wojeński, Andrzej; Zabolotny, Wojciech; Zienkiewicz, Paweł
2016-09-01
This article is an overview of what has been implemented in the process of development and testing the GEM detector based acquisition system in terms of post-processing algorithms. Information is given on mex functions for extended statistics collection, unified hex topology and optimized S-DAQ algorithm for splitting overlapped signals. Additional discussion on bottlenecks and major factors concerning optimization is presented.
ERIC Educational Resources Information Center
Lee, Chung Gun
2014-01-01
This study consists of three sub-studies. Sub-study 1 and 2 attempted to incorporate environmental variables as precursor background variables of the theory of planned behavior (TPB) to predict quitting-related intentions among Texas adult smokers and university student smokers, respectively. Sub-study 1 and 2 analyzed different data sets and were…
Sugimoto, Mikio; Takegami, Misa; Suzukamo, Yoshimi; Fukuhara, Shunichi; Kakehi, Yoshiyuki
2008-06-01
To evaluate health related quality of life (HRQOL) using the Medical Outcomes Study 8-items Short Form Health Survey (SF-8) questionnaire in Japanese patients with early prostate cancer. A cross-sectional analysis was done in 457 patients with prostate cancer treated with radical prostatectomy, external beam radiotherapy, brachytherapy, androgen deprivation therapy, and watchful waiting or a combination these therapies. General HRQOL was measured using the Japanese version of the SF-8 questionnaire and disease-specific HRQOL was assessed using the Japanese version of the Extended Prostate Cancer Index Composite. The external beam radiotherapy group reported significantly lower values for the physical health component summary score (PCS) in comparison to the radical prostatectomy and brachytherapy groups (P < 0.05). In the analysis of both the PCS and the mental health component summary score (MCS) over time after treatment, higher scores with time were found in the radical prostatectomy group. No significant change over time after androgen deprivation therapy in the PCS was found. In contrast, the MCS was found to deteriorate in the early period, showing a significant increase over time. SF-8 in combination with the Extended Prostate Cancer Index Composite has shown to be a helpful tool in the HRQOL assessment of Japanese patients treated for localized prostate cancer.
Javitt, Daniel C; Rabinowicz, Esther; Silipo, Gail; Dias, Elisa C
2007-03-01
Deficits in working memory performance are among the most widely replicated findings in schizophrenia. Roles of encoding vs. memory retention in working memory remain unresolved. The present study evaluated working memory performance in schizophrenia using an AX-type continuous performance test (AX-CPT) paradigm. Participants included 48 subjects with schizophrenia and 27 comparison subjects. Behavior was obtained in 3 versions of the task, which differed based upon ease of cue interoperability. In a simple cue version of the task, cue letters were replaced with red or green circles. In the complex cue version, letter/color conjunctions served as cues. In the base version of the task, patients showed increased rates of false alarms to invalidly cued targets, similar to prior reports. However, when the cue stimuli were replaced with green or red circles to ease interpretation, patients showed similar false alarm rates to controls. When feature conjunction cues were used, patients were also disproportionately affected relative to controls. No significant group by interstimulus interval interaction effects were observed in either the simple or complex cue conditions, suggesting normal retention of information even in the presence of overall performance decrements. These findings suggest first, that cue manipulation disproportionately affects AX-CPT performance in schizophrenia and, second, that substantial behavioral deficits may be observed on working memory tasks even in the absence of disturbances in mnemonic retention.
This Validations Summary Report (VSR) summarizes the results and conclusions of validation testing performed on the HARRIS Ada Compiler, Version 1.0...at compile time, at link time, or during execution. On-site testing was performed 28 APR 1986 through 30 APR 1986 at Harris Corporation, Ft. Lauderdale
Caporali, Priscila Faissola; Caporali, Sueli Aparecida; Bucuvic, Érika Cristina; Vieira, Sheila de Souza; Santos, Zeila Maria; Chiari, Brasília Maria
2016-01-01
Translation and cross-cultural adaptation of the instrument Hearing Implant Sound Quality Index (HISQUI19), and characterization of the target population and auditory performance in Cochlear Implant (CI) users through the application of a synthesis version of this tool. Evaluations of conceptual, item, semantic and operational equivalences were performed. The synthesis version was applied as a pre-test to 33 individuals, whose final results characterized the final sample and performance of the questionnaire. The results were analyzed statistically. The final translation (FT) was back-translated and compared with the original version, revealing a minimum difference between items. The changes observed between the FT and the synthesis version were characterized by the application of simplified vocabulary used on a daily basis. For the pre-test, the average score of the interviewees was 90.2, and a high level of reliability was achieved (0.83). The translation and cross-cultural adaptation of the HISQUI19 questionnaire showed suitability for conceptual, item, semantic and operational equivalences. For the sample characterization, the sound quality was classified as good with better performance for the categories of location and distinction of sound/voices.
Balestrieri, M; Giaroli, G; Mazzi, M; Bellantuono, C
2006-05-01
Several studies indicate that subjective experience toward antipsychotic drugs (APs) in schizophrenic patients is a key factor in ensuring a smooth recovery from the illness. The principal aim of this study was to establish the psychometric performance of the Subjective Well-being Under Neuroleptic (SWN) scale in its Italian version and to assess, through the SWN scale, the subjective experience of stabilized psychotic outpatients in maintenance with APs. The original short version of SWN, consisting of 20 items, was back translated, and a focus group was also conducted to better improve the comprehension of the scale. The results showed a good performance of the Italian version of the SWN as documented by the internal consistency (Cronbach's alpha; 0.85). A satisfactory subjective experience was reported in the sample of schizophrenic outpatients interviewed (SWN mean total score: 84.95, SD: 17.5). The performance of the SWN scale in the present study was very similar to that reported by Naber et al. in the original validation study. Large multi-center studies are needed to better establish differences in the subjective experience of schizophrenic patients treated with first- and second-generation APs.
Heuristics in Problem Solving: The Role of Direction in Controlling Search Space
ERIC Educational Resources Information Center
Chu, Yun; Li, Zheng; Su, Yong; Pizlo, Zygmunt
2010-01-01
Isomorphs of a puzzle called m+m resulted in faster solution times and an easily reproduced solution path in a labeled version of the problem compared to a more difficult binary version. We conjecture that performance is related to a type of heuristic called direction that not only constrains search space in the labeled version, but also…
ERIC Educational Resources Information Center
Taha, Haitham
2017-01-01
The current research examined how Arabic diglossia affects verbal learning memory. Thirty native Arab college students were tested using auditory verbal memory test that was adapted according to the Rey Auditory Verbal Learning Test and developed in three versions: Pure spoken language version (SL), pure standard language version (SA), and…
Terrier, A; Ston, J; Larrea, X; Farron, A
2014-04-01
The three-dimensional (3D) correction of glenoid erosion is critical to the long-term success of total shoulder replacement (TSR). In order to characterise the 3D morphology of eroded glenoid surfaces, we looked for a set of morphological parameters useful for TSR planning. We defined a scapular coordinates system based on non-eroded bony landmarks. The maximum glenoid version was measured and specified in 3D by its orientation angle. Medialisation was considered relative to the spino-glenoid notch. We analysed regular CT scans of 19 normal (N) and 86 osteoarthritic (OA) scapulae. When the maximum version of OA shoulders was higher than 10°, the orientation was not only posterior, but extended in postero-superior (35%), postero-inferior (6%) and anterior sectors (4%). The medialisation of the glenoid was higher in OA than normal shoulders. The orientation angle of maximum version appeared as a critical parameter to specify the glenoid shape in 3D. It will be very useful in planning the best position for the glenoid in TSR.
Modeling the Office of Science Ten Year FacilitiesPlan: The PERI Architecture Tiger Team
DOE Office of Scientific and Technical Information (OSTI.GOV)
de Supinski, B R; Alam, S R; Bailey, D H
2009-05-27
The Performance Engineering Institute (PERI) originally proposed a tiger team activity as a mechanism to target significant effort to the optimization of key Office of Science applications, a model that was successfully realized with the assistance of two JOULE metric teams. However, the Office of Science requested a new focus beginning in 2008: assistance in forming its ten year facilities plan. To meet this request, PERI formed the Architecture Tiger Team, which is modeling the performance of key science applications on future architectures, with S3D, FLASH and GTC chosen as the first application targets. In this activity, we have measuredmore » the performance of these applications on current systems in order to understand their baseline performance and to ensure that our modeling activity focuses on the right versions and inputs of the applications. We have applied a variety of modeling techniques to anticipate the performance of these applications on a range of anticipated systems. While our initial findings predict that Office of Science applications will continue to perform well on future machines from major hardware vendors, we have also encountered several areas in which we must extend our modeling techniques in order to fulfill our mission accurately and completely. In addition, we anticipate that models of a wider range of applications will reveal critical differences between expected future systems, thus providing guidance for future Office of Science procurement decisions, and will enable DOE applications to exploit machines in future facilities fully.« less
Modeling the Office of Science Ten Year Facilities Plan: The PERI Architecture Tiger Team
DOE Office of Scientific and Technical Information (OSTI.GOV)
de Supinski, Bronis R.; Alam, Sadaf; Bailey, David H.
2009-06-26
The Performance Engineering Institute (PERI) originally proposed a tiger team activity as a mechanism to target significant effort optimizing key Office of Science applications, a model that was successfully realized with the assistance of two JOULE metric teams. However, the Office of Science requested a new focus beginning in 2008: assistance in forming its ten year facilities plan. To meet this request, PERI formed the Architecture Tiger Team, which is modeling the performance of key science applications on future architectures, with S3D, FLASH and GTC chosen as the first application targets. In this activity, we have measured the performance ofmore » these applications on current systems in order to understand their baseline performance and to ensure that our modeling activity focuses on the right versions and inputs of the applications. We have applied a variety of modeling techniques to anticipate the performance of these applications on a range of anticipated systems. While our initial findings predict that Office of Science applications will continue to perform well on future machines from major hardware vendors, we have also encountered several areas in which we must extend our modeling techniques in order to fulfill our mission accurately and completely. In addition, we anticipate that models of a wider range of applications will reveal critical differences between expected future systems, thus providing guidance for future Office of Science procurement decisions, and will enable DOE applications to exploit machines in future facilities fully.« less
Modeling the Office of Science Ten Year Facilities Plan: The PERI Architecture Team
DOE Office of Scientific and Technical Information (OSTI.GOV)
de Supinski, Bronis R.; Alam, Sadaf R; Bailey, David
2009-01-01
The Performance Engineering Institute (PERI) originally proposed a tiger team activity as a mechanism to target significant effort optimizing key Office of Science applications, a model that was successfully realized with the assistance of two JOULE metric teams. However, the Office of Science requested a new focus beginning in 2008: assistance in forming its ten year facilities plan. To meet this request, PERI formed the Architecture Tiger Team, which is modeling the performance of key science applications on future architectures, with S3D, FLASH and GTC chosen as the first application targets. In this activity, we have measured the performance ofmore » these applications on current systems in order to understand their baseline performance and to ensure that our modeling activity focuses on the right versions and inputs of the applications. We have applied a variety of modeling techniques to anticipate the performance of these applications on a range of anticipated systems. While our initial findings predict that Office of Science applications will continue to perform well on future machines from major hardware vendors, we have also encountered several areas in which we must extend our modeling techniques in order to fulfilll our mission accurately and completely. In addition, we anticipate that models of a wider range of applications will reveal critical differences between expected future systems, thus providing guidance for future Office of Science procurement decisions, and will enable DOE applications to exploit machines in future facilities fully.« less
Detecting cheaters without thinking: testing the automaticity of the cheater detection module.
Van Lier, Jens; Revlin, Russell; De Neys, Wim
2013-01-01
Evolutionary psychologists have suggested that our brain is composed of evolved mechanisms. One extensively studied mechanism is the cheater detection module. This module would make people very good at detecting cheaters in a social exchange. A vast amount of research has illustrated performance facilitation on social contract selection tasks. This facilitation is attributed to the alleged automatic and isolated operation of the module (i.e., independent of general cognitive capacity). This study, using the selection task, tested the critical automaticity assumption in three experiments. Experiments 1 and 2 established that performance on social contract versions did not depend on cognitive capacity or age. Experiment 3 showed that experimentally burdening cognitive resources with a secondary task had no impact on performance on the social contract version. However, in all experiments, performance on a non-social contract version did depend on available cognitive capacity. Overall, findings validate the automatic and effortless nature of social exchange reasoning.
NASA Technical Reports Server (NTRS)
Lam, David W.
1995-01-01
The transonic performance of a dual-throat, single-expansion-ramp nozzle (SERN) was investigated with a PARC computational fluid dynamics (CFD) code, an external flow Navier-Stokes solver. The nozzle configuration was from a conceptual Mach 5 cruise aircraft powered by four air-breathing turboramjets. Initial test cases used the two-dimensional version of PARC in Euler mode to investigate the effect of geometric variation on transonic performance. Additional cases used the two-dimensional version in viscous mode and the three-dimensional version in both Euler and viscous modes. Results of the analysis indicate low nozzle performance and a highly three-dimensional nozzle flow at transonic conditions. In another comparative study using the PARC code, a single-throat SERN configuration for which experimental data were available at transonic conditions was used to validate the results of the over/under turboramjet nozzle.
Methadone disrupts performance on the working memory version of the Morris water task.
Hepner, Ilana J; Homewood, Judi; Taylor, Alan J
2002-05-01
The aim of the study was to examine if administration of the mu-opiate agonist methadone hydrochloride resulted in deficits in performance on the Morris water tank task, a widely used test of spatial cognition. To this end, after initial training on the task, Long-Evans rats were administered saline or methadone at either 1.25, 2.5 or 5 mg/kg ip 15 min prior to testing. The performance of the highest-dose methadone group was inferior to that of the controls on the working memory version of the Morris task. There were also differences between the groups on the reference memory version of the task, but this result cannot be considered reliable. These data show that methadone has its most profound effect on cognition in rats when efficient performance on the task requires attention to and retention of new information, in this case, the relationship between platform location and the extramaze cues.
The role of attention during retrieval in working-memory span: a dual-task study.
Healey, M Karl; Miyake, Akira
2009-04-01
We tested the hypothesis that retrieving target words in operation span (OSpan) involves attention-demanding processes. Participants completed the standard OSpan task and a modified version in which all equations preceded all target words. Recall took place under either full attention or easy versus hard divided-attention conditions. Recall suffered under divided attention with the recall decrement being greater for the hard secondary task. Moreover, secondary-task performance was disrupted more by the standard OSpan task than by the modified version with the hard secondary task showing the larger decrement. Finally, the time taken to start recalling the first word was considerably longer for the standard version than for the modified version. These results are consistent with the proposal that successful OSpan task performance in part involves the attention-demanding retrieval of targets from long-term memory.
Integration of nonthematic details in pictures and passages.
Viera, C L; Homa, D L
1991-01-01
Nonthematic details in naturalistic scenes were manipulated to produce four stimulus versions: color photos, black-white copies, and elaborated and unelaborated line drawings (Experiment 1); analogous verbal descriptions of each visual version were produced for Experiment 2. In Experiment 1, two or three different versions of a scene were presented in the mixed condition; the same version of the scene was repeated either two or three times in the same condition, and a 1-presentation control condition was also included. In Experiment 2, the same presentation conditions were used across different groups of subjects who either viewed the pictures or heard the descriptions. An old/new recognition test was given in which the nonstudied versions of the studied items were used as foils. Higher false recognition performances for the mixed condition were found for the visual materials in both experiments, and in the second experiment the verbal materials produced equivalently high levels of false recognition for both same and mixed conditions. Additionally, in Experiment 2 the patterns of performances across material conditions were differentially affected by the manipulation of detail in the four stimulus versions. These differences across materials suggest that the integration of semantically consistent details across temporally separable presentations is facilitated when the stimuli do not provide visual/physical attributes to enhance discrimination of different presentations. Further, the evidence derived from the visual scenes in both experiments indicates that the semantic schema abstracted from a picture is not the sole mediator of recognition performance.
Al Zoubi, Fadi M; Eilayyan, Owis; Mayo, Nancy E; Bussières, André E
2017-10-01
The purpose of this systematic review was to investigate the extent to which the STarT Back Screening Tool (SBST) has been evaluated for (1) the quality of translation of evidence for cross-cultural adaptation and (2) the measurement properties in languages other than English. A systematic search of 8 databases, including Medline, Embase, CINAHL, PsycINFO, AMED, Scopus, PubMed, and Web of Science, was performed. Electronic databases were searched for the period between 2008 and December 27, 2016. We included studies related to cross-cultural adaptation, including translation and assessment of the measurement properties of SBST. Study selection, translation, methodologic and quality assessments, and data extraction were performed independently by 2 reviewers. Of the 1566 citations retrieved, 17 studies were admissible, representing 11 different SBST versions in 10 languages. The quadratic weighted κ statistics of the 2 reviewers, for the translation, methodologic assessment, and quality assessment were 0.85, 0.76, and 0.83, respectively. For translation, only 2 versions (Belgian-French and Mandarin) fulfilled all requirements. None of the versions had tested all the measurement properties, and when performed, these were found to have been conducted inadequately. With regard to quality assessment, overall, the included versions had a "Poor" total summary score except 2 (Persian and Swiss-German), which were rated as "Fair." Few versions fully met the standard criteria for valid translation, and none of the versions tested all the measurement properties. There is a clear need for more accurate cross-cultural adaptation of SBST and greater attention to the quality of psychometric evaluation of the adapted versions of SBST. At this time, caution is recommended when using SBST in languages other than English. Copyright © 2017. Published by Elsevier Inc.
Chromosome aberrations and cell death by ionizing radiation: Evolution of a biophysical model
NASA Astrophysics Data System (ADS)
Ballarini, Francesca; Carante, Mario P.
2016-11-01
The manuscript summarizes and discusses the various versions of a radiation damage biophysical model, implemented as a Monte Carlo simulation code, originally developed for chromosome aberrations and subsequently extended to cell death. This extended version has been called BIANCA (BIophysical ANalysis of Cell death and chromosome Aberrations). According to the basic assumptions, complex double-strand breaks (called ;Cluster Lesions;, or CLs) produce independent chromosome free-ends, mis-rejoining within a threshold distance d (or un-rejoining) leads to chromosome aberrations, and ;lethal aberrations; (i.e., dicentrics plus rings plus large deletions) lead to clonogenic cell death. The mean number of CLs per Gy and per cell is an adjustable parameter. While in BIANCA the threshold distance d was the second parameter, in a subsequent version, called BIANCA II, d has been fixed as the mean distance between two adjacent interphase chromosome territories, and a new parameter, f, has been introduced to represent the chromosome free-end un-rejoining probability. Simulated dose-response curves for chromosome aberrations and cell survival obtained by the various model versions were compared with literature experimental data. Such comparisons provided indications on some open questions, including the role of energy deposition clustering at the nm and the μm level, the probability for a chromosome free-end to remain un-rejoined, and the relationship between chromosome aberrations and cell death. Although both BIANCA and BIANCA II provided cell survival curves in general agreement with human and hamster fibroblast survival data, BIANCA II allowed for a better reproduction of dicentrics, rings and deletions considered separately. Furthermore, the approach adopted in BIANCA II for d is more consistent with estimates reported in the literature. After testing against aberration and survival data, BIANCA II was applied to investigate the depth-dependence of the radiation effectiveness for a proton SOBP used to treat eye melanoma in Catania, Italy. The survival of AG01522 cells at different depths was reproduced, and the survival of V79 cells was predicted. For both cell lines, the simulations also predicted yields of chromosome aberrations, some of which can be regarded as indicators of the risk to normal tissues.
Extended capability of the integrated transport analysis suite, TASK3D-a, for LHD experiment
NASA Astrophysics Data System (ADS)
Yokoyama, M.; Seki, R.; Suzuki, C.; Sato, M.; Emoto, M.; Murakami, S.; Osakabe, M.; Tsujimura, T. Ii.; Yoshimura, Y.; Ido, T.; Ogawa, K.; Satake, S.; Suzuki, Y.; Goto, T.; Ida, K.; Pablant, N.; Gates, D.; Warmer, F.; Vincenzi, P.; Simulation Reactor Research Project, Numerical; LHD Experiment Group
2017-12-01
The integrated transport analysis suite, TASK3D-a (Analysis), has been developed to be capable for routine whole-discharge analyses of plasmas confined in three-dimensional (3D) magnetic configurations such as the LHD. The routine dynamic energy balance analysis for NBI-heated plasmas was made possible in the first version released in September 2012. The suite has been further extended through implementing additional modules for neoclassical transport and ECH deposition for 3D configurations. A module has also been added for creating systematic data for the International Stellarator-Heliotron Confinement and Profile Database. Improvement of neutral beam injection modules for multiple-ion species plasmas and loose coupling with a large-simulation code are also highlights of recent developments.
NASA Astrophysics Data System (ADS)
Leetmaa, Mikael; Skorodumova, Natalia V.
2015-11-01
We here present a revised version, v1.1, of the KMCLib general framework for kinetic Monte-Carlo (KMC) simulations. The generation of random numbers in KMCLib now relies on the C++11 standard library implementation, and support has been added for the user to choose from a set of C++11 implemented random number generators. The Mersenne-twister, the 24 and 48 bit RANLUX and a 'minimal-standard' PRNG are supported. We have also included the possibility to use true random numbers via the C++11 std::random_device generator. This release also includes technical updates to support the use of an extended range of operating systems and compilers.
Data-driven reconstruction of directed networks
NASA Astrophysics Data System (ADS)
Hempel, Sabrina; Koseska, Aneta; Nikoloski, Zoran
2013-06-01
We investigate the properties of a recently introduced asymmetric association measure, called inner composition alignment (IOTA), aimed at inferring regulatory links (couplings). We show that the measure can be used to determine the direction of coupling, detect superfluous links, and to account for autoregulation. In addition, the measure can be extended to infer the type of regulation (positive or negative). The capabilities of IOTA to correctly infer couplings together with their directionality are compared against Kendall's rank correlation for time series of different lengths, particularly focussing on biological examples. We demonstrate that an extended version of the measure, bidirectional inner composition alignment (biIOTA), increases the accuracy of the network reconstruction for short time series. Finally, we discuss the applicability of the measure to infer couplings in chaotic systems.
Held, Elizabeth; Cape, Joshua; Tintle, Nathan
2016-01-01
Machine learning methods continue to show promise in the analysis of data from genetic association studies because of the high number of variables relative to the number of observations. However, few best practices exist for the application of these methods. We extend a recently proposed supervised machine learning approach for predicting disease risk by genotypes to be able to incorporate gene expression data and rare variants. We then apply 2 different versions of the approach (radial and linear support vector machines) to simulated data from Genetic Analysis Workshop 19 and compare performance to logistic regression. Method performance was not radically different across the 3 methods, although the linear support vector machine tended to show small gains in predictive ability relative to a radial support vector machine and logistic regression. Importantly, as the number of genes in the models was increased, even when those genes contained causal rare variants, model predictive ability showed a statistically significant decrease in performance for both the radial support vector machine and logistic regression. The linear support vector machine showed more robust performance to the inclusion of additional genes. Further work is needed to evaluate machine learning approaches on larger samples and to evaluate the relative improvement in model prediction from the incorporation of gene expression data.